110 likes | 252 Views
Information Theory. Entropy: Conditional Entropy: Mutual Information:. Optimal Sensor Parameter Selection. MMI: Maximum Mutual Information. Example: 12 Coin Problem. Problem. Need to learn: Need to solve:. Observation Model. Can be learnt over many experiments
E N D
Information Theory • Entropy: • Conditional Entropy: • Mutual Information:
Optimal Sensor Parameter Selection • MMI: Maximum Mutual Information
Problem • Need to learn: • Need to solve:
Observation Model • Can be learnt over many experiments • Or, modelled by recognition system
Solve argmax problem • Integral difficult to compute: • Discretise • Or, use Monte Carlo methods to estimate • Even if we can compute the MI, we also need to maximise. • Local maxima possible
Experimental Results MI Max MI