Alison's New App is now available on iOS and Android! Download Now
Hypothesis Testing
Hypothesis Testing is a statistical method that is used in making decisions using experimental data. Hypothesis Testing is basically an assumption that we make about the population parameter.
Binary and M-ary Hypothesis Testing
Detection problems can usually be casted as Binary or M-ary hypothesis testing problems.
Binary Hypothesis Testing helps in deciding between two hypotheses based on random observation. The goal of M-ary Hypothesis Testing is to decide among M possible hypotheses.
Estimation
Estimation is concerned with inference about the numerical value of unknown population values from incomplete data such as a sample.
Log-Likelihood Ratio Test
Log-Likelihood Ratio is a statistical test to assess the goodness of fit between two models based on the ratio of their likelihoods.
Neyman-Pearson Lemma
The Neyman-Pearson Lemma is a way to find out if the hypothesis test you are using is the one with the greatest statistical power.
Kullback-Leibler Divergence
Kullback–Leibler Divergence is a way to measure the difference between two probability distributions.
Properties of Kullback-Leibler Divergence
• Data processing inequality
• Pinsker’s inequality
• Additivity
Log in to save your progress and obtain a certificate in Alison’s free Diploma in the Foundations of Information Theory online course
Sign up to save your progress and obtain a certificate in Alison’s free Diploma in the Foundations of Information Theory online course
Please enter you email address and we will mail you a link to reset your password.