Ch3. Probability and Information Theory

DDME·2020년 9월 19일
0

TIMESTAMP

@200919 시작

Probability Theory

Why Probability?

Random Variables

Probability Distributions

Discrete Variables and Probability Mass Functions

Probability Mass Function

Continuous Variables and Probability Density Functions

Probability Density Function

Marginal Probability

Computing Marginal Probability with the Sum Rule

Conditional Probability

Conditional Probability

The Chain Rule of Conditional Probabilities

Chain Rule of Probability

Independence and Conditional Independence

Independence
Conditional Independence

Expectation, Variance and Covariance

Expectation
Variance and Covariance

Common Probability Distributions

Bernoulli Distribution

Bernoulli Distribution

Multinoulli Distribution

Gaussian Distribution

Gaussian Distribution
Multivariate Gaussian

Exponential and Laplace Distributions

More Distribution

The Dirac Distribution and Empirical Distribution

Empirical Distribution

Mixtures of Distributions

Mixture Distribution

Useful Properties of Common Functions

Logistic Sigmoid
Softplus Function

Bayes' Rule

Bayes' Rule

Technical Details of Continuous Variables

Change of Variables

Information Theory

Information Theory

Entropy of a Bernoulli Variable

The KL Divergence is Asymmetric

Structured Probablistic Models

Directed Model

Undirected Model

profile
NULL

0개의 댓글