motivation Consider Generic Maximum Likelihood Estimate. parametric distribution estimation: suppose you have a family of densities p_{x}\left(y\right), with parameter x we take p_{x}\left(y\right) = 0 for invalid values of x maximum likelihood estimation: choose x to maximize p_{x}\left(y\right) given some dataset y. linear measurement with IID noise Suppose you have some kind of linear noise model:

\begin{equation} y_{i} = a_{i}^{T}x + v_{i} \end{equation}

where v_{i} is IID noise, and a^{T}_{i} is the model. We can write y probabilistically as:

\begin{equation} p_{x}\left(y\right) = \prod_{i=1}^{m} p\left(y_{i} - a_{i}^{T}x\right) \end{equation}

for some model p of noise v. Thus the noise-aware parameter estimation is:

\begin{align} \min_{x}\quad & \sum_{i=1}^{m} \log p\left(y_{i} - a_{i}^{T}x\right) \end{align}

with observed y and model a. some noise models Gaussian noise: ML estimate becomes least-squares Appalachian noise: ML estimate is l1-norm solution logistic regression Random variables y \in \left\{0,1\right\} with distribution:

\begin{equation} p = \frac{\exp \left(a^{T}u + b\right)}{1 + \exp \left(a^{T}u + b\right)} \end{equation}

The maximization of this is also a concave problem.

[[curator]]
I'm the Curator. I can help you navigate, organize, and curate this wiki. What would you like to do?