17-10-14 Interesting Facts in Machine Learning (Neural Networks)
Category: Idea Lists (Upon Request)
<!-- gdoc-inlined -->
1. Learns compositional (bottom-up) hierarchical structure 2. Model complexity overcomes the curse of dimensionality
- Combinatorial in depth and in width
- Requires high signal-to-noise ratio
- ‘Just’ adaptive basis function regression
- Optimizer improved by exponentially weighted average of the gradient, learning rate
- Covariate Shift
- Close-to-linear model leads to failure to generalize, ex. adversarial examples
- Dimensionality of the representation increases with depth of a convnet.
- Softmax leads to extreme solutions
- Non-convex optimization surface is dominated by saddle points.
- Convnets are:
- Parameter Sharing leads to translation equivariance
- Locality (Sparse Connectivity)
- Composition
- Not equivariant to scale or rotation.
- Many machine learning libraries implement cross-correlation but call it convolution
- Achieving the global minimum would overfit the training data.
Source: Original Google Doc