17-10-14 Interesting Facts in Machine Learning (Neural Networks)

Category: Idea Lists (Upon Request)

Read the original document

<!-- gdoc-inlined -->


1. Learns compositional (bottom-up) hierarchical structure 2. Model complexity overcomes the curse of dimensionality

  1. Combinatorial in depth and in width
  2. Requires high signal-to-noise ratio
  3. ‘Just’ adaptive basis function regression
  4. Optimizer improved by exponentially weighted average of the gradient, learning rate
  5. Covariate Shift
  6. Close-to-linear model leads to failure to generalize, ex. adversarial examples
  7. Dimensionality of the representation increases with depth of a convnet.
  8. Softmax leads to extreme solutions
  9. Non-convex optimization surface is dominated by saddle points.
  10. Convnets are:
  11. Parameter Sharing leads to translation equivariance
  12. Locality (Sparse Connectivity)
  13. Composition
  14. Not equivariant to scale or rotation.
  15. Many machine learning libraries implement cross-correlation but call it convolution
  16. Achieving the global minimum would overfit the training data.

Source: Original Google Doc

[[curator]]
I'm the Curator. I can help you navigate, organize, and curate this wiki. What would you like to do?