20-07-13 Deep Ideas in Class Discovery
Category: Idea Lists (Upon Request)
<!-- gdoc-inlined -->
1. Transfer objective 2. David’s layer type (anisotropic) 3. Training in sim 4. Recursive self-improvement
- Training leads to improved class creation
- Modifying the ontology dynamically
- Class merging, splitting, creation
- Learnability as the prior
- Start with contrastive learning (to get a good clusterable representation)
- One-vs-all / Object Detection / Escape the one class per image assumption
- Internet scale training
- Discover everything for incredible transfer
- Transfer because you’ve already seen and classified similar data
- Curriculum as you transfer to similar classes
Mathematical possibilities:
- David’s Anisotropic Layer
- Semi-supervised clustering
- Geometry of metric space
Obstacles and Limitations:
- Deep metric clustering outperformed by raw image space clustering
- Computational expense of transfer evaluations
- Noisy label training
Source: Original Google Doc