retrieval 4
- CLIP-MoE: Towards Building Mixture of Experts for CLIP with Diversified Multiplet Upcycling
- SMEC: Rethinking Matryoshka Representation Learning for Retrieval Embedding Compression
- FLAME: Frozen Large Language Models Enable Data-Efficient Language-Image Pre-training
- Randomly Removing 50% of Dimensions in Text Embeddings has Minimal Impact on Retrieval and Classification Tasks