GRACE: A Dynamic Coreset Selection Framework for Large Language Model Optimization

📰 ArXiv cs.AI

arXiv:2604.11810v1 Announce Type: cross Abstract: Large Language Models (LLMs) have demonstrated remarkable capabilities in natural language understanding and generation. However, their immense number of parameters and complex transformer-based architectures result in significant resource demands and computational complexity during training, making it challenging to optimize them efficiently on large datasets. To reduce training costs while preserving performance, researchers have investigated c

Published 15 Apr 2026
Read full paper → ← Back to Reads