Preconditioned Test-Time Adaptation for Out-of-Distribution Debiasing in Narrative Generation

📰 ArXiv cs.AI

arXiv:2603.13683v2 Announce Type: replace-cross Abstract: Although debiased large language models (LLMs) excel at handling known or low-bias prompts, they often fail on unfamiliar and high-bias prompts. We demonstrate via out-of-distribution (OOD) detection that these high-bias prompts cause a distribution shift, degrading static model performance. To enable real-time correction, we propose CAP-TTA, a test-time adaptation framework. CAP-TTA triggers context-aware LoRA updates only when a bias-ri

Published 17 Apr 2026
Read full paper → ← Back to Reads