Unlocking the Potential of Diffusion Language Models through Template Infilling

📰 ArXiv cs.AI

arXiv:2510.13870v2 Announce Type: replace-cross Abstract: Diffusion Language Models (DLMs) have emerged as a promising alternative to Autoregressive Language Models, yet their inference strategies remain limited to prefix-based prompting inherited from the autoregressive paradigm. In this paper, we propose Template Infilling (TI), a tailored conditioning methodology for DLMs. Unlike conventional prefix prompting, TI flexibly aligns structural anchors across the entire target response space, esta

Published 8 Apr 2026
Read full paper → ← Back to News