Domain-Adapted Retrieval for In-Context Annotation of Pedagogical Dialogue Acts
📰 ArXiv cs.AI
Domain-adapted retrieval improves in-context annotation of pedagogical dialogue acts using a lightweight embedding model
Action Steps
- Fine-tune a lightweight embedding model on tutoring corpora
- Index dialogues at the utterance level to retrieve labeled few-shot demonstrations
- Integrate the domain-adapted retrieval pipeline with a generative model for in-context annotation
- Evaluate the pipeline across multiple real-world tutoring datasets
Who Needs to Know This
NLP researchers and AI engineers working on LLMs and educational technology can benefit from this approach to improve annotation accuracy and efficiency
Key Insight
💡 Domain-adapted retrieval can improve the accuracy of in-context annotation of pedagogical dialogue acts without requiring fine-tuning of the generative model
Share This
📚 Improve pedagogical dialogue annotation with domain-adapted retrieval! 👉 Fine-tune embeddings, index utterances, and integrate with generative models
DeepCamp AI