Textbooks, Not the Internet, Trained This Powerful AI

📰 Hackernoon

phi-1.5 is a 1.3B-parameter Transformer trained mainly on synthetic, textbook-quality data. Despite its small size, it matches or beats much larger models on commonsense reasoning, grade-school math, and coding benchmarks. The results suggest data quality—not scale alone—drives reasoning ability in LLMs.

Published 30 Mar 2026
Read full article → ← Back to News