Blog
1 week ago
Textbooks, Not the Internet, Trained This Powerful AI
phi-1.5 is a 1.3B-parameter Transformer trained mainly on synthetic, textbook-quality data. Despite its small size, it matches or beats much larger models on commonsense reasoning, grade-school math, and coding benchmarks. The results suggest data quality—not scale alone—drives reasoning ability in LLMs.
Source: HackerNoon →