>_TheQuery
← Glossary

TRM (Tiny Recursive Model)

LLM Models

Samsung's 7M-parameter recursive reasoning model that outperforms LLMs 10,000x its size on abstract reasoning benchmarks like ARC-AGI.

The Tiny Recursive Model (TRM), developed by Samsung AI Lab (SAIL) Montreal and published in October 2025, is a remarkably small model with just 7 million parameters that challenges the assumption that bigger models are always better at reasoning. The key insight behind TRM is recursive reasoning - instead of scaling up model size, it applies the same small network repeatedly in loops, refining its answer with each pass through the problem.

Despite being less than 0.01% the size of leading LLMs, TRM achieves 44.6% accuracy on ARC-AGI-1 and 7.8% on ARC-AGI-2, surpassing models like DeepSeek-R1, Gemini 2.5 Pro, and o3-mini on these abstract reasoning benchmarks. On Sudoku-Extreme with only 1,000 training examples, TRM reaches 87.4% test accuracy, and on Maze-Hard it scores 85.3%, demonstrating strong generalization from minimal data.

The research paper "Less is More: Recursive Reasoning with Tiny Networks" by Alexia Jolicoeur-Martineau demonstrates that recursive computation can be a powerful alternative to parameter scaling for certain reasoning tasks. TRM's success suggests that the path to better AI reasoning may not always require trillion-parameter models, opening up possibilities for efficient, specialized reasoning systems that can run on consumer hardware. The model and code are fully open-source.

Last updated: February 22, 2026