An academic-grade journal of artificial intelligence research — peer-reviewed analysis of large language models, machine learning frameworks, and the engineering behind today’s most consequential AI systems.
In just 48 hours, DeepMind’s Harbor open-source harness has seen 11 labs submit results, with a leaderboard showing significant performance gaps.
A new algorithm from the University of Hawaiʻi significantly enhances physics-informed machine learning by enforcing conservation laws as hard constraints.
MIT researchers have developed a method using control theory to reduce compute costs in training large language models by up to 35% without quality loss.
A Cornell University research team has demonstrated a neuro-symbolic AI system that reduces energy consumption by up to 100 times while outperforming transformer baselines on reasoning benchmarks, pot
The European Commission unveiled a draft regulation this morning requiring developers of high-risk AI systems to submit comprehensive ethical impact statements before deployment, marking the most sign
Introduction: Bridging Artificial and Biological Intelligence The remarkable ability of large language models (LLMs) to perform new tasks from just a few examples, a phenomenon known as in-context…