Scientists Turn AI Transformer Into a Fully Functional Computer Capable of Running Real Software

Mar 13, 2026
Percepta
Article image for Scientists Turn AI Transformer Into a Fully Functional Computer Capable of Running Real Software

Summary

Scientists have transformed an AI transformer model into a fully functional computer by embedding a WebAssembly interpreter directly into its weights, enabling real C programs to run token-by-token with 100% accuracy on complex problems — no external tools required.

Key Points

  • Researchers build a working computer inside a transformer by implementing a WebAssembly interpreter directly in the model weights, enabling arbitrary C programs to execute token-by-token within the model's own inference loop — no external tools or interpreters required.
  • A new decoding technique called Exponentially Fast Attention restricts attention head dimensions to 2D, allowing memory lookups to be reframed as convex hull queries that run in O(log t) time instead of O(t), making million-step execution traces practical and achieving speeds exceeding 30,000 tokens per second on a CPU.
  • This approach enables 100% accurate solutions to hard computational problems like the Arto Inkala Sudoku and 10×10 min-cost matching, with broader implications for hybrid AI architectures where large language models reason while a compiled execution engine handles exact computation — and where transformer weights themselves become a direct deployment target for software logic.

Tags

Read Original Article