Tiiny AI has introduced what it says is the world’s smallest personal AI supercomputer, a claim now officially verified by Guinness World Records. Known as the Tiiny AI Pocket Lab, the device is roughly the size of a power bank. Still, it is designed to deliver AI performance levels usually associated with large, costly machines found in data centres or research labs. By shrinking this capability into a compact, portable form, the company is aiming to make advanced artificial intelligence more accessible to individual users rather than large organisations.
The announcement positions the Pocket Lab as a response to the high cost of existing compact AI systems. Devices such as NVIDIA’s Project Digits, priced around US$3,000, and the DGX Spark, priced around US$4,000, are often beyond the reach of hobbyists, students, and smaller teams. Tiiny AI believes there is room in the market for a far smaller and potentially more affordable option that still offers serious computing power. While the company has not yet shared pricing details, its messaging suggests a focus on broadening access to local AI computing.
Beyond its size, the Pocket Lab reflects a broader shift in how AI systems are expected to be used. Much of today’s AI relies heavily on cloud services, which require constant internet access and raise concerns around cost, latency, and data privacy. Tiiny AI argues that this dependence has become one of the most significant barriers to broader AI adoption, especially for users seeking greater control over their data and workflows.
The company’s GTM director, Samar Bhoj, has framed the Pocket Lab as part of a move away from centralised infrastructure. “Intelligence shouldn’t belong to data centres, but to people,” he says. According to Tiiny AI, running large models locally can reduce reliance on remote servers, limit data being sent off-device, and make AI tools feel more immediate and personal. This idea of personal ownership over AI capabilities is central to how the company is positioning its new product.
A supercomputer designed to fit in your hand
At first glance, the physical dimensions of the Tiiny AI Pocket Lab appear almost at odds with its ambitions. The device measures 14.2 by 8 by 2.53 centimetres and weighs just 300 grams. Despite this, Tiiny AI claims it can deploy large language models with up to 120 billion parameters. Models of this scale are typically associated with racks of servers or high-end professional graphics cards, not something that can be carried in a backpack or even a pocket.
The Pocket Lab is built around a modern ARM v9.2 CPU with 12 cores, providing a balance between performance and power efficiency. This choice reflects the company’s focus on running complex AI workloads locally without excessive energy consumption. The system supports a wide range of well-known open-source models, including GPT-OSS, Llama, Qwen, DeepSeek, Mistral, and Phi. This compatibility is intended to give developers and researchers flexibility, allowing them to experiment with different architectures and approaches without being locked into a single ecosystem.
A key component of the Pocket Lab is its discrete neural processing unit, or NPU, which Tiiny AI says is capable of delivering up to 190 trillion operations per second. This specialised hardware is explicitly designed for AI tasks, enabling faster inference than relying solely on a general-purpose CPU. The device also includes 80 gigabytes of LPDDR5X memory, a huge amount for a device of this size. This memory capacity supports aggressive quantisation techniques that reduce the precision of model parameters in a controlled way, enabling very large models to run efficiently on local hardware.
Tiiny AI argues that this combination of processing power and memory enables users to work with advanced models without offloading tasks to the cloud. For developers, this could mean faster iteration and testing. For privacy-conscious users, it could reduce the risk of sensitive data being transmitted to external servers. The company sees these benefits as increasingly important as AI becomes more deeply embedded in everyday tools and workflows.
Software innovations behind the hardware
Hardware alone does not enable the Pocket Lab to function as a miniature supercomputer. Tiiny AI has also developed its own software technologies to improve efficiency and performance. One of these is TurboSparse, a neuron-level sparse activation method designed to increase inference efficiency without reducing model intelligence. By activating only the most relevant parts of a neural network during computation, TurboSparse aims to cut unnecessary processing while maintaining output quality.
Another core technology is PowerInfer, a heterogeneous inference engine that divides AI workloads between the CPU and the NPU. This approach allows each component to focus on the tasks it handles best, resulting in higher overall performance while keeping power consumption low. Tiiny AI says this balance is crucial for delivering server-grade capabilities in a portable device that does not require active cooling or a large power supply.
Together, these technologies are intended to make the Pocket Lab suitable for a wide range of use cases. Researchers could use it to test and refine models locally before deploying them at scale. Robotics developers might integrate it into autonomous systems that need on-device intelligence without constant connectivity. Others may use it for advanced reasoning tasks, creative applications, or personal AI assistants that operate entirely offline.
The company is careful to present the Pocket Lab as an experimental and exploratory tool rather than a replacement for large data centre systems. However, by showing what is possible in such a small form factor, Tiiny AI is challenging assumptions about where powerful AI computing must live. The idea that a single user can own and operate a machine capable of running very large models marks a significant shift from the cloud-first approach that has dominated recent years.
Tiiny AI plans to publicly demonstrate the Pocket Lab at CES 2026, where it is expected to attract attention from both industry professionals and enthusiasts. While details around pricing, availability, and exact performance benchmarks have yet to be revealed, the unveiling has already sparked discussion about the future of personal AI hardware. If the device performs as promised in real-world scenarios, it could signal a move towards more decentralised and user-controlled AI computing.
As AI continues to evolve, products like the Pocket Lab highlight growing interest in bringing advanced capabilities closer to the end user. Whether Tiiny AI’s approach will reshape the market remains to be seen. Still, the company’s record-breaking device has clearly made a statement about how small and how personal, powerful AI systems can become.


