image credit: Arm
Arm has unveiled a major step forward in on-device AI with the introduction of dedicated neural accelerators built directly into its future GPUs. Announced at SIGGRAPH, this new architecture is aimed at delivering console-quality, AI-enhanced graphics to mobile devices while cutting GPU workloads by as much as 50 percent.
The company’s first application of this approach is Neural Super Sampling (NSS), an AI-powered upscaler that can potentially double rendered resolution with a processing cost of just 4 milliseconds per frame. This means a game rendered at 540p could be upscaled to 1080p with near-native quality, preserving surface detail, lighting, and motion clarity, while giving developers the flexibility to either boost frame rates, improve visual fidelity, or reduce power consumption.
Open Tools for Early Development
Hardware with neural acceleration will not ship until 2026, but Arm is releasing a fully open Neural Graphics Development Kit immediately so developers can start integrating the technology well ahead of launch. The kit is designed to fit directly into existing workflows and includes:
- An Unreal Engine plugin for rapid adoption in game pipelines
- PC-based Vulkan emulation for prototyping
- Updated profiling tools for AI-augmented rendering
- Fully open neural models hosted on GitHub and Hugging Face
- Arm ML extensions for Vulkan, introducing a dedicated Graph Pipeline for neural network inference alongside traditional graphics and compute pipelines
The decision to keep the models, weights, and tools open is deliberate. It allows studios to retrain and adapt them for their own needs without licensing barriers, making it easier for both AAA developers and smaller studios to experiment with neural rendering.
From Gaming to Broader On-Device AI
While the initial emphasis is on mobile gaming, Arm expects neural acceleration to benefit other workloads such as real-time camera processing, AI-driven productivity tools, and even mobile path tracing. The roadmap includes Neural Frame Rate Upscaling, which can double frame rates without doubling rendering load, and Neural Super Sampling and Denoising for enabling real-time ray tracing on mobile devices. Both are planned for release ahead of the supporting hardware.
By reducing GPU load and making AI inference a native part of the graphics pipeline, Arm’s approach addresses one of the biggest challenges in mobile computing, delivering high-quality visuals without draining the battery or thermally throttling performance.
Industry Support and Potential Impact
Early support for the development kit comes from some of the biggest names in gaming technology. Enduring Games, Epic Games (Unreal Engine), NetEase Games, Sumo Digital, Tencent Games, and Traverse Research have all committed to exploring and optimising the platform. These partnerships suggest that when hardware becomes available, titles will be ready to take advantage of neural acceleration from day one.
For developers, the combination of open tools, early access, and a clear hardware roadmap provides an opportunity to design for the next generation of mobile graphics today. For players, it could mean experiencing desktop-class rendering and AI-enhanced visuals on smartphones and tablets within the next few years.
By embedding AI acceleration directly into GPUs and keeping the development path open, Arm is positioning neural graphics as a foundational capability for the future of on-device computing.
Learn more and read the original article on www.arm.com
You may also like

