
The Ongoing Memory Challenge
Unless you have been under a rock, everyone is familiar with the current industry strain on memory. The prices of DDR5 and DDR4 have have become unaffordable for consumer applications, largely because much of the available capacity has been reserved by the AI industry as it continues to expand datacentres. It is not just the supply of chips that is an issue; chips still in development or planned production capacity have also been reserved, reducing availability for other users. This reduced availability is causing a ripple effect on electronics, with prices of RAM skyrocketing, and thus resulting in higher costs in electronics.
With all of this in mind, it can be clearly seen that engineers are going to be under immense pressure to find alternatives. If engineers continue to rely on the latest, most advanced memory technologies, then the price of electronics will continue to rise, and availability will be far from guaranteed. So, what will engineers do? Will they just resort to using older parts, or will they develop new methods to work around this shortage?
Well, as it turns out, engineers are already starting to look towards older technologies to provide solutions. In fact, it was only recently that we predicted here at The Component Club that engineers would start to turn their attention towards DDR3, and it looks like we were right!
QSMP-20 Module Utilises DDR3 Memory To Avoid Ongoing AI Pressure
Recently, Direct Insight announced that it now supports development for its QSMP-20 system-on-module (SoM), a drop-in, pin-compatible upgrade for the QSMP-15 that helps engineers avoid the ongoing impact of AI pressure on the DRAM market.
Designed to address the challenges associated with AI and related applications, the latest system-on-module from Direct Insight incorporates an industrial-grade STM32MP2 family MPU and uses lower-cost DDR3 RAM with shorter lead times compared to higher-end memory solutions. This allows engineers to overcome the significant market pressure on DDR4 and DDR5 memory supply due to AI infrastructure demand.
The new QSMP-20 is manufactured by Ka-Ro Electronics, and integrates STMicroelectronics’ STM32MP235 microprocessor, featuring a dual-core ARM Cortex-A35 application processor and a 400MHz ARM Cortex-M33 coprocessor. Additional features include a 0.6 TOPS NPU AI accelerator and a 3D GPU, 512MB of DDR3L RAM, and 4GB eMMC Flash, all integrated on a 27mm x 27mm solder-down package measuring just 2.6mm high.
The module's wide range of I/O interfaces includes CAN-FD, UART, SPI, I2C, audio, Gigabit Ethernet, SD card, USB and a MIPI-DSI display interface. The QSMP-20's support for Linux and Android development makes it ideal for size- and power-sensitive applications requiring quick time-to-market.
Additionally, the module's advanced security and power management features support IoT and industrial applications, and the design is ready for low-power and CRA-compliant system implementations. The QSMP-20's extended industrial operating temperature range of -25°C to +85°C and requirement for a single 3.3V supply make it ideal for applications ranging from industrial controls to low-power, fan-less, embedded systems.
The QSMP-20 is supported by the dedicated SYS-03-Q SBC development kit, featuring a Yocto Linux BSP (board support package), including a graphical user interface and command-line tools, enabling the creation of custom applications.
Does This Latest Announcement Mean That “The Latest Isn’t Always Greatest”?
When it comes to the release of the QSMP-20 utilising DDR3 memory, it’s not exactly groundbreaking news, but it does demonstrate that older memory technologies and nodes can still be used effectively. Despite what many engineers and tech experts would have you believe, the latest process nodes are not essential; they are nice to have, but not strictly required.
Older technologies are more mature, cheaper, and easier to deploy, meaning that they often face less supply chain issues. But this leads us to ask the question: “is modern technology too bloated?”.
In all honesty, the answer to this is a massive yes, and this is something that we really need to start thinking about. Some computers 30 years ago ran operating systems, office programs, and web browsers on as little as 128MB of RAM, and yet managed to be useful. Many modern systems ship with 8–16GB of RAM and modern operating systems and applications use substantially more memory than earlier software.
If the base functionality of modern services hasn’t changed, then why do we need so much memory? Why do we need tens of cores per CPU? Why do we need 20TB disks? What practical applications require all of this computing power? Maybe we should go back to an era where RAM was limited, optimisations were important, and we spent less time bloating our software and more time refining it.