![]() ![]()
EDACafe Editorial ![]() Sanjay Gangal
Sanjay Gangal is the President of IBSystems, the parent company of AECCafe.com, MCADCafe, EDACafe.Com, GISCafe.Com, and ShareCG.Com. EDACafe Industry Predictions for 2025 – Weebit NanoJanuary 24th, 2025 by Sanjay Gangal
By Coby Hanoch, CEO, Weebit Nano ![]() Coby Hanoch 2025: A Year of Change for Embedded Memory As we enter 2025, I want to share with you a few of the trends we are seeing, and the impact they will have on the use of non-volatile memory (NVM) in electronic systems. As a developer and licensor of innovative NVM technology – Weebit ReRAM – we are seeing a great deal of disruption in the industry, and we predict this will mean a turning point for embedded NVM in 2025. 2025 will be the year the industry standardizes on ReRAM as an embedded NVM. For more than 20 years, the semiconductor industry has been looking for the next NVM technology that will replace flash. Many options have been considered, including FeRAM, PCM (and its derivatives Optane and 3D XPoint), MRAM, CBRAM and ReRAM. Over the years most of these technologies were dropped due to cost or complexity of manufacturing. MRAM is now in mass production, but is expensive and susceptible to magnetic fields. In recent years we are witnessing an industry shift towards ReRAM, which is lower cost and simpler to manufacture. While flash memory is still the most popular NVM, the move to ReRAM is now clear, both because embedded flash can’t feasibly scale beyond 28nm, and because of ReRAM’s cost, power, and performance advantages.
ReRAM technologies are already shipping in volume, and a growing number of foundries, IDMs and semiconductor companies are committing to this innovative technology as the leading embedded NVM in their roadmaps. Edge AI architectures will get faster and more efficient with ReRAM. ReRAM is the ideal NVM for edge AI inference. In addition to NVM’s traditional code storage function, ReRAM can store the synaptic weights needed for neural network (NN) computations on the same chip as the AI engine. Since most AI applications are implemented at 28nm and below, embedded flash is not an option, requiring the weights to be stored on a separate chip. Beyond the cost issues, this also introduces a security issue, as the stored data can easily be read when communicating between the two chips. Having the weights stored on-chip means a lower cost and more secure solution. In addition, as ReRAM is non-volatile, the inference chip can be shut down to save power whenever there is nothing to process, while keeping all the on-chip dataset intact. In 2025, we will see a growing number of designs using ReRAM for near-memory compute architectures, as well as a great deal of investment towards developing in-memory compute solutions using ReRAM. Automotive solutions with embedded ReRAM will enter the market. The big trend in automotive right now is software defined vehicles (SDVs), as automakers integrate interactive AI systems, advanced displays, new comfort and personalization features, and more – all powered by software. NVM enables local storage and updating of complex software functions that power the software-defined vehicle. It must support frequent OTA updates, fast boot and instant response often in harsh conditions. Due to the significant performance requirements for SDVs, automotive MCUs are being pushed down to lower geometries, and have already hit the wall at 28nm. Since embedded flash can’t scale to meet this need, the NVM needed for these architectures is ReRAM, and several major automotive chip vendors are already making the move. Based on announcements made in the past couple of years from companies like Infineon and TSMC, in 2025, we will see the first automotive chips with ReRAM come to market. Weebit ReRAM is expected to conclude AEC-Q100 automotive qualification (150°C lifetime operation and 100K cycles endurance) in 2025. More BCD processes will get embedded NVM. If you’re creating smart power management designs or other high-voltage products especially in a Bipolar-CMOS-DMOS (BCD) process process, then you know that designs are increasingly complex. Many of today’s power management ICs (PMICs) are evolving from simple functions to support trends like enhanced wireless charging and intelligent motor control. To manage this level of sophistication, PMICs must be smart and capable of running numerous algorithms, requiring an MCU coupled with NVM that is low-power, high-density, and cost effective. In BCD processes, where significant attention is invested in optimizing Front-End-of-Line (FEOL) power components, integration of flash, which is FEOL as well, becomes problematic as it impacts the analog circuitry. In addition, flash requires many added masks and manufacturing steps. All this forces companies to make compromises resulting in degraded performance, larger size, and higher cost. In 2025, we will see more and more BCD processes integrating ReRAM as their embedded NVM of choice, as it is a Back-End-of-Line (BEOL) technology and is also much simpler and lower cost to manufacture. PMICs, audio CODECs and other high voltage designs with ReRAM are already in production. BCD offerings using Weebit ReRAM will move ahead this year, through Weebit customers including DB HiTek and onsemi. As Weebit celebrates our 10th company anniversary this year, we already have several commercial agreements in place, and many more in the making. Our technology is available for design-in, and is constantly improving, reaching higher endurance and temperature levels. An increasing number of companies are looking to embed ReRAM into their product offerings. In 2025, Weebit is well-placed to deliver ultra-low power, cost-effective ReRAM technology for a new generation of connected and AI driven products. RelatedTags: AI architectures, automotive NVM, embedded memory, non-volatile memory, ReRAM, Weebit Nano This entry was posted on Friday, January 24th, 2025 at 12:21 pm. You can follow any responses to this entry through the RSS 2.0 feed. You can skip to the end and leave a response. Pinging is currently not allowed. |