Why Weebit’s ReRAM tech looks ready for the next era of smart devices
AI is moving to the edge, and Weebit’s ReRAM could quietly power the billions of devicesleading the charge.
Stockhead
Don't miss out on the headlines from Stockhead. Followed categories will be added to My News.
AI is moving to edge devices fast
Weebit’s ReRAM solves the chip bottleneck
Massive opportunity as billions of devices “get smart”
Special Report: AI inference is shifting from the cloud to edge devices like phones and cars, and Weebit’s ReRAM is shaping up as the memory tech that could quietly power billions of them
If you want to understand where the next great tech leap is happening, don’t look at the data centres.
Look at your wrist, your phone, your car. Your smart fridge.
We’re entering an era where artificial intelligence doesn’t just live in the cloud, it lives at the edge, and this shift has massive implications for memory technology.
Let’s start with a quick 101.
AI’s work falls into two camps: training and inference. Training is what happens in massive data centres, where models learn from huge datasets.
But once trained, those models are used – that’s inference.
And increasingly, inference happens on edge devices like smartphones, smartwatches, autonomous cars, drones, factory sensors, and wearables – right where the action is.
This shift to the edge is happening because latency, power, privacy, and bandwidth all matter.
You don’t want your smartwatch sending data to the cloud every time it counts a step. And you can’t afford a 300-millisecond delay when your autonomous car needs to spot a pedestrian.
Edge inference is a juggernaut in motion.
According to ABI Research, the edge AI hardware market will grow to over US$70 billion by 2027, while McKinsey projects over 75 billion connected devices globally by the end of the decade.
But today’s chips can’t keep up
Right now, most edge AI systems rely on two-chip setups: one chip to handle the computing (called the processor), and a second, usually external Flash memory chip, to store the AI model weights.
On paper, this works. But in practice, it’s an expensive, power-hungry solution.
The communication between the chips is slow and opens up security risks, as it’s easy to eavesdrop on the communication. It’s not ideal. The ideal solution would be to integrate the memory and processor on a single chip.
The real dealbreaker here is the fact that Flash memory can’t scale below 28nm when embedded, while most AI applications already require smaller geometries.
That’s a problem, because chip designs are heading to more advanced nodes fast. So even if you wanted to integrate Flash on-chip with the processor, you simply can’t.
Flash has hit a brick wall.
Weebit solves this memory bottleneck
This is where Weebit Nano (ASX:WBT)’s Resistive RAM (ReRAM) technology comes into the picture.
ReRAM is a next-generation form of non-volatile memory (NVM) that stores data faster, uses less power and offers stronger security than Flash.
"ReRAM can scale to the most advanced semiconductor manufacturing processes where Flash isn’t a viable option.
“It also has much lower power consumption and is simpler and easier to integrate," said Weebit Nano’s CEO, Coby Hanoch.
NVM, meanwhile, is essentially memory that retains data even when power is switched off.
Think of your phone – when the battery dies, you don’t lose your messages or stored contacts. That’s because non-volatile memory retains data even when the power is off.
Instead of storing data via electric charge like Flash, ReRAM works by altering the resistance of materials.
This change in architecture means it’s less vulnerable to tampering, better suited for storing AI model weights, and easier to embed right onto the same chip as the processor.
At the Embedded World 2025 event, Weebit’s ReRAM was shown in action, running live inside a real-time gesture recognition system developed with EMASS.
This setup combined Weebit’s ReRAM with EMASS’ ultra-low-power AI chip to run the kind of edge inference that’s becoming more common in smart devices.
The results were impressive.
The system could wake instantly, consume 50 times less power, and provide 10 times the memory bandwidth compared to traditional setups using external flash memory.
"This ultra-low power edge AI system recognises simple hand gestures,” an EMASS engineer explained at the event.
“The weights for the detection model are being stored in the ReRAM, and our processor is doing the inferencing.”
In short, the demo showed that ReRAM isn’t just a future possibility, it’s ready for today.
Why ReRAM is a no-brainer for the future
So, Flash memory is reaching its limits. It’s power-hungry, increasingly insecure, and can’t scale into the smaller, faster world of tomorrow’s chips.
ReRAM, on the other hand, is doing what Flash can’t.
It stores the AI weights directly on-chip, so you don’t need to load them into SRAM every time the device powers on.
That cuts the power draw, reduces silicon real estate, and gets rid of a bunch of external components.
"ReRAM is also a much more secure solution compared to other non-volatile memories. It’s a natural fit for sectors where security is paramount, like automotive and IoT,” said Hanoch.
Weebit Nano is already working with leading foundries and integrated device manufactures, including major deals with South Korea’s DB HiTek and US semiconductor company onsemi, and is expecting to close additional agreements with other manufacturers in the next half year and beyond.
The company is well funded, with a cash position of $93.7 million as of March 31 to help fund further R&D and commercial activities.
Hanoch also confirmed that despite the unfolding conflict with Iran, the company has refreshed its Business Continuity Plan and its Israel-based operations remain unaffected.
He added that adoption takes time, but once these memory technologies are embedded in a chipmaker’s flow, they tend to be extremely sticky.
“These agreements normally last decades, not years.”
The biggest challenge, he said, isn’t the tech itself. “Our biggest competitor is human fear.”
“But I think eventually all of them will be our customers.
“I don’t think it’s a matter of if; I don’t think they have a choice. I mean, they will all need ReRAM,” said Hanoch.
This article was developed in collaboration with Weebit Nano, a Stockhead advertiser at the time of publishing.
This article does not constitute financial product advice. You should consider obtaining independent advice before making any financial decisions.
Originally published as Why Weebit’s ReRAM tech looks ready for the next era of smart devices