Unshackling Edge AI Performance and LLMs

December 14, 2023

I recently joined Hall Martin of TEN Capital Networks during a digital event to discuss the rapid growth of AI and the need to shift away from cloud computing towards computing at the edge. We delved into how cloud computing has significant limitations in latency and bandwidth, making it unsuitable for high-speed or data-intensive AI applications (e.g., manufacturing, robotics, surveillance). At the same time, edge platforms often have shackled computing capabilities due to strict power, size, cooling, and cost limitations. This means that you don’t see the biggest and most capable AI being deployed in edge platforms, which has held back a renaissance in edge AI. But we can break through that.

The surge of Large Language Models (LLMs) in 2023 has shown what removing those computing shackles can do. Companies like OpenAI have built bigger and better models and released the best versions to the public. As a result, they’ve reached a tipping point where the AI capabilities are hugely impactful, which has led to a surge of interest and further innovation.  In the cloud space, there are fewer barriers between developing a cutting-edge neural network and then serving it to millions of users.

By many accounts, a machine used to serve ChatGPT-4 cost more than the median home price in the USA this year. Though, to be fair, GPUs are one of the few resources that are scarcer than housing lately. This high cost is feasible mainly due to two things: the same system serves many users, and the bandwidth required for each user (just text or individual images) is very low. Edge applications typically have a much smaller number of users per system (perhaps even just one), and the data is often high bandwidth, like HD video or RF sensors.

With the M1076, Mythic uses analog in-memory computing to break these shackles at the edge and usher in a new renaissance in edge AI computing. The M1076 already enables full-HD AI at <4 watts of power. Built on years of experience, customer feedback, and innovations in hardware and software, Mythic’s M2000 series will bring another order of magnitude leap beyond that in power and cost efficiency. Analog in-memory computing is a new technology, and it will be exciting to see where it can lead us in the coming years.

What attracts Mythic to edge computing is that the latency and bandwidth challenges are so significant and showcase the advantages of Mythic’s technology. The enormous challenges of LLMs are similarly exciting. With the upwards of trillions of parameters required, systems struggle to provide enough memory bandwidth to execute LLMs efficiently. Mythic’s in-memory computing eliminates memory bandwidth problems, so there is a natural fit between these technologies. Our early analysis has indicated order-of-magnitude improvements in each metric of cost, performance, and power over Nvidia and leading LLM startups.

This makes us wonder: what will we unlock if we have those order-of-magnitude improvements in LLM computing? Could we marry high-performance LLMs with edge computing? Google published an exciting on the combination of robotics, computer vision, and LLMs earlier this year. They have also recently shown what a multi-modal LLM could do while watching video. LLMs enable systems to reliably respond to novel situations instead of with only pre-set routines. This makes for fascinating possibilities. If someone has a business full of LLM+CV-enabled robots, they’ll need to do that processing at least on-prem. I am okay with waiting a few seconds for Github Co-pilot to respond with coding suggestions, but edge devices need to respond instantaneously so that today’s robots don’t move the same way jerky 80’s robots did.

Emerging use cases for edge AI are growing. Between unshackling edge computing and the introduction of LLMs, it is impacting the industrial, vehicle, health and safety, AR/VR, RF and signal processing, and intelligent systems industry segments. As 2024 nears and technology requirements adapt to match the pace of society, edge AI allows consumers and designers to better integrate devices into our lifestyles. We will need continued breakthroughs in AI computing to see the full benefits of AI at the edge.

To learn more about the latest AI and edge computing trends and watch the presentation for yourself, visit this webpage.

Want to hear from Mythic?

Get the latest updates on news, events and blog post notifications! Subscribe to our What’s New Newsletter.

You can unsubscribe at any time by clicking the link in the footer of our emails. For information about our privacy practices, please view our privacy policy here.

Recent Posts