Chip startup Mythic Inc. today announced that it has closed a $125 million funding round led by DCVC.
The venture capital firm was joined by NEA, Softbank KR, Honda Motor Co. and a long list of other backers. The investment brings Mythic’s total outside funding to more than $175 million.
Standard processors represent information as a series of individual electrical signals that each correspond to 1 or 0. Information processed in this manner is known as digital data. Mythic has developed a so-called analog processing unit, or APU, that takes a different approach. It represents data as fluctuations in a single, continuous electrical signal rather than as multiple individual signals.
Analog chips are mainly used for tasks such as distributing power to a computer’s components and filtering Wi-Fi inference. Mythic’s APU, in contrast, is designed to run artificial intelligence models. The company claims that its chip can run AI models with 100 times the performance per watt of traditional graphics cards.
Mythic’s APU is based on a design that it describes as a compute-in-memory architecture. The chip is made of memory circuits that not only store information but also process it. Calculations are carried out using resistors, tiny electronic components that inhibit the flow of electrons.
An AI model is a collection of artificial neurons, code snippets that are organized into groups called layers. Each layer carries out a small portion of the calculations involved in analyzing a prompt and then passes its output to the next layer, which repeats the process. Mythic says its APU can run an AI model’s layers in parallel rather than one after one to speed up processing.
The company is turning its technology into a product with a device called Starlight. According to Mythic, it contains multiple APUs that cumulatively consume less than one watt of power. Starlight can be embedded in edge systems such as robots to enhance the quality of the data collected by their image sensors.
Mythic also sees customers deploying its silicon in data centers. Ahead of today’s funding announcement, it tested the APU’s ability to run large language models. It determined that APU-powered servers can process up to 750 times more tokens per second than graphics cards.
The company provides a software toolkit that makes it easier for developers to adapt their LLMs to its chips. The toolkit optimizes AI models using quantization, a method that compresses neural network parameters to reduce their memory footprint. Mythic’s software can further boost an LLM’s performance by retraining it for the APU.
The company disclosed that it’s working on a new version of the chip. According to Mythic, future APUs will enable relatively low-power devices such as smartphones to locally run LLMs on par with GPT-3.
Photo: Mythic
Support our mission to keep content open and free by engaging with theCUBE community. Join theCUBE’s Alumni Trust Network, where technology leaders connect, share intelligence and create opportunities.
- 15M+ viewers of theCUBE videos, powering conversations across AI, cloud, cybersecurity and more
- 11.4k+ theCUBE alumni — Connect with more than 11,400 tech and business leaders shaping the future through a unique trusted-based network.
About News Media
Founded by tech visionaries John Furrier and Dave Vellante, News Media has built a dynamic ecosystem of industry-leading digital media brands that reach 15+ million elite tech professionals. Our new proprietary theCUBE AI Video Cloud is breaking ground in audience interaction, leveraging theCUBEai.com neural network to help technology companies make data-driven decisions and stay at the forefront of industry conversations.
