Ex-Apple Engineers Launch ElastixAI to Tackle AI’s Costliest Challenge

June 17, 2025
ElastixAI

At IGF, we support the bold ideas that reshape industries. And today, we have one more cool example to look at. This time, in our article, we’re going to tell you more about a new, though interesting startup ElastixAI. One more Seattle-based startup was founded by former Apple and Xnor engineers. Their goal is to solve one of the most complex challenges in AI: how to make large language model (LLM) inference more efficient, scalable, and cost-effective.

Recently, they have raised $16 million in funding, led by FUSE. They got the support from investors such as Catapult, Tyche, Liquid 2 Ventures, and DNX Ventures. The company now acts stealthily but is already impactful.

ElastixAI Visionaries Building for the Future

ElastixAI was established by CEO Mohammad Rastegari, co-founder of Xnor, Saman Naderiparizi, former senior engineering manager at Apple, and Mahyar Najibi. By bringing their experience from Apple and Waymo, they built a software-first AI inference platform. The platform can suit every need you require, whether on edge devices or massive cloud systems.

The inference is one of the most costly and technically demanding parts of deploying LLMs. Consequently, ElastixAI is aiming to close the performance gap. The platform allows users to configure systems at a granular level. It also promises better results across a range of hardware.

IGF Vision

With better AI adoption, infrastructure becomes a top priority for many startups. As we can see, ElastixAI is addressing this head-on, setting the stage for smarter, leaner AI deployments. At IGF, we connect investors and startups to support innovative ideas and minds. That’s why we support startups and investors to bring such projects to life and promote better AI adoption. We turn bold ideas into realities. Contact us now.

Moreover, as we can see from this startup, some of the biggest opportunities lie beneath the surface — in the systems that make AI actually work.