World’s Leading AI Inference Selected by Innovation Zone Attendees at TSMC’s North America Technology Symposium
Cerebras Systems, makers of the fastest AI infrastructure, announced that Cerebras AI Inference has been named Demo of the Year at 2025 TSMC North America Technology Symposium. Voted on by attendees from TSMC’s customer and partners, the award recognizes the most compelling and impactful innovation demonstrated in the Innovation Zone at TSMC’s annual Technology Symposium.
Marketing Technology News: MarTech Interview with Kevin Shectman, Senior Director of Product Marketing @ Braze
“Wafer-scale computing was considered impossible for fifty years, and together with TSMC we proved it could be done,” said Dhiraj Mallick, COO, Cerebras Systems.
“Wafer-scale computing was considered impossible for fifty years, and together with TSMC we proved it could be done,” said Dhiraj Mallick, COO, Cerebras Systems. “Since that initial milestone, we’ve built an entire technology platform to run most important AI workloads more than 20x faster than GPUs, transforming a semiconductor breakthrough into a product breakthrough used around the world.”
Marketing Technology News: Martech & the ‘Digital Unconscious’: Unearthing Hidden Consumer Motivations
“At TSMC, we support all our customers of all sizes—from pioneering startups to established industry leaders—with industry-leading semiconductor manufacturing technologies and capacities, helping turn their transformative idea into realities,” said Lucas Tsai, Vice President of Business Management, TSMC North America. “We are glad to work with industry innovators likes Cerebras to enable their semiconductor success and drive advancements in AI.”
In 2019, Cerebras introduced the industry’s first functional wafer-scale processor—a single-die chip 50 times larger than conventional processors—breaking a half-century of semiconductor assumptions through its partnership with TSMC. The Cerebras CS-3 extends this lineage and continues a scaling law unique to Cerebras.
A Showcase of Innovation and Partnership
Cerebras demonstrated CS-3 inference in TSMC North America Technology Symposium’s Innovation Zone, a curated exhibition area highlighting breakthrough technologies from across TSMC’s emerging customers. Cerebras AI Inference received the highest number of votes from attendees at the North America event, reflecting both the technical achievement and the excitement it generated among event attendees.
Cerebras AI Inference Leading the Industry
Cerebras AI Inference is now used across the world’s most demanding environments. It is available through AWS, IBM, Hugging Face, and other cloud platforms. It supports cutting-edge national scientific research at U.S. Department of Energy laboratories and the Department of Defense, and global enterprises across healthcare, biotech, finance, and design have adopted Cerebras to accelerate their most complex AI workloads with real-time performance that GPUs cannot deliver.
Cerebras is also the fastest platform for AI coding—one of the fastest growing and most strategic AI verticals. It generates code more than 20 times faster than competing solutions.
Cerebras has been a pioneer in supporting open-source models from OpenAI, Meta, G42 and others, consistently achieving the fastest inference speeds as verified by independent benchmarking firm Artificial Analysis.
Cerebras now serves trillions of tokens per month across the Cerebras Cloud, on-premises deployments, and leading partner platforms.











