Cerebras Systems is redefining the future of AI with the world’s fastest inference engine—enabling breakthroughs in agentic intelligence, real-time reasoning, and AI coding assistants. Andrew Feldman, CEO and Co-Founder of Cerebras Systems, will share how ultra-fast inference speeds surpassing 3,000 tokens per second are transforming enterprise AI, unlocking dramatic productivity gains in AI coding and development workflows. By accelerating state-of-the-art open-source models from Meta, OpenAI, Qwen, DeepSeek, and others, Cerebras delivers both unmatched performance and tangible ROI for enterprises worldwide. This session will explore the next 12 months of AI infrastructure innovation and why ultra-fast inference is the foundation for a new era of super-intelligent applications—where AI doesn’t just assist in coding but redefines how software itself is imagined, written, and deployed.

Sean Lie
Sean is Chief Technology Officer and co-founder of Cerebras Systems. Prior to Cerebras, Sean was Lead Hardware Architect of the IO virtualization fabric ASIC at SeaMicro. After SeaMicro was acquired by AMD, Sean was made an AMD Fellow and Chief Data Center Architect. Earlier in his career, he spent five years at AMD in their advanced architecture team. He holds a BS and MEng in Electrical Engineering and Computer Science from MIT and has authored 16 patents in computer architecture.