Wire Stories

Cerebras Systems Accelerates Global Growth with New India Office

Led by Former Intel Leader, Lakshmi Ramachandran, Cerebras Expands R&D Operations and Strengthens Local Customer Growth

SUNNYVALE, Calif. & BANGALORE, India--(BUSINESS WIRE)--#AI--Cerebras Systems, the pioneer in accelerating artificial intelligence (AI) compute, today announced its continued global expansion with the opening of a new India office in Bangalore, India. Led by industry veteran Lakshmi Ramachandran, the new engineering office will focus on accelerating R&D efforts and supporting local customers. With a target of more than sixty engineers by year end, and more than twenty currently employed, Cerebras is looking to rapidly build its presence in Bangalore.

�India in general and Bangalore in particular, is extremely well-positioned to be a hotbed for AI innovation. It has world leading universities, pioneering research institutions and a large domestic enterprise market,� said Andrew Feldman, CEO and Co-Founder of Cerebras Systems. �Cerebras is committed to being a leader in this market. Under Lakshmi�s leadership, we are rapidly hiring top-notch engineering talent for Cerebras Systems India, as well as supporting sophisticated regional customers who are looking to do the most challenging AI work more quickly and easily.�

As part of the India expansion, Cerebras appointed Ramachandran as Head of Engineering and India Site Lead at Cerebras India. Based in Bangalore, Ramachandran brings more than 24 years of technical and leadership experience in Software and Engineering. Prior to joining Cerebras, she was with Intel in various engineering and leadership roles. Most recently, she was Senior Director at Intel's Data Center and AI group, responsible for delivering key capabilities of deep learning software for AI accelerators. She has extensive experience in scaling business operations and establishing technical engineering teams in India.

�I am honored to be part of Cerebras Systems� mission to change the future of AI compute, and to work with the extraordinary team that made wafer scale compute a reality,� said Ramachandran. �We have already begun to build a world class team of top AI talent in India, and we are excited to be building core components here that are critical to the success of Cerebras� mission. We look forward to adding more technology and engineering talent as we support the many customer opportunities we have countrywide.�

The Cerebras CS-2 is the fastest AI system in existence and it is powered by the largest processor ever built � the Cerebras Wafer-Scale Engine 2 (WSE-2), which is 56 times larger than the nearest competitor. As a result, the CS-2 delivers more AI-optimized compute cores, more fast memory, and more fabric bandwidth than any other deep learning processor in existence. It was purpose built to accelerate deep learning workloads, reducing the time to answer by orders of magnitude.

With a CS-2, Cerebras recently set a world record for the largest AI model trained on a single device. This is important because with natural language processing (NLP), larger models trained with large datasets are shown to be more accurate. But traditionally, only the largest and most sophisticated technology companies had the resources and expertise to train massive models across hundreds or thousands of graphics processing units (GPU). By enabling the capability to train GPT-3 models with a single CS-2, Cerebras is enabling the entire AI ecosystem to set up and train large models in a fraction of the time.

Customers around the world are already leveraging the Cerebras CS-2 to accelerate their AI research across drug discovery, clean energy exploration, cancer treatment research and more. For example, pharmaceutical leader GSK is now able to train complex epigenomic models with a previously prohibitively large dataset � made possible for the first time with Cerebras. AstraZeneca is iterating and experimenting in real-time by running queries on hundreds of thousands of abstracts and research papers � a process that previously took over two weeks with a GPU cluster and is now being achieved in just over two days with Cerebras. And Argonne National Laboratory is using a CS-2 to figure out how the virus that causes COVID-19 works. They are able to run simulations with a single CS-2 that would require 110-120 GPUs.

The recent international expansion in India comes on the heels of Cerebras� global growth across Japan and Canada in the past two years. With customers in North America, Asia, Europe and the Middle East, Cerebras is delivering industry leading AI solutions to a growing roster of customers in the enterprise, government, and high performance computing (HPC) segments, including GlaxoSmithKline, AstraZeneca, TotalEnergies, nference, Argonne National Laboratory, Lawrence Livermore National Laboratory, Pittsburgh Supercomputing Center, Leibniz Supercomputing Centre, National Center for Supercomputing Applications, Edinburgh Parallel Computing Centre (EPCC), National Energy Technology Laboratory, and Tokyo Electron Devices.

For more information on Cerebras India, please visit www.cerebras.net.

About Cerebras Systems

Cerebras Systems is a team of pioneering computer architects, computer scientists, deep learning researchers, and engineers of all types. We have come together to build a new class of computer system, designed for the singular purpose of accelerating AI and changing the future of AI work forever. Our flagship product, the CS-2 system is powered by the world�s largest processor � the 850,000 core Cerebras WSE-2 enables customers to accelerate their deep learning work by orders of magnitude over graphics processing units.

Contacts

Media Contact:
Kim Ziesemer

Email: [email protected]

To Top