Cerebras CEO ne kaha, Nvidia ki AI dominance ko challenge mil raha hai Davos se

TECH
Whalesbook Logo
AuthorKavya Nair|Published at:
Cerebras CEO ne kaha, Nvidia ki AI dominance ko challenge mil raha hai Davos se
Overview

Cerebras Systems ke CEO Andrew Feldman ne kaha hai ki Nvidia ki AI hardware mein lambe samay se chali aa rahi dominance ko ab challenge mil raha hai. Davos se baat karte hue, Feldman ne Google dwara Nvidia GPUs ke bina Gemini AI ko train karne aur fast inference ke badhte importance ko industry ke key shifts bataya. Cerebras apni Wafer Scale Engine ko Nvidia ke products ka ek viable alternative bata raha hai.

AI Chip Market Trends Change Ho Rahe Hain

Artificial intelligence ka sirf Nvidia ke graphics processing units (GPUs) se juda hone ka concept ab actively toda ja raha hai, AI chip designer Cerebras Systems ke co-founder aur CEO Andrew Feldman ke anusar. World Economic Forum Davos mein Feldman ki baaton ne Nvidia ki current market leadership mein ek significant potential disruption ka ishara diya hai.

"Mujhe lagta hai humne pehle hi Nvidia ki dominance ko challenge hote dekha hai," Feldman ne kaha. "GPU hi akela machine nahi hai jiska use AI mein ho sakta hai. Un logon ka mental moat tod diya gaya hai jo sochte the ki AI ka matlab sirf Nvidia hai."

Transition Ka Evidence

Feldman ne Groq ki Nvidia dwara reportedly $20 billion ki acquisition ko is baat ka evidence bataya ki GPUs ki limitations hain. "Nvidia ne maana ki unke paas solution nahi tha aur number two player ko kharidne ke liye $20 billion kharch kar diye. Yeh dikhata hai ki GPUs kuch cheezein achhe se nahi kar paate," unhone explain kiya. Nvidia dwara Groq ki technology ko acquire karne ka yeh move ek strategic effort hai, jabki Groq ek independent entity ke roop mein kaam kar raha hai.

Google Ka Shift aur Inference Par Focus

Feldman ke anusar, ek bada industry inflection point Google ka apna Gemini AI model train karne ka faisla tha, jismein Nvidia hardware ki jagah apne Tensor Processing Units (TPUs) use kiye gaye. "Humne dekha ki ek foundation model ki training poori tarah Nvidia ke bina hui. Yeh ek bada moment tha," unhone kaha.

Cerebras CEO ne rapid inference ki market ke tezi se pivot hone par zor diya – woh critical stage jahan AI models ko real-world applications ke liye deploy kiya jaata hai. "Inference woh jagah hai jahan AI real economy se milti hai. Wohi coding, agentic work aur deep research hota hai," Feldman ne samjhaya. Cerebras Systems, jo 2016 mein bani thi, apni Wafer Scale Engine (WSE) offer karti hai, jise duniya ka sabse bada AI accelerator kaha jaata hai, aur iska aim NVIDIA jaise established giants se sidhe compete karna hai.

AI Bubble Concerns Ko Dismiss Karna

Feldman ne artificial intelligence market mein bubble ke daron par bhi baat ki, is idea ko dismiss karte hue. "Yeh bubble nahi hai. Yeh woh tareeka hai jissey ek nayi technology poori economy mein spread hoti hai... Log AI ko tez karna chahte hain. Woh intezar nahi karna chahte. Tez inference hi hai jisme users aaj sabse zyada interest rakhte hain," unhone conclude kiya.

Disclaimer:This content is for educational and informational purposes only and does not constitute investment, financial, or trading advice, nor a recommendation to buy or sell any securities. Readers should consult a SEBI-registered advisor before making investment decisions, as markets involve risk and past performance does not guarantee future results. The publisher and authors accept no liability for any losses. Some content may be AI-generated and may contain errors; accuracy and completeness are not guaranteed. Views expressed do not reflect the publication’s editorial stance.