Google launches AI chip to handle faster training performance of LLMs

Google has introduced Cloud TPU v5e, claiming it is the most cost-efficient and versatile TPU to date. The new tensor processing unit aims to address the growing demand for computing infrastructure that can handle workloads like generative AI and LLMs. Google states that Cloud TPU v5e offers up to 2x higher training performance per dollar and up to 2.5x inference performance per dollar compared to its predecessor.

from Gadgets News – Latest Technology News, Mobile News & Updates https://ift.tt/BAgT2Ed

Comments

Popular posts from this blog

Airtel sees huge jump in demand for international roaming, plans start at Rs 133 per day

Ducati unveils special Troy Bayliss edition Panigale V2

WhatsApp starts testing voice notes for status with select iOS beta users: Report