Google launches AI chip to handle faster training performance of LLMs

Google has introduced Cloud TPU v5e, claiming it is the most cost-efficient and versatile TPU to date. The new tensor processing unit aims to address the growing demand for computing infrastructure that can handle workloads like generative AI and LLMs. Google states that Cloud TPU v5e offers up to 2x higher training performance per dollar and up to 2.5x inference performance per dollar compared to its predecessor.

from Gadgets News – Latest Technology News, Mobile News & Updates https://ift.tt/BAgT2Ed

Comments

Popular posts from this blog

Used car buying guide: Maruti Suzuki Swift (2018-2021)

Myntra’s private label chief Manohar Kamath reportedly quits

Two authors sue Microsoft, OpenAI over copyright infringement: All the details