Google launches AI chip to handle faster training performance of LLMs

Google has introduced Cloud TPU v5e, claiming it is the most cost-efficient and versatile TPU to date. The new tensor processing unit aims to address the growing demand for computing infrastructure that can handle workloads like generative AI and LLMs. Google states that Cloud TPU v5e offers up to 2x higher training performance per dollar and up to 2.5x inference performance per dollar compared to its predecessor.

from Gadgets News – Latest Technology News, Mobile News & Updates https://ift.tt/BAgT2Ed

Comments

Popular posts from this blog

KaarTech acquires US-based firm Dunn Solutions

Used car buying guide: Maruti Suzuki Swift (2018-2021)

This will be the first Realme phone to come with Snapdragon 8 Gen 1 processor