While GPUs (like Nvidia H100s) are the industry standard, Google designs its own AI accelerators called TPUs (Tensor Processing Units).
TPUs are explicitly designed for the matrix multiplication operations required by neural networks. They offer massive cost-performance benefits, particularly for training large foundational models.
The latest generation, Cloud TPU v5p, is what Google uses to train the Gemini models.