Is there any point in using a colab TPU for inference?

I have a colab pro+ subscription and like many, I'm finding that the GPU allowance is rather small compared to the price. While I wait for GPU access I was wondering if the TPU VM would be a substitute. It's running now and seems slower. I have not adjusted my code. Is there any point in this? To be honest, I'm not quite clear on the difference between a TPU and a GPU. I ran lscpu in the console and seems like the processor is a 20 core xeon compared to the GPU which was an 8 core xeon. Is that all the TPU is, a monster CPU chip?

Thanks

Topic tpu colab gpu

Category Data Science

About

Geeks Mental is a community that publishes articles and tutorials about Web, Android, Data Science, new techniques and Linux security.