External GPU vs. internal GPU for machine learning

What are the pros/cons of using external GPUs (e.g., connected through thunderbolt) vs. internal GPUs for machine learning?

Topic gpu performance

Category Data Science


Here is a good discussion on a similar question on reddit:

https://www.reddit.com/r/MachineLearning/comments/twnz1o/d_which_external_nvidia_gpu_should_i_buy_for/

Essentially, when connecting an external gpu, it depends a lot on the pc you're connecting it to. You need a good cpu and general hardware in that pc, in order for the gpu to function properly. that's why many recommend to build a powerful gpu home computer right away instead of buying an external gpu. this however comes at much higher cost. an advantageous use of external gpus may also be using cloud gpus from services like colab or better ones. It comes at a monthly fee, but may be easier, quicker and cheaper than building your own pc ^^.


External GPUs are often more easy to swap out than internal GPUs. Thus, there will be more options for hardware and software upgrades for external GPUs.

Internal GPUs may have better connections. Thus, moving data is will be less likely to be a constraint (i.e., GPU starvation).

About

Geeks Mental is a community that publishes articles and tutorials about Web, Android, Data Science, new techniques and Linux security.