GPU vs CPU: Which One Do You Need If You Want to Learn Deep Learning

Let us explain the difference between CPU vs GPU in the process of deep learning.

Recently, I had an interesting experience while training a deep learning model. To make a long story short, I’ll tell you the result first: CPU based computing took 42 minutes to train over 2000 images for one epoch, while GPU based computing only took 33 SECONDS!

Believe it or not, this is actually the first time, I experience the difference between CPU and GPU in the process of deep learning.

GPU vs CPU
 

As a data scientist or any machine learning enthusiast, it is inevitable for you to hear a similar statement over and over again:

Deep learning needs a lot of computational power.

GPU is the key to solve this problem. But for someone like me who has no background in hardware and computer science, why?

Here is a video that explains the mechanisms and differences of GPU vs CPU that I really like.

Basically, GPU is very powerful at processing massive amounts of data parallelly and CPU is good at sequential processes.

GPU is usually used for graphic rendering (what a surprise). That’s why all the gaming consoles (Xbox, PS, Nintendo Switch) require decent GPUs. CPU is more often used for general computations.

However, more importantly, which one do we need if we are learning deep learning?

Is CPU a must?

The short answer is yes. Most laptop or desktop computers, no matter if you’re using MacBook or PC, have a CPU included when you buy it. You must have seen Dual-Core Intel i5, i6, etc somewhere while you were browsing for computers online or in the shop. This is the electronic circuitry that executes instructions comprising a computer program.

The next question is: if there’s a CPU, is GPU a must?

Yes and no.

You’ll be surprised that how much you can learn without a GPU machine in your machine learning or deep learning journey. Your laptop is perfectly fine if you just start your learning process. Only if you are planning to work with a large number of images like us, or testing extremely complicated models, then you’ll need a GPU to speed things up.  

From my experience:

1. If you’re learning some online machine learning courses, most of the dataset they are using is a subset of a bigger dataset or some resampled images. My laptop, a six-year-old MacBook, could handle all the Coursera or YouTube videos that I’ve been following so far. Great pal.

GPU vs CPU

2. You could always test your code without a GPU. The common libraries like Tensorflow, PyTorch can all be told to use CPU specifically. What’s even better, they can use GPU too. You can use the same script and some of the libraries could detect the optimal methods and switch between processors. (But be careful, you might need to add some additional packages or libraries such as “TensorFlow-gpu”. A lesson learnt after training on CPU for three days.

3. Make good use of the online platforms. Buying a good GPU can be expensive. There are a bunch of free GPU sources online that you can simply sign up Kaggle or Google Colab.

4. There are also paid services if you’d like to experience bigger platforms such as Amazon Web Service (AWS)Microsoft Azure or Google Cloud.

The conclusion today is that you don’t need to have a GPU to learn machine learningbut it’s recommended if you’re planning to work in this field in the long term.

Subscribe for more stories from above, tips, & tricks

Share this article with your peers on social media.

Facebook
Twitter
LinkedIn
Topographic map with dots displaying data points of environmental drone mapping data. Three drone mapping overlays in different environments

Get a free account

  • Unlimited orthomosaics
  • Unlimited uploads
  • Free storage
Manage, process, analyze, and collaborate with your drone mapping data.