4 min to read
Cloud Setup Tutorial Part-0
AWS Setup Tutorial
Why use Cloud compute?
Deep Learning is extremely Computation intensive. GPU-Graphic Processing Units seem to perform really well with these computations. Reason: GPUs were orignally devoloped for gaming, which involves a lot of Matrix computations. Deep learning involves a lot of Matrix computations too and recently the GPUs have shown tremendous advances and this has resulted in these being used for Deep Learning.
Usually, GPU resources are not required in our everyday life and we do not have latops or ‘rigs’ (Desktops) that can leverage this amount of power. So chances are you do not have a GPU at all or have one that would take too long to ‘train’ or work with Deep learning Models. An easy hassle free alternative is using Cloud Compute for Deep learning.
Hassle free? You don’t have to go through the pain of setting every ‘Library’ and Dependency and you get to jump right in and start out with the lessons.
And to start out, it’s pretty cheap. You can get access at 0.6$ an hour (with decent power). Which is way less than buying a ‘rig’ right away.
There are a huge number of options that are available now. I will mention a few that I think are good. Know that there are a lot of more options that this blog wouldn’t cover, feel free to discuss about them in the comments below!
This is roughly in an order of the easiest to setup and in increasing difficulty
Crestle is the simplest interface to play with, once you login you have an option to either start with a GPU or CPU. Then you’re taken directly to the Jupyter notebook
All of DL libraries installed. No need for any setting up, no using terminals to login (which is initmidating when you start).
There is access to a terminal still if you want to add/download datasets and install other libraries.
The pricing doesn’t have many options since you don’t get any. It’s GPU or CPU. The GPUs are Tesla K-80 for you Benchmarking geeks.
GPU: 0.59$ an hour CPU: 0.059$ an hour
Paperspace also offers a clean an intuitive interface along with a few options for the GPU computing. Plus another cool feature is that launching an instance takes you directly to the ‘Desktop’ view. You get to control the GUI.
You can always use a terminal to SSH and use the interface.
That said, the downside of Paperspace is that they have servers in three areas, and the GUI feature might require a good internet connection. (Good is subjective. In my case, I use a 512KBPS connection and can’t use this feature at all, the internet options in your areas ideally would not be this bad)
Google Cloud Platform: One of the Big players in the space. The coolest part is, they are offering 300$ for signing up (with a validity of 12 months)
I encourage you to check out our AI Researcher, James Lee’s blog post about it Here
It’s a walk through of setting up a google cloud computing instance with a 500gb SSD, a 3.75gb ram Broadwell CPU and a Nvidia Tesla K80 GPU. All of this can be done for free at the start, with some details. Repeating these here is unncessary but please feel free to comment below if you to discuss anything.
AWS is arguably the Number 1 cloud service being used right now. Not just in academia, in Industry as well.
This series will include a walkthrough of setting up and launching instances in the following parts.
AWS offers tonnes of options for all purposes- storage, CPU Power, GPU power, it also hosts the fastest P3 instances in various regions which are really good for huge Deep learning models.
The costs range from 0.5$ onwards (per hour) for GPU compute. AWS offers some credits to Students as well (If you sign up via a Student ID).
I will skip the details of AWS since we will demonstrate setting up an AWS Instance in this series.
Feel free to ask for help with any other platforms too! Drop a comment below.
Choosing a Service
This Blogpost was intended to give a brief intro and has skipped over a lot of details, since many of the Pricing and technical details are highly dynamic. Using a certain vendor is truly a personal choice. I prefer Crestle and AWS, I suggest you should play with all of these options and pick your poison. At a certain point when you are sure you want to pursue the field, a Local server setup will seem to be more viable option. Setting up a Local server will be included in this series.
This is by no means an exhaustive and complete list. There are many other options to choose from:
Leave a comment below if there is anything you would want to discuss!
Subscribe to my Newsletter for a weekly-ish curated list of interesting Deep Learning reads.