Setup and Run in Amazon AWS

The philosophy of is to get up and running with Deep Learning as soon as possible, and figure out the details on the way. And they provide some wonderful resources to do just that. has a promo code for Paperspace, that would allow you to get started fairly easily. But soon you would realize that the $15 promo in Paperspace barely lasts two lectures. So, at some point you would want to move to a more powerful GPU provider, such as Amazon AWS. But setting up an AWS machine could be a little troublesome for beginners. In this blog I am going to give a comprehensive guideline on how to set up an AWS machine to fire up your deep learning code. Amazon AWS has a $150 bonus for signing up with a student email address. I will briefly discuss how to get it at the end.

In this tutorial I am assuming that you already have an AWS account, and you have set up the billing information and other necessary requirements. If you face any problem regarding those, you can contact the customer service, they are very helpful on this regard.

Login to console:

Log in to AWS console:

Select location Oregon, Ohio or N. Virginia for an AWS free tier account. Otherwise you would get an error while creating an instance. If you are not a free tier user, you can choose any location and you are good to go.

Go to All services> Select EC2

Request Limit Increase:

A new AWS account have a limit on how many instances of each machine you can access. The kind of machine we are going to use for is p2.xlarge. By default this limit is set to 0. That means they don’t allow you to make any instance of this type. So first thing you have to do is request for an increase in this limit. If you already have this limit set to 1 or more, you can skip this part.

Click “Limits” on the left panel of EC2 Dashboard

Inside “Instance Limits” click on “Request Limit Increase”

Select Region (The same region you have selected earlier), for example, I have chosen Oregon.
Primary Instance Type : p2.xlarge
Limit: <do nothing here>
New Limit Value: 1

In Use Case Description, type “ MOOC”

Select “Support Language” and “Contact Method” and hit “Submit”.
The customer care will reach out to you in 24/48 hours. Once this part is done, you can move on to the next sections.

Creating the Key Pair:

AWS requires your to generate a Key Pair to connect your computer to the cloud machine. So, you could run the cloud computer just as if you were running it in your own PC. In this step you are going to set up the key pair that would connect your PC with the AWS cloud computer.

Extra step for Windows:
To run this on Windows (works on Windows 10 only), you need to install Ubuntu Bash for windows. Here you can find a comprehensive guide to how to install it. Once it is done, move on to the next part of the tutorial. Here on, when I mention bash, you should understand the Ubuntu Bash for Windows.

Open your Linux Bash or Mac Terminal. Type “ssh-keygen” into the bash, hit Enter 3 times.

This will generate an SSH key file named “” in your local computer inside folder named ‘.ssh’. Now you need to copy this file to another directory that is easy for you to locate.

# Linux or Mac
cp .ssh/ /<A location in your computer>/
# For Bash for Winwows
cp .ssh/ /mnt/<A location in your computer>/

For example, following command in Windows 10 will copy the file to your Desktop and E drive respectively.

cp.ssh/ /mnt/c/Users/<yourUserName>/Desktop/
cp.ssh/ /mnt/e/

Selecting this Key Pair from AWS:
Select “Key Pairs” from the left panel.

Click “Import Key Pair

Choose the key pair file “” from your local storage (where you have just copied from .ssh folder), change the name for your Key Pair if you wish.

Launch Instance:

Go back to the EC2 dash board and click “Launch Instance

From the left panel, click “Community AMIs”

Search for “fastai”, select fastai-part1v2-p2 — ami-xxxxxxxx

Depending on your selected location, the “ami” would be one of the following:

  • Oregon: ami-8c4288f4
  • Sydney: ami-39ec055b
  • Mumbai: ami-c53975aa
  • N. Virginia: ami-c6ac1cbc
  • Ireland: ami-b93c9ec0

Filter by GPU Compute

Select p2.xlarge

Click “Review and Launch”

Click “Launch”

Select a key pair, click “Launch Instances”. You can set up a different name for this particular Key Pair if you want.

And, done. You should see a message like this confirming that your instance is launched.

A little bit caution though, the instance you have created will start running instantly. And to my concern, AWS bills you on hour basis. So, even if you run the machine for a few minutes, probably you would be charged for an hour (Please let me know if I am wrong). So, I would recommend you to run the instance for an hour before you shut it down. Maybe just test out some of the lessons if they are working right now.

Running the Instance:

To run the instance you have just created, go back to EC2 dashboard, click “Instances” from the left panel

You will see an Instance like this.

For the first time, it will probably be running. If not, to run it, right click> Instance State> Start. Also use the same process to stop the instance when you are done. Do not click “Terminate” unless you know what you are doing.

To log in to this computer, you need to know its Public IP address. The IP is located in the description, as shown here. Copy this IP address.

To get to this computer you need to SSH to it. SSH means connecting to the cloud computer so that it is like that you are typing in that computer. To do this, you need to open your Linux Bash, Ubuntu Bash for Windows, or Mac Terminal, and execute the following command. It will also connect Jupyter Notebook in the cloud instance to Jupyter Notebook in your PC. XXX.XXX.XXX.XXX indicates the Public IP address of the cloud instance.

ssh ubuntu@XXX.XXX.XXX.XXX -L8888:localhost:8888

It will ask for permission to connect to the cloud computer. Type “yes”. After this step, you should see something like this:

Now go to directory

cd fastai

First thing you should do when you get here is make sure it is updated

git pull

For the first run, and maybe even once a month, be sure to update the libraries as well

conda env update

This might take a few minutes, don’t freak out!
The last step, is to run Jupyter Notebook

jupyter notebook

This will fire up Jupyter Notebook in the cloud computer, as if it is running on your local computer. You should see something like this:

Copy the URL and paste it in your browser, and you will see a Jupyter Notebook window with files.

VIOLA!! You have successfully created and run a cloud instance in Amazon AWS. Now you can do anything in this Jupyter Notebook that you could do in your local computer. Enjoy the power of cloud, until you burn out of your credits ;). Last time I checked a p2.xlarge instance cost $0.90/Hour. After you are done working, be sure to shut down the instance.

AWS student pack:

You can get a $150 credit from Amazon AWS using your school email id.
This will only work if you haven’t signed up for AWS Educate program with that email address before.

Go to

Scroll down to “aws educate” click on “Get access using your unique link”. You’ll be asked to log in to github if you are not logged in.

Provide your student information and school email address to get the promo code and insert the promo code in your AWS account to get credit. If you have trouble getting this credit, ask me in the comments, I will make a separate blog describing the process.


Image Credit:
Some of the images are screen grab from the referenced video.




Researcher in NLP and Machine Learning |

Love podcasts or audiobooks? Learn on the go with our new app.

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store
Masum Hasan

Masum Hasan

Researcher in NLP and Machine Learning |

More from Medium

Deploy a local trained machine learning model on AWS using SAM

Store your MLFLOW artifacts on AWS s3 bucket ~free

Re-Deploying Trained Models when using Sagemaker Jumpstart

Improving the performance and speed of Deep Learning Pipelines in TensorFlow