100 Days of AWS — Day 22- Introduction to AWS CLI

  • Python3+
  • Windows, Linux, macOS, or Unix
  • You can install aws command line interface and its dependencies with the help of pip(package manager for python)
  • To install pip, we first need to install EPEL(Extra Package for Enterprise Linux) https://fedoraproject.org/wiki/EPEL
# yum -y install python34-pip.noarch
  • Now install awscli
# pip3 install awscli
  • To upgrade cli
# pip3 install awscli --upgrade
  • To verify aws cli installed successfully
# aws --versionaws-cli/1.18.147 Python/2.7.18 Linux/5.10.106-102.504.amzn2.x86_64 botocore/1.18.6
  • Let start our journey to AWS world using command line tool and the first command we are going to use is aws configure which is going to configure settings that aws command line interface uses when interacting with aws(this include security credentials and default region)
  • Before using aws configure we must need to sign up for aws account and download security credentials. If we don’t have access keys we can generate it from aws management console, go to IAM(under Security, Identity & Compliance)
  • Go to IAM console https://us-east-1.console.aws.amazon.com/iamv → Click on Users
  • Click on Add users and fill all the details(make sure Programmatic access is clicked)
  • In the next screen choose(Attach existing policies directly) and Policy name(Choose Administrator access)(Not the best choice for security purpose but this is just a testing env)
  • Skip the Tags, Review and hit Create user
  • As mentioned on the final screen, This is the last time these credentials will be available to download. However, you can create new credentials at any time.
  • Now pass these values to aws configure command
# aws configureAWS Access Key ID [None]: XXXXXXXXAWS Secret Access Key [None]: XXXXXXXXDefault region name [None]: us-west-2Default output format [None]: json
  • Default region is the name of the region you want to make calls against by default. This is usually the region closest to you, but it can be any region. For example, type us-west-2 to use US West (Oregon).
  • Default output format can be either json, text, or table. If you don't specify an output format, json is used.
# aws <command> <option> --output text
# aws ec2 describe-instances --output text
# aws <command> <option> --output table
# aws ec2 describe-instances --output table
  • The CLI stores credentials specified with aws configure in a local file named credentials in a folder named .aws in your home directory
# ls -l ~/.awstotal 8-rw-------. 1 root root  43 May 19 09:18 config-rw-------. 1 root root 116 May 19 09:18 credentials
  • We can also specify these variables with the help of environment variables
  • Order of precedence(Configuration Settings and Precedence)
AWS Command Line Option --> Environment Variables --> CLI configuration file
# aws ec
aws <service(command)> <operation(subcommand)
aws ec2 describe-instances
  • Name of the shell, you are using
  • Location of auto-completer script
# echo $SHELL/bin/bash# which aws_completer/bin/aws_completer
# complete -C '/usr/bin/aws_completer' aws
# aws eebs                    ec2-instance-connect   ecs                    eks                    elasticbeanstalk       elastictranscoder      elbv2                  esec2                    ecr                    efs                    elasticache            elastic-inference      elb                    emr                    events
# aws ec2 help
  • It give you the man page
  • Syntax and examples
aws ec2 describe-instances help
  • There is one more tool I would like to point out here called aws shell, it’s an integrated shell for working with the aws cli
  • Installation
pip install aws-shell --upgrade --ignore-installed six
  • Use
# aws-shell
  • As you can see aws-shell provides auto completion of commands and options as we type
  • For more info https://github.com/awslabs/aws-shell
  • Generate CLI Skelton(It shows all the possible parameter of the sub-command)
aws ec2 run-instances --generate-cli-skeleton > ec2instance.json
aws ec2 run-instances --cli-input-json file://ec2instance.json
"DryRun": false
  • Creating EC2 Instance using AWS CLI
# aws ec2 run-instances --image-id ami-0b36cd6786bcfe120 --region us-west-2 --key my-account-key --instance-type t2.micro --output text
  • We can check the status of instance
# aws ec2 describe-instance-status --region us-west-2 --output text
  • To terminate an instance
# aws ec2 terminate-instances --instance-ids i-03dd64a3ec26c34ac
  • In the previous example, we picked all the existing values eg: Keypair what would be the case if we want to generate all these values on the command line
  • To create EC2 keypair
aws ec2 create-key-pair --key-name MyKeyPair --output text
  • To create security group
# aws ec2 create-security-group --group-name MySecurityGroup --description "My security group" --vpc-id <vpc-id>
# aws ec2 describe-security-groups --group-names MySecurityGroup
# aws ec2 describe-images --image-ids <ami-id>
aws ec2 run-instances --image-id ami-abc1234 --count 1 --instance-type t2.micro --key-name keypair --subnet-id subnet-abcd1234 --security-group-ids sg-abcd1234
aws ec2 allocate-address
aws ec2 associate-address --instance-id i-abc1234565 --public-ip
o 0 : pending
o 16 : running
o 32 : shutting-down
o 48 : terminated
o 64 : stopping
o 80 : stopped
  • To create a bucket
# aws s3 mb s3://100daysofawsmake_bucket: 100daysofaws
  • To list out this bucket from the command line
# aws s3 ls2022-04-27 03:27:08 100daysofaws
  • If you want this bucket to be created in some specific region
aws s3 mb s3://100daysofaws --region us-west-1
  • Now copy some files to this newly created bucket, to upload a file, we are going to use cp command
# aws s3 cp index.html s3://100daysofaws
upload: ./index.html to s3://100daysofdevopsbucket/index.html
  • To verify file is uploaded successfully
# aws s3 ls s3://100daysofaws2022-04-27 03:28:56          0 index.html
  • To Download the file from s3 to local disk
# aws s3 cp s3://100daysofaws/index.html .download: s3://100daysofaws/index.html to ./index.html
  • We can also use features like a recursive copy to the local directory. This recursively copies all objects under mybucket to a specified directory
# aws s3 cp s3://mybucket . --recursive
  • cp also support include as well as exclude options(This will copy all the files minus jpg file)
# aws s3 cp . s3://100daysofaws --recursive --exclude "*.jpg"
upload: ./test1.txt to s3://100daysofaws/test1.txt
upload: ./index.html to s3://100daysofaws/index.html
  • We can also sync file from local drive to S3 or vice versa
# aws s3 sync . s3://100daysofaws
upload: ./test1.jpg to s3://100daysofaws/test1.jpg
upload: ./test2.txt to s3://100daysofaws/test2.txt
  • To delete a particular file from s3 bucket
# aws s3 rm s3://100daysofaws/test1.txt
delete: s3://100daysofaws/test1.txt
  • To remove a bucket use rb command
# aws s3 rm s3://100daysofaws
# aws s3 rm s3://100daysofaws --force



Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store
Prashant Lakhera

Prashant Lakhera


AWS Community Builder, Ex-Redhat, Author, Blogger, YouTuber, RHCA, RHCDS, RHCE, Docker Certified,4XAWS, CCNA, MCP, Certified Jenkins, Terraform Certified, 1XGCP