Skip to main content

Getting started with AWS

 We will learn how to setup AWS account, how to access AWS resources using AWS CLI, how to leverage VS Code to view AWS resources. 


AWS documentation links for getting started guides:

https://aws.amazon.com/getting-started/?e=gs2020&p=console/#Get_to_Know_the_AWS_Cloud

https://aws.amazon.com/getting-started/guides/setup-cdk/

https://aws.amazon.com/getting-started/?e=gs2020&p=console/#Launch_Your_First_Application



Setting up AWS account:

1. Create Amazon Free Tier accounthttps://portal.aws.amazon.com/billing/signup?refid=ps_a131l0000085ejvqam&trkcampaign=acq_paid_search_brand&redirect_url=https%3A%2F%2Faws.amazon.com%2Fregistration-confirmation#/start

- Provide your details (email, username, billing information, and make sure you select basicsupport-free option).

- Upon successful signup, we will be seeing a confirmation like this:



2. Signin as root user: provide your login information (email, password) and we will be able to see our aws dashboard. 


3. Access AWS Management Console:




4. Follow this great documentation provided by AWS: https://aws.amazon.com/getting-started/?e=gs2020&p=console/#Get_to_Know_the_AWS_Cloud

Using the above documentation link, we can find the best practices on how to setup our AWS cloud account. 

Let us start making progress by following this guide:

Setting up AWS environmenthttps://aws.amazon.com/getting-started/guides/setup-environment/


Adding MFA:

- select "IAM" service, and add MFA (Multi Factor Authentication).  



- once we select "add MFA", it will take us to this page- and we need to select "Activate MFA"


- select "Virtual MFA device" and hit Continue: 

- I used "Google Authenticator" app as my MFA device.  Scan the QR Code using the app, and enter 2 MFA codes. Once we successfully add the device, we can see our device under MFA. 




- once we add the MFA, we can see the IAM dashboard as:


Create IAM Group -> 

Now, we can proceed with creating user groups. As it is not advised to use root user for everything. We have to follow the least access privilege principle to keep our accounts more secure. 



Enter user group name: admins

Attach permission policies: search for "administrator access" & select it

Now we can see admins group getting created. 


Create IAM User ->



select "Add users"

username: soletechie

enable both programmatic access (to use AWS resources using CLI) and password - to access management console. 
















Note

- we can create our account alias if we don't want to use our account ID to login to AWS console

- To create alias, go to IAM dashboard, and to your right, you can find your AWS account ID information, where you will have the option to create alias. 

- Aliases must be unique, once you give a unique alias name, you will be able to sign in to AWS management console using this alias. 


***********************************************************************************

Setting up AWS-CLI:

***********************************************************************************

- Use this link to setup AWS CLI (latest version v2) based on your operating system: https://docs.aws.amazon.com/cli/latest/userguide/getting-started-install.html

- To download macOS package file: https://awscli.amazonaws.com/AWSCLIV2.pkg

- once you run the installer, we will be able to see the software installed successfully. 



- To verify if AWS CLI is successfully installed:


Time to CONFIGURE:

- type command - "aws configure" and provide your access key id, aws secret access key, default region name and default output format. 




More detailed information on how to configure AWS CLI:https://docs.aws.amazon.com/cli/latest/userguide/cli-chap-configure.html


***********************************************************************************

Setting up Cloud9: (use this only if we want to use browser based development tool)

***********************************************************************************

- use this link to setup Cloud9. https://aws.amazon.com/getting-started/guides/setup-environment/module-four/?refid=ps_a131l0000085ejvqam&trkcampaign=acq_paid_search_brand

- cloud9 is a free cloud based IDE that we can run using our browser. It supports programming languages including Python, JavaScript. So, we can work on our project basically using our browser rather than dealing with environment setups specific to our home/office laptops. 

- AWS CLI command to spin up & access & destroy cloud9 environment. we use environment ID to access and delete the cloud9 environment.

aws cloud9 create-environment-ec2 --name getting-started --description "Getting started with AWS Cloud9." --instance-type t3.micro --automatic-stop-time-minutes 60

{ "environmentId": "8a34f51ce1e04a08882f1e811bd706EX" }

aws cloud9 delete-environment --environment-id <environmentID>

- To access the cloud 9 environment: https://console.aws.amazon.com/cloud9/ide/<environment ID>?region=us-west-2


Note:

To dive deeper: https://aws.amazon.com/getting-started/?e=gs2020&p=console/#Dive_Deeper

Comments

Popular posts from this blog

AWS Connect: Reporting and Visualizations

Amazon connect offers: - built in reports i.e., historical and real-time reports.  We can customize these reports, schedule them and can integrate with any BI tool of our requirement to query and view the connect data.  Sample solution provided by AWS: 1. Make sure Connect is exporting the CTR data using Kinesis Data Stream 2. Use Kinesis Firehose to deliver the CTR that are in KDS to S3. (CTR's can be delivered as batch of records, so one s3 object might have multiple CTR's). AWS Lambda is used to add a new line character to each record, which makes object easier to parse.  3. s3 Event Notifications are used to send an event to modify the CTR record and saves it in S3. 4. Athena queries the modified CTR's using SQL. Use partitions to restrict the amount of data scanned by each query, improving performance and reducing cost. Lambda function is used to maintain the partitions.  5. Quicksight is used to visualize the modified CTRs.  Solution variations: Convert re...

Databricks: Job aborted due to stage failure. Total size of serialized results is bigger that spark driver memory.

  While running a databricks job, especially running a job with large datasets and longer running queries that creates a lot of temp space - we might be facing below issue if we have a minimal configuration set to the cluster.  The simple way to fix this would be changing the spark driver config in the databricks cluster tab spark.driver.maxResultSize = 100G (change the GB based on your cluster size)

Terraform lifecycle

 If we are using terraform, terraform state file is the heart of all the infrastructure that we spin up using terraform templates.  There are several ways to deploy the infrastructure using terraform: 1. Using CLI (setup terraform and then run terraform commands) 2. Automated Build (terraform scripts integrated as part of your jenkins pipeline) No matter of the way we chose, we must make sure that we are using the same terraform state file, so that we are having a sync and proper checklists of the resources that we used.  I would like to share the terraform commands that we do on a daily basis: terraform init = the basic/starting command which initializes the terraform (make sure the proper provider is provided. In my case, I use AWS).  terraform workspace select <workspace name > (creates a new workspace, useful in scenarios where we have different terraform modules - database, servers, logs, storage) terraform state list = shows the list of terraform resour...