What is AWS?
AWS is a cloud platform that is a comprehensive, easy to use cloud computing platform offered by Amazon. The platform is developed with a combination of infrastructure as a service (IaaS), platform as a service (PaaS) and packaged software as a service (SaaS) offerings.
AWS allows you to access and manage cloud services and resources provided by Amazon. It offers computing power, content delivery, database storage, and other services.

It offers services with flexible, reliable, scalable, easy-to-use and cost-effective cloud computing solutions.
History of AWS
- 2003: In 2003, Chris Pinkham and Benjamin Black presented a paper on how Amazon's own internal infrastructure should look like. They suggested to sell it as a service and prepared a business case on it. They prepared a six-page document and had a look over it to proceed with it or not. They decided to proceed with the documentation.
- 2004: SQS stands for "Simple Queue Service" was officially launched in 2004. A team launched this service in Cape Town, South Africa.
- 2006: AWS (Amazon Web Services) was officially launched.
- 2007: In 2007, over 180,000 developers had signed up for the AWS.
- 2010: In 2010, amazon.com retail web services were moved to the AWS, i.e., amazon.com is now running on AWS.
- 2011: AWS suffered from some major problems. Some parts of volume of EBS (Elastic Block Store) was stuck and were unable to read and write requests. It took two days for the problem to get resolved.
- 2012: AWS hosted a first customer event known as re:Invent conference. First re:invent conference occurred in which new products were launched. In AWS, another major problem occurred that affects many popular sites such as Pinterest, Reddit, and Foursquare.
- 2013: In 2013, certifications were launched. AWS started a certifications program for software engineers who had expertise in cloud computing.
- 2014: AWS committed to achieve 100% renewable energy usage for its global footprint.
- 2015: AWS breaks its revenue and reaches to $6 Billion USD per annum. The revenue was growing 90% every year.
- 2016: By 2016, revenue doubled and reached $13Billion USD per annum.
- 2017: In 2017, AWS re: invent releases a host of Artificial Intelligence Services due to which revenue of AWS doubled and reached $27 Billion USD per annum.
- 2018: In 2018, AWS launched a Machine Learning Speciality Certs. It heavily focussed on automating Artificial Intelligence and Machine learning.
Applications of AWS services
Amazon Web services are widely used for various computing purposes like:
- Web site hosting
- Application hosting/SaaS hosting
- Media Sharing (Image/ Video)
- Mobile and Social Applications
- Content delivery and Media Distribution
- Storage, backup, and disaster recovery
- Development and test environments
- Academic Computing
- Search Engines
- Social Networking
Companies using AWS
- Instagram
- Zoopla
- Smugmug
- Pinterest
- Netflix
Advantages of AWS
Following are the pros of using AWS services:
- Dropbox
- Etsy
- Talkbox
- Playfish
- Ftopia
AWS allows organizations to use the already familiar programming models, operating systems, databases, and architectures.
It is a cost-effective service that allows you to pay only for what you use, without any up-front or long-term commitments.
- Scable & high performance :
You can easily add or remove capacity.You are allowed cloud access quickly with limitless capacity.
- You will not require to spend money on running and maintaining data centers.
- Offers fast deployments

- Total Cost of Ownership is very low compared to any private/dedicated servers.
- Offers Centralized Billing and management
- Offers Hybrid Capabilities
- Allows you to deploy your application in multiple regions around the world with just a few clicks
Disadvantages of AWS
- If you need more immediate or intensive assistance, you'll have to opt for paid support packages.
- Amazon Web Services may have some common cloud computing issues when you move to a cloud. For example, downtime, limited control, and backup protection.
- AWS sets default limits on resources which differ from region to region. These resources consist of images, volumes, and snapshots.
- Hardware-level changes happen to your application which may not offer the best performance and usage of your applications.
Important AWS Services
Amazon Web Services offers a wide range of different business purpose global cloud- based products. The products include storage, databases, analytics, networking, mobile, development tools, enterprise applications, with a pay-as-you-go pricing model.
Important AWS Services
Here are essential AWS services. AWS Compute Services
Here, are Cloud Compute Services offered by Amazon:
- EC2(Elastic Compute Cloud-Virtual server in the cloud) (IaaS) - EC2 is a virtual machine in the cloud on which you have OS level control. You can run this cloud server whenever you want.

- LightSail (IaaS) -LightSail is virtual private server that bundles compute, storage, and networking features of Lightsail to deploy and manage websites or web applications in the cloud for developers .
- Amazon Lightsail is a great way to get started with AWS for developers, small businesses, students, and other users who need a cloud platform solution.
- Lightsail includes everything you need to launch your project quickly – virtual machines, solid state drive (SSD)-based storage, load balancers, and managed databases.

- Elastic Beanstalk (PaaS) — AWS Elastic Beanstalk is a pre-configured EC2 server to handle application deployment, load balancing, health monitoring, and more automatically

- AWS Lambda(SaaS) — This AWS service allows you to run code without thinking about servers and clusters

- EKS (Elastic Container Service for Kubernetes)-(PaaS) : EKS is a fully managed service that you can use to run Kubernetes on AWS.

- AWS Fargate: It is a serverless compute engine for containers. It works with both Amazon ECS and Amazon EKS.


Network
- Amazon VPC- used to create an isolated virtual network environment in the AWS cloud.

- Internet gateway-used to make connections between a VPC and the internet.

- Virtual Private gateway-used to allows protected internet traffic to enter into the VPC

- AWS Direct Connect: It is a network service that enables you to establish a dedicated private connection between your data center and a VPC.

- Network access control list (ACL): A network access control list (ACL) is a virtual firewall that controls inbound and outbound traffic at the subnet level.

- Security Group: A security group is a virtual firewall that controls inbound and outbound traffic for an Amazon EC2 instance.





Storage
- Amazon Instance store: An instance store provides temporary block-level storage for an Amazon EC2 instance.

- Amazon Elastic Block Store (Amazon EBS)- a service that provides block-level storage volumes.

- Amazon S3:- a service that provides object-level storage. Amazon S3 stores data as objects in buckets.
- Amazon EFS:-a service that provides a scalable file system used with AWS Cloud services and on-premises resources.

- Amazon Glacier- It is an extremely low-cost storage service. It offers secure and fast storage for data archiving and backup.

- AWS Storage Gateway- This AWS service is used for connecting on-premises software applications with cloud-based storage. It offers secure integration between the company's on-premises and AWS's storage infrastructure.

- Snowball :Snowball is a small application which allows you to transfer terabytes of data inside and outside of AWS environment.


- Amazon CloudFront: It is a content delivery network (CDN) service built for high performance, security, and developer convenience.


Database Services
- Amazon RDS(PaaS):-It is a service that enables you to run relational databases in the AWS Cloud.


Amazon RDS is a managed service that automates tasks such as hardware provisioning, database setup, patching, and backups.
- Amazon Aurora(Database Engine): It is a large scale data warehouse service for use with business intelligence tools.

- Amazon DynamoDB (SaaS)- Amazon DynamoDB is a key-value database service.It delivers single-digit millisecond performance at any scale.


- Amazon RedShift : Amazon Redshift is a data warehousing service that you can use for big data analytics. It offers the ability to collect data from many sources and helps you to understand relationships and trends across your data.
- Amazon ElastiCache-Amazon ElastiCache is a service that adds caching layers on top of your databases to help improve the read times of common requests.
- Amazon Neptune:Amazon Neptune is a graph database service. You can use Amazon Neptune to build and run applications that work with highly connected datasets, such as recommendation engines, fraud detection, and knowledge graphs.


Security Services
- IAM (Identity and Access Management) — enables you to manage access to AWS services and resources securely.
- IAM users: an identity that you create in AWS
- Groups : a collection of IAM users.
- Roles: an identity that you can assume to gain temporary access to permission
- IAM policies: a document that allows or denies permissions to AWS services and resources.

- Multi-factor authentication: an identity that you can assume to gain temporary access to permission

- AWS Artifact: a service that provides on-demand access to AWS security and compliance reports and select online agreements.

Denial-of-service (DoS) attack: is a deliberate attempt to make a website or application unavailable to users.

Distributed denial-of-service attacks: is a deliberate attempt by multiple sources to make a website or application unavailable to users

- Shield — a service that protects applications against DDoS attacks.

- KMS (Key Management Service) —It used to create, manage, and use cryptographic keys.

- Certificate Manager —a service that makes it easy for you to centrally manage your SSL/TLS certificates

- WAF (Web Application Firewall) —a service that filters malicious web traffic.


- Inspector —A service that improve the security and compliance of applications by running automated security assessments

- GuardDuty —service that provides intelligent threat detection for your AWS infrastructure and resources.

- Macie — a data security service that uses machine learning (ML) and pattern matching to discover and help protect your sensitive data.
- Cloud Directory — Host and manage active directory





Management Services:
- CloudWatch — a web service that enables you to monitor and manage various metrics and configure alarm actions based on data from those metrics.
For example, suppose that your company’s developers use Amazon EC2 instances for application development or testing purposes. If the developers occasionally forget to stop the instances, the instances will continue to run and incur charges.
In this scenario, you could create a CloudWatch alarm that automatically stops an Amazon EC2 instance when the CPU utilization percentage has remained below a certain threshold for a specified period. When configuring the alarm, you can specify to receive a notification whenever this alarm is triggered.


- CloudTrail — The comprehensive API auditing tool.The recorded information includes the identity of the API caller, the time of the API call, the source IP address of the API caller, and more.

- AWS Trusted Advisor: a service that provides recommendations that help you follow AWS best practices.

- Config —a service that provides a detailed view of the configuration of AWS resources in your AWS account.

- Service Catalog — enables organizations to create and manage catalogs of IT services that are approved for AWS
- AWS Auto Scaling — The service allows you to automatically scale your resources up and down based on given CloudWatch metrics.

- Organizations — a service that uses as central location to manage multiple AWS accounts.

- Systems Manager — allows you to safely automate common and repetitive IT operations and management tasks.

Deployment
- Cloud Formation :— an infrastructure as code (IaC) that uses template files to automate the setup of AWS resources.

- AWS CodeDeploy: a service that automates code deployments to any instance, including Amazon EC2 instances and instances running on-premises.

- OpsWorks (Application management service) —It is flexible application management solution with automation tools that enable you to model and control the complete life cycle of your application and infrastructure on which they run.

- Elastic Beanstalk (PaaS) — AWS Elastic Beanstalk is a pre-configured EC2 server to handle application deployment, load balancing, health monitoring, and more automatically

- EKS (Elastic Container Service for Kubernetes)-(PaaS) : EKS is a fully managed service that you can use to run Kubernetes on AWS.

Migration & Transfer services used to transfer data physically between your datacenter and AWS.
- Snowball :Snowball is a small application which allows you to transfer terabytes of data inside and outside of AWS environment.


- DMS (Database Migration Service) -DMS service helps you migrate databases to AWS quickly and securely. It helps you to migrate from one type of database to another — for example, Oracle to MySQL.
- SMS (Server Migration Service) - migrate your On-premises workloads to AWS easily and quickly.
Analytics
- Athena — a service that helps you analyze unstructured, semi-structured, and structured data stored in Amazon S3

- CloudSearch —a fully managed search service in the cloud for your website or application. it makes it easy to set up, manage, and scale a search solution for your website or application

- ElasticSearch — it is a modern search and analytics engine which is based on Apache Lucene. It is a full-text, distributed NoSQL database. However, it offers more features like application monitoring.

- Kinesis —a service that used to query and analyze streaming data.You can use Amazon Kinesis to securely stream video from camera-equipped devices in homes, offices, factories, and public places to AWS. You can then use these video streams for video playback, security monitoring, face detection, machine learning, and other analytics.

- QuickSight —A machine learning-powered business intelligence service is used to deliver easy-to-understand insights to the people .

- EMR (Elastic Map Reduce-Hosted Hadoop framework) —This AWS analytics service is mainly used for big data processing like Spark, Splunk, Hadoop, etc.

- Data Pipeline — Allows you to move data from one place to another.
For example from DynamoDB to S3.
Application Services
- Step Functions — Visual workflows for distributed applications
AWS Step Functions is a visual workflow service that helps developers use AWS services to build distributed applications, automate processes, orchestrate microservices, and create data and machine learning (ML) pipelines.

- SWF (Simple Workflow Service) — a fully managed workflow service for building scalable, resilient applications.
- SNS (Simple Notification Service) — You can use this service to send you notifications in the form of email and SMS based on given AWS services.
- SQS (Simple Queue Service) — Use this AWS service to decouple your applications. It is a pull-based service.
- Elastic Transcoder — This AWS service tool helps you to changes a video's format and resolution to support various devices like tablets, smartphones, and laptops of different resolutions.
Best practices of AWS
- You need to design for failure, but nothing will fail.
- It's important to decouple all your components before using AWS services.
- You need to keep dynamic data closer to compute and static data closer to the user.
- It's important to know security and performance tradeoffs.
- Pay for computing capacity by the hourly payment method.
- Make a habit of a one-time payment for each instance you want to reserve and to receive a significant discount on the hourly charge.
How to Create EC2 Instance in AWS:

Creating an EC2 instance
- Sign in to the AWS Management Console.
- Click on the EC2 service.
- Click on the Launch Instance button to create a new instance.

- Now, we have different Amazon Machine Images. These are the snapshots of different virtual machines. We will be using Amazon Linux AMI 2018.03.0 (HVM) as it has built-in tools such as java, python, ruby, perl, and especially AWS command line tools.

- Choose an Instance Type, and then click on the Next. Suppose I choose a t2.micro as an instance type.

- The main setup page of EC2 is shown below where we define setup configuration.


Where,
Number of Instances: It defines how many EC2 instances you want to create. I leave it as 1 as I want to create only one instance.
Purchasing Option: In the purchasing option, you need to set the price, request from, request to, and persistent request. Right now, I leave it as unchecked.
Tenancy: Click on the Shared-Run a shared hardware instance from the dropdown menu as we are sharing hardware.
Network: Choose your network, set it as default, i.e., vpc-dacbc4b2 (default) where vpc is a virtual private cloud where we can launch the AWS resources such as EC2 instances in a virtual cloud.
Subnet: It is a range of IP addresses in a virtual cloud. In a specified subnet, you can add new AWS resources.
Shutdown behavior: It defines the behavior of the instance type. You can either stop or terminate the instance when you shut down the Linux machine. Now, I leave it as Stop.
Enable Termination Protection: It allows the people to protect against the accidental termination.
Monitoring: We can monitor things such as CPU utilization. Right now, I uncheck the Monitoring.
User data: In Advanced details, you can pass the bootstrap scripts to EC2 instance. You can tell them to download PHP, Apache, install the Apache, etc.
- Now, add the EBS volume and attach it to the EC2 instance. Root is the default EBS volume. Click on the Next.

Volume Type: We select the Magnetic (standard) as it is the only disk which is bootable.
Delete on termination: It is checked means that the termination of an EC2 instance will also delete EBS volume.
- Now, Add the Tags and then click on the Next.

In the above screen, we observe that we add two tags, i.e., the name of the server and department. Create as many tags as you can as it reduces the overall cost.
- Configure Security Group. The security group allows some specific traffic to access your instance.

- Review an EC2 instance that you have just configured, and then click on the Launch button.

- Create a new key pair and enter the name of the key pair. Download the Key pair.

- Click on the Launch Instances button.

- To use an EC2 instance in Windows, you need to install both Putty and PuttyKeyGen.
- Download the Putty and PuttyKeyGen.
UNit-1 & Unit-3 Quick Review
What is cloud computing
- Cloud computing is the delivery of computing services over the internet.
Describe the shared responsibility model
Traditional corporate datacenter.
S.No | Company |
1 | Maintaining the physical space |
2 | Ensuring security |
3 | Maintaining or replacing the servers if anything happens |
4 | Maintaining all the infrastructure and software |
5 | keeping all systems patched and on the correct version. |
Shared responsibility model
- These responsibilities get shared between the cloud provider and the consumer..
S.No | Company | Consumer |
1 | Physical security | Data and Information stored in the cloud. |
2 | Power | Depends on situation of service IaaS or Paas or Saas Example : for some things, the responsibility depends on the situation. If you’re using a cloud SQL database, the cloud provider would be responsible for maintaining the actual database. However, you’re still responsible for the data that gets ingested into the database. If you deployed a virtual machine and installed an SQL database on it, you’d be responsible for database patches and updates, as well as maintaining the data and information stored in the database. |
3 | Network connectivity |
4 | Cooling
|
You’ll always be responsible for:
- The information and data stored in the cloud
- Devices that are allowed to connect to your cloud (cell phones, computers, and so on)
- The accounts and identities of the people, services, and devices within your organization
The cloud provider is always responsible for:
- The physical datacenter
- The physical network
- The physical hosts
Your service model will determine responsibility for things like:
- Operating systems
- Network controls
- Applications
- Identity and infrastructure
Define cloud models
- define the deployment type of cloud resources.
- Used by a single entity
- Private cloud may be hosted from your on site datacenter.
- It may also be hosted in a dedicated datacenter offsite,
- Potentially even by a third party that has dedicated that datacenter to your company
- built, controlled, and maintained by a third-party cloud provider
- public availability
- uses both public and private clouds in an inter-connected environment.
- users can flexibly choose which services to keep in public cloud and which to deploy to their private cloud infrastructure.
The following table highlights a few key comparative aspects between the cloud models.
Public cloud | Private cloud | Hybrid cloud |
No capital expenditures to scale up | Organizations have complete control over resources and security | Provides the most flexibility |
Applications can be quickly provisioned and deprovisioned | Data is not collocated with other organizations’ data | Organizations determine where to run their applications |
Organizations pay only for what they use | Hardware must be purchased for startup and maintenance | Organizations control security, compliance, or legal requirements |
Organizations don’t have complete control over resources and security | Organizations are responsible for hardware maintenance and updates |
|
Multi-cloud
- use multiple public cloud providers
- Maybe you use different features from different cloud providers
- Maybe you started your cloud journey with one provider and are in the process of migrating to a different provider.
Azure Arc
- a set of technologies that helps manage your cloud environment
- a public cloud solely on Azure
- a private cloud in your datacenter
- a hybrid configuration
- a multi-cloud environment running on multiple cloud providers at once.
Azure VMware Solution
- VMware in a private cloud environment but want to migrate to a public or hybrid cloud
- Azure VMware Solution lets you run your VMware workloads in Azure with seamless integration and scalability.
Describe the consumption-based model
- Capital expenditure (CapEx)
- CapEx is typically a one-time, up-front expenditure to purchase or secure tangible resources.
- A new building, repaving the parking lot, building a datacenter, or buying a company vehicle are examples of CapEx.
- operational expenditure (OpEx).
- OpEx is spending money on services or products over time.
- Renting a convention center, leasing a company vehicle, or signing up for cloud services are all examples of OpEx.
- Cloud computing falls under OpEx because cloud computing operates on a consumption-based model
his consumption-based model has many benefits, including:
- No upfront costs.
- No need to purchase and manage costly infrastructure that users might not use to its fullest potential.
- The ability to pay for more resources when they're needed.
- The ability to stop paying for resources that are no longer needed.
Traditional datacenter | Cloud-based model |
With a traditional datacenter, you try to estimate the future resource needs. If you overestimate, you spend more on your datacenter than you need to and potentially waste money. If you underestimate, your datacenter will quickly reach capacity and your applications and services may suffer from decreased performance. Fixing an under-provisioned datacenter can take a long time. You may need to order, receive, and install more hardware. You'll also need to add power, cooling, and networking for the extra hardware. | In a cloud-based model, you don’t have to worry about getting the resource needs just right. If you find that you need more virtual machines, you add more. If the demand drops and you don’t need as many virtual machines, you remove machines as needed. Either way, you’re only paying for the virtual machines that you use, not the “extra capacity” that the cloud provider has on hand. |
Compare cloud pricing models
- pay-as-you-go pricing model. You typically pay only for the cloud services you use, which helps you:
- Plan and manage your operating costs.
- Run your infrastructure more efficiently.
- Scale as your business needs change.
Benefits of cloud computing
High availability
- Ability to accessing resources at any time & from anywhere
- Availability is achieved with Auto scaling
- Azure is a highly available cloud environment with uptime guarantees depending on the service.
- These guarantees are part of the service-level agreements (SLAs).
Scalability
- the ability to adjust resources to meet demand
- Scalability is achieved with vertical scale set
- you aren't overpaying for services with this.
- Scaling types
- Vertical scaling is focused on increasing or decreasing the capabilities of resources.
- if you were developing an app and you needed more processing power, you could vertically scale up to add more CPUs or RAM to the virtual machine.
- if you realized you had over-specified the needs, you could vertically scale down by lowering the CPU or RAM specifications.
- Horizontal scaling is adding or subtracting the number of resources.
- you could add additional virtual machines or containers, scaling out
Reliability
- the ability of a system to recover from failures and continue to function
- Reliability is achieved with availability zones
- It's also one of the pillars of the Microsoft Azure Well-Architected Framework.
Predictability
- Performance and cost predictability are heavily influenced by the Microsoft Azure Well-Architected Framework.
- Autoscaling, load balancing, and high availability
- Cost predictability is focused on predicting or forecasting the cost of the cloud spend.
- Total Cost of Ownership (TCO) or Pricing Calculator to get an estimate of potential cloud spend
Governance
- The process of defining, implementing, and monitoring a framework of policies that guides an organization's cloud operations.
- Templates (Automation) help ensure that all your deployed resources meet corporate standards and government regulatory requirements.
- Cloud-based auditing helps flag any resource that’s out of compliance with your corporate standards and provides mitigation strategies.
security
- The ability of protecting against rapidly evolving threats
- Security is achieved with shared responsibility model
- infrastructure as a service provides you with physical resources but lets you manage the operating systems and installed software, including patches and maintenance
- If you want patches and maintenance taken care of automatically, platform as a service or software as a service deployments may be the best cloud strategies for you.
Infrastructure as a Service ( No installation of hardware / Infrastructure -renting hardware )
Common Diagram

- Infrastructure as a service (IaaS) is the most flexible category of cloud services,
- Responsible for maintaining the hardware, network connectivity (to the internet), and physical security.
- Responsible for operating system installation, configuration, and maintenance; network configuration; database and storage configuration; and so on
Scenarios
- Lift-and-shift migration:
- Testing and development:
Platform as a Service ( No installation of development tools / platform)
- Responsible for physical infrastructure, physical security, connection to the internet ,maintain the operating systems, middleware, development tools, and business intelligence services that make up a cloud solution
- you don't have to worry about the licensing or patching for operating systems and databases.
- Responsible for networking settings and connectivity within your cloud environment and application security, and the directory infrastructure.
Scenarios
- Development framework ( Application software development)
- Analytics or business intelligence (Database / data warehouse applications development)
Describe Software as a Service
- Software as a service (SaaS) is the most complete cloud service model from a product perspective.
- you’re essentially renting or using a fully developed application. Email, financial software, messaging applications, and connectivity software
- least flexible
- Cloud provider (Most responsibility)
- Responsible for physical security of the data centers, power, network connectivity, and application development and patching
- Cloud consumer (Least responsibility)
- Responsible for the data that you put into the system, the devices that you allow to connect to the system, and the users that have access.
Scenarios
- Email and messaging.
- Business productivity applications.
- Finance and expense tracking. etc…