Tag: AWS

Automatically Remediate Noncompliant AWS Resources using Lambda

Automatically Remediate Noncompliant AWS Resources using Lambda

While enterprises are capable of rapidly scaling their infrastructure in the cloud, there’s a corresponding increase in the demand for scalable mechanisms to meet security and compliance requirements based on corporate policies, auditors, security teams, and others.

For example, we can easily and rapidly launch hundreds of resources – such as EC2 instances – in the cloud, but we also need to have approaches for managing the security and compliance of these resources along with the surrounding infrastructure. It’s not good enough to simply passively monitor noncompliant resources; you need to automatically fix the configuration that led to the noncompliant resources.

Using a collection of AWS services, you can detect non-compliant resources and automatically remediate these resources to maintain compliance without human intervention.

In this post, you’ll learn how to automatically remediate non-compliant AWS resources as code using AWS services such as AWS Config Rules, Amazon CloudWatch Event Rules, and AWS Lambda. You’ll learn the step-by-step instructions for configuring automated remediation using the AWS Console.

The diagram below shows the key AWS resources and relationships you’ll be creating.

Let’s get started!

Create an S3 Bucket for CloudTrail

In this section, you’ll create an Amazon S3 bucket for use with CloudTrail. If you’ve already established CloudTrail, this section is optional. Here are the steps:

  1. Go to the S3 console
  2. Click the Create bucket button
  3. Enter ccoa-cloudtrail-ACCOUNTID in the Bucket name field (replacing ACCOUNTID with your account id)
  4. Click Next on the Configure Options screen
  5. Click Next on the Set Permissions screen
  6. Click Create bucket on the Review screen

Create a CloudTrail Trail

In this section, you’ll create a trail for AWS CloudTrail. If you’ve already established CloudTrail, this section is optional. Here are the steps:

  1. Go to the CloudTrail console
  2. Click the Create trail button
  3. Enter ccoa-cloudtrail in the Trail name field
  4. Choose the checkbox next to Select all S3 buckets in your account in the Data events section
  5. Choose the No radio button for the Create a new S3 bucket field in the Storage location section.
  6. Choose the S3 bucket you just created from the S3 bucket dropdown.
  7. Click the Create button

Create an AWS Config Recorder

In this section, you’ll configure the settings for AWS Config which includes turning on the Conifig recorder along with a delivery channel. If you’ve already configured AWS Config, this section is optional. Here are the steps:

  1. Go to the AWS Config console
  2. If it’s your first time using Config, click the Get Started button
  3. Select the Include global resources (e.g., AWS IAM resources) checkbox
  4. In the Amazon SNS topic section, select the Stream configuration changes and notifications to an Amazon SNS topic. checkbox
  5. Choose the Create a topic radio button in the Amazon SNS topic section
  6. In the Amazon S3 bucket section, select the Create a bucket radio button
  7. In the AWS Config role section, select the Use an existing AWS Config service-linked role radio button
  8. Click the Next button
  9. Click the Skip button on the AWS Config rules page
  10. Click the Confirm button on the Review page

Create an S3 Bucket in Violation of Compliance Rules

In this section, you’ll create an S3 bucket that allows people to put files into the bucket. We’re doing this for demonstration purposes since you should not grant any kind of public access to your S3 bucket. Here are the steps:

  1. Go to the S3 console
  2. Click the Create bucket button
  3. Enter ccoa-s3-write-violation-ACCOUNTID in the Bucket name field (replacing ACCOUNTID with your account id)
  4. Click Next on the Configure Options screen
  5. Unselect the Block all public access checkbox and click Next on the Set Permissions screen
  6. Click Create bucket on the Review screen
  7. Select the ccoa-s3-write-violation-ACCOUNTID bucket and choose the Permissions tab
  8. Click on Bucket Policy and paste the contents from below into the Bucket policy editor text area (replace both MYBUCKETNAME values with the ccoa-s3-write-violation-ACCOUNTID bucket you just created)
  9. Click the Save button

{
  "Version": "2012-10-17",
  "Statement": [
    {
      "Effect": "Allow",
      "Principal": "*",
      "Action": [
        "s3:Abort*",
        "s3:DeleteObject",
        "s3:GetBucket*",
        "s3:GetObject",
        "s3:List*",
        "s3:PutObject"
      ],
      "Resource": [
        "arn:aws:s3:::MYBUCKETNAME",
        "arn:aws:s3:::MYBUCKETNAME/*"
      ]
    }
  ]
}

You’ll receive this message: You have provided public access to this bucket. We highly recommend that you never grant any kind of public access to your S3 bucket.

Create an IAM Policy and Role for Lambda

In this section, you’ll create an IAM Policy and Role that established the permissions that the Lambda function will use. Here are the steps:

  1. Go to the IAM console
  2. Click on Policies
  3. Click Create policy
  4. Click the JSON tab
  5. Copy and replace the contents below into the JSON text area
  6. Click the Review policy button
  7. Enter ccoa-s3-write-policy in the *Name field
  8. Click the Create policy button
{
    "Version": "2012-10-17",
    "Statement": [
        {
            "Effect": "Allow",
            "Action": [
                "s3:DeleteBucketPolicy",
                "logs:CreateLogGroup",
                "logs:CreateLogStream",
                "logs:PutLogEvents"
            ],
            "Resource": "*"
        }
    ]
}
  1. Click on Roles
  2. Click the Create role button
  3. Click Lambda from the Choose the service that will use this role section
  4. Click the Next: Permissions button
  5. Click ccoa-s3-write-policy in the Filter policies search field
  6. Select the checkbox next to ccoa-s3-write-policy and click on the Next: Tags button
  7. Click the Next: Review button
  8. Enter ccoa-s3-write-role in the Role name field
  9. Click the Create role button

Create a Lambda Function to Auto-remediate S3 Buckets

In this section, you’ll create a Lambda function that is written in Node.js and performs the automatic remediation by deleting the S3 Bucket Policy associated with the bucket. Here are the steps:

  1. Go to the Lambda console
  2. Click the Create function button
  3. Keep the Author from scratch radio button selected and enter ccoa-s3-write-remediation in the Function name field
  4. Choose Node.js 10.x for the Runtime
  5. Under Permissions choose the Choose or create an execution role
  6. Under Execution role, choose Use an existing role
  7. In the Existing role dropdown, choose ccoa-s3-write-role
  8. Click the Create function button
  9. Scroll to the Function code section and within the index.js pane, copy and replace the code from below
var AWS = require('aws-sdk');

exports.handler = function(event) {
  console.log("request:", JSON.stringify(event, undefined, 2));

    var s3 = new AWS.S3({apiVersion: '2006-03-01'});
    var resource = event['detail']['requestParameters']['evaluations'];
    console.log("evaluations:", JSON.stringify(resource, null, 2));
    
  
for (var i = 0, len = resource.length; i < len; i++) {
  if (resource[i]["complianceType"] == "NON_COMPLIANT")
  {
      console.log(resource[i]["complianceResourceId"]);
      var params = {
        Bucket: resource[i]["complianceResourceId"]
      };

      s3.deleteBucketPolicy(params, function(err, data) {
        if (err) console.log(err, err.stack); // an error occurred
        else     console.log(data);           // successful response
      });
  }
}


};
  1. Click the Save button

Create an AWS Config Rule

In this section, you’ll create an AWS Config Rule that uses a Managed Config Rule to detect when there are S3 buckets that allow public writes. The Managed Config Rule runs a Lambda function to detect when S3 buckets on not in compliance. Here are the steps:

  1. Go to the Config console
  2. Click Rules
  3. Click the Add rule button
  4. In the filter box, type s3-bucket-public-write-prohibited
  5. Choose the s3-bucket-public-write-prohibited rule
  6. Click on the Remediation action dropdown within the Choose remediation action section
  7. Choose the AWS-PublishSNSNotification remediation in the dropdown
  8. Click Yes in the Auto remediation field
  9. In the Parameters field, enter arn:aws:iam::ACCOUNTID:role/aws-service-role/ssm.amazonaws.com/AWSServiceRoleForAmazonSSM in the AutomationAssumeRole field (replacing ACCOUNTID with your AWS account id)
  10. In the Parameters field, enter s3-bucket-public-write-prohibited violated in the Message field
  11. In the Parameters field, enter arn:aws:sns:us-east-1:ACCOUNTID:ccoa-awsconfig-ACCOUNTID in the TopicArn field (replacing ACCOUNTID with your AWS account id)
  12. Click the Save button

Cloudwatch Event Rule

In this section, you’ll create an Amazon CloudWatch Event Rule which monitors when the S3_BUCKET_PUBLIC_WRITE_PROHIBITED Config Rule is deemed noncompliant. Here are the steps:

  1. Go to the CloudWatch console
  2. Click on Rules
  3. Click the Create rule button
  4. Choose Event pattern in the Event Source section
  5. In the Event Pattern Preview section, click Edit
  6. Copy the contents from below and replace in the Event pattern text area
  7. Click the Save button
  8. Click the Add target button
  9. Choose Lambda function
  10. Select the ccoa-s3-write-remediation function you’d previously created.
  11. Click the Configure details button
  12. Enter ccoa-s3-write-cwe in the Name field
  13. Click the Create rule button

 

{
  "source":[
    "aws.config"
  ],
  "detail":{
    "requestParameters":{
      "evaluations":{
        "complianceType":[
          "NON_COMPLIANT"
        ]
      }
    },
    "additionalEventData":{
      "managedRuleIdentifier":[
        "S3_BUCKET_PUBLIC_WRITE_PROHIBITED"
      ]
    }
  }
}

View Config Rules

In this section, you’ll verify that the Config Rule has been triggered and that the S3 bucket resource has been automatically remediated:

  1. Go to the Config console
  2. Click on Rules
  3. Select the s3-bucket-public-write-prohibited rule
  4. Click the Re-evaluate button
  5. Go back Rules in the Config console
  6. Go to the S3 console and choose the ccoa-s3-write-violation-ACCOUNTID bucket that the bucket policy has been removed.
  7. Go back Rules in the Config console and confirm that the s3-bucket-public-write-prohibited rule is Compliant

Summary

In this post, you learned how to setup a robust automated compliance and remediation infrastructure for non-compliant AWS resources using services such as S3, AWS Config & Config Rules, Amazon CloudWatch Event Rules, AWS Lambda, IAM, and others. By leveraging this approach, your AWS infrastructure is capable of rapidly scaling resources while ensuring these resources are always in compliance without humans needing to manually intervene.

This general approach can be replicated for many other types of security and compliance checks using managed and custom config rules along with custom remediations. This way your compliance remains in lockstep with the rest of your AWS infrastructure.

Resources

The post Automatically Remediate Noncompliant AWS Resources using Lambda appeared first on Stelligent.

from Blog – Stelligent

Are You Well-Architected? – AWS Virtual Workshop

Are You Well-Architected? – AWS Virtual Workshop

Are You Well-Architected? – AWS Virtual Workshop
Most businesses depend on a portfolio of technology solutions to operate and be successful every day. How do you know if you and your team are following best practice, or what the risks in your architectures might be? In this virtual hands-on workshop, we will show how the AWS Well-Architected framework provides prescriptive architectural advice, and how the AWS Well-Architected Tool allows you to measure and improve your technology portfolio. We will explain how other customers are using AWS Well-Architected in their business and share insights into what we have learned from reviewing tens of thousands of architectures across Operational Excellence, Security, Reliability, Performance Efficiency, and Cost Optimization. We will also demonstrate and walk you through step-by-step how to use the Well-Architected Tool.

Learning Objectives:
– Learn the value proposition and components of Well-Architected
– Learn how to use the Well-Architected Tool to measure workloads
– Learn insights from reviews of tens of thousands of architectures

View on YouTube

AWS Office Hours: Amazon CloudFront – AWS Online Tech Talks

AWS Office Hours: Amazon CloudFront – AWS Online Tech Talks

AWS Office Hours: Amazon CloudFront – AWS Online Tech Talks
Just getting started with Amazon CloudFront and [email protected]? Get your answers directly from our experts during this AWS Office Hours session. Learn what a content delivery network (CDN) such as Amazon CloudFront is and how it works. Find out about the benefits it provides, common challenges, how it performs, recently released features, and examples of how customers are using CloudFront. You will also learn about re-customizing content delivery through [email protected] – a serverless compute service that lets you execute functions to customize the content delivered through CloudFront. Lastly, you will learn about best practices.

Learning Objectives:
– How do I accelerate both my static and dynamic content on AWS?
– How can I secure my content on AWS at the Edge?
– What are some best practices for Amazon CloudFront?

View on YouTube

Getting Hands-On with Machine Learning and Ready to Race in the AWS DeepRacer League

Getting Hands-On with Machine Learning and Ready to Race in the AWS DeepRacer League

Getting Hands-On with Machine Learning and Ready to Race in the AWS DeepRacer League
Developers, start your engines! This virtual workshop will provide developers of all skill levels an opportunity to get hands-on with AWS DeepRacer. Learn about the basics of machine learning and reinforcement learning (a machine learning technique, ideal for autonomous driving). During the workshop, you will build a reinforcement learning model and submit that model to the AWS DeepRacer League for a chance to win a trip to re:Invent 2019.

Learning Objectives:
– Learn the basics of machine learning
– Build a reinforcement learning model
– Learn how to get started with AWS DeepRacer and participate in the league

View on YouTube

Lower Costs by Right Sizing Your Instance with Amazon EC2 T3 General Purpose Burstable Instances

Lower Costs by Right Sizing Your Instance with Amazon EC2 T3 General Purpose Burstable Instances

Lower Costs by Right Sizing Your Instance with Amazon EC2 T3 General Purpose Burstable Instances
EC2 T3 instances are the next generation of low cost burstable general-purpose instance types that provide a baseline level of CPU performance with the ability to burst CPU usage at any time for as long as required. T3 instances are designed for applications with moderate CPU usage that experience temporary spikes in use. T3 instances also offer a balance of compute, memory, and network resources for a broad spectrum of general purpose workloads. These include micro-services, low-latency interactive applications, small and medium databases, virtual desktops, development environments, code repositories, and business-critical applications. In this tech talk, we will provide an overview of T3 instances, help you understand what workloads are ideal for them, and show how the T3 credit system works so that you can lower your EC2 instance costs today.

Learning Objectives:
– Understand what T3 instances are
– Understand what workloads are appropriate for T3 instances
– Understand how the credit system works

View on YouTube

Enterprise-Class Security, High-Availability, & Scalability with Amazon ElastiCache

Enterprise-Class Security, High-Availability, & Scalability with Amazon ElastiCache

Enterprise-Class Security, High-Availability, & Scalability with Amazon ElastiCache
Modern day enterprises face large scale challenges like security, global access, data availability, and scalability to meet their business needs. In this tech talk, you will learn about new enterprise-friendly enhancements you can leverage for your mission-critical workloads. From reader endpoint and customer managed keys for encryption to online scaling up or down, we continue to deliver the scalability, availability, compliance, and security that you care about.

Learning Objectives:
– Understand that Amazon ElastiCache is a fully-managed, enterprise-friendly in-memory database compatible with Redis and Memcached
– Learn how the newly released features work
– Explore these new features on your own in the console to build your enterprise apps

View on YouTube

Accelerate Cloud Adoption and Reduce Operational Risk with AWS Managed Services

Accelerate Cloud Adoption and Reduce Operational Risk with AWS Managed Services

Accelerate Cloud Adoption and Reduce Operational Risk with AWS Managed Services
Enterprises want to adopt AWS at scale, but their teams often lack the skills and experience needed to make the move. AWS Managed Services (AMS) operates AWS on your behalf providing a secure and compliant Landing Zone, an enterprise operating model “”out-of-the-box””, day-to-day infrastructure management, security and compliance control, and cost optimization. In this tech talk, we’ll show you how AMS accelerates your migration to AWS, reduces your operating costs, improves security and compliance, and enables you to focus on your differentiating business priorities.

Learning Objectives:
– Understand how to properly plan for operating your cloud infrastructure post migration
– Learn how AMS extends your security perimeter to the cloud, including Active Directory integration and compliance certification mapping
– Learn how AMS lowers the cost of operating AWS

View on YouTube

The Great Migration to Cloud Storage: Choosing the Right Storage Solution for Your Workload

The Great Migration to Cloud Storage: Choosing the Right Storage Solution for Your Workload

The Great Migration to Cloud Storage: Choosing the Right Storage Solution for Your Workload
Among your company’s top priorities should be ensuring that data is safe and secure. The prospect of moving data to the cloud can be daunting, and making sense of all of the services, tools, and protocols available to do it can be difficult. In this tech talk, we’ll help you demystify cloud storage while helping to identify which AWS storage service may be the right fit for your business. AWS offers a complete range of cloud storage services to support both application and archival compliance requirements. Select from object, file, and block storage services, as well as cloud data migration options, to start designing the foundation of your cloud IT environment. We’ll also discuss how to migrate your critical workloads, selecting volumes or file systems, maximizing performance, and designing for durability and availability. AWS provides storage services that fit any workload or application.

Learning Objectives:
– Learn how to get started moving data into AWS efficiently and securely
– Learn how to optimize for cost to make sure your “”lift-and-shift”” project is a complete success
– Learn which workloads and applications are best suited to each storage service

View on YouTube

Deep Dive on Amazon EventBridge – AWS Online Tech Talks

Deep Dive on Amazon EventBridge – AWS Online Tech Talks

Deep Dive on Amazon EventBridge – AWS Online Tech Talks
Amazon EventBridge is a serverless event bus service that launched in July 2019. EventBridge makes it easy to build event-driven architectures by using data from your own applications, Software-as-a-Service (SaaS) applications, and AWS services. This tech talk will provide a deep dive on how events flow from internal and external applications through EventBridge, and how you can use rules and policies to route, transform, and control access to these events. We’ll talk about how EventBridge integrates with other AWS services, like Amazon SQS and AWS Lambda, and help you get started building resilient, scalable, event-driven architectures in the cloud.

Learning Objectives:
– Learn how to build event-driven architectures with data from 3rd party SaaS apps
– Optimize event-driven applications for performance and availability
– Learn how to use rules to route and transform events

View on YouTube

Deep Dive on Running Active Directory on AWS – AWS Online Tech Talks

Deep Dive on Running Active Directory on AWS – AWS Online Tech Talks

Deep Dive on Running Active Directory on AWS – AWS Online Tech Talks
Microsoft Active Directory is an integral part of running any Windows workload. In this tech talk, we will deep dive into options for deploying Active Directory on AWS using AWS Managed Microsoft AD, as well as running Active Directory on EC2. We will also show you how you can easily deploy AWS Managed Microsoft AD from the AWS Console through a quick demo.

Learning Objectives:
– Learn different ways to run Active Directory on AWS
– Learn how to deploy AWS Managed AD through the AWS console
– Learn feature differences between AWS Managed AD and AD on EC2

View on YouTube