Unlocking Quantum Secrets: Shor's Algorithm

Today, we're diving into the fascinating world of Shor's Algorithm – a quantum magician that can crack the code of factoring large numbers faster than you can say "crypto-who?" So, grab your quantum popcorn, and let's unravel the mystery together!



The Classic Conundrum

Imagine you're in the digital realm, surrounded by classical computers doing their best with bits, the binary heroes that can only be 0 or 1. Now, our heroes struggle when faced with the colossal task of factoring large numbers into their prime buddies. It's like asking them to find a needle in a cosmic haystack!


Enter Shor's Quantum Wizardry

Cue the entrance of Shor's Algorithm, a quantum rockstar developed by the math maestro Peter Shor in 1994. This algorithm is like the superhero version of your regular factoring processes, but instead of bits, it employs quantum wonders called qubits – think of them as bits on steroids.


Quantum Bits: More Than Just 0s and 1s

Picture qubits as party animals, dancing in a superposition, existing in multiple states simultaneously. It's like having your cake and eating it too, but in the quantum realm!


Meet the Quantum Fourier Transform (QFT): The Cool Sidekick

Shor's Algorithm has this cool sidekick called the Quantum Fourier Transform (QFT). It's like the DJ at a quantum party, helping us spot patterns in a function at warp speed. Why? Because finding patterns reveals the secrets behind those tricky prime factors!



How Shor's Algorithm Throws the Ultimate Quantum Bash

Here's the quantum groove: Pick a random number, slap on the QFT shades to find a periodic function, and if it's even, voila! You're on the express train to calculating potential factors. It's like solving a puzzle with the quantum version of a treasure map.


Why Shor's Algorithm is the Quantum Life of the Party

While classical algorithms are huffing and puffing with exponential effort, Shor's Algorithm is strutting its stuff with polynomial finesse. It's like the quantum algorithm saying, "Hold my quantum lemonade – I've got this!"


Quantum Limitations and the Grand Finale

Shor's Algorithm isn't a party-crasher for all encryption, but watch out, RSA encryption – it's coming for you! And that's a wrap for the encryption era as we know it.


Quantum Future: Cryptographic Evolution

The quantum revolution isn't just for the cool kids in the lab coats. Shor's Algorithm hints at a brave new world of quantum-resistant cryptographic techniques, gearing up to defend our digital fortresses against quantum shenanigans.


Wrap Up!

In a nutshell, Shor's Algorithm is the quantum maestro playing the prime factor symphony in the world of quantum computing. It's faster, it's cooler, and it's shaking up the cryptographic dance floor.

So, folks, keep your eyes on the quantum horizon – there's more magic brewing in those qubits than meets the eye! Until then, keep calm and quantum on! 🚀✨

Securing Your APIs with Python Jose

Learn how to secure your serverless APIs with AWS Lambda Authorisers and Python Jose. We'll walk you
through the mechanics of OAuth and show you how to build a proof-of-concept with Terraform and Python.

When it comes to building serverless APIs, security is always a top concern. One way to ensure that only authorised users can access your APIs is by using an AWS Lambda Authoriser. An authoriser is a Lambda function that you can use to authenticate and authorise incoming requests to your API Gateway.

we will show you how to use AWS Lambda Authorisers in combination with Python Jose, a library for handling JSON Web Tokens (JWTs), to secure your serverless APIs. We'll start by discussing the basics of OAuth, the standard for authorisation on the web, and then move on to building a proof-of-concept with Terraform and Python.

First Steps!

First, let's take a look at the mechanics of OAuth. OAuth is an open standard for authorization that enables third-party applications to obtain limited access to an HTTP service. It works by allowing users to authenticate and authorize a third-party application to access their resources, without having to share their credentials.

There are three main actors in the OAuth flow: the resource owner (the user), the client (the third-party application), and the resource server (the API). The client sends a request to the resource server for access to the user's resources. The resource server then redirects the user to an authorization server, where the user grants permission for the client to access their resources. Once permission is granted, the resource server sends an access token to the client, which can then be used to access the user's resources.

Now that we understand the mechanics of OAuth, let's take a look at how we can use AWS Lambda Authorisers and Python Jose to secure our serverless APIs.

AWS Lambda Authorizers are Lambda functions that you can use to authenticate and authorize incoming requests to your API Gateway. They can be configured to use various authentication mechanisms, such as JWTs, OAuth tokens, or custom authentication. In this example, we'll use Python Jose to verify JWTs.

Going Deeper

First, you'll need to create a new Lambda function and add it as an authorizer to your API Gateway. Here's an example of a simple Lambda authorizer written in Python:

import json
import jose
def lambda_handler(event, context):
    # Get the JWT from the request
    jwt = event['authorizationToken']
    # Verify the JWT
    try:
        jwt_obj = jose.jwt.decode(jwt, 'secret_key', algorithms=['HS256'])
    except jose.exceptions.JWTError:
        raise Exception('Invalid JWT')
    # Return the user's email address
    return {
        'principalId': jwt_obj['email'],
        'policyDocument': {
            'Version': '2012-10-17',
            'Statement': [
                {
                    'Effect': 'Allow',
                    'Action': 'execute-api:Invoke',
                    'Resource': '*'
                }
            ]
        }
    }



This is a simple example of a Lambda function that can be used as an authorizer for your API Gateway. The function takes an incoming request and extracts the JWT from the authorizationToken field. It then uses the jose library to verify the JWT. If the JWT is valid, the function returns the user's email address as the principal ID. If the JWT is invalid, the function raises an exception.

You'll need to replace 'secret_key' with the secret key that you've used to sign the JWT. The 'email' field should be the field that holds the user's email address in your JWT.

Once you have created your Lambda function, you can add it as an authorizer to your API Gateway. To do this, you'll need to go to the API Gateway console and select your API. Then, select the Authorizers option from the sidebar, and click the Create button. You'll then be prompted to select the Lambda function that you want to use as an authorizer.

With your Lambda Authorizer set up, any incoming requests to your API Gateway will be passed through the Lambda function for authentication and authorization.

Refining the idea

To take this a step further, you can use Terraform to build out a proof-of-concept environment that includes an API Gateway, a Lambda function, and an IAM role for the Lambda function. Here is an example of a Terraform configuration that creates an API Gateway, a Lambda function, and an IAM role for the Lambda function:

resource "aws_api_gateway_rest_api" "example" {
  name = "example"
}
resource "aws_api_gateway_resource" "example" {
  rest_api_id = aws_api_gateway_rest_api.example.id
  parent_id = aws_api_gateway_rest_api.example.root_resource_id
  path_part = "{proxy+}"
}
resource "aws_api_gateway_method" "example" {
  rest_api_id = aws_api_gateway_rest_api.example.id
  resource_id = aws_api_gateway_resource.example.id
  http_method = "ANY"
  authorization = "CUSTOM"
  authorizer_id = aws_api_gateway_authorizer.example.id
}
resource "aws_api_gateway_authorizer" "example" {
  name = "example"
  rest_api_id = aws_api_gateway_rest_api.example.id
  type = "TOKEN"
  identity_source = "method.request.header.Authorization"
  authorizer_uri = aws_lambda_function.example.invoke_arn
}
resource "aws_lambda_function" "example" {
  filename = "lambda_function.zip"
  function_name = "example"
  role = aws_iam_role.example.arn
  handler = "lambda_function.lambda_handler"
  runtime = "python3.8"
}
resource "aws_iam_role" "example" {
  name = "example"
  assume_role_policy = <<EOF
{
  "Version": "2012-10-17",
  "Statement": [
    {
      "Effect": "Allow",
      "Principal": {
        "Service": "lambda.amazonaws.com"
      },
      "Action": "sts:AssumeRole"
    }
  ]
}
EOF
}
resource "aws_iam_role_policy" "example" {
  name = "example"
  role = aws_iam_role.example.id
  policy = <<EOF
{
  "Version": "2012-10-17",
  "Statement": [
    {
      "Effect": "Allow",
      "Action": "logs:*",
      "Resource": "arn:aws:logs:*:*:*"
    }
  ]
}
EOF
}

This Terraform configuration creates an API Gateway, a Lambda function, and an IAM role for the Lambda function. The API Gateway is configured to use the Lambda function as an authorizer, and the Lambda function is configured to use the IAM role.


Wrap Up!

AWS Lambda Authorizers provide a powerful way to authenticate and authorize incoming requests to your serverless APIs. By using a Lambda function to authenticate and authorize requests, you can add an extra layer of security to your APIs without having to manage servers or infrastructure.

When building a Lambda Authorizer, it is a good idea to use a JWT library such as Python Jose, to handle the JWT verification. The library provides a simple and secure way to handle JWTs and it's available in multiple languages.

Exploring Advanced Widgets for Your GitHub Pages

In our previous blog post, we learned how to create an impressive landing page on your GitHub profile using a repository with the same name as your GitHub ID. We also briefly touched upon some advanced widgets that you can add to enhance your profile. In this follow-up blog post, we'll take a deeper dive into the various widgets available for your GitHub Pages. These widgets will help you showcase your skills, projects, and contributions in a visually appealing and interactive manner.


GitHub Readme Stats Widget

The GitHub Readme Stats widget by Anurag Hazra allows you to display detailed statistics about your GitHub profile. In addition to the top languages used, you can showcase your total commits, contributions, followers, and more. To add this widget to your profile, insert the following code into your README.md file:

```markdown
[![Paul's GitHub stats](https://github-readme-stats.vercel.app/api?username=your-github-username&show_icons=true&theme=radical)](https://github.com/anuraghazra/github-readme-stats)

```

You can customise the appearance of the widget by changing the theme parameter to one of the available options, such as "radical," "gruvbox," or "dracula." Explore the GitHub Readme Stats repository for additional customisation options.

GitHub Activity Graph Widget

The GitHub Activity Graph widget provides a visual representation of your GitHub activity over time. This widget displays a calendar heat-map, where each square represents a day, and the colour intensity represents the level of activity. To add this widget to your profile, insert the following code into your README.md file:

```markdown
[![Paul's github activity graph](https://activity-graph.herokuapp.com/graph?username=your-github-username&theme=github)](https://github.com/ashutosh00710/github-readme-activity-graph)
```

You can customise the appearance of the graph by changing the theme parameter to one of the available options, such as "github," "dark," or "rogue." Visit the GitHub Readme Activity Graph repository for more customisation options.

GitHub Repository Card Widget

The GitHub Repository Card widget allows you to showcase your repositories with a visually appealing card layout. You can display information such as the repository's name, description, stars, forks, and language. To add this widget to your profile, insert the following code into your README.md file:

```markdown
[![Readme Card](https://github-readme-stats.vercel.app/api/pin/?username=your-github-username&repo=your-repo-name)](https://github.com/your-github-username/your-repo-name)
```

Replace `your-github-username` with your actual username and `your-repo-name` with the name of the repository you want to display. You can customize the appearance of the card by changing the theme parameter and adding additional parameters such as `show_owner`, `show_icons`, or `hide_border`. Refer to the GitHub Readme Stats repository for detailed customization options.

GitHub Profile Trophy Widget

The GitHub Profile Trophy widget allows you to showcase your achievements and milestones on your profile. It displays trophies for various accomplishments, such as the number of followers, repositories, stars received, and more. To add this widget to your profile, insert the following code into your README.md file:

```markdown
[![trophy](https://github-profile-trophy.vercel.app/?username=your-github-username)](https://github.com/ryo-ma/github-profile-trophy)
```

Replace `your-github-username` with your actual username. You can customise the appearance of the trophies by changing the theme parameter, adding a title, or excluding specific types of trophies. Visit the GitHub Profile Trophy repository for more customisation options.

Wrap Up!

By exploring the various advanced widgets available for your GitHub Pages, you can create a highly personalised and interactive landing page on your GitHub profile. These widgets allow you to showcase your skills, contributions, and achievements in a visually appealing and engaging manner. Experiment with different widgets, customise their appearance, and make your landing page truly reflect your coding journey. With these powerful widgets at your disposal, your GitHub profile will leave a lasting impression on visitors and potential collaborators!

Automating Compliance with Prowler with Lambda

Compliance monitoring is a critical aspect of maintaining a secure and well-managed infrastructure. Prowler, an open-source tool written in Python, allows you to automate compliance scanning and monitoring for AWS environments. In this blog post, we'll explore how to use Prowler on AWS Lambda using Terraform, eliminating the need for an EC2 instance and simplifying the compliance monitoring process.


Setting up the Infrastructure


To get started, we'll use Terraform to provision the necessary infrastructure components on AWS. Here's an example of the Terraform code required:

provider "aws" {
  region = "eu-west-2"
}
resource "aws_cloudwatch_event_rule" "prowler_schedule" {
  name                = "ProwlerSchedule"
  description         = "Scheduled rule for running Prowler"
  schedule_expression = "rate(1 day)"
}
resource "aws_cloudwatch_event_target" "prowler_target" {
  rule      = aws_cloudwatch_event_rule.prowler_schedule.name
  target_id = "ProwlerLambdaTarget"
  arn       = aws_lambda_function.prowler_lambda.arn
}
resource "aws_lambda_function" "prowler_lambda" {
  filename      = "prowler.zip"
  function_name = "ProwlerLambda"
  role          = aws_iam_role.prowler_lambda_role.arn
  handler       = "lambda_function.lambda_handler"
  runtime       = "python3.8"
  memory_size   = 128
  timeout       = 300
  source_code_hash = filebase64sha256("prowler.zip")
}
resource "aws_iam_role" "prowler_lambda_role" {
  name = "ProwlerLambdaRole"
  assume_role_policy = <<EOF
{
  "Version": "2012-10-17",
  "Statement": [
    {
      "Effect": "Allow",
      "Principal": {
        "Service": "lambda.amazonaws.com"
      },
      "Action": "sts:AssumeRole"
    }
  ]
}
EOF
}
resource "aws_iam_role_policy_attachment" "prowler_lambda_policy_attachment" {
  role       = aws_iam_role.prowler_lambda_role.name
  policy_arn = "arn:aws:iam::aws:policy/AmazonEC2ReadOnlyAccess"
}

Let's break down the Terraform code:

We define the AWS provider and specify the region.

The aws_cloudwatch_event_rule resource creates a CloudWatch Event Rule to schedule the Prowler scan. In this example, we set it to run once a day, but you can customise the schedule as needed.

The aws_cloudwatch_event_target resource associates the CloudWatch Event Rule with the Lambda function. This ensures that the Lambda function is triggered based on the defined schedule.

The aws_lambda_function resource creates the Lambda function. We provide the Python code for the Lambda function, as well as the necessary configuration such as the runtime, memory size, and timeout.

The aws_iam_role resource creates an IAM role for the Lambda function, allowing it to access the required AWS services.

Finally, the aws_iam_role_policy_attachment resource attaches the necessary IAM policies to the IAM role. In this example, we attach the "AmazonEC2ReadOnlyAccess" policy to allow the Lambda function to start the Prowler scan on EC2 instances.

Once you have the Terraform code ready, run terraform init, terraform plan, and terraform apply to provision the infrastructure on AWS. Make sure you have the AWS CLI configured with the appropriate credentials.

Writing the Lambda Function

The next step is to write the Python code for the Lambda function. Here's an example code snippet:

import prowlerlib


def lambda_handler(event, context):
    # Define the Prowler configuration
    config = prowlerlib.Config(config_file='my_config_file.yml', no_color=True)
    try:
        # Run Prowler with the specified configuration
        prowlerlib.run_prowler(config)
    except Exception as e:
        # Handle any errors that occur during the execution of Prowler
        print(f"Error executing Prowler: {e}")

Let's break down the Python code:

We import the boto3 library, which allows us to interact with AWS services.

The lambda_handler function is the entry point for the Lambda function. It receives an event and context object as parameters.

Inside the lambda_handler, we create an EC2 client using boto3.client('ec2').

We use the describe_instances method to retrieve the instances with the "Prowler" tag. You can customise this filter to match your specific requirements.

For each instance returned, we extract the instance ID and use the start_instances method to start the instance. This triggers the Prowler scan on the EC2 instance.

Finally, we print a message for each instance started, and return a success message.

You can customise this Python code to include additional functionality, such as sending scan results to an S3 bucket or notifying stakeholders via SNS.

Deploying and Running Prowler on Lambda

To deploy and run Prowler on AWS Lambda, follow these steps:

Package the Python code and any dependencies into a zip file. For example, create a file called prowler.zip and include the lambda_function.py file and any additional Python modules required by Prowler.

Run terraform init, terraform plan, and terraform apply to provision the infrastructure and deploy the Lambda function. Make sure to have the AWS CLI configured with appropriate credentials.

Once the infrastructure is provisioned and the Lambda function is deployed, the Prowler scan will be triggered based on the schedule defined in the CloudWatch Event Rule.

Monitor the CloudWatch Logs for the Lambda function to view the scan results and any logs generated by Prowler.

Remember to adjust the region, schedule, and other configurations as needed to match your requirements.

Note: This is a simplified example to demonstrate the concept. In production environments, consider adding error handling, logging, and security measures as necessary.

Wrap Up!

Automating compliance monitoring with Prowler on AWS Lambda can greatly simplify the process and help ensure the security and compliance of your AWS environment. By leveraging Terraform and Python, you can easily set up the infrastructure and deploy the Lambda function. This enables you to run Prowler scans on a regular schedule without the need for an EC2 instance.

You should experiment with different schedules, customise the Python code, and extend the solution to meet your specific compliance monitoring needs.

Happy compliance monitoring!

Building a Strong Partnership for Better Security

In today's organisational landscape, a significant challenge lies in bridging the gap between development and security teams. Operating in silos, these teams often face communication breakdowns and a lack of understanding each other's needs and priorities. However, to construct secure software, it is crucial that development and security teams collaborate and integrate their efforts. Enter DevSecOps, a philosophy that acknowledges the significance of cooperation between these two teams, aiming to infuse security into the software development process from start to finish.

Facilitating Collaboration

DevSecOps offers several avenues to foster collaboration between development and security teams.

Integration of security into the development process: By integrating security into the software development process, DevSecOps breaks down the silos that hinder collaboration. This integration enables both teams to work closely and effectively, fostering a better understanding of each other's needs and priorities.

Automation of security testing: DevSecOps automates security testing, making it an integral part of the software development process. This proactive approach ensures that security is considered from the inception of development, rather than an afterthought. By automating security testing, DevSecOps reduces the time and effort required to address security issues, allowing more time for collaboration between development and security teams.

Shared security metrics: DevSecOps provides a shared set of security metrics, creating a common understanding of what constitutes good security. With this shared understanding, development and security teams can work together to achieve common security goals.




Continuous security feedback: DevSecOps offers continuous security feedback, empowering development and security teams to identify and address security issues in real-time. By keeping security at the forefront, this approach ensures that both teams are continuously improving the security of software applications.

Wrap Up!

Collaboration between development and security teams is paramount for building secure software. DevSecOps serves as a framework for this collaboration, seamlessly integrating security into the software development process. Whether starting fresh with DevSecOps or seeking to enhance existing collaboration, embracing DevSecOps in your security strategy is a strategic move that fosters the creation of more secure software and guarantees the success of your DevSecOps initiatives.

GitHub Landing Pages

Your GitHub profile is not only a place to showcase your projects and contributions but also an
opportunity to leave a lasting impression on visitors. By creating a landing page with advanced widgets, you can make your profile stand out and demonstrate your skills in a visually appealing way. In this blog post, we'll guide you through the process of creating a landing page on your GitHub profile using a repository with the same name as your GitHub ID. We'll also explore some advanced widgets that you can add to enhance your profile.

Sign in to GitHub

If you haven't already, sign in to your GitHub account. If you don't have an account, you can create one for free.

Create a New Repository

Click on the "+" icon in the top-right corner of the GitHub homepage and select "New repository" from the dropdown menu. Enter your GitHub ID as the repository name.

Customise Your Repository

Add a description to your repository to give visitors an idea of what your landing page is about. Be creative and let your personality shine through!

Create a README.md File

To create your landing page, you'll need a README.md file. Check the box next to "Initialize this repository with a README" to create the file automatically.

Design Your Landing Page

Open the README.md file by clicking on it within your repository. You can use Markdown syntax to format the text and add visual elements. Let's explore some advanced widgets you can add to your landing page.

GitHub Stats Widget

The GitHub Readme Stats project by Anurag Hazra allows you to display your GitHub stats, such as top languages used, on your profile. Add the following code to your README.md file, replacing `your-github-username` with your actual username:

```markdown
[![Top Langs](https://github-readme-stats.vercel.app/api/top-langs/?username=your-github-username)](https://github.com/anuraghazra/github-readme-stats)
```

You can customise this widget by changing the colour, limiting the number of languages shown, and more. Refer to the GitHub Readme Stats repository for detailed customisation options.

GitHub Activity Widget

The GitHub Activity widget allows you to display your recent GitHub activity on your landing page. Add the following code to your README.md file:

```markdown
<!--START_SECTION:activity-->
<!--END_SECTION:activity-->
```

This widget will automatically populate with your recent activity, such as commits, pull requests, and issues.

GitHub Gists Widget

If you frequently use GitHub Gists to share code snippets, you can showcase them on your landing page. Add the following code to your README.md file:

```markdown
<!--START_SECTION:gists-->
<!--END_SECTION:gists-->
```

This widget will display your latest Gists, making it easy for visitors to explore your code snippets.

Feel free to explore other widgets and tools available to enhance your landing page further. You can add project cards, social media badges, or even embed interactive visualisations.

Commit and Publish

Once you've designed your landing page and added the desired widgets, it's time to commit your changes. Scroll down to the bottom of the README.md file, enter a commit message, and click on the "Commit changes" button.

Enable GitHub Pages

To make your landing page accessible to the world, enable GitHub Pages. Go to the main page of your repository, click on the "Settings" tab, and scroll down to the "GitHub Pages" section. Select the branch you want to use for your landing page (usually the main or master branch) and click on the "Save" button.

Visit Your Landing Page

Congratulations! You've successfully created an impressive landing page on your GitHub profile. To view your landing page, go back to the "GitHub Pages" section in your repository's settings. You'll find a link to your newly created landing page. Click on it and admire your beautifully designed profile!


Wrap Up

By creating a landing page on your GitHub profile and adding advanced widgets, you can showcase your skills and projects in an engaging and visually appealing way. The combination of a well-designed landing page and advanced widgets will leave a lasting impression on visitors and demonstrate your expertise as a developer. So go ahead, follow the steps outlined in this blog post, and create an impressive landing page that reflects your coding journey.

Happy coding!

Harnessing the Power of AI to Improve Security

In recent years, machine learning and AI have emerged as game-changers in the world of software
development, and DevSecOps is no exception. From detecting security threats to automating routine tasks, machine learning and AI have the potential to significantly improve the security of DevOps processes.


I noted down a few ways which would be interesting to explore with this new technology

Threat detection: Machine learning algorithms can be trained to detect and alert on potential security threats, such as malicious code or unauthorised access attempts. By automating threat detection, machine learning can help organisations to respond to security incidents more quickly and effectively, reducing the risk of security breaches.


Vulnerability scanning: Machine learning algorithms can be used to automate the process of vulnerability scanning, helping organisations to identify and address security vulnerabilities more quickly and accurately.

By automating vulnerability scanning, machine learning can reduce the risk of security breaches and ensure that software development pipelines are secure.


Automated remediation: In addition to detecting security threats and vulnerabilities, machine learning algorithms can be used to automate the process of remediation. For example, machine learning algorithms can be used to automatically patch vulnerabilities or deploy security updates, reducing the time and effort required to address security incidents.


Continuous security testing: Machine learning algorithms can be used to continuously test software applications and infrastructure, identifying and addressing security vulnerabilities in real-time. By automating security testing, machine learning can help organisations to catch security risks early in the development process, reducing the risk of security breaches.


Compliance management: Finally, machine learning algorithms can be used to manage compliance with various security and privacy regulations. For example, machine learning algorithms can be used to automate the process of generating compliance reports, ensuring that organisations are meeting their compliance obligations in a timely and accurate manner.



Wrap Up!

Machine learning and AI have the potential to significantly improve the security of DevSecOps processes. By automating routine tasks, detecting security threats, and managing compliance, machine learning and AI can help organisations to build more secure software, reduce the risk of security breaches, and ensure compliance with various security and privacy regulations. Whether you're just getting started with DevSecOps or are looking to improve your existing security posture, incorporating machine learning and AI into your security strategy is a smart move that will pay dividends for years to come.

In future posts we can now explore some of the code to generate these tools for our ecosystem.





A Guide to Building a Comprehensive Security Strategy

In this blog post, we'll explore each of these testing tools, their benefits, and how to integrate them into the DevSecOps process.

Static Application Security Testing (SAST)

Static application security testing (SAST) is a type of security testing that analyses the source code of software systems for vulnerabilities and security issues. SAST tools, such as Veracode and SonarQube, can help organizations to identify security issues early in the development process, so that they can be remediated before the software is deployed.




Dynamic Application Security Testing (DAST)

Dynamic application security testing (DAST) is a type of security testing that tests the behaviour of software systems while they are running. DAST tools, such as OWASP ZAP and Nessus, can help organisations to identify security issues that may not be apparent in the source code, such as cross-site scripting (XSS) and SQL injection attacks.


Interactive Application Security Testing (IAST)

Interactive application security testing (IAST) is a type of security testing that combines the benefits of SAST and DAST. IAST tools, such as AppScan and IBM AppScan, can help organisations to identify both source code and runtime vulnerabilities in real-time, so that they can be remediated quickly and effectively.


How to Integrate SAST, DAST, and IAST in DevSecOps

To effectively integrate SAST, DAST, and IAST into the DevSecOps process, organisations need to follow these steps:

Assess your security needs: Start by evaluating your organisation's security needs and determine which security testing tools and techniques will be most effective for your particular environment.

Choose the right tools: Select the SAST, DAST, and IAST tools that are best suited to your organisation's needs, taking into account factors such as ease of use, cost, and scalability.

Automate testing: Automate as much of the security testing process as possible, so that it can be performed quickly and efficiently. This can help to reduce the time it takes to identify and remediate security issues.

Incorporate security testing into the development process: Make security testing a key component of the DevSecOps process, so that security issues are identified and remediated early in the development process.

Regularly review and update: Regularly review and update your security testing processes to ensure that they are up-to-date and that your organisation is effectively identifying and mitigating security risks.


Wrap Up!

Integrating SAST, DAST, and IAST into the DevSecOps process is critical to ensuring the security and stability of software systems. By following these steps and regularly reviewing and updating your security testing processes, organisations can build a comprehensive security strategy that protects against threats and vulnerabilities. In future posts we can explore how to inject these concepts into your build pipeline in greater detail to ensure every build is tested and locked down.

Managing Security in a Microservices Architecture World

Micro Services architecture has become an increasingly popular method for building and deploying
software applications, offering a range of benefits such as increased agility, scalability, and efficiency. However, with the increased complexity of Micro Service Architecture comes new security challenges that organisations must address.

In this post, we'll explore some of the key security challenges of Micro Service Architecture and how DevSecOps can help organisations to effectively manage security in this environment.


Security Challenges of Micro Service Architecture

Increased complexity: One of the biggest challenges of Micro Services architecture is the increased complexity of the system. With multiple, interconnected Micro Services, it can be difficult to understand how security vulnerabilities in one service can impact the security of the entire system.

Configuration management: Another challenge of Micro Service Architecture is ensuring that each service is configured securely. With many services to manage, organisations must have a robust and consistent configuration management process in place to ensure that services are configured securely and consistently.

Lack of visibility: Micro Service Architecture can also make it difficult to see the big picture and understand the security posture of the entire system. With many services and components, it can be difficult to get a comprehensive view of the system's security status.

Coordination between teams: Another challenge of Micro Services architecture is coordinating security efforts between teams. With multiple teams working on different services, it can be difficult to ensure that security practices are consistent across the entire system.


Addressing the Security Challenges of Micro Service Architecture

Automated security testing: Automated security testing is a critical component of DevSecOps, and it can help organisations to identify security vulnerabilities in Micro Services architecture more quickly and efficiently. Automated security testing tools can be used to test the security of individual services and the entire system, providing organisations with a comprehensive view of their security posture.

Security-focused collaboration: DevSecOps encourages collaboration between development and security teams, which can help organisations to better coordinate their security efforts and ensure that security practices are consistent across the entire system

Continuous integration and deployment: Continuous integration and deployment (CI/CD) is a key principle of DevSecOps, and it can help organisations to ensure that security practices are integrated into the development process and that security issues are addressed quickly and efficiently.

Centralised security management: DevSecOps can also help organisations to centralise their security management processes, making it easier to manage security in Micro Service Architecture and ensuring that security practices are consistent across the entire system.




Wrap Up!

The security challenges of Micro Service Architecture can be daunting, but DevSecOps can help organisations to effectively manage security in this environment. By automating security testing, fostering collaboration between development and security teams, integrating security into the development process, and centralising security management processes, organisations can build a secure and stable Micro Service Architecture that protects against threats and vulnerabilities.

Pythonizing our Web App!

Ok, you asked for it! You all wanted an update for the lambda functions to use another language and
luckily, my favourite one! Python!

So let's get started!

Update the Lambda function to fetch and store data

The first step is to update the Lambda function that fetches data from an external website and stores it in a serverless database. Here is an example of the same function written in Python:

import boto3
import requests
dynamo = boto3.client('dynamodb')
def lambda_handler(event, context):
    # Fetch data from external website
    response = requests.get('https://example.com/data')
    data = response.json()
    
    # Store data in DynamoDB table
    item = {
        'id': {'S': data['id']},
        'data': {'S': data['data']}
    }
    dynamo.put_item(TableName='your-table-name', Item=item)
    return {
        'statusCode': 200,
        'body': json.dumps({'message': 'Data stored in DynamoDB'})
    }

This example uses the boto3 package to interact with DynamoDB and the requests package to fetch data from an external website, similar to the previous example. Note that the function uses python json module to parse data instead of JSON.parse() function in node.js

Update the Lambda function to retrieve data

The next step is to update the Lambda function that retrieves data from the serverless database and makes it accessible through the API Gateway. Here is an example of the same function written in Python:


import boto3
import json
dynamo = boto3.client('dynamodb')
def lambda_handler(event, context):
    # Retrieve data from DynamoDB table
    data = dynamo.scan(TableName='your-table-name')
    # Return data to user
    return {
        'statusCode': 200,
        'body': json.dumps(data)
    }

This function uses the boto3 package to interact with DynamoDB and returns the data to the user as a JSON string using json.dumps()

Re-Creating a Lambda Authorizer

A Lambda authorizer is a Lambda function that you build to authorize access to your API Gateway APIs using bearer token authentication, such as OAuth or JWT.

Here's an example of a simple Lambda authorizer function that can be used to authenticate a user using a bearer token. This function will use the Okta library to validate the token and return the appropriate policy allowing or denying access to the API Gateway endpoint.

import json
from okta.framework import OktaAuth
def lambda_handler(event, context):
    auth = OktaAuth(event)
    claims = auth.verify_token()
    if claims:
        auth_response = auth.get_auth_response(claims)
    else:
        auth_response = auth.get_auth_response(None, 'Unauthorized')
    return auth_response

This example uses the okta library to validate the token passed in the Authorization header and return the appropriate policy allowing or denying access to the API Gateway endpoint.

Updating Terraform Configuration

For the terraform configuration to use the python function in the Lambda function resource the runtime attribute needs to be updated from nodejs14.x to python3.8. Additionally, in order to link the newly created Lambda authorizer function to the API Gateway, we need to update the authorization attribute of aws_api_gateway_method resource to CUSTOM and add authorizer_id to it pointing to our newly created Lambda function.

resource "aws_api_gateway_method" "example" {
  rest_api_id = aws_api_gateway_rest_api.api_name.id
  resource_id = aws_api_gateway_resource.example.id
  http_method = "GET"
  authorization = "CUSTOM"
  authorizer_id = aws_lambda_authorizer.authorizer_lambda.id

  request_parameters = {
    "method.request.header.Authorization": true
  }
  integration {
    type = "AWS_PROXY"
    uri = aws_lambda_function.data_lambda.invoke_arn
    http_method = "POST"
    request_templates = {
      "application/json" = "{\"statusCode\": 200}"
    }
  }
}

Even Re-Create a Hello World Lambda Function

To create a simple "Hello World" Lambda function, you can use the following code:

def lambda_handler(event, context):
    message = 'Hello World!'
    return {
        'statusCode': 200,
        'body': json.dumps(message)
    }

This function returns a JSON object containing the message "Hello World!"

Verify the Deployment

To verify that everything is set up correctly, you can test the API Gateway endpoint using a tool like Postman and check the logs of all Lambda functions and the DynamoDB table to ensure that the data is being fetched and stored correctly, accessible when the user is logged in, and that the authorizer is properly handling token validation.

Additionally, you can test the "Hello World" lambda function by invoking it from the AWS Lambda console or by making a request to it via an API Gateway endpoint.

Wrap Up!

There you have it! You wanted it you got it, but by updating these Lambda functions and the Terraform configuration to use Python instead of Node.js, you can continue to use the same functionality as before but with a different programming language.

Despite updating the dependencies and language, the function code and dependencies together, remains ready for deployment with Terraform to create the same product demonstrating how modular we can make these websites!

m04r Data for the S3 App!

Now in our journey we need to add more tasks to build the front end. In this blog post, we will build out two Lambda functions and a serverless database to fetch and store data from an external website and make it accessible to a logged-in user through an API Gateway and by doing this we can no have more data available to our application out of the users raw reach and safely behind our API that will deal with any connection or credential issues.

Create a Lambda function to fetch and store data

The first step is to create a Lambda function that periodically fetches data from an external website and stores it in a serverless database. One way to achieve this is to set up a CloudWatch event to trigger the Lambda function on a regular schedule.

Here's an example of a Lambda function written in Node.js that fetches data from an external website using the request package and stores it in a DynamoDB table:

const AWS = require('aws-sdk');

const request = require('request');
const dynamo = new AWS.DynamoDB();
exports.handler = async (event) => {
    // Fetch data from external website
    const response = await request.get('https://example.com/data');
    const data = JSON.parse(response.body);
    // Store data in DynamoDB table
    const params = {
        Item: {
            "id": { S: data.id },
            "data": { S: data.data }
        },
        TableName: "your-table-name"
    };
    await dynamo.putItem(params).promise();
    return {
        statusCode: 200,
        body: JSON.stringify({
            message: 'Data stored in DynamoDB',
        }),
    };
};

This function uses the request package to fetch data from an external website, parses it as JSON, and stores it in a DynamoDB table using the AWS.DynamoDB client. The name of the table and the structure of the data being stored will depend on the external website you are fetching data from.

Create a Lambda function to retrieve data

The next step is to create a Lambda function that retrieves data from the serverless database and makes it accessible through the API Gateway. This function should be triggered when a user makes a request to a specific endpoint, such as /data.

Here's an example of a Lambda function written in Node.js that retrieves data from a DynamoDB table and returns it to the user:

const AWS = require('aws-sdk');
const dynamo = new AWS.DynamoDB();
exports.handler = async (event) => {
    // Retrieve data from DynamoDB table
    const params = {
        TableName: "your-table-name"
    };
    const data = await dynamo.scan(params).promise();
    // Return data to user
    return {
        statusCode: 200,
        body: JSON.stringify(data),
    };
};

This function retrieves data from the DynamoDB table by scanning the table and returns it to the user as a JSON string.

Create a serverless database

We will use DynamoDB as the Serverless database and use terraform to create the DynamoDB table for storing the data.

resource "aws_dynamodb_table" "example" {
  name = "your-table-name"
  billing_mode = "PAY_PER_REQUEST"
  hash_key  = "id"
  attribute {
    name = "id"
    type = "S"
  }
  attribute {
    name = "data"
    type = "S"
  }
}

Integrate with API Gateway

We will need to integrate the Lambda function created on step 2 to the API Gateway so that it is triggered when a user makes a request to a specific endpoint. This can be done by creating a new resource and method in the API Gateway and linking it to the Lambda function.

Here's an example of how to create a new resource and method in the API Gateway using Terraform:

resource "aws_api_gateway_resource" "example" {
  rest_api_id = aws_api_gateway_rest_api.api_name.id
  parent_id = aws_api_gateway_rest_api.api_name.root_resource_id
  path_part = "data"
}
resource "aws_api_gateway_method" "example" {
  rest_api_id = aws_api_gateway_rest_api.api_name.id
  resource_id = aws_api_gateway_resource.example.id
  http_method = "GET"
  authorization = "NONE"
  request_parameters = {
    "method.request.header.Authorization": true
  }
  integration {
    type = "AWS_PROXY"
    uri = aws_lambda_function.data_lambda.invoke_arn
    http_method = "POST"
    request_templates = {
      "application/json" = "{\"statusCode\": 200}"
    }
  }
}

This Terraform code creates a new resource and method in the API Gateway, with the path /data and linked to the Lambda function created on step 2. This will trigger the Lambda function when a user makes a GET request to the /data endpoint and return the data stored in the DynamoDB table.

Verify the Deployment

To verify that everything is set up correctly, you can test the API Gateway endpoint using a tool like Postman and check the logs of both Lambda functions and the DynamoDB table to ensure that the data is being fetched and stored correctly and accessible when the user is logged in.

Wrap Up!

Now by creating these two Lambda functions and a serverless database to our setup, you can fetch and store data from an external website or another service and make it accessible to a our user through an API Gateway using what we learned from other posts we could use the logic to make sure our users are logged in with an identity as well!

I got ID! Adding Authorisation to our App!

Why not show off a bit of my fondness towards Pearl Jam?

In our previous chapters of this series we have:

That's right! Today we will add an additional layer of security to our Angular application by integrating an AWS Lambda Authorizer as an authentication and authorization method, using either AWS Cognito or Okta as a third-party provider.

Create a Lambda Authorizer function

The first step in creating a Lambda Authorizer is to create the Lambda function itself. The function will handle the authentication and authorization process by validating the incoming token and determining if the user is authorized to access the resources.

Here's an example of a simple Lambda Authorizer function written in Node.js that uses AWS Cognito as the third-party provider:

const AWS = require('aws-sdk');
const cognito = new AWS.CognitoIdentityServiceProvider();
exports.handler = async (event) => {
    // Extract the token from the request
    const token = event.authorizationToken;
    
    // Verify the token using the Cognito provider
    const params = {
        AccessToken: token
    };
    const data = await cognito.getUser(params).promise();
    
    // Determine if the user is authorized
    const user = data.UserAttributes;
    const isAuthorized = user.some(attr => attr.Name === 'custom:groups' && attr.Value === 'admin');
    
    // Return the policy statement
    if (isAuthorized) {
        return {
            principalId: user.find(attr => attr.Name === 'sub').Value,
            policyDocument: {
                Version: '2012-10-17',
                Statement: [
                    {
                        Action: 'execute-api:Invoke',
                        Effect: 'Allow',
                        Resource: event.methodArn
                    }
                ]
            }
        };
    } else {
        return {
            principalId: 'user',
            policyDocument: {
                Version: '2012-10-17',
                Statement: [
                    {
                        Action: 'execute-api:Invoke',
                        Effect: 'Deny',
                        Resource: event.methodArn
                    }
                ]
            }
        };
    }
};

If you are using Okta as the provider, you should use Okta's SDK and API to verify the token and determine if the user is authorized.

const OktaJwtVerifier = require('@okta/jwt-verifier');
const oktaJwtVerifier = new OktaJwtVerifier({
  issuer: 'https://{yourOktaDomain}.com/oauth2/default',
  clientId: '{clientId}'
});
exports.handler = async (event) => {
    // Extract the token from the request
    const token = event.authorizationToken;
    
    // Verify the token using the Okta provider
    try {
        const jwt = await oktaJwtVerifier.verifyAccessToken(token);
        // Verify that the user has the required scope
        if (jwt.claims.scp.split(" ").includes("admin")) {
            return {
                principalId: jwt.claims.sub,
                policyDocument: {
                    Version: '2012-10-17',
                    Statement: [
                        {
                            Action: 'execute-api:Invoke',
                            Effect: 'Allow',
                            Resource: event.methodArn
                        }
                    ]
                }
            };
        }
    } catch (err) {
        console.log(err);
    }
    // Return a policy statement denying access
    return {
        principalId: 'user',
        policyDocument: {
            Version: '2012-10-17',
            Statement: [
                {
                    Action: 'execute-api:Invoke',
                    Effect: 'Deny',
                    Resource: event.methodArn
                }
            ]
        }
    };
};

This example uses the @okta/jwt-verifier package to verify the access token and check for the required scope(in this case "admin").

Create the Lambda Authorizer on API Gateway

Once the Lambda Authorizer function is created, it needs to be added as an Authorizer on the API Gateway. This can be done using the AWS Management Console or using Terraform. 

resource "aws_api_gateway_authorizer" "authorizer_name" {
  name = "your-authorizer-name"
  rest_api_id = aws_api_gateway_rest_api.api_name.id
  type = "TOKEN"
  identity_source = "method.request.header.Authorization"
  authorizer_uri = "arn:aws:apigateway:us-east-1:lambda:path/2015-03-31/functions/${aws_lambda_function.authorizer_name.arn}/invocations"
}

The above code creates an Authorizer on the API Gateway and associates it with the Lambda function that we created at the start.

Update the Angular application

Once the Lambda Authorizer is set up, the Angular application needs to be updated to send the token in the Authorization header with each request. This can be done by adding an interceptor that adds the token to the headers before sending the request.

Here's an example of an Angular interceptor that adds the token to the headers:

import { Injectable } from '@angular/core';
import { HttpInterceptor, HttpRequest, HttpHandler, HttpEvent } from '@angular/common/http';
import { TokenService } from './token.service';
import { Observable } from 'rxjs';
@Injectable()
export class TokenInterceptor implements HttpInterceptor {
    constructor(private tokenService: TokenService) {}
    intercept(request: HttpRequest<any>, next: HttpHandler): Observable<HttpEvent<any>> {
        request = request.clone({
            setHeaders: {
                Authorization: `Bearer ${this.tokenService.getToken()}`
            }
        });
        return next.handle(request);
    }
}

This example assumes that you have a TokenService that retrieves the token from storage.

You will also need to register this interceptor in the app.module.ts file.

import { TokenInterceptor } from './token.interceptor';
providers:
...
@NgModule({
  ...
  providers: [
    {
      provide: HTTP_INTERCEPTORS,
      useClass: TokenInterceptor,
      multi: true
    }
  ],
  ...
})
export class AppModule { }

This registers the interceptor in the AppModule, ensuring that the token is added to the headers of all HTTP requests made by the application.

Deploy the Terraform Configuration

After updating the Angular application, the Terraform configuration can be deployed to create and configure the necessary resources on AWS.
$ terraform init
$ terraform apply

Verify the Deployment

Now we can test to make sure that the Lambda Authorizer has been set up correctly, you can test the API Gateway endpoint using a tool like Postman and check the logs of the Lambda function to ensure that it is being invoked correctly.

Wrap Up!

By now you should be starting to see how we are building up this application and how you can start to implement your own logic into this magical paradigm. By using a Lambda Authorizer and integrating with a third-party provider like AWS Cognito, Okta or even many others! you can add an additional layer of security to your Angular application.

Playing with AWS SAM

Let's take a break and play with something fun and new! In this blog post, we will go over how to use
AWS Serverless Application Model (SAM) templates to build and deploy a serverless application that includes multiple Lambda functions, an Amazon DynamoDB table, and an API Gateway. Hilariously, SAM has a squirrel for it's mascot... because the inventor likes squirrels... Ok!

Creating your SAM templates

Of course it all starts with the template!

History Lesson!

AWS SAM is an extension of AWS CloudFormation and uses the same template format. To create a SAM template, you need to define the AWS resources needed for your serverless application, including Lambda functions, DynamoDB tables, and API Gateway.

Example Template/Code

Here's an example SAM template that defines a Lambda function to fetch and store data, another Lambda function to retrieve data and make it accessible through an API Gateway, an DynamoDB table, and an API Gateway endpoint:

AWSTemplateFormatVersion: '2010-09-09'

Transform: AWS::Serverless-2016-10-31
Resources:
  DataFunction:
    Type: AWS::Serverless::Function
    Properties:
      CodeUri: data_function
      Handler: data_function.lambda_handler
      Runtime: python3.8
      Events:
        Timer:
          Type: Schedule
          Properties:
            Schedule: rate(1 hour)
  API:
    Type: AWS::Serverless::Api
    Properties:
      StageName: Prod
      
  RetrieveFunction:
    Type: AWS::Serverless::Function
    Properties:
      CodeUri: retrieve_function
      Handler: retrieve_function.lambda_handler
      Runtime: python3.8
      Events:
        GetResource:
          Type: Api 
          Properties:
            Path: /items
            Method: get
            RestApiId: !Ref API
  
  DataTable:
    Type: AWS::DynamoDB::Table
    Properties:
      TableName: myTable
      AttributeDefinitions:
        - AttributeName: id
          AttributeType: S
      KeySchema:
        - AttributeName: id
          KeyType: HASH
      ProvisionedThroughput:
        ReadCapacityUnits: 1
        WriteCapacityUnits: 1

Package and deploy your application

Once you have created your SAM template, you can use the AWS SAM CLI to package and deploy your application to the AWS Cloud.

sam package --template-file template.yaml --output-template-file packaged.yaml --s3-bucket my-bucket
sam deploy --template-file packaged.yaml --stack-name my-stack --capabilities CAPABILITY_IAM

The sam package command creates a new template file (packaged.yaml) that includes the location of your function code artefacts in an S3 bucket.

The sam deploy command deploys your application using the CloudFormation stack defined in the template. This process provisions all the necessary resources defined in the template and sets up the required permissions and triggers.

Test your application

You can use the AWS Management Console, the AWS CLI, or the AWS SDKs to test your application, for example by invoking the Lambda functions, making GET requests to the API Gateway endpoint, or querying the DynamoDB table.

Another way we can sneakily test our lambda functions is by using the in-built test feature for AWS SAM.

sam local start-api

Which will deploy the lambdas inside docker containers and allow us to interact with them with the sam cli as well as with the port we are hosting the containers on. 

Adding an Authorizer

In order to test the Lambda authorizer created in the previous blog post we will have to update the SAM template accordingly. Specifically, we'll need to create another AWS::Serverless::Function resource for the authorizer, and another AWS::Serverless::Api resource for the custom authorizer. The following example shows how to create a Lambda authorizer function and how to link it to an API Gateway using SAM template.

Resources:
  Authorizer:
    Type: AWS::Serverless::Function
    Properties:
      CodeUri: authorizer_function
      Handler: authorizer_function.lambda_handler
      Runtime: python3.8
      
  AuthorizerAPI:
    Type: AWS::Serverless::Api
    Properties:
      StageName: Prod
      DefinitionUri: authorizer_swagger.yaml
     
  RetrieveFunction:
    Type: AWS::Serverless::Function
    Properties:
      CodeUri: retrieve_function
      Handler: retrieve_function.lambda_handler
      Runtime: python3.8
      Events:
        GetResource:
          Type: Api 
          Properties:
            Path: /items
            Method: get
            RestApiId: !Ref AuthorizerAPI
            AuthorizerId: !Ref Authorizer

This example creates a new Lambda function resource for the authorizer and another for the RetrieveFunction. Additionally, it creates an new API resource which references the Authorizer and swagger definition file authorizer_swagger.yaml.

Deploy your application with the authorizer function

Once you have updated your SAM template, you can use the AWS SAM CLI to package and deploy your application, including the authorizer function, to the AWS Cloud in the same way as before.

sam package --template-file template.yaml --output-template-file packaged.yaml --s3-bucket my-bucket
sam deploy --template-file packaged.yaml --stack-name my-stack --capabilities CAPABILITY_IAM

Verify the Deployment

To verify that everything is set up correctly, you can test the API Gateway endpoint using a tool like Postman or curl and check the logs of all Lambda functions and the DynamoDB table to ensure that the data is being fetched and stored correctly and that the authorizer is properly handling token validation.

Wrap Up!

That was fun, using AWS SAM templates you can define your serverless application resources and their configuration in a simple YAML or JSON file, and use the SAM CLI commands to package, deploy and even run and test the application, as well as test it, which can be more convenient than using Terraform. Additionally, you can include an authorizer function, created as described in the previous blog post, in the stack, with a simple addition of the necessary resources in the template and referencing them accordingly in the rest of the resources.