Airflow on AWS: Set up a production-ready architecture

How to set up Apache Airflow on AWS EKS? That’s one of the top questions I received from my students. If you started with Airflow pretty recently, there is a good chance that you are still using it in local or on an EC2 instance with the local executor. That’s great at first, but you are going to quickly reach the limits of that configuration as it doesn’t scale well and definitely not recommended in production. Another set up you might have is Airflow running with the Celery Executor but you would like to shift to the Kubernetes Executor as it brings many advantages over Celery (cf: Kubernetes Executor). Well you are the right place. I’ve created a 9 hours course in which you are going to learn how to set up a production-ready architecture for Apache Airflow on AWS EKS

Airflow on AWS EKS

Let me tell you, setting up a production ready architecture on AWS EKS is hard. Supporting and monitoring that architecture is even harder. Indeed, you have to

  • Understand Dev ops principles
  • Learn what is Kubernetes and how it works
  • Discover and experiment many different AWS services such as ECR, CodePipeline, CodeBuild, ALB and so on
  • Configure Apache Airflow with Helm to run on EKS
  • Mix all AWS Services together to build your architecture and ensure that everything works as expected

This list is really non-exhaustive but it’s enough to show you how hard the task can be.

In my course Apache Airflow on AWS EKS: The Hands-On Guide you are going to learn how to build an architecture from the ground up so that, at the end, you will be able to use it as a reliable starting point to build your own architecture, in your own company.

On top of that, you will discover and learn any concepts/features addressed along the build of the architecture such as what is EKS, GitOps, Kubernetes, ECR, Load Balancers and how they work, architecture schemas will be provided as well and so on.

Here are the different points tackled in the course:

  1. Configuring the EKS cluster following best practices
  2. Deploying automatically changes with GitOps
  3. Using Helm to configure and set up Airflow on Kubernetes
  4. Configuring the official Helm chart of Airflow to use the Kubernetes Executor and many different features
  5. Deploying DAGs in Airflow with Git-Sync and AWS EFS
  6. Deploying DAGs/Airflow through CI/CD pipelines with AWS CodePipeline
  7. Testing your DAGs automatically
  8. Securing your credentials and sensitive data in a Secret Backend
  9. Enabling remote logging with AWS S3
  10. Creating 3 different environments dev/staging and prod
  11. Making the production environment scalable and highly available

A (very) broad overview of the architecture we are going to build:


Important Prerequisites

All you need to know before enrolling in the course


Interested by learning more? Stay tuned and get special promotions!

Leave a Comment

Your email address will not be published.