Automate Marketing Initiatives with Salesforce Marketing Cloud Learn More

Apache Airflow 2.3.0 Release is Out Now- New Features That Everyone Should Know

Recently, Apache announced the release of Airflow 2.3.0. Since its last update which is Apache Airflow 2.2.0, this new release has over 700 commits, including 50 new features, 99 improvements, 85 bug fixes, and several doc changes.

Here is a glimpse of major updates:

  • Dynamic Task Mapping(AIP-42)
  • Grid View Replaces Tree View
  • Purge History From the Metadata Database
  • LocalKubernetesExecutor
  • DagProcessorManager as standalone process (AIP-43)
  • JSON Serialization for Connections
  • Airflow db downgrade and Offline Generation of SQL Scripts
  • Reuse of Decorated Tasks

Let’s discuss these updates in detail

Dynamic Task Mapping(AIP-42)

Dynamic Task Mapping provides a way for the workflow to create a number of tasks at runtime based on current data, instead of the DAG author having to know in advance how many tasks would be required.


This is similar to defining tasks in a for loop. Instead of having the DAG file fetch the data and do that itself, a scheduler can do this based on the output of a previous task. Right before the mapped task is executed the scheduler will create n copies of the task, one for each input.


It is also possible to have the task operate on a collected output of the mapped task, and it is commonly known as a map and reduce.


The airflow now provides full support for dynamic tasks. This refers to the fact that the tasks can be generated dynamically at runtime. It is similar to the working of for loop, i.e. can be used to create a list of tasks, you can make the same task without knowing the exact number of tasks ahead of time. Suppose you can have a task that generates a list to iterate over without the possibilities of the for-loop.


Find the below example for this:

329f0534 54f4 4b74 a3a3 c9ce6c595375 1
New features in the latest release of Apache Airflow 2 3 0 3

Grid View replaces Tree View.

Find the screenshots below showing the replacement of Tree view to Grid view in Airflow 2.3.0.


Grid View replaces Tree View 1

To Get Rid of History from the Metadata Database


A new airflow DB clean command is used to get rid of old data from the metadata database.

This command can use to reduce the size of the metadata database.

Here is some more information: Purge history from metadata database


Airflow 2.3.0 introduced a new executor named LocalKubernetesExecutor, which helps you run some of the tasks using LocalExecutor and run another set of functions using the KubernetesExecutor in the same deployment based on the task’s queue.

Here is some more information: LocalKubernetesExecutor

DagProcessorManager as standalone process (AIP-43)

In Airflow 2.3.0, the DagProcessorManager can be run as a standalone process. As DagProcessorManager runs user code, It is better to separate it from the scheduler process and run it as an independent process in a different host.

In airflow 2.3.0, the dag-processor CLI command will start a new process to run the DagProcessorManager in a separate process. Before the DagProcessorManager can run as a standalone process, it is necessary to set the [scheduler] standalone_dag_processor to True.

Here is some more information: dag-processor CLI command

A place for big ideas.

Reimagine organizational performance while delivering a delightful experience through optimized operations.

JSON Serialization for the Connections.

The connections can be created by using the json serialization format.

New features in the latest release of Apache Airflow 2 3 0 1 1

JSON serialization format can also be used when setting connections in the environment variables.

Here is some more information: JSON serialization for connections

Airflow DB downgrade and Offline generation of the SQL scripts

A new command airflow DB downgrade in Airflow 2.3.0 will be used for your chosen version by downgrading the database.

The downgrade/upgrade SQL scripts can also be generated for your database and also run it against your database manually or just view the SQL queries, which would be run by the downgrade/upgrade command.

Here is some more information: Airflow DB downgrade and Offline generation of SQL scripts

Reuse of decorated tasks

The decorated tasks can be reused across the dag files. A decorated task has an override method that allows you to override its arguments.

Here’s an example:

New features in the latest release of Apache Airflow 2 3 0 2 1

Other small features

It is not a full-scale list, but some noteworthy or interesting small features include:

It Supports different timeout values for dag file analyzing.
The airflow dags reserialize command to reserialize dags.
The Events Timetable.

SmoothOperator – The Operator that does literally nothing except logging a YouTube link to Sade’s “Smooth Operator”. Enjoy!

Top Stories

Enhancing GraphQL with Roles and Permissions
Enhancing GraphQL with Roles and Permissions
GraphQL has gained popularity due to its flexibility and efficiency in fetching data from the server. However, with great power comes great responsibility, especially when it comes to managing access to sensitive data. In this article, we'll explore how to implement roles and permissions in GraphQL APIs to ensure that
Exploring GraphQL with FastAPI A Practical Guide to begin with
Exploring GraphQL with FastAPI: A Practical Guide to begin with
GraphQL serves as a language for asking questions to APIs and as a tool for getting answers from existing data. It's like a translator that helps your application talk to databases and other systems. When you use GraphQL, you're like a detective asking for specific clues – you only get
Train tensorflow object detection model with custom data
Train Tensorflow Object Detection Model With Custom Data
In this article, we'll show you how to make your own tool that can recognize things in pictures. It's called an object detection model, and we'll use TensorFlow to teach it. We'll explain each step clearly, from gathering pictures, preparing data to telling the model what to look for in
Software Development Team
How to deploy chat completion model over EC2?
The Chat Completion model revolutionizes conversational experiences by proficiently generating responses derived from given contexts and inquiries. This innovative system harnesses the power of the Mistral-7B-Instruct-v0.2 model, renowned for its sophisticated natural language processing capabilities. The model can be accessed via Hugging Face at – on a dedicated GPU server g4dn.2xlarge,
How to deploy multilingual embedding model over EC2
How to deploy multilingual embedding model over EC2?
The multilingual embedding model represents a state-of-the-art solution designed to produce embeddings tailored explicitly for chat responses. By aligning paragraph embeddings, it ensures that the resulting replies are not only contextually relevant but also coherent. This is achieved through leveraging the advanced capabilities of the BAAI/bge-m3 model, widely recognized for
Tracking and Analyzing E commerce Performance with Odoo Analytics
Tracking and Analyzing E-commerce Performance with Odoo Analytics
Odoo is famous for its customizable nature. Businesses from around the world choose Odoo because of its scalability and modality. Regardless of the business size, Odoo can cater to the unique and diverse needs of any company. Odoo has proven its capacity and robust quality in terms of helping businesses


          Keep an eye on your inbox for the PDF, it's on its way!

          If you don't see it in your inbox, don't forget to give your junk folder a quick peek. Just in case.

              You have successfully subscribed to the newsletter

              There was an error while trying to send your request. Please try again.

              Zehntech will use the information you provide on this form to be in touch with you and to provide updates and marketing.