Apache Airflow CLI command reference - Amazon Managed Workflows for Apache Airflow

Apache Airflow CLI command reference

This page describes the supported and unsupported Apache Airflow CLI commands on Amazon Managed Workflows for Apache Airflow (MWAA).

Prerequisites

The following section describes the preliminary steps required to use the commands and scripts on this page.

Access

AWS CLI

The AWS Command Line Interface (AWS CLI) is an open source tool that enables you to interact with AWS services using commands in your command-line shell. To complete the steps on this page, you need the following:

What's changed in v2.0.2

  • New: Airflow CLI command structure. The Apache Airflow v2.0.2 CLI is organized so that related commands are grouped together as subcommands, which means you need to update Apache Airflow v1.10.12 scripts if you want to upgrade to Apache Airflow v2.0.2. For example, unpause in Apache Airflow v1.10.12 is now dags unpause in Apache Airflow v2.0.2. To learn more, see Airflow CLI changes in 2.0 in the Apache Airflow reference guide.

CLI commands

The following section contains the CLI commands supported.

Supported commands

The following list shows the Apache Airflow CLI commands available on Amazon MWAA.

Airflow v2.0.2
Airflow version Supported Command

v2.0.2

Yes

cheat-sheet

v2.0.2

Yes

connections add

v2.0.2

Yes

connections delete

v2.0.2

Yes

dags delete

v2.0.2

Yes

dags list-jobs

v2.0.2

Yes

dags pause

v2.0.2

Yes

dags report

v2.0.2

Yes

dags show

v2.0.2

Yes

dags state

v2.0.2

Yes

dags test

v2.0.2

Yes

dags trigger

v2.0.2

Yes

dags unpause

v2.0.2

Yes

providers behaviours

v2.0.2

Yes

providers get

v2.0.2

Yes

providers hooks

v2.0.2

Yes

providers links

v2.0.2

Yes

providers list

v2.0.2

Yes

providers widgets

v2.0.2

Yes

roles create

v2.0.2

Yes

roles list

v2.0.2

Yes

tasks clear

v2.0.2

Yes

tasks failed-deps

v2.0.2

Yes

tasks list

v2.0.2

Yes

tasks render

v2.0.2

Yes

tasks state

v2.0.2

Yes

tasks states-for-dag-run

v2.0.2

Yes

tasks test

v2.0.2

Yes

variables delete

v2.0.2

Yes

variables get

v2.0.2

Yes

variables set

v2.0.2

Yes

variables list

v2.0.2

Yes

version

Airflow v1.10.12
Airflow version Supported Command

v1.10.12

Yes

clear

v1.10.12

Yes

delete_dag

v1.10.12

Yes

next_execution

v1.10.12

Yes

pause

v1.10.12

Yes

pool

v1.10.12

Yes

render

v1.10.12

Yes

run

v1.10.12

Yes

task_failed_deps

v1.10.12

Yes

trigger_dag

v1.10.12

Yes

unpause

v1.10.12

Yes

variables

v1.10.12

Yes

version

Unsupported commands

The following list shows the Apache Airflow CLI commands not available on Amazon MWAA.

Airflow v2.0.2
Airflow version Supported Command

v2.0.2

*No (note)

dags backfill

v2.0.2

*No (note)

dags list

v2.0.2

*No (note)

dags list-runs

v2.0.2

*No (note)

dags next-execution

v2.0.2

No

dags show --save

v2.0.2

No

dags test --save-dagrun

v2.0.2

No

dags test --show-dagrun

v2.0.2

No

db check

v2.0.2

No

db check-migrations

v2.0.2

No

db init

v2.0.2

No

db reset

v2.0.2

No

db upgrade

v2.0.2

No

flower

v2.0.2

No

kerberos

v2.0.2

No

pools delete

v2.0.2

No

pools list

v2.0.2

No

pools export

v2.0.2

No

pools set

v2.0.2

No

rotate-fernet-key

v2.0.2

No

scheduler

v2.0.2

No

shell

v2.0.2

No

sync-perm

v2.0.2

No

tasks run

v2.0.2

No

users create

v2.0.2

No

users delete

v2.0.2

No

users list

v2.0.2

No

webserver

v2.0.2

No

worker

Airflow v1.10.12
Airflow version Supported Command

v1.10.12

*No (note)

backfill

v1.10.12

No

checkdb

v1.10.12

No

connections

v1.10.12

No

create_user

v1.10.12

*No (note)

dag_state

v1.10.12

No

delete_user

v1.10.12

No

flower

v1.10.12

No

initdb

v1.10.12

No

kerberos

v1.10.12

*No (note)

list_dag_runs

v1.10.12

*No (note)

list_dags

v1.10.12

No

list_users

v1.10.12

*No (note)

list_tasks

v1.10.12

No

resetdb

v1.10.12

No

rotate_fernet_key

v1.10.12

No

scheduler

v1.10.12

No

serve_logs

v1.10.12

No

shell

v1.10.12

*No (note)

show_dag

v1.10.12

No

sync_perm

v1.10.12

*No (note)

task_state

v1.10.12

*No (note)

test

v1.10.12

No

upgradedb

v1.10.12

No

webserver

v1.10.12

No

worker

Using commands that parse DAGs

Apache Airflow CLI commands that parse DAGs will fail if the DAG uses plugins that depend on packages installed through a requirements.txt:

  • backfill

  • dag_state

  • dags backfill

  • dags list

  • dags list-runs

  • dags next-execution

  • list_dag_runs

  • list_dags

  • list_tasks

  • show_dag

  • task_state

  • test

You can use these CLI commands if your DAGS don't use plugins that depend on packages installed through a requirements.txt.

Sample code

The following section contains examples of different ways to use the Apache Airflow CLI.

Set, get or delete an Apache Airflow v2.0.2 variable

You can use the following sample code to set, get or delete a variable in the format of <script> <mwaa env name> get | set | delete <variable> <variable value> </variable> </variable>.

[ $# -eq 0 ] && echo "Usage: $0 MWAA environment name " && exit if [[ $2 == "" ]]; then dag="variables list" elif [ $2 == "get" ] || [ $2 == "delete" ] || [ $2 == "set" ]; then dag="variables $2 $3 $4 $5" else echo "Not a valid command" exit 1 fi CLI_JSON=$(aws mwaa --region $AWS_REGION create-cli-token --name $1) \ && CLI_TOKEN=$(echo $CLI_JSON | jq -r '.CliToken') \ && WEB_SERVER_HOSTNAME=$(echo $CLI_JSON | jq -r '.WebServerHostname') \ && CLI_RESULTS=$(curl --request POST "https://$WEB_SERVER_HOSTNAME/aws_mwaa/cli" \ --header "Authorization: Bearer $CLI_TOKEN" \ --header "Content-Type: text/plain" \ --data-raw "$dag" ) \ && echo "Output:" \ && echo $CLI_RESULTS | jq -r '.stdout' | base64 --decode \ && echo "Errors:" \ && echo $CLI_RESULTS | jq -r '.stderr' | base64 --decode

Add a configuration when triggering a DAG

You can use the following sample code with Apache Airflow v1.10.12 and Apache Airflow v2.0.2 to add a configuration when triggering a DAG, such as airflow trigger_dag 'dag_name' —conf '{"key":"value"}'.

import boto3 import json import requests import base64 mwaa_env_name = 'YOUR_ENVIRONMENT_NAME' dag_name = 'YOUR_DAG_NAME' key = "YOUR_KEY" value = "YOUR_VALUE" conf = "{\"" + key + "\":\"" + value + "\"}" client = boto3.client('mwaa') mwaa_cli_token = client.create_cli_token( Name=mwaa_env_name ) mwaa_auth_token = 'Bearer ' + mwaa_cli_token['CliToken'] mwaa_webserver_hostname = 'https://{0}/aws_mwaa/cli'.format(mwaa_cli_token['WebServerHostname']) raw_data = "trigger_dag {0} -c '{1}'".format(dag_name, conf) mwaa_response = requests.post( mwaa_webserver_hostname, headers={ 'Authorization': mwaa_auth_token, 'Content-Type': 'text/plain' }, data=raw_data ) mwaa_std_err_message = base64.b64decode(mwaa_response.json()['stderr']).decode('utf8') mwaa_std_out_message = base64.b64decode(mwaa_response.json()['stdout']).decode('utf8') print(mwaa_response.status_code) print(mwaa_std_err_message) print(mwaa_std_out_message)

Run CLI commands on an SSH tunnel to a bastion host

The following example shows how to run Airflow CLI commands using an SSH tunnel proxy to a Linux Bastion Host.

Using curl

  1. ssh -D 8080 -f -C -q -N YOUR_USER@YOUR_BASTION_HOST
  2. curl -x socks5h://0:8080 --request POST https://YOUR_HOST_NAME/aws_mwaa/cli --header YOUR_HEADERS --data-raw YOUR_CLI_COMMAND

Samples in GitHub and AWS tutorials