Airflow installed on Docker Compose
Get the docker CONTAINER ID of the scheduler.
docker ps
Open the bash (or other program) in the container system.
docker exec -it [CONTAINER ID of the scheduler] bash
For example, to see what airflow provider version is installed,
pip list | grep airflow
The database should be one of the existing databases under the connection, which is either airflow or postgres when first creating connection.
Host: localhost
Database: airflow or postgres (case sensitive)
Host: postgres
MinIO doesn't require installation or download. Acess ID and password can be set as you wish [ref]. Running the image in docker suffices. When Airflow is running in Docker Compose while MinIO in Docker, they are not in the same docker network.
docker network ls
There is a name 'my_project_default' for Airflow project. To connect MinIO to this network,
# docker network connect [NETWORK NAME] [CONTAINER NAME]
docker network connect airflow_learning_default minio
To check if MinIO gets into the network successfully, either of below can be used.
# Check 'Networks' at the end
docker inspect minio
# Check 'Containers' at the end
docker network inspect airflow_learning_default
Then go to Airflow web UI Connection page. Connection type is 'Amazon Web Services' (not S3 anymore). Don't fill in 'AWS Access Key ID' and 'AWS Secret Acess Key' field but use 'Extra Field JSON' for both.
{
"aws_access_key_id": "miniouser",
"aws_secret_access_key": "miniopassword",
"endpoint_url": "http://minio:9000"
}
The point is, after the network connection, endpoint_url is http://[CONTAINER_NAME]:9000.