Closed
Bug 1325229
Opened 9 years ago
Closed 5 years ago
Testing on Airflow should reflect changes to files e.g. airflow.sh
Categories
(Data Platform and Tools :: General, defect, P2)
Data Platform and Tools
General
Tracking
(Not tracked)
RESOLVED
WONTFIX
People
(Reporter: amiyaguchi, Unassigned)
References
Details
When running tests on Airflow with a locally deployed instances, changes in ansible/files are not reflected. For example:
> ansible-playbook ansible/deploy_local.yml -e '@ansible/envs/dev.yml' -e "command='test churn churn 20161221'"
will fail to recognize a change to ansible/files/spark/airflow.sh. The job will pull the script from `s3://telemetry-test-bucket/steps/airflow.sh` [1].
The workaround is to upload changes directly to the bucket. Some possible solutions could be to include uploading files as part of the deploy_local or to have a staging environment that mirrors the production environment.
[1] https://github.com/mozilla/telemetry-airflow/blob/master/dags/operators/emr_spark_operator.py#L94
![]() |
Reporter | |
Comment 1•9 years ago
|
||
Another motivating case is keeping the airflow.sh in the test bucket up to date. I had issues with a UTF-8 error where the file would not be found after downloading because of filename corruption [1]. This turned out to be solved once an updated script was uploaded to the bucket.
[1] https://gist.github.com/acmiyaguchi/5aff2b37df5113c3c9072b4ff58a5908#file-airflow-log-L185-L195
![]() |
||
Updated•9 years ago
|
Points: --- → 2
Priority: -- → P2
![]() |
||
Updated•8 years ago
|
Component: Metrics: Pipeline → Scheduling
Product: Cloud Services → Data Platform and Tools
![]() |
Reporter | |
Updated•5 years ago
|
Status: NEW → RESOLVED
Closed: 5 years ago
Resolution: --- → WONTFIX
Assignee | ||
Updated•3 years ago
|
Component: Scheduling → General
You need to log in
before you can comment on or make changes to this bug.
Description
•