INTERMEDIATE AIRFLOW DEVELOPER
About the Role:
We are looking for an experienced Intermediate Airflow Developer with over 2 years of experience to help transition our existing Windows scheduler jobs to Apache Airflow DAGs. In this role, you’ll play a critical part in modernizing and optimizing our task automation processes by converting existing jobs into efficient, manageable, and scalable workflows in Airflow. You will also work on security hardening, implementing data pipelines, and designing ETL processes.
Key Responsibilities:
* Convert Windows Scheduler Jobs to Airflow: Migrate existing Windows-based scheduled jobs into Airflow DAGs, ensuring smooth execution and reliability.
* Develop and Optimize DAGs: Author, schedule, and monitor DAGs (Directed Acyclic Graphs) to handle data workflows, ETL tasks, and various automation processes.
* Programming and Scripting: Use Python as the primary language for Airflow DAGs and task logic, with experience in SQL for data manipulation.
* Set Up and Configure Airflow: Provide comprehensive instructions and configurations for setting up Airflow environments, including deployment, resource allocation, and high availability.
* Security Hardening: Implement security best practices in Airflow, including role-based access control, network security, logging, and data protection.
* Data Pipelines and ETL: Design and implement data pipelines to move, transform, and aggregate data, ensuring data accuracy and performance.
* Troubleshooting and Optimization: Proactively monitor DAGs for performance issues, manage task dependencies, and optimize for performance and reliability.
* Documentation and Knowledge Transfer: Document migration steps, Airflow setup instructions, and security configurations for future reference and knowledge sharing within the team.
Requirements:
* 2+ years of experience in working with scheduling tools, task automation, and job orchestration.
* Proficiency in Apache Airflow: Experience authoring and managing DAGs, with a solid understanding of Airflow architecture and best practices.
* Strong Python Skills: Ability to write clear, maintainable Python code for data workflows and ETL processes.
* SQL Knowledge: Proficiency in SQL for data manipulation within ETL tasks.
* Experience with Windows Scheduler: Familiarity with converting Windows-scheduled jobs to Airflow and handling dependencies.
* Security Awareness: Knowledge of security hardening principles within Airflow, including user roles, network configurations, and logging.
* Problem-Solving Skills: Strong analytical skills to debug and troubleshoot workflow issues and optimize DAGs for performance.
* Communication Skills: Ability to document and communicate setup processes, configurations, and migration steps clearly.
Preferred Qualifications:
* Experience with Airflow in cloud environments such as AWS, GCP, or Azure.
* Exposure to Docker and containerization for Airflow deployments.
* Familiarity with DevOps practices and CI/CD tools.
* Knowledge of additional programming languages like Bash, PowerShell, or JavaScript.