Job Information
Worldpay, LLC Senior Data Engineer - Python Airflow in Cincinnati, Ohio
Job Description Are you ready to write your next chapter? Make your mark at one of the biggest names in payments. With proven technology, we process the largest volume of payments in the world, driving the global economy every day. When you join Worldpay, you join a global community of experts and changemakers, working to reinvent an industry by constantly evolving how we work and making the way millions of people pay easier, every day. What makes a Worldpayer? It's simple: Think, Act, Win. We stay curious, always asking the right questions to be better every day, finding creative solutions to simplify the complex. We're dynamic, every Worldpayer is empowered to make the right decisions for their customers. And we're determined, always staying open - winning and failing as one. We're looking for a Senior Data Engineer - Python Airflow to join our ever evolving Cloud Data Platform team to help us unleash the potential of every business. Are you ready to make your mark? Then you sound like a Worldpayer. About The Team Worldpay Cloud Data Platform team provides WorldPay Big Data capabilities including large scale data processing, analytics, fraud platforms, machine learning engineering, stream processing and data insights. Worldpay Cloud Data Platform team help maintain and advance cloud data platforms and services to build data products powering our business services and customers. What You'll Own * Develop and implement strategies for data engineering initiatives using Python, AWS, Airflow, and Snowflake technologies * Monitor trends in the data engineering industry and stay up to date on current technologies * Collaborate with product team to develop solutions that meet their goals and objectives * Act as a subject matter expert for Apache Airflow and provide technical guidance to team members * Install, configure, and maintain Astronomer Airflow environments * Build complex data engineering pipelines using Python and Airflow * Will be responsible for designing, developing, and maintaining scalable workflows and orchestration systems using Astronomer * Create and manage Directed Acyclic Graphs (DAGs) to automate data pipelines and processes * Leverage AWS Glue, Step Functions, and other services for orchestrating data workflows * Develop custom operators and plugins for Astronomer to extend its capabilities * Integrate code with defined CI/CD framework and AWS services required for building secure data pipeline * Manage user access and permissions, ensuring data security and compliance with company policies * Implement and monitor security controls, including encryption, authentication, and network security * Conduct regular security audits and vulnerability assessments * Manage data ingestion and ETL processes * Automate routine tasks and processes using scripting and automation tools * Maintain comprehensive documentation for configurations, procedures, and troubleshooting steps What You Bring * 5+ yrs in a pivotal Software/Data Engineering role with deep exposure to modern data stacks, particularly Snowflake, Airflow, DBT, and AWS data services * Proficiency in cloud platforms such as AWS, Azure, or Google Cloud * Experience with PySpark/Hadoop and/or AWS Glue ETL and/or Databricks w/ python is preferred. Must have experience on ETL development life cycle, best practices of ETL pipelines, thorough work experience on data warehouse using combination of Python, Snowflake & AWS Services * Data engineering experience with AWS Services (S3, Lambda, Glue, Lake Formation, EMR), Kafka, Streaming, Databricks is highly preferred * Experience in Astronomer and/or Airflow * Understanding data pipelines and modern ways of automating data pipeline using cloud-based testing and clearly document implementations, so