We are looking for a Data Pipeline Engineer to support the Data Pipeline team in designing and developing data flows on Google Cloud Platform. The role focuses on creating standardized, simplified, and reliable data solutions that enable effective steering and planning across the business support domain. You will build and maintain ETL/ELT workflows using Python, Airflow, and transformation tools such as dbt, while working with core GCP services including BigQuery, GCS, Pub/Sub, IAM, and Cloud Functions. The assignment includes deploying and operating workflows in Google Cloud Composer, applying infrastructure-as-code practices with Terraform, and collaborating with cross-functional teams to deliver clean, tested, and trustworthy datasets. Strong Python backend development experience, solid SQL skills, and familiarity with Git, CI/CD, Docker-based development, and cloud data platforms are essential. Expected to be on-site at least three days per week.