GCP Data Engineer (Python + OOPs + Dataflow + Cloud Composer + BigQuery) 6-10 Years

Full-Time @HypTechie
  • India – Bangalore / Hyderabad / Chennai / Gurgaon / Pune View on Map
  • Post Date : November 25, 2025
  • Salary: ₹900,000.00 - ₹1,050,000.00 / Yearly
  • View(s) 38
Email Job

Job Detail

  • Job ID 61721

Job Description

Job ID: 6494

Location: India – Bangalore / Hyderabad / Chennai / Gurgaon / Pune

Job Type: Permanent

Work Mode: Office Only

Experience Required: 6–10 Years

Salary Range: ₹9,00,000 – ₹10,50,000 per annum


Job Overview

Our client is looking for an experienced GCP Data Engineer to build and maintain scalable, production-ready data pipelines on Google Cloud Platform. The ideal candidate must have strong hands-on knowledge of Python with OOPs concepts and extensive experience working with Dataflow, Cloud Composer, and BigQuery. The role demands close collaboration with business and technical teams to enable seamless data migration and onboarding of diverse data products into the cloud.

Roles & Responsibilities

  • Develop and maintain scalable data pipelines using GCP services.
  • Collaborate with cross-functional teams to build complex data migration pipelines for onboarding new data products.
  • Optimize and streamline data workflows to add value to client data hubs.
  • Follow agile software development practices to onboard and manage cloud data workloads.
  • Build scalable, well-documented data pipelines that cleanse, transform, and curate raw datasets into meaningful insights.
  • Develop reusable data ingestion frameworks to support multiple data patterns.

Primary Skills (Minimum 5 Years of Strong Experience)

  • Python with OOPs concepts
  • Dataflow
  • Cloud Composer
  • BigQuery

Secondary Skills (Added Advantage)

  • Agile methodology
  • Spinnaker
  • Jenkins
  • Git
  • GKE

Good to Have (Not Mandatory)

  • Data Modelling
  • Tableau
  • SQL Performance Tuninga

Other jobs you may like

Scroll to Top