Senior Specialist – Data Engineering (DBT / GCP / ETL) 7-11 years Exp
@Algae Services posted 6 hours ago Shortlist Email JobJob Detail
- 
                                                
                                                Job ID 60874
 
Job Description
Senior Specialist – Data Engineering (DBT / GCP / ETL)
Job ID: 803144
Experience: 7-11 Years Exp
Location: Bengaluru
Type: Permanent
Openings: 1
Job Description
We are seeking an experienced Senior DBT Technical Lead to join our 2025–2026 data transformation initiatives. This role demands deep expertise in dbt (data build tool), cloud data platforms, and complex ETL/ELT pipeline development. The ideal candidate will lead project design and development efforts while providing hands-on technical contributions across all stages of the data lifecycle.
Key Responsibilities
- Lead the design, development, and deployment of modern data pipelines using dbt Cloud and dbt Core.
 - Develop scalable, reliable, and high-performance ELT frameworks to support enterprise data initiatives.
 - Define and implement dbt project structures, including macros, tests, documentation, and deployment strategies.
 - Optimize SQL queries for performance and maintainability across large-scale datasets.
 - Design and develop data models aligned with dimensional modeling and data vault methodologies.
 - Work closely with cross-functional teams — including data analysts, architects, and business stakeholders — to deliver end-to-end data solutions.
 - Integrate real-time data pipelines using technologies like Kafka, Pub/Sub, and Dataflow (preferred).
 - Manage ETL/ELT workloads across both modern tools (Spark, PySpark, Databricks) and traditional platforms (Informatica, BDM).
 - Leverage Airflow or Cloud Composer for orchestration and scheduling.
 - Use Python for pipeline automation, data validation, and workflow enhancements.
 - Ensure best practices in code management, CI/CD, and documentation.
 
Required Skills & Experience
- 8–12 years of experience in Data Engineering / Analytics Engineering.
 - 3+ years of hands-on experience with dbt (Data Build Tool) – both dbt Cloud and dbt Core.
 - Expert-level proficiency in dbt: macros, testing, documentation, and deployment pipelines.
 - Advanced SQL (ANSI-SQL) – strong in query optimization and performance tuning.
 - 3+ years of experience on GCP – BigQuery, Cloud Storage, Dataflow, Cloud Composer.
 - Strong background in ETL/ELT design, data modeling, and data architecture principles.
 - Proficient in Python for automation and pipeline development.
 - Familiarity with real-time data streaming and data orchestration tools.
 
Preferred / Good-to-Have
- Experience with Kafka, Pub/Sub, and real-time analytics.
 - Exposure to Spark, Scala, PySpark, or Databricks.
 - Knowledge of Informatica BDM or other enterprise ETL platforms.
 - Strong understanding of data warehousing, dimensional modeling, and data vault frameworks.
 
Mandatory Skills
- ANSI-SQL
 - DBT (Data Build Tool)