GCP Big Data Engineer
Job Title: GCP Big Data Engineer
Experience: 7–8 Years
Employment Type: Full-Time
Job Summary
We are seeking a highly skilled GCP Big Data Engineer with 7–8 years of experience in designing, developing, and optimizing large-scale data solutions on Google Cloud Platform (GCP). The ideal candidate will have strong expertise in data engineering, cloud-native architectures, and building scalable batch and real-time data pipelines.
You will be responsible for designing and implementing robust data platforms that support analytics, reporting, and machine learning use cases.
Key Responsibilities
Design, develop, and maintain scalable data pipelines using GCP services.
Build and optimize batch and real-time processing frameworks.
Develop ETL/ELT workflows using Dataflow, Dataproc, Composer (Airflow), and Cloud Functions.
Implement data ingestion solutions using Pub/Sub, Cloud Storage, and BigQuery.
Design and manage data warehousing solutions in BigQuery.
Ensure data quality, governance, security, and compliance standards.
Optimize performance and cost of cloud data workloads.
Collaborate with Data Scientists, Analysts, and Business stakeholders to deliver data solutions.
Implement CI/CD pipelines for data engineering workflows.
Monitor and troubleshoot data platform issues.
Required Skills & Technical Expertise
Google Cloud Platform (Must Have)
BigQuery (advanced SQL, optimization, partitioning, clustering)
Dataflow (Apache Beam)
Dataproc (Spark, Hadoop)
Cloud Composer (Airflow)
Pub/Sub
Cloud Storage
Cloud Functions / Cloud Run
IAM & Security Best Practices
Big Data Technologies
Apache Spark
Hadoop Ecosystem
Kafka (preferred)
Hive
Programming
Python (mandatory)
SQL (advanced level)
Scala or Java (preferred)
Other Skills
Strong understanding of Data Modeling (Star/Snowflake schemas)
Experience with CI/CD tools (Jenkins, GitLab CI, etc.)
Infrastructure as Code (Terraform preferred)
Experience with DevOps practices
Strong analytical and problem-solving skills
Qualifications
Bachelor’s or Master’s degree in Computer Science, Engineering, or related field.
7–8 years of experience in Data Engineering.
Minimum 3+ years of hands-on experience on Google Cloud Platform.
GCP Certifications (Professional Data Engineer) – Preferred.
Good to Have
Experience with Real-time Streaming Architectures.
Exposure to Data Governance tools.
Experience in multi-cloud environments.
Knowledge of ML pipelines on GCP.