Data Engineer -GCP & BigQuery Expert
- ירושלים
- Technology consulting
We are KPMG’s technology arm in Israel. KPMG delves headfirst into the power of emerging technologies and scientific breakthroughs to craft solutions, projects, and products for companies facing complex business challenges in today’s continuously changing world. By uniting groundbreaking technology with industry expertise, we are able to harness the potential of cloud, AI, ML, digital, and cyber to design and implement top-of-the-line tailored solutions.
About the job
We are looking for an experienced and hands-on Data Engineer to lead the migration of enterprise data platforms to Google Cloud Platform (GCP).
In this role, you will design, build and maintain scalable ETL/ELT pipelines, develop advanced data models in BigQuery and contribute to the creation of a high-performance, reliable and cost-efficient data architecture.
You will work closely with analysts, data scientists and engineers and have real impact on how data is consumed across the organization.
What You Will Do
- Lead the migration of data from on-premise core systems to Google Cloud Platform (GCP).
- Design and develop processed data layers (Silver and Gold) and data marts in BigQuery, including complex business logic.
- Build, orchestrate and maintain data pipelines using Cloud Composer / Apache Airflow.
- Develop robust data transformations, including cleansing, enrichment and data quality improvements.
- Write efficient and optimized SQL queries in BigQuery with strong focus on performance and cost.
- Create and maintain clear and up-to-date technical documentation for data architecture and processes.
- 3+ years of hands-on experience as a Data Engineer.
- Strong experience working with Google Cloud Platform (GCP) - mandatory.
- Proven experience with BigQuery, including data modeling, complex SQL and performance optimization - mandatory.
- Strong Python skills for ETL/ELT and data transformations.
- Experience with orchestration and workflow management tools such as Cloud Composer, Apache Airflow or similar.
- Experience working with Cloud Storage (GCS) and additional GCP data services such as Cloud SQL, Data Lakes and storage solutions.
Nice to Have
- Experience with GCP streaming technologies such as Cloud Pub/Sub and Dataflow.
- Familiarity with Git and CI/CD processes.
- Previous experience migrating data from legacy systems such as Mainframe or Oracle to the cloud.
Personal Skills
- Ability to work independently and lead projects end-to-end.
- Proactive mindset with strong technical curiosity and continuous learning attitude.
- Strong collaboration skills and ability to work with cross-functional teams.
Requirements
- 3+ years of hands-on experience as a Data Engineer.
- Strong experience working with Google Cloud Platform (GCP) - mandatory.
- Proven experience with BigQuery, including data modeling, complex SQL and performance optimization - mandatory.
- Strong Python skills for ETL/ELT and data transformations.
- Experience with orchestration and workflow management tools such as Cloud Composer, Apache Airflow or similar.
- Experience working with Cloud Storage (GCS) and additional GCP data services such as Cloud SQL, Data Lakes and storage solutions.
Nice to Have
- Experience with GCP streaming technologies such as Cloud Pub/Sub and Dataflow.
- Familiarity with Git and CI/CD processes.
- Previous experience migrating data from legacy systems such as Mainframe or Oracle to the cloud.
Personal Skills
- Ability to work independently and lead projects end-to-end.
- Proactive mindset with strong technical curiosity and continuous learning attitude.
- Strong collaboration skills and ability to work with cross-functional teams.
The position is open for all genders as well as people with disabilities.
Amazing! This is my dream job
Loading application form
Liked it? Share it!