GCP Data Engineer
X
Xurpas
90 - 120K PHP
Contract
N/A
Data EngineeringData WarehousingGoogle Cloud PlatformBigQueryCloud ComposerPythonStar/Snowflake schemassecurity protocols
Responsibilities:
- Design, create, code, and support a variety of data pipelines and models on GCP cloud technology
- Strong hand-on exposure to GCP services like BigQuery, Composer etc.
- Partner with business/data analysts, architects, and other key project stakeholders to deliver data requirements.
- Developing data integration and ETL (Extract, Transform, Load) processes.
- Support existing Data warehouses & related pipelines.
- Ensuring data quality, security, and compliance.
- Optimizing data processing and storage efficiency, troubleshoot issues in Data space.
- Seeks to learn new skills/tools utilized in Data space (ex: dbt, MonteCarlo etc.)
- Excellent communication skills- verbal and written, Excellent analytical skills with Agile mindset.
- Demonstrates strong affinity towards paying attention to details and delivery accuracy.
- Self-motivated team player and should have ability to overcome challenges and achieve desired results.
- Work effectively in Global distributed environment.
Qualifications:
- Bachelor’s or Master’s degree in Computer Science, Data Engineering, Information Technology, or a related quantitative field.
- 3–5+ years of hands-on experience in Data Engineering or Data Warehousing roles.
- At least 2 years specifically focused on Google Cloud Platform (GCP).
- Expert-level knowledge of BigQuery (SQL, optimization, partitioning) and Cloud Composer (Apache Airflow) for orchestration.
- Proficiency in Python (specifically for ETL and Airflow DAGs) and advanced SQL.
- Strong experience in designing Star/Snowflake schemas and dimensional modeling.
- Proven track record of building and maintaining scalable data pipelines and managing data integration from diverse sources.
- Experience implementing data validation, cleansing, and security protocols (IAM, encryption).