GCP data engineer
Role details
Job location
Tech stack
Job description
We are looking for a highly skilled Data Engineer with strong expertise in Snowflake and Google Cloud Platform (GCP) to design, build, and optimize scalable data platforms and analytics solutions. The role involves developing robust data pipelines, managing cloud data warehouses, and enabling high performance analytics for business and reporting needs. Key Responsibilities Snowflake Data Engineering Design, develop, and maintain Snowflake data warehouse solutions Implement and optimize Snowflake objects including databases, schemas, tables, views, and stages Develop and manage Snowflake SQL, stored procedures, tasks, and streams Optimize query performance, storage, and compute usage Implement data sharing, security roles, and access controls in Snowflake Support data modeling for analytical and reporting use cases GCP Data Engineering Design and build end to end data pipelines on Google Cloud Platform Develop ETL/ELT pipelines using BigQuery, Cloud Storage, Dataflow / Dataproc Integrate data from multiple sources (applications, APIs, files, streaming sources) Ensure scalability, reliability, and cost optimization of cloud data solutions Apply best practices for data governance, security, and compliance on GCP Data Integration & Modeling Perform data ingestion, transformation, and validation Design dimensional and analytical data models for reporting and BI Handle structured and semi structured data (CSV, JSON, Parquet, etc.) Ensure data quality checks, reconciliation, and monitoring Collaboration & Delivery Work closely with analytics, reporting, and business teams to understand data requirements Support UAT, production deployments, and ongoing enhancements Document data pipelines, models, and technical design Participate in Agile ceremonies and sprint-based delivery, This early-career fullstack role is tailored for recent graduates eager to accelerate their growth by building impactful products. You will contribute to and gradually take ownersh…
- 7 days ago
- Apply easily
Requirements
Technical Skills Strong hands-on experience with Snowflake Strong hands-on experience with Google Cloud Platform (GCP) BigQuery, Cloud Storage, Dataflow / Dataproc Advanced SQL (performance tuning, complex queries) Experience with ETL / ELT frameworks Data modeling experience (dimensional / analytical) Experience with version control tools (Git) Good to Have Python (or similar) for data processing and automation Experience with orchestration tools (e.g., Airflow / Cloud Composer) Experience working with BI tools (Looker, Tableau, Power BI, Qlik) Exposure to CI/CD for data pipelines Healthcare / Financial / Large enterprise data platform experience Diverse Lynx LLC is an Equal Employment Opportunity employer. All qualified applicants will receive due consideration for employment without any discrimination. All applicants will be evaluated solely on the basis of their ability, competence and their proven capability to perform the functions outlined in the corresponding role. We promote and support a diverse workforce across all levels in the company.
Benefits & conditions
-
$123,000-209,000 per year Help us change lives At Exact Sciences, we're helping change how the world prevents, detects and guides treatment for cancer. We give patients and clinicians the clarity needed t…
-
1 month ago