Engineer - Data Engineering
Role details
Job location
Tech stack
Job description
-
Designing and developing data solutions for one of the world's largest corporations involved in the marketing and distribution of food products
-
Implementing distributed and highly available data processing applications that scale for enterprise demands on Cloud Services such as GCP
-
Design, develop and maintain data pipelines integrating multiple source systems and targets.
-
Ensuring high code quality by following software engineering best practices
-
Working collaboratively in a cross-functional team in an Agile delivery environment
-
Adhering to DevOps principles and being involved in projects throughout their full software lifecycle: from development, QA, and deployment to post-production support
Requirements
-
A Bachelor's Degree in Computer Science or equivalent, and 1-2 years of experience in developing enterprise grade data processing applications
-
A strong programming background in data ops (Python, Shell, SQL)
-
Experience in processing large volumes of data
-
Hands-on experience working with relational/NoSQL databases and distributed storage engines
-
Hands-on experience in ETL/ELT design and development using ETL/ELT tools such as but not limited to Airflow, Google Cloud Services such as Cloud Composer, Cloud Dataflow, Cloud Dataproc and Amazon Web Services such as Data Pipelines, Glue, Lambda, EMR, Spark, Hive
-
Hands-on experience with Informatica tools (PowerCenter) for enterprise data integration, workflow orchestration, and data transformation
-
Experience in working with streaming data (using tools such as Pub/Sub, Kafka, Storm, Spark) will be an added advantage
-
Hands-on experience with Google Cloud Platform (GCP): Experience with services & serverless functions (e.g., BigQuery, Datastream, DataFlow, Pub/Sub, Cloud Functions, Cloud Run, Cloud composer, etc.) for data processing and orchestration will be an added advantage
-
Experience working in a Scrum Agile delivery environment and DevOps practices
-
Experience in code management and CI/CD tools such as Github, Gitlab and Jenkins
-
A strong desire to continue to grow your skillset
-
Strong communication skills that are influential and convincing
-
Experience with application monitoring (Datadog, or equivalent) will be an added advantage
-
A passion for building and maintaining data solutions (data warehousing, data marts, data lakes, data mesh) will be an added advantage
Benefits & conditions
-
US dollar-linked compensation
-
Performance-based annual bonus
-
Performance rewards and recognition
-
Agile Benefits - special allowances for Health, Wellness & Academic purposes
-
Paid birthday leave
-
Team engagement allowance
-
Comprehensive Health & Life Insurance Cover - extendable to parents and in-laws
-
Overseas travel opportunities and exposure to client environments
-
Hybrid work arrangement
Sysco LABS is an Equal Opportunity Employer.