Data Engineer (Data Science)
Role details
Job location
Tech stack
Job description
The Analyst Expert is responsible for placing data at the heart of our operations. S/He conducts cross analysis of complex data to monitor and optimize the performance of the marketing strategy for clients., In this position, you'll play a vital role in delivering a wide variety of projects for our clients and internal teams. You'll be responsible for creating solutions to a range of problems - from bringing data together from multiple sources into centralised datasets, to building predictive models to drive optimisation of our clients' digital marketing. We are a small, highly collaborative team, and we value cloud-agnostic technical fundamentals and self-sufficiency above specific platform expertise. The following requirements reflect the skills needed to contribute immediately and integrate smoothly with our existing workflow., * Build and maintain data pipelines to integrate marketing platform APIs (Google Ads, Meta, TikTok, etc.) with cloud data warehouses, including custom API development where platform connectors are unavailable
-
Develop and optimize SQL queries and data transformations in BigQuery and AWS to aggregate campaign performance data, customer behavior metrics, and attribution models for reporting and analysis
-
Design and implement data models that combine first-party customer data with marketing performance data to enable cross-channel analysis and audience segmentation
-
Deploy containerized data solutions using Docker and Cloud Run, ensuring pipelines run reliably at scale with appropriate error handling and monitoring
-
Implement statistical techniques such as time series forecasting, propensity modeling, or multi-touch attribution to build predictive models for client campaign optimization
-
Develop, test, and deploy machine learning models into production environments with MLOps best practices including versioning, monitoring, and automated retraining workflows
-
Translate client briefs and business stakeholder requirements into detailed technical specifications, delivery plans, and accurate time estimates
-
Configure and maintain CI/CD pipelines in Azure DevOps to automate testing, deployment, and infrastructure provisioning for data and ML projects
-
Create clear technical documentation including architecture diagrams, data dictionaries, and implementation guides to enable team knowledge sharing and project handovers
-
Participate actively in code reviews, providing constructive feedback on SQL queries, Python code, and infrastructure configurations to maintain team code quality standards
-
Provide technical consultation to clients on topics such as data architecture design, measurement strategy, and the feasibility of proposed ML applications
-
Support Analytics and Business Intelligence teams by creating reusable data assets, troubleshooting data quality issues, and building datasets that enable self-service reporting
-
Train and mentor junior team members through pair programming, code review feedback, and guided project work on data engineering and ML workflows
-
Implement workflow orchestration using tools like Kubeflow to coordinate complex multi-step data pipelines with appropriate dependency management and retry logic
-
Stay current with developments in cloud data platforms, digital marketing measurement, and ML techniques relevant to performance marketing optimization
-
Identify and implement improvements to team infrastructure, development workflows, and data quality processes
Requirements
Do you have experience in SQL?, * Expert-level proficiency in Python for building robust APIs, scripting, and maintaining complex data/ML codebases.
-
Strong SQL expertise and deep familiarity with data warehousing concepts relevant to tools like BigQuery.
-
Practical experience with Docker and a firm grasp of the Linux to manage local devcontainers, servers, and Cloud Run deployments.
-
Advanced Git proficiency and active experience participating in PR reviews to maintain code quality.
-
Solid understanding of CI/CD principles and practical experience defining or managing pipelines, preferably using a tool like Azure DevOps.
-
Proven ability to quickly read, understand, and apply technical documentation to translate broad business requirements into precise technical specifications.
-
Excellent written and verbal communication skills for proactive knowledge sharing, constructive PR feedback, participating in daily standups, and documenting processes.
Beneficial skills and experience to have: * Hands-on experience with any major cloud ML platform, focusing on MLOps workflow patterns.
-
Practical experience with stream or batch processing tools like GCP Dataflow or general orchestrators like Apache Beam.
-
Familiarity with Python ML frameworks or data modeling tools like Dataform/dbt.
-
Familiarity with the structure and core offerings of GCP or AWS.