Global IT Corporate Applications Big Data Engineer
Role details
Job location
Tech stack
Job description
The purpose of this role is to design, build, and maintain robust data processing pipelines and scalable storage systems for large and complex datasets. You will oversee the overall health, performance, and evolution of our global data platform architecture, while driving innovation in analytics and supporting the delivery of high-quality, data-driven solutions for the business., Platform, Architecture & Engineering
-
Design, build, and maintain robust data processing pipelines and scalable storage systems for large and complex datasets.
-
Oversee the overall health, performance, and evolution of the global data platform architecture.
-
Diagnose and resolve technical issues across big data systems, identifying root causes and recommending improvements.
-
Evaluate emerging technologies (AI, ML, NLP, CV, GenAI, data engineering frameworks, etc.) and assess their applicability within Avolta's ecosystem.
-
Create prototypes and proof of concepts for new data products and advanced analytics use cases.
Advanced Analytics & Collaboration
-
Collaborate closely with Global Data Scientists, BI & Integration & AI Managers, and other stakeholders to design analytical solutions that address key business challenges.
-
Support Data Scientists in developing algorithms, models, and experimentation workflows to extract insights from large datasets.
-
Identify new analytical trends, data needs, and opportunities for improving business processes using modern data and AI technologies. Data Quality, Observability, Governance & Review
-
Perform quality assurance testing to ensure data products and analytical models are accurate, reliable, and aligned with business logic.
-
Conduct peer reviews of engineers' work to ensure alignment with industry best practices, coding standards, and architectural guidelines.
-
Collaborate with data sourcing teams to develop, test, implement, and optimize analytical models and data transformations.
-
Partner with platform and infrastructure teams to integrate observability tools (log analytics, APM, pipeline observability, data quality monitors) and automate remediation when possible.
-
Establish standards for instrumentation (at pipeline, job, and dataset level), error budgets, incident runbooks, and post mortems to drive continuous improvement.
-
Maintain continuous monitoring of data platform systems and perform proactive maintenance to ensure availability, stability, data quality, and cost efficiency. Data Discovery & Complex Analysis
-
Lead data discovery activities, working with business teams to understand data sources, address inconsistencies, and validate business logic.
-
Explore complex datasets to uncover hidden relationships, patterns, and insights that support data-driven decision making.
-
Perform technical, quantitative, statistical, and operational analysis to support projects and business initiatives.
Delivery & Consumption
-
Ensure high-quality data products are delivered through various consumption modes, including dashboards, reports, APIs, AI agents, and application integrations.
-
Ensure data products are easily discoverable and consumable, supporting self-service analytics and integration into AI/GenAI use cases.
Requirements
Do you have experience in SQL?, * Strong experience designing and modeling big data products and relational/analytical data models (data modeling mandatory).
-
Proficiency in SQL, Python, and DAX for data engineering and analytics.
-
Hands-on experience with Databricks and Power BI is mandatory.
-
Experience with Kafka and orchestration tools such as Azure Data Factory is a plus.
-
Solid understanding of the Big Data ecosystem, Azure cloud services, and modern data engineering trends.
-
Knowledge of GenAI, AI/ML modeling, and MLOps/DataOps practices is an advantage
Due to certain email system settings, some of our messages may occasionally land in your junk or spam folder. To ensure you don't miss any important updates regarding your application, please check these folders regularly and mark our emails as 'Not Spam' if needed.