Data Engineer
Role details
Job location
Tech stack
Job description
A driven Data Engineer who can bridge the gap between complex data management and robust software engineering. You are someone who not only executes tasks but brings new perspectives to the table, helping to evolve our large-scale application landscape that processes millions of records., As a Data Engineer, you are responsible for the end-to-end development and implementation of data pipelines and streaming-based integrations. You will take a lead role in designing how we consume data from diverse sources, applying complex transformations, and ensuring seamless delivery to target systems within the Azure cloud., * Develop and perform peer reviews for data solutions deployed in the Azure cloud.
- Master and implement technologies including ADLS Gen2, Databricks, and Apache Kafka.
- Act as a core member of a diverse Agile/Scrum DevOps team, taking full ownership of managing and maintaining mission-critical functionalities., * Architecting and maintaining a state-of-the-art Data Lake solution.
- Integrating time-critical fraud detection applications with multiple downstream systems via real-time and streaming interfaces.
- Mentoring junior team members and catering to the high-level data needs of various business consumers.
Who are you?
You are a technical specialist who lives and breathes data engineering. You stay ahead of the curve regarding the latest technology trends and thrive in a hybrid, flexible working environment that rewards proactivity and a "challenge-accepted" mindset.
Requirements
Do you have experience in Scrum?, * Big Data Mastery: Proven experience with PySpark and Databricks in large-scale environments.
- Programming Excellence: Strong command of Python and SQL for complex data manipulation.
- Cloud Infrastructure: Professional experience with ADLS Gen2 and Azure-based components.
- Agile Leadership: Deep understanding of Scrum/Agile principles and DevOps mentalities.
Profile
- Cloud Ecosystem: Strong exposure to Microsoft Azure (DevOps, Boards, Git, Pipelines) or AWS.
- Data Architecture: Solid understanding of data management, including data retention, quality control, metadata management, and data lineage.
- Automation: Proficiency or a strong drive to implement Workflow Automation via Azure Data Factory, Databricks Workflows, or Apache Airflow.
- Soft Skills: Professional English proficiency; a persuasive communicator who can align stakeholders and lead technical discussions.