Senior Data Engineer
Ixceed
21 days ago
Role details
Contract type
Temporary contract Employment type
Full-time (> 32 hours) Working hours
Regular working hours Languages
English Experience level
SeniorJob location
Tech stack
Agile Methodologies
Automation of Tests
Big Data
Cloud Computing
Continuous Integration
Data Validation
Information Engineering
Data Governance
ETL
Data Profiling
Hadoop
Python
Meta-Data Management
NoSQL
Scrum
Regression Testing
SQL Databases
Workflow Management Systems
Snowflake
Spark
Kafka
Apache Nifi
Data Management
Data Pipelines
Databricks
Job description
- Design, build, and maintain scalable data pipelines using Snowflake, Hadoop, Spark, NiFi, and related big data technologies.
- Implement data architectures and optimize workflows for massive financial datasets.
- Write high-quality, maintainable code in Python and SQL following best practices.
- Integrate data governance principles, metadata management, and lineage tracking into solutions.
Data Quality Assurance & Testing
- Develop automated testing frameworks and validation scripts for ETL processes and data transformations.
- Implement data quality checks, reconciliation processes, and regression testing suites to ensure accuracy, completeness, and timeliness.
- Perform unit, integration, and end-to-end testing for data pipelines and schema changes.
- Use tools like dbt tests, custom Python utilities for automated validation.
Collaboration & Agile Delivery
- Work closely with Data Engineers, Product, and Data Science teams to embed testing into the development lifecycle.
- Participate in agile ceremonies (sprint planning, backlog refinement, retrospectives) with a focus on quality and delivery.
- Support production incident response with rapid data validation and root cause analysis.
Continuous Improvement
- Stay current with emerging data engineering and testing technologies.
- Contribute to team knowledge sharing, mentoring junior engineers, and improving technical standards.
- Shape best practices for data reliability, testing automation, and CI/CD integration.
Requirements
Do you have experience in Python?, Core Technical Expertise
- Advanced SQL and experience with relational and NoSQL databases.
- Strong experience with Snowflake, Hadoop, Spark, Databricks, Kafka, and cloud data platforms.
- Proficiency in Python for both data engineering and test automation.
- Familiarity with orchestration tools and workflow management systems.
Testing & Quality
- Proven experience in data testing methodologies, ETL validation, and automated testing frameworks.
- Knowledge of data profiling, anomaly detection, and statistical validation techniques.
- Experience integrating testing into CI/CD pipelines.
Professional Attributes
- Strong problem-solving and analytical skills with attention to detail.
- Excellent communication skills for cross-functional collaboration.
- Ability to work independently and manage multiple priorities in fast-paced environments., * Data Engineering: 5 years (required)
- Snowflake: 3 years (required)
- Hadoop: 3 years (required)
- Spark: 3 years (required)
- NiFi: 2 years (required)
- Python: 2 years (required)
- SQL : 2 years (required)
- Databricks: 3 years (required)
- Kafka: 1 year (required)
- Data testing methodologies: 3 years (required)
- CI/CD pipelines: 2 years (required)