Senior Data Platform Engineer (Remote)
Crystal Intelligence
Barcelona, Spain
19 days ago
Role details
Contract type
Permanent contract Employment type
Full-time (> 32 hours) Working hours
Regular working hours Languages
English Experience level
SeniorJob location
Remote
Barcelona, Spain
Tech stack
Java
API
Unit Testing
Big Data
Code Review
Databases
Information Engineering
Data Infrastructure
Data Warehousing
Database Queries
Software Debugging
Distributed Systems
Python
Open Source Technology
Posix
Ansible
Blockchain
Standard Sql
Software Engineering
Data Processing
Flask
Spark
Backend
FastAPI
Data Lake
Kubernetes
Apache Flink
Dask
Kafka
Terraform
Data Pipelines
Docker
Job description
We are looking for a talented Senior Data Platform Engineer to join our Data Engineering team, to participate in the development and maintenance of data pipelines for blockchain data processing. This is a remote role, and we are flexible with considering applications from anywhere in Europe., * Active participation in development and maintenance of our data pipelines and backend services;
- Integration of blockchains, Automated Market Maker (AMM) protocols, and bridges within Crystal's platform;
- Integrate new technologies into our processes and tools;
- End-to-end feature designing and implementation;
- Code, debug, test and deliver features and improvements in a continuous manner;
- Provide code review, assistance, and feedback for other team members.
Requirements
- 8+ years of experience in software engineering, developing backend services/APIs or data pipelines;
- Strong knowledge and experience with Python;
- Advanced knowledge of SQL - ability to write, understand and debug complex queries;
- Strong database architecture principles;
- Strong experience with data lakes, data warehouses, relational and analytical databases;
- POSIX/Unix/Linux ecosystem knowledge;
- Blockchain tech knowledge or willingness to learn;
- Experienced in Unit Testing principles;
- Experience with Docker containers and proven ability to migrate existing services;
- Independent and autonomous way of working;
- Team-oriented work and good communication skills are an asset.
Would be a plus:
- Practical experience in big data and frameworks - Kafka, Spark, Dask, Flink and Data Lakes;
- Experience with Kubernetes and Infrastructure as Code - Terraform and Ansible;
- Experience with API frameworks such as Flask or FastAPI;
- Experience with distributed systems;
- Experience with opensource solutions;
- Experience with Java or willingness to learn.