Data Engineer (Lambda)
Role details
Job location
Tech stack
Job description
- Track token balances and DeFi positions across multiple chains
- Analyze historical and real-time rewards
- Accurately calculate PnL and uncover hidden costs (e.g., slippage, rebalancing, fees)
- Compare strategies and pools across protocols with confidence
Our mission is to make crypto data transparent, reliable, and actionable, reducing the time to generate accurate performance reports from weeks to just a few hours. We're a fast-moving startup with a strong technical culture, building the backbone of crypto data infrastructure.
What You'll Do
- Design, maintain, and scale streaming ETL pipelines for blockchain data.
- Build and optimize ClickHouse data models and materialized views for high-performance analytics.
- Develop and maintain data exporters using orchestration tools.
- Implement data transformations and decoding logic.
- Establish and improve testing, monitoring, automation, and migration processes for pipelines.
- Ensure timely delivery of new data features in alignment with product goals.
- Combine multiple data sources - indexers and Kafka topics from third parties - to aggregate them into tables for our API.
- Create automation tools for data analyst inputs, such as a dictionary, to keep them up to date.
- Collaborate within the team to deliver accurate, reliable, and scalable data services that power the Lambda app.
Requirements
Do you have experience in SQL?, * Streaming & ETL: Managed Flink-based pipelines (real-time event & transaction processing), Apache Kafka
- Data Warehouse: ClickHouse (Cloud)
- Workflow orchestration: Airflow
- Programming: Python (data processing, services, automation)
- Domain: Multi-chain crypto data (EVM & non-EVM ecosystems)
- 4+ years in Data Engineering (ETL/ELT, data pipelines, streaming systems).
What We're Looking For
- Strong SQL skills with columnar databases (ClickHouse, Druid, BigQuery, etc.).
- Hands-on streaming frameworks experience (Flink, Kafka, or similar).
- Solid Python skills for data engineering and backend services.
- Proven track record of delivering pipelines and features to production on schedule.
- Strong focus on automation, reliability, maintainability, and documentation.
- Startup mindset: keep balance between speed and quality.
Nice to Have
- Experience operating ClickHouse at scale (performance tuning, partitioning, materialized views)
- Experience with CI/CD and automated testing for data pipelines (e.g. GitHub Actions, dbt)
- Knowledge of multi-chain ecosystems (EVM & non-EVM)
- Familiarity with blockchain/crypto data structures (transactions, logs, ABI decoding).
- Contributions to open-source or blockchain data infrastructure projects
Benefits & conditions
At P2P.org we have a team of experts with their own unique approach and ownership culture. Together we gain experience and make dreams come true!
- Fully remote
- Full-time contractor (Indefinite-term Consultancy Agreement)
- Competitive salary level in $ (we can also pay in crypto)
- Paid vacation and sick leave
- Well-being program
- Mental Health care program
- Compensation for education, including foreign language & professional growth courses
- Equipment & co-working reimbursement program
- Overseas conferences, community immersion