Data Engineer
Role details
Job location
Tech stack
Job description
We are looking for a Data Engineer to scale our Data Team and help us improve our Data Platform & Data Integration processes. This role's mission will be to:
- Ensure we are able to keep all data in our Data Warehouse updated and accessible for transformation and analysis
- Partner with the Analytics Engineers and Data Analysts to support the requirements of the company in terms of reliability, quality and efficiency
- Develop and maintain data pipelines to extract data from different sources and integrate it in the Data Warehouse following best practices
What you will do
- Take charge of the required data processing while ensuring sustainable and organic growth of our data platform and infrastructure
- Keep our data infrastructure up to date and working like a clock
- Integrate datasets from different sources
- Support our Data Analysts & Analytics Engineers to get the right data to build dashboards and complex analytical models
- Support Analytics Engineering teams in defining the best approaches for data modelling
- Use data to investigate and help resolve issues in our product or processes
- In collaboration with our Data Governance team, proactively suggest improvements to data security, reliability, efficiency and quality
- Champion a healthy data culture throughout the organisation
Requirements
Do you have experience in SQL?, Do you have a Master's degree?, * 3+ years of relevant experience as a Data Engineer, Analytics Engineer, Big-Data Engineer or similar role working with large-scale data systems
- Excellent communication skills, both written and spoken, in English
- You are a Master in SQL, optimising queries for performance, scalability, and ease of maintenance
- You feel comfortable querying different types of databases (PostgreSQL, Redshift, Snowflake) and have knowledge of different AWS services
- You're accustomed to designing and implementing complex architectures with a constant eye on their future evolution while taking into account the needs of multiple users
- You have experience building data pipelines using Python
- You have experience integrating data from multiple sources including databases, product tracking, and APIs. You get excited by seeing your jobs run like clockwork
- You have an instinct for automation
- You have the desire to work in an international multi-location environment with highly engaged individuals
Bonus points for…
- Experience with AWS Redshift or other distributed systems (Snowflake, Big Query, Hadoop, Vertica, Exasol, etc.). Basic DBA skills
- Experience in web analytics and web tracking and event-based analytics tools
- Experience with workflow managers (Airflow, etc.)
- Experience with dbt.
- Experience in queue and streaming systems (SNS, Kafka, Firehose…)
Benefits & conditions
- Competitive compensation including equity in the company
- Generous vacation days so you can rest and recharge
- Health perks such as private healthcare or gym allowance depending on your location
- "Flexible compensation plan" to help you diversify and increase the net salary
- Unforgettable Perk events including to travel to one of our hubs
- A mental health support tool for your wellbeing
- Exponential growth opportunities
Our Vision is for a world where Perk serves as the platform for human connection in-real-life (IRL). We take an IRL-first approach to work, where our team works together in-person 3 days a week. As such, this role requires you to be based within commuting distance of our Barcelona hub. We fundamentally believe in the value of meeting in-real-life to improve connectivity, productivity, creativity and ultimately making us a great place to work.