Lead Data Engineer
Role details
Job location
Tech stack
Job description
Permira is seeking a Lead Data Engineer to join our Data Team based in Madrid, Spain. The Data Team was established in 2022 to roll out a comprehensive Data Strategy across the Permira Business and implement the necessary Data Platform to support all data-driven processes and analytics.
As the Lead Data Engineer, you will be responsible for handling a dynamic and constantly evolving data environment, designing, implementing, and operating scalable and robust cloud-based data pipelines, interfaces, and models. Your role will ensure the capture, validation, consolidation, and timely availability of internal and external data to analytics users in a reliable manner.
Working closely with data architects, data analytics experts, and visualization specialists, you will collaborate to guarantee the implementation and operation of necessary data flows that fulfil business requirements. As a problem solver, you will assume technical leadership of the group of Data Engineers within the team, providing sound knowledge to address day-to-day implementation issues. Additionally, you will proactively monitor data interfaces, data pipelines execution, and data quality rules, while providing mechanisms and tools for data reprocessing and data quality exception handling.
This role offers an opportunity to be part of a transformational journey within the company, extracting maximum value from internal/external data to build added value for investment professionals and decision-makers. Key Responsabilities:
- In close coordination with Data Architects to design, implement, and operate a robust, scalable, and flexible cloud-based data processing This framework will ensure the successful capture, processing, validation, and delivery of high-quality and timely data required to support data analytics and system integration requirements.
- Guarantee the implementation of best practices in terms of code versioning, technical documentation, and CI/CD of the technical artifacts
- Ensure the highest standards of quality, reliability, and scalability are met in the delivery of technical
- Act as the gatekeeper of delivery quality assurance, data operations, and proactively monitor data
- Participate in all Agile events, proactively contribute to the team, and pick up user stories and tasks to ensure successful
- Lead the onboarding process of new data sources into the data platform, supporting the implementation of necessary data
Requirements
- Bachelor's degree in Computer Science, related fields, or equivalent work
- Specific know-how or previous exposure to projects within the Accounting domain, Financial Planning & Analysis, and Asset Portfolio Management & Reporting is highly valued.
- Demonstrable exposure and knowledge of designing and implementing cloud-based data solutions to support Information Management and Analytics.
- Exposure to the implementation of Data Lake and Data Warehouse architecture
- Proven hands-on experience with developing Relational Previous experience with Snowflake is highly valued.
- Experience in the implementation and rollout of ETL/ELT Pipelines in a cloud environment using Python/Pyspark and dbt as foundational technologies. Experience in Data Bricks implementations is highly valued.
- Previous experience in implementing CI/CD pipelines on Cloud-based data
- Systems integration know-how in the development of APIs and experience with event streaming technologies is highly valued.
- Strong problem-solving skills and a passion for data.
- Ability to work in a globalized team environment in a collaborative manner, acting proactively to achieve team goals following Agile methodologies.
- Focus on continuous improvement of the data platform.
- Excellent listening and communication
- Highly proficient in spoken and written English. Ability to speak Spanish is valued.