Data Architect
Role details
Job location
Tech stack
Job description
Join Pictet Tech, a leader in innovative software solutions. Be part of a dynamic team driving transformative advancements in finance.
Tasks
- Contribute to the modernization of data platforms and solutions.
- Design and optimize scalable data architectures using cloud tools.
- Oversee data integration and governance, ensuring best practices., As a Data Architect, you will contribute to the modernization of the group's current data platforms. With your ideas and initiatives, you will have the opportunity to propose and build innovative solutions while keeping the group's constraints in mind. You will play a key role in implementing solutions for data processing, management, and governance.
Your key responsibilities will include:
- Leverage cloud platforms (AWS, Azure) and tools such as Redshift, Snowflake, S3/Parquet and Iceberg to develop and optimize data solutions.
- Design and enhance modern and scalable data architectures.
- Oversee data modelling and manage data integrations/workflows using patterns like ETL, ELT, APIs, batch processing, Change Data Capture and messaging technologies (e.g., Kafka, MQ).
- Play an active role in both planning and implementing initiatives, making results-driven and hands-on contributions.
- Play a strategic role when needed:
-Develop and communicate target architecture roadmaps and present technical strategies to stakeholders and leadership. -Advocate for best practices in data management, ensure adherence to data quality standards, promote robust data governance practices and act as a champion for data-driven decision-making within the organisation. -Collaborate with and support technical teams, fostering knowledge-sharing and effective communication among data experts.
Requirements
- Bachelor's degree in computer science and relevant data experience.
- Proficient in Python, SQL, and cloud platforms like AWS and Azure.
- Strong knowledge of data management techniques and tools., * Academic background in computer science.
- First relevant experience in the data field.
- Development skills (Python, Java) and knowledge of BI and data transformation tools. Skill in SQL is a must.
- Expertise in data management: Data Hub, Data Warehouse, Data Lake.
- Proficiency in integration concepts (ETL, ELT, API, batch, streaming).
- Experience with cloud platforms (AWS, Azure) and associated tools.
- Good knowledge or experience with tools like DBT, Informatica, Tableau, or Power BI is a plus.
- Proficiency in French is essential, with a good command of English required.
- Swiss resident.