Data Scientist
Role details
Job location
Tech stack
Job description
You will join the GFTOO Department (Group Finance Transformation Operation Office), whose purpose is to continuously improve how Finance operates, with three overarching targets:
- better serve the business,
- boost and motivate teams by simplifying processes and improving tools and
- increase our effectiveness, also in terms of costs.
You will join the Dune Team, whose purpose is to operate with multiple suppliers (PBRC, RM, asset managers, Performance Controlling, Actuary, Group IT, etc.) and multiple clients (Group and local, finance, investment, risk, compliance, etc.)
The main challenges are:
- to manage the pricing complex investments, with new type of investments coming frequently,
- to manage the several accounting norms and their evolution
- to manage the multiple stakeholders using different inputs, output features and tools
- to have the solidity of a robust process under the scrutiny of various internal controls and the agility to adapt rapidly to market crisis or other circumstances.
Provide the Group Investment professionals and other stakeholders with earnings forecast process dedicated to investments, fed by quality data and providing reliable outputs.
The mission encompasses:
- Developpe and improve the process of earnings projection on financial assets across the group
- Contribute to the development of a tool taking a deterministic approach to forecast earnings, invested assets and cash flows to facilitate the building of strategic plans, sensitivities, impairments and other simulations.
- The mission encompasses the design and development of such a tool using SQL/Python in a Spark environment (Azure Databricks), the writing of a documentation and the creation of a dynamic restitution of the results using a web-based reporting tool with regular production of the outputs/reports, including quantitative controlled data
- Help in Coordinating the tool use and outputs/reports production with local investment teams, * Produce of reliable, readable, version controlled, documented & commented code in SQL/Python
- Maintain a comprehensive and up to date documentation of the code/project.
- Ensure the integrity & consistency of the tool while integrating new features (such as pricing methods)
- Ensure architecture and Infrastructure proper evolution
- Developpe the platform enhancement with new accounting norms
- Provide support to entities on the solution.
- Implement new functionalities based on user needs (tool ergonomics, outputs, API for local entity financial tools)
Requirements
- Excellent knowledge of R/Python programming
- Excellent knowledge of Spark SQL
- Knowledge of a data visualization and analytics tools (such as Spotfire, Tableau, Rshiny,…)
- Knowledge of Azure (Databricks, DevOps, Data Factory, Azure Storage Explorer) is a plus
- Knowledge of SAS & SAS Risk Dimension is a plus
- Good knowledge of financial markets / financial instruments and valuation is a plus
- Fluent English
Various skills:
- Excellent verbal and written communication skills
- Strong organization skills with the ability to manage multiple tasks and deadlines simultaneously
- Ability to work with attention to detail and follow-through