Informatica Platform Engineer - Customer Data
Role details
Job location
Tech stack
Job description
- Salary: Gross monthly salary between EUR 4,931 and EUR 7,043 (scale 09) for a 36-hour work week.
- Extras: a thirteenth month, 8% holiday allowance, and a 10% Employee Benefit Budget.
- Development budget: EUR 1,400 development budget per year for your growth and development.
- Hybrid working: a balance between home and office work (possible for most roles).
- Pension: decide for yourself the amount of your personal contribution.
Or view all our benefits.
Shape the new MDM platform behind our Customer Data and support the CRM modernization program. You own the informatica IDMC platform and the shared infrastructure/services to enable a secure and scalable foundation for data engineers, data scientists, DQ and DevOps teams.
You & your role
In the upcoming years you directly work on building a new MDM platform for customer data using Informatica MDM (IDMC), a Saas Solutions that enables our data customers within the bank and the migration of the current Siebel environment towards Salesforce (the CRM modernization program), also a SaaS solution. This powers the bank initiative on Customer 360, MDM, Salesforce integration, event streaming and analytical/ML workloads. As platform engineer design and build the secure, automated, cloud-based data infrastructure that enables Data & DevOps teams to deliver reliable, secure and scalable customer data products.
Examples from practice
- Involved in the creation of the new MDM platform using Informatica IDMC.
- Enabling the migration from Siebel to Salesforce supporting the CRM modernization/migration
Facts & figures
- 36/40 hours a week
- Over 49,000 Rabobank colleagues worldwide.
Top responsibilities
- Design, build, and maintain the customer data platform architecture (Informatica IDMC, Event Hub/Kafka, APIM/microservices, Databricks).
- Implement platform automation and Infrastructure as Code for consistent, repeatable environment provisioning (dev/test/acc/prod).
- Ensure privacy by design and security by design (encryption, Key Vault policies, network isolation, RBAC/ABAC, secrets/identity management) and meet GDPR/PII requirements.
- Provide and operate shared platform services: schema registry, feature store infra, metadata & lineage (Collibra), orchestrators, DQ runtimes, job scheduling/service bus.
- Embed SRE practices-define SLIs/SLOs/SLAs, implement observability (Azure Monitor, Log Analytics, Grafana/Prometheus, Splunk), and drive fast incident response/RCA.
- Optimize performance, reliability, and cost (cluster sizing & autoscaling, storage tiering, Event Hub throughput, API/streaming tuning, scheduling).
- Establish platform standards, guardrails, reusable IaC modules, runbooks, and onboarding guides for Data Product teams.
- Enrol with Architects a solid and sustainable data architecture for the Customer Data Area
- Train the DevOps and platform engineers by ongoing knowledge sharing, define and lead workshops and setup trainings for the different roles within the Area.
Requirements
Do you have experience in Splunk?, * At least 5 years of experience in a similar technical engineering role.
- Hands-on experience with modern cloud data platforms: Informatica, Databricks, Event Hub/Kafka, Azure.
- Strong IaC skills and CI/CD exposure (Azure DevOps/GitHub Actions).
- Knowledge on Informatica (IDMC) in development, administration and operations experience.
- Solid grasp of networking & security (private endpoints, managed identities, encryption).
- Familiarity with customer-data concepts (Customer 360/MDM, CRM integrations (Salesforce), consent & preference management, GDPR implications).
- Clear communication, collaborative mindset, and an interest for reliability, automation, and developer self-service.
- A pro-active, curious and entrepreneurial mind-set
- Nice to have: Collibra, Grafana/Prometheus/Splunk.
- To be considered for this position you must be located in the Netherlands, re-location is not possible for this role.