Data Integration Specialist

Next Link
1 month ago

Role details

Contract type
Permanent contract
Employment type
Full-time (> 32 hours)
Working hours
Regular working hours
Languages
English
Experience level
Senior

Job location

Tech stack

API
Airflow
Amazon Web Services (AWS)
Data analysis
Cloud Computing
Data Cleansing
Data Integration
Data Transformation
Salesforce
SQL Databases
Data Streaming
Systems Integration
Parquet
Adobe Campaign
Snowflake
Data Layers
PySpark
Information Technology
Data Management
Gsuite
Data Pipelines

Job description

We are looking for a Global Data Factory Specialist (Data Integration Specialist) to join the client's Data & Analytics organization.

This role will play a key part in enabling data excellence across the company, contributing to the standardization and centralization of data processes worldwide.

As a Global Data Factory Specialist, you will be responsible for designing and managing end-to-end data integrations from multiple sources into clean and structured data layers. You will collaborate closely with global teams across the Data & Analytics organization to support the company's data-driven strategy, ensuring data quality, performance, and reliability.

MAIN TASKS

  • Design, develop, and maintain robust data pipelines integrating data from diverse sources into curated layers.
  • Implement and manage API and FT integrations, ensuring seamless data flow between systems.
  • Perform data cleansing, transformation, and enrichment to deliver high-quality, business-ready data.
  • Collaborate with data analysts, data engineers, and stakeholders to translate business requirements into technical solutions.
  • Monitor and optimize data pipelines for performance, cost, and scalability.
  • Communicate proactively with data consumers about pipeline execution, incidents, and remediation plans.
  • Maintain comprehensive documentation for all data integration processes and workflows.

Requirements

  • Bachelor's degree in Computer Science, Information Technology, or a related field.
  • 5+ years of experience in data management, data modeling, or data integration.
  • Strong experience with APIs, FT integrations, and data integration tools.
  • Proficiency in PySpark, SQL, and cloud platforms such as AWS or Snowflake.
  • Hands-on experience with AWS services (Glue, Lambda, Airflow, Cloud9, Step Functions).
  • Solid knowledge of data transformation techniques (e.g., Parquet).
  • Proven experience working with pharma datasets and applications (Veeva CRM, Adobe Campaign, Google Suite, Veeva Vault, MDG, etc.).
  • Excellent problem-solving skills, attention to detail, and technical judgment.
  • Fluent in English (spoken and written).
  • Strong collaboration, communication, and stakeholder management skills.
  • Results-oriented, organized, and capable of managing multiple priorities.

Apply for this position