Senior DBA & Data Modeller - (12 month Contract)
Robert Walters
Barcelona, Spain
2 days ago
Role details
Contract type
Temporary contract Employment type
Full-time (> 32 hours) Working hours
Regular working hours Languages
EnglishJob location
Barcelona, Spain
Tech stack
Airflow
Amazon Web Services (AWS)
Azure
Bash
Business Software
Cloud Computing
Software Documentation
Databases
Continuous Integration
Data Dictionary
Information Engineering
Data Governance
ETL
Data Mapping
Data Transformation
Data Warehousing
Relational Databases
DevOps
Identity and Access Management
Python
PostgreSQL
Microsoft SQL Server
MySQL
Online Analytical Processing
NoSQL
Online Transaction Processing
Operational Databases
Oracle Applications
Performance Tuning
Lucidchart
Scala
Software Engineering
SQL Databases
Talend
Data Processing
Scripting (Bash/Python/Go/Ruby)
Informatica Powercenter
Snowflake
Spark
Caching
Reliability of Systems
GIT
Data Layers
Data Lake
Data Lineage
Physical Data Models
Data Pipelines
Databricks
Job description
Engaged as a Senior Database Administrator & Data Modeller supporting enterprise data transformation initiatives. Responsible for designing scalable data models, administering operational databases, and enabling advanced analytical workloads across cloud platforms. Contributed to the delivery of robust data pipelines and modelling frameworks leveraging Databricks and cloud-native technologies, while ensuring system reliability, data quality, and strong governance. Key Responsibilities
- Designed and developed conceptual, logical, and physical data models to support core business applications and analytical workloads.
- Administered and optimised relational databases, ensuring reliability, performance, and compliance with regulatory requirements.
- Developed and optimised Databricks data models and ETL/ELT pipelines, leveraging Delta Lake architecture for scalable processing.
- Built reusable data assets within Databricks, including gold/silver/bronze layer tables, enabling reporting and ML consumption.
- Implemented data governance and quality frameworks across cloud platforms to ensure accurate, secure, and trusted data.
- Collaborated with Cloud Engineers, Data Engineers, and Architects to design cloud-based integration and storage patterns.
- Performed performance tuning at database, pipeline, and data model levels (indexing, partitioning, caching, compute scaling).
- Managed security and compliance standards, including IAM roles, encryption policies, auditing, and least-privilege access.
- Documented data lineage, data dictionaries, ERD models, and metadata to improve discoverability and cross-team collaboration.
- Supported business stakeholders in requirement analysis, data mapping, modelling, and solution design.
- Contributed to backup and DR planning to ensure business continuity across cloud environments.
Technologies & Tools
- Databricks / Delta Lake / Spark
- Cloud Platforms: AWS / Azure / GCP
- RDBMS: PostgreSQL, MySQL, Oracle, SQL Server
- Data Modelling: ER/Studio, ERwin, dbt, Lucidchart
- ETL / ELT: Databricks, Informatica, Talend, Airflow
- DevOps: Git, CI/CD frameworks
- Scripting: Python, Scala, SQL, Bash
Key Achievements
- Delivered scalable Databricks data models and Delta Lake assets, improving end-to-end data processing speed and consistency.
- Reduced cloud compute costs by optimising Databricks cluster sizing, caching strategies, and query execution logic.
- Improved performance of mission-critical transactional systems through schema optimisations and advanced indexing.
- Defined modelling and documentation standards that strengthened data governance and platform usability across teams.
- Contributed to a more secure and resilient cloud data platform through structured IAM policies, auditability, and DR planning.
Requirements
- Proven ability to optimise existing OLTP entity-relationship (ER) data models and design new OLTP schemas from scratch.
- Proficient with ERD modelling tools (e.g., ER/Studio, Lucidchart, draw.io, or similar)
- Skilled in designing and maintaining OLAP, NoSQL, and Data Warehouse/Data Lake architectures.
- Strong understanding of common OLTP and OLAP modelling techniques, including Star, Snowflake, and Dimensional models.
- Deep technical knowledge of SQL and NoSQL databases, as well as modern Data Warehouse platforms and Databricks environments.
- Effective team player, experienced in collaborating with software development and data engineering teams to manage and evolve the data layer.
- Software development or data engineering experience is considered a valuable bonus, though not a core requirement.