Data Architect

Coforge
Municipality of Madrid, Spain
2 days ago

Role details

Contract type
Permanent contract
Employment type
Full-time (> 32 hours)
Working hours
Regular working hours
Languages
English
Experience level
Senior

Job location

Municipality of Madrid, Spain

Tech stack

Airflow
Amazon Web Services (AWS)
Azure
Big Data
Google BigQuery
Cloud Computing
Databases
ETL
Data Mining
Data Visualization
Data Warehousing
Database Design
Database Development
DevOps
Failover
Data Flow Control
NoSQL
Systems Development Life Cycle
Azure
Google Cloud Platform
Snowflake
Spark
Build Management
Deployment Automation
Kafka
Data Management
Terraform
Legacy Systems
Jenkins
Databricks

Job description

  • 16-18+ years of total experience in DWBI, Big Data, Cloud Technologies, * Prepare accurate database design and architecture reports for management and executive teams.
  • Oversee the migration of data from legacy systems to new solutions.
  • Educate staff members through training and individual support.
  • Offer support by responding to system problems in a timely manner.

Requirements

  • Implementation experience and hands on experience in either of the 2 Cloud technologies - Azure, AWS, GCP, Snowflake, Databricks
  • Must Have Hands on experience on at least 2 Hyperscalers (GCP/AWS/Azure platforms) and specifically in Big Data processing services (Apache Spark, Beam or equivalent).
  • In-depth knowledge on key technologies like Big Query/Redshift/Synapse/Pub Sub/Kinesis/MQ/Event Hubs, Kafka Dataflow/Airflow/ADF etc.
  • Excellent consulting experience and ability to design and build solutions, actively contribute to RfP response.
  • Ability to be a SPOC for all technical discussions across industry groups.
  • Excellent design experience, with entrepreneurship skills to own and lead solutions for clients
  • Excellent ETL skills, Data Modeling Skills
  • Excellent communication skills
  • Ability to define the monitoring, alerting, deployment strategies for various services.
  • Experience providing solution for resiliency, fail over, monitoring etc.
  • Good to have working knowledge of Jenkins, Terraform, StackDriver or any other DevOps tools.
  • Design and implement effective database solutions and models to store and retrieve data.
  • Examine and identify database structural necessities by evaluating client operations, applications, and programming.
  • Ability to articulate and write POVs on new and old technologies
  • Ability to recommend solutions to improve new and existing database systems.
  • Assess data implementation procedures to ensure they comply with internal and external regulations., * Strong knowledge of database structure systems and data mining.
  • Knowledge of systems development, including system development life cycle, project management approaches and requirements, design and testing techniques
  • Proficiency in data modeling and design, including SQL development and database administration
  • Ability to implement common data management and reporting technologies, as well as the Columnar and NoSQL databases, data visualization, unstructured

Apply for this position