Data Architect with Data Modelling
Realign Llc
Morton Township, United States of America
1 month ago
Role details
Contract type
Permanent contract Employment type
Full-time (> 32 hours) Working hours
Regular working hours Languages
English Experience level
Intermediate Compensation
$ 146KJob location
Remote
Morton Township, United States of America
Tech stack
Third Normal Form
Artificial Intelligence
Azure
Databases
Data Architecture
Data Masking
Data Warehousing
SQL Azure
Data Streaming
Snowflake
Kafka
Spark Streaming
Databricks
Requirements
Do you have experience in Technical documentation?, Must Have Technical/Functional Skills
- Excellent data modelling skills. Dimensional and 3NF needed.
- Data modelling must have conceptual model showing core entities from CRM, e commerce, and finance systems, include relationships and grain for each table
- Catalog federation or query federation
- Must have implemented implement streaming (autoloader, CloudFiles, Spark Streaming, Event Hubs/Kafka)
- Strong in Delta tables in Databricks
- Extensive experience translating business requirements to data design requirements.
- Extensive data flow documentation experience.
- Azure experience - ADLS, Azure SQL.
- Snowflake or Databricks design experience
- Excellent Communication skills for both technical and business communications.
- Excellent documentation skills.
- Extensive experience translating business requirements to data design requirements.
- Eight year's experience as a data architect
- Excellent data modelling skills.
- Must have implemented PCI/tokenization/data masking/row-/column-/object-level security
- Extensive data flow documentation experience.
- Two year's experience working in Databricks.
- Databricks certification a plus.
- Two year's experience in Snowflake.
- Snowflake certifications a plus.
- Two year's experience building enterprise data warehouses in Azure.
- Azure solution architect certification preferred.
- Azure database certifications preferred.
- Azure AI certifications preferred.
Roles & Responsibilities
- Excellent data modelling skills. Dimensional and 3NF needed.
- Data modelling - must have created conceptual model showing core entities from CRM, e commerce, and finance systems, include relationships and grain for each table
- Catalog federation or query federation
- Must have implemented implement streaming (autoloader, CloudFiles, Spark Streaming, Event Hubs/Kafka)
- Strong in Delta tables in Databricks
- Extensive experience translating business requirements to data design requirements.
- Extensive data flow documentation experience.
- Azure experience - ADLS, Azure SQL.
- Snowflake or Databricks design experience
- Excellent Communication skills for both technical and business communications.
- Excellent documentation skills.
- Extensive experience translating business requirements to data design requirements.
- Eight years' experience as a data architect
- Excellent data modelling skills.
- Must have implemented PCI/tokenization/data masking/row-/column-/object-level security
- Extensive data flow documentation experience.
- Two year's experience working in Databricks.
- Databricks certification a plus.