Data Warehouse Developer/Engineer
UVS InfoTech LLC
yesterday
Role details
Contract type
Permanent contract Employment type
Full-time (> 32 hours) Working hours
Regular working hours Languages
English Experience level
SeniorJob location
Remote
Tech stack
Query Performance
Java
API
Airflow
Amazon Web Services (AWS)
Data analysis
Azure
Big Data
Google BigQuery
Software as a Service
Cloud Computing
Cloud Database
Information Systems
Computer Programming
Databases
Data Architecture
Data Validation
Information Engineering
Data Governance
ETL
Data Warehousing
Document-Oriented Databases
Hadoop
Python
Performance Tuning
Power BI
Cloud Services
SQL Databases
Data Streaming
Tableau
Talend
Workflow Management Systems
Google Cloud Platform
Snowflake
Spark
Indexer
GIT
Pandas
Build Management
PySpark
Information Technology
Data Lineage
Google BigQuery
Kafka
Spark Streaming
Tools for Reporting
Terraform
Looker Analytics
Software Version Control
Data Pipelines
Redshift
Databricks
Job description
UVS InfoTech is looking for a skilled Data Warehouse Engineer (or Developer) to design, build, optimize, and maintain scalable data warehouse solutions that power business intelligence, analytics, and decision-making.
You will work closely with data engineers, analysts, and business stakeholders to transform raw data from multiple sources into clean, structured, and reliable datasets ready for analysis and reporting. This role is central to our modern data stack in a cloud-first environment.
- Design & Build Data Warehouses: Architect and implement cloud-based data warehouses (e.g., Snowflake, Amazon Redshift, Google BigQuery, Databricks) using best practices for scalability, performance, and cost-efficiency.
- Develop ETL/ELT Pipelines: Design, build, and maintain robust Extract, Transform, Load (or ELT) processes to ingest, clean, transform, and load data from diverse sources (databases, APIs, SaaS platforms, files).
- Data Modeling: Create and maintain dimensional models (star/snowflake schemas), fact and dimension tables, and slowly changing dimensions (SCD) using Kimball or hybrid methodologies.
- Performance Optimization: Tune queries, implement partitioning/clustering, indexing, and materialized views to ensure fast query performance at scale.
- Data Quality & Governance: Implement data validation, monitoring, lineage tracking, and quality rules to ensure accuracy, consistency, and compliance.
- Integration & Automation: Integrate the data warehouse with BI tools (Tableau, Power BI, Looker) and orchestration tools (Airflow, dbt, Prefect).
- Monitoring & Maintenance: Set up monitoring, alerting, and observability for pipelines and warehouse performance. Troubleshoot and resolve production issues.
- Collaboration: Work with stakeholders to gather requirements, translate business needs into technical solutions, and document data models and processes.
- Continuous Improvement: Evaluate new tools and technologies; migrate or modernize legacy data warehouses as needed.
Requirements
- Strong expertise in SQL (advanced querying, optimization, window functions).
- Proficiency with cloud data platforms: Snowflake, BigQuery, Redshift, or Databricks.
- Experience with ETL/ELT tools: dbt, Apache Airflow, Fivetran, Talend, Informatica, or Spark.
- Programming skills: Python (pandas, PySpark) and/or Java/Scala.
- Deep understanding of dimensional data modeling and data architecture principles.
- Knowledge of data governance, security, and compliance (GDPR, SOC2, etc.).
- Familiarity with infrastructure-as-code (Terraform) and version control (Git).
Preferred Skills:
- Experience with real-time streaming (Kafka, Spark Streaming).
- Big Data technologies (Spark, Hadoop).
- BI/reporting tools integration.
- Cloud certifications (AWS, Azure, Google Cloud Platform, or Snowflake)., * Education: Bachelor s degree in Computer Science, Information Systems, Data Engineering, or related field (Master s preferred).
- Experience: 3 7+ years of hands-on experience in data warehousing, ETL development, or data engineering. Ability to coordinate the work of a team with the work of other units within the Department of Research, Assessment, and Accountability and other BCPS offices, schools, and departments.
- Effective verbal and written communication skills.
- Ability to effectively communicate with all levels of users and team members.