Data Engineer

Vanderhouwen & Associates, Inc.
Portland, United States of America
1 month ago

Role details

Contract type
Permanent contract
Employment type
Full-time (> 32 hours)
Working hours
Regular working hours
Languages
English
Experience level
Senior
Compensation
$ 155K

Job location

Remote
Portland, United States of America

Tech stack

Java
.NET
API
Amazon Web Services (AWS)
Azure
Databases
Information Engineering
Data Integration
ETL
Data Transformation
Data Warehousing
Python
Software Engineering
SQL Databases
Snowflake
Backend
Build Management
PySpark
Data Management
Data Pipelines
Databricks

Job description

  • Design and build scalable data pipelines and ETL/ELT frameworks that ingest and transform data from enterprise ERP and financial systems into modern cloud data platforms.
  • Develop and maintain data lakehouse and warehouse architectures using Databricks, implementing layered data models aligned with medallion architecture (bronze, silver, gold).
  • Create high-performance data transformations and integrations using Python, PySpark, and advanced SQL.
  • Integrate and model data from a variety of sources including ERP systems, APIs, databases, and third-party platforms to support analytics and reporting.
  • Partner with business stakeholders, analysts, and engineering teams to gather requirements, deliver analytics-ready datasets, and support BI tools and dashboards.

Requirements

Our client is seeking a highly skilled Senior Data Engineer to design and implement modern data pipelines and scalable cloud-based data platforms. This role is ideal for a technically strong engineer who enjoys solving complex data integration challenges, building reliable data architectures, and working directly with stakeholders to translate business needs into high-value data solutions. The ideal candidate combines deep data engineering expertise with strong software development skills and thrives in an environment where they can work independently while delivering impactful, production-grade systems.

This role will begin fully remote; future assignments may require onsite. Applicants must reside within one hour of the Portland Oregon Metro area., * 5+ years of experience in data engineering or software engineering roles building enterprise-scale data platforms and pipelines.

  • Expert-level experience with Databricks and strong knowledge of modern data warehouse and lakehouse architectures.
  • Advanced proficiency in Python and SQL, including experience developing complex data transformation logic and production-grade pipelines.
  • Strong experience working with cloud platforms such as AWS or Azure, along with experience integrating Snowflake or similar modern data platforms.
  • Experience developing backend systems or services using technologies such as .NET or Java, along with the ability to work independently and collaborate directly with client stakeholders.

Apply for this position