Data Engineer
Heitmeyer Consulting
Rogers, United States of America
7 days ago
Role details
Contract type
Permanent contract Employment type
Full-time (> 32 hours) Working hours
Regular working hours Languages
English Experience level
SeniorJob location
Remote
Rogers, United States of America
Tech stack
Information Engineering
Python
SQL Databases
Systems Integration
Google Cloud Platform
Data Build Tool (dbt)
Job description
- Design, build, and maintain end-to-end data pipelines from legacy and modern source systems into cloud platforms
- Extract data from FCRM (legacy BSA platform) and additional internal systems
- Use dbt to transform, model, and structure data for Verafin ingestion
- Develop and maintain data pipelines using Python and SQL
- Work within Arvest's GCP-native data environment
- Ensure data accuracy, reliability, and performance in a regulated financial environment
- Collaborate with data, compliance, and fraud stakeholders to support business needs
Requirements
- Senior-level experience in data engineering
- Strong hands-on expertise in:
- Python
- SQL (advanced, production-level usage)
- dbt (data build tool) for transformations and modeling
- Google Cloud Platform (GCP)
- Proven experience building data pipelines from application and source systems into cloud platforms
- Ability to work independently in a fully remote environment while collaborating across teams, * Experience with fraud, AML, or BSA platforms
- Prior exposure to Verafin
- Experience working with financial services or banking data
- Familiarity with:
- Fiserv Signature (core banking platform)
- Legacy system integrations