Data Engineer

Q2 Software, Inc.
Cary, United States of America
4 days ago

Role details

Contract type
Permanent contract
Employment type
Full-time (> 32 hours)
Working hours
Regular working hours
Languages
English

Job location

Cary, United States of America

Tech stack

Clean Code Principles
Airflow
Amazon Web Services (AWS)
Data analysis
Azure
Bash
C Sharp (Programming Language)
Cloud Database
Software Documentation
Code Review
Continuous Integration
Data Architecture
Information Engineering
Data Transformation
Data Systems
Data Warehousing
Software Debugging
Distributed File Systems
Distributed Computing Environment
Distributed Systems
Amazon DynamoDB
Python
PostgreSQL
Machine Learning
Microsoft SQL Server
NoSQL
Octopus Deploy
Operational Databases
Standard Sql
DataOps
SQL Databases
Data Streaming
Snowflake
Gitlab
GIT
Pandas
Containerization
PySpark
Kubernetes
Infrastructure Automation Frameworks
Amazon Web Services (AWS)
Kafka
Database Replication
Terraform
Software Version Control
Data Pipelines
Docker
Databricks
Go

Job description

Being as passionate about our people as we are about our mission. We celebrate our employees in many ways, including our "Circle of Awesomeness" award ceremony and day of employee celebration among others! We invest in the growth and development of our team members through ongoing learning opportunities, mentorship programs, internal mobility, and meaningful leadership relationships. We also know that nothing builds trust and collaboration like having fun. We hold an annual Dodgeball for Charity event at our Q2 Stadium in Austin, inviting other local companies to play, and community organizations we support to raise money and awareness together.

Our Team

The Risk & Fraud team at Q2 helps our customers take a proactive stance against fraud while managing the risks inherent to their business. We build and enhance products that evolve with the ever-changing fraud landscape, delivering tangible value to our customers. Our solutions allow financial institutions to focus more of their time and energy on their mission: serving their customers and communities.

The Role

In this role, you will take ownership for building and operating our data architecture to support new and evolving fraud solutions. You'll play a key role in ensuring data is reliable, scalable, and accessible to power models, agents, and UIs directly impacting our customers' ability to detect and prevent fraud.

This is an opportunity to work on production systems with real-world impact while continuing to grow your skills in data engineering, cloud platforms, and distributed systems.

Your Key Responsibilities

  • Design, build, and maintain scalable data pipelines and workflows in a cloud environment
  • Deliver clean, well-structured datasets to support fraud analytics, machine learning models, and agentic solutions
  • Contribute to improving our data architecture, including ingestion, storage, and access patterns
  • Own data operations by monitoring data workflows, triaging failures, and resolving data issues
  • Enhance observability and performance by implementing monitoring and optimizing pipelines for reliability, scalability, and cost efficiency
  • Partner with product managers, data scientists, and engineers to translate fraud and risk requirements into data solutions
  • Write maintainable code; participate in code reviews; and help improve testing, deployment, and documentation standards

Requirements

Must Haves

  • Typically requires a Bachelor's degree in (relevant degree) and a minimum of 2 years of related experience; or an advanced degree without experience; or equivalent work experience.
  • Experience building and maintaining data pipelines and workflows in production environments
  • Proficiency in SQL and working with relational and/or analytical data stores
  • Experience with Python
  • Familiarity with data modeling, transformation, and orchestration concepts
  • Experience with data warehouses and distributed data processing systems
  • Experience with version control (e.g., Git) and CI/CD practices
  • Ability to troubleshoot data issues, debug pipelines, and work through ambiguous problems

Nice to Have

  • Experience with tools such as Apache Airflow, dbt, Kafka, Airbyte, or FiveTran
  • Experience with Snowflake or similar cloud data warehouses
  • Experience with SQL Server, PostgreSQL, or NoSQL systems like DynamoDB
  • Familiarity with infrastructure as code tools (e.g. Terraform)
  • Experience with Docker and/or Kubernetes
  • Exposure to platforms like Databricks, AWS Glue, AWS Sagemaker, Snowpark

RESPONSIBILITIES * Production Support: Start the day by reviewing production data pipeline executions, investigating and resolving failures * Development: Build and orchestrate data pipelines, defining data flow, transformations, and dataset relationships * Observability: Monitor and optimize data pipelines for performance and efficiency * Collaboration: Work closely with teams and stakeholders to understand data requirements and ensure platform solutions meet business needs EXPERIENCE AND KNOWLEDGE * Typically requires a Bachelor's degree in (relevant degree) and a minimum of 2 years of related experience; or an advanced degree without experience; or equivalent work experience * Advanced knowledge of data transformations, data orchestration and pipelines, data replication * Working knowledge of SQL and NoSQL databases, data warehouses, distributed file storage and compute platforms * Experience with some of the following technologies is a plus: o Data Movement and Pipelines: Apache Airflow, dbt, Kafka, AirByte o Data Warehouse: Snowflake o Databases: SQL Server, Postgres, DynamoDB o Languages: Python, C#, Golang, Bash, SQL o CI/CD tools and infrastructure as code: GitLab, Azure DevOps, Terraform, Argo CD o Containerization: Kubernetes, Docker o Data Tools: Pyspark, Snowpark, AWS Glue, Pandas, Databricks, SageMaker

This position requires fluent written and oral communication in English.

Applicants must be authorized to work for any employer in the U.S. We are unable to sponsor or take over sponsorship of an employment Visa at this time.

Benefits & conditions

  • Hybrid Work Opportunities
  • Flexible Time Off
  • Career Development & Mentoring Programs
  • Health & Wellness Benefits, including competitive health insurance offerings and generous paid parental leave for eligible new parents
  • Community Volunteering & Company Philanthropy Programs
  • Employee Peer Recognition Programs - "You Earned it"

About the company

Q2 is a leading provider of digital banking and lending solutions to banks, credit unions, alternative finance companies, and fintechs in the U.S. and internationally. Our mission is simple: build strong and diverse communities through innovative financial technology-and we do that by empowering our people to help create success for our customers.

Apply for this position