Data Engineer & Database Administrator

STAFFING TECHNOLOGIES
Bellevue, United States of America
3 days ago

Role details

Contract type
Permanent contract
Employment type
Full-time (> 32 hours)
Working hours
Regular working hours
Languages
English
Experience level
Intermediate

Job location

Bellevue, United States of America

Tech stack

Clean Code Principles
Airflow
Amazon Web Services (AWS)
Backup Devices
Code Review
Databases
Continuous Integration
Information Engineering
Data Transformation
Data Retention
Data Systems
Database Security
Shard (Database Architecture)
DevOps
Disaster Recovery
Document-Oriented Databases
Monitoring of Systems
Python
PostgreSQL
MongoDB
NoSQL
Performance Tuning
Query Optimization
Standard Sql
Data Streaming
Scripting (Bash/Python/Go/Ruby)
Indexer
Cloudformation
Data Lake
Information Technology
Kafka
Cloudwatch
Terraform
Data Pipelines
Redshift

Job description

· Design and optimize data models in PostgreSQL, MongoDB, and Amazon Redshift

· Develop and manage data workflows using AWS (S3, Glue, Lambda, Step Functions, Kinesis)

· Administer and maintain database environments across development, staging, and production

· Monitor data pipelines and databases, troubleshoot issues, and implement alerts

· Optimize query performance, indexes, and configurations across relational and NoSQL systems

· Manage database provisioning, upgrades, backups, and disaster recovery (RDS, MongoDB, Redshift)

· Ensure database security, including access control, encryption, and role management

· Plan capacity and scale systems to support growing data needs

· Define and enforce data retention and archival policies

· Collaborate with analytics and product teams to support reporting and data needs

· Document data pipelines, database processes, and operational procedures

· Participate in code reviews and follow engineering best practices

Requirements

· Design, build and maintain ETL/ELT pipelines across multiple systems, Bachelor's degree in Computer Science, Engineering, or a related field - or equivalent experience.

3-5 years of Data Engineering Experience or equivalent experience

Qualifications:

· 3-5 years of experience in data engineering, database administration, or similar roles

· Strong experience with PostgreSQL, MongoDB, and Amazon Redshift

· Solid SQL skills for both transactional and analytical workloads

· Experience with AWS data and database services (S3, Glue, Lambda, RDS, Redshift, etc.)

· Proficiency in Python or another scripting language

· Experience with workflow orchestration tools (Airflow, Step Functions, etc.)

· Hands-on database administration experience, including:

· MongoDB (replica sets, sharding, indexing, backups)

· Redshift (cluster management, query tuning, WLM, snapshots)

· PostgreSQL (replication, performance tuning, connection pooling)

· Familiarity with monitoring tools (CloudWatch, pgBadger, MongoDB Atlas, etc.)

· Understanding of database security (encryption, auditing, least-privilege access)

· Strong problem-solving and analytical skills

· Ability to translate business needs into data solutions

· Comfortable working in a fast-paced, collaborative environment

· Clear communicator with both technical and non-technical audiences

· Self-motivated with a focus on clean, maintainable code

Nice to Have

· Experience with Kafka or Kinesis (streaming data)

· Familiarity with dbt for data transformation

· Knowledge of data lake/lakehouse architectures (Delta Lake, AWS Lake Formation)

· Experience with Terraform or CloudFormation

· CI/CD experience for data pipelines

· Basic DevOps skills

Apply for this position