ata Engineer

Randstad
Malvern, United States of America
1 month ago

Role details

Contract type
Contract
Employment type
Full-time (> 32 hours)
Working hours
Regular working hours
Languages
English
Experience level
Senior
Compensation
$ 146K

Job location

Malvern, United States of America

Tech stack

API
Artificial Intelligence
Amazon Web Services (AWS)
Amazon Web Services (AWS)
Big Data
Bioinformatics
Cloud Computing
Computer Programming
Data Validation
Information Engineering
Data Integrity
ETL
Data Systems
Distributed Systems
Amazon DynamoDB
Failover
Fault Tolerance
Python
Machine Learning
Prometheus
Swagger
SQL Databases
TypeScript
Web Services
Openapi
Data Processing
Load Balancing
Real Time Systems
Large Language Models
Grafana
Reliability of Systems
Backend
Data Lake
GraphQL
Api Design
Api Gateway
REST
Data Inconsistencies
Data Pipelines
Microservices

Job description

job summary: Required Skills - Top 3-5 Key Words

"Required Technical Skills: Python, SQL, AWS Web services (Glu, S3, Lambda)

Core Programming Skills:

Expert proficiency in Python, with experience in building data pipelines and back-end

systems.

Advanced knowledge of SQL for querying and optimizing large datasets.

AWS Cloud Services Expertise:

DynamoDB, S3, Athena, GlueETL, Lambda, ECS, Glue Data Quality, EventBridge,

Redshift Machine Learning, OpenSearch, and RDS.

API and Resilience Engineering:

Proven expertise in designing fault-tolerant APIs using Swagger/OpenAPI, GraphQL,

and RESTful standards.

Robust understanding of distributed systems, load balancing, and failover strategies.

Monitoring and Orchestration:

Hands-on experience with Prometheus and Grafana for observability and monitoring.

Job Duties

Senior Data Engineer - 7+ Years of Experience

We are seeking a highly experienced Senior Data Engineer with 7+ years of expertise in

designing, building, and optimizing robust data solutions. The ideal candidate must

possess top-tier skills in Python, AWS services, API development, and TypeScript, and

have significant hands-on experience with anomaly detection systems.

The candidate should have a proven ability to work at both strategic and tactical levels,

from designing data architectures to implementing them in the weeds.

location: Malvern, Pennsylvania job type: Contract salary: $65 - 70 per hour work hours: 8am to 5pm education: Bachelors

responsibilities: Job Requirements

Key Responsibilities:

  • Data Pipeline Development

  • Independently design, build, and maintain complex ETL pipelines, ensuring scalability

  • and efficiency for large-scale data processing needs.

  • Manage pipeline complexity and orchestration, delivering high-performance data

  • products accessible via APIs for business-critical applications.

  • Archive processed data products into data lakes (e.g., AWS S3) for analytics and

  • machine learning use cases.

  • Anomaly Detection and Data Quality

  • Implement advanced anomaly detection systems and data validation techniques, ensuring data integrity and quality.

  • Leverage AI/ML methodologies, including Large Language Models (LLMs), to detect and address data inconsistencies.

  • Develop and automate robust data quality and validation frameworks.

  • Cloud and API Engineering

  • Architect and manage resilient APIs using modern patterns, including microservices,

  • RESTful design, and GraphQL.

  • Configure API gateways, circuit breakers, and fault-tolerant mechanisms for distributed

  • systems.

  • Ensure horizontal and vertical scaling strategies for API-driven data products.

  • Monitoring and Observability

  • Implement comprehensive monitoring and observability solutions using Prometheus and Grafana to optimize system reliability.

  • Establish proactive alerting systems and ensure real-time system health visibility.

  • Cross-functional Collaboration and Innovation

  • Collaborate with stakeholders to understand business needs and translate them into scalable, data-driven solutions.

  • Continuously research and integrate emerging technologies to enhance data engineering practices.

qualifications: Job Requirements

Key Responsibilities:

Data Pipeline Development

Independently design, build, and maintain complex ETL pipelines, ensuring scalability

and efficiency for large-scale data processing needs.

Manage pipeline complexity and orchestration, delivering high-performance data

products accessible via APIs for business-critical applications.

Archive processed data products into data lakes (e.g., AWS S3) for analytics and

machine learning use cases.

Anomaly Detection and Data Quality

Implement advanced anomaly detection systems and data validation techniques, ensuring data integrity and quality.

Leverage AI/ML methodologies, including Large Language Models (LLMs), to detect and address data inconsistencies.

Develop and automate robust data quality and validation frameworks.

Cloud and API Engineering

Architect and manage resilient APIs using modern patterns, including microservices,

RESTful design, and GraphQL.

Configure API gateways, circuit breakers, and fault-tolerant mechanisms for distributed

systems.

Ensure horizontal and vertical scaling strategies for API-driven data products.

Monitoring and Observability

Implement comprehensive monitoring and observability solutions using Prometheus and Grafana to optimize system reliability.

Establish proactive alerting systems and ensure real-time system health visibility.

Cross-functional Collaboration and Innovation

Collaborate with stakeholders to understand business needs and translate them into scalable, data-driven solutions.

Continuously research and integrate emerging technologies to enhance data engineering practices.

Equal Opportunity Employer: Race, Color, Religion, Sex, Sexual Orientation, Gender Identity, National Origin, Age, Genetic Information, Disability, Protected Veteran Status, or any other legally protected group status.

At Randstad Digital, we welcome people of all abilities and want to ensure that our hiring and interview process meets the needs of all applicants. If you require a reasonable accommodation to make your application or interview experience a great one, please contact HRsupport@randstadusa.com.

Pay offered to a successful candidate will be based on several factors including the candidate's education, work experience, work location, specific job duties, certifications, etc. In addition, Randstad Digital offers a comprehensive benefits package, including: medical, prescription, dental, vision, AD&D, and life insurance offerings, short-term disability, and a 401K plan (all benefits are based on eligibility).

This posting is open for thirty (30) days.

Any consideration of a background check would be an individualized assessment based on the applicant or employee's specific record and the duties and requirements of the specific job.

,

Job Requirements

Key Responsibilities:

  • Data Pipeline Development

  • Independently design, build, and maintain complex ETL pipelines, ensuring scalability

  • and efficiency for large-scale data processing needs.

  • Manage pipeline complexity and orchestration, delivering high-performance data

  • products accessible via APIs for business-critical applications.

  • Archive processed data products into data lakes (e.g., AWS S3) for analytics and

  • machine learning use cases.

  • Anomaly Detection and Data Quality

  • Implement advanced anomaly detection systems and data validation techniques, ensuring data integrity and quality.

  • Leverage AI/ML methodologies, including Large Language Models (LLMs), to detect and address data inconsistencies.

  • Develop and automate robust data quality and validation frameworks.

  • Cloud and API Engineering

  • Architect and manage resilient APIs using modern patterns, including microservices,

  • RESTful design, and GraphQL.

  • Configure API gateways, circuit breakers, and fault-tolerant mechanisms for distributed

  • systems.

  • Ensure horizontal and vertical scaling strategies for API-driven data products.

  • Monitoring and Observability

  • Implement comprehensive monitoring and observability solutions using Prometheus and Grafana to optimize system reliability.

  • Establish proactive alerting systems and ensure real-time system health visibility.

  • Cross-functional Collaboration and Innovation

  • Collaborate with stakeholders to understand business needs and translate them into scalable, data-driven solutions.

  • Continuously research and integrate emerging technologies to enhance data engineering practices.

Requirements

Job Requirements Key Responsibilities: Data Pipeline Development Independently design, build, and maintain complex ETL pipelines, ensuring scalability and efficiency for large-scale data processing needs. Manage pipeline complexity and orchestration, delivering high-performance data products accessible via APIs for business-critical applications. Archive processed data products into data lakes (e.g., AWS S3) for analytics and machine learning use cases. Anomaly Detection and Data Quality Implement advanced anomaly detection systems and data validation techniques, ensuring data integrity and quality. Leverage AI/ML methodologies, including Large Language Models (LLMs), to detect and address data inconsistencies. Develop and automate robust data quality and validation frameworks. Cloud and API Engineering Architect and manage resilient APIs using modern patterns, including microservices, RESTful design, and GraphQL. Configure API gateways, circuit breakers, and fault-tolerant mechanisms for distributed systems. Ensure horizontal and vertical scaling strategies for API-driven data products. Monitoring and Observability Implement comprehensive monitoring and observability solutions using Prometheus and Grafana to optimize system reliability. Establish proactive alerting systems and ensure real-time system health visibility. Cross-functional Collaboration and Innovation Collaborate with stakeholders to understand business needs and translate them into scalable, data-driven solutions. Continuously research and integrate emerging technologies to enhance data engineering practices.

Apply for this position