Senior Data Engineer

Eliassen Group
Boston, United States of America
1 month ago

Role details

Contract type
Permanent contract
Employment type
Full-time (> 32 hours)
Working hours
Regular working hours
Languages
English
Experience level
Senior

Job location

Boston, United States of America

Tech stack

Java
Agile Methodologies
Amazon Web Services (AWS)
Databases
Information Engineering
ETL
Data Visualization
Data Warehousing
R
Python
PostgreSQL
Machine Learning
MySQL
Oracle Applications
Grafana
Gitlab
Data Lake
Kubernetes
Information Technology
Data Analytics
Kafka
Apache Nifi
Data Management
Cloudwatch
Api Gateway
Kibana
REST
Splunk
Data Pipelines
Devsecops
Api Management
Docker
Programming Languages

Job description

· Develop, optimize, and maintain data ingest flows using Apache Kafka, Apache Nifi and MySQL/PostGreSQL

· Develop within the components in the AWS cloud platform using services such as RedShift, SageMaker, API Gateway, QuickSight, and Athena

· Communicate with data owners to set up and ensure configuration parameters

· Document SOP related to streaming configuration, batch configuration or API management depending on role requirement

· Document details of each data ingest activity to ensure they can be understood by the rest of the team

· Develop and maintain best practices in data engineering and data analytics while following Agile DevSecOps methodology

Requirements

· Strong analytical skills, including statistical analysis, data visualization, and machine learning techniques

· Strong understanding of programming languages like Python, R, and Java

· Expertise in building modern data pipelines and ETL (extract, transform, load) processes using tools such as Apache Kafka and Apache Nifi

· Proficient in programming languages like Java, Scala, or Python

· Experience or expertise using, managing, and/or testing API Gateway tools and Rest APIs

· Experience in traditional database and data warehouse products such as Oracle, MySQL, etc.

· Experience in modern data management technologies such as Datalake, data fabric, and data mesh

· Experience with creating DevSecOps pipeline using CI CD CT tools and GitLab

· Excellent written and oral communication skills, including strong technical documentation skills

· Strong interpersonal skills and ability to work collaboratively in dynamic team environment

· Proven track record in demanding, customer service-oriented environment

· Ability to communicate clearly with all levels within an organization

· Excellent analytical skills, organizational abilities and problem-solving skills

· Experience in instituting data observability solutions using tools such as Grafana, Splunk, AWS CloudWatch, Kibana, etc.

· Experience in container technologies such as Docker, Kubernetes, and Amazon EKS

Qualifications:

· Ability to obtain an Active Secret clearance or higher

· Bachelors Degree in Computer Science, Engineering, or other technical discipline required, OR a minimum of 8 years equivalent work experience

· 8+ years of experience of IT data/system administration experience

· AWS Cloud certifications are a plus

Apply for this position