Associate Data Engineer, AWS Cloud

Publicis Sapient
Los Angeles, United States of America
yesterday

Role details

Contract type
Permanent contract
Employment type
Full-time (> 32 hours)
Working hours
Regular working hours
Languages
English
Experience level
Senior
Compensation
$ 145K

Job location

Remote
Los Angeles, United States of America

Tech stack

Java
JavaScript
Agile Methodologies
Amazon Web Services (AWS)
Big Data
Code Review
Databases
Continuous Integration
Data Validation
ETL
Data Transformation
Data Warehousing
Amazon DynamoDB
Graph Database
Python
Metadata
Metadata Repositories
Microsoft SQL Server
MySQL
NoSQL
Oracle Applications
Cloud Services
SQL Databases
Data Streaming
Software Repository
Data Processing
Google Cloud Platform
Cloud Platform System
Real Time Systems
Spark
Cloudformation
Event Driven Architecture
Data Lake
PySpark
Information Technology
Data Lineage
Amazon Web Services (AWS)
Amazon Web Services (AWS)
Cosmos DB
Data Management
Vertica
Functional Programming
Terraform
Data Pipelines
Azure
Databricks

Job description

Senior Associate, AWS Data Engineering - Los Angeles 3 x per week., As a Senior Associate Data Engineer, you will be responsible for designing, developing, and maintaining scalable Big Data solutions. You will work with large datasets, real-time processing frameworks, and AWS cloud-based data platforms to enable data-driven decision-making for our clients. Your Impact

  • Combine your technical expertise and problem-solving passion to work closely with clients, turning complex ideas into end-to-end solutions that transform our clients' business
  • Lead, design, develop and deliver large-scale data systems, data processing and data transformation projects that delivers business value for clients
  • Automate data platform operations and manage the post-production system and processes
  • Conduct technical feasibility assessments and provide project estimates for the design and development of the solution
  • Provide technical inputs to agile processes, such as epic, story, and task definition to resolve issues and remove barriers throughout the lifecycle of client engagements
  • Creation and maintenance of infrastructure-as-code for cloud, on-prem, and hybrid environments using tools such as Terraform, CloudFormation, Azure Resource Manager, Helm, and Google Cloud Deployment Manager
  • Mentor, help and grow junior team members

Requirements

  • Demonstrable experience in data platforms involving implementation of end-to-end data pipelines
  • Hands-on experience with Amazon Web Services Cloud Platform
  • Implementation experience with column-oriented database technologies (i.e., Redshift, Vertica), NoSQL database technologies (i.e., DynamoDB, Cosmos DB, etc.) and traditional database systems (i.e., SQL Server, Oracle, MySQL)
  • Experience in implementing data pipelines for both streaming and batch integrations using tools/frameworks like Glue ETL, Lambda, Spark, pyspark,streaming, etc.
  • Ability to handle module or track level responsibilities and contributing to tasks "hands-on"
  • Experience in data modeling, warehouse design and fact/dimension implementations
  • Experience working with code repositories and continuous integration
  • Data modeling, querying, and optimization for relational, NoSQL, timeseries, and graph databases and data warehouses and data lakes
  • Data processing programming using SQL, DBT, Python.
  • Experience with data processing platforms such as Databricks
  • Logical programming in Python, Spark, PySpark, Java, Javascript, and/or Scala
  • Data ingest, validation, and enrichment pipeline design and implementation
  • Cloud-native data platform design with a focus on streaming and event-driven architectures
  • Test programming using automated testing frameworks, data validation and quality frameworks, and data lineage frameworks
  • Metadata definition and management via data catalogs, service catalogs, and stewardship tools such as AWS Glue Catalog, OpenMetadata, DataHub, Alation, and similar
  • Code review and mentorship
  • Bachelor's degree in Computer Science, Engineering or related field

Benefits & conditions

The range shown represents a grouping of relevant ranges currently in use at Publicis Sapient. Actual range for this position may differ, depending on location and specific skillset required for the work itself.

  • An inclusive workplace that promotes diversity and collaboration.
  • Access to ongoing learning and development opportunities.
  • Competitive compensation and benefits package.
  • Flexibility to support work-life balance.
  • Comprehensive health benefits for you and your family.
  • Generous paid leave and holidays.
  • Wellness program and employee assistance.

About the company

Publicis Sapient is a digital transformation partner helping established organizations get to their future, digitally-enabled state, both in the way they work and the way they serve their customers. We help unlock value through a start-up mindset and modern methods, fusing strategy, consulting and customer experience with agile engineering and problem-solving creativity.

Apply for this position