Data Engineer

S R H MEDIA
Chicago, United States of America
13 days ago

Role details

Contract type
Permanent contract
Employment type
Full-time (> 32 hours)
Working hours
Regular working hours
Languages
English
Experience level
Senior
Compensation
$ 130K

Job location

Chicago, United States of America

Tech stack

Adobe Analytics
Amazon Web Services (AWS)
Amazon Web Services (AWS)
Amazon Web Services (AWS)
Data analysis
JIRA
Big Data
Cloud Engineering
Information Engineering
ETL
Data Transformation
Data Visualization
Data Warehousing
Linux
Python
Cloud Services
SQL Databases
Tableau
Software Repository
Google Cloud Platform
Microsoft Power Automate
Core Api
Backend
GIT
Information Technology
Real Time Data
Data Delivery
Data Pipelines
Redshift
Programming Languages

Job description

This role will be reporting to the Senior Director, Data Engineering and be part of Data Engineering Team who build, enhance and optimize ETL (Extract, Transform, Load) jobs, implement automated data warehouse solutions, conduct QA and manage day-to-day tasks. The candidate must have strong data warehouse, data pipeline design/development, SQL, AWS Cloud and Big Data Technologies experience. As an experienced Data Engineer, you will also provide support and mentor other team members who will look for your experience and guidance to solve a problem. Complete oversight of our data warehouse and the ability to work directly with stakeholders on Investment, Strategy and Analytical teams are additional tasks that will be expected of you., * Contribute to the growth of the enterprise data warehouse (EDW) by working alongside stakeholders to produce accurate information within database objects used in reporting, strategic planning and decision making

  • Assist the Senior Director in developing long-term strategies and capacity planning for meeting future data warehouse and data automation needs
  • Meet with end-users and data SMEs to clarify requirements and business rules needed to design, develop and implement ETL/ELT processes
  • Collaborate with peer Data Engineers to build upon and support our CI/CD pipelines hosted within our extensive GIT repository
  • Mentor our emerging Data Analysts, including providing feedback and suggestions for their planned workflows prior to deployment into production
  • Respond, investigate and correct reported data inaccuracies within the EDW compared to source and business-expected results
  • Establish relationships with external and internal POCs to ensure seamless inbound and outbound Data Delivery expectations are met
  • Implement and support backend maintenance processes that enforce ongoing data optimization and corporate retention policies while ensuring the historical data needs of the client are met
  • Document and maintain all the processes and procedures pertaining to ETL and data warehouse usage/development, contributing to the team knowledgebase when applicable

Requirements

You will have the opportunity to contribute to end-to-end platform design for our cloud architecture and work multi-functionally with our Analytics, Data Science, IT and Investment teams to build batch and real-time data solutions. This is a very hands-on position, requiring extensive experience developing workflows in Python, willingness to dive into legacy code repositories and maintain complex data transformation processes on AWS cloud. Good decision-making, time management, prioritization and escalation skills are traits needed to ensure continued success in this role., * Bachelor's in Computer Science or similar technical degree within a STEM field

  • 5+ years of hands-on experience in building data pipelines using traditional ETL tech stack, platform APIs and big data technologies
  • 5+ years of experience in programming languages including as Python and working within a Linux based environment
  • Experience designing, deploying and supporting production cloud services in AWS ecosystem including EC2, S3 and Redshift
  • Can speak to concepts and designs/benefits to multiple audiences as well as interpret high level business requirements from conversation
  • Able to work within a cross-functional team environment with people from multiple business units, vendors, countries, and cultures
  • Working experience within the advertising/media industry with knowledge of basic media math
  • Experience in Social Media Datasets such as Twitter, Facebook, Instagram is a plus
  • Experience in Google Campaign Manager, Xandr, or Adobe Analytics datasets is a plus
  • Experience with Google Cloud products is a plus
  • Experience with Microsoft PowerAutomate is a plus
  • Experience using Project Management tools such as Jira is a plus
  • Experience creating data visualizations and using tools such as Tableau is a plus

#LI-CC2

About the company

Hearts & Science has been inspired by confident marketers seeking business advantage in a world of personalized digital marketing, where CRM and addressable channels converge, and decisions must be made in real time to aggregate effective reach and deliver the right message at the right time. Designed to inform brand strategies with real-time insights, Hearts & Science is a data-driven marketing agency with expert media planning and buying capabilities, among other services that include shopper marketing, marketing innovation and content activation.

Apply for this position