Analytics Engineer
Role details
Job location
Tech stack
Job description
Analytics Engineer, DoorDash, Inc., Seattle, WA. Build database solutions for various use cases including operational reporting, product analytics, marketing optimization, and financial reporting. Implement dashboards, data structures, and data warehouse architecture to serve as the foundation for decision-making at DoorDash. Leverage knowledge of database fundamentals and experience with Hadoop or similar ecosystems to design and develop appropriate data models and pipelines for data lakes and data warehouses. Own and define business KPIs, their measurement plans, and data/reporting requirements. Collaborate with engineering, product teams, and third-party partners to ensure all required data is collected. Use experience with DBMS platforms such as Snowflake, Redshift, and/or PostgreSQL and tools such as Tableau, Sigma, Looker, and/or Superset to develop and implement scalable reporting and dash boarding solutions. Address ad-hoc reporting requirements and find pathways for automation. Build and enforce common design patterns to increase report reusability, readability, and standardization. Telecommuting Permitted. (AE-S-102-WA)
Requirements
Master's degree (or foreign equivalent) in Computer Science, Computer Information Systems, Engineering (any) or related field of study plus two (2) years of experience in the field of software engineering/robotics, or related occupation.
In the alternative, employer will accept a Bachelor's degree (or foreign equivalent) in Computer Science, Computer Information Systems, Engineering (any), or related field of study plus five (5) years of progressive post-bachelor experience in the field of software engineering/robotics, or related occupation.
Qualifying experience must include two (2) years in the following skills (which may be gained concurrently);
- Writing SQL statements, including use of aggregate functions and joins;
- Conducting quantitative analyses on large data sets;
- Data warehouse architecture;
- Data modeling design;
- ETL pipeline design, implementation, and maintenance;
- BI visualization tools including Tableau, Looker, Qlikview, Qliksense, or similar;
- Cloud-based computing services and data warehouses such as Snowflake, Google BigQuery, Amazon Redshift or similar;
- Linux/OSX command line, and version control software (Git);
- Python scripting to process data for modeling;
Any suitable combination of education, training and experience is acceptable.
Up to 10% domestic travel possible based on business need.
Benefits & conditions
40 hrs/week, Mon-Fri, 8:30 a.m. - 5:30 p.m. Salary range: $168,111 - $207,400/yr. Standard company benefits.