Senior Data Analyst
Role details
Job location
Tech stack
Job description
-
Design and develop ETL/ELT pipelines using Snowflake, Snowpipe, internal systems, Salesforce, SharePoint, and DocuSign
-
Build and maintain dimensional data models in Snowflake using dbt, including data quality checks (Great Expectations, Deequ)
-
Implement CDC patterns for near real-time data synchronization
-
Manage and evolve the data platform across S3 Data Lake (Apache Iceberg) and Snowflake data warehouse
-
Build and maintain Medallion architecture data lake in Snowflake
-
Prepare ML features using SageMaker Feature Store
-
Develop analytical dashboards and reports in Power BI
What we offer
-
Continuous learning and career growth opportunities
-
Professional training and English/Spanish language classes
-
Comprehensive medical insurance
-
Mental health support
-
Specialized benefits program with compensation for fitness activities, hobbies, pet care, and more
Requirements
-
5+ years of experience in data analysis or data engineering
-
3+ years of hands-on experience building and supporting production ETL/ELT pipelines
-
Advanced SQL skills (CTEs, window functions, performance optimization)
-
Strong Python skills (pandas, API integrations)
-
Proven experience with Snowflake (schema design, Snowpipe, Streams, Tasks, performance tuning, data quality)
-
Solid knowledge of AWS services: S3, Lambda, EventBridge, IAM, CloudWatch, Step Functions
-
Strong understanding of dimensional data modeling (Kimball methodology, SCDs)
-
Experience working with enterprise systems (ERP, CRM, or similar)
Nice-to-haves
-
Experience with data quality frameworks (Great Expectations, Deequ)
-
Knowledge of CDC tools and concepts (AWS DMS, Kafka, Debezium)
-
Hands-on experience with data lake technologies (Apache Iceberg, Parquet)
-
Exposure to ML data pipelines and feature stores (SageMaker Feature Store)
-
Experience with document processing tools such as Amazon Textract