Senior Data Analyst

Trinetix
2 days ago

Role details

Contract type
Permanent contract
Employment type
Full-time (> 32 hours)
Working hours
Regular working hours
Languages
English, Spanish
Experience level
Intermediate

Job location

Tech stack

API
Amazon Web Services (AWS)
Amazon Web Services (AWS)
Data analysis
Apache HTTP Server
Information Engineering
ETL
Data Warehousing
Identity and Access Management
Python
Performance Tuning
Power BI
Salesforce
SharePoint
SQL Databases
Parquet
Snowflake
State Machines
Servicebus
Pandas
Data Lake
Debezium
Real Time Data
Kafka
Cloudwatch
DocuSign
Data Pipelines

Job description

  • Design and develop ETL/ELT pipelines using Snowflake, Snowpipe, internal systems, Salesforce, SharePoint, and DocuSign

  • Build and maintain dimensional data models in Snowflake using dbt, including data quality checks (Great Expectations, Deequ)

  • Implement CDC patterns for near real-time data synchronization

  • Manage and evolve the data platform across S3 Data Lake (Apache Iceberg) and Snowflake data warehouse

  • Build and maintain Medallion architecture data lake in Snowflake

  • Prepare ML features using SageMaker Feature Store

  • Develop analytical dashboards and reports in Power BI

What we offer

  • Continuous learning and career growth opportunities

  • Professional training and English/Spanish language classes

  • Comprehensive medical insurance

  • Mental health support

  • Specialized benefits program with compensation for fitness activities, hobbies, pet care, and more

Requirements

  • 5+ years of experience in data analysis or data engineering

  • 3+ years of hands-on experience building and supporting production ETL/ELT pipelines

  • Advanced SQL skills (CTEs, window functions, performance optimization)

  • Strong Python skills (pandas, API integrations)

  • Proven experience with Snowflake (schema design, Snowpipe, Streams, Tasks, performance tuning, data quality)

  • Solid knowledge of AWS services: S3, Lambda, EventBridge, IAM, CloudWatch, Step Functions

  • Strong understanding of dimensional data modeling (Kimball methodology, SCDs)

  • Experience working with enterprise systems (ERP, CRM, or similar)

Nice-to-haves

  • Experience with data quality frameworks (Great Expectations, Deequ)

  • Knowledge of CDC tools and concepts (AWS DMS, Kafka, Debezium)

  • Hands-on experience with data lake technologies (Apache Iceberg, Parquet)

  • Exposure to ML data pipelines and feature stores (SageMaker Feature Store)

  • Experience with document processing tools such as Amazon Textract

About the company

Established in 2011, Trinetix is a dynamic tech service provider supporting enterprise clients around the world. Headquartered in Nashville, Tennessee, we have a global team of over 1,000 professionals and delivery centers across Europe, the United States, and Argentina. We partner with leading global brands, delivering innovative digital solutions across Fintech, Professional Services, Logistics, Healthcare, and Agriculture. Our operations are driven by a strong business vision, a people-first culture, and a commitment to responsible growth. We actively give back to the community through various CSR activities and adhere to international principles for sustainable development and business ethics. To learn more about how we collect, process, and store your personal data, please review our Privacy Notice: https://www.trinetix.com/corporate-policies/privacy-notice

Apply for this position