Data Architect - INTL India

Insight Global
Atlanta, United States of America
5 days ago

Role details

Contract type
Permanent contract
Employment type
Full-time (> 32 hours)
Working hours
Shift work
Languages
English
Experience level
Senior
Compensation
$ 35K

Job location

Atlanta, United States of America

Tech stack

Query Performance
Artificial Intelligence
Audit Trail
Backup Devices
Continuous Integration
Data Architecture
Information Engineering
Data Governance
Data Infrastructure
Data Masking
Data Security
Data Sharing
Database Development
DevOps
Disaster Recovery
Electronic Data Interchange (EDI)
Monitoring of Systems
Performance Tuning
Software Architecture
Query Optimization
Role-Based Access Control
Power BI
Data Streaming
Tableau
Software Vulnerability Management
Data Logging
Cloud Platform System
Data Classification
Snowflake
Caching
Backend
Infrastructure Automation Frameworks
Data Lineage
Data Management
Virtual Agents
Software Version Control
Data Pipelines
Api Management

Job description

· Design and maintain the overall data platform architecture integrating Snowflake, AtScale, and dbt components · Establish data quality frameworks and monitoring systems · Define platform scalability and performance optimization strategies · Design and implement Snowflake warehouse configurations and resource management policies · Manage user access controls, roles, and security policies within Snowflake · Optimize query performance through warehouse sizing, clustering, and partitioning strategies · Implement data sharing and collaboration features across business units · Monitor and control Snowflake costs through usage optimization and resource scheduling · Design and maintain data pipeline architectures for ingestion and transformation · Establish connection protocols between AtScale and various BI tools (Tableau, Power BI, etc.) · Optimize query performance through aggregate tables and caching strategies · Manage AtScale cluster configurations and scaling policies · Implement security models that align with organizational data access requirements

dbt Development & Operations · Design and maintain dbt project structures and development workflows · Create and enforce dbt coding standards and documentation requirements · Implement dbt testing frameworks for data quality validation · Manage dbt deployment pipelines and environment promotion processes · Design modular dbt models that support reusability and maintainability · Establish version control and collaboration practices for dbt development teams

Integration & Data Flow Management · Design end-to-end data pipelines connecting source systems to Snowflake via dbt transformations · Implement data lineage tracking across all three platforms · Create monitoring and alerting systems for data pipeline health and performance · Establish disaster recovery and backup procedures for the integrated platform · Design API integrations and data exchange protocols between platforms

Team Leadership & Collaboration · Mentor and train data engineers and analysts on platform best practices · Collaborate with data science teams to support ML/AI workload requirements · Coordinate with DevOps teams on infrastructure automation and deployment practices · Lead architectural review sessions and technical decision-making processes

Performance & Cost Optimization · Monitor and optimize compute resource utilization across all platforms · Implement cost allocation and chargeback models for different business units · Analyze query patterns and usage metrics to identify optimization opportunities · Design and implement automated scaling policies based on workload demands · Establish SLA monitoring and reporting for platform availability and performance

Security & Compliance · Implement data encryption, masking, and anonymization policies · Ensure compliance with regulatory requirements (GDPR, HIPAA, SOX, etc.) · Design and maintain audit trails and data access logging · Coordinate security assessments and vulnerability management · Establish data classification and handling procedures across all platforms, Architect will need to collaborate with all engineers here and work with Debbie and team with US hours Architect responsible for entire platform Data flow- how it moves from one source to another and ends in Snowflake Cost optimization, performance, security

Someone who knows Snowflake and not just developing in snowflake, but someone who knows the backend of developing in snowflake too Can look at big picture of things and think forward

Projects and goals are set by quarter This quarter they are up to 22 different things they want to accomplish Some are business related, and then some are platform related,

  • Creating semantic models
  • Hardening our metrics
  • Automated anomaly detections
  • Want to create a conversation AI agent for Snowflake
  • Sunsetting systems
  • Cost Control

L1 Interview: Sunil L2: Debbie or someone she nominates

Requirements

10+ years of experience in data engineering and data platform architecture, with ownership of enterprise-scale data platforms Expert-level Snowflake experience, including backend architecture (warehouses, RBAC, resource monitors, cost controls), not just SQL development Proven ownership of end-to-end data flow from source systems through transformation, semantic layer, and consumption in Snowflake Deep experience designing and optimizing Snowflake performance and scalability, including warehouse sizing, clustering, concurrency, and query tuning Demonstrated success managing Snowflake cost optimization, usage monitoring, and chargeback/showback models Hands-on experience designing and maintaining semantic models using AtScale, including metrics, aggregates, caching, and BI integrations Strong understanding of enterprise metric governance and hardening definitions for consistent analytical usage Advanced experience with dbt, including project structure, modular modeling, testing, version control, and CI/CD pipelines Ability to architect and manage complex data pipelines integrating multiple source systems into Snowflake via dbt transformations Experience implementing data quality frameworks, monitoring, alerting, and data lineage across platforms Strong background in platform security and governance, including access controls, data masking, encryption, and audit logging Experience supporting AI/ML and advanced analytics workloads on Snowflake, including performance and scalability considerations Proven architectural leadership-ability to see the big picture, design for the future, and guide platform evolution by quarter Experience mentoring and collaborating with data engineers, analytics engineers, data science, and DevOps teams Strong communication skills with experience working from Chennai while collaborating with US-based stakeholders during overlapping hours

Benefits & conditions

Shift- 11am- 8pm ist but first 3 month may work in later shift 2pm-11pm Targeting 30 days or less for notice

Here is intake code for Data Architect: When the REQ is ready for PZone, please copy the title: "Data Architect - INTL India"

When submitting the DEEL PPW request, please put "0fd7a3d7" as the code and EOR in the notes!

Benefit packages for this role will start on the 1st day of employment and include medical, dental, and vision insurance, as well as HSA, FSA, and DCFSA account options, and 401k retirement account access with employer matching. Employees in this role are also entitled to paid sick leave and/or other paid time off as provided by applicable law.

Apply for this position