Data Platform Architect - AIRLKLHV
Role details
Job location
Tech stack
Job description
We are seeking an experienced Data Platform Architect to design and implement scalable, high-performance data platforms that support enterprise analytics, machine learning, and reporting initiatives. This role is critical in building the data foundation that enables data-driven decision-making across the organization. The ideal candidate will bring deep expertise in modern data architectures, including lakehouse models and cloud-based ecosystems, along with the ability to align technical solutions with business objectives. Responsibilities
- Design and implement enterprise-grade data platforms for large-scale analytics and AI use cases
- Develop and optimize data pipelines for ingestion, transformation, and storage (ETL/ELT)
- Evaluate and recommend data technologies based on business and technical requirements
- Define and implement data governance, security, and compliance frameworks
- Collaborate with cross-functional engineering teams to integrate data systems seamlessly
- Optimize data infrastructure for performance, scalability, and reliability
- Enable advanced analytics and machine learning workloads
- Provide technical leadership, guidance, and mentorship to data engineering teams
Requirements
- Extensive experience designing and implementing modern data architectures
- Strong knowledge of cloud data platforms (AWS, Azure, or GCP)
- Hands-on experience with ETL/ELT tools and data pipeline development
- Solid understanding of data governance, security, and compliance practices
- Experience working in large-scale enterprise data environments
- Strong expertise in database technologies and data modeling
- Ability to translate complex business requirements into scalable technical solutions
Must-Have Skills
- Proven experience designing enterprise-scale data platforms supporting AI/ML workloads
- Experience integrating legacy systems with modern cloud data platforms
- Strong problem-solving, architecture, and decision-making skills
- Excellent communication and collaboration abilities, * Experience with lakehouse architectures (e.g., Delta Lake, Iceberg, or similar)
- Familiarity with big data processing frameworks (e.g., Spark)
- Knowledge of data observability and monitoring tools
- Experience with MLOps and data platform optimization for AI use cases
Work Environment
- Remote / Hybrid flexibility
- Collaborative, fast-paced, and innovation-driven environment
Benefits & conditions
SaidGig
-
San Francisco, CA
-
$90.00 per hour Join a dynamic team dedicated to building an AI-native platform that transforms traditional spreadsheet operations into real-time dashboards and agentic workflows. This role involv…
-
Just now, + $185,000-250,000 per year Join a high-growth profitable AI startup transforming a deeply broken system with agentic driven software, building products that directly impact the lives and careers of top globa…
-
1 day ago