Databricks Architect (Modern Big Data)
Role details
Job location
Tech stack
Job description
We are seeking a highly experienced and visionary Data Solutions Architect (Modern Big Data) to join our Data & AI practice. The successful candidate will bring extensive expertise in architecting and delivering modern big data platforms that are scalable, reliable, and business-aligned. This role is pivotal in enabling clients to harness the power of streaming data, data lakes, lakehouses, and advanced analytics platforms, while guiding them on their data modernisation journeys.
As a trusted advisor, you will collaborate with executives, stakeholders, and technical teams to define modern big data strategies, design cloud-native architectures, and implement industry-leading best practices. You will thrive in a fast-paced, evolving technology environment, continuously expanding your knowledge to ensure NTT DATA and our clients remain leaders in data-driven innovation.
What youll be doing:
Primary Responsibilities:
-
Client Engagement & Delivery
-
Solution Design & Implementation
-
Modernisation & Transformation
-
Thought Leadership & Knowledge Sharing
-
Collaboration & Leadership
Business Relationships:
-
Client Partners
-
Practice Leaders and Members
-
Peer-level relationships within client organisations up to Head of Data Engineering, Chief Data Architect, CIO, and CDO level
Requirements
-
8+ years data architecture experience - Enterprise-scale solutions across multiple sectors with proven delivery track record
-
Technical leadership at scale - Leading 15+ person cross-functional teams and serving as technical escalation point for C-level stakeholders
-
Full data lifecycle mastery - End-to-end expertise from ingestion to consumption, including governance, security, and both batch/real-time processing
-
Business-technology translation - Ability to align data strategy with business objectives and communicate across all stakeholder levels
-
Databricks platform expertise - Deep hands-on experience with Databricks Lakehouse architecture, Delta Lake, Unity Catalog, and multi-cloud implementations
Must be eligible for SC clearance
Nice to Have:
-
Cloud-native architecture expertise - Hands-on experience with AWS/Azure/GCP, data lakes, real-time streaming, and infrastructure-as-code
-
Presales & business development experience - Track record supporting opportunity qualification, bid reviews, proposal development, and client-facing sales activities
-
Data governance & compliance - Strong background in security frameworks, regulatory compliance (GDPR), data lineage, and quality management
-
AI/ML integration capabilities - Experience with MLOps, analytics platforms, and integrating AI/ML into data architectures
-
Agile delivery & thought leadership - Proven agile/hybrid delivery experience with contribution to practice growth through proposition development and knowledge sharing
Experience, Qualifications:
-
Experience: Minimum 8-12 years in data architecture, engineering, or consulting, with at least 4+ years in modern big data solution architecture.
-
Education: University degree required.
-
Preferred: BSc/MSc in Computer Science, Data Engineering, or related field.
-
Relevant certifications in Databricks, Kafka, or cloud platforms highly desirable.