Lead Data Engineer of IntelliScript's Data Platform
Role details
Job location
Tech stack
Job description
IntelliScript's Data has been a key part of our success and is critical to our future. In this position as a Lead Data Engineer of IntelliScript's Data Platform, you will be a trusted advisor across leadership and engineering team, responsible for technical decisions. You will ensure data systems work together across our hybrid environments with a focus on scale and performance. You will lead technical and architecture decisions in collaboration with architecture and data engineering teams while ensuring compliance with industry-leading data privacy standards.
What you will be doing
- Acts as a subject matter expert and thought leader within the Data Platform Domain
- Data Strategy: Serves as a thought leader in data processing design and implementation, defining advanced structure for moving, storing, and maintaining high-quality data.
- Team Leadership: Leads projects by managing timelines, coordinating teams, and communicating project statuses. Influences organizational direction through effective leadership and strategic collaboration
- Data Governance and Security: Serves as a subject matter expert on governance standards, continuously aligning data practices with evolving industry best practices and requirements
- Project Management and Scope of Work: Contributes to defining the overall vision and strategy for data engineering within the organization, ensuring alignment with organizational goals and long-term objectives
- Results Orientation: Establishes visionary goals, advises on strategic plans, employs advanced monitoring, influences high-level stakeholders, and delivers transformative results
- Data Platform: Expansion of our Data Warehouse(s) and Lakehouse solutions for a healthcare data focused enterprise
- Data Governance: Configuring and maintaining unity catalog to enable enterprise data lineage, data quality, auditability and data stewardship
- Data Security: Building out Data Security protocols and best practices including the management of identified and de-identified (PHI/PII) solutions
- Access Management: Always ensure a policy of least privilege is followed for anything being implemented
- External Data Products: Building data solutions for clients while upholding the best standards for reliability, quality, and performance
- ETL: Building solutions within Delta Live Tables and automation of transformations
- Medallion Architecture: Building out performant enterprise-level medallion architecture(s)
- Streaming and Batch Processing: Building fit-for-purpose near real-time streaming and batch solutions
- Large Data Management: Building out performant and efficient enterprise solutions for internal and external users for both structured and unstructured healthcare data
- Platform Engineering: Building out Infrastructure as Code using Terraform and Asset Bundles
- Costs: Working with the business to build cost effective and cost transparent Data solutions
- Pipeline/ETL Management: You will help architect, build, and maintain robust and scalable data pipelines, monitoring, and optimizing performance
- Experience working with Migration tools i.e. Fivetran, AWS technologies and custom solutions
- Identify and implement improvements to enhance data processing efficiency
- Design and implement reliable and resilient Event Driven data processing
- Experience with building out effective pipeline monitoring solutions
- Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL, Delta Live Tables, Python, Scala, and cloud-based 'big data' technologies
- API Development: Drive our design and implementation of internal APIs for integrating data between different systems and applications
- Integration with external systems utilizing API driven processes to ingest data
- Develop APIs built on top of datasets for internal systems to consume data from Databricks
- Experience integrating with external APIs including but not limited to Salesforce, Financial systems, HR systems and other external systems
- Data Modeling: Lead design, implementation, and maintenance of standards based (FHIR, OMOP, etc.) and efficient data models for both structured and unstructured data
- Assemble large, complex data sets that meet functional and non-functional business requirements
- Develop and maintain data models, ensuring they align with business objectives and data privacy regulations
- Collaboration: Partner internally and externally with key stakeholders to ensure we are providing meaningful, functional, and valuable data
- Effectively work with Data, Development, Analysts, Data Science, and Business team members to gather requirements, propose, and build solutions.
- Communicate complex technical concepts to non-technical stakeholders and provide guidance on best practices.
- Ensure that technology execution aligns with business strategy and provides efficient, secure solutions and systems
- Gather requirements and build out project plans to implement those requirements with forecasted efforts to implement
- Processes and Tools: Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
- Build analytics tools that utilize the data pipeline to provide actionable insights into operational efficiency and other key business performance metrics.
- Create data tools for clinical, analytics and data scientist team members that assist them in building and optimizing our product into an innovative industry leader.
- Lead investigation of new tooling, develop implementation plans, and deployment of necessary tooling
Requirements
- 10+ years of relevant experience in design, development, and testing of Data Platform solutions, such as Data Warehouses, Data Lakes, and Data Products
- Expert level experience working in Databricks and AWS
- Expert level experience working in both relational and non-relational databases such as SQL Server, PostgreSQL, DynamoDB, DocumentDB
- Experience managing and standardizing clinical data from structured and unstructured sources
- Experience building and managing solutions on AWS
- Expert knowledge in healthcare standards including FHIR, C-CDA, and traditional HL7
- Expert knowledge in clinical standards/ontologies including ICD10/SNOMED/NDC/LOINC/Rx Norm
- Expert in building out data models, data warehouses, designing of data lakes for enterprise and product use cases
- Familiarity with designing and building APIs, ETL and data ingestion processes and utilization of tools to support enterprise solutions
- Experience in performance tuning, query optimization, security, monitoring, and release management
- Experience working with and managing large, disparate, identified and de-identified data sets from multiple data sources
- Experience with building and deploying IAC using terraform, asset bundles and GitHub
- Experience collaborating with Data Science teams and building AI based solutions to drive efficiencies and business value
What you bring to the table
- Demonstrated "let's find a way to do it" attitude - no task is too big or too small
- Effective collaboration and communication across multiple technical and non-technical disciplines
- Comfortable working through ambiguous situations
- Able to teach & mentor others on new/emerging technologies
- Customer obsessed with a business-centric focus
- Able to understand both strategic and tactical needs and balance appropriately
- Driven, thorough and self-directed
- Able to lead through influence and persuasion
- Curiosity to explore industry trends and drive the organization to utilize fit for purpose solutions
Wish list
- Bachelor's degree or master's degree in computer science, data engineering or related field
- Health and Life Insurance business experience
- Professional level solution architecture certification in AWS
Benefits & conditions
Our team is smart, down-to-earth, and ready to listen to your best ideas. We reward excellence and offer competitive compensation and benefits. Visit our LinkedIn page for a closer look at our company, and learn more about our cultural values here ., The overall salary range for this role is $117,500 - $222,985. For candidates residing in:
- Alaska, California, Connecticut, Illinois, Maryland, Massachusetts, New Jersey, New York City, Pennsylvania, Virginia, Washington, or the District of Columbia the salary range is $135,125 - $222,985.
- All other locations the salary range is $117,500 - $193,900., We offer a comprehensive benefits package designed to support employees' health, financial security, and well-being. Benefits include:
- Medical, Dental and Vision - Coverage for employees, dependents, and domestic partners
- Employee Assistance Program (EAP) - Confidential support for personal and work-related challenges
- 401(k) Plan - Includes a company matching program and profit-sharing contributions
- Discretionary Bonus Program - Recognizing employee contributions
- Flexible Spending Accounts (FSA) - Pre-tax savings for dependent care, transportation, and eligible medical expenses
- Paid Time Off (PTO) - Begins accruing on the first day of work. Full-time employees accrue 15 days per year, and employees working less than full-time accrue PTO on a prorated basis
- Holidays - A minimum of 10 paid holidays per year
- Family Building Benefits - Includes adoption and fertility assistance
- Paid Parental Leave - Up to 12 weeks of paid leave for employees who meet eligibility criteria
- Life Insurance & AD&D - 100% of premiums covered by Milliman
- Short-Term and Long-Term Disability - Fully paid by Milliman