Principal Azure Cloud and Databricks Solutions Architect
Role details
Job location
Tech stack
Job description
Build and optimize databases for performance and efficiency. Write code to extract, transform, and load data from various sources into data warehouses and data lakes. Architect and optimize complex data processing pipelines utilizing Databricks. Implement data quality checks to ensure data accuracy and consistency. Troubleshoot and resolve software and data-related issues. Establish and promote best practices for data governance, security, and compliance, ensuring that R1's data infrastructure remains robust and resilient. Design and deploy robust cloud-based applications on the Azure platform. Improve performance and cost efficiency of cloud-based applications. Leverage the full spectrum of Azure services to enhance application capabilities and ensure seamless integration with existing systems. Spearhead the architectural design and development of highly scalable software solutions using Python, Spark, Scala, and Databricks. Define and implement technical strategies that align with business goals and ensure future scalability and maintainability. Collaborate closely with cross-functional teams to drive data-driven decision-making and innovation, making a significant impact on R1's success. The salary range is $158,787 - $248,837.87 per year. #LI-DNI #BI-DNI
Requirements
Must have a Master's degree or foreign equivalent in Computer Science, Computer Engineering, Electrical Engineering, or a related field, and 6 years of related work experience; OR a Bachelor's degree or foreign equivalent in Computer Science, Computer Engineering, Electrical Engineering, or a related field, and 8 years of post-bachelor's, progressive related work experience.
Of the required experience, must have 6 years of experience with the following: Cloud computing with Azure or AWS; Utilizing Agile, DevOps to deploy solutions to the cloud using GitHub actions; CI/CD; Cloud data engineering, including designing, developing, and testing; and Maintaining data pipelines and multi-tier object-oriented applications including Scala & Python.
Of the required experience, must have 5 years of experience with utilizing Spark, building data pipelines, data governance, data security, and data compliance; and Jenkins.
Of the required experience, must have 4 years of experience with utilizing Terraform to deploy infrastructure and Docker.
Of the required experience, must have 2 years of experience with utilizing Databricks, building databases.
Telecommuting permitted 5 days a week, depending on business need.
Employer will accept any suitable combination of education, training, or experience.
Benefits & conditions
WORK SCHEDULE: 40 hours per week, M-F (9:00 a.m. - 5:00 p.m.)
Bonus: Job eligible to participate in bonus plan: target 10%/yr. Bonuses are discretionary and not guaranteed.
Benefits: Medical; Dental; Vision; 401k matching; Paid time off [amount depends on years of service]; Paid Parental leave; 8 paid holidays per year; Disability coverage; Tuition reimbursement; Health savings account; Flexible spending account; Wellness benefits; Life insurance; and Accidental death and dismemberment insurance. Full details listed at go.r1rcm.com/benefits.
The healthcare system is always evolving - and it's up to us to use our shared expertise to find new solutions that can keep up. On our growing team you'll find the opportunity to constantly learn, collaborate across groups and explore new paths for your career.
Our associates are given the chance to contribute, think boldly and create meaningful work that makes a difference in the communities we serve around the world. We go beyond expectations in everything we do. Not only does that drive customer success and improve patient care, but that same enthusiasm is applied to giving back to the community and taking care of our team - including offering a competitive benefits package.