Sr Analyst, Risk Data Engineering
Role details
Job location
Tech stack
Job description
As a Risk Data Engineer and Architect, you will architect and lead the design and implementation of new technology solutions for the Risk Investments and Technology team within the Chief Risk Office (CRO). You will own technical roadmaps, champion SDLC best practices, and build production grade systems that power analytics, machine learning, and cross-functional risk management. Working on a centralized cloud data platform (AWS, Databricks or Dataiku), you'll design scalable data pipelines, implement data governance frameworks using modern cataloging tools, and develop advanced solutions. This role is critical to modernizing how we manage market risk, credit risk, and operational risk data on the enterprise scale. You'll partner closely with risk analysts, quantitative researchers, and technology leaders to migrate business-owned processes into robust, production-controlled platforms while maintaining operational excellence. What you'll be doing
-
Architect cloud-native data solutions using AWS services (S3, Redshift, Glue, EMR, Lambda, Athena) and Databricks (Unity Catalog, Workflows, MLflow) or similar technology to support risk analytics at scale.
-
Build and optimize data pipelines using Python, PySpark, and SQL to integrate structured and unstructured data from internal and external sources.
-
Drive modernization initiatives by migrating legacy on-premises systems to cloud platforms while ensuring zero disruption to critical business services.
-
Implement data governance frameworks including cataloging, lineage tracking, quality monitoring, and access controls aligned with internal controls and regulatory requirements.
-
Collaborate cross-functionally with data scientists, risk managers, and business stakeholders to translate requirements into scalable technical solutions, drive change management and process improvements.
-
Develop production-grade code following DevOps practices including CI/CD, automated testing, code reviews, and infrastructure-as-code.
-
Manage and optimize relational databases (MySQL, Aurora PostgreSQL, Redshift) for performance, cost efficiency, and data integrity.
-
Integrate structured and unstructured data from various internal and external sources.
-
Mentor and influence technical standards and architectural decisions across the Risk, Investments Technology organization.
-
Monitor and troubleshoot production systems, implementing observability practices and proactive performance tuning., Applications for this position will be accepted through May 15th, 2026 subject to earlier closure due to applicant volume. What's it like to work here? At Lincoln Financial, we love what we do. We make meaningful contributions each and every day to empower our customers to take charge of their lives. Working alongside dedicated and talented colleagues, we build fulfilling careers and stronger communities through a company that values our unique perspectives, insights and contributions and invests in programs that empower each of us to take charge of our own future. What's in it for you:
-
Clearly defined career tracks and job levels, along with associated behaviors for each of Lincoln's core values and leadership attributes
-
Leadership development and virtual training opportunities
-
PTO/parental leave
-
Competitive 401K and employee benefits
-
Free financial counseling, health coaching and employee assistance program
-
Tuition assistance program
-
Work arrangements that work for you
-
Effective productivity/technology tools and training, The pay range for this position is $96,900 - $176,200 with anticipated pay for new hires between the minimum and midpoint of the range and could vary above and below the listed range as permitted by applicable law. Pay is based on non-discriminatory factors including but not limited to work experience, education, location, licensure requirements, proficiency and qualifications required for the role. The base pay is just one component of Lincoln's total rewards package for employees. In addition, the role may be eligible for the Annual Incentive Program, which is discretionary and based on the performance of the company, business unit and individual. Other rewards may include long-term incentives, sales incentives and Lincoln's standard benefits package., This position may be subject to Lincoln's Political Contribution Policy. An offer of employment may be contingent upon disclosing to Lincoln the details of certain political contributions. Lincoln may decline to extend an offer or terminate employment for this role if it determines political contributions made could have an adverse impact on Lincoln's current or future business interests, misrepresentations were made, or for failure to fully disclose applicable political contributions and or fundraising activities. Any unsolicited resumes or candidate profiles submitted through our web site or to personal e-mail accounts of employees of Lincoln Financial are considered property of Lincoln Financial and are not subject to payment of agency fees. Lincoln Financial ("Lincoln" or "the Company") is an Equal Opportunity employer and, as such, is committed in policy and practice to recruit, hire, compensate, train and promote, in all job classifications, without regard to race, color, religion, sex, age, national origin or disability. Opportunities throughout Lincoln are available to employees and applicants are evaluated on the basis of job qualifications. If you are a person with a disability that impedes your ability to express your interest for a position through our online application process, or require TTY/TDD assistance, contact us by calling 260-455-2558.
Requirements
Do you have experience in YAML?, Do you have a Bachelor's degree?, * Bachelor's degree in Computer Science, Information Systems, Engineering or related field.
- 5+ years of experience in data engineering or similar role implementing with a variety of on-premises and cloud data management, integration, and analytical technologies.
- Production experience with relational databases such as MySQL, and SQL Server, including designing schemas, writing complex queries, optimizing performance, and ensuring data integrity for a variety of business applications.
- Strong experience with end-to-end data architecture implementing systems that use AWS services related to data storage and management, must have advanced proficiency in SQL, Python, YAML and Bash.
- Thorough understanding of the Software Development Life Cycle (SDLC), including DevOps practices, CI/CD processes, application resiliency, and security measures.
- Excellent communication skills with ability to translate technical concepts for business audiences.
- Data governance experience: catalog tools, data quality frameworks, lineage tracking, and compliance controls is a plus.
- AWS certifications such as SAA, Associate Developer, Data Analytics Specialty, or Databricks certification is a plus.
Benefits & conditions
$96,900 - $176,200 a year Tuition reimbursement, Parental leave, Paid time off, Employee assistance program Hybrid work in Radnor, PA