Data Engineer

Dla Piper LLP (us)
Boston, United States of America
1 month ago

Role details

Contract type
Permanent contract
Employment type
Full-time (> 32 hours)
Working hours
Regular working hours
Languages
English
Experience level
Intermediate
Compensation
$ 160K

Job location

Boston, United States of America

Tech stack

Java
API
Agile Methodologies
Artificial Intelligence
Azure
Data as a Services
Information Engineering
Data Integration
Data Systems
Data Warehousing
Python
Microsoft SQL Server
Powershell
Raw Data
DataOps
SQL Databases
SQL Server Integration Services
Data Streaming
Scripting (Bash/Python/Go/Ruby)
Azure
Information Technology
Azure
Data Pipelines
Databricks
Programming Languages

Job description

The Data Engineer, Solutions & Data role designs, builds, and operates data pipelines and data integration processes that translate raw data into trusted, usable datasets for analytics, reporting, and downstream solutions. The role focuses on operationalizing pipelines with governance and service expectations (SLAs), improving data quality and reusability, and enabling secure access to integrated data in support of business initiatives. In current initiatives, data engineering includes consolidating data from multiple sources into a central SQL-based integration point and performing field mapping and transformations, so solution teams can consume data consistently., Data Pipeline Engineering & Integration

  • Build and operationalize data pipelines across heterogeneous environments, aligning to governance principles and service expectations (SLAs).
  • Build and maintain ingestion, transformation, and publication of pipelines (data engineering practice) to deliver analytics-ready data.
  • Consolidate data from multiple sources into a centralized integration point (e.g., a single SQL Server instance) and manage field mappings and transformations to support consistent downstream consumption.

Data Platform & Storage

  • Design and implement data pipelines using Azure data technologies (e.g., Azure Data Factory, Azure Databricks, Azure Event Hubs, SSIS) to ingest, process, and deliver data from sources such as APIs and other systems.
  • Build and maintain data warehousing capabilities (e.g., Azure Synapse Analytics) to support analytics and reporting workloads.

Data Quality, Reliability & Operations

  • Identify, troubleshoot, and resolve data issues including data quality, integrity, latency, and security concerns; apply monitoring and operational best practices to keep pipelines reliable and performant.
  • Contribute to data quality and governance practices, including profiling datasets, defining quality rules, and establishing monitoring/remediation approaches.

Collaboration & Delivery (Agile Pod Model)

  • Work cross-functionally with engineers, analysts, and stakeholders to understand requirements and deliver data solutions that support sprint-based delivery.
  • Support pod-level delivery by producing reusable data assets and integration components that can be leveraged across multiple initiatives., * Effectively communicate, verbally and in writing, with clients, lawyers, business professionals, and third parties;
  • Produce deliverables, answer phone calls, and reply to correspondence in an efficient and responsive manner;
  • Provide timely, accurate, and quality work product;
  • Successfully meet deadlines, expectations, and perform work duties as required;
  • Foster positive work relationships;
  • Comply with all firm policies and practices;
  • Engage in both physical and sedentary activity, such as (a) working at a computer for extended periods of time, including on-screen reading and typing; (b) participating in digital/virtual conference calls; (c) participating in meetings as needed;
  • Ability to work under pressure and manage competing demands in a fast-paced environment;
  • Perform all other duties, tasks or projects as assigned.

Our employees are expected to embrace and uphold our firm values as a part of our DLA Piper culture. We are committed to excellence in how we represent our clients and develop our people.

Physical Demands

Sedentary work: Exerting up to 10 pounds of force occasionally and/or a negligible amount of force frequently or constantly to lift, carry, push, pull or otherwise move objects, including the human body. Sedentary work involves sitting most of the time. Jobs are sedentary if walking and standing are required only occasionally and all other sedentary criteria are met.

Work Environment

The individual selected for this position may have the opportunity for a hybrid work arrangement comprised of remote and in-office work, the requirement for which will be determined in coordination with the hiring manager or supervisor and may be modified in the firm's discretion in the future.

Disclaimer

The purpose of this job description is to provide a concise statement of the work elements and to organize and present the information in a standardized way. It is not intended to describe all the elements of the work that may be performed by every individual in this classification, nor should it serve as the sole criteria for personnel decisions and actions. The job duties, requirements, and expectations for this position may be modified at the Firm's discretion at any time. This job description does not change the at-will nature of employment.

Requirements

  • Proficiency in SQL and Python.
  • Data pipeline tooling and cloud data services experience (Azure Data Factory, Azure Databricks, Azure Event Hubs, SSIS).
  • Data warehousing experience (Azure Synapse Analytics) and strong fundamentals in data modeling, warehousing, and governance.
  • Scripting/automation skills (PowerShell and related tooling) for platform operations and troubleshooting.
  • Preferred experience includes familiarity with additional programming languages such as Java, Scala, or Go; experience integrating data from multiple enterprise source systems into a central SQL-based integration layer; and familiarity with DataOps concepts and operating in cross-functional teams that include data engineering personas.
  • The measures of success for this role include delivering data pipelines with trusted, quality data with agreed service levels, enabling faster onboarding of new data and more consistent analytics/AI consumption and creating reduced manual effort through reusable integrations and standardized transformations, improved data reliability and operational readiness.

Minimum Education

  • High School or GED

Preferred Education

  • Bachelor's Degree in Computer Science, Engineering, or related field.

Minimum Years of Experience

  • 3 years of experience in data engineering and/or data platform engineering (pipelines, integration, and operational support).

Benefits & conditions

The firm's expected hiring range for this position is $100,787 - $160,255 depending on the candidate's geographic market location.

The compensation offered for employment will also be dependent on other factors including the candidate's experience, skills, educational and professional background, and overall qualifications. We offer a comprehensive package of benefits including medical/dental/vision insurance, and 401(k).

About the company

DLA Piper is, at its core, bold, exceptional, collaborative and supportive. Our people are the backbone, heart and soul of our firm. Wherever you are in your professional journey, DLA Piper is a place you can engage in meaningful work and grow your career. Let's see what we can achieve. Together.

Apply for this position