Analytics & BI Spclst 3 or Sr

MidAmerican Energy
Portland, United States of America
4 days ago

Role details

Contract type
Permanent contract
Employment type
Full-time (> 32 hours)
Working hours
Regular working hours
Languages
English
Experience level
Senior
Compensation
$ 31K

Job location

Portland, United States of America

Tech stack

API
Artificial Intelligence
Data analysis
Azure
Cloud Storage
Information Systems
Continuous Integration
Data Architecture
Data Validation
Information Engineering
Data Governance
Data Infrastructure
ETL
Data Security
Data Systems
IBM DB2
Data Intelligence
Meta-Data Management
Oracle Data Service Integrator
Oracle Applications
Performance Tuning
Role-Based Access Control
Software Tools
SQL Databases
Data Streaming
Unstructured Data
Management of Software Versions
Data Processing
Data Storage Management
Enterprise Software Applications
Data Ingestion
Azure
Informatica Powercenter
Informatica Cloud
Data Lake
PySpark
Information Technology
Data Lineage
Data Management
Software Version Control
Data Pipelines
Databricks

Job description

This is a multi-level posting. Candidates may be considered for any of the posted levels, depending on their level of experience and depth of expertise.

You will design, build, and maintain scalable data pipelines and infrastructure to support analytics, reporting, and data science initiatives. You will work closely with cross-functional teams to ensure data is accessible, reliable, and secure across the organization.

Responsibilities:

  • Design and implement scalable data ingestion and transformation frameworks using one or more of the following:
  • Azure services enabling structured, semi-structured, and unstructured data to be efficiently processed and integrated into enterprise data platforms
  • Informatica Power Center & Informatica Cloud
  • Oracle Data Integrator
  • Build and maintain robust ETL/ELT pipelines.
  • Integrate data from diverse sources including on-premises systems, cloud storage, APIs, and streaming platforms.

Informatica Development and Optimization

  • Design, develop, test, and maintain ETL pipelines using Informatica PowerCenter, including performance tuning, error handling, and integration with ControlM scheduling.
  • Participate in the migration from PowerCenter to Informatica Cloud (IICS) by redesigning mappings, optimizing transformations, and supporting secure agent configurations.

Oracle Data Integrator

  • Design, develop, test, and maintain ETL pipelines using Oracle Data Integrator, including performance tuning, error handling, and integration with ControlM scheduling.
  • Experience with Fusion AI Data Platform a plus (Fusion Data Intelligence, Fusion Analytics Warehouse).

Databricks Development and Optimization

  • Develop and optimize notebooks and workflows in Azure Databricks using PySpark, SQL.
  • Implement Delta Lake for efficient data storage, versioning, and ACID transactions.
  • Leverage Databricks features such as Unity Catalog and job orchestration.

Data Modeling and Architecture

  • Design and implement data models (star/snowflake schemas) for analytics and reporting.
  • Collaborate with architects to define data lakehouse architecture and best practices.
  • Hands-on experience implementing and optimizing data solutions using the Medallion Architecture (Bronze, Silver, Gold layers) for scalable and structured data processing

Data Quality and Governance

  • Implement data validation, profiling, and cleansing routines.
  • Ensure compliance with data governance policies, including data lineage and metadata management.

Performance Tuning and Monitoring

  • Monitor and optimize performance various data processes.
  • Troubleshoot and resolve issues related to data latency, job failures, and resource utilization.

Collaboration and Stakeholder Engagement

  • Work closely with data scientists, analysts, and business units to understand data requirements.
  • Translate business needs into technical solutions that are scalable and maintainable.

Security and Compliance

  • Implement role-based access control (RBAC), encryption, and secure data handling practices.
  • Ensure compliance with industry regulations (e.g., NERC CIP, GDPR, HIPAA if applicable).

Documentation and Best Practices

  • Maintain clear documentation of data flows, architecture, and operational procedures.
  • Promote best practices in code versioning, testing, and CI/CD for data engineering.

Requirements

Bachelor's degree in information systems, computer science or related technical field; or equivalent work experience. (Typically four years of related, progressive work experience would be needed for candidates applying for this position who do not possess a bachelor's degree.)

Six or more years of experience with advanced knowledge of data architecture, cloud platforms (especially Azure), and enterprise data solutions is required for the sr level.

Proficiency in data engineering tools and platforms, especially Azure Data Factory and Azure Databricks, Informatica Power Center and IICS, Oracle Data Integrator.

Proficiency in Oracle DB, IBM DB2, Azure.

Strong understanding of data modeling, ETL/ELT processes, and performance tuning of enterprise-level applications.

Expert-level knowledge of data-related technologies from architecture to administration, including design, development, optimization, and licensing.

Proven experience working in the utility industry is required

Effective oral and written communication skills, with the ability to collaborate across teams and mentor junior engineers.

Strong analytical and problem-solving abilities.

Ability to prioritize and manage multiple tasks and projects concurrently.

Apply for this position