Renewables Data Analyst
Role details
Job location
Tech stack
Job description
We're hiring a Renewables Data Analyst II to own the monthly generation data validation and correction process across our entire renewable fleet. You'll partner directly with our Performance Engineering team to curate site-level data integrity and extend our Databricks automation roadmap to replace manual workflows with scalable pipelines and AI-assisted tooling
This role sits at the center of three things: reliable data, trusted relationships with Performance Engineers, and a growing automation platform. Every correction you make flows directly to performance and financial reporting, variance analysis, and contractual data requirements. If you enjoy being the person who connects field reality to data truth - and who builds the tooling that makes that connection scale - this is your role
Main Responsibilities
Reporting - Generation Data Validation
- Run the monthly 5-business-day reporting cycle end-to-end, reconciling generation figures across PI Connect (SCADA), PowerOptix, and NetSuite at generating unit and plant granularity
- Process PI edits and validate downstream propagation to PowerBI and Tab Model refreshes.
- Validate KPI impact and support variance analysis at month-end.
- Author and maintain Databricks Lakeview dashboards and PySpark notebooks used to produce monthly metrics and reporting deliverables
Data Quality Control - Partnership with Performance Engineers
- Partner directly with the Performance Engineering team (5-6 PSEs) to curate data integrity: site meter power validation, meteorological sensor review, automated data substitution investigation, outage categorization, and data pipeline validity monitoring.
- Serve as the liaison between the PE team and IT - translating field observations into structured data corrections and system fixes
- Design and document QC measures within Databricks (Unity Catalog, Delta tables); test and approve system changes or enhancements affecting data quality and reporting
- Maintain the data-quality narrative for each month's reporting cycle - what changed, why, and what it means for KPIs
Process Improvement - AI-Assisted Automation
- Build and extend Databricks PySpark / SQL pipelines using medallion architecture and incremental processing to replace manual validation workflows
- Contribute to an active automation roadmap: PI Tag Flatline Detector, Monthly Metrics Pipeline, Meteorological Sensor Monitoring, Bat Curtailment Compliance, Poseidon Status
- Adopt AI-assisted tooling (LLM-based anomaly triage, automated exception narratives, assisted root-cause investigation) where it reduces manual load without sacrificing data integrity
- Identify and pilot new technologies that improve data collection, validation throughput, and analyst productivity
Business & Technical Knowledge
- Maintain fluency in all Renewables Operations applications: PI Connect / PI Vision / PI Asset Framework, Databricks (Unity Catalog, Lakeview), PowerBI, PowerOptix, NetSuite, OneNote Monthly Report, JIRA
BI Tool Administration & Training
- Administer and train end-users on existing BI and reporting tools, including Databricks dashboards and PowerBI reports
Other / Project Support
- Ad-hoc project work ensuring data integrity and compliance to business processes and procedures
Tech Stack
- Databricks (PySpark, SQL, Unity Catalog, Delta Lake, Lakeview Dashboards)
- OSIsoft / AVEVA PI (PI Connect, PI Vision, PI Asset Framework)
- Seeq
- PowerOptix
- NetSuite
- PowerBI
- JIRA
- OneNote
- Excel, * Every correction directly impacts financial reporting and contractual data requirements across a multi-GW fleet
- Active, early-stage automation roadmap - you're building the systems, not inheriting a mature platform
- Full-stack data ownership from raw SCADA ingestion through anomaly detection to downstream validation
- Domain depth that generic data roles never encounter - sensor integrity, environmental compliance, and asset health monitoring
- Direct, trusted relationships with the Performance Engineering team - you are the bridge between field reality and data truth
Working Conditions
- This role is intended to be located at the Deriva Headquarters office in Charlotte, NC.
- This is an in-person position with flexibility to work remotely once per week
Requirements
- Bachelor's degree in information technology, Engineering, Business, or another technical field
- In addition to a bachelor's degree, one (1) year or more of related work experience in data analysis, data engineering, or a similar role
- In lieu of a bachelor's degree, HS/GED AND six (6) years or more of relevant work experience, * Master's degree in Information Technology, Engineering, Business, or another technical field
- Three (3) years or more of related work experience
- Demonstrated proficiency in SQL and Python (PySpark preferred) with the ability to write and debug data transformations fluently
- Experience working with time series data (10-minute intervals, hourly rollups, monthly aggregations)
- Comfort with messy, inconsistent data across multiple source systems and the judgment to investigate discrepancies
- Strong attention to detail under volume
- Ability to work independently across data systems and operational processes.
- Demonstrated interpersonal skills with the ability and willingness to partner with multiple other groups throughout the organization
- Hands-on experience with Databricks / Delta Lake / PySpark and medallion architecture or incremental pipeline design
- SCADA / OSIsoft PI experience - PI tags, backfill, Event Frames
- Renewable energy operations exposure (turbine availability, curtailment, outage flags, scheduled downtime)
- Track record replacing manual workflows with automation - including willingness to adopt AI-assisted tooling (LLMs, anomaly detection, automated triage) for data quality workflows
- PowerBI (DAX measures, dashboard creation) and experience building internal tools or lightweight apps
- Demonstrated oral and written communication, organizational, presentation, and facilitation skills
- Two (2) years or more of renewable operational or utility experience
Benefits & conditions
- Health Insurance
- Dental Insurance
- Vision Insurance
- 401(k) with matching
- Employee assistance program
- Flexible spending account
- Life insurance
- Paid time off
- Parental leave
- Attractive Bonus Potential