BACK
Energy Efficiency
R. 13-11-005
November 17, 2022

Energy Division Issues Draft Evaluation of Regional Energy Networks

Energy Division's draft evaluation details its three-year assessment of Regional Energy Network ("REN") data tracking and reporting processes for program years 2019 through 2021. The draft evaluation also examines each REN's effort to align with the segmentation and metrics requirements of D.21-05-031 which required the program administrators to assign each program within their energy efficiency program portfolios to a market segment based on the offering's primary purpose: resource acquisition, market support or equity. Comments are due by November 29, 2022.

The following is a brief summary of the draft evaluation's findings:

Portfolio Segmentation and Metrics:

  1. Once the RENs have had the chance to finish collecting baseline data in 2022 and 2023, a full evaluability assessment of their baselines and ongoing data collection protocols will be feasible and should be conducted to determine if metrics and targets are set appropriately and if data collection and reporting are yielding meaningful data and insights.
  2. The evaluation team’s in-depth interviews revealed widespread implementer praise for the RENs’ new portfolio segmentation schemes and the associated metrics.
  3. Metrics should focus on a program’s most important activities and outcomes and tie performance to annual program administrator ("PA") goals and cumulative statewide targets across all PAs. Additionally, both statewide and the RENs’ unique value metrics and indicators should provide direction, demonstrate progress, and hold implementers and PAs accountable for performance. However, the desire for more metrics should be weighed against the effort required to gather the data.
  4. While the CPUC should allow the RENs and their implementers discretion in their selection of which metrics best apply to their programs, it should simultaneously maintain requirements regarding data collection for a minimum number of shared metrics to ensure adequate data to assess program performance and to ensure meaningful contributions to any aggregated statewide totals from all PA programs.
  5. Having both types of metrics (shared statewide metrics and REN unique value metrics) allow the RENs greater reporting flexibility and more opportunity to showcase their gap-filling activities that might otherwise be lost when individual program performance numbers are combined with overall PA portfolio metrics or aggregate statewide numbers that reflect contributions from all PAs.
  6. Although unique value metrics are a meaningful way to track and demonstrate the RENs’ unique contributions to PAs’ EE portfolio, in some cases they still do not demonstrate the full value of the RENs’ activities because there is currently no CPUC-approved way to quantify the non-energy benefits arising from the ways the RENs are filling gaps in the marketplace. Until such time as the CPUC approves a formal framework to determine non-energy benefits, the evaluation team suggests the RENs do their best to document and illustrate these additional benefits in their annual reports, such as quantifying the number of jobs or newly educated/credentialled workers added within their service territories.
  7. Some statewide metrics/indicators may require personally sensitive information from customers. The CPUC, PAs, and/or a CAEECC WG should draft best practices for how implementers can collect personally identifiable information ("PII") and other sensitive information and how to speak with program participants to encourage them to share the requisite personal information necessary to comply with reporting new statewide metrics.
  8. Some statewide metrics call for information that could be better provided by third parties with better access to existing data or with the ability to gather primary data from multiple sources. The CPUC should consider delineating distinctions within the required statewide metric categories to differentiate between (1) those that can be readily collected from project data, customers, and participating trade allies or other professionals; and (2) those that require external data from third parties such as public agencies, partnerships with outside organizations like air quality management districts, and/or extensive data collection such as surveys of populations that span multiple PA service territories.
  9. The evaluation team found that in some cases segmentation metrics “fit,” but they did not “make sense” for certain programs.
  10. The new statewide market support metrics call for tracking participants in various activities. The CPUC should consider requiring PAs to include definitions for the units of measure (e.g., training participants, contractors, requests for information, etc.) and the qualifications for inclusion (e.g., 100% vs. more than 50% attendance) used within their reporting metrics. PAs should also be required to point out any heterogeneous counts (e.g., individuals + facilities + agencies) included in their final singular portfolio wide number being reported for a given metric.
  11. Because the statewide metrics have only recently been proposed, the CAEECC WGs have not had time to attend to the myriad of details associated with the metrics they established and open questions remain, such as how to avoid double counting of participants, activities or outcomes due to overlapping PA programs or inconsistent definitions. Until such time as the CPUC approves specific rules regarding reporting and the avoidance of double counting, the CPUC should consider requiring each implementer and each PA to document in their filings how they have quantified their program performance and what they have done to prevent double counting, such as providing definitions of the units counted (e.g. contractors vs employees) and the eligible groups from which those units were drawn (e.g. within a geographic boundary, customer class, etc.).

Data Management Assessment of BayREN, SoCalREN, and 3C-REN:

  1. Based on the data review and interviews with BayREN staff, the evaluation team found BayREN’s data to be of high quality, reasonable mergeability and moderately complete, except for phone numbers, which could be entered more consistently.
  2. Based on in-depth interviews with the BayREN program implementation teams, the evaluation team found all BayREN programs have well-established methods for documenting and explaining data flows and data management.
  3. While all BayREN implementers appear to maintain adequate data security protocols, an implementer who supports BayREN’s multifamily program has less rigorous data security standards than the others. For instance, while this multifamily implementer requires unique logins for their technical assistants, they do not provide specific training regarding how to handle PII, despite its centrality to data handling. The evaluation team recommends that BayREN, as well the other RENs, conduct a data security review across all programs to ensure all implementers and any others with access to PII or other customer records maintain industry best practice standards.
  4. The evaluation team found the majority of SoCalREN’s tracking data fields to be sufficiently populated and of good quality, except for phone numbers which were inconsistent.
  5. SoCalREN provided ample program documentation regarding protocols, tools, and practices. SoCalREN’s resource data processes and procedures are well honed to minimize errors.
  6. SoCalREN demonstrates a commitment to protecting the data within its data systems, as well as the systems used by its program implementers. While data handling and data security protocols differ by program implementer, SoCalREN requires compliance with the California Consumer Privacy Act ("CCPA") wherever PII is involved.
  7. The robustness of 3C-REN's data tracking and reporting is significantly better than observed in past studies of RENs and LGPs. However, while the tracking datasets 3C-REN provided include the most common fields necessary to merge program data with CPUC data (e.g., customer name, full address, phone number, and email address), we identified gaps in the data completeness of 6 key mergeable fields including name, address, city or zip code, phone, date of participation, and email.
  8. 3C-REN should continue to work with its program implementers to identify existing barriers and potential solutions to collecting better customer contact info for their activities by requiring a minimum of at least one way to contact the customer, and to implement a standardized approach to data entry and validation across its programs, particularly with respect to key mergeable fields.
  9. All 3C-REN’s programs use Salesforce for customer relationship management (CRM) and data tracking, and SharePoint for data storage and collaboration across agencies and implementers. While these systems have built-in data security, 3C-REN did not provide specific data security documentation. However, when asked during in-depth interviews 3C-REN staff explained that they ensure the security of their data by limiting access, using two-factor authentication, and transferring files through secure file protocols. They also require implementers to pass IOU Third-Party Security Reviews.
  10. The evaluation team finds that 3C-REN data systems are secure as identified through interviews, but 3C-REN’s documentation did not discuss its data security policy. 3C-REN should clearly define its security policies to clearly articulate how participant data under its control is secured.
Update Links
Assessment of Regional Energy Networks
SEE PROCEEDING
RELATED UPDATES

Client Resources

Land Use

Regulatory

Litigation

About

845 15th Street, Suite 103
San Diego, CA 92101
858-224-3068