Loading...
HomeMy WebLinkAbout20110708PAC to IIPA Attach 12 -2.pdf 1 15 October 2010 Mr. Jason Berry Program Manager Demand Side Management Rocky Mountain Power 201 South Main Street, Suite 2300 Salt Lake City, Utah 84111 Dear Mr. Berry: Comverge is pleased to submit its annual report assessing system impacts of the Cool Keeper Load Control Program for 2010. Our Measurement & Verification (M&V) plan calls for the delivery of this report summarizing the results of the curtailment impacts for each program year. As you will see in your review of this report, Comverge has developed significant and innovative capabilities in the support of its contracts to obtain timely and accurate measurements of the response of its assets in Utah to curtailment requests. We provide in the attached report a comprehensive description of our Measurement & Verification processes and systems which were used in Utah this past summer. The calculation of the available capacity is also detailed as discussed in the M&V plan. Any questions concerning this report may be directed to Ms. Laurie Sobczak or the undersigned. Comverge looks for our continuing relationship with Rocky Mountain Power on this exciting program. Thank you. Sincerely, Wendell Miyaji, PhD Vice President, Energy Sciences Comverge, Inc. ID PAC-E-11-12 IIPA 12 Attachment IIPA 12 -2 Attach IIPA 12 -2.pdf Page 1 of 57 ID PAC-E-11-12 IIPA 12 Attachment IIPA 12 -2 Attach IIPA 12 -2.pdf Page 2 of 57 Rocky Mountain Power Cool Keeper 2010 Load Control Impact Evaluation Report Comverge, Inc. 120 Eagle Rock Ave., Suite 190 East Hanover, NJ 07936 www.comverge.com © 2010 Comverge, Inc. This document contains proprietary information of Comverge, Inc. and is not to be disclosed or used except in accordance with applicable agreements. ID PAC-E-11-12 IIPA 12 Attachment IIPA 12 -2 Attach IIPA 12 -2.pdf Page 3 of 57 ID PAC-E-11-12 IIPA 12 Attachment IIPA 12 -2 Attach IIPA 12 -2.pdf Page 4 of 57 Table of Contents 1 INTRODUCTION ................................................................................................................................. 1 1.1 EXECUTIVE SUMMARY .................................................................................................................... 1 2 DATA SOURCES AND USES ............................................................................................................ 4 3 MEASUREMENT AND VALUATION REQUIREMENTS .................................................................. 7 4 SAMPLE DESIGN ............................................................................................................................... 8 4.1 SAMPLE SIZE SELECTION ............................................................................................................... 8 4.2 RESIDENTIAL SEGMENT SAMPLE DESIGN ........................................................................................ 8 4.3 COMMERCIAL SEGMENT SAMPLE DESIGN ..................................................................................... 11 5 IMPACT EVALUATION METHODS ................................................................................................. 13 5.1 DATA COLLECTION & VALIDATION ................................................................................................ 13 5.1.1 Data Collection ................................................................................................................... 13 5.1.2 Data Validation ................................................................................................................... 14 5.2 IMPACT ESTIMATION METHODS ..................................................................................................... 14 5.2.2 Load Shape Comparison.................................................................................................... 14 5.2.2 Duty Cycle Method ............................................................................................................. 15 6 IMPACT EVALUATION RESULTS .................................................................................................. 16 6.1 TEMPERATURE ANALYSIS ............................................................................................................. 16 6.2 IMPACT EVALUATION CALCULATIONS............................................................................................ 20 6.3 FINAL KW SETTLEMENT CALCULATION ......................................................................................... 21 7 M&V EXCEPTION HANDLING ........................................................................................................ 22 7.1 AUTOMATED STATUS FLAGS ........................................................................................................ 22 7.2 OUT OF RANGE VALUES ............................................................................................................... 23 7.3 SAMPLING BIAS ............................................................................................................................ 25 7.4 DATA REPORTS ............................................................................................................................ 30 7.5 MISSING METER DATA .................................................................................................................. 31 7.6 NON-RESPONDING DCUS............................................................................................................. 31 7.7 MISWIRINGS ................................................................................................................................. 33 7.8 ABSENCE OF ZERO LOAD DATA .................................................................................................... 33 7.9 TONNAGE DISCREPANCIES ........................................................................................................... 34 7.10 INCORRECT CLASSIFICATIONS .................................................................................................... 34 7.11 INCORRECT METER TIMES .......................................................................................................... 34 8 CONCLUSIONS ................................................................................................................................ 35 APPENDIX............................................................................................................................................ 37 ID PAC-E-11-12 IIPA 12 Attachment IIPA 12 -2 Attach IIPA 12 -2.pdf Page 5 of 57 Table of Figures FIGURE 1-1 SUMMARY OF 2010 EVENTS ..................................................................................................... 2 FIGURE 2-1 SAMPLE COMMUNICATIONS M&V MAP ................................................................................... 6 FIGURE 3-1 SAMPLE SIZE ANALYSIS RESULTS PERFORMED BY QUANTUM ................................................... 7FIGURE 4-1 ENROLLED POPULATION BY STRATIFICATIONS (PERCENTAGE) .................................................. 9FIGURE 4-2 ENROLLED POPULATION BY STRATIFICATIONS (COUNT) ........................................................... 9FIGURE 4-3 M&V SAMPLE BY STRATIFICATIONS (PERCENTAGE) .............................................................. 10 FIGURE 4-4 M&V SAMPLE AND ENROLLED POPULATION DIFFERENCES (M&V MINUS ENROLLED) ............ 10 FIGURE 4-5 ENROLLED POPULATION BY STRATIFICATIONS (PERCENTAGE) ................................................ 11FIGURE 4-6 ENROLLED POPULATION BY STRATIFICATIONS (COUNT) ......................................................... 11FIGURE 6-1 EVENT HOURS AND TEMPERATURES ...................................................................................... 17FIGURE 6-2 TEMPERATURE AT KSLC ON JULY 16, 2010 ........................................................................... 18FIGURE 6-3 TEMPERATURE VERSUS AVERAGE LOAD (KW) FOR RESIDENTIAL SITES .................................. 19 FIGURE 6-4 TEMPERATURE VERSUS AVERAGE LOAD (KW) FOR COMMERCIAL SITES ................................. 19 FIGURE 6-5 EVENT RESULTS SUMMARY ................................................................................................... 20FIGURE 6-6 TEMPERATURE BIN IMPACT RESULTS ..................................................................................... 21FIGURE 7-1 VALID NON-ZERO STATUS VALUES ....................................................................................... 22FIGURE 7-2 OUT OF RANGE BOUNDARIES ................................................................................................. 24 FIGURE 7-3 SITES FLAGGED FOR LOAD VALUES BELOW LOWER BOUNDS ................................................. 24 FIGURE 7-4 SITES FLAGGED FOR LOAD VALUES ABOVE UPPER BOUNDS ................................................... 25FIGURE 7-5 DISTRIBUTION OF OUT OF RANGE DEVICES ............................................................................ 25FIGURE 7-6 GROUP TONNAGE DISTRIBUTION ........................................................................................... 26FIGURE 7-7 AVERAGE RESIDENTIAL BIAS ................................................................................................ 26FIGURE 7-8 AVERAGE RESIDENTIAL NON-MDU BIAS .............................................................................. 27 FIGURE 7-9 AVERAGE RESIDENTIAL MDU BIAS ....................................................................................... 27 FIGURE 7-10 AVERAGE RESIDENTIAL LOAD FOR EVENT DAY JULY 16 ...................................................... 28FIGURE 7-11 AVERAGE RESIDENTIAL NON-MDU LOAD FOR EVENT DAY JULY 16 .................................... 28FIGURE 7-12 AVERAGE RESIDENTIAL MDU LOAD FOR EVENT DAY JULY 16 ............................................. 29FIGURE 7-13 AVERAGE COMMERCIAL BIAS .............................................................................................. 30 FIGURE 7-14 AVERAGE COMMERCIAL LOAD FOR EVENT DAY JULY 16 ..................................................... 30 FIGURE 7-15 M&V FIELD VISITS ASSOCIATED WITH DATA REPORTING ISSUES ......................................... 31FIGURE 7-16 M&V FIELD VISITS ASSOCIATED WITH PAGING ISSUES ......................................................... 32FIGURE 7-17 NONZERO CONTROL SEASON MINIMUM LOADS .................................................................... 33 ID PAC-E-11-12 IIPA 12 Attachment IIPA 12 -2 Attach IIPA 12 -2.pdf Page 6 of 57 Document Revision History Revision Reason for change Release date 1.0 Initial release for review 10/19/2010 2.0 Revisions after review: • Split residential and MDU information, similar to table 6.5 • Updated kW factors based on revised MDU weights • Add sentence explaining out of range boundaries in table 7.2 • Provide M&V groups A vs. B comparisons for event day(s) • Replace customer names with premise ID’s • Note date of problems experienced with any gateways and their resolution, going forward • Reference contract section in report • Spell out acronyms • Data validation; provide rules and code string – added appendix 12/03/2010 2.1 Updated Table of Contents and Table of Figures 12/03/2010 3.0 Revised Residential calculations per J. Bumgarner on 12/22/2010 per contractual M&V requirements. Used data from sampled sites without applying any weights for MDU and non-MDU contribution to Residential population. 12/28/2010 ID PAC-E-11-12 IIPA 12 Attachment IIPA 12 -2 Attach IIPA 12 -2.pdf Page 7 of 57 1 1 Introduction 1.1 Executive Summary This report details the operation and load reduction results of the Comverge Cool Keeper™ direct load control (DLC) program for the summer of 2010. In 2010, three curtailment hours qualified as an M&V event due to the 97 degree temperature threshold, all of which occurred within the same event. This report details the operation and results of the 2010 program year given the available data. This year the Cool Keeper program effectively delivered 101.5 megawatts (MW) of load savings throughout the Salt Lake City area. The load reduction results are calculated from load measurements taken at randomly selected Measurement and Verification (M&V) sites that closely follow the geographic and tonnage distribution of all sites enrolled in the program. M&V efforts in 2010 were conducted according an M&V plan as described to Rocky Mountain Power in a presentation by Comverge prior to the 2010 curtailment season. The M&V efforts and load reduction calculations are consistent with the descriptions made in the contract. These efforts aided in identifying opportunities for the continual improvement of the Cool Keeper program. ID PAC-E-11-12 IIPA 12 Attachment IIPA 12 -2 Attach IIPA 12 -2.pdf Page 8 of 57 2 Hour Temp (F) Res kW Reduction Com kW Reduction June 29, 2010, 14:00-14:15 (50% ADI++) 14:00-14:15 97 N/A+ N/A+ July 15, 2010, 14:00-18:00 (50% DI++) 14:00-15:00 90 0.48 1.02 15:00-16:00 90 0.43 1.11 16:00-17:00 92 0.73 1.55 17:00-18:00 91 0.84 1.30 July 16, 2010, 14:00-18:00 (50% ADI) 14:00-15:00 97 0.76 1.96 15:00-16:00 98 1.05 2.15 16:00-17:00 100 0.99 2.21 17:00-18:00 102 1.03 1.67 July 19, 2010, 14:00-18:00 (50% ADI) 14:00-15:00 91 0.79 1.20 15:00-16:00 93 0.89 1.38 16:00-17:00 93 0.91 1.66 17:00-18:00 94 0.90 1.09 July 20, 2010, 14:30-17:00 (DI) 14:30-15:00 90 N/A* N/A* 15:00-16:00 91 0.65 1.57 16:00-17:00 91 0.87 1.48 July 30, 2010, 15:30-18:00 (DI) 15:30-16:00 94 N/A* N/A* 16:00-17:00 94 0.95 1.44 17:00-18:00 94 0.94 1.19 August 3, 2010, 14:00-16:30 (DI) 14:00-15:00 92 0.68 1.33 15:00-16:00 94 0.76 1.51 16:00-16:30 93 N/A* N/A* August 17, 2010, 15:30-18:00 (DI) 15:30-16:00 95 N/A* N/A* 16:00-17:00 95 0.98 1.57 17:00-18:00 96 1.00 1.77 + Event cancelled after 15 minutes. * Events are only calculated on an hourly basis. ++ ADI stands for Adaptive Distributed Intelligence; DI stands for Distributed Intellligence Figure 1-1 Summary of 2010 Events ID PAC-E-11-12 IIPA 12 Attachment IIPA 12 -2 Attach IIPA 12 -2.pdf Page 9 of 57 3 This year Comverge successfully executed 8 curtailment events called by Rocky Mountain Power. In total, there was three complete hours at 97 degrees or above which are used to calculate the load reduction impact estimates. The reduction calculations yielded a result of 1.02 kW for the Residential segment. The Commercial segment load reduction estimate for 2010 came to 2.01 kW. This leads to an average total estimated load reduction of 101.5 MW based on the total installed capacity. These results are described in detail and calculated as shown in the Impact Evaluation Results section. Figure 1-1 summarizes the load reduction results for each event. Both Residential and Commercial results are calculated by averaging the differencing method and the duty cycle method. This report follows the format used to detail M&V results in 2009. This report contains sections detailing Data Sources and Uses, Sample Design, Impact Evaluation Methods, Impact Evaluation Results. A section on M&V Exception Handling was added last year, which details how field and data issues are handled to ensure the integrity of kW reduction estimate. Finally, the report is concluded with a summary of the 2010 M&V operations and recommendations for improvements to the program in 2011. The analysis follows Exhibit A, Work Scope Section II Measurement & Valuation of the PacifiCorp Contract 46-00001018. As noted in this report, some interpretation and clarifications have been made as described in this report. ID PAC-E-11-12 IIPA 12 Attachment IIPA 12 -2 Attach IIPA 12 -2.pdf Page 10 of 57 4 2 Data Sources and Uses Comverge’s M&V program utilizes a number of data sources to monitor, analyze, and forecast load usage in the Cool Keeper program. Comverge uses PowerCAMP™ database software to store all incoming load and communication data and the Business Information System (BIS) database to store all customer related data. PowerCAMP stores information on paging reception at the M&V sites as well as the 5 minute load profile data for each site. In the PowerCAMP database each 5 minute interval for each meter is assigned an automated status flag, designating if there was an error in reading the kW measurement for that interval. Data of status 0 indicates no error, such as a power outage, has occurred. All data of status 0 is used for analysis. There was some data left out for settlement calculations this year due to power outages. There were two sites that had missing data due to a power outage for the entire day of the July 16th event, as recorded by the meters. The rule for dealing with missing data is to simply leave it out of calculations when it is missing for a full hour. If we have data for some intervals within both half hours of a given hour, we can then use the actual data collected in the duty cycle calculations. There are no cases where missing data is assumed to have a kW value of zero or otherwise. The practice of leaving non-zero data out of the analysis is consistent with what has been done in all previous years of the project. Automated status flags will be discussed in more detail in the M&V Exception Handling section of this report. In addition to this database, Comverge uses external data sources to monitor weather and system load. Specifically, Comverge receives hourly weather data from the Salt Lake City airport from a third party vendor, DTN Meteorlogix. The Salt Lake City airport is a National Oceanic and Atmospheric Administration (NOAA) weather station. The site is identified by its call sign - KSLC. The average number of active load switches is calculated based on data from the participant tracking system. For the duty cycle calculations, manufacturer data is also used to determine the connected load of specific M&V units when it is available. If it is not available, connected load is calculated as the 99th percentile of the load readings taken from the unit. If the 99th percentile of the operating load exceeds 1.5 times the known nominal unit capacity, then the connected load is set to be the 95th percentile of the operating load. PowerCAMP is the database system used to store and access all M&V data related to this project. PowerCAMP was developed by Comverge to allow for data collection from M&V sites. The meters used at the M&V sites have been programmed to record interval kilowatt data every 5 minutes. The PowerCAMP system requests updated meter data every 30 minutes. PowerCAMP provides M&V log files from the M&V units. A separate Load Management System contains listings those who are available for dispatch. For the 2010 season, the Apollo™ Demand Response Management System was used for event scheduling. Apollo is an open-standards, web-based demand response platform that supports device management and configuration as well as event generation and control. ID PAC-E-11-12 IIPA 12 Attachment IIPA 12 -2 Attach IIPA 12 -2.pdf Page 11 of 57 5 BIS is the database system used to store customer and equipment related data for every participant that is enrolled in the Cool Keeper program. BIS allows Comverge to monitor and analyze data related to inspections, field service requests, and characteristics of program participants as well as the characteristics of their installed equipment. For example, BIS is used to determine the number of active customers enrolled in the Cool Keeper program on any given day. Comverge also uses BIS to determine whether the M&V sample is representative of the enrolled population by querying customer addresses and air conditioner tonnage information from the database. Daily “nicking” tests are conducted to test the M&V sites’ ability to receive pages. All communications data from these daily nicking tests were stored in the PowerCAMP database. An application was developed that queried the database and displayed the data using Microsoft® MapPoint in order to visualize the communication status of the M&V sites. Figure 2-1 is a screenshot of this program. The M&V sites are labeled with yellow and blue squares, indicating M&V groups A and B, respectively. Sites with a white square are associated with thermostat M&V sites that are not part of the analysis for the 2010 season. Additionally, a colored dot or triangle is placed in these square symbols to represent the number of paging signals received by those sites in a given day. This allows for a quick determination of paging reliability over a region. This map makes clear any paging problems that are geographic in nature. One such problem could be an inoperative transmitter station. The transmitter stations in the area are also displayed on the map. Microsoft Access is used as the main user interface in viewing and processing load and communication data. Meter readings are sent to the database every 30 minutes. SAS software is also used to access data directly from PowerCAMP. SAS software allows for querying and analysis of the data. These software packages are also used to calculate the load impact estimates using the methodologies described in this document. ID PAC-E-11-12 IIPA 12 Attachment IIPA 12 -2 Attach IIPA 12 -2.pdf Page 12 of 57 6 Figure 2-1 Sample Communications M&V Map ID PAC-E-11-12 IIPA 12 Attachment IIPA 12 -2 Attach IIPA 12 -2.pdf Page 13 of 57 7 3 Measurement and Valuation Requirements The Measurement and Valuation requirements for the Cool Keeper program call for a statistically valid sample of M&V sites to be used to calculate the load impact estimate for the entire enrolled population. The sample set is broken up by sites rated at less than 65,000 British Thermal Units (BTUs) and sites greater than 65,000 BTUs but less than 90,000 BTUs. For the remainder of this document, the class of sites rated at less than 65,000 BTUs is referred to as Residential and those above 65,000 but less than 90,000 BTUs as Commercial. The sample size was selected to achieve the load reduction estimates with 90% confidence and reasonable precision. The goal is to obtain Residential estimates with 13% precision and Commercial estimates with 20% precision. Analyses performed by Quantum in 2007 indicated that Comverge would achieve sufficient precision if multiple hours are included in the final reduction estimate, as indicated in Figure 3-1 below. Number of Hours Used in Analysis Sample Size 1 2 3 4 5 6 5 82.0% 65.0% 58.0% 54.0% 52.0% 50.0% 10 58.0% 46.0% 41.0% 38.0% 37.0% 36.0% 30 33.0% 26.0% 24.0% 22.0% 21.0% 20.0% 65 23.0% 18.0% 16.0% 15.0% 14.0% 14.0% 100 18.0% 14.0% 13.0% 12.0% 12.0% 11.0% 135 16.0% 12.0% 11.0% 10.0% 10.0% 10.0% Figure 3-1 Sample Size Analysis Results Performed by Quantum This year 20 units were added to the M&V population for a total of 175 units. The M&V sites were randomly selected to be in Group A or Group B. These groups were alternated as curtailment and comparison groups. The impact estimates are calculated from curtailment events occurring at 97 degrees F or higher using an adaptive load control algorithm set for a 50% duty cycle, operating on a 30-minute time step. Events were held only on summer non-holiday weekdays. The load estimates are calculated using two different methods. Method One is a load shape comparison approach and Method Two is the duty-cycle approach. The Method One approach calculates the impact estimates by taking the difference between the average load in the comparison and curtailment groups. The groups that are used for curtailment and comparison alternate to eliminate any bias that may exist between the groups over the course of the summer. Method Two uses a duty cycle simulation algorithm on uncontrolled M&V sites to calculate the load reduction. ID PAC-E-11-12 IIPA 12 Attachment IIPA 12 -2 Attach IIPA 12 -2.pdf Page 14 of 57 8 4 Sample Design This section describes the sampling methodologies used to obtain a statistically representative sample from the population of enrolled Cool Keeper sites. Specifically, the sample is designed to provide impact estimates with 90% confidence and 13% precision (90/13) for the Residential segment. The sample for the Commercial segment is designed to achieve estimates with 90% confidence level and 20% confidence interval (90/20). The Residential segment consists of air conditioners rated at less than 65,000 BTUs while the Commercial segment contains those rated at greater than 65,000 BTUs and less than 90,000. The capacities were determined using the manufacturer model numbers stored in a tracking system. The samples are selected randomly along stratifications as shown in this section. 4.1 Sample Size Selection The necessary sample sizes needed to meet the statistical criteria outlined in the contract were calculated to be 175 M&V units. Due to customers dropping out of the M&V program, Comverge rotated out 26 units and reinstalled those plus an additional 20 units to bring the M&V population up to 175 units. The Residential segment contained 142 total Digital Control Units (DCUs) and the Commercial segment contained 32 total DCUs on June 1, 2010. On September 1, 2010, the Residential segment consisted of 141 total DCUs and the Commercial segment consisted of 29 total DCUs. 4.2 Residential Segment Sample Design The Residential segment is stratified a number of ways so that the sample could be as representative as possible. This segment is first divided into building categories such as single- family detached, single-family attached (apartments and condominiums) and small commercial sites. All sites use air conditioners rated at less than 65,000 BTUs. Within each building type, sites are then classified into small and large air-conditioning tonnage groups. All sites are also classified as being in 1 of 6 geographic regions. Figure 4-1 below shows the distribution of the population of participating Residential sites across the strata identified above. In February 2010, at the time when the M&V Report compiled, there were approximately 92,248 unique sites in the Residential segment overall. Single-family detached homes make up the largest part of the Residential segment at 84%. These sites are broken up by air-conditioning tonnage to provide a better representation across the population. Please note that the population counts and percentages for non-apartments and apartments are not the sum of the individual cells due to missing tonnage data in BIS. Sites that do not have a tonnage record are included the population counts and percentages in the TYPE Total row. ID PAC-E-11-12 IIPA 12 Attachment IIPA 12 -2 Attach IIPA 12 -2.pdf Page 15 of 57 9 POPULATION PERCENTAGES NON-APT APT TOTAL COUNTY 1.0-2.9 3.0-5.4 1.0-2.9 3.0-5.4 Davis 5.1% 9.3% 1.3% 0.1% 15.7% Salt Lake - 840 8.6% 15.2% 4.9% 0.1% 11.3% Salt Lake - 841 13.8% 11.7% 3.7% 0.2% 13.0% Tooele 0.7% 1.0% 0.0% 0.0% 0.9% Utah 4.4% 8.2% 0.8% 0.0% 6.8% Weber 3.9% 6.3% 0.6% 0.0% 5.9% TON Total 36.6% 51.6% 11.3% 0.4% TYPE Total 83.6% 16.4% Figure 4-1 Enrolled Population by Stratifications (Percentage) POPULATION COUNTS NON-APT APT TOTAL COUNTY 1.0-2.9 3.0-5.4 1.0-2.9 3.0-5.4 Davis 2,735 4,963 717 27 8,442 Salt Lake - 840 4,629 8,161 2,620 71 15,481 Salt Lake - 841 7,402 6,287 1,999 113 15,801 Tooele 384 541 1 1 927 Utah 2,380 4,380 413 9 7,182 Weber 2,110 3,358 301 15 5,784 TON Total 19,640 27,690 6,051 236 92,248 TYPE Total 77,077 15,171 Figure 4-2 Enrolled Population by Stratifications (Count) The remainder of this section describes analysis procedures, the breakdown of the residential strata, the final sample distribution, and any adjustments that had to be made to the sample frame. Before additional M&V samples are selected, a number of steps are taken to clean the residential customer data. The data used to design the sample includes the Cool Keeper participant database, a list of participant removals, a list of participant disconnects, and data from the Rocky Mountain Power Customer Support System (CSS). The Residential sample was chosen across a number of different strata to ensure a good representation of the participant population. These stratifications are done by geographic region and the air-conditioning tonnage at the residence. The geographic stratifications are determined by county and zip code. There are six different regions used. ID PAC-E-11-12 IIPA 12 Attachment IIPA 12 -2 Attach IIPA 12 -2.pdf Page 16 of 57 10 As of February 2010, the enrolled population was distributed throughout the strata described above according to Figure 4-1 and 4-2. The percentage contained in each stratum is shown in Figure 4-1. Efforts were made to recruit customers according to the distribution in the overall population. Since customers must voluntarily take part in the program, it can be difficult to follow it exactly. Figure 4-3 shows how the Residential sample was distributed on June 1, 2010. The M&V sample distribution compares favorably with the population distribution. Figure 4-4 shows the differences between the M&V and population distributions. All of the individual cells in the table show a difference of 3.5% and less. M&V PERCENTAGES NON-APT APT TOTAL COUNTY 1.0-2.9 3.0-5.4 1.0-2.9 3.0-5.4 Davis 5.6% 9.2% 1.4% 0.0% 16.2% Salt Lake - 840 9.9% 15.5% 3.5% 0.0% 28.9% Salt Lake - 841 12.7% 12.0% 1.4% 1.4% 27.5% Tooele 0.7% 0.7% 0.0% 0.0% 1.4% Utah 4.9% 7.7% 0.0% 0.7% 13.4% Weber 1.4% 6.3% 4.2% 0.7% 12.7% TON Total 35.2% 51.4% 10.6% 2.8% TYPE Total 86.6% 13.4% Figure 4-3 M&V Sample by Stratifications (Percentage) M&V-POPULATION PERCENTAGES NON-APT APT TOTAL COUNTY 1.0-2.9 3.0-5.4 1.0-2.9 3.0-5.4 Davis 0.6% 0.1% 0.3% -0.1% 1.0% Salt Lake - 840 1.3% -0.9% -0.7% -0.1% -0.4% Salt Lake - 841 -1.3% -0.5% -1.9% 1.2% -2.5% Tooele 0.0% -0.3% 0.0% 0.0% -0.3% Utah 0.8% -0.4% -0.7% 0.7% 0.5% Weber -2.6% 0.1% 3.5% 0.7% 1.7% TON Total -1.1% -1.8% 0.5% 2.3% TYPE Total 2.6% -2.6% Figure 4-4 M&V Sample and Enrolled Population Differences (M&V Minus Enrolled) ID PAC-E-11-12 IIPA 12 Attachment IIPA 12 -2 Attach IIPA 12 -2.pdf Page 17 of 57 11 4.3 Commercial Segment Sample Design This year additional Commercial M&V units were enrolled to bring the total sample to 32 air conditioners prior to the start of the program season. There were 394 air conditioners with capacity over 65,000 BTUs participating in the Cool Keeper program at the time the M&V Report was compiled, on February 1, 2010. The Commercial segment is stratified in a similar manner to the Residential segment. Sites are then classified into small and large air-conditioning tonnage groups. Sites are also classified as being in 1 of 6 geographic regions. All sites use air conditioners rated at greater than 65,000 BTUs and less than 90,000 BTUs. Figure 4-5 and 4-6 below shows the distribution of the population of participating Commercial sites across the strata identified above. The percentage contained in each stratum is shown in Figure 4-5. POPULATION PERCENTAGES COUNTY 5.5-6.4 6.5-7.5 Total Davis 9.6% 5.6% 15.2% Salt Lake - 840 10.4% 16.8% 27.2% Salt Lake - 841 10.9% 9.9% 20.8% Tooele 0.3% 0.3% 0.5% Utah 6.1% 7.6% 13.7% Weber 8.9% 13.7% 22.6% Total 46.2% 53.8% Figure 4-5 Enrolled Population by Stratifications (Percentage) POPULATION COUNTS COUNTY 5.5-6.4 6.5-7.5 Total Davis 38 22 60 Salt Lake - 840 41 66 107 Salt Lake - 841 43 39 82 Tooele 1 1 2 Utah 24 30 54 Weber 35 54 89 Total 182 212 394 Figure 4-6 Enrolled Population by Stratifications (Count) Special efforts are made to determine the capacity of each air conditioner to ensure their proper classification into the Commercial or Residential segment. During installation the technician is required to record the air-conditioner (A/C) make and model number. This information is then ID PAC-E-11-12 IIPA 12 Attachment IIPA 12 -2 Attach IIPA 12 -2.pdf Page 18 of 57 12 transferred over to an Excel spreadsheet with all the other DCU installations for that month. The transfer of information is done by the technician’s company and then that file is shipped over to Comverge. When the file arrives to Comverge there are dedicated personnel to translate the make and model numbers into accurate tonnage values. Comverge utilizes the Preston’s Guide, published by the Preston Marketing Group, if needed to verify model characteristics. This number gives an accurate tonnage for that unit. For example, a Carrier unit with model number 38AK075 has a BTU/Hour value of 71,000 or a tonnage value of 5.9. In instances where the Preston Guide does not have the model number, Comverge personnel will go to the manufacturer’s web site and find literature on that specific model number. ID PAC-E-11-12 IIPA 12 Attachment IIPA 12 -2 Attach IIPA 12 -2.pdf Page 19 of 57 13 5 Impact Evaluation Methods This section describes the methods used to evaluate the impact of the load control events using the sample set of M&V sites. The following sub-sections describe in detail the data collection approach, the calculations used to estimate load reduction, an analysis of the summer participants, and a load control switch failure analysis. 5.1 Data Collection & Validation 5.1.1 Data Collection This section outlines the steps taken to collect all M&V data. Meter data is collected by using a Comverge gateway sending data over the Code Division Multiple Access (CDMA) network. The gateway communicates with the meter and sends load information back to the PowerCAMP server. Comverge uses Microsoft Access and SAS software to connect directly to the PowerCAMP database. These software packages allow for increased data visualization and analysis. Additionally, Comverge uses MapPoint and programs developed in Visual Basic to view geographic details in the database. These software tools also aid in data validation since they provide the ability through queries and visualization techniques to detect any anomalies in the data. The M&V meter data for this project was collected and monitored throughout the summer from June 1 through August 31. The meter used at each site is a Landis+Gyr S4 meter. The meter measures the amount of air conditioner energy usage. A Comverge dual-relay DCU is used to control the air conditioners. The Comverge MainGate Commercial & Industrial (C&I) Gateway is used to retrieve meter data, capture the DCU change of state events, and send data back to the PowerCAMP server. Comverge maintains a server hosting the PowerCAMP application. Data is stored on this server in a SQL database. Communications software is used to access the Gateways. Scheduling software is used to retrieve metering and DCU state information. The gateway communicates to the server through the internet using a CDMA modem. The communications are sent over Verizon’s CDMA network to the server’s static Internet Protocol (IP) address. The gateway is programmed with this IP address upon deployment to allow for direct communication to the server. The meter and gateway contain clocks to track time. The meter’s clock can be changed remotely if needed. The gateway is configured to retrieve time from the network periodically. In order to obtain metering data, the gateway is programmed through the server with a call-in schedule. When the gateway calls in, the server checks the meter database for the last load profile data stored and issues a request for the gateway to retrieve any additional interval data after the specified date. The gateway then interrogates the meter, packages the metering data, and sends it to the server. The server stores the new metering data in the database. As part of this data, the meter flags the appropriate intervals with a status bit to indicate overflow, parity error, short interval due to power outage, or other special conditions. The dual-relay DCU is simultaneously connected to the A/C unit for control purposes and to an external option board in the gateway for reporting purposes. Therefore, any action by the DCU to the A/C unit will also ID PAC-E-11-12 IIPA 12 Attachment IIPA 12 -2 Attach IIPA 12 -2.pdf Page 20 of 57 14 be sent to the external option board in the gateway. The gateway reports any change of state of the external option board to the server. 5.1.2 Data Validation The meter and weather data stored in Comverge’s database are checked for consistency and accuracy in a number of ways. A visual inspection of meter data generally occurs daily throughout the summer. This inspection involves detecting meter data that seems out of place based on past experience. For example, an inspection may show there are more air conditioners not running than what would be expected at a given temperature. This would prompt further inspection into the communication reliability for that day since poor communication with M&V sites can result in no meter readings when there is actually load at the site. This inspection is done through the use of database queries to see which units have been communicating. Other consistency checks include comparing individual unit loads by day to look for unusual load profiles and comparing the comparison and curtailment groups to detect potential biases. Comverge’s data validation procedures will be discussed in detail in the M&V Exception Handling section of this report. 5.2 Impact Estimation Methods The impact estimation methods used to calculate the kW savings for the 2010 curtailment season follow the same procedures used for estimation in 2009. Only events where a 50% ADI curtailment strategy was used are considered for settlement calculations. Additionally, only full hours may be used for analysis and the first half-hour of each event is excluded. Only event hours where the temperature was recorded as being 97 degrees or above are included. In total, there were three hours over the course of the summer that fit these criteria to be included in the analysis. 5.2.2 Load Shape Comparison The load shape comparison method calculates the impact estimate through the comparison of two M&V groups. The M&V sites are randomly grouped into a Group A and Group B. The group that is curtailed for each event changes based on the prior events and curtailment strategies. Group A is curtailed during the first event for both ADI and DI events. For subsequent events, the curtailment group is alternated based on which group was curtailed during the prior event with the same curtailment strategy. The load estimate is obtained by averaging the load from the Natural Duty Cycle (NDC) units (the comparison group) and the Imposed Duty Cycle (IDC) units (the curtailment group) at each temperature bin. The difference between the two averages is taken for each temperature bin. The final estimate using the load shape comparison technique is obtained by averaging the estimates from all temperature bins. This number will later be averaged with the estimate obtained from the duty cycle method described in the next section. ID PAC-E-11-12 IIPA 12 Attachment IIPA 12 -2 Attach IIPA 12 -2.pdf Page 21 of 57 15 5.2.2 Duty Cycle Method The final load impact estimate will average the result obtained from the load shape comparison methodology with an estimate obtained using the Duty Cycle Method. In the Duty Cycle Approach, the imposed duty cycle (IDC) is simulated based on knowledge of how the 50% ADI strategy alters AC usage. The natural duty cycle (NDC) data for each unit is used to determine the usage during the hour prior to an event being called. The ADI algorithm uses this information to determine run time during an event. The demand savings is determined by subtracting the simulated IDC from the NDC data during the analysis hour. This difference is then multiplied by the unit’s connected load to get the demand savings. This is done for each half-hour interval during an event. The results are averaged separately for the Multiple Dwelling Unit (MDU) Residential, Non-MDU Residential, and Commercial segments. The final Residential estimate is determined by weighting the MDU Residential and Non-MDU Residential estimates based on number of corresponding active participants in the population. The first step in calculating the impact estimate involves calculating the prior hour NDC for the hour prior to the event. This is done by calculating the average kW demand for the prior hour and dividing by the connected load for each unit. The maximum cycle (AA) during curtailment based on a 50% ADI algorithm is then obtained by multiplying that natural duty cycle number by 0.5. If the prior hour NDC divided by the Connected Load is less than 0.1, the Connected Load value is used in the difference calculations. These calculations are done for every M&V unit in the control group for each control day. The natural duty cycle during the event is calculated for each half hour of the event. The NDC of the M&V units in the comparison group is calculated by taking the kW value for each event half-hour and dividing by each unit’s respective connected load. The IDC is then calculated as the minimum of the NDC and the maximum cycle calculation (AA) for each half-hour. The hourly impact for each site is obtained by averaging the impacts for the two half hour intervals. All M&V comparison sites are then averaged for each analysis hour. These results are then binned by temperature in the same manner described in the load comparison section. The average of all bins is taken and the impact result from the Duty Cycle method is averaged with the impact result from the load comparison method to get a final estimated impact. ID PAC-E-11-12 IIPA 12 Attachment IIPA 12 -2 Attach IIPA 12 -2.pdf Page 22 of 57 16 6 Impact Evaluation Results 6.1 Temperature Analysis There were a total of 8 separate events called in the Cool Keeper program this year. There were three event hours that met the full criteria to be considered in the impact analysis. All event hours are detailed in Figure 6-1. ID PAC-E-11-12 IIPA 12 Attachment IIPA 12 -2 Attach IIPA 12 -2.pdf Page 23 of 57 17 Date Hour Temp (F) June 29, 2010, 14:00-14:15 6/29 14:00 97 July 15, 2010, 14:00-18:00 7/15 14:00 90 7/15 15:00 90 7/15 16:00 92 7/15 17:00 91 July 16, 2010, 14:00-18:00 7/16 14:00 97 7/16 15:00 98 7/16 16:00 100 7/16 17:00 102 July 19, 2010, 14:00-18:00 7/19 14:00 91 7/19 15:00 93 7/19 16:00 93 7/19 17:00 94 July 20, 2010, 14:30-17:00 7/20 14:00 90 7/20 15:00 91 7/20 16:00 91 July 30, 2010, 15:30-18:00 7/30 15:00 94 7/30 16:00 94 7/30 17:00 94 August 3, 2010, 14:00-16:30 8/3 14:00 92 8/3 15:00 94 8/3 16:00 93 August 17, 2010, 15:30-18:00 8/17 15:00 95 8/17 16:00 95 8/17 17:00 96 Figure 6-1 Event Hours and Temperatures The event that met the full criteria to be considered in the impact estimate calculations occurred on July 16, 2010. Figure 6-2 shows the temperature for the event as recorded for the weather station in Salt Lake City, UT. The maximum temperature reached 102 degrees F at 17:00. ID PAC-E-11-12 IIPA 12 Attachment IIPA 12 -2 Attach IIPA 12 -2.pdf Page 24 of 57 18 Figure 6-2 Temperature at KSLC on July 16, 2010 An analysis of temperature and load data indicates that there is a strong positive correlation between these two variables. Figures 6-3 and 6-4 show scatters plot of temperature in Salt Lake City, UT versus load for the Residential and Commercial segments, respectively. The data is from June 1, 2010 through August 31, 2010 for weekdays and non-holidays for the hours of 13:00-19:00. Once again, load data is collected in 5-minute intervals. The load displayed in the plot below is calculated by averaging the loads for all M&V sites for each 5-minute interval and then taking the average for each hour. If we were to regress temperature on load, the data indicates that there is approximately 0.053 kW and 0.091 kW increase in load for every one degree increase in temperature for Residential and Commercial sites, respectively. Based on the regression model, a temperature of 99 degrees would yield about 1.73 kW of load for Residential sites and 3.92 kW of load for Commercial sites. ID PAC-E-11-12 IIPA 12 Attachment IIPA 12 -2 Attach IIPA 12 -2.pdf Page 25 of 57 19 Figure 6-3 Temperature versus Average Load (kW) for Residential Sites Figure 6-4 Temperature versus Average Load (kW) For Commercial Sites ID PAC-E-11-12 IIPA 12 Attachment IIPA 12 -2 Attach IIPA 12 -2.pdf Page 26 of 57 20 6.2 Impact Evaluation Calculations Figure 6-5 shows the results for each of the eight events held this year and Figures 6-6 shows the results averaged into the appropriate temperature bins. Averaging these temperature bins for the settlement date of July 16, 2010, yields a Residential result of 1.02 kW and a Commercial result of 2.01 kW. Hour Temp (F) Differencing Method (kW Reduction) Duty Cycle Method (kW Reduction) Average (kW Reduction) Apt Res~ Com Apt Res~ Com Apt Res~ Com June 29, 2010, 14:00-14:15 (50% ADI) 14:00-14:15 97 N/A+ N/A+ N/A+ N/A+ N/A+ N/A+ N/A+ N/A+ N/A+ July 15, 2010, 14:00-18:00 (50% DI) 14:00-15:00 90 0.56 0.41 0.14 0.41 0.55 1.90 0.49 0.48 1.02 15:00-16:00 90 0.59 0.30 0.22 0.42 0.55 1.99 0.51 0.43 1.11 16:00-17:00 92 0.94 0.64 0.84 0.62 0.82 2.26 0.78 0.73 1.55 17:00-18:00 91 1.01 0.76 0.66 0.66 0.91 1.93 0.84 0.84 1.30 July 16, 2010, 14:00-18:00 (50% ADI) 14:00-15:00 97 -0.45 0.70 1.48 0.25 0.82 2.44 -0.10 0.76 1.96 15:00-16:00 98 -0.30 1.07 1.66 0.34 1.03 2.63 0.02 1.05 2.15 16:00-17:00 100 -0.44 0.96 1.62 0.29 1.03 2.80 -0.07 0.99 2.21 17:00-18:00 102 -0.50 1.01 0.96 0.22 1.05 2.38 -0.14 1.03 1.67 July 19, 2010, 14:00-18:00 (50% ADI) 14:00-15:00 91 0.74 0.70 0.45 0.45 0.89 1.96 0.59 0.79 1.20 15:00-16:00 93 0.31 0.82 0.80 0.26 0.95 1.96 0.29 0.89 1.38 16:00-17:00 93 0.27 0.82 1.20 0.36 1.00 2.12 0.32 0.91 1.66 17:00-18:00 94 0.50 0.83 0.48 0.42 0.97 1.69 0.46 0.90 1.09 July 20, 2010, 14:30-17:00 (50% DI) 14:30-15:00 90 -0.32 0.15 0.67 0.16 0.65 1.93 -0.08 0.40 1.30 15:00-16:00 91 -0.30 0.50 0.93 0.20 0.81 2.20 -0.05 0.65 1.57 16:00-17:00 91 -0.11 0.70 0.83 0.24 1.03 2.14 0.06 0.87 1.48 July 30, 2010, 15:30-18:00 (50% DI) 15:30-16:00 94 0.60 0.38 0.08 0.58 0.93 1.94 0.59 0.66 1.01 16:00-17:00 94 0.66 0.84 0.90 0.51 1.07 1.99 0.58 0.95 1.44 17:00-18:00 94 0.63 0.82 0.78 0.58 1.05 1.60 0.60 0.94 1.19 August 3, 2010, 14:00-16:30 (50% DI) 14:00-15:00 92 -0.07 0.52 0.63 0.19 0.85 2.03 0.06 0.68 1.33 15:00-16:00 94 -0.22 0.61 0.94 0.20 0.92 2.07 -0.01 0.76 1.51 16:00-16:30 93 -0.32 0.23 0.48 0.21 0.94 2.33 -0.06 0.59 1.40 August 17, 2010, 15:30-18:00 (50% DI) 15:30-16:00 95 0.53 0.48 0.28 0.51 0.91 2.11 0.52 0.69 1.20 16:00-17:00 95 0.72 0.92 0.90 0.71 1.05 2.24 0.72 0.98 1.57 17:00-18:00 96 0.82 0.87 1.39 0.82 1.13 2.14 0.82 1.00 1.77 + Event cancelled after 15 minutes. + Events are only calculated on an hourly basis. ~ Residential includes both MDU and non-MDU population without weighting. Figure 6-5 Event Results Summary ID PAC-E-11-12 IIPA 12 Attachment IIPA 12 -2 Attach IIPA 12 -2.pdf Page 27 of 57 21 In the M&V Exception Handling section of this report, the bias between the groups is analyzed in detail. Although there is a bias present in the MDUs, we have not applied any adjustments to account for this bias. The reduction results for the settlement day are presented in Figure 6-6. Temp Bin (F) Res kW Reduction Com kW Reduction 98 1.05 2.15 100 0.99 2.21 102 1.03 1.67 Average 1.02 2.01 Figure 6-6 Temperature Bin Impact Results 6.3 Final kW Settlement Calculation The final kW reduction estimate was calculated to be 1.02 kW for the Residential segment and 2.01 kW for the Commercial segment. Multiplying these final reduction estimates by the number of weighted installs yields 100.4 MW (1.02 kW × 98,399 installs) for the Residential segment and 1.1 MW (2.01 kW × 551 installs) for the Commercial segment. The number of weighted installs is found by weighting each installed unit by the percentage of days the unit was active for 2010 control season. ID PAC-E-11-12 IIPA 12 Attachment IIPA 12 -2 Attach IIPA 12 -2.pdf Page 28 of 57 22 7 M&V Exception Handling This section describes a number of circumstances involving either a technician visit to an M&V site or an adjustment of load data involved for settlement calculations. Below each subsection is a description of how Comverge remedies the situation in a manner that best estimates the performance of the general population. 7.1 Automated Status Flags As mentioned in the Data Sources and Uses section of this report, for each 5 minute interval, every meter in PowerCAMP is assigned an automated status flag, designating if there was an error in reading the kW measurement for that interval. Most of the status flags that are recorded in PowerCAMP come back as 0, signaling that there are no problems detected with the data. Figure 7-1 below details all of the status values and their meanings. Any of the values in Figure 7-1 would result in load data in the interval in which the status value occurred not being counted toward load reduction calculations. Occasionally, PowerCAMP may report a status value which is neither 0 nor any of the values in Figure 7-1. This is an error from the PowerCAMP system in which a 0 value should have been recorded. In these cases, the 0 is substituted for the non-valid status flag and the load data would be counted toward settlement as any other interval with a 0 status value would. Automated parsing scripts are used to remove non-valid load data for settlement consideration. Only intervals in which there are status values shown in Figure 7-1 are removed from settlement calculations. Therefore, if a given meter shows 3 intervals with power outages in an hour, only those 3 intervals will be discounted. All other intervals in that hour will have data that would be used in settlement calculations. Status Value Description Total Instances 104 Time Change, Power Outage, Long Inverval Time Change 0 100 Time Change, Power Outage, Short Inveral 0 96 Time Change, Power Outage 0 72 Time Change, Long Interval 0 68 Time Change, Short Interval 1 64 Time Change 0 40 Power Outage, Long Interval 0 36 Power Outage, Short Interval 492 32 Power Outage 36,191 2 Overflow 0 1 Parity 0 Figure 7-1 Valid Non-Zero Status Values An analysis of the PowerCAMP system shows that there were a total of 36,684 instances of non- zero status values. One instance corresponds to one five-minute interval for one site. Therefore, 36,684 instances correspond to less than 1% of the control season. Figure 7-1 outlines the ID PAC-E-11-12 IIPA 12 Attachment IIPA 12 -2 Attach IIPA 12 -2.pdf Page 29 of 57 23 specific instances of these status values. The maximum number of instances of nonzero statuses for any one site is 10,596. This particular site had two major power outages that account for the majority of its nonzero status. The first power outage occurred from June 21 until June 28 and the second occurred from July 15 until August 13. Field technicians made a visit to this site in early July and noted that Heating, Ventilation and Air Conditioning (HVAC) personnel had removed M&V equipment from this site. Eight sites had nonzero status values that occurred on July 16, 2010. Only two of these eight sites had nonzero status values for the duration of the event. Three of the eight sites had one nonzero status values that occurred at 14:10; the remaining three sites had nonzero status values that occurred outside of the event. Approximately 60% of sites had less than one total hour of load data over the course of the cooling season that is excluded from any type of analysis. Only 13% of sites had over 8 hours of load data that is excluded from any type of analysis. 7.2 Out of Range Values An analysis of the load data from each meter is performed to get the kW value associated with the A/C that is running at peak. The analysis is to verify the load data is consistent with an A/C for the tonnage reported. If not, the sites are investigated. During the report generation in prior years, there were instances where the 99th percentile was either too high or too low for the associated tonnage of the A/C unit. The 99th percentile load is used in the duty cycle method calculations. Comverge has set up load data boundaries in order to detect load data which is larger than would be expected. Load data from each meter is compared with observed load data for meters with known tonnages. Specifically, upper bounds have been calculated for load usage by tonnage based on the average value of the 99th percentile of load data for each meter in the 2008 PowerCAMP system. The upper bound for each tonnage bin is set equal to the average 99th percentile plus one standard deviation of the percentile. Figure 7-2 shows the average 99th percentile of load usage from the 2008 data by tonnage bin. Comverge has set up a query to flag possible anomalies from meters which read a value of more than the upper bound value shown in Figure 7-2. Similarly, if a meter had a 99th percentile value below one standard deviation of the average, a flag would signal a data anomaly. If two subsequent events are flagged, suggesting a trend rather than an anomaly, or if a single event above the upper bound value was of a magnitude to indicate a significant issue, the local field team is instructed to investigate. A log is maintained at the local office, in concert with the Technical team, of all investigations and their outcome. If the issue is a wiring problem or an issue with the meter or Gateway, then the data collected prior to the fix of this problem will be removed from settlement calculations. If however the field team determines that the meter is reading the load properly, the tonnage would be revised if necessary but the meter data will be left in the database for settlement calculations. ID PAC-E-11-12 IIPA 12 Attachment IIPA 12 -2 Attach IIPA 12 -2.pdf Page 30 of 57 24 Tons Lower Bound Upper Bound 1.0-1.9 1.02 2.23 2.0-2.9 1.71 2.87 3.0-3.9 2.49 3.80 4.0-4.9 3.69 5.06 5.0-5.4 4.78 6.14 Figure 7-2 Out of Range Boundaries Figures 7-3 and 7-4 show the devices that were flagged for having load values beyond the lower bounds and upper bounds, respectively. In each of these figures, the probability corresponds to the probability of getting a more extreme load value. Values that fall below the lower bounds are not considered as much of an anomaly as values that fall beyond the upper bounds because the low values are mostly likely due to customers restricting their air-conditioning usage. Figure 7-5 show where these out of range values fall based on a normalized probability distribution. The gray vertical line indicates an upper 95% confidence interval. Load values that are beyond +1.645 standard deviations from their mean are cases which are investigated in detail. Device Tonnage 99th Percentile Standard Deviations Probability Comments 356 3.0 2.46 1.04 15% 99th percentile is within 95% confidence interval. 99 4.0 3.62 1.08 14% 99th percentile is within 95% confidence interval. 107 3.0 2.42 1.09 14% 99th percentile is within 95% confidence interval. 16 2.0 1.62 1.16 12% 99th percentile is within 95% confidence interval. 133 3.0 2.37 1.17 12% 99th percentile is within 95% confidence interval. 284 3.0 2.16 1.48 7% 99th percentile is within 95% confidence interval. 359 3.0 2.10 1.57 6% 99th percentile is within 95% confidence interval. 228 3.0 2.09 1.59 6% 99th percentile is within 95% confidence interval. 447 4.0 3.10 1.84 3% Verified by field visit. 367 5.0 4.00 2.15 2% Verified by field visit. 96 1.5 0.04 2.64 0% Verified by field visit. 97 3.0 0.96 3.30 0% Found to be 1.0 tons. 185 2.0 0.00 3.95 0% Verified by field visit. 55 2.5 0.00 3.95 0% Verified by field visit. 374 3.0 0.00 4.76 0% Found to be 2 tons. Figure 7-3 Sites Flagged for Load Values Below Lower Bounds ID PAC-E-11-12 IIPA 12 Attachment IIPA 12 -2 Attach IIPA 12 -2.pdf Page 31 of 57 25 Device Tonnage 99th Percentile Standard Deviations Probability Comments 563 3 3.86 1.09 14% 99th percentile is within 95% confidence interval. 338 2.5 2.92 1.09 14% 99th percentile is within 95% confidence interval. 205 3.5 3.88 1.12 13% 99th percentile is within 95% confidence interval. 207 3.5 3.90 1.14 13% 99th percentile is within 95% confidence interval. 122 2.5 3.04 1.29 10% 99th percentile is within 95% confidence interval. 48 3.5 4.01 1.32 9% 99th percentile is within 95% confidence interval. 254 2.5 3.07 1.35 9% 99th percentile is within 95% confidence interval. 304 5 6.66 1.76 4% Verified by field visit. 425 5 6.80 1.98 2% Verified by field visit. 71 3.5 4.92 2.69 0% Verified by field visit. 568 5 7.62 3.18 0% Verified by field visit. Figure 7-4 Sites Flagged for Load Values Above Upper Bounds Figure 7-5 Distribution of Out of Range Devices 7.3 Sampling Bias Each Residential site in the M&V sample is randomly selected to be in either Group A or Group B, which alternately serve as the Comparison and Curtailment groups for each event occurring at 97 degrees or above. These groups will also separately alternate for each event below 97 degrees. The goal in alternating the groups is to minimize the overall effect of any bias between the groups on the final load reduction calculation. The random selection is intended to minimize the potential for bias between the groups however there is always the possibility that the random ID PAC-E-11-12 IIPA 12 Attachment IIPA 12 -2 Attach IIPA 12 -2.pdf Page 32 of 57 26 selection results in one group having larger units than the other. In order to keep this from happening, the tonnage data from the M&V sample prior to the start of the season is evaluated to ensure that the two groups have similar tonnage profiles. The tonnage distribution prior to the beginning of the cooling season is detailed in Figure 7-6. Differences in tonnage distribution between the two groups were found to be negligible and, therefore, sites remained in the same group for the entire cooling season. Residential Commercial Tons 1.0-2.9 3.0-5.4 5.5-6.4 6.4-7.5 Group A 34 36 7 8 Group B 33 38 8 11 Figure 7-6 Group Tonnage Distribution Figures 7-7 through 7-9 below demonstrate the bias for the Residential group over the course of the control season. These figures show the hourly non-curtailed load averaged over all non- holiday weekdays from June 1, 2010 through August 31, 2010. The difference in load between the groups does not exceed 0.1 kW for the Residential segment, however, there is a bias observed in the MDU portion of the population. Figure 7-7 Average Residential Bias ID PAC-E-11-12 IIPA 12 Attachment IIPA 12 -2 Attach IIPA 12 -2.pdf Page 33 of 57 27 Figure 7-8 Average Residential Non-MDU Bias Figure 7-9 Average Residential MDU Bias Figures 7-10 through 7-12 present the hourly averages for the Residential segment for the event day. ID PAC-E-11-12 IIPA 12 Attachment IIPA 12 -2 Attach IIPA 12 -2.pdf Page 34 of 57 28 Figure 7-10 Average Residential Load for Event Day July 16 Figure 7-11 Average Residential Non-MDU Load for Event Day July 16 ID PAC-E-11-12 IIPA 12 Attachment IIPA 12 -2 Attach IIPA 12 -2.pdf Page 35 of 57 29 Figure 7-12 Average Residential MDU Load for Event Day July 16 Figure 7-13 shows that for the Commercial segment, group A had a substantially higher load than group B, with the difference between the groups average about 0.82 kW. However, on July 16, 2010, the difference between the groups was much lower for the hour leading up to the event; during the 13:00 hour, the difference was about 0.17 kW (as shown in Figure 7-14). ID PAC-E-11-12 IIPA 12 Attachment IIPA 12 -2 Attach IIPA 12 -2.pdf Page 36 of 57 30 Figure 7-13 Average Commercial Bias Figure 7-14 Average Commercial Load for Event Day July 16 7.4 Data Reports Data reports not being received can be classified into two categories, either a system-wide communications issue or an issue with specific sites. For system-wide communications issues, the problem could result from the following circumstances: (1) PowerCAMP is down, or (2) CDMA network is down. ID PAC-E-11-12 IIPA 12 Attachment IIPA 12 -2 Attach IIPA 12 -2.pdf Page 37 of 57 31 The Comverge Network Operating Center (NOC) monitors the servers to ensure the applications required to collect the data are running. Most problems can be resolved by restarting the PowerCAMP applications and verifying communications have been reestablished. If it has been determined that the PowerCAMP application is functionally properly, then Comverge would contact the CDMA network provider to ensure the network is operational. For specific site communication issues, if no data reports are received from an M&V site over the course of a 3 day period, Comverge would seek to investigate the cause for the problem. This problem could result from one of two circumstances. These are: (1) Gateway is broken or miswired, or (2) the customer or HVAC technician has disconnected the equipment. This problem would require a field visit by a Comverge technician. If the technician finds the Gateway device to be dysfunctional or miswired, the technician will replace the Gateway or fix the wiring to the Gateway. If the technician finds the equipment has been disconnected, he will leave the equipment as is since it is representative of a population site where the same type of disconnection might occur. Specific instances when a technician replaced the Gateway device are detailed in Figure 7-15 below. PREMISE ID GWID TESTS/MODIFICATIONS PROBLEM DATE RESOLUTION DATE 290178243 5225 GATEWAY REPLACED 08/01/10 08/19/10 773295125 22649 GATEWAY REPLACED 08/15/10 08/19/10 588890471 22675 GATEWAY REPLACED 06/01/10 06/23/10 998201905 5251 GATEWAY MISWIRING CORRECTED 06/01/10 06/16/10 166007181 22744 GATEWAY MISWIRING CORRECTED 06/01/10 06/09/10 Figure 7-15 M&V Field Visits Associated with Data Reporting Issues 7.5 Missing Meter Data Data may be missing because a Gateway device failed to report it. A meter will store data for at least 7 days before the memory capacity is reached. If Comverge detects that a Gateway has not reported over a period of 3 days, a technician would be sent to the field to fix any Gateway problems. If needed, the technician can download any data stored in the meter to his laptop computer. Once the data is downloaded, there are procedures which can take that data and incorporate it into the PowerCAMP system with all other load data. Specific M&V field visits associated with Gateway device issues are detailed in Figure 7-15 above. 7.6 Non-Responding DCUs Comverge conducts daily paging tests to all M&V units, as described earlier in the section Data Sources and Uses. In the event that DCU reports not being received, the underlying issue can be classified into two categories, either a system-wide communications issue or an issue with specific sites. ID PAC-E-11-12 IIPA 12 Attachment IIPA 12 -2 Attach IIPA 12 -2.pdf Page 38 of 57 32 For system-wide communications issues, the problem could result from the following circumstances: (1) there is a head end software problem, or (2) there is a paging network problem. A head end software problem involves the LMS software or Apollo system, from which the curtailment message is initiated. If a problem with this software causes the signal not to be sent, no DCU will receive the signal. Detection of this problem is done by performing daily paging tests and is fixed by immediately restarting the software to bring it back to functional status. If the LMS and the Apollo systems are functional and a large number of devices are not receiving the pages, a paging tower or the paging network in general may be down. Comverge detects the failure to receive the signal among many of the installed M&V units by reviewing paging reception on a map of the area on a daily basis to verify whether all paging towers are functioning as intended. If a regional reception failure is detected, Comverge immediately contacts Utah Comm to troubleshoot the source of the paging problem. If a DCU is found not to respond to pages, there could be a number of causes including: (1) the customer has left the program and their device has been deactivated through Apollo, or (2) there is a problem with the wiring between the DCU and the Gateway, or (3) the DCU has poor paging reception based on location and/or device. The first step in investigating this issue is to verify whether the device has been reprogrammed in the Apollo system because the customer has left the program. If the customer has left the program, any data collected after the deactivation date is not counted toward settlement. The data will have to be removed from the load database used for settlement calculations. Comverge verifies the dropout dates of M&V customers on a weekly basis to ensure that only appropriate data is used in event reduction calculations. If the customer is found to still be an active participant in the program, a technician is dispatched to the site within two to three days to investigate further. If there is a problem with the wiring between the DCU and the Gateway, the technician fixes the connection. If, however, the problem results simply from poor paging reception on that unit, the technician does nothing and leaves the site as is. In 2010, there were a handful of M&V sites that had documented paging issues, which are outlined in the following figure. PREMISE ID GWID COMMENTS 938180642 5252 DCU EQUIPMENT FOUND TO BE FAULTY. CAN NOT AND DID NOT REPLACE DCU. 317856442 5167 SITE IS COMPLETELY ISOLATED AND NOT IN SIGHT OF THE TOWER. 12034094 5192 INTERMITTENT PAGING SITE. 131725412 15154 BAD LOCATION WITH WEAK SIGNAL STRENGTH. Figure 7-16 M&V Field Visits Associated with Paging Issues ID PAC-E-11-12 IIPA 12 Attachment IIPA 12 -2 Attach IIPA 12 -2.pdf Page 39 of 57 33 7.7 Miswirings Miswiring between the air-conditioning condenser and the meter could result in readings of zero load when in fact the air conditioning unit is running. Comverge monitors load data from all sites on a daily basis. If a site is found to be running at zero load over the course of three days when the weather is hot, Comverge sends a technician to the site to investigate if the unit is wired correctly. If the miswiring is found, the technician fixes the wiring problem so that data is accurately reported. In this case, all data collected from this site for the time when only zero load is reported would be removed from settlement calculations. This is because we know that the data was reported incorrectly. If, however, the unit is found to be wired correctly, the zero load may simply be due to the fact that the customer is not using air conditioning. In this case nothing is done and the zero load readings count toward settlement. In the 2010 control season there were no instances when a technician found that the equipment at an M&V site had been wired incorrectly. 7.8 Absence of Zero Load Data Occasionally, Comverge finds that certain meters never report zero loads in any interval. In this case, Comverge calculates the minimum load reading reported for those meters over the course of the control season. In most cases, the load will reach essentially zero load (0.01-0.30 kW). If substantial load is always found at the site, Comverge dispatches a technician to verify the equipment is wired and functioning properly and that the air-conditioning unit is, in fact, running during the full time they are there. It is Comverge’s policy to leave this data as is unless a miswiring problem is found involving the Gateway or Meter. In that case, the technician fixes the miswiring problem and the prior data is removed from load reduction calculations. Figure 7- 17 below shows all sites where minimum load over the course of the control season was nonzero. The load is negligible is for every device included in the figure below. Figure 7-17 Nonzero Control Season Minimum Loads ID PAC-E-11-12 IIPA 12 Attachment IIPA 12 -2 Attach IIPA 12 -2.pdf Page 40 of 57 34 7.9 Tonnage Discrepancies Upon each visit to an M&V location, technicians verify that the make, model, and tonnage listed in the BIS system is consistent with their observations. If a technician determines that the information listed in BIS does not match their field observations, they note the new make, model, and/or tonnage information and make the update to the BIS system. This information is also shared with appropriate M&V staff to ensure that the unit is classified properly for settlement purposes. Additionally, if a site is found to have a tonnage of 1.0 or less or of more than 7.5, Comverge sends a technician out to that site to verify the tonnage of the device. Sites with tonnage above 7.5 are not counted in the program. In 2010, there were two M&V sites in which BIS specified an incorrect tonnage value. This discrepancy did not affect type classification. For one of the sites, BIS indicated a tonnage value of 3.0 tons, while a field visit provided a tonnage of 2.0 tons. As a result, this M&V site was moved from the 3.0-5.4 tonnage bin to the 1.0-2.9 tonnage bin. 7.10 Incorrect Classifications Sites may be incorrectly classified in a number of ways. The unit may be wrongly classified by tonnage or by whether it is an apartment or not. Comverge will verify proper tonnage upon each M&V site visit. Also, Comverge matches load usage with the tonnage information to look for sites running at a greater capacity than their tonnage would suggest is possible. If Comverge determines a classification is incorrect, the site is reclassified into the appropriate category and will count toward settlement in the category verified to be correct. RMP will be notified if any changes are required for site classification. There were no misclassifications of the M&V sites for 2010 control season. 7.11 Incorrect Meter Times PowerCAMP generates an alert if a particular meter is found to have a time which is more than 1 minute different from the true time. Since all meters are synchronized at the start of each curtailment season, incorrect time occurrences are rare. However if it does occur, Comverge sends out a signal which re-syncs the meter to the appropriate time. There was one instance in which a meter was found to have a time that was more than 1 minute different from the true time for the 2010 control season. The equipment for this M&V site was installed on June 10, 2010 and the meter time was corrected on June 11, 2010. ID PAC-E-11-12 IIPA 12 Attachment IIPA 12 -2 Attach IIPA 12 -2.pdf Page 41 of 57 35 8 Conclusions In 2010, Comverge continued its efforts to improve the reliability and functionality of the Cool Keeper program. These efforts included daily paging tests, field inspections, and the addition of M&V sites for the Residential and Commercial segments. The results of these actions provide insight into how well the program is functioning and aid in determining how further improvements in the program’s operation can be made. The paging tests give a daily assessment of how well we are able to communicate to sites in the field. By conducting these tests without curtailing, we are able to gain real-time knowledge of system performance without impacting the enrolled customers. This year the Residential segment averaged over 96% reception for all events. The Commercial segment averaged over 93% reception over all events. The inspections allow us to collect data on whether sites installed in the Cool Keeper program are operating as expected. These inspections may alert us to equipment problems or installation issues. The results from the inspections are laid out in more detail in a separate inspection report. Finally, the new M&V sample sites stratified according to population characteristics provides greater precision in our load reduction estimates and ensures that our sample provides a good representation of the population at large. This year the Cool Keeper program successfully executed 8 curtailment events including one emergency shed event. There were 3 event hours executed at temperatures of 97 degrees or above that counted toward the final impact estimate. For each event, the Cool Keeper system responded at the time the event was called to initiate by Rocky Mountain Power. In 2010, the reduction estimate for the Residential segment was estimated to be 1.02 kW and 2.01 kW for the Commercial segment. Improvements for next year can be made by conducting a 20% sample rotation to reflect a potentially changing population distribution. The efforts to monitor the paging performance should continue to ensure the Cool Keeper program maintains high reception levels. ID PAC-E-11-12 IIPA 12 Attachment IIPA 12 -2 Attach IIPA 12 -2.pdf Page 42 of 57 36 Pursuant to the Contract Between Rocky Mountain Power and Comverge Inc. for the Cool Keeper Load Control Program (“Contract”), the undersigns hereby acknowledge and accept the M&V impact results for the Cool Keeper program to be 1.02 kW for the Residential Segment and 2.01 kW for the Commercial segment. The undersigned further acknowledge and agree that these calculations were made according to the methodologies and procedures prescribed in the Contract and accept the results to be valid for the Contract year ended August 31, 2010. Signature: ____________________ Name: Wendell Miyaji, Ph.D. Title: Vice President, Energy Sciences Company: Comverge, Inc. Signature: ____________________ Name: Jason Berry Title: Program Manager, Demand Side Management Company: Rocky Mountain Power ID PAC-E-11-12 IIPA 12 Attachment IIPA 12 -2 Attach IIPA 12 -2.pdf Page 43 of 57 37 Appendix /*Declaring the library*/ libname Utah 'G:\sasfiles\Development\Settlement 2010\RMP'; /*Global Variables*/ %let EventDate = '16JUL10'D; /*Date of event to be analyzed-'DDMMMYY'D format */ %let EventEightDigit = '07/16/2010'; /*Date of event-'MM/DD/YYYY' format*/ %let TempToday = '07/19/10'; /* Two Days after Event */ %let Today= 071610; /* Event Day in this format */ %let hourstart = 14; /*Start of event hour -*/ %let hourend = 18; %let JuneOne='06/25/10'; /*Periodically change this to a later date...real purpose is to speed up program...but make sure it is atleast 20 days before event date. */ %let JuneFirst = '06/01/10'; data zz; b=&eventdate; c=intnx('day',&EventDate,1); d=&eventdate; format b mmddyy8.; format c mmddyy6.; format d mmddyy6.; run; %let RMP_Curtailment = 'B'; /* For RMP Only...specify which group is being curtailed*/ /**These are RMP-Only variables, for start/end times of intervals**/ %let EventDay=&EventDate; %let BefStart='13:00'T; %let BefEnd='13:55'T; %let Hr1Int1Start='14:00'T; %let Hr1Int1End='14:25'T; %let Hr1Int2Start='14:30'T; %let Hr1Int2End='14:55'T; %let Hr2Int1Start='15:00'T; %let Hr2Int1End='15:25'T; %let Hr2Int2Start='15:30'T; %let Hr2Int2End='15:55'T; %let Hr3Int1Start='16:00'T; %let Hr3Int1End='16:25'T; %let Hr3Int2Start='16:30'T; %let Hr3Int2End='16:55'T; %let Hr4Int1Start='17:00'T; %let Hr4Int1End='17:25'T; %let Hr4Int2Start='17:30'T; %let Hr4Int2End='17:55'T; /**This never changes: It is used for calculated connected load (currently commented out) for SDGE**/ ************Input Timeshift in Hours (Converts to Local Time)************; %let Timeshift=6; *****************************Get WeatherData******************************; data Temp; ID PAC-E-11-12 IIPA 12 Attachment IIPA 12 -2 Attach IIPA 12 -2.pdf Page 44 of 57 38 Z='09'x; infile 'G:\sasfiles\Development\Settlement 2010\RMP\SLC-H-071610.TXT' firstobs = 2 dlm=Z; input CODE $ OF $ DATE yymmdd10. HOUR TMP DPT HUM HID WCL WDR WSP WET CC SSM ForecastDate yymmdd10. ForecastHour time5. Units; Time=HOUR; format DATE date7.; If Date ^= &EventDate then delete; drop CODE OF DATE HOUR DPT--Units; if Time < 12 then delete; if Time > 23 then delete; run; data temp; set temp; hour=put(time,18.); run; *********************Clean up invalid meter status codes******************* data LoadForActiveStatusCodes; set Utah.rmp_validload_2010;/*Load from meters active during the control season extracted from PowerCAMP*/ if Status>0 and Status<98 then delete; if LoadKW=. then delete; run; ***************************Get Customer Data****************************; proc import datafile='G:\sasfiles\Development\Settlement 2010\RMP\UtahCustomerData071610_UpdatedGrp_Removed_Apt_as_Res.csv' out=customerdata dbms=csv replace; run; data customerdata; set customerdata; drop MeterSN; run; ********************Merge Meter, Status, and Customer Data******************; proc sort data=CustomerData; by DeviceID; run; proc sort data=LoadForActiveStatusCodes; by DeviceID; run; data ValidMeterNStatusNCustomerData; merge CustomerData LoadForActiveStatusCodes; by DeviceID; if LoadKW=. then delete; if Type="" then delete; run; data ValidMeterNStatusNCustomerData; ID PAC-E-11-12 IIPA 12 Attachment IIPA 12 -2 Attach IIPA 12 -2.pdf Page 45 of 57 39 set ValidMeterNStatusNCustomerData; Date_Local = DATEPART(DateTime); format Date_Local date7.; Time_Local = TIMEPART(DateTime); format Time_Local time5.; run; data validmeternstatusncustomerdata; set validmeternstatusncustomerdata; where Date_Local = &EventDate; if Grp = 31 then delete; run; ********************Calculate Averages************************************; /*Begin Duty-Cycle Code*/ data ValidMeterNStatusNCustomerDataE; set ValidMeterNStatusNCustomerData; if Group = &RMP_Curtailment then delete; run; data BefTimes; set ValidMeterNStatusNCustomerDataE; if Date_Local ^= &EventDate then delete; if Time_Local >= &BefStart and Time_Local <= &BefEnd; run; proc means data=BefTimes noprint; types Type*DeviceID; class Type DeviceID; var LoadKW; output out=AvgBef mean=AvgBef; run; data Hr1Int1Times; set ValidMeterNStatusNCustomerDataE; if Date_Local ^= &EventDate then delete; if Time_Local >= &Hr1Int1Start and Time_Local <= &Hr1Int1End; run; proc means data=Hr1Int1Times noprint; types Type*DeviceID; class Type DeviceID; var LoadKW; output out=AvgHr1Int1 mean=AvgHr1Int1; run; data Hr1Int2Times; set ValidMeterNStatusNCustomerDataE; if Date_Local ^= &EventDate then delete; if Time_Local >= &Hr1Int2Start and Time_Local <= &Hr1Int2End; run; proc means data=Hr1Int2Times noprint; types Type*DeviceID; class Type DeviceID; var LoadKW; output out=AvgHr1Int2 mean=AvgHr1Int2; run; ID PAC-E-11-12 IIPA 12 Attachment IIPA 12 -2 Attach IIPA 12 -2.pdf Page 46 of 57 40 data Hr2Int1Times; set ValidMeterNStatusNCustomerDataE; if Date_Local ^= &EventDate then delete; if Time_Local >= &Hr2Int1Start and Time_Local <= &Hr2Int1End; run; proc means data=Hr2Int1Times noprint; types Type*DeviceID; class Type DeviceID; var LoadKW; output out=AvgHr2Int1 mean=AvgHr2Int1; run; data Hr2Int2Times; set ValidMeterNStatusNCustomerDataE; if Date_Local ^= &EventDate then delete; if Time_Local >= &Hr2Int2Start and Time_Local <= &Hr2Int2End; run; proc means data=Hr2Int2Times noprint; types Type*DeviceID; class Type DeviceID; var LoadKW; output out=AvgHr2Int2 mean=AvgHr2Int2; run; data Hr3Int1Times; set ValidMeterNStatusNCustomerDataE; if Date_Local ^= &EventDate then delete; if Time_Local >= &Hr3Int1Start and Time_Local <= &Hr3Int1End; run; proc means data=Hr3Int1Times noprint; types Type*DeviceID; class Type DeviceID; var LoadKW; output out=AvgHr3Int1 mean=AvgHr3Int1; run; data Hr3Int2Times; set ValidMeterNStatusNCustomerDataE; if Date_Local ^= &EventDate then delete; if Time_Local >= &Hr3Int2Start and Time_Local <= &Hr3Int2End; run; proc means data=Hr3Int2Times noprint; types Type*DeviceID; class Type DeviceID; var LoadKW; output out=AvgHr3Int2 mean=AvgHr3Int2; run; data Hr4Int1Times; set ValidMeterNStatusNCustomerDataE; if Date_Local ^= &EventDate then delete; if Time_Local >= &Hr4Int1Start and Time_Local <= &Hr4Int1End; run; ID PAC-E-11-12 IIPA 12 Attachment IIPA 12 -2 Attach IIPA 12 -2.pdf Page 47 of 57 41 proc means data=Hr4Int1Times noprint; types Type*DeviceID; class Type DeviceID; var LoadKW; output out=AvgHr4Int1 mean=AvgHr4Int1; run; data Hr4Int2Times; set ValidMeterNStatusNCustomerDataE; if Date_Local ^= &EventDate then delete; if Time_Local >= &Hr4Int2Start and Time_Local <= &Hr4Int2End; run; proc means data=Hr4Int2Times noprint; types Type*DeviceID; class Type DeviceID; var LoadKW; output out=AvgHr4Int2 mean=AvgHr4Int2; run; *****************************Merge Averages********************************; data AvgBef; set AvgBef; drop _TYPE_ _FREQ_; run; data AvgHr1Int1; set AvgHr1Int1; drop _TYPE_ _FREQ_; run; data AvgHr1Int2; set AvgHr1Int2; drop _TYPE_ _FREQ_; run; data AvgHr2Int1; set AvgHr2Int1; drop _TYPE_ _FREQ_; run; data AvgHr2Int2; set AvgHr2Int2; drop _TYPE_ _FREQ_; run; data AvgHr3Int1; set AvgHr3Int1; drop _TYPE_ _FREQ_; run; data AvgHr3Int2; set AvgHr3Int2; drop _TYPE_ _FREQ_; ID PAC-E-11-12 IIPA 12 Attachment IIPA 12 -2 Attach IIPA 12 -2.pdf Page 48 of 57 42 run; data AvgHr4Int1; set AvgHr4Int1; drop _TYPE_ _FREQ_; run; data AvgHr4Int2; set AvgHr4Int2; drop _TYPE_ _FREQ_; run; proc sort data=AvgBef; by DeviceID; run; proc sort data=AvgHr1Int1; by DeviceID; run; proc sort data=AvgHr1Int2; by DeviceID; run; proc sort data=AvgHr2Int1; by DeviceID; run; proc sort data=AvgHr2Int2; by DeviceID; run; proc sort data=AvgHr3Int1; by DeviceID; run; proc sort data=AvgHr3Int2; by DeviceID; run; proc sort data=AvgHr4Int1; by DeviceID; run; proc sort data=AvgHr4Int2; by DeviceID; run; data Averages; merge AvgBef AvgHr1Int1 AvgHr1Int2 AvgHr2Int1 AvgHr2Int2 AvgHr3Int1 AvgHr3Int2 AvgHr4Int1 AvgHr4Int2; by DeviceID; run; ******************* Calculates Connected Load****************************; ID PAC-E-11-12 IIPA 12 Attachment IIPA 12 -2 Attach IIPA 12 -2.pdf Page 49 of 57 43 /* Already calculated and stored in this csv file */ proc import datafile='G:\sasfiles\Development\Settlement 2010\RMP\CalcConLoad.csv' out=CalcConLoad dbms=csv replace; run; proc sort data=CustomerData; by DeviceID; run; data CalcConLoadNCustomerData; merge CalcConLoad CustomerData; by DeviceID; if Type="" then delete; run; /*Find Devices with no data, so they are removed from averaging of load*/ proc sort data = CalcConLoadNCustomerData; by deviceID; run; data temp2; merge Avgbef CalcConLoadNCustomerData; by DeviceID; run; data temp2; set temp2; if AvgBef = '.' then delete; drop AvgBef; run; proc means data=temp2 noprint; class Type; types Type; var ConLoad; output out=ConLoadAverages mean=ConLoadAverage; run; data ConLoadAverages; set ConLoadAverages; drop _TYPE_ _FREQ_; run; proc sort data=ConLoadAverages; by Type; run; proc sort data=CalcConLoadNCustomerData; by Type; run; data CalcConLoadFinal; merge ConLoadAverages CalcConLoadNCustomerData; by Type; keep DeviceID Type ConLoadAverage P95 P99 NomCap NomCap15 ConLoad; if ConLoad=. then ConLoad=ConLoadAverage; run; *********************Merge Averages and Connected Load**********************; ID PAC-E-11-12 IIPA 12 Attachment IIPA 12 -2 Attach IIPA 12 -2.pdf Page 50 of 57 44 proc sort data=Averages; by DeviceID; run; proc sort data=CalcConLoadFinal; by DeviceID; run; data AllRes; merge Averages CalcConLoadFinal; by DeviceID; if Type='Res'; AvgBefDivConLoad=AvgBef/ConLoad; if AvgBefDivConLoad<0.1 then RedHr1Int1=AvgHr1Int1-min(of AvgHr1Int1, 0.5*ConLoad); else RedHr1Int1=AvgHr1Int1-min(of AvgHr1Int1, 0.5*AvgBef); if AvgBefDivConLoad<0.1 then RedHr1Int2=AvgHr1Int2-min(of AvgHr1Int2, 0.5*ConLoad); else RedHr1Int2=AvgHr1Int2-min(of AvgHr1Int2, 0.5*AvgBef); if AvgBefDivConLoad<0.1 then RedHr2Int1=AvgHr2Int1-min(of AvgHr2Int1, 0.5*ConLoad); else RedHr2Int1=AvgHr2Int1-min(of AvgHr2Int1, 0.5*AvgBef); if AvgBefDivConLoad<0.1 then RedHr2Int2=AvgHr2Int2-min(of AvgHr2Int2, 0.5*ConLoad); else RedHr2Int2=AvgHr2Int2-min(of AvgHr2Int2, 0.5*AvgBef); if AvgBefDivConLoad<0.1 then RedHr3Int1=AvgHr3Int1-min(of AvgHr3Int1, 0.5*ConLoad); else RedHr3Int1=AvgHr3Int1-min(of AvgHr3Int1, 0.5*AvgBef); if AvgBefDivConLoad<0.1 then RedHr3Int2=AvgHr3Int2-min(of AvgHr3Int2, 0.5*ConLoad); else RedHr3Int2=AvgHr3Int2-min(of AvgHr3Int2, 0.5*AvgBef); if AvgBefDivConLoad<0.1 then RedHr4Int1=AvgHr4Int1-min(of AvgHr4Int1, 0.5*ConLoad); else RedHr4Int1=AvgHr4Int1-min(of AvgHr4Int1, 0.5*AvgBef); if AvgBefDivConLoad<0.1 then RedHr4Int2=AvgHr4Int2-min(of AvgHr4Int2, 0.5*ConLoad); else RedHr4Int2=AvgHr4Int2-min(of AvgHr4Int2, 0.5*AvgBef); NDCHr1Int1=AvgHr1Int1; NDCHr1Int2=AvgHr1Int2; NDCHr2Int1=AvgHr2Int1; NDCHr2Int2=AvgHr2Int2; NDCHr3Int1=AvgHr3Int1; NDCHr3Int2=AvgHr3Int2; NDCHr4Int1=AvgHr4Int1; NDCHr4Int2=AvgHr4Int2; if AvgBefDivConLoad<0.1 then IDCHr1Int1=min(of AvgHr1Int1, 0.5*ConLoad); else IDCHr1Int1=min(of AvgHr1Int1, 0.5*AvgBef); if AvgBefDivConLoad<0.1 then IDCHr1Int2=min(of AvgHr1Int2, 0.5*ConLoad); else IDCHr1Int2=min(of AvgHr1Int2, 0.5*AvgBef); if AvgBefDivConLoad<0.1 then IDCHr2Int1=min(of AvgHr2Int1, 0.5*ConLoad); else IDCHr2Int1=min(of AvgHr2Int1, 0.5*AvgBef); if AvgBefDivConLoad<0.1 then IDCHr2Int2=min(of AvgHr2Int2, 0.5*ConLoad); else IDCHr2Int2=min(of AvgHr2Int2, 0.5*AvgBef); ID PAC-E-11-12 IIPA 12 Attachment IIPA 12 -2 Attach IIPA 12 -2.pdf Page 51 of 57 45 if AvgBefDivConLoad<0.1 then IDCHr3Int1=min(of AvgHr3Int1, 0.5*ConLoad); else IDCHr3Int1=min(of AvgHr3Int1, 0.5*AvgBef); if AvgBefDivConLoad<0.1 then IDCHr3Int2=min(of AvgHr3Int2, 0.5*ConLoad); else IDCHr3Int2=min(of AvgHr3Int2, 0.5*AvgBef); if AvgBefDivConLoad<0.1 then IDCHr4Int1=min(of AvgHr4Int1, 0.5*ConLoad); else IDCHr4Int1=min(of AvgHr4Int1, 0.5*AvgBef); if AvgBefDivConLoad<0.1 then IDCHr4Int2=min(of AvgHr4Int2, 0.5*ConLoad); else IDCHr4Int2=min(of AvgHr4Int2, 0.5*AvgBef); AvgRedHr1=mean(of RedHr1Int1, RedHr1Int2); AvgRedHr2=mean(of RedHr2Int1 RedHr2Int2); AvgRedHr3=mean(of RedHr3Int1 RedHr3Int2); AvgRedHr4=mean(of RedHr4Int1 RedHr4Int2); NDCHr1=mean(of NDCHr1Int1, NDCHr1Int2); NDCHr2=mean(of NDCHr2Int1, NDCHr2Int2); NDCHr3=mean(of NDCHr3Int1, NDCHr3Int2); NDCHr4=mean(of NDCHr4Int1, NDCHr4Int2); IDCHr1=mean(of IDCHr1Int1, IDCHr1Int2); IDCHr2=mean(of IDCHr2Int1, IDCHr2Int2); IDCHr3=mean(of IDCHr3Int1, IDCHr3Int2); IDCHr4=mean(of IDCHr4Int1, IDCHr4Int2); run; data AllCom; merge Averages CalcConLoadFinal; by DeviceID; if Type='Com'; AvgBefDivConLoad=AvgBef/ConLoad; if AvgBefDivConLoad<0.1 then RedHr1Int1=AvgHr1Int1-min(of AvgHr1Int1, 0.5*ConLoad); else RedHr1Int1=AvgHr1Int1-min(of AvgHr1Int1, 0.5*AvgBef); if AvgBefDivConLoad<0.1 then RedHr1Int2=AvgHr1Int2-min(of AvgHr1Int2, 0.5*ConLoad); else RedHr1Int2=AvgHr1Int2-min(of AvgHr1Int2, 0.5*AvgBef); if AvgBefDivConLoad<0.1 then RedHr2Int1=AvgHr2Int1-min(of AvgHr2Int1, 0.5*ConLoad); else RedHr2Int1=AvgHr2Int1-min(of AvgHr2Int1, 0.5*AvgBef); if AvgBefDivConLoad<0.1 then RedHr2Int2=AvgHr2Int2-min(of AvgHr2Int2, 0.5*ConLoad); else RedHr2Int2=AvgHr2Int2-min(of AvgHr2Int2, 0.5*AvgBef); if AvgBefDivConLoad<0.1 then RedHr3Int1=AvgHr3Int1-min(of AvgHr3Int1, 0.5*ConLoad); else RedHr3Int1=AvgHr3Int1-min(of AvgHr3Int1, 0.5*AvgBef); if AvgBefDivConLoad<0.1 then RedHr3Int2=AvgHr3Int2-min(of AvgHr3Int2, 0.5*ConLoad); else RedHr3Int2=AvgHr3Int2-min(of AvgHr3Int2, 0.5*AvgBef); if AvgBefDivConLoad<0.1 then RedHr4Int1=AvgHr4Int1-min(of AvgHr4Int1, 0.5*ConLoad); else RedHr4Int1=AvgHr4Int1-min(of AvgHr4Int1, 0.5*AvgBef); if AvgBefDivConLoad<0.1 then RedHr4Int2=AvgHr4Int2-min(of AvgHr4Int2, 0.5*ConLoad); ID PAC-E-11-12 IIPA 12 Attachment IIPA 12 -2 Attach IIPA 12 -2.pdf Page 52 of 57 46 else RedHr4Int2=AvgHr4Int2-min(of AvgHr4Int2, 0.5*AvgBef); NDCHr1Int1=AvgHr1Int1; NDCHr1Int2=AvgHr1Int2; NDCHr2Int1=AvgHr2Int1; NDCHr2Int2=AvgHr2Int2; NDCHr3Int1=AvgHr3Int1; NDCHr3Int2=AvgHr3Int2; NDCHr4Int1=AvgHr4Int1; NDCHr4Int2=AvgHr4Int2; if AvgBefDivConLoad<0.1 then IDCHr1Int1=min(of AvgHr1Int1, 0.5*ConLoad); else IDCHr1Int1=min(of AvgHr1Int1, 0.5*AvgBef); if AvgBefDivConLoad<0.1 then IDCHr1Int2=min(of AvgHr1Int2, 0.5*ConLoad); else IDCHr1Int2=min(of AvgHr1Int2, 0.5*AvgBef); if AvgBefDivConLoad<0.1 then IDCHr2Int1=min(of AvgHr2Int1, 0.5*ConLoad); else IDCHr2Int1=min(of AvgHr2Int1, 0.5*AvgBef); if AvgBefDivConLoad<0.1 then IDCHr2Int2=min(of AvgHr2Int2, 0.5*ConLoad); else IDCHr2Int2=min(of AvgHr2Int2, 0.5*AvgBef); if AvgBefDivConLoad<0.1 then IDCHr3Int1=min(of AvgHr3Int1, 0.5*ConLoad); else IDCHr3Int1=min(of AvgHr3Int1, 0.5*AvgBef); if AvgBefDivConLoad<0.1 then IDCHr3Int2=min(of AvgHr3Int2, 0.5*ConLoad); else IDCHr3Int2=min(of AvgHr3Int2, 0.5*AvgBef); if AvgBefDivConLoad<0.1 then IDCHr4Int1=min(of AvgHr4Int1, 0.5*ConLoad); else IDCHr4Int1=min(of AvgHr4Int1, 0.5*AvgBef); if AvgBefDivConLoad<0.1 then IDCHr4Int2=min(of AvgHr4Int2, 0.5*ConLoad); else IDCHr4Int2=min(of AvgHr4Int2, 0.5*AvgBef); AvgRedHr1=mean(of RedHr1Int1 RedHr1Int2); AvgRedHr2=mean(of RedHr2Int1 RedHr2Int2); AvgRedHr3=mean(of RedHr3Int1 RedHr3Int2); AvgRedHr4=mean(of RedHr4Int1 RedHr4Int2); NDCHr1=mean(of NDCHr1Int1, NDCHr1Int2); NDCHr2=mean(of NDCHr2Int1, NDCHr2Int2); NDCHr3=mean(of NDCHr3Int1, NDCHr3Int2); NDCHr4=mean(of NDCHr4Int1, NDCHr4Int2); IDCHr1=mean(of IDCHr1Int1, IDCHr1Int2); IDCHr2=mean(of IDCHr2Int1, IDCHr2Int2); IDCHr3=mean(of IDCHr3Int1, IDCHr3Int2); IDCHr4=mean(of IDCHr4Int1, IDCHr4Int2); run; data AllApt; merge Averages CalcConLoadFinal; by DeviceID; if Type='Apt'; AvgBefDivConLoad=AvgBef/ConLoad; if AvgBefDivConLoad<0.1 then RedHr1Int1=AvgHr1Int1-min(of AvgHr1Int1, 0.5*ConLoad); else RedHr1Int1=AvgHr1Int1-min(of AvgHr1Int1, 0.5*AvgBef); if AvgBefDivConLoad<0.1 then RedHr1Int2=AvgHr1Int2-min(of AvgHr1Int2, 0.5*ConLoad); ID PAC-E-11-12 IIPA 12 Attachment IIPA 12 -2 Attach IIPA 12 -2.pdf Page 53 of 57 47 else RedHr1Int2=AvgHr1Int2-min(of AvgHr1Int2, 0.5*AvgBef); if AvgBefDivConLoad<0.1 then RedHr2Int1=AvgHr2Int1-min(of AvgHr2Int1, 0.5*ConLoad); else RedHr2Int1=AvgHr2Int1-min(of AvgHr2Int1, 0.5*AvgBef); if AvgBefDivConLoad<0.1 then RedHr2Int2=AvgHr2Int2-min(of AvgHr2Int2, 0.5*ConLoad); else RedHr2Int2=AvgHr2Int2-min(of AvgHr2Int2, 0.5*AvgBef); if AvgBefDivConLoad<0.1 then RedHr3Int1=AvgHr3Int1-min(of AvgHr3Int1, 0.5*ConLoad); else RedHr3Int1=AvgHr3Int1-min(of AvgHr3Int1, 0.5*AvgBef); if AvgBefDivConLoad<0.1 then RedHr3Int2=AvgHr3Int2-min(of AvgHr3Int2, 0.5*ConLoad); else RedHr3Int2=AvgHr3Int2-min(of AvgHr3Int2, 0.5*AvgBef); if AvgBefDivConLoad<0.1 then RedHr4Int1=AvgHr4Int1-min(of AvgHr4Int1, 0.5*ConLoad); else RedHr4Int1=AvgHr4Int1-min(of AvgHr4Int1, 0.5*AvgBef); if AvgBefDivConLoad<0.1 then RedHr4Int2=AvgHr4Int2-min(of AvgHr4Int2, 0.5*ConLoad); else RedHr4Int2=AvgHr4Int2-min(of AvgHr4Int2, 0.5*AvgBef); NDCHr1Int1=AvgHr1Int1; NDCHr1Int2=AvgHr1Int2; NDCHr2Int1=AvgHr2Int1; NDCHr2Int2=AvgHr2Int2; NDCHr3Int1=AvgHr3Int1; NDCHr3Int2=AvgHr3Int2; NDCHr4Int1=AvgHr4Int1; NDCHr4Int2=AvgHr4Int2; if AvgBefDivConLoad<0.1 then IDCHr1Int1=min(of AvgHr1Int1, 0.5*ConLoad); else IDCHr1Int1=min(of AvgHr1Int1, 0.5*AvgBef); if AvgBefDivConLoad<0.1 then IDCHr1Int2=min(of AvgHr1Int2, 0.5*ConLoad); else IDCHr1Int2=min(of AvgHr1Int2, 0.5*AvgBef); if AvgBefDivConLoad<0.1 then IDCHr2Int1=min(of AvgHr2Int1, 0.5*ConLoad); else IDCHr2Int1=min(of AvgHr2Int1, 0.5*AvgBef); if AvgBefDivConLoad<0.1 then IDCHr2Int2=min(of AvgHr2Int2, 0.5*ConLoad); else IDCHr2Int2=min(of AvgHr2Int2, 0.5*AvgBef); if AvgBefDivConLoad<0.1 then IDCHr3Int1=min(of AvgHr3Int1, 0.5*ConLoad); else IDCHr3Int1=min(of AvgHr3Int1, 0.5*AvgBef); if AvgBefDivConLoad<0.1 then IDCHr3Int2=min(of AvgHr3Int2, 0.5*ConLoad); else IDCHr3Int2=min(of AvgHr3Int2, 0.5*AvgBef); if AvgBefDivConLoad<0.1 then IDCHr4Int1=min(of AvgHr4Int1, 0.5*ConLoad); else IDCHr4Int1=min(of AvgHr4Int1, 0.5*AvgBef); if AvgBefDivConLoad<0.1 then IDCHr4Int2=min(of AvgHr4Int2, 0.5*ConLoad); else IDCHr4Int2=min(of AvgHr4Int2, 0.5*AvgBef); AvgRedHr1=mean(of RedHr1Int1, RedHr1Int2); AvgRedHr2=mean(of RedHr2Int1 RedHr2Int2); AvgRedHr3=mean(of RedHr3Int1 RedHr3Int2); AvgRedHr4=mean(of RedHr4Int1 RedHr4Int2); NDCHr1=mean(of NDCHr1Int1, NDCHr1Int2); NDCHr2=mean(of NDCHr2Int1, NDCHr2Int2); NDCHr3=mean(of NDCHr3Int1, NDCHr3Int2); NDCHr4=mean(of NDCHr4Int1, NDCHr4Int2); IDCHr1=mean(of IDCHr1Int1, IDCHr1Int2); ID PAC-E-11-12 IIPA 12 Attachment IIPA 12 -2 Attach IIPA 12 -2.pdf Page 54 of 57 48 IDCHr2=mean(of IDCHr2Int1, IDCHr2Int2); IDCHr3=mean(of IDCHr3Int1, IDCHr3Int2); IDCHr4=mean(of IDCHr4Int1, IDCHr4Int2); run; proc means data=AllRes mean noprint; var AvgRedHr1 AvgRedHr2 AvgRedHr3 AvgRedHr4; output out = Res_RMP_Reduction; run; proc means data=AllCom mean noprint; var AvgRedHr1 AvgRedHr2 AvgRedHr3 AvgRedHr4; output out = Com_RMP_Reduction; run; proc means data=AllApt mean noprint; var AvgRedHr1 AvgRedHr2 AvgRedHr3 AvgRedHr4; output out = Apt_RMP_Reduction; run; proc means data=AllRes mean noprint; var NDCHr1 NDCHr2 NDCHr3 NDCHr4 IDCHr1 IDCHr2 IDCHr3 IDCHr4; run; proc means data=AllCom mean noprint; var NDCHr1 NDCHr2 NDCHr3 NDCHr4 IDCHr1 IDCHr2 IDCHr3 IDCHr4; run; proc means data=AllApt mean noprint; var NDCHr1 NDCHr2 NDCHr3 NDCHr4 IDCHr1 IDCHr2 IDCHr3 IDCHr4; run; data Res_RMP_Reduction; set Res_RMP_Reduction; where _STAT_ = "MEAN"; drop _TYPE_ _STAT_ _FREQ_; Type='Residential'; run; data COM_RMP_Reduction; set Com_RMP_Reduction; where _STAT_ = "MEAN"; drop _TYPE_ _STAT_ _FREQ_; Type='Commercial'; run; data Apt_RMP_Reduction; set Apt_RMP_Reduction; where _STAT_ = "MEAN"; drop _TYPE_ _STAT_ _FREQ_; Type='Apartment'; run; data RMP_Combined_Reductions; merge Com_Rmp_Reduction Res_Rmp_Reduction Apt_Rmp_Reduction; by type; run; ID PAC-E-11-12 IIPA 12 Attachment IIPA 12 -2 Attach IIPA 12 -2.pdf Page 55 of 57 49 proc transpose data = RMP_Combined_Reductions out=RMP_Combined_Reductions_Trans; run; quit; data RMP_Combined_Reductions_Trans; set RMP_Combined_Reductions_Trans; Hour = put(substr(_NAME_,9,1)+13, 18.); /*Apartment = COL1; Commercial = COL2; Residential = COL3; drop _NAME_ COL1 COL2 COL3;*/ Commercial = COL1; Residential = COL2; drop _NAME_ COL1 COL2; run; data RMP_Combined_Reductions_WEIGHTED; merge RMP_Combined_Reductions_Trans temp; by hour; if Commercial = '.' then delete; /*Residential = Residential*.843 +Apartment*.1569;*/ /*Apply weights for the MDU and entire residential population*/ drop time; run; /* Begin Differencing Code*/ data validmeternstatusncustomerdata; set validmeternstatusncustomerdata; where Date_Local = &EventDate; hour=hour(Time_Local); run; proc sort data = validmeternstatusncustomerdata; by Type hour group; run; proc means data = validmeternstatusncustomerdata noprint; by Date_Local Type hour group; var loadkw; output out = ABMETHOD mean=; run; data abmethod; set abmethod; drop Date_Local _type_ _freq_; run; proc transpose data = abmethod out=abmethodtrans; by type hour; id group; run; data abmethodtrans; set abmethodtrans; ABMETHOD = A-B; Drop _NAME_ A B; hour2=put(hour, 18.); drop hour; run; proc sort data=abmethodtrans; by hour2; run; ID PAC-E-11-12 IIPA 12 Attachment IIPA 12 -2 Attach IIPA 12 -2.pdf Page 56 of 57 50 proc transpose data = abmethodtrans out=abmethodfinal; by hour2; id type; run ; data abmethodfinal; set abmethodfinal; if _NAME_ = "C" then delete; run; /*Residential Reduction is weighted by Apt/Res*/ data abmethodfinal_WEIGHTED; set abmethodfinal; hour=hour2; /*Res=Apt*.1569+Res*.843;*/ drop hour2 _NAME_; run; /*Combining the results of both methods in one table*/ data RMP_FINAL; merge abmethodfinal_WEIGHTED RMP_Combined_Reductions_WEIGHTED; by hour; run; ID PAC-E-11-12 IIPA 12 Attachment IIPA 12 -2 Attach IIPA 12 -2.pdf Page 57 of 57