Printer Friendly

Model validation and testing: the methodological foundation of ASHRAE Standard 140.

ABSTRACT

Ideally, whole-building energy simulation programs model all aspects of a building that influence energy use and thermal and visual comfort for the occupants. An essential component of the development of such computer simulation models is a rigorous program of validation and testing. This paper describes a methodology to evaluate the accuracy of whole-building energy simulation programs. The methodology is also used to identify and diagnose differences in simulation predictions that may be caused by algorithmic differences, modeling limitations, coding errors, or input errors. The methodology has been adopted by ANSI/ASHRAE Standard 140, Method of Test for the Evaluation of Building Energy Analysis Computer Programs (ASHRAE 2001a, 2004). A summary of the method is included in the 2005 ASHRAE Handbook--Fundamentals (ASHRAE 2005). This paper describes the ASHRAE Standard 140 method of test and its methodological basis. Also discussed are possible future enhancements to ASHRAE Standard 140 and related research recommendations.

INTRODUCTION

Ideally, whole-building energy simulation programs model all aspects of energy use and thermal and visual comfort in buildings. Such programs may contain tens of thousands to hundreds of thousands of lines of code. An error in even one character of one line of code can lead to seriously flawed results. Therefore, an essential component of the development of such computer simulation models is a rigorous program of validation, testing, and diagnostics. The United States Department of Energy's National Renewable Energy Laboratory (NREL) has developed an overall methodology to evaluate the accuracy of whole-building energy simulation programs and to identify and diagnose differences in simulation predictions that may be caused by algorithmic differences, modeling limitations, coding errors, or input errors. This method has been adopted by ANSI/ASHRAE Standard 140, Method of Test for the Evaluation of Building Energy Analysis Computer Programs (ASHRAE 2001a, 2004). The method has recently been summarized for inclusion in the 2005 ASHRAE Handbook -- Fundamentals (ASHRAE 2005).

ANSI/ASHRAE STANDARD 140

ASHRAE Standard 140 (ASHRAE 2001a, 2004) was the first codified method of test for building energy software in the world and has recently been referenced by ANSI/ASHRAE/IESNA Standard 90.1 (ASHRAE 2001b, 2006) for approval of software used to show performance path compliance. Standard 140 is structured to allow for the addition of all elements of a complete validation approach as they become available. This structure corresponds to the validation methodology introduced above with subdivisions creating a matrix of six areas for testing, including:

1. Comparative tests -- building envelope

2. Comparative tests -- mechanical equipment and on-site energy generation equipment

3. Analytical verification -- building envelope

4. Analytical verification -- mechanical equipment and onsite energy generation equipment

5. Empirical validation -- building envelope

6. Empirical validation -- mechanical equipment and on-site energy generation equipment

This is a highly abbreviated way of representing the overall parameter space in which building energy simulation programs operate, and each cell in the matrix represents a very large region in the space. The current set of tests are classified under categories 1 and 4. These tests are based on procedures developed by the NREL and field-tested by the International Energy Agency (IEA) over three IEA research tasks (Judkoff and Neymark 1995a; Neymark and Judkoff 2002). The category 1 procedures (ASHRAE 2001a) were refined and converted to standards language and format by the ASHRAE SPC 140 Standards Project Committee. The category 4 procedures were refined and converted to standards language and format by ASHRAE Standing Standard Project Committee (SSPC) 140 and are included along with the category 1 procedures in the most recent revision of Standard 140 (ASHRAE 2004). Continuing oversight of Standard 140 by SSPC 140 allows validation procedures used within Standard 140 to evolve as the state of the art of whole-building energy simulation evolves. Additional tests have been--or are being--developed under ASHRAE research projects (Spitler et al. 2001; Yuill and Haberl 2002) and under a joint IEA Solar Heating and Cooling Programme and Energy Conservation in Buildings and Community Systems Task 34/Annex 43 (Judkoff and Neymark 2004) that are intended to fill in other categories of the above validation matrix.

METHODOLOGICAL BASIS FOR ANSI/ASHRAE STANDARD 140

There are only a few ways to evaluate the accuracy of a whole-building energy simulation program (Judkoff et al. 1983b):

* Empirical Validation -- in which calculated results from a program, subroutine, algorithm, or software object are compared to monitored data from a real building, test cell, or laboratory experiment.

* Analytical Verification -- in which outputs from a program, subroutine, algorithm, or software object are compared to results from a known analytical solution or a generally accepted numerical method for isolated heat transfer under very simple, highly constrained boundary conditions.

* Comparative Testing -- in which a program is compared to itself or to other programs.

Table 1 compares these techniques (Judkoff 1988). In this table the term model is the representation of reality for a given physical behavior. For example, heat transfer may be simulated with one-, two-, or three-dimensional thermal conduction models. The term solution process encompasses the mathematics and computer coding to solve a given model. The solution process for a model can be perfect while the model remains inappropriate for a given physical situation, such as using a one-dimensional conduction model where two-dimensional conduction dominates. The term truth standard represents the standard of accuracy for predicting real behavior. An analytical solution is a mathematical truth standard but only tests the solution process for a model, not the appropriateness of the model. An approximate truth standard from an experiment tests both the solution process and appropriateness of the model within experimental uncertainty. The ultimate (or "absolute") validation truth standard would be comparison of simulation results with a perfectly performed empirical experiment, with all simulation inputs perfectly defined.

Establishing an absolute truth standard for evaluating a program's ability to analyze physical behavior requires empirical validation, but this is only possible within the range of measurement uncertainty, including that related to instruments, spatial and temporal discretization, and the overall experimental design. Test cells and buildings are large, relatively complex experimental objects. The exact design details, material properties, and construction in the field cannot be perfectly known, so there is some uncertainty about the simulation model inputs that accurately represent the experimental object. Meticulous care is required to describe the experimental apparatus as clearly as possible to modelers to minimize this uncertainty. This includes experimental determination of as many material properties and other simulation model inputs as possible, including overall building parameters such as overall steady-state heat transmission coefficient, infiltration rate, and thermal capacitance. Also required are detailed meteorological measurements. For example, many experiments measure global horizontal solar radiation, but very few experiments measure the splits between direct, diffuse, and ground-reflected radiation, all of which are inputs to many whole-building energy simulation programs.

The NREL methodology divides empirical validation into different levels because many past validation studies produced inconclusive results. The levels of validation depend on the degree of control over possible sources of error in a simulation. These error sources consist of seven types, divided into two groups.

External Error Types

* Differences between the actual microclimate that affects the building versus weather input used by the program

* Differences between actual schedules, control strategies, effects of occupant behavior, and other effects from the real building versus those assumed by the program user

* User error in deriving building input files

* Differences between the actual physical properties of the building (including HVAC systems) versus those input by the user

Internal Error Types

* Differences between the actual thermal transfer mechanisms in the real building and its HVAC systems versus the simplified model of those processes in the simulation (all models, no matter how detailed, are simplifications of reality)

* Errors or inaccuracies in the mathematical solution of the models

* Coding errors

The simplest level of empirical validation compares a building's actual long-term energy use to that calculated by a computer program, with no attempt to eliminate sources of discrepancy. Because this is similar to how a simulation tool is used in practice, it is favored by many in the building industry. However, it is difficult to interpret the results because all possible error sources are acting simultaneously. Even if there is good agreement between measured and calculated performance, possible offsetting errors prevent a definitive conclusion about the model's accuracy. More informative levels of validation involve controlling or eliminating various combinations of error types and increasing the density of output-to-data comparisons (e.g., comparing temperature and energy results at various time scales, ranging from subhourly to annual). At the most detailed level, all known sources of error are controlled to identify and quantify unknown error sources and to reveal causal relationships associated with the error sources.

This principle also applies to intermodel comparative testing and analytical verification. The more realistic the test case, the more difficult it is to establish causality and diagnose problems; the simpler and more controlled the test case, the easier it is to pinpoint sources of error or inaccuracy. Methodically building up to realistic cases is useful for testing interactions between algorithms modeling linked mechanisms.

A comparison between measured and calculated performance represents a small region in an immense N-dimensional parameter space. Investigators are constrained to exploring relatively few regions in this space yet would like to be assured that the results are not coincidental (e.g., not a result of offsetting errors) and do represent the validity of the simulation elsewhere in the parameter space. Analytical and comparative techniques minimize the uncertainty of the extrapolations around the limited number of sampled empirical domains. Table 2 classifies these extrapolations. Use of the term vice versa in Table 2 is intended to mean that the extrapolation can go both ways (e.g., from short-term to long-term data, and from long-term to short-term data). This does not mean that such extrapolations are correct, but only that researchers and practitioners have either explicitly or implicitly made such inferences in the past.

Figure 1 shows one process to combine the analytical, empirical, and comparative techniques. These three techniques may also be used together in other ways; for example, intermodel comparisons may be done before an empirical validation exercise to better define the experiment and to help estimate experimental uncertainty by propagating all known error sources through one or more whole-building energy simulation programs (Hunn et al. 1982; Lomas et al. 1994).

For the path shown in Figure 1, the first step is running the code against analytical verification test cases to check its mathematical solution. Discrepancies must be corrected before proceeding further.

Second, the code is run against high-quality empirical validation data and errors are corrected. Diagnosing error sources can be quite difficult and is an area of research in itself. Comparative techniques can be used to create diagnostic procedures (Achermann and Zweifel 2003; Judkoff 1988; Judkoff and Neymark 1995a, 1995b; Judkoff et al. 1983a; Judkoff et al. 1980; Morck 1986; Neymark and Judkoff 2002, 2004; Purdy and Beausoleil-Morrison 2003; Spitler et al. 2001; Yuill and Haberl 2002) and to better define empirical experiments.

The third step is to check agreement of several different programs with different thermal solution and modeling approaches (that have passed through steps 1 and 2) in a variety of representative cases. This uses the comparative technique as an extrapolation tool. Deviations in the program predictions indicate areas for further investigation.

When programs successfully complete these three stages, they are considered validated for cases where acceptable agreement was achieved (i.e., for the range of building, climate, and mechanical system types represented by the test cases). Once several detailed simulation programs have satisfactorily completed the procedure, other programs and simplified design tools can be tested against them. A validated code does not necessarily represent truth. It does represent a set of algorithms that have been shown, through a repeatable procedure, to perform according to the current state of the art.

The NREL methodology for validating building energy simulation programs has been generally accepted by the International Energy Agency (Irving 1988), ASHRAE (ASHRAE 2001a, 2001b, 2004, 2006), and elsewhere with refinements suggested by other researchers (Bland 1992; Bloomfield 1988, 1999; Guyon and Palomo 1999; Irving 1988; Lomas 1991; Lomas and Bowman 1987; Lomas and Eppel 1992). Additionally, the Commission of European Communities has conducted considerable work under the PASSYS program (Jensen 1989; Jensen and van de Perre 1991).

EXAMPLES OF APPLYING THE NREL VALIDATION/TESTING METHODOLOGY

Herein is an example applying a portion of the above methodology to show how comparative tests can be used to extrapolate beyond analytical verification test cases. The process is illustrated for testing the ability of simulation programs to model a simple split-system air conditioner using the steady-state analytical verification tests of HVAC BESTEST, Volume 1 (Neymark and Judkoff 2002), as the basis for further, more realistic, comparative tests of HVAC BESTEST, Volume 2 (Neymark and Judkoff 2004). Initially, Volume 1's analytical verification test cases were developed and distributed to IEA field-trial participants before analytical solutions were developed. The initial results are illustrated in Figure 2; these are "blind" results, meaning that the field-trial participants had no prior knowledge of each others' results. Development of steady-state analytical solutions after this initial round of simulations yielded a mathematical truth standard for the field-trial participants to compare to their results. Using this mathematical truth standard, the field-trial participants discovered and corrected a number of errors in their simulation programs. Such improvements to simulation programs or simulation inputs made by participants must have a mathematical and physical basis and must be applied consistently across tests. Also, all improvements were required to be documented in modeler reports. Arbitrary modification of a simulation program's input or internal code just for the purpose of more closely matching a given set of results is not allowed. The participants' corrections resulted in improved agreement among the simulation programs, as shown in Figure 3.

[FIGURE 1 OMITTED]

[FIGURE 2 OMITTED]

[FIGURE 3 OMITTED]

The HVAC BESTEST, Volume 2, comparative test cases, also applying a split-system air conditioner, were then developed. These cases include more realism (e.g., hourly varying internal gains and weather data, outside air mixing, etc.) such that obtaining analytical solutions outside the environment of a whole-building energy simulation program was not possible. Because these cases are an extension of the HVAC BESTEST, Volume 1, analytical verification test cases, it was possible to use results from the programs that were run and improved in the Volume 1 test cases and then further tested with Volume 2 as approximate benchmark results for comparing with programs that did not previously run the Volume 1 test cases. Figure 4 illustrates "blind" initial results for the Volume 2 comparative tests, while Figure 5 illustrates final Volume 2 comparative test results after technically justifiable corrections to the simulation programs were applied according to the rules previously stated. These figures show that because good agreement among simulations was achieved in the Volume 1 tests and because the Volume 2 tests are an extension of the the Volume 1 tests, there was some propagation of the level of agreement among simulations of the Volume 1 analytical verification tests through to the Volume 2 comparative test simulation results. There was also some further mprovement to the simulations achieved because of the Volume 2 comparative tests.

SUMMARY OF PREVIOUS AND CURRENT TESTING AND VALIDATION WORK

A summary of work in the areas of analytical verification, empirical validation, and comparative testing is available that cites and briefly summarizes approximately 100 articles and research reports covering work published from 1980 through mid-2001 (ASHRAE 2005; Neymark and Judkoff 2002). The following test suites have been completed and are in various stages of adaptation for Standard 140:

* Building energy simulation test and diagnostic method for heating, ventilating, and air-conditioning equipment models (HVAC BESTEST): Fuel-fired furnace test suite (Purdy and Beausoleil-Morrison 2003)

* Home energy rating system building energy simulation test (Judkoff and Neymark 1995b)

* International Energy Agency building energy simulation test and diagnostic method for heating, ventilating, and air-conditioning equipment models (HVAC BESTEST), Volume 2: Cases E300-E545 (Neymark and Judkoff 2004)

There are also a number of simulation test suites in various stages of completion that could eventually be included in Standard 140. These include, among others:

* ASHRAE RP-1052, Development of an analytical verification test suite for whole building energy simulation programs--Building fabric (Spitler et al. 2001)

* ASHRAE RP-865, Development of accuracy tests for mechanical system simulation (Yuill and Haberl 2002)

* RADTEST radiant heating and cooling test cases (Achermann and Zweifel 2003)

* Proposed IEA BESTEST ground-coupled cases (Deru et al. 2002)

* ETNA BESTEST empirical validation test specification (Neymark et al. 2004)

* Daylighting--HVAC interaction tests for the empirical validation of building energy analysis tools (Maxwell et al. 2003)

* Economizer control tests for the empirical validation of building energy analysis tools (Maxwell et al. 2004)

* A number of test suites that are being developed by National Renewable Energy Laboratory and researchers in International Energy Agency member nations under the auspices of IEA Solar Heating and Cooling Task 34 and IEA Energy Conservation in Buildings and Community Systems Annex 43 (IEA SHC 34/ECBCS 43) (Judkoff and Neymark 2004).

CONCLUSION AND RECOMMENDATIONS

New material has been added to the 2005 ASHRAE Handbook--Fundamentals (ASHRAE 2005) that describes a method to evaluate the accuracy of whole-building energy simulation programs and to identify and diagnose differences in simulation predictions that may be caused by algorithmic differences, modeling limitations, coding errors, or input errors. This methodology, which was developed by the NREL beginning over 20 years ago (Judkoff et al. 1983b), has been adopted by ANSI/ASHRAE Standard 140, Method of Test for the Evaluation of Building Energy Analysis Computer Programs (ASHRAE 2001a, 2004). The current set of test cases included in Standard 140 occupy only a small area of the parameter space represented by the cells in the methodological test matrix described in this paper. Additional suites of test cases have been developed that can be considered for integration into Standard 140. We recommend more work to integrate such existing test suites into Standard 140.

Some high-priority areas for additional test cases are:

* Comparative tests -- Mechanical equipment (additional tests beyond those in HVAC BESTEST unitary cooling and heating equipment cases)

* Analytical verification -- Mechanical equipment (additional tests beyond those in RP-865 and HVAC BESTEST unitary cooling and heating equipment test cases)

* Empirical validation -- Mechanical equipment (additional tests beyond those in IEA SHC Task 22 and IEA SHC Task 34/ECBCS Annex 43)

More work to develop such test suites is recommended.

Standard 140 and/or the reports that comprise the test suites contained therein (commonly known as the NREL BESTEST reports) are being referenced and used by a growing number of code promulagation authorities throughout the world. ASHRAE Standard 90.1 (ASHRAE 2001b, 2006) requires that software used for demonstrating performance compliance with Standard 90.1 be tested using ASHRAE Standard 140. Standard 90.1 is ASHRAE's consensus energy code for commercial buildings and for non-low-rise residential buildings. IEA BESTEST is also being used for simulation certification tests in The Netherlands (ISSO 2003), Australia (SEDA 2003; Pears 1998), New Zealand (Donn 2004), and Portugal (Maldonado 2005). As part of their building energy performance assessments under the European Community's Energy Performance Directive (European Union 2002), Austria, Denmark, Greece, and The Netherlands are using a new software tool that includes algorithms that have been checked with BESTEST (Balaras et al. 2005). Also, CEN has utilized BESTEST to check their reference cooling load calculation general criteria of prEN ISO 13791 (CEN 2004a) and simplified methods of prEN ISO 13792 (CEN2004b; Millet 2003). In the United States, the National Association of State Energy Officials (NASEO) Residential Energy Services Network (RESNET) has adopted Home Energy Rating System (HERS) BESTEST (Judkoff and Neymark 1995b) as the basis for certifying software to be used for home energy rating systems under the NASEO/RESNET national accreditation standard (NASEO/RESNET 2002). This is indicative of the importance of validation methods for improving the state of the art in building energy software and for helping to certify such software for use with home energy rating standards, building energy codes, building energy tax credits, and other building energy incentive programs. The importance of validation and test methods is further supported by a recent report comparing 20 whole-building energy simulation tools (Crawley et al. 2005). The report indicates that 18 of the 20 tools reviewed have been tested with at least one of the two procedures currently included in ASHRAE Standard 140-2004, 7 of the tools have been tested with both procedures, and 9 of the tools have been tested with three additional procedures that are currently in various stages of adaptation for Standard 140.

[FIGURE 4 OMITTED]

[FIGURE 5 OMITTED]

Extensively reducing the energy intensity of buildings through better design is possible with the use of simulation tools. However, widespread use of building energy simulation software will not occur unless the design and engineering communities have confidence in these programs. The work described here represents a good start in the effort to develop rigorously validated building energy simulation tools.

ACKNOWLEDGMENTS

We appreciate the support and guidance of D. Crawley, DOE Program Manager for Simulation Tools and representative to the IEA Solar Heating and Cooling Programme Executive Committee. Also appreciated is the support and guidance of R. Karney, DOE Program Manager and representative to the IEA Energy Conservation in Buildings and Community Systems Executive Committee.

Adaptation of the pre-normative IEA research into an ASHRAE Standard Method of Test was assisted by the help and guidance of the members of the SSPC 140 standing committee and the preceding SPC 140 committee: C. Barnaby, I. Beausoleil-Morrison, D. Crawley, P. Fairey, K. Fraser, J. Haberl, D. Knebel, B. Maeda, S. Rees, R. Sonderegger, J. Spitler, G. Walton, B. Wilcox, F. Winkelmann, M. Witte, and G. Yuill. We also gratefully acknowledge the significant contributions to this work from the members of IEA SHC Tasks 8, 12, and 22, and IEA ECBCS Annex 21.

REFERENCES

Achermann, M., and G. Zweifel. 2003. RADTEST--Radiant Heating and Cooling Test Cases. Horw-Lucerne, Switzerland: University of Applied Sciences Central Switzerland, Lucerne School of Architecture. http://www.iea-shc.org/task22/reports/RADTEST_final.pdf.

ASHRAE. 2001a. ANSI/ASHRAE Standard 140-2001, Standard Method of Test for the Evaluation of Building Energy Analysis Computer Programs. Atlanta: American Society of Heating, Refrigerating and Air-Conditioning Engineers, Inc.

ASHRAE. 2001b. ANSI/ASHRAE/IESNA Standard 90.1-2001, Energy Standard for Buildings Except Low-Rise Residential Buildings, Addendum p. Atlanta: American Society of Heating, Refrigerating and Air-Conditioning Engineers, Inc.

ASHRAE. 2004. ANSI/ASHRAE Standard 140-2004, Standard Method of Test for the Evaluation of Building Energy Analysis Computer Programs. Atlanta: American Society of Heating, Refrigerating and Air-Conditioning Engineers, Inc.

ASHRAE. 2005. 2005 ASHRAE Handbook--Fundamentals. Chapter 32, Energy Estimating and Modeling Methods, section "Model Validation and Testing." Atlanta: American Society of Heating, Refrigerating and Air-Conditioning Engineers, Inc.

ASHRAE. 2006. ANSI/ASHRAE/IESNA Standard 90.1-2004, Energy Standard for Buildings Except Low-Rise Residential Buildings, Addendum l. Atlanta: American Society of Heating, Refrigerating and Air-Conditioning Engineers, Inc. (1)

Balaras, C.A., B. Poel, and G. van Crutchen. 2005. Software for energy performance assessment of existing dwellings. IBPSA News 15(1):24-31. International Building Performance Simulation Association, April.

Bland, B. 1992. Conduction in dynamic thermal models: Analytical tests for validation. BSER & T 13(4):197-208.

Bloomfield, D. 1988. An Investigation into Analytical and Empirical Validation Techniques for Dynamic Thermal Models of Buildings. Vol. 1, Executive Summary. SERC/BRE final report. Garston, Watford, UK: BRE.

Bloomfield, D. 1999. An overview of validation methods for energy and environmental software. ASHRAE Transactions 105(2):685-93.

CEN. 2004a. prEN ISO 13791, Thermal performance of buildings--Calculation of internal temperatures of a room in summer without mechanical cooling--General criteria and validation procedures. Final draft. Comite Europeen de la Normalisation, Brussels.

CEN. 2004b. prEN ISO 13792, Thermal performance of buildings--Calculation of internal temperatures of a room in summer without mechanical cooling--Simplified methods. Final draft. Comite Europeen de la Normalisation, Brussels.

Crawley, D., J. Hand, M. Kummert, and B. Griffith. 2005. Contrasting the Capabilities of Building Energy Performance Simulation Programs. Washington, DC: US Department of Energy; Glasgow, Scotland, UK: University of Strathclyde; Madison, WI: University of Wisconsin. http://gundog.lbl.gov/dirpubs/2005/05_compare.pdf.

Deru, M., R. Judkoff, and J. Neymark. 2002. Proposed IEA BESTEST Ground-coupled Cases, draft. National Renewable Energy Laboratory, Golden, CO. July.

Donn, M. 2004. May 2004 e-mail communication with R. Judkoff and J. Neymark. Victoria University, Victoria, New Zealand.

European Union. 2002. On the energy performance of buildings. Directive 2002/91/EC of the European Parliament and of the Council. Official Journal of the European Communities, December.

Guyon, G., and E. Palomo. 1999. Validation of two French building energy analysis programs, Part 1: Analytical verification. ASHRAE Transactions 105(2):694-708.

Hunn, B.D., W.V. Turk, and W.O. Wray. 1982. Validation of Passive Solar Analysis/Design Tools Using Class A Performance Evaluation Data. LA-UR-82-1732. Los Alamos, NM: LANL.

Irving, A. 1988. Validation of dynamic thermal models. Energy and Buildings, January. Lausanne, Switzerland: Elsevier Sequoia, S.A.

ISSO. 2003. Energie Diagnose Referentie Versie 3.0 (in Dutch). ISSO Publicatie 54. Rotterdam, The Netherlands: Institut voor Studie en Stimulering van Onderzoekop Het Gebied van Gebouwinstallaties.

Jensen, S., ed. 1989. The PASSYS Project Phase 1--Subgroup model validation and development, Final report--1986-1989. Commission of the European Communities, Directorate General XII.

Jensen, S., and R. van de Perre. 1991. Tools for whole model validation of building simulation programs, Experience from the CEC Concerted Action PASSYS. Proceedings of Building Simulation '91, August 20-22, Nice, France.

Judkoff, R. 1988. Validation of building energy analysis simulation programs at the Solar Energy Research Institute. Energy and Buildings 10(3):235. Lausanne, Switzerland: Elsevier Sequoia.

Judkoff, R., and J. Neymark. 1995a. International Energy Agency building energy simulation test (BESTEST) and diagnostic method, NREL/TP-472-6231. National Renewable Energy Laboratory, Golden, CO. http://www.nrel.gov/docs/legosti/old/6231.pdf.

Judkoff, R., and J. Neymark. 1995b. Home energy rating system building energy simulation test (HERS BESTEST), NREL/TP-472-7332. National Renewable Energy Laboratory, Golden, CO. http://www.nrel.gov/docs/legosti/fy96/7332a.pdf. http://www.nrel.gov/docs/legosti/fy96/ 7332b.pdf.

Judkoff, R. (operating agent), and J. Neymark. 2004. IEA SHC Task 34/ECBCS Annex 43, Testing and validation of building energy simulation tools. Annex document. International Energy Agency: Solar Heating and Cooling Programme, and Energy Conservation in Buildings and Community Systems, Paris, France.

Judkoff, R., D. Wortman, and J. Burch. 1983a. Measured versus predicted performance of the SERI test house: A validation study, SERI/TP-254-1953. Solar Energy Research Institute (now National Renewable Energy Laboratory), Golden, CO.

Judkoff, R., D. Wortman, C. Christensen, B. O'Doherty, D. Simms, and M. Hannifan. 1980. A comparative study of four passive building energy simulations: DOE-2.1, BLAST, SUNCAT-2.4, DEROB-III. SERI/TP-721-837. UC-59c. Solar Energy Research Institute (now National Renewable Energy Laboratory), Golden, CO.

Judkoff, R., D. Wortman, B. O'Doherty, and J. Burch. 1983b. A methodology for validating building energy analysis simulations, SERI/TR-254-1508. Solar Energy Research Institute (now National Renewable Energy Laboratory), Golden, CO.

Lomas, K. 1991. Dynamic thermal simulation models of buildings: New method of empirical validation. BSER & T 12(1):25-37.

Lomas, K., and N. Bowman. 1987. Developing and testing tools for empirical validation. Chapter 14, Vol. IV of SERC/BRE final report, An investigation in analytical and empirical validation techniques for dynamic thermal models of buildings. BRE, Garston, Watford, UK.

Lomas, K., and H. Eppel. 1992. Sensitivity analysis techniques for building thermal simulation programs. Energy and Building (19)1:21-44.

Lomas, K., H. Eppel, C. Martin, and D. Bloomfield. 1994. Empirical validation of thermal building simulation programs using test room data. Vol.1, Final report. International Energy Agency Report #IEA21RN399/94. Vol. 2, Empirical Validation Package (1993), IEA21RR5/93. Vol. 3, Working Reports (1993), IEA21RN375/93. De Montfort University, Leicester, UK.

Maldonado, E. 2005. Energy certification of buildings in Portugal. Presentation at Joint IEA ECBCS/SHC Executive Committee Meeting, Technical Day, June 15, 2005, Espinho, Portugal.

Maxwell, G., P. Loutzenhiser, and C. Klaassen. 2003. Day-lighting--HVAC interaction tests for the empirical validation of building energy analysis tools. Iowa State University, Department of Mechanical Engineering, Ames, IA.

Maxwell, G., P. Loutzenhiser, and C. Klaassen. 2004. Economizer control tests for the empirical validation of building energy analysis tools, Draft. Iowa State University, Department of Mechanical Engineering, Ames, IA.

Millet, J.-R. 2003. Personal communication at IEA SHC Task Definition Workshop, April 24-25, 2003, Delft, Netherlands. Centre Scientifique et Technique du Batiment, Paris, France.

Morck, O. 1986. Simulation model validation using test cell data. IEA SHC Task VIII, Report #176, Thermal Insulation Laboratory, Technical University of Denmark, Lyngby, Denmark.

NASEO/RESNET. 2002. Mortage industry national home energy rating systems accreditation standards. Residential Energy Services Network, Oceanside, CA. June 15, 2002. www.natresnet.com.

Neymark, J., P. Girault, G. Guyon, R. Judkoff, R. LeBerre, J. Ojalvo, and P. Reimer. 2004. ETNA BESTEST empirical validation test specification. J. Neymark & Associates, Golden, Colo.; Electricite de France, Moret sur Loing, France. In collaboration with National Renewable Energy Laboratory, Golden, CO.

Neymark, J., and R. Judkoff. 2002. International Energy Agency Building Energy Simulation Test and Diagnostic Method for Heating, Ventilating, and Air-Conditioning Equipment Models (HVAC BESTEST), Volume 1: Cases E100-E200. NREL/TP-550-30152. Golden, CO: National Renewable Energy Laboratory. http://www.nrel.gov/docs/fy02osti/30152.pdf.

Neymark, J., and R. Judkoff. 2004. International Energy Agency Building Energy Simulation Test and Diagnostic Method for Heating, Ventilating, and Air-Conditioning Equipment Models (HVAC BESTEST), Volume 2: Cases E300-E545. NREL/TP-550-36754. Golden, CO: National Renewable Energy Laboratory. http://www.nrel.gov/docs/fy05osti/36754.pdf.

Pears, A. 1998. Rating energy efficiency of non-residential buildings: A path forward for New South Wales. Report for the Sustainable Energy Development Authority. Sustainable Solutions Pty Ltd., Brighton, Victoria, Australia. http://www.abgr.com.au.

Purdy, J., and I. Beausoleil-Morrison. 2003. Building energy simulation test and diagnostic method for heating, ventilating, and air-conditioning equipment models (HVAC BESTEST): Fuel-fired furnace test cases. Natural Resources Canada, CANMET Energy Technology Centre, Ottawa, Canada. http://www.iea-shc.org/task22/deliverables.htm.

SEDA. 2003. Guidelines for the use of simulation in commitment agreements. Sustainable Energy Development Authority, Grosvenor Place, New South Wales, Australia.

Spitler, S., S. Rees, and D. Xiao. 2001. Development of an analytical verification test suite for whole building energy simulation programs--Building fabric. Final report for ASHRAE 1052-RP. Oklahoma State University School of Mechanical and Aerospace Engineering, Stillwater, OK.

Yuill, G., and J. Haberl. 2002. Development of accuracy tests for mechanical system simulation. Final Report for ASHRAE 865-RP. University of Nebraska, Omaha, NE.

Ron Judkoff

Member ASHRAE

Joel Neymark, PE

Member ASHRAE

Ron Judkoff is director of the Buildings & Thermal Systems Center at the National Renewable Energy Laboratory, Golden, Colo., and is chair of ASHRAE Standing Standard Project Committee (SSPC) 140. Joel Neymark is director of J. Neymark & Associates, Lakewood, Colo., and is vice-chair of SSPC 140.

(1). References ANSI/ASHRAE Standard 140-2004. Addendum l has passed through the ASHRAE review process and is expected to be published during mid-2006.
Table 1. Validation Techniques

Technique Advantages Disadvantages

Empirical * Approximate truth * Experimental
Test of model and solution standard within uncertainties:
process experimental accuracy -Instrument
 * Any level of calibration, spatial/
 complexity temporal
 discretization
 -Imperfect knowledge/
 specification of
 experimental object
 (building) being
 simulated
 * High-quality
 detailed measurements
 are expensive and
 time consuming
 * Only a limited
 number of test
 conditions are
 practical
Analytical * No input * No test of model
Test of solution process uncertainty validity
 * Exact mathematical * Limited to highly
 truth standard for constrained cases
 the given model for which analytical
 * Inexpensive solutions can be
 derived
Comparative * No input * No absolute truth
Relative test of model and uncertainty standard (only
solution process * Any level of statistically-based
 complexity acceptance ranges
 * Many diagnostic are possible)
 comparisons possible
 * Inexpensive and
 quick

Table 2. Types of Extrapolation

Obtainable Data Points Extrapolation

A few climates Many climates
Short-term total energy usage Long-term total energy usage or
 vice versa
Short-term (hourly) temperatures Long-term total energy usage or
and/or fluxes vice versa
A few equipment performance points Many equipment performance points
A few buildings representing a few Many buildings representing many
sets of variable and parameter sets of variable and parameter
combinations combinations or vice versa
Small-scale: simple test cells, Large-scale complex buildings with
buildings, and mechanical systems; complex HVAC systems or vice
laboratory experiments versa
COPYRIGHT 2006 American Society of Heating, Refrigerating, and Air-Conditioning Engineers, Inc.
No portion of this article can be reproduced without the express written permission from the copyright holder.
Copyright 2006 Gale, Cengage Learning. All rights reserved.

 Reader Opinion

Title:

Comment:



 

Article Details
Printer friendly Cite/link Email Feedback
Author:Judkoff, Ron; Neymark, Joel
Publication:ASHRAE Transactions
Geographic Code:1USA
Date:Jul 1, 2006
Words:5355
Previous Article:BigHorn Home improvement center energy performance.
Next Article:Accuracy tests for simulations of VAV dual-duct, single-zone, four-pipe fan-coil, and four-pipe induction air-handling systems.
Topics:


Related Articles
TESTS CERTIFY "SMART" BUILDING PRODUCTS.
2006 ASHRAE annual meeting, Quebec City, Quebec.
Integrated comparative validation tests as an aid for building simulation tool users and developers.
Laboratory testing of full-scale in-duct gas air cleaners.
An empirical validation of modeling solar gains through a glazing nit using building energy simulation programs.
The development of standardized whole-building simulation assumptions for energy analysis for a set of commercial buildings.
Definition of standard office environments for evaluating the impact of office furniture emissions on indoor VOC concentrations.
Supervisory and optimal control of building HVAC systems: a review.
NAA/NMHC report helps firms meet energy efficiency mandates.

Terms of use | Copyright © 2014 Farlex, Inc. | Feedback | For webmasters