Printer Friendly

Environmental uses of GIS, the scientific method and Daubert.

Geographic information systems, or GIS, are often underutilized, being used to simply show others where data is geographically located. However, GIS was originally designed for spatial analysis and is capable of dealing with much more complex issues, such as modeling the real world in order to understand how complex events can affect our environment. Furthermore, since GIS is scientific in nature, it can be used in court testimony, an increasingly important factor in light of the recent Daubert decision, which, in short, implies that expert testimony needs to be applicable to the scientific method.

The use of GIS is analogous to the invention of the microscope, in that both allow the researcher to explore small areas that were previously unknown. Exploratory data analysis can then be used to hypothesize cause-and-effect relationships at the micro-level, which can be further tested by scientific methods for their reliability. Appraisal is not dissimilar to what medical research was 100 years or so ago. It may be that GIS will end up being the most important tool of the more highly qualified appraiser that the Appraiser Qualifications Board seeks to have in the future.


Models and their Simulation

The book Modeling our World, which outlines using GIS for modeling, defines a formal model as an abstract and well-defined system of concepts and states that a geographic data model serves as the foundation on which all geographic information systems are built. A simple map is one such model of reality. It represents streets as lines, and we can use it to answer questions such as how far one location is from another. Maps can also tell us about terrain, for example, whether it is rough with significantly varying elevations. GIS does the same, representing geographic information in a computer system, instead of on paper.

Another type of modeling system, GoldSim, was demonstrated at a recent convention for those interested in the use of computer models as predictive tools. GoldSim is a tool that is used to simulate models by creating a model (i.e., an abstract representation or facsimile) of an existing or proposed system (e.g., a business, a mine, a watershed, a forest, the organs in your body, the atmosphere) in order to predict (forecast) the future behavior of the system. Almost any system that can be quantitatively described using equations and/or rules can be simulated.

The GoldSim Contaminant Transport Module, a program extension to GoldSim, allows one to probabilistically simulate the release, transport and fate of contaminants in an environmental system. In Fig. 1, the pathways by which contaminants (represented by the buried drums) can reach receptors (the human body at the top) are depicted.

The uncertainty regarding the extent of underground contamination and its ability to move through soil, underground water supplies, local bodies of water and absorption into surface vegetation and animals, subsequently ingested by humans, drives the financial risk of brownfield properties. However, GIS has been used for years to model the location, density and movement of underground contamination, and an interface is in development that will merge the spatial capabilities of GIS with the simulation capabilities of GoldSim. This could result in a very effective combination of tools for analyzing the effects of contamination on the value and subsequent financial risk of brownfield properties.

The Appraisal Process vs. the Scientific Method

The manual for GoldSim summarizes the "Six Iterative Steps to Scientific Model Building." Namely:

1. Define your objectives and measures of performance.

2. Develop the conceptual model.

3. Create the mathematical model.

4. Quantify the input parameters.

5. Implement and solve the mathematical model using a computational tool.

6. Document the model and evaluate, explain and present the results.

The steps are iterative because when the process is completed once, it is typically necessary to go back to the beginning to alter the model to refine it and make it more accurate.

It is interesting to compare the scientific process to the appraisal process, in that the first few steps are the same and the use of GIS can result in the appraisal process becoming more like the scientific process.

As can be seen in Fig. 2, both processes begin with the definition of the problem. Step two of the scientific method begins with the formulation of a hypothesis, in this case, the development of a conceptual model, which corresponds to the marketability analysis portion of the appraisal process. It is accomplished by developing hypotheses between trends in the supply and demand for a particular property type and the trends' effect on the value of the property in question. This model-building can be enhanced by using GIS for situs analysis in order to better delineate the relevant market area.



GIS and Situs

Exploratory data analysis is inherent in the first steps in the scientific process. It is used to find independent variables that tend to explain observed effects. Examples of effects in a real estate context could be employment trends within a submarket which could result in changing absorption of recently developed space and rising or declining income streams from local properties. This can be an important model to use for the analysis of feasibility of new development. Spatial and other uniqueness of contaminated land can require the use of a land development or residual analysis.



GIS is a very effective tool for analyzing subsurface contamination, as described above, but on-site contamination is only one of the factors that can affect the value of contaminated land. GIS can be used to understand and delineate submarkets. OLAP is a database technology that can be used to track local employment, as an independent variable, by groups of ZIP codes defining sub-markets in order to forecast absorption of new homes and local price trends. Situs analysis is needed in order to be able to define areas that can react very differently from other nearby areas to given economic conditions. How one area's reaction differs from another is the concept of situs, an important element, but one that is not widely known of or commonly understood by many appraisers.

Market Analysis for Valuation Appraisals, published by the Appraisal Institute, defines situs as "the relationship between the total urban environment and a specific land use on an individual land parcel at a specific time." While it then took over 50 pages to explain situs and its relationship to the urban structure, a simply analogy is found by considering an operating system and your PC. It seems reasonable that you could copy the operating system from your desktop to your laptop computer, saving you time from having to reinstall all that software. You will probably find problems if you do. Different drivers are needed for your LCD monitor than your desktop CRT. Sound cards also require different drivers. Drivers are to computers what linkages are to real estate.

The use of comparables without the benefit of situs analysis can be like taking ram from one of your computers and trying to plug it into another. Comparison works in single-family appraisal if homes are located blocks from each other. The distance one needs to go to find a contaminated site can result in the comparable being useless. Successful redevelopment of a contaminated site via a shopping center in City A does not necessarily mean the same thing can be done in City B. The parcels can look a lot like each other on the surface, but the similarities can end there.

Fig. 3, courtesy of UCLA, shows the range in cap rates for apartments in the Los Angeles basin. If scientific tools such as regression analysis are used without situs analysis, the results are likely to be erroneous. This is a problem that could result from using physical attributes of sales in areas with widely differing cap rates. The International Association of Assessing Officers has struggled with this problem for over 20 years and has recently found that GIS can be used to identify "value influence centers" that result in the geographic distortion of values.

Brownfields are often located in older sections of a city. This can create problems in finding any nearby comparisons for use in highest and best use analysis. Facilities that have been used for military weapons research over the years have been developed in more remote locations. When the time comes to redevelop these sites, nearby comparisons are also absent. Office buildings that have soil contamination can be limited by regulators to industrial use in the future. If the closest industrial rent comparables are distant, industrial utility of the subject is also uncertain.

A good understanding of situs analysis is required to deal with these problems. An appraiser with an understanding of situs analysis can do a much better job by using GIS to find and understand local linkages that are critically important for the redevelopment of environmentally impaired real estate. Analysis of the financial risk that is posed by submarket employment trends is an important addition to using GIS to understand the heath risk and subsequent financial risk of the underground contamination itself.

Scientifically Modeling Residential Values and Demand

Continuing on the case study depicted by the UCLA graphic, almost all of the developable land in the Los Angeles Basin has been developed. The result is that brownfields will have to be redeveloped to provide new housing for the growing population. Recently, a case study of properties in Austin, Texas, shows how GIS can be used as part of a land residual/developmental analysis to quantify home values and the likely absorption for homes in a remote subdivision planned for brownfield redevelopment.

Fewer people are willing to buy homes in remote locations, since proximity to employment, supermarkets, a variety of restaurants and many other amenities are considered important to most homebuyers. GIS can be used to statistically model this decline in value.

One of the problematic steps in building a land development model was estimating the price at which homes would sell for in a major subdivision that was planned for the property in question. The subject site was remote, as is the case with a number of sites used for military research that have been contaminated in the past. So, first, it was necessary to estimate the effect on value of a significant distance of the subject from the center of activity in the local area. This was done by creating a graph similar to Figure 4, used for research on the effect of urban structure for forecasting purposes in Austin, Texas. The highest sales prices per unit are found in the central business district. Prices decline in all directions from point A (the intersection of 6th Street and Congress Avenue), which is the center of the CBD.

This method was used to find out if this type of model would be applicable to the property being appraised. The first step was to download sales of over 1,000 single-family homes in order to create a map similar to the above. The parameters for search were that the properties:

* Sold since the beginning of the year

* Ranged in size from 1,000 to 2,000 square feet

* Were all-cash and/or confirmed sales

The properties that were downloaded were then geocoded, with the latitude and longitude coordinates being appended to each record so that the mapping system could locate the sales on an electronic map. After the data was imported to the GIS software, its frequency distribution was analyzed to find the predominant size that sold.

The data was then filtered by a string of variables to only include sales within the predominant size group: say 3-BR homes from 1,700 to 1,790 sq. ft. on sites that ranged from 6,000 to 12,000 sq. ft. built from 1990 to 1995. The filtering process resulted in a refined dataset in which almost all of the variation in price was thought to result from differences in property location.

The filtered data was subsequently mapped in order to show its spatial variation. This data was mapped by sale price, with red dots showing the locations of the homes with the highest prices. A red data cluster appeared a couple of miles east of the main north-south freeway that extended diagonally through the area. This analysis was considered to be the most objective way of finding the most desirable part of the residential market area. GIS was then used to find the travel times between this "100% location" and the location of the other homes in the dataset that sold.

Drive-time analysis

ArcView 3.X's Network Analyst extension was then used to create a second map, a drive-time analysis, which assigns travel times to the local road network. Freeways are labeled as having 65 mph drive times and residential streets were labeled with an effective speed of 20 mph, due to the need to stop at stop signs and intersections. Then the program was instructed to analyze the bands of travel time outward from the center of the red cluster.

The GIS software was told to color the streets within a 10-minute drive time of the 100% location red, the streets within a 10- to 20-minute drive time yellow, the streets within a 20- to 30-minute drive time green, and so on. The sales were then plotted on the map in order to visually see how the home prices varied as a function of how long it took to get from each of them to the 100 percent location. The data shows that basically the same home sold for $131,000 at the 100 percent location, $100,000+/- if located a 20- to 30-minute drive time away and $81,000 due to its location about a 40-minute drive time away.

In other words, the mapping done in conjunction with a drive-time analysis found that prices for similar homes declined by about $20 per sq. ft. for each additional 20 minutes it took to reach the 100 percent location or the "urban, or regional activity center" at it is often called. This concept is known as a "distance-decay" relationship. It is often used to quantify the increasing loss in patronage at shopping centers that occurs as a result of the distance one lives from the shopping center. Mathematical equations can then be created to quantify the effect of distance on value--a scientific method that replaces subjective opinion with objective analysis.

A separate analysis was made via GIS to find out where the strongest regional employment growth was occurring. A second drive-time map was created with this "epicenter" of employment growth as the starting point. Analysis of regional employment growth by ZIP code showed that it increased from west to east, driving up home prices along the way in a path that corresponded to increasing drive times from the epicenter, and then started to move to undeveloped areas to the north.

Further analysis of travel times shows that prices increase in the outer contours when prices in the inner contours increase to about $200,000. Once this price level is reached, people have to drive progressively farther to find housing that is being built on less expensive land. The effect on the demand for homes has been analogous to rainwater filling areas with the lowest elevation, then moving in other directions, finding and filling the next lowest (cost) areas, and eventually towards the next travel-time contour after home prices rose above $200,000.

Scientifically Modeling the Growth in Demand

The example shows that situs and the urban structure can result in effects similar to acupuncture. A change in pressure on, say, part of the foot, can have an effect on the other end of the body. In the previous example, the real estate analyst has to understand the importance of situs and focus on other areas and be aware of economic conditions there, in order to correctly forecast the demand for homes to the north.

The analysis shows the benefits of building scientific models with GIS and OLAP programs in an attempt to understand how area-wide changes in employment and demographics of the population can shape real estate values in light of geographical variables like access and resultant travel time.


Scientific model building has been the norm for research presented at the American Real Estate Society and its sister societies on a global basis for many years. Experience reviewing hundreds of land appraisals shows that the logic that quantitatively links data to the conclusions in the scientific method is typically missing in the appraisal process. The missing link also explains problems with subjective judgments.

The Appraiser Qualifications Board has stated that appraisers should be trained in scientific methods. This is becoming increasingly relevant as all federal courts and half of the state courts have started requiring expert witnesses to couch their expertise in the scientific method, to conform to the ruling in the Daubert case.

Furthermore, GIS and modeling tools using database technology such as OLAP could get appraisal out of its time warp and result in a dramatic improvement of conventional methods. The recent lack in confidence in accounting, stock analysis and other professions suggest that it is an opportune time to use scientific tools to instill greater confidence in appraisal methodologies.

RELATED ARTICLE: GIS in Real Estate Book Offers Well-Rounded Foundation

For practical information on using GIS in your real estate appraisal practice, order a copy of GIS in Real Estate: Integrating, Analyzing and Presenting Locational Information today. This multi-authored text takes readers through all the decisions and problems involved in setting up a geographic information system, acquiring hardware and software, making use of attribute data, custom programming, system integration, and staff training. Learn all about exciting GIS applications that will allow you to prepare in-depth analyses of geographical data and present them quickly, effectively, and persuasively in your appraisal reports. Through December 31, 2002, GIS in Real Estate is available for $15 by calling 888-570-4545 and requesting stock number 0635M, or visiting our Web site at The regular price is $31.50 for members and $35 for nonmembers.


by Bruce R. Weber, MAI

BRUCE WEBER, MAI, is a director at Integra, in Rancho Santa Margarita, Calif., specializing in the use of GIS for market studies using empirically derived quantitative GIS models as opposed to "black boxes" that are sometimes used for market studies. He received his MBA from the University of Michigan and has specialized in the use of GIS for market analysis, with a special interest in forecasting submarket demand for real estate. You can contact him at
COPYRIGHT 2002 The Appraisal Institute
No portion of this article can be reproduced without the express written permission from the copyright holder.
Copyright 2002 Gale, Cengage Learning. All rights reserved.

Article Details
Printer friendly Cite/link Email Feedback
Author:Weber, Bruce R.
Publication:Valuation Insights & Perspectives
Date:Jun 22, 2002
Previous Article:The loss of wetlands and the resulting valuation problems: the second edition of the Appraisal Institute book, The Valuation of Wetlands, by David...
Next Article:2002 LDAC addresses issues facing industry, Institute.

Related Articles
New requirements for the appraisal expert witness.
Junk Science, Environmental Stigma, Market Surveys, and Proper Appraisal Methodology: Recent Lessons from the Litigation Trenches.
The Federal Rules of Evidence and Daubert: Evaluating Real Property Valuation Witnesses.
Implications of the Kumho Tire Case for Appraisal Expert Witnesses.
A beginning best practice brownfield valuation model. (Features).
Compliance with Daubert. (Notes and Issues).
Toxic mold liability update: implications of Kilian v. Equity Residential Trust.
Daubert and the appraisal expert witness revisited.
Expert testimony: regression analysis and other systematic methodologies.
Noteworthy valuation-related papers from the American Real Estate Society Annual Meeting.

Terms of use | Copyright © 2017 Farlex, Inc. | Feedback | For webmasters