1 / 29

Evaluating Growth Models: A Case Study Using Prognosis BC

This case study presents the evaluation of the PrognosisBC growth model, its validation techniques, and the testing of its outputs against data. The study demonstrates the importance of validation in model development and highlights the process of identifying and repairing coding errors. The model's performance against individual tree and stand-level data is assessed, and its ability to produce consistent results across various stand structures is discussed.

jewelm
Download Presentation

Evaluating Growth Models: A Case Study Using Prognosis BC

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Evaluating Growth Models: A Case Study Using PrognosisBC Peter Marshall, University of British Columbia, Vancouver, BC Pablo Parysow, Northern Arizona University, Flagstaff, AZ Shadrach Akindele, Federal University of Technology, Akure, Nigeria Presented at the Third FVS Conference, Feb. 13-15, Fort Collins, CO

  2. Outline • Background- PrognosisBC- Validation - Study area • Testing Against Data • Simulation Testing • Conclusions Within a framework of general observations about validation techniques and processes.

  3. Validation Observation #1 Validation is important … but it tends to be much more of interest to the person doing it than it is to the person hearing about it.

  4. PrognosisBC • An adaptation of the northern Idaho (NI) version of theoriginal Prognosis model • The architecture of the original model remains but many of the internal equations have been reformulated and the remainder have been recalibrated • Habitat types have been replaced with appropriate units within the BC Biogeoclimatic Ecosystem Classification (BEC) system • All inputs and outputs are in metric units • Different versions have been developed for various BEC subzones • Additional information is available at the following URL: http://www.for.gov.bc.ca/hre/gymodels/progbc/

  5. Validation • There are various definitions of “validation” in use and a growing literature on different approaches to use in validation • For the purposes of this presentation, I will define validation as: “The process of evaluating model outputs for consistency and usefulness.” • Under this definition, validation is very much context dependent - which model outputs are being evaluated - in what location(s) - for what purposes

  6. Study Area CANADA British Columbia Study Site Location Interior Douglas-Fir Zone Vancouver Victoria

  7. Validation Observation #2 Validation is most effective if several different approaches to validation are used – there are gains from added perspective.

  8. Testing Against Independent Data • Data were from two research installations established in the late 1980s in stands which were uneven-aged and primarily interior Douglas-fir • One installation, consisting of 6 plots measured on 4 occasions, was set up to follow stand dynamics under different structural conditions(1) predominance of large older trees (dbh > 30 cm)(2) predominance of pole-sized trees (dbh 15-30 cm) (3) predominance of saplings (dbh < 15 cm) • The second installation, consisting of 24 plots measured on 3 occasions, was set up as a precommercial thinning experiment in stands which were diameter-limit logged in the 1960s - 3 blocks consisting of three thinning treatments and a control, with two plots in each block/treatment • Projections made for 11 years to match one of the possible remeasurement intervals (closest match to the 10-year projections of PrognosisBC)

  9. Validation Observation #3 If at all possible, try to look at both individual tree projections as well as stand-level projections. Joint comparisons might well highlight issues that otherwise would not be apparent.

  10. Validation Observation #4 Even apparently well-tested models may well still contain hidden coding errors that have subtle impacts. They are worth looking for carefully. This process is known by some as “verification”. (We found a few such errors by running various of the component equations both within and outside the model environment looking for “oddities”.)

  11. Validation Observation #5 It is best to look for coding errors early on in the validation process. Otherwise, you may have to re-do some of your previous work.

  12. Validation Observation #6 Individuals who are at arm’s length from the model development process are often more likely to spot errors, since they don’t usually assume that they “know” what is going on within the model.

  13. Validation Observation #7 Regression-based equivalence tests (Robinson et al. 2005) provide a convenient means of examining model predictions versus observations. The routine for equivalence testing in R produces nice pictures. Robinson, A.P., R.A. Duursma, and J.D. Marshall. 2005. A regression-based equivalence test for model validation: shifting the burden of proof. Tree Physiology 25: 903-913

  14. Overall equivalence test for the tree-level DBH predictions.

  15. Validation Observation #8 What you see depends on what you look at.For example, the relationship between observed and predicted DBH will appear considerably stronger than the relationship between observed and predicted DBH growth, which is actually what is estimated within PrognosisBC.

  16. Simulation

  17. Conclusions • The validation exercise allowed us to identify and repair minor errors in the coding. • Once these errors were fixed, the model performed well against data at both the single tree and stand level. • The model produced results under a wide variety of stand structures which were consistent with our understanding of stand dynamics.

  18. Validation Observation #9 When preparing validation reports, remember Observation #1: “Validation is important … but it tends to be much more of interest to the person doing it than it is to the person hearing about it.” It is easy to bury readers with an avalanche of results. However, syntheses and summaries are much more likely to be read and understood.

  19. This project was funded by the BC Ministry of Forests and Range, using funds provided for continuing work on PrognosisBC by the Forest Investment Account (FIA). We are grateful for this support.

  20. Thank you! Are there any questions?

More Related