160 likes | 294 Views
Feature-based (object-based) Verification. Nathan M. Hitchens National Severe Storms Laboratory. Introduction. Feature-based verification approaches identify “objects” within forecast and observed fields Attributes related to the objects from each field are compared
E N D
Feature-based(object-based)Verification Nathan M. Hitchens National Severe Storms Laboratory
Introduction • Feature-based verification approaches identify “objects” within forecast and observed fields • Attributes related to the objects from each field are compared • e.g. size, location, intensity, orientation angle, etc. • Precipitation most common variable • Summary of approaches • Gilleland et al. 2009 and Gilleland et al. 2010
Example 1-hr precipitation (Stage II) Precipitation Objects
Approaches • Contiguous Rain Areas (CRAs) • “The area of contiguousobserved and/or forecastrainfall enclosed within aspecified isohyet” • CRAs are the union of forecastand observed rain entities Ebert and McBride 2000
Verification Statistics Mean horizontal displacement of the forecast Error in forecast and observed rain area Error in mean and maximum rain rates Error in rain volume Pattern correlation of the corrected forecast Approaches
Approaches • Baldwin et al. 2005 • Features-based technique to classify rainfall systems • Non-convective subclass (stratiform) • Convective subclasses (linear and cellular) • First identify objects similar to Ebert and McBride 2000
Approaches • Use manual expert classification of system type on “training” dataset • Apply cluster analysisto training dataset • Gamma-scale parameterand object eccentricityfound to have mostdetermining power Baldwin et al. 2005
Approaches • Method for Object-based Diagnostic Analysis (MODE) • Smoothing of fields to filter out small-scale variations Davis et al. 2006
Approaches • Smoothed fields are thresholded to allow object boundaries to be detected • Identified objects may also be “associated” into simple shapes for better evaluation of some attributes (aspect ratio, angle, etc) Davis et al. 2006
Approaches • Observed and forecasted objects can be “matched” based on the distance between two objects (relative to their size) • Object attributes are compared (either with or without matching)
My Research • Used Baldwin’s approach to identify objects • 6.0 mm threshold applied to 1-hr Stage II precipitation • Identified threshold for “extreme” as 99th percentile value of maximum precip in objects • Used WRF to simulate selected events
28 August 1998 ST2 120-km NARR 150-km 60-km 180-km 90-km R1
Methods • BOOIA applied to ST2 product and precipitation from each simulation • Simulated objects compared to observed using Euclidean distance approach • Object dissimilarity score formula: • where s is areal size, me is mean precipitation value, ma is maximum precipitation value, x is the x-direction coordinate, y is the y-coordinate value, and the subscripts O and F represent observed and forecast objects • Coefficients A through E are for weighting purposes
Methods • Each attribute is scaled based on the formula: • where z is the scaled attribute, z0 is the non-scaled attribute, z10 is the attribute’s 10th percentile value, and z90 is the attribute’s 90th percentile value
28 August 1998 • BOOIA attributes for observed and forecast objects