180 likes | 378 Views
2. Overview. The ApproachAccomplishmentsProgram-level Performance Measures Selected for ImplementationDevelopment of Action PlansNext Steps. 3. The Approach. Eastern Regional Workshop. Western Regional Workshop. Dallas Workshop. 1. Identification of Candidate National TIM Performance Measures:
E N D
1. Traffic Incident Management Focus State Initiative: Program-Level Performance Measurement WorkshopsTalking Operations WebcastFebruary 22, 2006
2. 2 Overview The Approach
Accomplishments
Program-level Performance Measures Selected for Implementation
Development of Action Plans
Next Steps
3. 3 The Approach
4. 4 Accomplishments to Date More Than 50 TIM practitioners from eleven leading States participated
Defined 10 program-level objectives across three “goal areas”
Defined 30 candidate program-level TIM performance measures
Developed common understandings of some core terms such as “incident clearance time”
All Eleven Focus States are developing Action Plans for two common “program-level” performance measures
5. 5 Regional Workshops: Candidate Program-Level TIM Vision TIM Goals and Objectives fell into three primary goal areas or themes:
Reduce Incident Duration
Improve Safety of Motorists and Responders
Improve Communications to Reduce the Impact of Incidents on Customers
States choosing to implement program-level performance measures should be encouraged to develop Program-Level TIM Strategic Plans that identify the specific Program-Level TIM objectives that State is choosing to focus on. It should identify the specific strategies and implementation steps the Program team believes are needed to achieve the objectives and then identify the program-level performance measures that will enable objective understanding and evaluation of the progress being made toward each objective. Development of the Strategic Plan by the Program team will foster the cooperation needed for an effective TIM program and buy-in by partners to the objectives. It provides a basis for dialogue on an annual basis by partners on how TIM can be improved from a Program perspective.
Definition of the strategies and implementation steps in the Strategic Plan will enable States (and in some cases, specific agencies) to then make adjustments as needed to enhance particularly effective strategies or to modify/eliminate strategies that that may be shown to be less effective.
A regular (annual or as needed) program review will enable the identification of best practices and lessons learned at a program-level that could then be shared with the broader National TIM community.TIM Goals and Objectives fell into three primary goal areas or themes:
Reduce Incident Duration
Improve Safety of Motorists and Responders
Improve Communications to Reduce the Impact of Incidents on Customers
States choosing to implement program-level performance measures should be encouraged to develop Program-Level TIM Strategic Plans that identify the specific Program-Level TIM objectives that State is choosing to focus on. It should identify the specific strategies and implementation steps the Program team believes are needed to achieve the objectives and then identify the program-level performance measures that will enable objective understanding and evaluation of the progress being made toward each objective. Development of the Strategic Plan by the Program team will foster the cooperation needed for an effective TIM program and buy-in by partners to the objectives. It provides a basis for dialogue on an annual basis by partners on how TIM can be improved from a Program perspective.
Definition of the strategies and implementation steps in the Strategic Plan will enable States (and in some cases, specific agencies) to then make adjustments as needed to enhance particularly effective strategies or to modify/eliminate strategies that that may be shown to be less effective.
A regular (annual or as needed) program review will enable the identification of best practices and lessons learned at a program-level that could then be shared with the broader National TIM community.
6. 6 10 Candidate Program-Level TIM Objectives Reduce incident notification time (defined as the time between the first agency’s awareness of an incident, and the time to notify needed response agencies).
Reduce roadway clearance time (defined as the time between awareness of an incident and restoration of lanes to full operational status).
Reduce incident clearance time (defined as the time between awareness of an incident and removal of all evidence of the incident, including debris or remaining assets, from shoulders).
Reduce “recovery” time (defined as between awareness of an incident and restoration of impacted roadway/roadways to “normal” conditions).
Reduce time for needed responders to arrive on-scene after notification.
Reduce number of secondary incidents and severity of primary and secondary incidents.
Develop and ensure familiarity with regional, multi-disciplinary TIM goals and objectives and supporting procedures by all stakeholders.
Improve communication between responders and managers regarding the status of an incident throughout the incident.
Provide timely, accurate, and useful traveler information to the motoring public on regular basis during incident.
Regularly evaluate and use customer (road user) feedback to improve TIM program assets and practices.
NOTE: Synthesis of objectives from regional workshops Participants identified 3 goal areas. They identified 10 objectives that together support the accomplishment of the goal areas. Each objective supports one or more of the goal areas. These are not in a priority order.
The 10 objectives were derived from more than 20 candidate objectives suggested by participants between the east and west coast workshops and hundreds of comments by participants in the group systems regarding these objectives. Participants in both workshops were asked, “What are some things that we are all trying to accomplish at the program-level with respect to TIM?” They were instructed that objectives should be Specific, Measurable, Achievable, Realistic, and Tangible. Participants in both workshops challenged objective statements so that they met these criteria. Limited time did not permit a full vetting through the workshop process in the collaborative environment of the workshop so participants were invited to put comments on objectives into the Group Systems that could be reviewed and synthesized after the sessions by the analyst team.
The analyst team reviewed the originally proposed objectives, along with the participant comments on the objectives, to see which could potentially be combined into a larger or broader reaching objective statement.
In some cases, objectives statements were combined because they were very similar and the distinctions could be differentiated at the performance measurement level. For example, analysts recommended that “Reduce the number and severity of incidents” and “Reduce secondary incidents” be combined into a single objective. At the performance measurement or tracking level, states may indeed choose (though some may not at this time) to track primary and secondary incidents separately.
In many cases, objective statements proposed in the workshops represented a combination of goals, objectives, implementation steps and performance measures, so analysts parsed the objective statements and supporting comments to distinguish these. For example, one objective statement read, “No unexpected delays” and “. This may indeed be a target for many states. But it represents a combination of an objective (relating to providing timely, accurate and useful travel time estimates to road users), a scale (inference of a count of unexpected delays), and a value that “no” unexpected delays would be good. The objective statement here could possibly be more accurately characterized as “Provide timely, accurate, and useful traveler information to the motoring public on regular basis during incident.” The degree to which states can do this, how they will measure it, and how they will do it will likely vary by state.
In another example, “Improve the use of traffic control devices” was considered by many participants to be more of an implementation step to a larger objective or set of objectives such as “Reduce incident clearance time” or “Reduce recovery time.” States may determine that improving the use of a specific asset or set of assets through certain strategies, or the development of an improved asset, may help to achieve these objectives.
Table X depicts the evolution of the more than 20 originally-proposed objectives into the 10 candidate objectives shown here.Participants identified 3 goal areas. They identified 10 objectives that together support the accomplishment of the goal areas. Each objective supports one or more of the goal areas. These are not in a priority order.
The 10 objectives were derived from more than 20 candidate objectives suggested by participants between the east and west coast workshops and hundreds of comments by participants in the group systems regarding these objectives. Participants in both workshops were asked, “What are some things that we are all trying to accomplish at the program-level with respect to TIM?” They were instructed that objectives should be Specific, Measurable, Achievable, Realistic, and Tangible. Participants in both workshops challenged objective statements so that they met these criteria. Limited time did not permit a full vetting through the workshop process in the collaborative environment of the workshop so participants were invited to put comments on objectives into the Group Systems that could be reviewed and synthesized after the sessions by the analyst team.
The analyst team reviewed the originally proposed objectives, along with the participant comments on the objectives, to see which could potentially be combined into a larger or broader reaching objective statement.
In some cases, objectives statements were combined because they were very similar and the distinctions could be differentiated at the performance measurement level. For example, analysts recommended that “Reduce the number and severity of incidents” and “Reduce secondary incidents” be combined into a single objective. At the performance measurement or tracking level, states may indeed choose (though some may not at this time) to track primary and secondary incidents separately.
In many cases, objective statements proposed in the workshops represented a combination of goals, objectives, implementation steps and performance measures, so analysts parsed the objective statements and supporting comments to distinguish these. For example, one objective statement read, “No unexpected delays” and “. This may indeed be a target for many states. But it represents a combination of an objective (relating to providing timely, accurate and useful travel time estimates to road users), a scale (inference of a count of unexpected delays), and a value that “no” unexpected delays would be good. The objective statement here could possibly be more accurately characterized as “Provide timely, accurate, and useful traveler information to the motoring public on regular basis during incident.” The degree to which states can do this, how they will measure it, and how they will do it will likely vary by state.
In another example, “Improve the use of traffic control devices” was considered by many participants to be more of an implementation step to a larger objective or set of objectives such as “Reduce incident clearance time” or “Reduce recovery time.” States may determine that improving the use of a specific asset or set of assets through certain strategies, or the development of an improved asset, may help to achieve these objectives.
Table X depicts the evolution of the more than 20 originally-proposed objectives into the 10 candidate objectives shown here.
7. 7 Candidate Program-Level TIM Performance Measures NOTE: WOULD NOT READ THROUGH ALL…JUST HAVE THEM FOR REFERENCE….REFER AUDIENCE TO PAGE 13 TO CONTINUE.
Neither objectives nor candidate performances are in a priority order. Numbers are used for referential purposes only at this time. West coast participants prioritized the more than 20 originally proposed objectives to help provide a sense of which were most important. That prioritization provided a sequence by which performance measures were developed. It also shed some light on potential areas for consolidation.
Performance measures were not prioritized yet. The participants were instructed to simply identify potential ways that the objective could be objective evaluated, to suggest potential means of evaluating progress toward achievement of the objective.
Future forums could seek to narrow this list of performance measures down to a few “core” measures that would shed light on the accomplishment of a few “core” objectives. NOTE: WOULD NOT READ THROUGH ALL…JUST HAVE THEM FOR REFERENCE….REFER AUDIENCE TO PAGE 13 TO CONTINUE.
Neither objectives nor candidate performances are in a priority order. Numbers are used for referential purposes only at this time. West coast participants prioritized the more than 20 originally proposed objectives to help provide a sense of which were most important. That prioritization provided a sequence by which performance measures were developed. It also shed some light on potential areas for consolidation.
Performance measures were not prioritized yet. The participants were instructed to simply identify potential ways that the objective could be objective evaluated, to suggest potential means of evaluating progress toward achievement of the objective.
Future forums could seek to narrow this list of performance measures down to a few “core” measures that would shed light on the accomplishment of a few “core” objectives.
8. 8 Candidate Program-Level TIM Performance Measures (Cont.) These are not in a priority order. West coast participants prioritized the more than 20 originally proposed objectives to help provide a sense of which were most important. That prioritization provided a sequence by which performance measures were developed. It also shed some light on potential areas for consolidation.
These are not in a priority order. West coast participants prioritized the more than 20 originally proposed objectives to help provide a sense of which were most important. That prioritization provided a sequence by which performance measures were developed. It also shed some light on potential areas for consolidation.
9. 9 Candidate Program-Level TIM Performance Measures (Cont.) These are not in a priority order. West coast participants prioritized the more than 20 originally proposed objectives to help provide a sense of which were most important. That prioritization provided a sequence by which performance measures were developed. It also shed some light on potential areas for consolidation.
These are not in a priority order. West coast participants prioritized the more than 20 originally proposed objectives to help provide a sense of which were most important. That prioritization provided a sequence by which performance measures were developed. It also shed some light on potential areas for consolidation.
10. 10 Candidate Program-Level TIM Performance Measures (Cont.) These are not in a priority order. West coast participants prioritized the more than 20 originally proposed objectives to help provide a sense of which were most important. That prioritization provided a sequence by which performance measures were developed. It also shed some light on potential areas for consolidation.
These are not in a priority order. West coast participants prioritized the more than 20 originally proposed objectives to help provide a sense of which were most important. That prioritization provided a sequence by which performance measures were developed. It also shed some light on potential areas for consolidation.
11. 11 Consensus-Based Selection of Two Measures for Test Implementation Many State representatives noted that they were already implementing these performance measures to varying degrees. While several other performance measures received significant support, they did not receive a consensus by all participants.
MANY STATES ARE ALREADY IMPLEMENTING THESE TO SOME DEGREE BECAUSE:
Most States have an existing Intelligent Transportation Systems (ITS) infrastructure in place that is collecting, or could be modified to collect, the data needed to support Performance Measures 2a and 3b.
Several States indicated that some software modifications and/or legacy system interfaces would need to be developed to collect this data in a format that would support the performance measures, but that doing so was technically feasible.
The States indicated that most of the tasks necessary for implementing these two performance measures could be completed within a 6-month to 1-year timeframe (medium-term).
States indicated that some of the other performance measures did not meet State-specific needs, or would be technically difficult to implement.
Many State representatives noted that they were already implementing these performance measures to varying degrees. While several other performance measures received significant support, they did not receive a consensus by all participants.
MANY STATES ARE ALREADY IMPLEMENTING THESE TO SOME DEGREE BECAUSE:
Most States have an existing Intelligent Transportation Systems (ITS) infrastructure in place that is collecting, or could be modified to collect, the data needed to support Performance Measures 2a and 3b.
Several States indicated that some software modifications and/or legacy system interfaces would need to be developed to collect this data in a format that would support the performance measures, but that doing so was technically feasible.
The States indicated that most of the tasks necessary for implementing these two performance measures could be completed within a 6-month to 1-year timeframe (medium-term).
States indicated that some of the other performance measures did not meet State-specific needs, or would be technically difficult to implement.
12. 12 Other Measures Selected for Implementation The States indicated that Objective #7 and the associated performance measures were more of a program support activity that could enhance all Objectives. FHWA will provide some support in this area.
The States indicated that Objective #7 and the associated performance measures were more of a program support activity that could enhance all Objectives. FHWA will provide some support in this area.
13. 13 Other Measures Selected for Implementation (cont). Additional clarity and shared understanding is needed among States in how they define “secondary incidents” to consistently implement Objective #6.
Participants noted that “normal” conditions could be difficult to define. This will require further study and consideration.
Objectives 6 and 7 represent the most likely “next measures” for broad adoption at this time.
Additional clarity and shared understanding is needed among States in how they define “secondary incidents” to consistently implement Objective #6.
Participants noted that “normal” conditions could be difficult to define. This will require further study and consideration.
Objectives 6 and 7 represent the most likely “next measures” for broad adoption at this time.
14. 14 Action Plan Framework Related Program-level Objective
Performance measure
Implementation partners
Needed data and sources
Measurement tool or approach
Specific steps to implement and evaluate the performance measure
Lead coordinator
Implementation partners and roles
Timeframe for implementation
Needed resources/support FHWA can provide The participants divided into small groups by State and worked in one of three rooms alongside their regional colleagues. The working room assignments were designed to allow participants to collaborate closely with colleagues from their State as well as neighboring States.
Participants first identified their implementation partners for their selected measures and related objectives. Next, participants identified the data they will need to collect to evaluate the performance measure, sources of data, and the measurement approach or tools they expect to use. The participants further identified the specific steps they will take to implement the performance measure. Additionally, participants were asked to identify the target date for accomplishment, specific partners and roles for that step, and the primary coordinator. They were also invited to specify programmatic support or other resources that could be helpful as FHWA was interested to obtain feedback on the types of support it could provide the focus States participants to help them initiate program-level performance measurement. The participants divided into small groups by State and worked in one of three rooms alongside their regional colleagues. The working room assignments were designed to allow participants to collaborate closely with colleagues from their State as well as neighboring States.
Participants first identified their implementation partners for their selected measures and related objectives. Next, participants identified the data they will need to collect to evaluate the performance measure, sources of data, and the measurement approach or tools they expect to use. The participants further identified the specific steps they will take to implement the performance measure. Additionally, participants were asked to identify the target date for accomplishment, specific partners and roles for that step, and the primary coordinator. They were also invited to specify programmatic support or other resources that could be helpful as FHWA was interested to obtain feedback on the types of support it could provide the focus States participants to help them initiate program-level performance measurement.
15. 15 Emerging Common Approaches Start Small
Integrate CAD Systems
Develop/Expand MOUs with partners Start Small
Many State plans reflected a “start small” approach by focusing on a single, or relatively limited, geographic area and expanding outward once program-level performance measurement is successfully in place for that area. Florida, Texas, and New York plans all used the term “test bed” to describe this approach.
Integration of CAD Systems
Nearly every State cited the need to define common data elements with their TIM partners and develop methods to share the needed data to implement program-level performance measurement.
Development or Expansion of MOUs
At least 7 of the 11 focus States cited or mentioned implementation of MOUs or similar agreements as a component of their Action Plans, reflecting a desire to broaden or expand existing MOUs to include data sharing agreements.
It is anticipated that the experiences of the participating states in this Focus States Initiative with the initial program-level performance measures will yield additional insights on common approaches to implementing performance measurement that may have broad applicability to other States. Start Small
Many State plans reflected a “start small” approach by focusing on a single, or relatively limited, geographic area and expanding outward once program-level performance measurement is successfully in place for that area. Florida, Texas, and New York plans all used the term “test bed” to describe this approach.
Integration of CAD Systems
Nearly every State cited the need to define common data elements with their TIM partners and develop methods to share the needed data to implement program-level performance measurement.
Development or Expansion of MOUs
At least 7 of the 11 focus States cited or mentioned implementation of MOUs or similar agreements as a component of their Action Plans, reflecting a desire to broaden or expand existing MOUs to include data sharing agreements.
It is anticipated that the experiences of the participating states in this Focus States Initiative with the initial program-level performance measures will yield additional insights on common approaches to implementing performance measurement that may have broad applicability to other States.
16. 16 Issues to Consider In Implementing Program-Level TIM Performance Measures More clarity and consensus is needed still on a number of factors that influence the ability to evaluate program-level TIM performance including:
How “normal traffic flow” should be defined (“recovery”)
Varying incident type classifications
Varying incident severity classifications
Criteria for determination that an incident is a “secondary” incident
Value of tracking queue length in understanding TIM performance
Non-DOT and non-law enforcement responders may not collect data that could be needed for program-level analysis at this time.
State agencies and systems may not collect data consistently at this time (may collect slightly different data in different formats).
Time-synchronization between systems within a State may be difficult.
Different means of identifying incident location – Use of Lat/Long and State Plane Coordinates
Incident managers may feel they don't have time to call TMC with updates or to input data into various systems in real-time because they are trying to manage the site.
Time-stamping may not be representative of actual time (validation procedures can correct this but could be time-consuming). These should be addressed further by the NTIMC or other TIM forums to promote consensus and/or ideas to overcome/eliminate the barriers.These should be addressed further by the NTIMC or other TIM forums to promote consensus and/or ideas to overcome/eliminate the barriers.
17. 17 FHWA Support Areas Technical Support
Funding
Information Exchange (website/webexes)
Follow-up Forums
Training
Case Studies Technical Support
Most States indicated that implementing the two common performance measures would require inter-agency cooperation, or the establishment of MOUs with partner agencies that define roles and responsibilities for the development of a TIM Program Plan across agencies. Participants from both New York and Wisconsin indicated that obtaining sample or model MOUs would be helpful for formalizing the inter-agency relationships necessary for the implementation of program-level TIM performance measurement.
Funding
Florida, New York, and Wisconsin Action Plans, in particular, identified funding support as their primary need. Not surprisingly, all of the states indicated they also would like increased funding support. The funding support appears to be needed primarily for CAD integration, software modifications, systems engineers and programmers, etc. Common tools, documentation, and/or basic requirements checklists may also be helpful in this regard as well to States.
Information Exchange (Web Site)
Participants from Utah specifically suggested that having a formal process or mechanisms for information exchange would be helpful, such as a web site. The Utah representatives specifically suggested that such a web site could be helpful in allowing States to share model MOUs, best practices, lessons learned, requests for assistance, and progress reports. New York participants noted that they currently have a web site that may be able to be expanded or leveraged for this purpose (at least to some degree or in the near term). The TIM Community of Practice may offer a means for this in the future as it becomes developed. Maryland participants also identified information exchange as a need, without identifying a particular format.
Follow-Up Forums
A number of States noted the value of the peer exchange afforded by the workshop and mentioned their desire to have a follow-up workshop in a year that would focus on sharing accomplishments, lessons-learned, and next steps. The Action Plans could be used to track and report implementation progress. An additional column(s) could be added to indicate when each particular task has been completed, challenges encountered, and how they were resolved. If delays have been encountered, States could note the reason and the revised implementation schedule.
States could participate in moderated conference calls as a group to review progress on action plans, share issues encountered, and offer lessons learned, etc. Collaborative technologies such as Web Ex, Groove, or Microsoft Net meeting (off the shelf) could be used to supplement the calls with visual presentation materials if desired. These tools could also be used to provide virtual one-on-one follow-up or consultative support to the focus States if desired by FHWA.
Training
Participants from Maryland and New York both identified training as a need – specifically, inter-agency training on regional/integrated TIM programs. Opportunities to provide updated or expanded National Highway Institute (NHI) Training, to include identifying any training requirements that may go beyond current NHI curriculum (which Florida uses and would like to provide more broadly), may be worth consideration. New York participants suggested that this may be an area where FHWA could provide support that would broadly benefit all States in the context of TIM Program Objective #7, and promote more common approaches to TIM programs and performance measurement.
Case Studies
Numerous states commented on the value of the introductory presentations on TIM multi-agency performance measurement initiatives in Florida, Maryland, and Utah. Technical Support
Most States indicated that implementing the two common performance measures would require inter-agency cooperation, or the establishment of MOUs with partner agencies that define roles and responsibilities for the development of a TIM Program Plan across agencies. Participants from both New York and Wisconsin indicated that obtaining sample or model MOUs would be helpful for formalizing the inter-agency relationships necessary for the implementation of program-level TIM performance measurement.
Funding
Florida, New York, and Wisconsin Action Plans, in particular, identified funding support as their primary need. Not surprisingly, all of the states indicated they also would like increased funding support. The funding support appears to be needed primarily for CAD integration, software modifications, systems engineers and programmers, etc. Common tools, documentation, and/or basic requirements checklists may also be helpful in this regard as well to States.
Information Exchange (Web Site)
Participants from Utah specifically suggested that having a formal process or mechanisms for information exchange would be helpful, such as a web site. The Utah representatives specifically suggested that such a web site could be helpful in allowing States to share model MOUs, best practices, lessons learned, requests for assistance, and progress reports. New York participants noted that they currently have a web site that may be able to be expanded or leveraged for this purpose (at least to some degree or in the near term). The TIM Community of Practice may offer a means for this in the future as it becomes developed. Maryland participants also identified information exchange as a need, without identifying a particular format.
Follow-Up Forums
A number of States noted the value of the peer exchange afforded by the workshop and mentioned their desire to have a follow-up workshop in a year that would focus on sharing accomplishments, lessons-learned, and next steps. The Action Plans could be used to track and report implementation progress. An additional column(s) could be added to indicate when each particular task has been completed, challenges encountered, and how they were resolved. If delays have been encountered, States could note the reason and the revised implementation schedule.
States could participate in moderated conference calls as a group to review progress on action plans, share issues encountered, and offer lessons learned, etc. Collaborative technologies such as Web Ex, Groove, or Microsoft Net meeting (off the shelf) could be used to supplement the calls with visual presentation materials if desired. These tools could also be used to provide virtual one-on-one follow-up or consultative support to the focus States if desired by FHWA.
Training
Participants from Maryland and New York both identified training as a need – specifically, inter-agency training on regional/integrated TIM programs. Opportunities to provide updated or expanded National Highway Institute (NHI) Training, to include identifying any training requirements that may go beyond current NHI curriculum (which Florida uses and would like to provide more broadly), may be worth consideration. New York participants suggested that this may be an area where FHWA could provide support that would broadly benefit all States in the context of TIM Program Objective #7, and promote more common approaches to TIM programs and performance measurement.
Case Studies
Numerous states commented on the value of the introductory presentations on TIM multi-agency performance measurement initiatives in Florida, Maryland, and Utah.
18. 18 Future Steps The Focus States will refine their action plans, will act to implement program-level performance measures and will share their progress and experiences.
FHWA will coordinate with states for technical support needs.
FHWA will create a forum for the States to exchange their experiences and lessons-learned.
FHWA will work with states to further refine the types of incidents to be evaluated.
FHWA will support definition of safety-related performance measures. FHWA will coordinate further with states on needed support and to provide telephonic support as needed.
FHWA will create a forum to enable participating focus States to share their experiences implementing program level measures:
What worked and why.
What didn’t work and why.
Lessons learned.
What next.
FHWA will work with participating focus States to further define the types of incidents that will be measured and evaluated, i.e.,:
Major (define criteria, thresholds):
Specific Types.
Minor (define criteria, thresholds):
Specific Types.
FHWA will work with participating focus States to define program-level safety measures (see Objective #6) and other program-level measures that may be of importance to all States. FHWA will coordinate further with states on needed support and to provide telephonic support as needed.
FHWA will create a forum to enable participating focus States to share their experiences implementing program level measures:
What worked and why.
What didn’t work and why.
Lessons learned.
What next.
FHWA will work with participating focus States to further define the types of incidents that will be measured and evaluated, i.e.,:
Major (define criteria, thresholds):
Specific Types.
Minor (define criteria, thresholds):
Specific Types.
FHWA will work with participating focus States to define program-level safety measures (see Objective #6) and other program-level measures that may be of importance to all States.