380 likes | 491 Views
Outline for the Day. Building Blocks for Digital Curation Programs Standards as Frameworks for Action Lunch 12:00-1:00 Use and Re-use Over Time Sustainability , Advocacy, and Engagement Wrap up and Evaluation There will be a break in the morning and afternoon.
E N D
Outline for the Day • Building Blocks for Digital Curation Programs • Standards as Frameworks for Action • Lunch 12:00-1:00 • Use and Re-use Over Time • Sustainability, Advocacy, and Engagement • Wrap up and Evaluation There will be a break in the morning and afternoon
Goals for Part 3: Use & Re-Use • Become familiar with the concepts and objectives of workflows that enable long-term access • Explore options for managing expectations across the lifecycle • Review the fundamentals of user-based evaluation for assessing and improving services • Understand the benefits of developing a culture of assessment for repository sustainability
Building Blocks re: Use & Re-Use Conceptual frameworks Organizational infrastructure Technological infrastructure Resource framework Policy framework Roles & responsibilities Stakeholders Content characteristics Standards Holistic workflows Strategy & planning Outreach & advocacy Ongoing evaluation
Work Flow “the sequence of processes through which a piece of work passes from initiation to completion” (Oxford English Dictionary, Second Edition, 1989)
Converging Developments • Tools • Perform functions • Populate workflows • Workflows • Integrated into repositories or standalone • Combine 1 or more tools with human action
Workflow Concepts • Definition of workflow: • Description of practice and procedures • Automation of repetitive tasks • Graphic representation of flow of work • Workflow engine concepts: • Orchestration: composition and execution of new services (definition) • Choreography: interaction/coordinated action between services (description)
Workflows vs. Functions • In practice, the functions have to be pieced together in specific ways that are appropriate to an organizational context or need • If the functions are the verbs, then the workflows are the sentences (or paragraphs...)
Workflows vs. Tasks • Tasks are discrete • Workflows link tasks in a logical fashion • Workflows depend upon interoperability
Workflow Influences • Critical path method (project management) • List all activities • Determine time (duration) for completion • Identify dependencies between activities • Process Improvement examples: • Six Sigma • Total Quality Management (TQM) • Business Process Reengineering
Considerations • Purpose • As is: document what is happening now • To be: document what should happen • Right-sized • Appropriate granularity for problem, setting • Extent and type of documentation • Maintenance • changes in staff, roles • New or changed functions • New tools and common workflows
Discussion 3: Archival Scenario -- What kinds of risks do archivists need to avoid/mitigate to preserve authentic digital records?
Evaluation • “A systematic process for ascertaining whether and why an object of study meets the goals envisioned for that object.” • Gary Marchionini & Gregory Crane, “Evaluating Hypermedia and Learning: Methods and Results from the Perseus Project,” Transactions on Information Systems (TOIS) , Vol. 12 Issue 1 (January 1994): 6.
Assessment vs. Evaluation Academics tend to use the term “evaluation” for this constellation of activities. Administrators in higher education and libraries are coming to use the term “assessment” when thinking about their programs. In both cases, we want to know how well we are doing and base that assessment in measurable data.
High-Level Rationale for Evaluation & Assessment Assessment is the basis of self-understanding and improvement. Sharing results through publication leads to profession-wide benchmarks and overall understand and improvement. A culture of assessment can arise when fostered by administrators and managers.
Rationale • The ability to accurately compare and contrast program metrics with like institutions helps to • set standards for services • assist in planning for improvements to those functions. • The ultimate goal of these projects was to provide archivists with data collection and analysis tools to support decision-making.
Elements of Evaluation/Assessment Goals and objectives of object, system, process, etc. Evaluators. Often other humans, i.e., staff and users of system. Methodology. Data. Comparison of goals for object, event, process, etc. under study. Conclusions.
Evaluation Issues Complex process. Requires some level of training. Takes time and resources. No single golden method exists. Multiple methods yield best view of “reality.” Rigorous sampling is essential – study is only as good as the sample. Quantitative / qualitative: an artificial dichotomy? Privacy and Institutional Review Board (IRB) approval.
What Is a Culture of Assessment? • “An organizational environment in which decisions are based on facts, research and analysis, and where services are planned and delivered in ways that maximize positive outcomes and impacts for customers and stakeholders.” • Amos Lakos & Shelley Phipps, “Creating a Culture of Assessment,” portal: Libraries and the Academy, Vol. 4, No. 3 (2004), pp. 345–361.
Why a Culture of Assessment? Assessment, based on measurable, reliable, and valid data, is a pathway to sustainability. Assessment metrics make the case for value and continuing funding. User-based assessment metrics are an advocacy tool, drawing in the user, the collections donor, and the resource allocator.
Creating a Culture of Assessment • User-based evaluation needs to be established on concepts that are specific to archives and special collections. • If archivists do not do this for themselves, someone else will. • Archival Metrics Toolkits developed by and for archivists • Collaboration between researchers and practitioners.
User-Based Evaluation Tells us how users view our constellation of services, service delivery, and resources, such as collections. Is not collection-based but user-centric. Can tell us about fulfillment of needs as well as user satisfaction. Helps information professionals better allocate their resources for improved performance and user satisfaction. Few user-based studies have been conducted and fewer published, especially from archives and museums, so there is much to learn. Little comparability across studies conducted.
Evaluation of Digital Projects • Designed to: • assess what has been achieved with large investments of resources. • improve current provision (e.g. usability testing, interface design). • provide public accountability (often public funds are used). • Fulfill requirements of funding bodies.
Evaluation of Digital Projects Helps to show who the users are and what they are doing/requesting. Measures the project’s/program’s impact (e.g. on teaching, research, lifelong learning). Informs future developments and as a basis for future projects or grants.
And…. • No good understanding yet of the use of digital objects in collections. • We don’t know what people want/need in terms of: • The collection, • Services, or • What constitutes good service.
Archival Metrics Toolkits • Developed to meet the needs of archivists in evaluating their services to specific user groups: • Researcher • Archival Website • Online Finding Aids • Student Researcher • Teaching Support
Discussion 4: Archival Scenario -- What types of evaluation have you conducted in your repository? What types would you like to conduct?