1 / 38

Outline for the Day

Outline for the Day. Building Blocks for Digital Curation Programs Standards as Frameworks for Action Lunch 12:00-1:00 Use and Re-use Over Time Sustainability , Advocacy, and Engagement Wrap up and Evaluation There will be a break in the morning and afternoon.

adem
Download Presentation

Outline for the Day

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Outline for the Day • Building Blocks for Digital Curation Programs • Standards as Frameworks for Action • Lunch 12:00-1:00 • Use and Re-use Over Time • Sustainability, Advocacy, and Engagement • Wrap up and Evaluation There will be a break in the morning and afternoon

  2. Goals for Part 3: Use & Re-Use • Become familiar with the concepts and objectives of workflows that enable long-term access • Explore options for managing expectations across the lifecycle • Review the fundamentals of user-based evaluation for assessing and improving services • Understand the benefits of developing a culture of assessment for repository sustainability

  3. Building Blocks re: Use & Re-Use Conceptual frameworks Organizational infrastructure Technological infrastructure Resource framework Policy framework Roles & responsibilities Stakeholders Content characteristics Standards Holistic workflows Strategy & planning Outreach & advocacy Ongoing evaluation

  4. Effective Workflows

  5. DigCCurr: Transition Point in Life of Digital Object

  6. Work Flow “the sequence of processes through which a piece of work passes from initiation to completion” (Oxford English Dictionary, Second Edition, 1989)

  7. Converging Developments • Tools • Perform functions • Populate workflows • Workflows • Integrated into repositories or standalone • Combine 1 or more tools with human action

  8. Workflow Concepts • Definition of workflow: • Description of practice and procedures • Automation of repetitive tasks • Graphic representation of flow of work • Workflow engine concepts: • Orchestration: composition and execution of new services (definition) • Choreography: interaction/coordinated action between services (description)

  9. Workflows vs. Functions • In practice, the functions have to be pieced together in specific ways that are appropriate to an organizational context or need • If the functions are the verbs, then the workflows are the sentences (or paragraphs...)

  10. Workflows vs. Tasks • Tasks are discrete • Workflows link tasks in a logical fashion • Workflows depend upon interoperability

  11. Workflow Influences • Critical path method (project management) • List all activities • Determine time (duration) for completion • Identify dependencies between activities • Process Improvement examples: • Six Sigma • Total Quality Management (TQM) • Business Process Reengineering

  12. Considerations • Purpose • As is: document what is happening now • To be: document what should happen • Right-sized • Appropriate granularity for problem, setting • Extent and type of documentation • Maintenance • changes in staff, roles • New or changed functions • New tools and common workflows

  13. Examples

  14. PANIC, 2003-2004

  15. version 0.10 in 2013

  16. Discussion 3: Archival Scenario -- What kinds of risks do archivists need to avoid/mitigate to preserve authentic digital records?

  17. Evaluation & Assessment

  18. Evaluation • “A systematic process for ascertaining whether and why an object of study meets the goals envisioned for that object.” • Gary Marchionini & Gregory Crane, “Evaluating Hypermedia and Learning: Methods and Results from the Perseus Project,” Transactions on Information Systems (TOIS) , Vol. 12 Issue 1 (January 1994): 6.

  19. Assessment vs. Evaluation Academics tend to use the term “evaluation” for this constellation of activities. Administrators in higher education and libraries are coming to use the term “assessment” when thinking about their programs. In both cases, we want to know how well we are doing and base that assessment in measurable data.

  20. High-Level Rationale for Evaluation & Assessment Assessment is the basis of self-understanding and improvement. Sharing results through publication leads to profession-wide benchmarks and overall understand and improvement. A culture of assessment can arise when fostered by administrators and managers.

  21. Rationale • The ability to accurately compare and contrast program metrics with like institutions helps to • set standards for services • assist in planning for improvements to those functions. • The ultimate goal of these projects was to provide archivists with data collection and analysis tools to support decision-making.

  22. Elements of Evaluation/Assessment Goals and objectives of object, system, process, etc. Evaluators. Often other humans, i.e., staff and users of system. Methodology. Data. Comparison of goals for object, event, process, etc. under study. Conclusions.

  23. Evaluation Issues Complex process. Requires some level of training. Takes time and resources. No single golden method exists. Multiple methods yield best view of “reality.” Rigorous sampling is essential – study is only as good as the sample. Quantitative / qualitative: an artificial dichotomy? Privacy and Institutional Review Board (IRB) approval.

  24. Building a culture of assessment

  25. What Is a Culture of Assessment? • “An organizational environment in which decisions are based on facts, research and analysis, and where services are planned and delivered in ways that maximize positive outcomes and impacts for customers and stakeholders.” • Amos Lakos & Shelley Phipps, “Creating a Culture of Assessment,” portal: Libraries and the Academy, Vol. 4, No. 3 (2004), pp. 345–361.

  26. Why a Culture of Assessment? Assessment, based on measurable, reliable, and valid data, is a pathway to sustainability. Assessment metrics make the case for value and continuing funding. User-based assessment metrics are an advocacy tool, drawing in the user, the collections donor, and the resource allocator.

  27. Creating a Culture of Assessment • User-based evaluation needs to be established on concepts that are specific to archives and special collections. • If archivists do not do this for themselves, someone else will. • Archival Metrics Toolkits developed by and for archivists • Collaboration between researchers and practitioners.

  28. User-Based Evaluation

  29. User-Based Evaluation Tells us how users view our constellation of services, service delivery, and resources, such as collections. Is not collection-based but user-centric. Can tell us about fulfillment of needs as well as user satisfaction. Helps information professionals better allocate their resources for improved performance and user satisfaction. Few user-based studies have been conducted and fewer published, especially from archives and museums, so there is much to learn. Little comparability across studies conducted.

  30. Evaluation of Digital Projects • Designed to: • assess what has been achieved with large investments of resources. • improve current provision (e.g. usability testing, interface design). • provide public accountability (often public funds are used). • Fulfill requirements of funding bodies.

  31. Evaluation of Digital Projects Helps to show who the users are and what they are doing/requesting. Measures the project’s/program’s impact (e.g. on teaching, research, lifelong learning). Informs future developments and as a basis for future projects or grants.

  32. And…. • No good understanding yet of the use of digital objects in collections. • We don’t know what people want/need in terms of: • The collection, • Services, or • What constitutes good service.

  33. Archival Metrics ProjectS

  34. www.archivalmetrics.org

  35. Archival Metrics Toolkits • Developed to meet the needs of archivists in evaluating their services to specific user groups: • Researcher • Archival Website • Online Finding Aids • Student Researcher • Teaching Support

  36. Discussion 4: Archival Scenario -- What types of evaluation have you conducted in your repository? What types would you like to conduct?

More Related