1 / 21

CWI Amsterdam The Netherlands

Explore a new documentary video authoring process that includes annotation, automatic editing, and interactive distribution. Learn about the Semantic Graph generation, Automatic Linking Process, and the Rhetorical Argumentation model. Find out how the Thesaurus relations influence the creation of different arguments within video content. Discover the potential of this innovative approach for creating multiple video versions and supporting authors in documentary production.

gnewell
Download Presentation

CWI Amsterdam The Netherlands

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. CWI Amsterdam The Netherlands Supporting the Generation of Argument Structure within Video Sequences

  2. Talk Outline The motivation and vision of the work What is needed Annotations Editing Process Editor Support Conclusions

  3. Existing Documentaries Traditional video authoring: the footage is selected and edited for the final cut there is only one final version, what is shown is the choice of the author / editor Material can be very rich and controversial (e.g. Voices of Iraq)

  4. New Paradigm • Proposed video authoring: • Annotate the video material semantics • Edit it automatically, selecting what the user asks to see • Use the Web as an interactive distribution mean • More than a sequence of matching video fragments): • Argumentation/rhetoric • Narrative

  5. Video material Interview with America: video footage with interviews and background material about the opinion of American people after 9-11 www.interviewwithamerica.com Annotations: 1 hour annotated, 15 interviews, 60 interview segments, 120 statements

  6. What do you think of the war in Afghanistan?

  7. Example Explained Claim Claim Concession Claim Two billions dollar bombs on tents contradict I cannot think of a more effective solution weaken I am not a fan of military actions support War has never solved anything

  8. The annotations Rhetorical Argumentation model: Toulmin model Rhetorical Statement (mostly verbal, but visual also possible) Descriptive Question asked Interviewee (social) Filmic (e.g. location/time/framing/gaze)

  9. Encode statements Statement formally annotated: <subject> <modifier> <predicate> E.g. “warbestsolution” A thesaurus containing: Terms for each part (155 in total) Relations between terms: similar(72), opposite(108), generalization(10), specialization(10) E.g. waroppositediplomacy Relations in the thesaurus determine link type between statements

  10. Automatic Linking Process STEP1: Using the thesaurus, generate related statements, by replacing iteratively terms: E.g. from “war best solution” “diplomacy best solution”, “war not solution” STEP2: Query the repository to see whether the statement is present

  11. Connect statements • Create a graph of related statements • Nodes are the statements (video segments), edges are either support or contradict S8 S7 S0 S1 S6 S2 S9 S4 = support S3 S5 = contradict

  12. Author/Annotator support • Capability of generating different arguments depends on the quality of the Semantic Graph • Statements (and corresponding video segments) not connect are lost for generation: • Our case: out of 118 statements 54 were not connected • Measure the performance of the automatic linking process

  13. Indices for statements • Measure how many statements are generated from a given one: • depends on the quantity of the relations in the thesaurus • Measure how many generates statements are present in the repository • Depends on correctness of the relations in the thesaurus

  14. Index for relations • Measure how a particular relation in the thesaurus is performing: • If a generated statement is present in the repository, the relations used to generate it get one point on a hit score, otherwise one point on the miss score • The ratio hit/miss gives an idea of the semantic accuracy of the relation with respect to the repository

  15. Current/future Work Automatic Relation suggestion: Start with a fully connected thesaurus, keep only best relations Suggest best relation to add to existing ones Linking Process tuning Currently 3 iterations for performance, but the process runs of-line: more iterations possible Different repositories (VJ project)

  16. Conclusions • New documentary production mechanism, multiple versions • Different authoring, author does not have full control anymore • Authoring support needed

  17. Questions? Thanks for your attention

  18. Pointers & Acknowledgments This presentation and Demo available at: http://www.cwi.nl/~media/demo/IWA/ This research was funded by the Dutch national ToKeN2000 I2RP and CHIME projects.

  19. Author/Annotator support • Provide means to measure the performance of the creation of the Semantic Graph • Reengineer the Semantic Graph generation: • Changing annotations • Changing relations in the Thesaurus

  20. What do you think of the war in Afghanistan? I am not a fan of military actions I cannot think of a more effective solution War has never solved anything Two billions dollar bombs on tents

  21. Toulmin model Data Claim Qualifier Warrant Condition Backing Concession 57 Claims, 16 Data, 4 Concessions, 3 Warrants, 1 Condition

More Related