310 likes | 393 Views
Factors Affecting Website Reconstruction from the Web Infrastructure. Frank McCown, Norou Diawara, and Michael L. Nelson Old Dominion University Computer Science Department Norfolk, Virginia, USA JCDL 2007 Vancouver, BC June 20, 2007. Outline. Web-repository crawling with Warrick
E N D
Factors Affecting Website Reconstruction from the Web Infrastructure Frank McCown, Norou Diawara, and Michael L. Nelson Old Dominion UniversityComputer Science DepartmentNorfolk, Virginia, USAJCDL 2007 Vancouver, BCJune 20, 2007
Outline • Web-repository crawling with Warrick • How successful is a reconstruction? • Reconstruction experiment • Significant findings
Black hat: http://img.webpronews.com/securitypronews/110705blackhat.jpgVirus image: http://polarboing.com/images/topics/misc/story.computer.virus_1137794805.jpg Hard drive: http://www.datarecoveryspecialist.com/images/head-crash-2.jpg
Cached PDF http://www.fda.gov/cder/about/whatwedo/testtube.pdf canonical MSN version Yahoo version Google version
McCown, et al., Brass: A Queueing Manager for Warrick, IWAW 2007. • McCown, et al., Factors Affecting Website Reconstruction from the Web Infrastructure, ACM IEEE JCDL 2007. • McCown and Nelson, Evaluation of Crawling Policies for a Web-Repository Crawler, HYPERTEXT 2006. • McCown, et al., Lazy Preservation: Reconstructing Websites by Crawling the Crawlers, ACM WIDM 2006. Available at http://warrick.cs.odu.edu/
Measuring the Difference Apply Recovery Vector for each resource (rc, rm, ra) changed missing added Compute Difference Vector for website
Some Difference Vectors D = (changed, missing, added) (0,0,0) – Perfect recovery (1,0,0) – All resources are recovered but changed (0,1,0) – All resources are lost (0,0,1) – All recovered resources are at new URIs
How Much Change is a Bad Thing? Lost Recovered
How Much Change is a Bad Thing? Lost Recovered
Assigning Penalties Penalty Adjustment (Pc, Pm, Pa) Apply to each resource Or Difference vector
0 1 Less successful More successful Defining Success success = 1 – dmEquivalent to percent of recovered resources
Reconstruction Experiment • 300 websites chosen randomly from Open Directory Project (dmoz.org) • Crawled and reconstructed each website every week for 14 weeks • Examined change rates, age, decay, growth, recoverability
Success of website recovery each week *On average, we recovered 61% of a website on any given week.
External backlinks Internal backlinks Google’s PageRank Hops from root page Path depth MIME type Query string params Age Resource birth rate TLD Website size Size of resources Which Factors Are Significant?
Mild Correlations • Hops and • website size (0.428) • path depth (0.388) • Age and # of query params (-0.318) • External links and • PageRank (0.339) • Website size (0.301) • Hops (0.320)
Regression Analysis • No surprises: all variables are significant, but overall model only explains about half of the observations • Three most significant variables: PageRank, hops and age (R-squared = 0.1496)
Conclusions • Most of the sampled websites were relatively stable • One third of the websites never lost a single resource • Half of the websites never added any new resources • The typical website can expect to get back 61% of its resources if it were lost today (77% textual, 42% images and 32% other) • How to improve recovery from WI? Improve PageRank, decrease number of hops to resources, create stable URLs
Thank You Sorry, Dad… You lost me in the first two minutes. Frank McCown fmccown@cs.odu.edu http://www.cs.odu.edu/~fmccown/
Injecting Server Components into Crawlable Pages Erasure codes HTML pages Recover at least m blocks
Web Server Static files(html files, PDFs, images, style sheets, Javascript, etc.) Web Infrastructure Recoverable config Perlscript Dynamicpage Database Not Recoverable