420 likes | 555 Views
Sociotechnical production systems for software in science. James Howison and Jim Herbsleb. Institute for Software Research School of Computer Science Carnegie Mellon University. School of Information University of Texas at Austin.
E N D
Sociotechnical production systems for software in science James Howison and Jim Herbsleb Institute for Software Research School of Computer Science Carnegie Mellon University School of Information University of Texas at Austin http://james.howison.name/pubs/HowisonHerbsleb2011SciSoftIncentives.pdf
First find some ice Image Credit: NASA
Build a big drill Image Credit: IceCube
and some Digital Optical Modules Image Credit: IceCube
Combine Image Credit: IceCube
Collect and filter data Image Credit: IceCube
Store and analyze it Image Credit: http://www.flickr.com/photos/theplanetdotcom
Simulate light in ice Photo credit: http://www.flickr.com/photos/rainman_yukky/
Simulate Atmosphere Image Credit: NASA
A appealing vision of software … • Enhancing reproducibility and correctness • Saving money • Driving innovation • Coalescing into widely used software platforms • All linked to software as information artifact: Re-playable Re-useable Extendable
Yet software also has constraints • Maintenance (avoiding “bit rot”) • Software must be maintained (“synchronization work” • Kept in sync with complements and dependencies • Coordinated • Rapid development and changes can lead to breakdown • Path dependencies • Easy to start, hard to architect for widespread use
How to achieve the Software Vision? • Better technologies? • Better engineering methods? • Leadership/Norms/Ethics? • Policy? • Rewards?
A sociotechnical understanding • Understand software work in existing institutions of science • Specific Research Questions: • What software is used? • How created and maintains it? • What incentives drive its creation? • Why is it trusted?
Method: Data • Route into complex practice • Chose paper as unit of analysis: “Focal Paper” • Trace back from paper to work that produced it • Semi-structured interviews • Supported by artifacts (e.g., paper/methods and materials) • Elicit workflow, focus on software work • Identify software authors/sources, and seek introductions • Qualitative analysis • Phenomenological exhaustion
Case 1: STAR Image Credit: RHIC
Software Production • Employed Core Software development • Professional software developers • ROOT4STAR framework • Core simulation code • Scientists undertaking “service work” • Analysis code • “to get the plots” • Locally written, frozen at publication
Case 3: Bioinformatic microbiology Image Credit: http://www.flickr.com/photos/grytr
Studying the nitrogen cycle Image Credit: Focal Paper
Personal software infrastructure • “Power user scripts” • Personal competitive advantage “that is something that most biologists can’t do. period.” • Share methods but not personal infrastructure code or actively support others • Methods and materials section should provide enough information, if not he’ll fix it. • But not going “to do their homework for them”
“Publishing on” software • Tools potentially useful to others described in separate publications, “Software pubs” • Ambivalence: • Can you make a career out of this? “Definitely” • But: “he’s known for his software rather than his science … he’s known for facilitating science rather than … and some people have that reputation” • Advise a student to do this? • “Yes, but … if you happen to get a publication out of it and it becomes a tool that’s widely used, then great, that’s fantastic, better props for you … but there’s a danger … Tool developers are greatly under-appreciated”
Algorithm people • Self-described member of the “algorithm people” as distinguished from biologists • Muscle: “biology == strcmp()” • Builds from scratch (“avoid tricky dependencies”) • “Obvious” that they don’t collaborate • Credit accrues to the “original publications” • Little credit in perceived incremental improvements • Politics of improvement acceptance “at the mercy of” • Competition is appropriate and productive
Software Production systems Practice that is similar on four aspects: • Incentives for the work • The type of artifacts produced • The way it is organized • The logic of correctness
Systemic threats to software vision • The type of software work needed to realize the cyberinfrastructure vision is poorly motivated • “Invisible work” (Star and Ruhlender) • Especially, little incentive to collaborate • Project “owned” by initial creators • Initial publications receive citations • Extension dominated by fork-and-rename
Academic reputation and integration James Howison and Jim Herbsleb (2013) Sharing the spoils: incentives and integration in scientific software production. ACM CSCW
Where to for science policy? • Exhortations? • Training? • Forcing “open source” through funding lever? • Risk of substituting logics of correctness • “Kleenex” code as open source? • Risk of undermining appropriate competition • Turn scientists into open source community managers? • When there is little reward for this work?
Scientific Software Network Map But, you know, imagine it as a live, dynamic data set!
Techniques for measuring use • Software that reports its own use • Instrumentation • Analysis of traces in papers • Mentions, citations • Characteristic artifacts • Analysis of collections of software • On supercomputing resources (TACC, NICS) • Through workflow systems (Galaxy, Pegasus, Taverna)
Contact James Howison http://james.howison.name jhowison@ischool.utexas.edu This material is based upon work supported by the US National Science Foundation under Grant No. #0943168.