370 likes | 483 Views
Extreme Data Hypercomplexity Cognitive Dissonance A Case For Transdisciplinary Internet Research by W. Reid Cornwell Ph.D. The Center For Internet Research http://www.tcfir.org wrc@tcfir.org. Question?.
E N D
Extreme DataHypercomplexityCognitive Dissonance A Case For Transdisciplinary Internet ResearchbyW. Reid Cornwell Ph.D.The Center For Internet Researchhttp://www.tcfir.orgwrc@tcfir.org
Question? Are the data management tools available today capable of handling the data streams being produced by the current input devices?
In the beginning… On Christmas day,1990, the world was given a gift. Working on his own time, without a budget, and without official sanction Tim Berniers-Lee created what we now know as the “World Wide Web” and the first graphic browser to access it.
The Internet At 16, the Internet is like a pimply faced adolescent, incomplete, little understood and filled with incredible promise. The body of empirical research is limited and naive. The ignorance of how to use new ideas stockpiles exponentially – Marshall MCCluhan Marshall Mcluhan - http://www.marshallmcluhan.com/
The Internet Vinton Cerf on creating TCP/IP “Bob and I had no idea how robust and ubiquitous our work would become.” Vinton Cerf – A personal conversation with Reid Cornwell
The Problem At the most fundamental level, a computer is an input device for another, extremely more complex, computer The ThinkerAugust Rodin
The Problem • Data is not information • Information is meaningful data • Computer Science is about creating the tools to provide meaning to data. • It is easier to derive data than to derive meaning.
Hypercomplexity "The age of reason has ended, and now we must organize around chaos."Watts Wacker - CEO and Futurist
Extreme Data New types of data, generated by new types of devices, being used in new ways Paul Gustafson Director CSC – Leading Edge Forum
Extreme Data "The average house in 2010 will have 100 computers, embedded in all kinds of appliances and amenities, and mostly networked to each other, and to the Web." Glen Hiemstra - Futurist.Com
Extreme Data "All information, ever created, is still in existence." Thomas Frey Executive Director, The DaVinci Institute Eventually, it will all be online!
Extreme Data "The year is 2050, and you are standing in front of a vending machine. What form of payment will you put into it?" Thomas Frey - Executive Director, The DaVinci Institute
Hypercomplexity Hypercomplexity is complexity inscribed in complexity, e.g., second-order complexity. (Luhmann 1984, p. 637 [1995, p. 471]
Hypercomplexity Complex search algorithms, symantic webs, and metadata are, by definition, hypercomplexity. Hypercomplexity is directly proportional to the volume of the data store.
Hypercomplexity "The Semantic Web is an extension of the current web in which information is given well-defined meaning, better enabling computers and people to work in cooperation." Tim Berners-Lee, James Hendler, Ora Lassila, The Semantic Web, Scientific American, May 2001
Hypercomplexity • What Is Metadata? Metadata is a component of data which describes the data. It is "data about data.“ Imagine trying to find a book in the library without the help of a card catalog or computerized search interface. The information contained in these types of systems is essentially metadata about the books housed at that library or other libraries. http://www.csc.noaa.gov/metadata/
Hypercomplexity Why Is Metadata Important? Metadata is critical to preserving the usefulness of data over time. For instance, metadata captures important information on how the data was collected and/or processed so that future users of that data understand these details. Another vital function metadata serves is as a record in search systems so that users can locate data sets of interest. http://www.csc.noaa.gov/metadata/
Hypercomplexity Statistical methods such as multivariate analysis of variance, factor analysis, etc. break down as a function of the number of variables. The results of Statistical methods are, by definition, hypercomplex. Human perception of hypercomplex data is a third order derivitive and therefore subject to a greater coefficient of error. Statistical methods applied to hypercomplex data is equivalent to a “mean of percentages”.
Hypercomplexity The act of observing a phenomena, changes the phenomena. Heisenberg
Hypercomplexity Gödel's theorem: “Any system that is complex enough to be useful also encompasses unanswerable questions”.
Hypercomplexity Progress in digital space is exponential “Not just the measure of power of computation, number of Internet nodes, and magnetic spots on a hard disk–the rate of paradigm shift is itself accelerating, doubling every decade.” “Scientists look at a problem and they intuitively conclude that since we’ve solved 1 percent over the last year, it’ll therefore be one hundred years until the problem is exhausted: but the rate of progress doubles every decade, and the power of the information tools (in price-performance, resolution, bandwidth, and so on) doubles every year.” “People, even scientists, don’t grasp exponential growth. During the first decade of the human genome project, we only solved 2 percent of the problem, but we solved the remaining 98 percent in five years.“ Ray Kurzwiel - an interview with Cory Doctorow
Hypercomplexity We are faced with an unending stream of new products, information and data. In the past, products that only appealed to one in 35,000 people would have never made it to the store shelves, but today the Internet creates marketing channels that make this type of product viable. On Amazon we can find 2 million books, on iTunes, over a million songs. On the Software Superstore, over a million software products. There are currently 19 million known chemical substances today, and the number is constantly doubling every 13 years… reaching 80 million by 2025. Grocery store products are being created at the rate of one every 30 minutes. Now more than ever we can define who we are and what we care about with the millions of micro-defining choices we make. And people will become more and more complicated.
Cognitive Dissonance There is a tendency for individuals to seek consistency among their cognitions (i.e. ideas, beliefs, opinions) Festinger, L (1957) A Theory of Cognitive Dissonance. Stanford University Press
Cognitive Dissonance Hypotheses: • Too much information produces dissonance. • Contradictory information produces dissonance. • Voluminous contradictions produce rationalization with the aim of reducing dissonance. • Dissonance tends to produce inaction. • Dissonance tends to defeat learning. • Consonance is best achieved by maintaining the status quo.
Cyber Psychology In just a brief one-twentieth of a second -- less than half the time it takes to blink -- people make aesthetic judgments that influence the rest of their experience with an internet site. But the results did not show how to win a positive reaction from users, said Lindgaard, a psychology professor at Carleton University in Ottawa. "When we looked at the websites that we tested, there is really nothing there that tells us what leads to dislike or to like."
Cyber Psychology Cyber Psychology is the study of man’s interaction with computing machines. Psychology and its new sub-discipline Cyber Psychology suffers from institutional “Physics Envy”. Physics Envy is a science in search of a math and characterized by wild speculation or over simplification. Physics Envy is a science in search of a .01 level of confidence and characterized by no vestige of laws.
Errata Oxford University Announces Multidisciplinary Doctoral Programme Submitted on Tue, 2005-10-11 00:58. The Oxford Internet Institute is now accepting worldwide applications from candidates who want to study the Internet and its social impact. We are a department of Oxford University chartered to pioneer the multidisciplinary study of the Internet. The Institute is dedicated to engaging in fruitful collaboration with policy makers, technologists, businesspeople, teachers, scholars and civil society entrepreneurs to inform and ground our research. We seek to understand the most difficult and relevant social puzzles, problems, and opportunities as the mainstream Internet enters the second decade of a multi-year buildout transforming the fundamentals of work, politics, education, entertainment, social interaction, and conflict. There are less than 5 similar programs in the world.
Questions? Are the data management tools, available today, capable of handling the data streams being produced by the current input devices?
Questions What scientific disciplines will be necessary to answer this question? What level of collaboration will it take?
A Case For Multi-disciplinary Internet ResearchbyW. Reid Cornwell Ph.D.The Center For Internet Researchhttp://www.tcfir.orgwrc@tcfir.org