220 likes | 346 Views
Big Data EUDAT 2012 – Training Day. Adam Carter , EPCC EUDAT Training Task Leader. Data. Collect. Data. Share. Organise. Process. Interpret. Store. Knowledge. Paper. Collect. Data. Share. Organise. Interpret. PIDs. Process. SimpleStore. Data Staging. Store. Knowledge.
E N D
Big DataEUDAT 2012 – Training Day Adam Carter, EPCC EUDAT Training Task Leader
Collect Data Share Organise Process Interpret Store Knowledge Paper
Collect Data Share Organise Interpret PIDs Process SimpleStore Data Staging Store Knowledge iRODS Paper Replication
Collect Workflows Dataflows Data Share Organise Interpret PIDs Process SimpleStore Data Staging Store Knowledge iRODS Paper Replication
Data Bytes Data Data Data km Data I/O ≫ Compute
Data Data Data Data Data Data Data Data Data Data Data Data Data Data Data Data Data
1. Many Bytes Data takes a lot of space to store Data takes a long time to move Data takes a long time to process Data takes a long time to organise Data too large to be manually interpreted
2. Distributed Data • You have to bring (some of) the data together to process it • You have to bring (some of) the data together to interpret it • Often implies: • Different ownership • Different formats • Different quality
3. Data Intensive • Actually a feature of the application and the data size (and, arguably, the hardware) • Difficult to compute efficiently • Often implies big (in bytes) • Different definitions: • A lot of data on which to compute • A lot more I/O required than compute
Everyone can now do big science… Big Data - Rewards
Datascopes Theory Experiment Simulation Datascope “Observe” a large amount of data and look for patterns Data Intensive Research
Interdisciplinary Research Using others’ data brings more to your problem than you could have collected yourself May already have been pre-processed, possibly with input from experts in another field
More data Better statistics More likely to be data collected that can be used to test a theory
Of course, bigger doesn’t mean better As always, you should use the right tools for the job
Data Infrastructure Share your data more easily Use others’ data more easily Store more data Store data more reliably Move data more quickly Process data more efficiently Interpret data more easily
Move the Compute to the Data • Virtualisation • Queries to relational databases • Distributed query processing • If you’ve got a large amount of “baseline” data that is accessed repeatedly • put compute power close to the data • consider mechanisms to allow others to perform arbitrary compute against your data at your site
Data Intensive Computing • Build and use computers built for data-intensive research • Slower, low-power processors • High performance I/O systems • “Amdahl-Balanced” • In some cases these machines might (locally) move the compute to the data
Organise Your Data • Metadata • PIDs • Catalogues, Indexing • Make your data addressable • Allows for caching, or using nearby replicas • Allows for retrieving only necessary data • Consider why you’re storing your data • Think big!
Conclusions • We can do a lot more with our data than we currently do • A problem shared is a problem halved • Use infrastructure and services provided by others • Consider your data as important as the paper • Infrastructure likely to help with this • Also need change of mindset re attribution, etc. • Big data offers opportunities • Think about what you could do with data or techniques from other disciplines • Think big!