320 likes | 434 Views
SpARC – Supplementary Assistance for Rowing Coaching. Simon Fothergill Ph.D. student, Digital Technology Group, Computer Laboratory. DTG Monday Meeting, 9 th November 2009. Overview. Automated Physical Performance Assessment Basic real-time feedback Performance similarity
E N D
SpARC – Supplementary Assistance for Rowing Coaching Simon Fothergill Ph.D. student, Digital Technology Group, Computer Laboratory DTG Monday Meeting, 9th November 2009
Overview • Automated Physical Performance Assessment • Basic real-time feedback • Performance similarity • Fault recognition • Data annotation • Questions
Automated Physical Performance Assessment • Sense and Optimise • Automatically provide feedback to athletes on their technique, including how good they are and how they could improve. • Combining Coaches and surrogate coaches • Analysis of the kinetics of a physical performance • Focusing on (indoor) rowing • Other relevant domains
Overview • Automated Physical Performance Assessment • Basic real-time feedback • Data capture • User interface • Conclusions • Performance similarity • Fault recognition • Data annotation • Questions
Data Capture System Aim: • Record a data set. • Allow annotation by expert, professional coaches • Provide basic local, real-time and distributed, post-session feedback. Data set requirements: • Real • Convincing + useful evaluation of algorithms • Investigate domain • Large • Segmented • Labelled • Synchronised • High fidelity Data capture system requirements: • Portable • Cheap • Physically robust • Extensible platform • Equipment augmentation
Data Capture System : Previous work Systems: • Bat system • Inertial sensors • Phasespace • Coda • Vicon • Motion capture with Wii controllers • StrideSense • Concept 2 Performance monitor Datasets: • Generic actions • 2D videos of sports performances
Data Capture System : WMCS 3D Motion Capture System with Wii controllers: • Based on work below which showed 3D motion capture is possible with a reasonable local precision, using 2 Wii controllers. Reference: Optical tracking using commodity hardware; Simon Hay, Joseph Newman and Robert Harle; 7th IEEE and ACM International Symposium on Mixed and Augmented Reality; 2008 • Standalone from Matlab, improved performance • Synchronisation of Wii controllers • Robust software • Cmdline tool/library
Data Capture System : WMCS PC (Ubuntu) C server Wii library interface LIFO Buffer Wii library Wii controller Thread Wii controller Thread Bluetooth library Bluetooth adapter Serial port Bluetooth Bluetooth Nintendo Wii controller Nintendo Wii controller power + control power + control Wii controller bridge IR 1024x768 camera (100Hz) IR 1024x768 camera (100Hz)
Data Capture System : C2PM3 PC (Ubuntu) C server C2PM3 library interface libUSB USB port Concept 2 Performance Monitor v3 Buffer Erg (with flywheel)
Data Capture System : StrideSense PC (Ubuntu) ( http://sourceforge.net/projects/stridesense/ ) Java client usb0 network interface USB port : power + TCP/IP ADC Power board Crossbow iMote2 strideSenseADCServer.c FSR FSR
Data Capture System : EMCS (1) • EMCS needs to track handle (1), seat (1) and erg position + orientation (4) • WMCS currently limited to 4 LEDs • Use 1 LED as a stationary point on the erg & 2 LEDs on the seat at different points in time • Use PCA to extract ECS axes ECS (Erg Coordinate System) Two LEDS attached to seat Erg clamped to camera rig to minimise error
Data Capture System : EMCS (2) Server Client Storage 4 x 2D coordinates End LED Calibrate labeller Erg calibration Calibration Calibrate WMCS (openCV) Stereo calibration Handle LED Label markers Seat LEDs Triangulation ECS Update ECS if necessary Live operation Transform to ECS Data from one Wii controller IR camera, used in computing correspondance of LEDs between cameras 4 x 3D coordinates
Athlete’s / coach’s PC Data Capture System : SpARC (1) PC (Ubuntu) in boathouse PC in Computer Lab Monitor + Keyboard Upload to database ssh CLOCK Create user requested videos scp Batch processing C server Java client TCP / IP camera interface libdc1394 (firewire) Camera (rower) C2PM3 (handle force) StrideSense (footplates force) WMCS (handle + seat motion) Athlete’s / coach’s PC Web browser
Data Capture System : SpARC (2) Server Client File server (CL) Detect strokes Create directories Turn on/off camera Transmit data Record user code Live operation Handle + seat coordinates, handle force, stroke boundaries Display on GUI Log data :Motion + force data, images Split data into strokes Augment and select Encode videos Post session Create metadata Data, videos, video metadata Create user videos Update database
User interface : SpARC (1) Real-time feedback
User interface : SpARC (2) http://www-dyn.cl.cam.ac.uk/~jsf29/
Conclusions (1) General • It works! • It is being used and collects on average 1 session of 5 strokes every 2 or 3 days Users • “Cool!”, “Works”, “Has potential” • Some people are very frightened about using it (reassurance is required). • Being able to see what your doing as well as feel it has been observed to facilitate communication and understanding of correct technique • Although rowing technique is complex, the system has a steep but short learning curve • Athletes require a very simple interface. They won’t even see half the screen and definitely not read anything. • Elite athletes will favour raw signal feedback; novices would be aided by hints and tips • Force is equally important as motion in the feedback.
Conclusions (2) General • Basic signals or sophisticated interpretation. Manually constructing rules is hard and unhelpful • Every sports has specific requirements • General system fidelity does show up interesting shapes, artefacts in rowing • Sensor signals BECOME the ontology Technical • At limit of WMCS range (accuracy and precision) • WMCS won’t work in bright sunlight • Hand covering LED on handle • Correspondence: Unnecessary vigorous rowing upsets algorithms which could be improved (domain specific e.g. scan; generic e.g. epipolar constraints) • ECS updated infrequently • More force sensors on heal of feet • openCV is buggy
Conclusions (3) General • Developed a novel and functional system and gained experience of deploying it and what is possible to achieve. • It enables further useful and convincing work to be done • Useful dataset • Platform for other system to be built with • Contributes to the age of useful data sets by setting a benchmark for dataset scale and detail. • The experience is applicable in other domains of physical performances and a lot of the work and could be reused
Further work • Use metrics to quantitatively measure the consistency of a performance with different levels (and modes) of real-time feedback
Overview • Automated Physical Performance Assessment • Basic real-time feedback • Performance similarity • Fault recognition • Data annotation • Questions
Performance similarity (1) • Overall quality • Quantitatively measure the difference between an ideal and given performance • Motivation: Closed sports, muscle memory, practise consistently good strokes • Cue and motivate athlete • Problems include definition of ideal (currently coach), anthropomorphic invariance • Similarity is VERY hard to define.
Performance Similarity (2) Approach • Trajectories can be different in various ways • Objective or subjective (from coaches) arguments for how similar are pairs of trajectories which are different in these ways. • Form hypothesis about likely algorithms • Test on dataset labelled with similarity define by coach • See how necessary various aspects of algorithms are • Artefacts exist in real dataset • Correlations between shape & speed
Overview • Automated Physical Performance Assessment • Basic real-time feedback • Performance similarity • Fault recognition • Data annotation • Questions
Fault recognition (on-going work) • Preliminary results have been obtained using a dataset of 6 rowers and the complete trajectory of the erg handle only. Binary classification over stroke quality was done using tempo-spatial features of the trajectory of the handle and a neural network. Two training methods were compared. Classification accuracy across given number of performers, for quality of individual aspects of technique. • Progress: Waiting for enough data to make it worth running more algorithms
Overview • Automated Physical Performance Assessment • Basic real-time feedback • Performance similarity • Fault recognition • Data annotation • Questions
Annotating data http://www-dyn.cl.cam.ac.uk/~jsf29/
Concluding remarks • Three generations of machine learning (Taxonomy by Prof. Chris Bishop). • Rule don’t work, • Statistics is limited, especially recognition across different people • Structure (time, segment selection)…
Acknowledgements • Rob Harle (advice) • Brian Jones (EMCS, driving the van) • Marcelo Pias & Salman Taherian (StrideSense) • SeSAME (context) • Simon Hay (Wii controllers, van driving) • Andrew Lewis (C2PM3 code advice) • Jesus College Boat Club, fellows and IT department (environment and support) • i-Teams (who are investigating any commercial potential of SpARC)
Questions Please come and use the system! Thank you! Any questions?