1 / 1

利用少量慣性感測器監控人物動作之研究 Action Surveillance Using Sparse Wearable Inertial Sensors

Preprocessing. Motion sequence 1. Training motion data. Frame 1. Frame 2. Frame 3. Motion sequence 2. Frame 4. Frame 5. …. Frame 6. Motion sequence x. …. Frame n. S t-2. S t-3. S t-1. S t. Sensor Data. S t+1. OLNG. Clip-based OLNG. KD-tree. Frame 2. Frame 6. Frame 4.

cili
Download Presentation

利用少量慣性感測器監控人物動作之研究 Action Surveillance Using Sparse Wearable Inertial Sensors

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Preprocessing Motion sequence 1 Training motion data Frame 1 Frame 2 Frame 3 Motion sequence 2 Frame 4 Frame 5 … Frame 6 Motion sequence x … Frame n St-2 St-3 St-1 St Sensor Data St+1 OLNG Clip-based OLNG KD-tree Frame 2 Frame 6 Frame 4 Frame 1 Priority1 Frame 5 Frame 3 Frame 36 Frame 4 Frame 5 Priority2 Frame 61 Frame 59 Frame 77 Frame 38 Frame 58 Interpolated Motion Frame 39 Priority3 Frame 37 Frame 60 Frame 3 Frame 35 Motion Capture Database Frame 80 Priority4 Frame 78 Frame 47 Frame 48 Frame 40 Motion Synthesis Clip-based OLNG Motion Synthesis Post-processing Frame 48 利用少量慣性感測器監控人物動作之研究 Action Surveillance Using Sparse Wearable Inertial Sensors Student: Shih-Yu Lin Advisor: I-Chen Lin Frame 90 Frame 38 Frame 83 Frame 44 Priority5 Frame 85 Frame 82 Frame 84 Frame 88 Frame 39 Priority6 Similarity Weights Frame 89 Frame 89 Frame 50 Frame 83 Frame 90 Priority7 Frame 4 Priority8 Similarity 1.d() Abstract: We present a framework to reconstruct full-body human motion by four to five inertial sensors that attached to the user’s four limbs and torso. Based on the gathered data, we construct an online k-dimensional tree (kd-tree) index structure which consists of hundred thousands of frames, and find the most appropriate motion fragment as user’s current full-body motion. However, the sparse and noisy sensing data cause high ambiguity for our motion estimation. It then results in gaps between poses continuous. Consequently, we include the concept of motion fields for more reasonable motion transition. This run-time motion synthesis mechanism merges the candidates of the motion sequences by weighted combination, and generates natural and smooth motions. Pre-processing: 1. Our training data is from CMU Mocap Lab 2. Clip the training data to several sequences to mitigate the time Figure 2. We clip the data to several parts and let the first frame of each clip be index. We use index to construct OLNG. Each index presents n frames Clip-based OLNG: Interpolated Clip-based OLNG: Figure 1. Overview of system Figure 3: Implementation of the OLNG.We search whether the last frame of node is adjacent to the new node and form a connection Figure 4: We calculate the similarity weights between two candidates and blend motion using similarity weights. Figure 5: Accuracy of our approach

More Related