1 / 21

So you want to run an MVPA experiment…

So you want to run an MVPA experiment…. Lindsay Morgan April 9, 2012. Overview. Study Design Preprocessing Pattern Estimation Voxel Selection Classifier. Study Design. Blocked design Smaller # of conditions Better estimate of the average response pattern. Event Related Design

maree
Download Presentation

So you want to run an MVPA experiment…

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. So you want to run an MVPA experiment… Lindsay Morgan April 9, 2012

  2. Overview • Study Design • Preprocessing • Pattern Estimation • Voxel Selection • Classifier

  3. Study Design Blocked design • Smaller # of conditions • Better estimate of the average response pattern Event Related Design • Larger # of conditions • Similarity analyses • Better estimate of the response distribution across exemplars • Psychologically less predictable • Requires sequence optimization (e.g., OptSeq, de Bruijn)

  4. Study Design Suggestions • Multiple runs • Independent data sets for training & testing • Many short runs preferable to a few long runs (Coutanche & Thompson-SchillNeuroImage 2012) • Equal # of exemplars per stimulus class • Or use subsamples of more numerous class

  5. Pre-processing • Pre-process each run separately • Slice time correction • Motion correction • Smoothing?

  6. To Smooth or Not to Smooth? Op de BeeckNeuroImage 2010

  7. Pattern Estimation Raw signal intensity values • Suitable for block or slow event-related Betas (parameter estimates) or t values • Suitable for all designs • Derived from GLM • Accounts for overlap in HRF • Can remove motion effects and linear trends

  8. Data transformation so far… Mur et al., Soc Cog Affective Neurosci, 2009

  9. Ungrouped design • 96 images • Each image presented 1x/run • 3 comparisons • Inanimate vs. animate • Face vs. body • Natural vs. artificial Kriegeskorte et al., Frontiers Sys Neurosci, 2008

  10. Betas or t values? Misaki et al., NeuroImage, 2010

  11. Pattern Normalization Misaki et al., NeuroImage, 2010

  12. Pattern Normalization Misaki et al., NeuroImage, 2010

  13. Data transformation so far… Mur et al., Soc Cog Affective Neurosci, 2009

  14. Voxel Selection • Typically, performance decreases as the # of voxels increases • Data must be independent of classifier • Anatomically-defined region • Functional localizer • Training set from your experimental data • E.g., ANOVA for all conditions at each voxel  select top N voxels

  15. The Classifier Misaki et al., NeuroImage, 2010

  16. Which classifier should you use? Misaki et al., NeuroImage, 2010

  17. Data transformation complete! Mur et al., Soc Cog Affective Neurosci, 2009

  18. How to implement the classifier • AFNI 3dsvm • Princeton MVPA toolbox • PyMVPA toolbox • LIBSVM toolbox

  19. General Conclusions • Design your experiment to yield as many independent patterns as possible • Estimate your patterns using t values (or z scores) • Use a linear classifier

More Related