User Tools

Site Tools


mkda

Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Next revisionBoth sides next revision
mkda [2011/12/08 17:02] – created jochenmkda [2011/12/08 22:47] – updated jochen
Line 6: Line 6:
   * creation of a compound table containing all found coordinates (possibly matching further selection criteria)   * creation of a compound table containing all found coordinates (possibly matching further selection criteria)
   * performing the MKDA (or a similar algorithm, such as found in the [[http://brainmap.org/ale/|GingerALE]] package), which tests the coordinates in the table against a Monte-Carlo random sampling (empirical null)   * performing the MKDA (or a similar algorithm, such as found in the [[http://brainmap.org/ale/|GingerALE]] package), which tests the coordinates in the table against a Monte-Carlo random sampling (empirical null)
-  * interpreting the results of the MKDA (making statistical inferences based on areas where the null hypothesis of interest can be safely rejected)+  * interpreting the results of the MKDA (making statistical inferences based on areas where the null hypothesis of interest can be safely rejected, followed by bringing those inferences into context of the literature and existing models)
  
 ===== Motivation ===== ===== Motivation =====
-For many reasons, most neuroimaging experiments (where data is collected and spatial maps are created, allowing functional representations to be located across the brain) and their results as reported in journals (in this context that means tables linking specific spatial locations, i.e. coordinates, to certain functions/phenomena) are not well suited to generate "factual knowledge": +For various reasons, many neuroimaging experiments (where data is collected and spatial maps are created, allowing functional representations to be located across the brain) and their results as reported in journals (in this context that means tables linking specific spatial locations, i.e. coordinates, to certain functions/phenomena) are, by themselves, not well suited to generate "factual knowledge": 
-  * without a clear model that underlies and fits the observed spatial representation (networks subserving the experimentally manipulated function), the results do not represent "knowledge" (strong inference) but rather new "hypotheses" (potential explanations based on reverse inference) +  * without a clear model that underlies and fits the observed spatial representation (networks subserving the experimentally manipulated function), the results do not represent "accepted knowledge" (strong inference) but rather new "hypotheses" (potential explanations for observed patterns, oftentimes based on reverse inference) 
-  * the choice of subjects, stimuli, experimentation design, etc. could have biased the results to make them less informative for the more general population case (false-positive and false-negative identification+  * the choice of subjects, stimuli, experimentation design, etc. could have biased the results to make them less informative for the more general population case (false-positive identification and false-negative masking
-  * noise components in the data (on all levels) could have masked important aspects (locations, false-negative identification)+  * noise components in the data (on all levels) could have masked important aspects (locations, usually as false-negatives)
  
-One potential way to overcome these problems is to aggregate coordinates from several (at least ten to 15) studies (or rather contrasts from those studies) and then test whether certain spatial locations are implicated more often in a given function than warranted by chance (Monte-Carlo null distribution via simulating data drawn from, say, a gray matter mask).+One possible way to overcome these problems (to some extent at least) is to aggregate coordinates from several (as a rule of thumb at least ten to 15, with most published meta analyses drawing from at least 40) studies (or rather contrasts from those studies) and then test whether certain spatial locations are implicated more often in the examined brain function than warranted by chance (Monte-Carlo null distribution via simulating data drawn from, say, a gray matter mask).
  
 However, there are some additional problems that are only partially addressable with meta analyses of any kind, such as: However, there are some additional problems that are only partially addressable with meta analyses of any kind, such as:
Line 22: Line 22:
  
 And it must be noted that even meta analyses cannot, per se, create "knowledge" (strong inferences) in absence of a model that explains and fits the observed patterns. Still, by summarizing several independent datasets into a single spatial map (e.g. via MKDA), the likelihood of making certain types of mistakes is highly reduced! And it must be noted that even meta analyses cannot, per se, create "knowledge" (strong inferences) in absence of a model that explains and fits the observed patterns. Still, by summarizing several independent datasets into a single spatial map (e.g. via MKDA), the likelihood of making certain types of mistakes is highly reduced!
 +
 +===== Practical outline =====
 +The following steps, in detail, have to be performed to run an MKDA in NeuroElf:
 +  * creation of a database (tabular format, one row per coordinate, with identifying columns/fields for study, contrast, x/y/z coordinate, as well as any other fields)
 +  * possibly saving the database in a text-based format (e.g. when using Microsoft Excel for the first step, you should save the database as a CSV file)
 +  * importing the database into NeuroElf (either using the command line or the MKDA UI)
 +  * deciding on settings for the MKDA (e.g. smoothness of underlying indicator maps representing each statistical unit)
 +  * if necessary, configuring one or several contrasts of interest
 +  * running the analysis/analyses
 +  * thresholding the resulting maps
 +  * drawing inferences
  
 ===== Requirements ===== ===== Requirements =====
 +
 +==== Creation of database ====
 Following the introduction, the first step is to look through the literature and select articles you wish to include in the MKDA. Next, you need to create a tabular representation of all coordinates found in the tables (or text) of those articles, such as the following example demonstrates: Following the introduction, the first step is to look through the literature and select articles you wish to include in the MKDA. Next, you need to create a tabular representation of all coordinates found in the tables (or text) of those articles, such as the following example demonstrates:
  
Line 34: Line 47:
 Lieberman_et_al_2010;36;21;15;T88;16;Negative>Neutral;</code> Lieberman_et_al_2010;36;21;15;T88;16;Negative>Neutral;</code>
  
-If you wish to use this table in Tor Wager's MKDA tool as well, the first row should contain a single line with the number of fields:+If you wish to use this table in Tor Wager's MKDA tool as well, the first row should contain a single line with the number of fields in the first column:
  
 <code CSV MKDA_sample_with_fields.txt> <code CSV MKDA_sample_with_fields.txt>
Line 44: Line 57:
 Ochsner_et_al_2008;15;33;48;MNI;21;LookNeg>LookNeu; Ochsner_et_al_2008;15;33;48;MNI;21;LookNeg>LookNeu;
 Lieberman_et_al_2010;36;21;15;T88;16;Negative>Neutral;</code> Lieberman_et_al_2010;36;21;15;T88;16;Negative>Neutral;</code>
 +
 +This first step can be performed in a variety of programs with Microsoft Excel being very suitable for this task. Usually it would seem most appropriate to first setup the columns (field names), followed by copying and pasting the coordinates into the table and setting all desired columns to their appropriate values. Eventually, the table must be available as a text-based (ASCII) file with a row of field names at the top followed by the actual data, one coordinate per row. 
 +
 +==== Importing the database into NeuroElf ====
 +In case you wish to perform this step on the command line (which might be particularly helpful if an error occurs to pinpoint the problem), you can use the following syntax:
 +
 +<code matlab importplp.m>
 +plp = importplpfrommkdadb('MKDA_sample.txt');</code>
 +
 +This will create a [[xff - PLP format|PLP object]] containing the coordinates as well as all other columns in a numeric representation. **Each non-numeric string will be converted to a unique number** such that, for instance, each unique study label will be stored by its numeric index into the ''Labels'' property of the PLP object.
 +
 +To then save the PLP object, please use the following syntax:
 +
 +<code matlab saveplp.m>
 +plp.SaveAs('MKDA_sample.plp');
 +% or simply plp.SaveAs;</code>
 +
 +Alternatively, you can use the [[neuroelf_gui - MKDA UI|MKDA dialog]] to import the database.
 +
 +===== Running the analysis =====
 +For the actual procedure of running the MKDA, please refer to the [[neuroelf_gui - MKDA UI|MKDA UI]] article.
  
mkda.txt · Last modified: 2012/02/28 16:17 by jochen