MUSE Analysis with MATLAB and Brain Vision Analyzer
-
Brain Vision Analyzer
-
MATLAB
<
>
To improve the transparency of what we do in our laboratory, and to help others interpret our results, we provide here, a step-by-step description of our analysis pipeline. There are two key components to our analysis: first, the raw data analysis using Brain Vision Analyzer 2.1, and next, statistical analysis. What you see below can also act as a tutorial for those looking to develop their skills and familiarity with EEG outputs and analysis, and a sample data set can be downloaded for those seeking to practice their use of Analyzer. The experiment below consists of an oddball task, where participants were exposed to either a blue 'target' circle or green 'control' circle located either side of a central focus. As the target and control stimuli were presented, participants were instructed to mentally count the number of target circles that appeared among the series of circles. At the end of each testing block participants were instructed to record the number of target circles they encountered on the electronic device housing the Oddball task.
Converting MUSE.csv Files for Analyzer
When exporting MUSE data from your device, you will be left with a comma separated value file containing raw MUSE EEG data. Analyzer can not read .csv files, and so one of our PhD students Chad Williams, has created a Matlab script to convert this data. Posted below is a zip file containing this script as well as a link to his GitHub repository.
https://github.com/chadcwilliams/Convert_Muse_Files/blob/master/Convert%20Muse%20PEER/convert_Peer_to_BV.m
https://github.com/chadcwilliams/Convert_Muse_Files/blob/master/Convert%20Muse%20PEER/convert_Peer_to_BV.m
muse_conversion_zip_file.zip | |
File Size: | 15 kb |
File Type: | zip |
Creating a Storage Location
Create four files labeled: Export, History, Raw and Workspace on your desktop. Inside the Export and History folders create a folder labeled Export_Lastname and History_Lastname respectively.
Open Analyzer
Open BrainVision Analyzer by clicking on the red icon.
Create a Workspace
In the File tab, select New. A new workspace will appear asking you to select locations for Raw, History, and Export files. Using the browse function, select the folders you have created as destinations for each file type. Click ok and save the workspace to your Workspace folder.
Open Your Data
Go to the File tab and click Open. Find your data file and open it. Your raw data will now be in the Analyzer window.
Create four files labeled: Export, History, Raw and Workspace on your desktop. Inside the Export and History folders create a folder labeled Export_Lastname and History_Lastname respectively.
Open Analyzer
Open BrainVision Analyzer by clicking on the red icon.
Create a Workspace
In the File tab, select New. A new workspace will appear asking you to select locations for Raw, History, and Export files. Using the browse function, select the folders you have created as destinations for each file type. Click ok and save the workspace to your Workspace folder.
Open Your Data
Go to the File tab and click Open. Find your data file and open it. Your raw data will now be in the Analyzer window.
Starting Analysis
Step 1: Data Filtering
Data Filtering
Under the Transformations tab, click Data Filtering and select IIR Filters. Enable the Low Cutoff and insert 0.1 as the frequency. Enable the High Cutoff and insert 15 as the frequency. Enable the Notch and select 60 as the notch frequency. Hit OK.
Under the Transformations tab, click Data Filtering and select IIR Filters. Enable the Low Cutoff and insert 0.1 as the frequency. Enable the High Cutoff and insert 15 as the frequency. Enable the Notch and select 60 as the notch frequency. Hit OK.
Step 2: Segmentation
The purpose of segmentation is to extract the 'Target' and 'Control' stimuli of the experiment from the remainder of the data collected. Under the Transformations tab, click Segmentation. In the first window, select Create new Segments based on a marker position. Select Cache data to a permanent file. Hit Next. In the next window, select the markers of interest (S 6 and S 5 in this oddball paradigm) add them to the right-hand column, then click Next. In the third window, select Based on Time and insert -200 in the start box and 600 in the end box. Click Finish.
step 3: Baseline Correction
Under the Transformations tab, click Baseline Correction. In the pop-up window, insert -200 in the Begin space and 0 in the End space. Click OK.
Step 4: Artifact Rejection
I. In the pop-up window select Automatic Segment Selection under the Inspection Method tab. Uncheck all other options under this tab.
II. Move to the Channels tab. Here, select Enable All.
III. Move to the Criteria tab and in the Gradient sub-tab select Check Gradient and insert 10 into the Maximal Allowed Voltage text box. In the Before Event space insert 200. Do the same in the After Event text box.
IV. Now move to the Min-Max sub-tab and select Maximal Allowed Absolute Difference. In the following text box insert 100. In the Interval Length text box insert the length of your segment (e.g. 800ms: see above). In the Before Event space insert 200. Do the same in the After Event text box. Finally, finish up artifact rejection by clicking OK.
II. Move to the Channels tab. Here, select Enable All.
III. Move to the Criteria tab and in the Gradient sub-tab select Check Gradient and insert 10 into the Maximal Allowed Voltage text box. In the Before Event space insert 200. Do the same in the After Event text box.
IV. Now move to the Min-Max sub-tab and select Maximal Allowed Absolute Difference. In the following text box insert 100. In the Interval Length text box insert the length of your segment (e.g. 800ms: see above). In the Before Event space insert 200. Do the same in the After Event text box. Finally, finish up artifact rejection by clicking OK.
Step 5. Condition Segmentation
Now you must segment your data into the different conditions of your study (e.g. oddball and control). Under the Transformations tab, click Segmentation. In the first window, select Create new Segments based on a marker position. Select Cache data to a permanent file. Hit Next. In the next window select the markers of interest (S6 for the oddball target or S5 for control) and add them to the right-hand column then click Next. In the third window select Based on Time and insert -200 in the start box and 600 in the end box. Note: The segment length here is dependent of the ERP component of interest. For the P1/N1 a segment length of 600 ms is all you need: -200 to 400 ms. For the FRN/RP 800 ms, -200 to 600 ms, for the P300 you may need a segment length of 1000 ms: -200 to 800 ms. Click Finish. This step should be repeated for each marker of interest (oddball marker S 6 and control marker S 5).
Step 6. New Rerefence
Re-Referencing allows you to form a new reference from the average of any grouping of channels in your data you select. As part of our analysis we re-reference to the mastoids (TP9 & TP10).
Step 7. Averaging
The ERP waveform is the average of all the segments for a condition. As such, once artifacts are removed one simply needs to generate the mean across the segments to get the ERP waveforms for each channel and condition. It is worth noting here, that in our laboratory we typically collapse across the two frontal channels (AF7,AF8) and the two ear channels (TP9, TP10) to create a pooled or average channel (AF, TP).
It is important to realize at the outset this webpage is not intended to be an introductory tutorial on how to analyze ERP data. Indeed, that particular topic is far too broad to be covered in a single webpage - or many webpages for that matter. If you are reading this and have no ERP experience, then I strongly recommend you read Steve Luck's book - An Introduction to the Event Related Potential Technique - before proceeding any further. Better, yet - a graduate level course or experience in an ERP lab would be preferable. We want to stress here that it is quite easy to record ERP data with the MUSE - however, interpretation of said data does require some expertise.
Further, we assume that the reader of this webpage is familiar with MATLAB. Again, a one page MATLAB tutorial is not possible.
Further, we assume that the reader of this webpage is familiar with MATLAB. Again, a one page MATLAB tutorial is not possible.
Data Format
If you are using our code to record data directly from MUSE, then the saved ERP data is recorded as a cell array with 2 cells, each with a 4 x n matrix where 4 is the number of MUSE channels (TP9, AF7, AF8, TP10) and n is the number of recorded time points. Note, with our code and research preset AD this will be a multiple of 500 as we record 1 second of data (500 samples) for each trial. The cells reflect each of the two experimental conditions. On this website, the main task we present is a visual oddball paradigm. For the sample data that is attached below, the first cell reflects the data for the oddball trials and the second cell reflects the data for the control trials. Note, the trial data is concatenated during recording. As such, one needs to reshape the matrix to get data that is channel x time x trial (e.g., epoched or segmented data).
If you load the recorded MUSE data into MATLAB you will see a variable EEG with two cells as outlined above. For the purposes of the pre-processing examples below, we will just work with the data for the oddball condition.
In MATLAB:
MUSEDATA = EEG{1};
If you load the recorded MUSE data into MATLAB you will see a variable EEG with two cells as outlined above. For the purposes of the pre-processing examples below, we will just work with the data for the oddball condition.
In MATLAB:
MUSEDATA = EEG{1};
Data Processing
We use a series of steps to pre-process the data for ERP analysis. Full code is not provided here - simply examples of each step.
1. Demean
MUSE data is recorded with a large DC offset (also, it is not in uV units but "MUSE" units). As such, the first step needed to process the MUSE data is to remove the mean (typically about 800 MUSE units).
In MATLAB:
mean(MUSEDATA(1,:)) % display the mean of the first channel prior to demeaning
MUSEDATA(1,:) = MUSEDATA(1,:) - mean(MUSEDATA(1,:)); % remove the mean from the first channel
mean(MUSEDATA(1,:)) % display the mean after the the original mean has been removed
This step removes the mean of the data for channel 1 (TP9) from the data. You should see that the mean before the demean is in the 800's and after is close to zero. Note, you would have to repeat this step for the other three channels. Also note, for the purposes of this tutorial we will only work with the first channel as well. Of course, you would need to tweak your code to see repeat the analysis for each channel.
In MATLAB:
mean(MUSEDATA(1,:)) % display the mean of the first channel prior to demeaning
MUSEDATA(1,:) = MUSEDATA(1,:) - mean(MUSEDATA(1,:)); % remove the mean from the first channel
mean(MUSEDATA(1,:)) % display the mean after the the original mean has been removed
This step removes the mean of the data for channel 1 (TP9) from the data. You should see that the mean before the demean is in the 800's and after is close to zero. Note, you would have to repeat this step for the other three channels. Also note, for the purposes of this tutorial we will only work with the first channel as well. Of course, you would need to tweak your code to see repeat the analysis for each channel.
2. Filter
The next step in pre-processing is to filter the data. We will filter the data using a 2nd order bandpass Butterworth filter. Again, this solution is not ideal as this is not "continuous" data, but it does work and does not leave large artifacts from our experience.
In MATLAB:a
plot(MUSEDATA(1,1:1000)); % plot a segment of the MUSE data prior to the filter
[b,a] = butter(2,[0.5 15]/(500/2)); % construct the filter
MUSEDATA(1,:) = filtfilt(b,a,MUSEDATA(1,:)); % apply the filter
hold on; % ensure the original plot stays on screen
plot(MUSEDATA(1,1:1000)); % overlay the filtered dat
In MATLAB:a
plot(MUSEDATA(1,1:1000)); % plot a segment of the MUSE data prior to the filter
[b,a] = butter(2,[0.5 15]/(500/2)); % construct the filter
MUSEDATA(1,:) = filtfilt(b,a,MUSEDATA(1,:)); % apply the filter
hold on; % ensure the original plot stays on screen
plot(MUSEDATA(1,1:1000)); % overlay the filtered dat
3. Segmentation
In the next analysis step we reshape our "continuous" data into segmented (epoched) data. In a typical ERP experiment this would be done with event markers, here however, we know that we have exactly 500 samples of data for each trial. So, we will just use this information to reshape the data. Also note, we do not have a baseline prior to the onset of the event of interest, as such, we our segments at 500 Hz go from 0 ms to 1000 ms.
In MATLAB:
size(MUSEDATA) % show that the data matrix is currently 4 x n in size
number_of_trials = size(MUSEDATA,2)/500; % determine the number of trials represented in the data
MUSEDATA = reshape(MUSEDATA,4,500,number_of_trials); % use reshape to create segments
size(MUSEDATA) % show the new size of the data matrix
In MATLAB:
size(MUSEDATA) % show that the data matrix is currently 4 x n in size
number_of_trials = size(MUSEDATA,2)/500; % determine the number of trials represented in the data
MUSEDATA = reshape(MUSEDATA,4,500,number_of_trials); % use reshape to create segments
size(MUSEDATA) % show the new size of the data matrix
4. Trim Data
We will state at the outset that this step is not necessary - but a general rule of thumb in ERP research is that the longer each segment is the more artifacts you will detect and the more trials will be "lost". As such, we found through observation of the data for this experiment that we only needed the first 300 data points. Thus, to reduce the number of artifacts that we encountered we "trimmed" the last 200 data points from the data matrix.
In MATLAB:
size(MUSEDATA) % show that the data matrix is currently 4 x n in size
MUSEDATA(:,301:end,:) = []; % remove the last 200 data points
size(MUSEDATA) % show the new size of the data matrix
In MATLAB:
size(MUSEDATA) % show that the data matrix is currently 4 x n in size
MUSEDATA(:,301:end,:) = []; % remove the last 200 data points
size(MUSEDATA) % show the new size of the data matrix
5. Baseline Segmentation
Typically in ERP studies a baseline correction is used to "zero" the waveforms for each segment. To do this, researchers use a baseline from 200 ms prior to stimulus onset to the stimulus onset. The mean is computed for the baseline, and the mean is then subtracted from each time point of data. This is done on a segment by segment basis.
With the MUSE, we do not have baseline data from prior to stimulus onset, so instead we use the first 50 ms of data as the baseline. The MATLAB code below shows how the baseline would be computed and subtracted from a single trial of data. Note, this code would have to be put in a loop to repeat this step for each segment for each channel.
In MATLAB:
baseline = mean(MUSEDATA(1,1:25,1));
MUSEDATA(1,:,1) = MUSEDATA(1,:,1) - baseline;
With the MUSE, we do not have baseline data from prior to stimulus onset, so instead we use the first 50 ms of data as the baseline. The MATLAB code below shows how the baseline would be computed and subtracted from a single trial of data. Note, this code would have to be put in a loop to repeat this step for each segment for each channel.
In MATLAB:
baseline = mean(MUSEDATA(1,1:25,1));
MUSEDATA(1,:,1) = MUSEDATA(1,:,1) - baseline;
6. Artifact Detection
In ERP studies one wants to compute averages that do not contain artifacts - blinks, saccades, or other non ERP noise. There are many different approaches to doing this, the one we use here is simple. We compute the variance of each segment of data - we then discard any segments in which the variance of the segment exceeds 200 uV^2/ms. Alternatively, one could use a max-min approach in which segments are discarded exceed some absolute difference between the maximum and minimum values of the segment - for instance 100 uV. Note, this code would have to be put in a loop to repeat this step for each segment for each channel.
In MATLAB:
check_value = var(MUSEDATA(1,:,1));
if check_value > 200
MUSEDATA(1,:,1) = NaN;
end
In a later step you could then remove any trials with NaN values. There are other ways to do this of course.
In MATLAB:
check_value = var(MUSEDATA(1,:,1));
if check_value > 200
MUSEDATA(1,:,1) = NaN;
end
In a later step you could then remove any trials with NaN values. There are other ways to do this of course.
7. Averaging
The ERP waveform is the average of all the segments for a condition. As such, once artifacts are removed one simply needs to generate the mean across the segments to get the ERP waveforms for each channel and condition. It is worth noting here, than in our laboratory we typically collapse across the two frontal channels (AF7,AF8) and the two ear channels (TP9, TP10) to create a pooled or average channel (AF, TP).
In MATLAB:
ERPDATA = mean(MUSEDATA,3);
In MATLAB:
ERPDATA = mean(MUSEDATA,3);
8. Quantify ERP Components
Typically, one wants to quantify ERP components for statistical analysis. For example, with the oddball data here one might want to get the score for the N200 and P300 for each participant. This process is quite involved, but a simple algorithm for doing so is:
1. Create average ERP waveforms for each subject for the oddball and control conditions.
2. Create an average difference waveform for each subject by subtracting the average control waveform from the average oddball waveform.
3. Create grand average ERP waveforms by averaging ERP conditional waveforms (oddball, control) across all subjects.
4. Create grand average difference ERP waveforms by averaging the difference waveforms across all subjects.
5. Plot the grand average conditional and difference waveforms.
6. Establish the time point at which the N200 and P300 are maximal on the grand average difference waveforms.
7. Take the mean using a window of +/- 25 ms centered on these two times of the average difference waveforms for each subject - these are your "scores" for the N200 and P300. The logic here is simple - these scores should statistically differ from zero if the N200 and P300 components exist - i.e., there is a statistical difference in the subject waveforms meaning the effect is real. You could then compare these scores between groups (young versus old, happy versus sad) or within groups (time 1 versus time 2).
1. Create average ERP waveforms for each subject for the oddball and control conditions.
2. Create an average difference waveform for each subject by subtracting the average control waveform from the average oddball waveform.
3. Create grand average ERP waveforms by averaging ERP conditional waveforms (oddball, control) across all subjects.
4. Create grand average difference ERP waveforms by averaging the difference waveforms across all subjects.
5. Plot the grand average conditional and difference waveforms.
6. Establish the time point at which the N200 and P300 are maximal on the grand average difference waveforms.
7. Take the mean using a window of +/- 25 ms centered on these two times of the average difference waveforms for each subject - these are your "scores" for the N200 and P300. The logic here is simple - these scores should statistically differ from zero if the N200 and P300 components exist - i.e., there is a statistical difference in the subject waveforms meaning the effect is real. You could then compare these scores between groups (young versus old, happy versus sad) or within groups (time 1 versus time 2).
Other Options
The code in the ZIP files HERE brings .muse data into EEGLAB. The code also saves the EEG structure that is created in Brain Vision Analyzer format. In this manner, one can use either EEGLAB/ERPLAB or Analyzer to preprocess the EEG data.
Note, this works for both continuous data that is generated via MUSELAB or from the sample code we have provided here. Because this data is assumed to reflects trials (the oddball task, the learning task) the above MATLAB scripts add markers to the data for segmentation purposes. Markers are added at the start of each trial (i.e., 0ms) and by condition - a "1" for whatever condition one is and a "2" for the other condition. This code would need to be adapted for other experiments wherein there are more than two conditions.
Note, this works for both continuous data that is generated via MUSELAB or from the sample code we have provided here. Because this data is assumed to reflects trials (the oddball task, the learning task) the above MATLAB scripts add markers to the data for segmentation purposes. Markers are added at the start of each trial (i.e., 0ms) and by condition - a "1" for whatever condition one is and a "2" for the other condition. This code would need to be adapted for other experiments wherein there are more than two conditions.