Loading
Notes
Study Reminders
Support
Text Version

Wavelet Analysis of VEP

Set your study reminders

We will email you at these times to remind you to study.
  • Monday

    -

    7am

    +

    Tuesday

    -

    7am

    +

    Wednesday

    -

    7am

    +

    Thursday

    -

    7am

    +

    Friday

    -

    7am

    +

    Saturday

    -

    7am

    +

    Sunday

    -

    7am

    +

I will be next giving a short update about the wavelet analysis along with the VEPs. So, what is this wavelet analysis about and what are the VEPs, a small background I will be covering in these particular slides, in this presentation. Followed by how to do the analysis and the wavelet analysis and everything. So, firstly I will just you a background about what is this VEP. VEP is nothing but the visual evoke potential. So, we had the as I told you what is an ERP, EPR is a event-related potential with a particular event how the brain is going to respond to that particular event is what is called as this event-related potential. So, here in this visual evoke potential obviously, there will be some visual stimulus. For example, in that P300 as I mentioned before, there was P300 for them the 4 centimeter ball or 6 centimetre ball was been used. In case of where there were visual evoke potential, we will be flashing light or it can be either reversing checkerboard or some visual potential is been given. So, how the brain perceives this visual? It is in the visual cortex and the visual cortex is mainly present in the occipital region of the brain. Occipital that is in the O1OZ, O1O2 in that particular place it will be present. So, here the visual potentials it will give you, it is same like the auditory evoke potential only, it will be like having the N1 P1 P2 complex everything. So, here the experiment it is reversing checkerboard, the stimulus is given and it is in the frequency of about 2 hertz like for which means for every 500 millisecond, there will be a reversing of the checkerboard happening over here. Similar to the other, here also we have obtained acquired the data using the 64 channel neural scan system.

So, here we have to give the stimulus accordingly, the reversing checkerboard is being given. And the artifact rejection, all the filters, the notch filter, the bandpass filter everything is been performed in this case. So, basically, when there is that, when there is a high contract patterns is being observed, the brain it will respond to that particular as a visual as a response to that visual stimulus in the visual cortex region of the brain. So, how does it look like is, this is how the experiment is being done. Here the reversing checkerboard is being presented over here and the subject is made to sit in such a manner that they are in the angle is been kept to a constant and the line of sight and all should be properly measured and everything. And then we will be, the experiment is being recorded. So, what happens over here is, the retina, from the retina through the optical nerves, the signal is being given to the visual cortex that is present in the behind, exactly behind the brain. So, it is like above the thalamus or above the cerebrum like that. So, it is at the back. And the visual, the visual potential, it is obtained, the visual evoked potential that is obtained is in this fashion. Like always in any of the ERPs the amplitude and the latency, these two are the parameters that we have to check in everything. So, for example in MMN also we check this P1 N1 P2 response that is the one which we check. Similarly, in for the P300 means we will check that at that 300 millisecond in that or a more than that, we will be getting the response. For MMN means it should be in that latency range from minus 50, sorry 150 to 300 millisecond. So, for any ERPs any ERP analysis this amplitude and latency is the one which we have to check in details. So, that is the one which we have to focus on in any of the ERPs. Next, I will just give you how this, what is this wavelet about and how does that change the wavelet, I mean the EEG analysis.

So, this wavelets and all they are they actually give a time and frequency analysis. So, frequency analysis of the EEG. Now EEG is having a huge dataset and there will be a lot of waveforms that is being obtained. So, we cannot doing, there are various methods wherein we can do the analysis of the EEG, but using the Fourier transform or it can be short-time Fourier transform or like that there are various methods and all are there for doing the wavelet, the EEG analysis. But what happens is that in those type of channels, for those analysis like the FFTs of the STFTs and everything, what happens is there will be a loss of data or sometime there will be, it does not, it should simultaneously it has to measure or it has to work on time as well as the frequency domain of the EEG analysis. That is not been done in other options or either what happens is there will be a loss of data or sometimes the even this time, for time-domain analysis is been done, the frequency would not be done. If the frequency analysis being done, that time the analysis would not be done. So, vice versa it should be completely, simultaneously it has to be done. So, only wavelet analysis that particular option is possible wherein, we can do the analysis simultaneously and the correlation studies and everything can be done together. So, what is wavelet is all about? Wavelets are nothing but the whole of this our EEG data we have to decompose it into small wavelet packages. So, first of all based on this, this particular step it always takes place based on the sampling rate alone. So, with the sampling rate only we can do the wavelet analysis properly. Based on that only the different wavelet coefficients are being created. So, here in this analysis we have in we have been working with the wavelet analysis that is being done by Quiroga. This Quiroga’s paper they have made a software or there is something called as the EP_Den software which is being given in the University of Leicester.

So, in that in that particular, if you just put EP_Den wavelet denoising Quiroga, if you put in this way you will get in Google we can get over their page and how to download and everything. So it is like there is a they have their Matlab code itself. So, it just we have to run it and we have to give the data according to that. So, another one important thing about this wavelet is that it always takes the sample number or sampling rate or anything in the powers of 2 that is one of the most important thing. Because that is like a standard, it should have always, always this sample size it should be in the powers of 2 alone. So, the sampling rate over here which we take, it is 256 hertz that is the normal, that is like they have used for their tutorial purposes their analysis they have took 256. So, according to our data, according to our this thing, our CNT files, we can give this particular sampling rate according to that. So, for example, now in all this neuro scan data file, the sampling rate is being taken as 1000 hertz. So, what we do is that we actually will resample those data in 1024 which is again the power of 2. So, that is the important step we have to consider. Now in case of you know BO or the open BSI datas and all, they take a sampling rate of 500 hertz. So, in that case we have to do a closest 2 to the power powers of 2 means that will be 512. So, we have to do the resampling in that 512 range. So, based on this sampling rate the wavelet coefficients will be decomposed.

So, basically what we are doing is that the whole dataset first we do the whole dataset is first run by this data wavelet coefficients. Then we reconstruct the data and they we get the wavelet analysed after denoising with various algorithms and all. So, in this particular Quiroga paper, they have mentioned that, where they have used something called as the B spline orthogonal, they have used that particular wavelet. So, that is the that is like that is somewhat similar to how the ERPs look. So, that is why that particular orthogonal B splines are being used. These wavelet coefficients they will be divided or they will be as I told you they will reconstruct I said, so first we have to, they will first reduce it in this DI, D2, D3, D4, D5 so in this ranges the gamma, beta, alpha, theta, delta in this ranges it will be in this particular range it will first get reduced and then from here it is being reconstructed as the next for the next wavelet for the analysis. So, first what happens is after doing this EEG analysis and everything, the data is then exported. We will I was showing how we export the data. So, after the exporting the data we have to convert those datas in the forms of in the, we have to convert into ASCII files and that too it should be all those sample data also it should be in the powers of 2 I will just go through the wavelet, the wavelet analysis, the software the code and I will show you where the difference and all has to be made. And a single column mean ASCII file has to be made. So, this these are the 3 additional steps that has to be done apart from the EEG analysis, ERP analysis. So, I will just do this extra step to show how to do the wavelet analysis and how, where the changes and all to be done in the code.

So, this is just a sequential step of what is the demonstration, what I am going to do. So, as similarly we first do the, since we are working on the VEP data, first we will do the VEP analysis and we have to do, here one important step we have to do is the resampling we have to do because we want to get a the samples, the export the samples in the powers of 2, so for that purpose we have to do the wavelet, so this resampling has to being done. And then followed by this similar thing, epoching, the creating the event list, then the VEP triggers, then all those VEP analysis everything has to be done and finally we do the wavelet analysis using the software. So, this is how we do the, this is the just demonstration steps that I am going to do. Next I will be showing how to do the thing. Before that, just these are the 2 Quiroga papers I have mentioned about the wavelet analysis they have being doing. So, in their website also there will be various tutorials and everything that shows how the, how it looks when the data things happens and the wavelet analysis important thing is that we can also help, it also helps us to understand how the how our brain is getting used to the stimulus. Now for experience, one experiment it takes for about to 15 to 20 minutes means for initially for some time our brain will be a bit, it will get used to, I mean it will be like first initially we will be concentrating on the triggers or the stimulus everything, but later on we will be getting, our brain will get used to that. So that habituation it will take place in our brain itself. So, that also can be analysed using the wavelet analysis, how the habituation is taking place as the time proceeds in the experiment. So, based on that we can also tell, this is how the brain should look, the I mean this is, this so much time is required for our for obtaining the P300 analysis or AEP analysis or VEP analysis that we can, that time, the time required for the thing can be understood. And this is, the first two papers are just some VEP related papers wherein they have used the VEPs how and all that is being, when they have started working on it and how it was being developed and everything has been shown in those two papers. Next I will be working, I will show you the demonstration. Thank you.

Hello everyone. So, as a part of the wavelet analysis using the VEPs, I was giving a presentation regarding that, so based on that I will just give you a small demonstration of how to do the wavelet analysis. So, first as usual we will do the usual EEG analysis we just open the EEG lab. So, here I am going to use a VPR data. So, I will just import the CNT file we have being using the CNT file for this VEP, so I will just take a VEP data. So, here we get the, so I just open this VEP demo over here also it is a CNT file so there are 69 channels and here the events are just 120 because there are only 120 events and triggers we have being giving and here the sampling rate is 1000. So, here this is the place where we have to do the sampling. If we have to do the same procedures for the other data also we have to change the sampling rate. So, since I am just showing example with VEP I am doing this sampling rate change for this particular thing so we just make 1000 to 24 because this is a power of 2 also and we have to obtain this epoch in that range also so that is why we will take a 1024. So, in this case what happens is we will have the sampling rate change to, so it to get re-sampled and we will have a 1024.

So, here there is a re-sampled data before doing any step we have to add the channel locations. So, we do the a usual step of putting the channels, if it was any other data they would have the channel locations inside the a the channel, I mean in the continuous data itself. So, here we have to externally add the default channel locations. So, this is how it is being done. So, here we have the channel location being included and after this I just give you a pre-filtering step we have to do that is a notch filter. So, we does do the a notch filter. So, in a notch filter it will get re filtered so based on that filter we can after that I will show you how the triggers look in VEP. So, this is the, we will do the scroll data analysis over here. So, over here this is are the initial responses which have very having a lot of artifacts. So, I will just change the display 30. So, here this the red once all this red once are the a data, the triggers, the VEP triggers that is being obtained. So, here only one trigger option is there, so this is about the VEP. So, here we have to create this alone this once event list we have to create.

So, first we have to do that process of creating the EEG event list so here we use an advanced option and here we load accordingly that. So, this is the VEP so here the first one first one indicates the trigger name that is a 1. And 1 is event code the event type which was there, we labeled let us VEP and then we put a bin number as 1 and then we do a checkerboard description is that it is a checkerboard. So, this is the event list and we just apply we just have to enter it over here and update line, so it will get added and you can just save it for future purposes also and then you just apply it. So, what happens is all this event which was named as 1 it will now named as bin 1 or it will be named as checkerboard something like that. So, I will just show you how it has been changed now. So, followed by this there should be an epoching that is being done. So, here all of them are been named as VEPs now all the ones are being changed to VEP. So, this is this is why we do the event list options in order to make the program understand that this once are nothing but the triggers of our interest. So, next is a bin the, bin epoch. So, epoching means to be done. So, now what has happened is we have to be very precise about the this now next, now in this case for wavelet we have to consider this frames per epoch also because we want the frames per epoch also to be in the range of, in the powers of 2. So, since we have given this 1024 what happens is we will get the epochs if you do from minus 1000 to 1000 what happens is this whole thing will be given in that particular range in 1 2 seconds we will have this particular. Now what happens is we wanted only a particular part of interest that is just till 200 maximum not more than that so what happens is we just want the first quadrant so first 45, first 25 percent of the sample is only we are taking.

So, we take from minus 50 to 200 so that we get the 25 percent of 2 and this 1204. So, it will be then, we will get a frames per epoch, that means friends 1 epoch for that reason only we are doing the epoching in this range. So, if you wanted in P300 means, we will take from minus 50 to 700 so that you will get in that range and also we will get in the powers of 2 as well. So, that is why we take like first quadrant kind of thing we will take in this section. So, if we run it what happens is, now all these we will have 120 epochs, along with that each epochs will be, each epochs will have 256 data points. So, this is why we did that particular epoch range. So, we if we suppose if we do a different for example, now if I change the epoch, for and I am just doing it like, I will go back to this event list after creating event list. So, now I am doing a the bin based epoch. Now, what I do is I just do not, now if I do not do this particular range for example, I am doing something like 300 for example, so now here we have 350 data points will be coming. So, if I just run in this manner what happens is we would not get the powers of 2 in here in the frames per epochs. So, it will be changed. It will become like say if 358, it has become. So, we do not want to be in this range, we want always to be in the powers of 2.

So, for that reason only we are doing it in this as I mentioned over here I have done in this case. So, this is just I gave a demo that we have to, why we have to do from minus 50 to 200, why I did I just explained over here. So, that we get this 2 to the power n in this range we get our data point, datasets we get. So, this is about the event epoching. Next we have to do is the usual step that is the artifacts detection that is also the voltage threshold. Even this the time period of, our experiment time period is less only but still we have to remove all our artifacts. So, here we take, this is a 64 channel, so I am just removing all the trigger and EMG, HEO, VEO channels and everything. So, and then we accept it. So, if we can check over here, if we check all the these many of them will be are been rejected and this one, the 77 of them have being accepted. So, that is why even though it is a small this one, there will be a lot of artifacts is still being there. So, here we have to update, we have the artifacts rejected dataset is being created. Followed by we have to just update these marks so that we can reject it all the rejected epochs can be removed. So, we just remove as the usual step. So, here all these channels, if you go further there will be various other channels also will be there which has been remarked as problem. So, I mean which has been marked as being rejected. So, just rejected based on that.

So, if you see of total 120 epochs, 77 was accepted and 43 was being rejected. And we rejected those 43 trials from the whole dataset. So, now when we create a new dataset it will be having just the 77 epochs alone. Now, what does this 256 actually mean is that, for 1 epoch, there will be 256 data points. Similarly, so if there are so for total of 77 epochs there will be 256 into 77 that many epochs, that many data points will be there. That is the, that is our interest that is why we are using this particular, that is the meaning of this particular this frames per epoch means that only. Now, next after artifacts rejection, the next step is on filtering the data, again we use the Butterworth filters. Similarly, with 1 to 30 and then we just do the filtering. So, here we have the filtered dataset. After doing this particular step, next is the averaging it. So, as I have mentioned before, the averaging is being done for, since we are working in a very few microvolts of data. So, what happens is just a single epoch is not at all enough for analyzing it. So, we just do the this average of the whole 77 to get the proper analyzed ERP. So, we just make it as VEP. So, here we making the data, ERP set.

So, here ERP set is being created. Now, we will just visualize it, here there is no any bin operation or channel operation because there is just one bin left there. So, we just have, if you want we can do any channel operation like for a particular way, if you want only the occipital region of the brain alone to be averaged together, then you can just take the bin, channel operation can be done. But I am just going to show an individual response of how it looks. Here also we take a negative up. Or we can take a positive up also anything is fine, so I just take it a topographic analysis of it so that you get a proper, so we have done the ERP lap. So, this is the proper, so here also we see the reversal taking place, it is a bit less only but yet there is a reversal. So, here it starts itself from this top, from here it is like from down it is going up, so that is like a reversal taking place and this is the main this particular region, this occipital area is the main place wherein we get the visual VEP, visual evoked potential. So, we can actually make it a positive up or negative up anyways this is fine. So, here if you see this is how the AEP is looking, it is a bit different but then the as I told you the amplitude and the latencies are the interest of our, that is the point of our interest. So, here this is the positive up, the negative up, it is coming in this region, actually this is a bit weird I mean it is a quite a different dataset you can get. So, here we can see a proper N1 P1 peak, N1 peak and all. So, always a P, we can first check is the N1 peak. I think it is better to do a negative up.

We added a positive up in the previous thing I will do a negative up and check how it looks. Negative up is a bit better because we are mostly checking the negative peaks more so that is why the negative up is better. So, anyway this is how the VEP looks. Now, next I have to make you I mean I have to show is about the how to the wavelet analysis. So, here in this case we can directly so as I told you there are 77 epoch and there are 256 data points for each epoch so what happens is we will have totally a 77 into 256 that many number of data points we will be obtaining. So, this is supposed to dip and this is a negative up and again a positive. So, this is this should be the N1 response actually and this should be the P1 response followed by the so this is how it should look like, it is not a proper VEP but this is how it should look like there is a proper N1 P1 response that is been obtained. So, this is how it should look like.

Anyways, so I will just do the exporting now so we can just go into files and do the exporting of the data so when we export in this manner what happens is we can we can just change the extension. So, we just put as excel sheet so we can just save it in this manner how we can have a word digital I mean how many data points, how many decimal points late afterword’s you want everything you can take it, you can do the transfers everything that also can be done. So, I just save it. So, as we save we will have the VEP being saved in excel this is the exported data now. Now if you open this we will have the whole of the whole for each electrode and for from minus 50 to 200 so there are 256 so if you see there are only 256 samples for one epoch. So, similarly there will be for 77 epoch there will be the whole data set so there will be so totally 1917 so this many data points are there for the whole ERP set. So, here in this we can just copy the whole of a whole of a row copy it and put in a notepad and then you can just save it as a for example I will just save it as FP1 and I will just put the extension as ASCII format because always the wavelet analysis it shod be saved in this ASCII format so that is why we just put in this way. If you want negative up if you want a negative up this is all the positive up data points that is being there, if you want negative up we have to just multiply the whole of this data sheet minus 1 so that you will get whole of it as a negative up version.

So, next I will just show you the data this is the Matlab code for the wavelet analysis. So, here these 3 are the main important parameters that you have to check about. So, here these are the nap sample size, so here the sample numbers which means that for each epoch there are 256 samples. So, for we will put that as 256 over here and this particular thing that is called as a sampling rate and for over case the sampling rate is 2, 201024 so that we enter in that manner. And then this stimulus over here is actually nothing but how far you want the axis to move up and down so that is been given by this stim parameter over here now in our case we have taken from minus 50 to 200 millisecond that is the time range we have taken. So, just minus 50 that for that range we have it should be just this much only like we have to divide this by 15 like that so it will be you will get 51.2 so that we can set a data set from minus 50 to 200. So, I will just have given this, then this is nothing but the contour plots and the trial numbers that is just to. So, I will just run this particular dataset, even this also we have to do the path, the event, the set path we have to give here also. So, we can just download these, this particular thing we can download from anywhere I mean in the website actually. So, from there we have to then give the in this set path we have give this particular folder so that you can just and then you have to open this and you have to make the changes accordingly and then you have to run this set because some parameters in this is required to be opened in this way.

So, we will just load this particular I have already made few of this occipital region electrode and all. So, I have just made some of them like I, just showed now that ASCII file has been made for those. So, I will just open this OZ, so if you see, here so this and this, they are both almost the same. These two are almost the same. So, this is called as wavelet, this is the wavelet software that is being used and here we have the, these are the different the wavelet coefficients as I mentioned before. So, each of these coefficient which will move across all the ERPs, ERP averages and that will obtain this particular waveform in this format. So, here these are the single trials, for each trial, it looks in this manner. So, like there are nearly 77 trials, so like this it looks. This is the contour plot that is being obtained for this. So, here this is the proper negative and this is always a negative, here even though it is shown as two, I mean positive up and negative down, actually I have imported a data which is being a negative up. So, this is how, this is actually the positive peak and positive peak is given as the responses, in blue response and this is the red response that is being obtained. Now these are the automatic denoising that is being there, the neighboring as well as the MZT, these are the 2 advanced denoising algorithms which is actually being mentioned in detail in their Quiroga paper.

So, when we do this, we will have a particular, this. So, it is almost completely overlapping over the system. This red color before the one which was there, that was the original dataset which we given and after doing the denoising, we have obtained the red colored waveform which gives the denoised dataset. So, here we can clearly see that the blue colors is the positive peak and the red colors is the negative peak that is being obtained. So, basically, this will be the N1 response and this will be the P1 response and the followed by then there will be a P2 response. So, here we can get as you move up, sometimes what happens is there will be the coloration will be reducing. So, here it is dark and here it will be much lesser or like that it happens, which means that there is a habituation occurring. So, this is about the wavelet. So here similarly we can create various datasets or I will just show you for PZ also. So, this is another one ASCII file I have being given, I have been made. So, here also we have the we can create the contour plot. So, here it is not much prominently seen because it is not so prominently seen in this also, so but after denosing may be you will have a proper response being obtained.

So, see here this is a negative and this is the positive peak like that. So, this is how the wavelet analysis work with respect to the Quiroga paper, they have mentioned. So there this is about the wavelet analysis that is being done using this EP_den software. There are various other parameters which we can change and work about it like the time, the time maximum time, minimum, the contours, the trials or even we can change the sampling rate, the sample size accordingly everything can be done. So, that is about the wavelet analysis and using the VEPs. And then even in this also we can have the similar approach of getting the codes and all for this. So, we can just copy-paste, add loops, add any other things and all so that for each subject, for all the subjects we can do the similar analysis, even we can create some codes in Matlab itself so that the datas, this EEG Excel sheets and all can be converted into ASCII files, column based ASCII files so that and all an advanced Matlab approach, it is like an advanced, people who are problem in Matlab, they can do this type of playing with the numbers like that with Matlab and all. So, that is about the wavelet analysis using the VEP.