Loading

Alison's New App is now available on iOS and Android! Download Now

Study Reminders
Support
Text Version

Set your study reminders

We will email you at these times to remind you to study.
  • Monday

    -

    7am

    +

    Tuesday

    -

    7am

    +

    Wednesday

    -

    7am

    +

    Thursday

    -

    7am

    +

    Friday

    -

    7am

    +

    Saturday

    -

    7am

    +

    Sunday

    -

    7am

    +

Video 1
we are going to discuss the multispectral scanner and imaging devices. So, far we have discussed different kinds of platforms and briefly, we have also touched a different kind of multispectral scanners associated or onboard of different satellites. And now we will be further discussing these multispectral scanners.Someone may have a question in mind that why we are discussing old scanners which are no more in the space for example, Landsat MSS, it was in 1972 walked up to 1976 and it is no more in a space why we are discussing. Similarly maybe IRS LISS 1 and LISS 2 is a very genuine question. But as you know that since then that data was being recorded or has been recorded since then and that is what is most important that now we are having more than 40 years ofarchives having data from different sensors onboard of different kinds of satellites. And this archival data is becoming very, very useful especially for change detection is studies any studies associated with the climate change another thing that means you are having 40 years of satellite based unbiased recordings of the things, recordings of the earth or surface of the earth and whether it is a forest, or whether it is a water bodies, or whether it is a beer, soils or land, everything was being recorded since 1972 onward/And a very important product which is also really very useful product if one would like to see an area between 1972 to recent, then Google Earth provides that archive also. So without downloading anything, just going in the history or archive, you can bring the cursor at the end in that menu and you get the oldest image of that area. Probably most, if you are lucky, you will get an old image of even 1972.And since then various images of all parts of the world are available is also for India. So, that is why it is important even today to discuss about the oldest sensors which are no more on the satellite, those satellites are also not in service, those sensors are not working now, but their data is available and data is you know by the time it is becoming more useful to study climate change and change indigenous studies land use change. Some geological changes in new technology studies and other things, landslide movements. So,in various types of studies such old data is becoming useful and therefore, they are sensors, so, to also be discussed and understood the best part of those sensors and limitations also. Now, it is specific to this particular discussion and different types of sensors are there have been there in on different kinds of satellites. So, we start with this kind of general logic that the sensor is there and there are 2 basic categories which we can divide these sensors, one is passive which  is very common and other one is active which is microwave sensors. Now, in passive the we can divided in other 2 main categories. One is non scanning, another one is scanning, non scanning means take data or acquired data in little different way especially like imaging camera. So, these are the provide the data in form of a snapshot like when we take a digital photograph using a DSLR handheld camera or even amobile camera. So, we take a snapshot not scanned images, which are more useful for scientificstudies. Using these cameras we can acquire a monochrome images, we can embrace a natural occurrence these are common when we had the film system. Now we are having these digitalm systems so we can acquire of course on the visible part but infrared also our mobile cameras are sensitive to infrared lights as well and very useful for night vision or nighttime photography also. So, even in a very dark condition these mobile cameras can acquire images or photograph anyway. So, we continue on this that passive remote sensing can have 2 non scanning and scanning. Among non scanning again you can have non imaging and imaging. So non imaging likem microwave radiometer magnitude and sensor gravity meter, 4ier spectrometer and others. So different kinds of sensors are there. Those who work in the field of geophysics, they know about this magnitude meter or gravimeter also, or microwave radiometer. But microwave radiometer in passive category and there used to be satellite on Nimbus series of satellite on the sensor SSMI and SMNR which we are working in passive microwaveable part of EM spectrum, which is different than active. When we go for active again so again you can have a scanning and scanning type and within scanning you are having imaging that object plane isscanning and image plane is scanning. These are so real area aperture radar and synthetic aperture radar. So, most common activemicrowave remote sensing on different platforms like if I give the example of Sentinel or Envisat they are on the synthetic aperture radar. And synthetic aperture radar means that the size of theradar is not as big on onboard of the satellite, but they are synthetically it was created in, so that is why it was called synthetic rather than VL. Then you are also having non imaging and non imaging active non scanning sensors like microwave radiometer again, microwave radiometer we have discussing in case of passivesensors, but there are microwave radiometers which also works in the active remote sensing thenyou are having multimeters, laser water depth meters, laser distance meters and many other sensors are there. So, various sensors depending on the requirements are the satellites are also there. When we go for passive and scanning imaging, then image plain is scanning, object pain is scanning. So, incase of scanning here in the active and you are also having the TV camera, solid scanner, solid scanner we also see, then optical mechanical scanner and microwave radiometer. Optical mechanical scanner is the common in your desktop scanner kind of systems. They are a little different than what we are having in a space, but concept by or technology wise, they are in some ways similar to the space basis scanners. So, these are the different types of scanners which are onboard of different satellites have been or are there. As you know that remote sensing we have just discussed that can be divided in 2 broad categories, one is a active remote sensing and another one is the passive remote sensing, what exactly is an active remote sensing means here that is the instrument itself since the pulse or the signals towards the earth and whatever the backscattering, which reaches after the interacting the energy which came through from the satellite towards the earth.When it interacted with the different objects which are present and when it goes back, it is again recorded by the same platform and when this is done, then we get these radar images very common radar images, I will be showing their example. So, when we the pulse is sent here the pulse strength is maximum at that time when it is dispatched towards the earth it takes some time to travel and then this is the backscattering.And then maybe backscattering again. So, these peaks are recorded by the sensor as also shown here, that here 1 2 3. So, this is the first, second, third pulse, 4 pulse likewise and in return vegetation, which is a different distance from the sensor or platform is having, reaching time will be different in case of a house or building the time is different here. It is say if I take everything in terms of second unit.Then the unit is second then it is reaching at 18 seconds of course, it is not in seconds in millisecond, but anyway for example. So, it is reaching in 18 seconds whereas, the you know backscattering from a tree which is little far from the compared to the building it reaching in 26 seconds and this is how these peaks are recorded by these active microwave sensors. So, when you are having a series of such data sets, then an image of a microwave sense from microwave sensors can be created. Wave by wave, the data is pulse by pulse the data is sent and recorded, sent and recorded and likewise, the images are created. The important point here to mention needs for power to operate because each sensor, each satellite which is having is active microwave sensor has to have power to send those signals or pulse towards the earth. And therefore, generally the life of such active remote sensing platforms is relatively less as compared to passive remote sensing. Because passive remote sensing it does not have to send any energy towards the earth, the energy is coming from the sun which is here. But in case of active the energy has to come from the sensor itself and will go back there, here the energy is coming, it is illuminating the part of the earth or maybe the emission which is coming out of surface of the earth and which is being recorded by the satellite itself.So, in passive microwave or in passive remote sensing the source maybe the surface admission or maybe cosmic background, sometimes maybe minute rain emission, but mainly in daytime you are having the sun and sun reflection is there and it of course it reaches there. So, what we record is in case of surface emission, what we record is brightness temperature. Now, where these sensors are located. This is very, very important and whether it is active sensors or passive sensors So these 2 extreme ends on the top we are having active sensors which are shown in red color. There is a in the bottom it is written is passive sensors which are shown in blue color and the EM spectrum here the part of EM spectrum is shown in the black and green color. So, we go one by one again, starting with the active sensors that we are having laserfluorescent sensors. And not very common emission and measurements are in between this 430 to 750 nanometre. A typical excitation wavelength is 255 nanometre. This is a here in this part of EM spectrum. When we go in the microwave region of EM spectrum, then we find a lot of sensors, here we do not have in case of laser flow sensors, we do not have you know, this field is really coming up. We do not have regular satellites orbiting part of the earth through the shuttle missions or through the airborne these sensors have been working or have been tested.But in case of when we come in the radar range, starting from 1 centimetre to 10 centimetre andmore, and then we are having a lot of bands within the range bands starting K X band S band, L band and of course very common is the C band also. So, all these and you are having SAR that is synthetic aperture radar or a SLAR, that is side looking this aperture radar. So, what types of radars are there or microwave radiometers are there.And they work on individual bands generally either on C band of X band, which is very common with it. Of course, there are further divisions of K band, Ka and Ku are there certain missions where they are and different bands but the most common band currently, either C X or L. L band is also being used which goes towards more radiowaves kind of thing. It is also being used for some kind of transmission related with navigation satellites as well. So, microwave like your Sentinel or Envisat they were using C band, C band means this much part of EM spectrum of microwave region is being used by Envisat or Sentinel satellites. When we come to passive sensors, then we are having a of course, ultraviolet images but spaceborne not possible maybe drawnborne, or airborne possible, but most common one in passive sensors the visible part or near infrared. And it is quite broad compared to other windows which are available and a lot of satellites of different countries of different types of sensors we are working or working invisible and near infrared spectrometers are there, scanners are there, typical wavelength starting from 0.4micrometer to 1.1 micrometer or a 400 nanometer to 1100 nanometer. Then in between you do not have the atmospheric window. Then you come to the thermal infrared and here the reason is 8 micrometer to 14 micrometer. This is the typical wavelength range for thermal infrared sensors. So, like Landsat MSS did not have Landsat MSS was mainly focus in visible infrared near infrared, whereas Landsat TM which is thematic member had the these thermal channel as well just single channel it had, NOAA AVHRR is having 2 channels in this thermal part of EM spectrum, which we have discussed in earlier lecture.So, different satellites again, they are working in a different part of EM spectrum within different parts, like in thermal some are having single channel, quite broad channel, some are having 2 channels and maybe relatively thinner channels. So, likewise, we are having active sensors operating in different part of EM spectrum active sensors mainly in the radar part, whereas,passive sensors mainly in the visible part and thermal infrared part.Microwave also, microwave passing microwave, which I said that the one NIMBUS satellites SSMI and SSMR sensors we are there and but they had relatively very poor resolution of about 30 kilometer yes 30 kilometer I am talking. So 30 kilometer resolution but the very valuable data is especially we use this data to drive is no death or some part of Himalaya very successfully but 30 kilometer resolution NOAA provides at 1 kilometer resolution. And now you are talking about centimeter resolution as well. So, all kinds of you know spatialresolution remote sensing data, either active or passive is available.  
Video 2 
We continue on the passive sensors that for passive sensors, they do not have their own energy source. So, they depend on the source of energy either from sun or remittance or emittant thermal energy or thermal infrared. And these systems that passive remote sensing systems measures energy that is naturally available, that is why they are called passive sensors. And these sensors can only use to detectenergy when naturally occurring energy is available and for reflected energy they need to havethe sun which illuminates and then only it will work. So that is also there. And there is no reflected energy available from the sun at night. And that is why when I was discussing NOAA AVHRR data I said 3 channels sorry 5 channels will work in daytime and thermal infrared channels can work both in daytime and nighttime when it comes to nighttime only 3 channels or NOAA AVHRR works which are infrared, thermal infrared have 2 channels of thermal infrared. So, of course, the sensors go and because in visible channels 2 channels which are devoted for visual part of EM spectrum. Since you do not have the natural light source available, that is sun, therefore, these reflective channels will not work or will not have any recording. So, only you are having recordings in 3 channels and nighttime data. It is also possible to acquire nighttime data from Landsat as new series like OLI and Landsat TM, because they are having 1 channel devoted for thermal infrared part of EM spectrum. But it is not normally available on only on the request it has to be. So there in if you start searching archives, you may not get nighttime thermal infrared data available from Landsat TM or ETM + sensors. Now energy that is naturally emitted by the arts for thermal sensors such as thermal infrared can be detected day and night I said for NOAA AVHRR these 2 thermalchannels also works in daytime and nighttime. In case of Landsat TM ETM +, again thermal channels also work, in same in that esther also. So, the because in daytime this energy which is emitted as long as the amount of energy is large enough to be recorded, and that is why relatively the spatial resolution of thermal channels on the same satellite is different of course in case of NOAA AVHRR the visible and infrared channel resolution is relatively anyway course, that is 1 kilometer. So, thermal channel are also having the solution of 1 kilometer, but in case of Landsat TM like normal channels will have 30 meter resolution, but your thermal channel can have 120 meterresolution in case of OLI series normal channels are having resolution of 15 meter. So, thermal channel is having 60 meter, because, you recall these you know 4 rules of remote sensing andwhat it says that when you move towards the thermal part of EM spectrum, the energy available is very little for the sensors. And that is why as long as the amount of energy is large enough to be recorded, and in order to have sufficient energy for recordings, you have to cover a large area and that means compromising on front of spatial resolution. So, that is why generally thermal infrared sensorsare having coarser resolution compared to visible channels. These are different bands on the latest example from Landsat and reflective bands are here only reflective band is there and not thermal infrared and thermal infrared channels are there. And, of course, we can assign different colors red colors for different bands as per our requirements and can create a false color composite like this or color combination like this as shown here. Now we come to this false color composite, we have been using this word and discussing alsoshowing some images in false color composite, so it is just right time now, to discuss this what is basically false color composite is. As you can see in the example that I am having 3 channels 1 is green, another one is red and third one is near infrared. Compulsion of using near infrared is because you know when this spectral response curves that vegetation is having maximum reflection in infrared channels or near infrared channels. And therefore we need to keep that channel so that in color composite images we can also have and good information about the vegetation. Now, this is a infrared channel, so, if we assign green or blue color then it is going to be a little different scenario. So, generally what is done as you can see here, though it is the red channel of visible part, but it has been assigned green color. And for the infrared channel or near infrared channel red color has been assigned and for green color, the blue has been assigned, there was only available colors in additive color scheme are just 3, blue, green and red BGR or RGB. Now, as you know, the additive color scheme and this is how we have learned in very earliest stages of our schooling. So this color though initially, the green part of the spectrum was represented in grey shades or in black and white panchromatic. Now it has been assigned blue color. So now it is visible and differentiates are blue. Similarly, this red channel of visible part of EM spectrum is now assigned green color and therefore nowwe are seeing differentiates of green. Similarly, the near infrared channel is assigned red color, and therefore we are seeing in red. So when we combined these 3 or these 3 bands one is now blue, green and red. Through additive color scheme like this here, then we get a false color composite, why false color because vegetation in the channel which the vegetation has the highest reflectance that means infrared channel has been assigned red color, therefore, vegetation is appearing red and you know naturally vegetation so they appear in shades of green. So, because in this combination it is not a true color. So, we use the word false colour.So, in a standard false color composite images, the infrared channel is always assigned a red color. And therefore, it becomes a false color composite. The advantage with this false color composite as indicated earlier, that you get the vegetation coming very clearly out of this, but if you see the visual channels the vegetation does not come very clearly here. So, because here it is high reflectance of vegetation in infrared. We have assigned red color and therefore, vegetation is appearing in red in different shades of red, water bodies appearing in blue and some other bare grounds and other things roof top of some buildings are appearing in white. So, the discrimination of different objects in a color composite becomes easier if we go for false color composite composition, that is why we make false color composite. If we are having multiple channels, like in case of Landsat ETM + or ETM even then it is possible to see in near true color, true color images not available. So, what do you see in Google Earth are not false color composites, you see near true color images because if you fly on aircraft at 10 kilometer high and you look towards the ground, everything looks grey, you know,brownish it does not look very you know, green like vegetation or water blue no does not appear like that. But when we see Google Earth images, we see vegetation as green, water bodies as blue. So,these are not really truly true color images, but they are near true color images are there. Themain purpose here to combine these multispectral that is a multi channels and create this throughadditive color scheme color composite. So, the main purpose is to discriminate different object our interpretations, all these images or analysis of these images becomes much easier. So that is why these combinations color combinations are used. One example here is from NOAA AVHRR in previous lectures also with different reference we seen these 2 images, I am showing again this is color composite. And this is thermal infrared image of the same part, but it has been taken in the nighttime, but you can see here it is in the daytime. Of course, for the same day it also acquired daytime thermal infrared images also. So, they would that will look quite similar to what we are seeing is a 23 hours local time image of the same area. Now about active sensors, active sensors as you know, they are having their own energy source for illumination. And these active sensors emits send pulse or signals towards the earth target which they are focusing. And one important thing about active sensors and passive sensors, I should not miss here is generally passive sensors are neither local sensors or vertically downward. So, these satellites are generally looking vertically downward which we call is another viewing. Whereas, active sensors is they have to collect the backscatter data or backscattered signals. Therefore, they are of another viewing and this is what exactly also soon here, that they will look obliquely towards the part of the earth which they are scanning or recording the emitance. So, that are the backscattering and that is very important to remember that passive sensors generally are another view whereas active sensors are of another view or an oblique in direction. Because the design like this that they send the signals and the same sensor or same spacecraft will collect later on and the same backscattered signal. So, for that purpose the design is like this that they have to be of another, with another viewing active sensor it is not possible to collect the same signals backscattered signals. So, advantages of active sensors which includes the ability tomay obtain measurements anytime.Because it is not depending on the sunlight or reflected light nothing because the sun it is havingits own generator of light or energy. So, therefore, does not have to depend on. So, anytime the data can be acquired, but on normal space borne active sensors like Sentinel or Envisat or radar sat or ERS or ISAT they are having fixed time of orbit and they acquired the data. But if it is airborne, it is possible to acquire data anytime.Regardless of time of the day or season, season is important, whether it is a monsoon season or a winter season whatever, because they are working in a large wavelength and these largewavelengths can penetrate through the clouds. So, all these scattering and other things will not affect microwave part of EM spectrum. Therefore, they can acquire data at any time of the day orany season that is biggest advantages with. So, like in if somebody would like to use active remote sensing data for flood studies, then it is possible to use it because clouds generally when you are having floods, you are also having cloud. So, clouds will not hinder the signals, you will have a clean images, but in case of visible infrared and thermal infrared, the clouds will create a problem and images may not be very useful either. So, in such situations people resort for active sensors. Active sensors can be used to examining wavelength and that are not sufficiently provided by the sun such as microwaves or better control the way target is illuminated. Controls are therefore, to use different wavelengths of microwave region and these active microwave systems require generation of fairly large amount of energy. That is why I said that whatever the source of because after all they are having batteries and solar panels and you cannot have too many batteries otherwise the load will increase. So, there is aalways compromise and the compromise with the time So, the satellites do not last very long acting microwave compared to normal passive satellites. And best example of active sensors issynthetic aperture radar, or in short we say SAR ASAR and BSAR and others are there. One of the examples just I will go to first next slide and then come back again. Just to give you a comparison, this is radar image taken by our own Indian RiSAT satellite 1, 2 will be launched soon, maybe end of this year or next year. And what I said this is radar image for the same area, this is Google Earth image. See the difference, all fine details about the morphology, about the terrain and durations, because vegetation is not creating much problem, because clouds and other things are not there. Andtherefore, these structures, geological structures like fold and others can be seen very clearly here compared to what we see in the visible channels or near color composites.It is not very easy in relative to what we are seeing here, including it is very easy to see the waterbodies in the microwave region because this is completely absorbed, the microwave energy is completely absorbed by the water body and therefore, but water body will generally appear blackin microwave images like here, the example is shown. So when somebody is working in an area for certain applications like in geological or structural mapping or maybe for some civilengineering purposes, where one would like to have the complete a fair idea about the terrain conditions. And the terrain is not typical like Himalayan conditions. You know undulations are there, hillsand valleys are there but not very rugged. Then radar can work wonderfully well as examples shown here. That each a little you know an landforms geomorphology topography can be seen very clearly in radar images especially this example is shown of Northeast India that is because generally otherwise these are you are having lot of clouds regularly in these parts of the country. But as I mentioned earlier that radar images, there is no problem with the clouds and therefore, you get a very clear pictures. In case of visible channels, maybe after 20 30 orbits you may get one day when the complete image is a completely cloud free. But because of presents ofvegetation, or maybe presents of clouds, you do not get a very clear picture as example I am showing here. Though relatively this image is quite high resolution image as compared to what you are seeinghere. But the clear because of it is working in microwave region and different rocks and it located different places are behaving little differently with these microwaves and therefore, you are getting a complete a detailed image of that area. So, that is the biggest advantage with radar images.
Video 3