Loading

Alison's New App is now available on iOS and Android! Download Now

Study Reminders
Support
Text Version

Set your study reminders

We will email you at these times to remind you to study.
  • Monday

    -

    7am

    +

    Tuesday

    -

    7am

    +

    Wednesday

    -

    7am

    +

    Thursday

    -

    7am

    +

    Friday

    -

    7am

    +

    Saturday

    -

    7am

    +

    Sunday

    -

    7am

    +

Video 1

Hello everyone and welcome to a new topic which is different techniques of image acquisition of this course remote sensing essentials. So, in this particular discussion and we will be considering
different techniques of image acquisition how images are required, what are you know, you know little inside, not really electronic part but different types of sensors, and which we will be discussing.
And you know the most familiar digital Images basically snapshot taken by a digital camera or a handheld mobile, handheld camera and which generates digital image that can be stored in your computer, displayed on a screen and print it. And this is the most familiar way we know and the digital images. There are some satellites also. Like I give you the Indian example the INSAT series of satellites currently INSAT 3D space. These are geostationary satellites. We have already discussed different types of orbits. So I am not going to go in detail about the orbit. So geo stationary satellites and this INSAT and they are having a digital cameras, which can provide every half an hour an image of India and surrounding countries and it is called India disk. Because its covers a circular area.
There are various such satellites is basically the purpose of the satellites are either communication and for telecommunication also TV communication, but the same time since they are all the time looking towards India, they are geostationary. So, these cameras digital cameras are also not the handheld which we use as a SLR or DSLR. But more sophisticated cameras have been installed and they have been programmed. So, they can send images every half an hour, this frequency can also be increased if required. Especially for monitoring or cyclones and other things but normally and generally they provide the data at every half an hour. But they are a snapshots. They are not taking line by line or not done scanning as we see in case of a flatbed scanner on in a photocopy machine also. Those are
scanning devices whereas these digital cameras are digital sensors which are on board which takes a snapshot are not clearly scanning devices they go in just one. And whereas, the satellite images are they are also digital images like your snapshots, but they are taken line by line. As the satellite move in the orbit and line by line the data is recorded and the same time it may send towards the earth also through a direct broadcasting method, which we will be touching later in this discussion. So, these orbiting sensing devices which are onboard of satellites and they will scan line by line and build an image. In control to this in your digital camera it is the snapshot its just one go on a few seconds a split second an image is captured. So, the sensitivity and other things are different the whole
mechanism is also different. Though now the sensors which we discuss in remote sensing imaging sensors, which are nothing but a electronic device which produces an electrical output which is proportional to light intensity. So, these are light sensitive sensors. And higher the light or a reflection or emission more the output it will generate and more the valued will record as a pixel value. So, these are electronic devices. There are mainly 3 types of sensor are there. One is the single imaging sensor. Another one the line sensor, which is very common. Array sensors which you see is in most of these snapshotting in snapshot cameras which are using these array sensors. Since they are also on many satellites so that has been included under this category. So single imaging sensor, a line sensor, an array sensor. In case of single imaging sensor, just one single cell is there are CCD you can say charge coupled device and that takes the whatever the energy which is coming over it reflection or a
mission or whatever and this record this converts to electrical currents and higher energies means higher values and so on so forth. This is the schematic on the left side whereas in the front sight
of illuminated sensor with the pixel size of 5.3 micrometer. So very tiny electrical electronic device is a single sensor. Whereas in case of line sensor you are having array of single sensors or CCD charge coupled devices arranged in a line and like here, this is the real photograph which is does just 36 millimeter long, the total length here and it is having to 2048 CCDs arranged in a linear. So, ifthis is put on a satellite sensor, then an image width would be 2048 pixels width. And you know it would cover a quite large area from that distance which is 940 50 kilometer away from. So, it is a solid state and linear sensor the most common nowadays in the camera or the sensors on board of different satellites. Now for snapshotting devices, or digital cameras which are on board of satellites, they use the 2D array 2 dimensional array sensor which is in 2 dimensions
rows and columns are there and these CCDs are arranged in a 2 dimensional matrix. The real example is shown here, that matrix area sensors 16 by 10 rows in columns which is 80 millimeter and 50 millimeter and multiply by 50 millimeter which is the that active sensing area which is this much. Now, in compared to the linear area which we saw there will be heard and continue 2048 CCDs but here we are having just a few CCDs arranged and you say to them as and this is just an example of that. In case of single sensor when it is used and either only film sensitive or electronic sensitive devices when it is you are having a sensor it rotates, this rotation is there and it keeps recording the things. So, there may be a film or some other sensitive material. So, this is one image line out per increment rotation and full linear displacement of sensor from left to right. And such sensors
are not there anymore now on satellites. So most of the common one which we are having is linear sensors. Using single sensors and the image acquisition, the most common sensor of this type is the photodiode, which is constructed of silicon materials and whose output voltage waveform is proportional to light. So, they areused for different purposes, but not really a serious remote sensing and the single sensors are not used. Whereas in case of this 2D linear sensor in case of linear sensors, a different scenario is there. So, even if one is having single sensor the 2D image can be generated by having this kind of arrangement of a rotating drum and line by line and that is the single sensor records the energy on the film or whatever material you are having. And the single sensor is mounted on a lead screw has shown here on the right side and provides
motion in perpendicular direction and this mechanical thing is there and they are having theirown difficulties. But trade office are inexpensive and one can acquire a high resolution images as well using even a single. So, low cost devices are possible. Now, this is the most common one which is the linear sensor array. So, you are having just one lean and I already told and that one 2048 CCDs we arranged in a line and is moves this is the direction of motion, then the sensor is strips can acquire the data of entire area as this strip moves and which is on the satellites. So, this linear array provides imaging elements only in one direction. Motion perpendicular to array provides imaging in other directions also and this type of arrangement used in most flatbed scanners as version and satellite sensors also. So, sensing devices with 4000 or more in line sensors are possible 2048 real example I also show to you. One dimensional imaging sensor strips and these you know areas and respond to various bands of electromagnetic spectrum are mounted perpendicular to the direction of flight. So, they are fixed devices, no mechanical things and their life is very good and very low and performance wise, they are also very good. And when we go for their data acquisition, this is how a single sensor will do. So, when you are
having multiple sensors, same thing is repeated for 2048 times or 4000 times depending on. So, when you are having just one single sensor, this is the illumination source and that is the energy source and then whatever the energy which is getting reflected here, which goes through this imaging sensor system which is a this part C part and then it is projected and then it is recorded and you get the output image digital image like this. So, likewise these images are created. And then this type of arrangement is generally found in
digital cameras or satellite sensors. An example is INSAT 3D, because they are taking just a snapshot. But when we go for a linear one, then it is you know this linear array is arranged in the manner that it is across the motion of the sensor or satellite and therefore, it is possible to acquire images. The only problem with such a linear array is about the calibration of each CCDs. These has to be calibrated perfectly that means they are performance individual performance
should be matching with others. Otherwise, in an image output image you might be seeing some stripping effect and which is very difficult to remove. So, that is one of the drawbacks in that one, because if an array linear array is having 4000 CCDs, then all 4000 CCDs should have the same responses against a single object, same type of object. So, this is the response of each sensor is proportional to the integral of the light energy projected onto the surface of the sensor and the function of all these should be very much perfected. And there is the calibration part is most important in linear array or any such sensors or devices, sensing devices which are put on satellite images. 

Video 2

So, as you know that image is a 2 dimensional matrix, which is having is be define f function, x and y and the value or amplitude of F at spatial resolution x and y, x and y row and column we can say here is it positive scalar quantity that is positive integer value. So, f this becomes basically pixel value which is nothing but the amplitude of the energy reflected or emitted which is being recorded. And when image is generated from a physical process it is value are proportional to energy radiated by physical source. This is what the exactly is done in case of satellite based sensing. So as a consequence this f x and y must be nonzero and finite. And this is not infinite number, remember this is finite. The values this radiometric resolution or number of bits are fixed. And before the launching of such sensors, so whether it is a 6 bits, 8 bit, 11 bits, 20 bits it is as per the design and requirement. So, this is 0 less than function x and y and that there's then an infinity is here, but it has to be finite that is why it is less than infinity. So, this function f x and y may be characterized by 2 components, the first one is the amount of source illumination incident on the scene which is being viewed and of course, the second one is the amount of illumination reflected by object in the scene. So, what we so these are called illumination and reflection components, the noted I and R of X and Y. So of all individual pixels, these x and y would be there. So I is the illumination and R for reflectance respectively. Now, these 2 functions combined as a product to form f x y, and what we can right here that f x y = i X and Y multiply by I x y. So, it depends whether what we are having illumination and reflection. So, simple image formation and can be that 0 less than i, f x and y less than in infinity and 0 less than R x and y less than 1. So, r x and y = 0 means that the total absorption is r x y = 1 means total reflection. So, there are 2 means total absorptions when it is r x and y = 0 and when r x and
y =1 that is total reflection. So, the total values basically are in case of binary example, there in between 0and 1. So we can call the intensity of a monochrome image black and white image or a binary image at coordinates x and y the grayscale, gray level L for an image of that point. So now we can move forward instead of just binary image. So L, we can say L equal to function f and x in the interval
of this L ranges between 0, L - 1. So when L = 0 indicates a black and when L = 1 indicates the
white, so binary example is also here. All the intermediate values are shades of gray varying from black to white. So 2 extremes 0 and
1 remaining values in between and can have but since we talk in terms of integers, so, these values can depending on the parametric resolution or number of bits, these L equal to can be 255 and equal to the 0. So, rest of the values can vary between 0 and 1. So, this is how the image is constructed. So, sampling and quantization are the 2 important processes used to convert a continuous surface
into a digital image and this quantization means assigning the number of bits per pixel and sampling is basically indirectly it is controlling your spatial resolution. So, like here if I take a
little different example, if we this is continuous image which is shown here, if I take a just a cross section and plot here, then this is the line is can between point A and B and the white part is having of course, very high values or reflection. Then suddenly drops and then you are having curve like this and again white part is having then you reach to the V. So, they say image sampling basically refers to the discretization instead of a continuous now we are taking the profile or the cross section of a spatial coordinates, coordinates are in terms of rows and columns along x axis, where as quantization refers to the discretization of gray level values amplitude along y axis. So, here when we move here, this is what we are getting the spatial coordinates starting from first column to the last column for this particular example. And y axis here is referring to the amplitude or the pixel values which are being plotted here. So, by amplitude alone that y axis here. So, given a continuous image, which is f x and y digitizing the coordinate values is called sampling, which is what exactly what we have done here and the digitizing amplitude, the
intensity, the value basically plotting on the y using along the y axis is called the quantization. So, here the sampling and quantization this is what we see, that after this different location these values are collected and they are shown like this. So, here you get from starting from here then come here and then like this. So, this is a typical digitally scan line, different pixels values are shown. And here both processes about sampling and quantization is also shown that not continuous but at certain intervals, the data has been collected, so that is sampling. So if you go back again to this line, which says a
given a continuous image f x and y and digitizing the coordinates value is called the sampling and the digitizing the amplitude here this is quantization and the amplitude intensity value is called quantization. So, this is how an image is basically constructed and understood.

Video 3

Now, we have discussed about different sensors, we have also discussed about quantization and sampling, because if a sensor is on board of a satellite and having say 4000 CCDs in a linear array, then it is collecting the data in a continuous fashion and it is transmitting the data. Now once the data has been acquired via satellite or being acquired via satellite, then what next happens, how we get the data from a satellite towards earth, that is also important to discuss. This part is least discussed in remote sensing literature or elsewhere. Because this we say it has been provided by an agency. Like maybe agency might be NASA agency might be ESA or in our case in India agency might be an NRSA National Remote Sensing Agency. So they provide but how they really acquire the data from a satellite after this image has been acquired via sensor that is also important to understand. So, next few slides I am going to have the image acquisition basically from satellite to the earth and for which you require a satellite station. And this is one example of satellite earth station of NOAA series of satellites which at IIT Roorkee we are having and running since October 2002. It is having 2 major components, one is the external component as you are seeing antenna here and another one is internal component and which is inside the lab and which is just PC based system and the main here is the receiver. Now, the antenna which you are seeing in this case NOAA case it is 1.2 meter diameter parabolic disk antenna and there is a motor here which is a 2 axis rotation motor that means, that it can rotate the antenna in if I say in horizontal plane in 360 degree almost in 360 degree and in vertical plane 180 degree. So, this disk is having capabilities of tracking any of these NOAA satellite which are in space. And where is the internal component is having receiver, so lot of cables are all connected with this one and then one USB is in the system and then there you are having a software which generally are called the grabber. One more important thing is a another antenna which you can see here is the GPS antenna. So, 2 antennas outside and rest of the things that are inside. Why this GPS antenna or GNSS antenna is there for 2 purposes. One is to determine the location of this antenna on the globe and second is to provide accurate timing to this rate receiving PC or computer. Because as you know that these computers they arehaving for backup they are having a battery that may deteriorate with time and the system clock
may be delayed by few second or minute. Whereas for data acquisition from a satellite that the system clock has to be very accurate, accurate as like atomic clocks and which is only possible if we get the data from GNSS satellites or GNSS receiver. So, that is the purpose of installing. So this antenna GNSS antenna, keep watch on the watch of the computer and whenever there is a delay or whatever changes in the system clock, it proceed and correct the system clock. So, almost every minute this exercise is being done so that the system remain completely coordinated with the satellites overpasses so we do not miss, otherwise what would be if there is a delay in our system clock, then the antenna would be locating the satellite as per the prediction program somewhere else in the space that is 940 kilometer sorry 850 kilometer And whereas, the satellite might be a different location. So, if timing is perfect then it works. So, this is how a PC based automatic a via automatic that it is not necessarily that whenever is there is an overpass NOAA series of satellite one has to be present system is always on, the power supply is always on and whenever there is overpass based on the prediction program, it goes in that direction from where the satellite will come it aligned with the movement of the satellite and they start getting the data line by line. It is a scanning devices on a NOAA satellite. So, this AVHRR sensor, it scans line by line and the data is transmitted directly towards the earth and through these earth stations we can record the data. Since spatial resolution is closer in case of NOAA AVHRR and therefore, the swath is quite bite. I will be showing an image also. But the purpose of showing this map is that it is showing the location this red dot is showing this IIT Roorkee satellite earth station and a circle is there of 3000 kilometer radius this is nothing but the footprint of a cone, base of a cone and whose apex is in the suppose 850 kilometers with the satellite. So, if anywhere if you project this 3000 kilometer radius circle in a space of 840 50 kilometer. Then within that circle if within that circle if any NOAA series of satellite are coming then our earth station can acquire the data. So, this base of cone one has to imagine in a space at 840
kilometers and the apex of that cone is a IIT R location. So, anywhere within this circle if satellite is passing. So, if I project everything in 2D and if a satellite is here and then our earth station can acquire data by tilting the antenna towards that direction from where the satellite
would be coming and it will keep tracking till that satellite disappears on the other side of the origin and this is done all automatically. That is why it is not, see some time you may get a image something like this, which has been acquired by our own satellite earth station, which has covered the entire Himalaya because the spatial resolution is closer and therefore the swath width is very wide 2800 kilometer and in one go almost entire Himalaya has been acquired. But if suppose it would have been a 1 meter resolution instead of 1100 meter resolution. Then only a very tiny part of the earth would have been covered a very small strip though higher spatial resolution. So, there is always a trade of between space of resolution and the swath. Now, since NOAA AVHRR sensors are having 2 thermal channels and therefore, it is possible to acquire data even in nighttime. This is not common with many satellites, because if they do not have the thermal channel, then the data cannot be acquired in nighttime. But NOAA is having
thermal channels and this near infrared and 2 thermal channels 3 channels data can also be acquired in nighttime. Whereas in daytime 5 channel data 2 visible 1 near infrared and 2 thermal channel data can be. So this is one example in the daytime the 1432 hours and this is 23 hours in nighttime 11pm. And this is one of the thermal image which you see and it is just like x ray seems to be accepted
but it is a thermal image of the same part of the earth as seen in the daytime. So, that makes a big advantage if a sensor is having thermal channels as well. NOAA satellites is having also your Landsat this OLI series also having but normally they do not acquire data in nighttime only the daytime thermal data is acquired but in case of NOAA AVHRR nighttime data is we acquired because it covers a wide swath and sometimes it is very, very useful. We have been using this thermal data for various purposes especially in case of detection of pre earthquake and thermal anomalies. So, this brings to the end of the discussion about how different sensors are there from little
electronics point of view, not in much detail, and then how the data from satellite to earth how it comes to the satellite earth station that part we have discussed. So, thank you very much.