Peter is a Senior Software Engineer at Exelis VIS, with the company for 2+ years with a B.A. from the University of Colorado in Physics with Astrophysical, Planetary & Atmospheric Sciences concentration. After 5 years working in the exploration geophysical business, Peter has spent the last 15 years as a geospatial software engineer with experience in the petroleum, utility and remote sensing industries. Peter's focus has typically been on enterprise GIS technologies, and has a keen interest in the evolution of tools for geospatial analysis and geovisualization.
Author: Peter DeCurtins
In looking at the nature of light, we have seen how light interacting with matter illuminates our world. When light rays fall up some piece of matter, they can be absorbed, reflected, or transmitted. We've seen that the reflection of light is key to how we remotely perceive the universe around us. Light photons that are absorbed by matter add to the energy of the quantum system, and matter may radiate light back out to space. Just with this information, we can say a great deal about the way light behaves. To complete the bigger picture, however, we will need to take a look at how the wavelike nature of light emerges when light flows around and through matter.
The Golden Gate Bridge refracted in rain drops acting as lenses on glass by Brocken Inaglory - Licensed under CC BY-SA 3.0 via Commons
If you've ever been to the ocean and seen water waves bending around the end of a sea wall, or watched as waves spread out from a channel, you've observed the wave phenomenon known as diffraction. Basically, wave energy bends around physical objects that it encounters. While this is easy to observe with mechanical waves in a medium such as water, diffraction of light is very subtle and not easily noticed. Light waves have such small wavelengths (~5000 Angstroms) that the amount of bending that occurs is always very small. However, there is in fact a simple way to observe diffraction of light. When light from a distant source is passed through a small pinhole or slit, what is called a diffraction pattern can be observed. If you've ever seen a distant street light through a fine window screen you may have noticed that the point light source of the street lamp appears to your eyes to form a cross. This is diffraction: the four sides of each tiny hole in the screen act like the side of a diffraction slit, and bends the light four ways to form a cross shape.The fact that diffracted light rays form patterns is due to another feature of general wave phenomena: interference.
Thomas Young's sketch of two-slit diffraction, presented to the Royal Society in 1803 - Public Domain via Commons
Interference is another effect that is easily observed in the realm of mechanical waves. Drop two rocks into slightly different spots of a pond and watch as the resulting waves interact with each other. Waves whose crests and troughs match up with each other in frequency are said to be in-phase. Such waves will interfere with each other constructively, meaning that they will add to each other to produce a wave that, while still of the same frequency, has twice the amplitude (wave height) of the two individual constituent waves. Waves that are completely out of phase with each other (peak lining up with trough and vice-versa) will combine with what is termed destructive interference. The two waves will cancel each other out, resulting in no wave. Interference does not create or destroy wave energy, but rather redistributes it. Thus when light waves are passed through a diffraction slit, the resulting pattern is formed by the diffracted light rays constructively and destructively interfering with each other.
Constructive and destructive interference of two waves by Haade - Licensed under CC BY-SA 3.0 via Commons
When light is able to traverse through matter of some sort, perhaps its most useful behavior emerges. Refraction is the bending of light rays as they travel across the interface between two different materials, such as air and water. The cause of this behavior is that light travels at different speeds through different materials. It is well known that the speed of light is constant in a vacuum, and it doesn't slow down much when traveling through air. But light travels only 3/4 as fast through water and only 2/3 as fast through glass. The slower light travels through a substance, the more it will be bent through refraction.
Snell's Law which describes refraction by Cristan - Public Domain via Commons
Large portions of the science of optics largely proceed from the very simple rules of refraction. Because translucent materials such as glass bend light rays, we can create lenses that manipulate and focus the light that passes through them to serve various purposes. The lenses in eyeglasses, microscopes and telescopes all harness the refractory behavior of light, which is only possible because light observably behaves like all waves do. Of course, light also behaves in some ways as a particle (see the photoelectric effect, for example), but that is a subject for an entirely different article.
Two soap bubbles, illustrating iridescent colors due to thin film interference by Tagishsimon - Licensed under CC BY-SA 3.0 via Commons
Categories: ENVI Blog | Imagery Speaks
Tags: light wavelenght, refraction, optics
When considering the nature of light and color, it makes sense to ask in what ways does light behave? An inclusive list of the basic categories of behavior for light would most likely include transmission, absorption, reflection, refraction, diffraction, scattering, polarization and interference. Of these, only the first three - transmission, absorption and reflection - are needed to account for all of the electromagnetic energy, or light, which falls upon an object.
"Tso Kiagar Lake Ladakh" by Prabhu B - Licensed under CC BY 2.0 via Commons
Light rays that are reflected or transmitted may become polarized, and transmitted light may be scattered or refracted. Any light that is neither reflected or transmitted is absorbed, and adds to the internal heat energy of the material at an atomic or molecular level. Taken together, the changes that light goes through as a result of all of these processes is responsible for all that may be seen.
Original work by author based on illustration in "Light And Color" published by Golden Press, 1971
The most fundamental behavior of all of these is probably reflection. For being such a critical phenomenon of nature, the laws that describe the reflection of light are remarkably simple.
"Reflection angles". Licensed under CC BY-SA 3.0 via Commons
At the microscopic level, all reflection is regular, which is a way of saying that one can visibly perceive that the reflected light conforms to these laws. In the everyday macroscopic world, regular reflection is only perceived off of mirror-like surfaces. With regular reflection, one doesn't see the object that the light reflects off of, such as the surface of a polished mirror, but rather the objects that are reflected by it. A plane mirror reverses the scene it reflects, with objects on the left appearing to the right in the mirror image, and everything appearing to be as far behind the mirror's surface as they are in fact in front of it.
"Mirror" by Cgs - English Wikipedia - Licensed under CC BY-SA 3.0 via Commons
Reflection is almost completely regular from a mirror because of the mirror's smooth, highly polished surface. A rough surface will present many different very small angles of incidence for light to reflect off of, so macroscopically the overall reflection will be diffuse. The wavelength of visible light averages something like 500 nm, so by comparison most surfaces are rough and therefore diffusely reflect light. A fairly smooth surface, such as that of a road when it is wet, can exhibit both regular and diffuse reflection, in proportions which will vary with the angle of the incident light.
"Diffuse reflection" by Theresa Knott at the English language Wikipedia. - Licensed under CC BY-SA 3.0 via Commons
What will happen when light is shone on an object really depends on the material that object is made out of. Polished metal reflects almost all of the light that hits it, absorbs some, and transmits effectively none. An object such as a black tarp appears dark because it is absorbing much of the light that is falling on it. Most materials exhibit some selectivity in the different wavelengths of light that they will reflect, transmit and absorb. Gold reflects red and yellow wavelengths more strongly than it does blue. Silver tends to reflect all wavelengths equally, making it appear almost white. A purple cloth will reflect blue and red wavelengths, but absorb green light. The intrinsic color of any object is almost always due to the selective reflection and absorption characteristics of its constituent materials.
"Gold-crystals" by Alchemist-hp - Licensed under CC BY-SA 3.0 via Commons
Remote sensing isn't only done with high-end sensors from an orbiting platform. It's what happens when we see. In order to perceive a non-luminous object with our eyes, at least some light has to reflect off of it. As basic of a concept as that is, it's not a bad starting point for pondering how light behaves.
The natural phenomena of light and color are intimately involved with our daily existence, as well as being at the center of much of the science and technology that many of us dedicate our professional careers to. Yet a detailed understanding of these phenomena is not common. In this and perhaps subsequent articles, we will survey these topics at what hopefully will be a clear and approachable level for the general reader.
One fascinating aspect of this subject is that the sensation of vision - like each of our senses - is a psychological interpretation of our physiological response to physical stimuli. We are used to dealing with the physics of the interaction with light and matter, but the sensation of sight occurs in the brain. What someone 'sees' is effectively subjective - wholly contained within their mind, and usually not measurable.
"Cloud in the sunlight" by Ibrahim Iujaz from Rep. Of Maldives. Licensed under CC BY 2.0 via Commons
Basic scientific literacy includes an understanding that light is a category of electromagnetic radiation, alongside radio waves, gamma and x-rays, etc. Electromagnetic waves propagate energy in all directions through the universe, interacting with matter which both absorbs and radiates these waves. Visible light exists in the portion of the electromagnetic spectrum that extends between wave lengths of around 350 nm up to 750 nm. Just outside of this range lies infrared radiation with longer wavelengths and shorter wave ultraviolet.
All electromagnetic waves basically behave exactly the same, which is not surprising considering that they are all manifestations of the same phenomena. For example, all electromagnetic waves travel through space at the same speed - the speed of light - and have the property that as the wavelength decreases along the spectrum, their frequency increases. The differences that exist between different electromagnetic waves are really just a matter of degree, and exist due to the differences in frequency (or inversely, wavelength). Light waves, which we usually take to mean visible light waves, are unique only due to their visual qualities. The concept of color as we usually think of it really only has meaning in the context of visible light.
"EM spectrum". Licensed under CC BY-SA 3.0 via Commons
The term visible indicates frequencies that can be detected by that most versatile radiation sensor in existence - the eye. Its sensitive retinal inner surface photo-chemically responds to incoming light rays, turning them into weak electrical impulses that are sent to the brain for interpretation. What we perceive - the sense of vision - occurs entirely within the brain of the observer, meaning that it is both subjective and for the most part impossible to measure.
Brightness, for example, is a completely psychological concept. It is a sensation that is experienced in the mind of the observer, and is not measurable by instruments. The eye is so adaptable to varying levels of illumination that it is an extremely poor detector of absolute brightness levels. When considering illumination, we talk about the intensity of the light source, a quantity of which depends on the total amount of light emitted by the source in a given direction. Brightness is associated with the amount of light stimulus experienced by an observer. It corresponds to the perception of light-source intensity per unit area. This measurable quantity, luminence, is an example of a psychophysical variable, linking a psychological response - brightness, in this case - to a related physical aspect.
For most of human history, people thought that light was colorless and only revealed an object's intrinsic property of color through illumination. It wasn't until 1630 that Descartes attributed the color of an object to some change in the light that is reflected off of the object. All objects reflect, absorb and potentially transmit incident light in some combination due to their physical structure, and the variance of that response in term of wavelength is interpreted as the sensation of color to an object's observer. It happens so naturally that it is easy to discard how intricate an experience color vision is. When you look at an object, the color that you see depends on the frequencies and intensities of the light illuminating it, the frequencies of the light that is reflected or transmitted by it, the color of objects that are adjacent to it, and on any absorption, reflection and transmission by materials in the ray path between you and the object.
"Goethe, Farbenkreis zur Symbolisierung des menschlichen Geistes- und Seelenlebens, 1809" by
Luestling Licensed under Public Domain via Commons
The psychological aspects of the sensation of color vision are hue, saturation, and brightness, none of which can be directly measured. Since color cannot be accurately specified by physical aspects, we rely on the related psychophysical variables of dominant wavelength, purity and luminance.
Hue is the sensation through which we distinguish the variability of the spectrum, from red to yellow to green to blue, and so on. Most color samples can be matched by adding some monochromatic (pure) light to white light. The wavelength of that light is called the dominant wavelength.
"Eyesensitivity" by Skatebiker, vector by Adam Rędzikowski - File:Evesensitivity.svg, vectorised. Licensed under CC BY-SA 3.0 via Commons
Saturation is the sensation that derives from the relative amount of hue in a color. Purity is the psychophysical variable that relates to saturation. It is the ratio of the amount of monochromatic light to the amount of white light in a color sample. Monochromatic light has a purity of 100%, white light 0%.
Although hue, saturation and brightness (the sensation dependent on the overall amount of luminance) are the three components of color perception, they are not independent of each other. Changing one of the qualities often results in changes to the other two.
Without having to get into all the complexities of the subject, it is clear that light and color are essential to our every day lives and an essential part of human technology. Compositions of alloys, the molecular identification of various materials, the temperature of a distant star, as well as its chemical constitution and velocity - all of these can be determined by the analysis of light and color. Light and color in many ways are the the key means by which humans interface with the world and derive meaning from it. This is a topic that we may find interesting when we contemplate the interpretation of image analysis.
Light dispersion of a mercury-vapor lamp with a flint glass prism IPNr°0125" by D-Kuru - Own work. Licensed under CC BY-SA 3.0 at via Commons
In 1986 NASA defined a set of processing "levels" to classify standard data products that were to be produced from remotely sensed data from their Earth Observing System. The idea was that the given level of any output product would indicate the type of data processing that had been applied in creating it, allowing the consumer of that product to know what the appropriate uses for it are. NASA set forth brief definitions of each level.
NASA Earth Science Division Operating Missions as of February 2, 2015 - NASA, Public Domain
One key aspect to this system is that each level is cumulative, deriving from the level below it and representing the prerequisite input to the processing needed to reach the level above it. Level 0 data is basically the raw, unprocessed instrument and sensor data. Although at that fundamental level it may be of use to someone who is interested in the calibration and sensitivity of the sensors that collected it, the main utility of level 0 data is as the raw source that is fed to the data processing chain to produce higher level output. Level 1 data can be reversed back to its level 0 state, and is the foundation for all higher level data sets that may be produced.
At level 2, data sets become directly usable for most scientific applications. These data sets may be smaller than the level 1 data that they were derived from as they may have been reduced in some aspect such as spatial extent or spectral range. Level 3 products tend to be smaller still, making them easier to handle, and the regular spatial and temporal organization of these data sets make them appropriate for combining with data from differing sources. Basically, as you go up in processing level, the data sets grow smaller, but their value and utility to scientific applications gets larger.
The advantages of adopting a common set of processing levels to describe the types and degree of processing that an image has had applied to it quickly became clear. The practice seems to have grown to be universally adopted, though many variations do exist. In general, the following definitions are taken to be a standard of sorts:
Level 0: Raw instrument data, as collected by the sensor. Data in this state is not terribly useful, unless the focus of interest is the sensing instrument itself rather than the features recorded in the data.
Level 1A: This data has been corrected for variations in detectors across the sensor by applying equalization functions among the detectors, leveling the measurements made by the sensor. This radiometric correction includes absolute calibration coefficients which can then be used to convert the digital numbers into irradiance values.
Level 1B: The next step is to apply measurable corrections to the image to address systematic geometric distortions inherent in the acquisition of the image by some sensors. This level is not necessary for other sensors that don't suffer from such systematic geometric error. Also, note that level 0 data can not be recovered from 1B data.
Level 2A: These images have been systematically mapped into a standard cartographic map reference system. Such products are nominally referred to as being geo-referenced, but do not have a high level of accuracy.
Level 2B: To improve the spatial accuracy of an image, a more rigorous process involving considerable user input is required. Through the process of image rectification, an image analyst geo-registers the image by identifying specific points in the image that correspond to very well-defined geographic locations known as ground control points. With this processing completed, the image is geo-referenced accurately to the spatial resolution of the original data - in other words, limited only by the spatial resolution of the sensor - except in areas of high local topographic relief.
Level 3: In areas with a great deal of elevation relief, such as in mountainous areas, further corrections are required to obtain a more accurate spatial image. Level 3 products have gone through the process of orthorectification, which adjusts the image for distortions due to topographic relief, lens effects, and camera tilt. Level 3 data is uniform in scale and appropriate for use over large grid scales, such as in mosaics.
It is important to note that different systems are in place with different missions and data providers, for various reasons. Landsat 7 designated level 1G as products that are geo-rectified with pixel values in sensor units, for example. DigitalGlobe has an extensive system for categorizing product levels, going from level 1B 'Basic' through level 2A 'Standard' up to various entities such as level 3F (orthorectified imagery with a map accuracy representative fraction of 1:5000) and level Stereo OR2A 'Ortho-ready Standard' - a stereo pair that is ready for the consumer to perform orthorectification on according to their own processes and specifications.
Geodesy: The scientific discipline that deals with the measurement and representation of the Earth.
Cartography: The study and practice of making maps.
Geoid: C.F. Gauss, who first described it, called it the “mathematical figure of the Earth”. It is a smooth but highly irregular surface that can be derived only through extensive gravitational measurements and calculations. This gravity-defined equipotential surface theoretically would coincide with the mean ocean surface of the Earth, if the oceans and atmosphere were in equilibrium, at rest relative to the rotating Earth, and the seas extended through the continents (such as with a series of very narrow canals). The geoid is a much closer approximation of the true shape of the Earth than any reference ellipsoid could provide.
Reference Ellipsoid: A mathematically-defined surface that approximates the geoid. Because it is an idealized model of relative simplicity,reference ellipsoids are used as a preferred surface on which geodetic network computations are performed and point coordinates such as latitude, longitude and elevation are defined.
Datum: A datum is needed to be able to match coordinates on the reference surface to points on the physical surface of the Earth. It contains the specific definition of the reference surface as well as the point of origin and directions from that origin in order to specify the orientation of the surface.
Map Projection: A systematic transformation of a coordinate system defined on a three-dimensional reference surface such as a sphere or ellipsoid into coordinate locations on a two-dimensional plane. A projection provides the transformation between a geographic coordinate system and a flat, planar projected coordinate system, the kind found on maps.
WGS 84: The latest revision of the World Geodetic System, a standard for use in cartography, geodesy and navigation. It is made up of a standard coordinate system for the Earth, a standard reference ellipsoid with datum. The geoid serves to define the nominal sea level. The coordinates used are latitude and longitude on the surface of the ellipsoid, and a height or Z value which defines the vertical displacement above or below the geoid.
UTM: A standard coordinate system which divides the Earth between 80 degrees South and 84 degrees North latitude into sixty zones, each six-degrees of longitude wide. It uses a transverse Mercator projection and, unlike WGS 84, is a 2-dimensional Cartesian system that specifies locations on the earth in terms of East and North coordinates independent of any vertical position.
Sign Up for News & Updates: Stay informed with the latest news, events, technologies and special offers.