Select Page

Perspective

Perspective has several definitions, but most involve how we observe something. We often become less observant when we view things the same way over and over again. In a sense, changing our perspective is like viewing something with “new eyes.”  Get closer to something, and we notice finer detail.  Go farther away, and we see how components are spatially related.  The video, Powers of 10, provides excellent examples of what is observable based on the distance away you are from an object.

Have you watched a black and white movie or looked at black and white photos?  We often focus on the texture of objects in the image that we would often overlook when we view the full range of color. Color provides a great deal of information, but we often become so reliant on it that we stop looking for additional information.

We use color filters to view only selected colors of a scene.  Foresters use plant stress detection glasses, which are purple filters, to quickly see plant health. Healthy plants absorb red and blue light during photosynthesis, so healthy plants look dark when viewed through the glasses.  Stressed plants reflect more red and blue light, so they appear brighter.  It turns out most of us detect green light most acutely (see if you do by using the color change and color difference web apps at Our Senses and Brain), so the purple filter stops the green light from reaching our eyes, improving how we detect the colors that reveal plant health.

Technologies allow us to view more than visible light. Satellite sensors have been viewing Earth and the universe in a range of wavelengths or types of light for over 50 years, but now digital cameras allow us to view the world differently. Sensors in digital cameras detect ultraviolet, visible light, and near-infrared, and they are being used to explore our world in new ways.  Compare the photos, right, of the same landscape photographed in visible light and near-infrared.

Technologies also let us merge colors in a digital image.  Visible and near-infrared are combined in satellite images to monitor vegetation health across our planet.  But we can also modify colors to see new patterns.  One technique is to subtract one color from another and to improve the sensitivity of small differences in low light conditions, divide the difference by the sum of the intensities.  This manipulation is called a normalized difference and is useful in analyzing plant health from aerial photography and satellite images.

So if you want to enhance your observations, analyze your current perspective, and consider how it can be changed.

Change your perspective by changing your vantage point.  Looking at the Earth’s surface from above helps us see how land features are spatially organized.

Another way to change perspective is to look at a scene in a different type of light – in this case, the same landscape photographed above is viewed in the near-infrared.  The shadows cast by the clouds are much more pronounced, but most notably, lakes and ponds stand out as the darkest features in the image.

0 Comments