11

May

2016

An XPonential(ly) Great Time with ENVI Analytics in the Big Easy!

Author: Rebecca Lasica

Thanks to everyone who was able to stop by the Harris booth last week at XPonential in “The Big Easy”! It was my first trip to such an amazing city and in addition to the great river views and beignets around every corner, I had the exciting privilege to debut our ENVI OneButton technology in conjunction with our “We’re so sure you’ll love it please try it out at no cost…” campaign (details). 

In case I missed you at the show – imagine having the ability to download an entire collection of multi-spectral UAV data, perform whatever pre-processing is required, generate a multi-band orthomosaic of your image, and perform spectral analysis to derive an output with a single solution – ENVI! Here’s a rundown of the workflow end to end:

Perform data preprocessing (if required) such as this band to band alignment and exposure adjustment (both available in the ENVI UAVToolkit):

Before:

After:

Generate an orthomosaic using ENVI OneButton:

Perform spectral analysis, classification, temporal analysis, or other processing using ENVI. Below: OSAVI spectral index with blue-green-red-yellow color table (yellow areas are the most healthy vegetation. Red areas represent stressed vegetation):

I’d love to hear about your needs for end-to-end UAV processing workflows!

Comments (0) Number of views (1475) Article rating: No rating

Categories: ENVI Blog | Imagery Speaks

Tags:

15

Mar

2016

A Fresh Perspective is Everything!

Author: Rebecca Lasica

I remember the first time I ever saw a satellite image. It was a low resolution top-down view of a suburban neighborhood. I recall thinking, “Well this top-down view is something to get used to. It’s a little disorienting, but OK, I’ll go with it.”

Here I am several years later, completely comfortable contemplating perfectly nadir imagery for much of the day. So when I was fortunate enough to be asked to speak to a few local high school geography classes last week, I jumped at the opportunity. I was so taken by the students’ insights and enthusiasm that it reminded me that a fresh perspective on the world ensures that our future is in great hands! I am still processing the energy I gained from the experience, but I thought I’d share some of their great insights.

For example, check out the movie here and note the change in population over time.

Were you looking at the buildings and urban change? So was I! But the teacher (Mr. Hickory – a fabulous teacher btw) pointed out the changing shoreline of Lake Mead in the SE portion of the image. Do you know how many times I’ve seen this movie and never noticed that incredibly significant observation? (That’s a rhetorical question by the way.)

Also worth mention was the sheer wonder around loading different spectral bands into the display to exploit information in the wavelengths our eyes can’t see. That’s actually so significant and something I take for granted all the time! This link is for the student who was asking about a place to go online for some first-hand experience to view and manipulate some satellite images.


Interactive demo using Landsat imagery

I also really appreciated when a student asked if we could see satellites from Earth? Of course not! Satellites are exploiting reflected sunlight as input – how can you see things in space from here on Earth during the day? It didn’t occur to me to consider looking skyward at night. But when I went online later to get my brain around the question, not only did I discover my “um – duh” moment, but I also came across this really cool concept:

Digital Globe’s WorldView-3 satellite turned around and “looked through its legs” to capture images looking across the Earth’s surface as opposed to top-down. This might be one of the most outside the box ideas I would never have imagined. This concept is a revolutionary approach to Earth imaging, and a perfect example that looking at something familiar from a fresh perspective offers insights that would never be possible using traditional approaches.

I am grateful for the opportunity I had to share my experience, but even more so – the opportunity to be reminded that a fresh perspective is invaluable. That’s a lesson I hope to take with me going forward as I contemplate this wonderful industry of ours and what amazing things are to come when some of these young scholars (hopefully) join us in the next several years. Thanks for the experience!

Comments (0) Number of views (2349) Article rating: 3.0

Categories: ENVI Blog | Imagery Speaks

Tags:

8

Jan

2016

A Look Ahead at 2016

Author: Rebecca Lasica

As much as I appreciate a good reflection on the past, after a week in the fresh (and cold!) mountain air I am excited to look ahead! 2016 is here and one thing is certain, as far as trends go I predict that the theme this year will be end-to-end integrated solutions. What that means is that we (software companies) need to continue to provide excellent analytics (ENVI) that work well in an environment with many moving parts (the enterprise) to solve a problem from start to finish – in a seamless manner (using both ours and others’ solution suites). Yes that’s a mouthful. But, I do believe that there is a clear need to be nimble, flexible, and deliberate when it comes to proposing and delivering tools to solve problems.

For example, if you haven’t already seen my colleague Barrett’s blog about a new web interface for Precision Agriculture you should take a look! This is a perfect example of a scalable solution to take any data (satellite, airborne, UAS), perform some analysis, and export results. Not only that – but customers are adding custom algorithms for inventory, prescriptive maps, and more within the same interface which by definition illustrates scalability!

Agriculture Toolkit for ENVI displayed with plant count inventory custom algorithm example image.

Speaking of algorithms, many applications for precision agriculture have been taking advantage of ENVI’s spectral indices (66 now!) for various analyses in some exciting new ways! For example, take a look at this Pleiades image courtesy of Airbus. This image has been calibrated corrected for atmosphere, and run through several of ENVI’s spectral indices. It is easier than ever before to not only apply a color table (top left) to an index to bring out features of interest, but also to combine indices with one another to generate false color composites. As you can see, the false color composite of three indices differentiates some features (red) that are not well defined in one index alone.

Pleiades image courtesy of Airbus Defense and Space. Left: Difference Vegetation Index with color table applied. Right: false color composite of Visible Atmospherically Resistant Index (R), Soil Adjusted Vegetation Index (G), and Sum Green Index(B).

Also worth mention if you’re planning to work with LiDAR this year are some great things adding to your ability to interact directly with the point cloud. Not only can you now color your points by classification code, but it is also easy to interact directly with the points by turning entire classes on or off for more immediate data insight, as well as controlling the density of points displayed. The latter is useful with new datasets such as Geiger Mode LiDAR that capture very high point density from very high altitudes. Additionally, users from anywhere can access tools like building extraction directly from within a web page, which means there is no need for all users to install and maintain software!

    

Geiger Mode LiDAR point cloud colored by classification code.

End-to-end complete workflows are the thing of the “now”, not the future. I look forward to working with you all this year!

Comments (1) Number of views (5412) Article rating: 5.0

6

Oct

2015

The Calm Before the Storm

Author: Rebecca Lasica

What do quad-copters, software, and slot machine have in common? That’s right, it’s the Commercial UAS Show in Las Vegas and I am so excited to be here! The energy in the room is palpable and at the one hour mixer on the showroom floor last night I had the fortunate opportunity to learn about projects both here in the US and abroad that will certainly change the world. Some companies are planning to help growers achieve higher crop yields while saving water. Others are helping with infrastructure inspection to increase public safety and maintenance efficiency. Some are using UAS technology to collect terrain information in areas where it is dangerous to send ground crew.

These and many other applications are utilizing multispectral imagery and LiDAR point clouds to extract actionable information from UAS data. There are even some early adopters of lightweight hyperspectral and SAR sensors. It seems as though the possibilities are endless when I think about applications that could be improved or enhanced when these new data are introduced to the analysis stack. Why? Because there are some major differences in these collection platforms vs. satellite or airborne imagery that fill the data with different information that one can get from traditional platforms. 

For example, the very high spatial resolution enables pure pixel analyses as well as very accurate edge detection – both are keys to success when working with only a few wavelengths. Also, UAS can fly low to the ground and collect information on cloudy days, and also change some of the atmospheric issues one must address when working with satellite imagery . In addition, the unique ability to interchange payloads and collect different data modalities, like multispectral and LiDAR, of the same area of interest enables data fusion techniques for richer analyses.

If you’re here at the show stop by booth 503, I’d love to hear about your projects! Now off to the show!

Comments (0) Number of views (2089) Article rating: 5.0

Categories: ENVI Blog | Imagery Speaks

Tags:

20

Aug

2015

Taking a deep dive into SAR data

Author: Rebecca Lasica

I feel lucky this week to be immersed in a SARscape class where we’ve been diving into the weeds of Synthetic Aperture Radar (SAR) analysis. Alessio Cantone is here, all the way from Italy just to teach this class and has brought with him a level of knowledge and experience about SAR that we don’t get to see very often. So today I’d like to share with you just a couple of the things I’ve learned so far this week.

First, I have often wanted to take a deep-dive into “What is coherence as it relates to SAR data?” and yesterday we spent several hours on just that very topic. It turns out that coherence is a measure of stability over time both in the amplitude and phase components of the signal. In other words, a coherence image will show us what has changed, and one of the greatest things about the “how” behind it is that we can see change in coherence images that we could never distinguish with our eyes from an optical image.

 

Courtesy of Sarmap

Here’s how it works. First let’s think about phase. Say you have a pulse with 3cm wavelengths and you are imaging a forest. The size of the wavelength is very similar to the size of the leaves thus the signal will interact with the canopy. Due to wind and motion of the canopy, it is unlikely that images from time 1 to time 2 will be similar, and therefore coherence I slow. Conversely, think about a manmade structure that does not move. Likely the signal will be highly similar from time 1 to time 2 thus coherence will be high.

Overall, so far we have considered the phase component of the signal. The amplitude portion of the signal is a measure of how much signal is returned. This will be very high for manmade objects, and very low for water, where vegetation will give an average and inconsistent return due to signal bouncing around in the canopy.

A popular way to visualize the phase and amplitude components together in order to extract a meaningful visualization is to load both the coherence and amplitude components into different channels of the display. For example, loading the coherence image into the red channel, the amplitude average into the green channel, and the amplitude variation into the blue channel, we come up with a false color image as the one above.

High coherence with high amplitude represent urban features thus urban areas appear yellow. Vegetation generally has low coherence and consistent average amplitude without much amplitude variation thus it appears green, and features with low coherence that also may have some amplitude variation but almost very low average amplitude represent water which appears very dark or even blue.  

Overall, SAR data are highly complex but fascinating to work with and have relevant applications across industries including vegetation analysis, change detection, feature extraction, highly accurate terrain calculation, and many other use-cases. I look forward to learning more and now I better get back to class!

Comments (0) Number of views (2690) Article rating: 5.0

Categories: ENVI Blog | Imagery Speaks

Tags:

1234567

MOST POPULAR POSTS

AUTHORS

Authors

Authors

Authors

Authors

Authors

Authors

Authors

Authors

Authors

Authors

GUEST AUTHORS

Authors

Authors

Authors

Authors

Authors

Authors

Authors



© 2017 Exelis Visual Information Solutions, Inc., a subsidiary of Harris Corporation