Amanda O’Connor is a graduate from the University of Colorado with her M.S. in geology. She studied cross correlation of AVIRIS and Landsat data for High Plains Vegetation Analysis in her thesis. Following graduation, she worked at Stennis Space Center on calibration projects, sensor noise simulations, and the commercialization of remote sensing. After her stint in Mississippi, she headed west to Carnegie Institution Department of Global Ecology based at Stanford University working for Greg Asner on tropical ecology projects using AVIRIS, Landsat and Hyperion. She has been with Exelis Visual Information Solutions for almost 10 years supporting government, educational, and commercial customers with the use of ENVI and IDL software.
Author: Amanda O'Connor
In the last few weeks, I’ve been working with a lot of
precision agriculture data of ALL kinds. No one project is the same and if I reveal anything about them, well, I’d have
to kill you. Except for one. Our Airbus DS co-marketing activities, those I can
discuss. I recently delivered a webinar on using Pleiades imagery in our ENVI Toolkit for Precision agriculture. The
webinar focused on plant counting and using USDA Cropland data layers to find
fields of interest. Sky Rubin at Airbus Co presented as well, and we’ll be showing demos of this at the ESRI UC so be
sure to find us (or we’ll find you).
Anyways, when I first got my hands on the plant counter it
was an earlier version and I, being new to it, was perhaps not keying in the
best parameters as seen below. But I was
still getting 95% accuracy on a grape vineyard. The plant centers aren’t quite perfect, but still decent to get a count.
Enter the new and improved 1.0 version—I dare you to find a
missed plant. Austin Coates, one of our
consultants and Plant Finder auteur, tightened up a few things to get these
Then Zach Norman, Sales engineer extraordinaire, asked what
if I could pull an average vegetation index value. Drops mic.
When you do things like this and people see your work, you
realize the value of being in a collaborative environment—you can feed off of
each, motivate each other to ask questions, and find different ways of doing
things. Here we go from ok counts, to
better counts, to water/fertilize/do something to that grape plant right now!
In boutique crops like viticulture where the value of each plant is so high, a
look like this can be incredible useful for plant monitoring. For a lower value crop like corn, more
regional information is fine and useful because management occurs more from the
field level than the plant.
And the ENVI Toolkit for Precision Agriculture is
available. It’s a library that runs with
ENVI+IDL or ENVI Services Engine. Cost
is $3,500. You can use it yourself or our services team can integrate it into
your existing cloud or ArcGIS for Server environment. Drop me a line with questions.
Follow me @asoconnor
Categories: ENVI Blog | Imagery Speaks
Lately the topic of vegetation analysis, in particular species identification, has been coming up like gang busters. I spent a lot of time during my years at the Carnegie Institution department of Global Ecology looking at invasive species in tropical environments—think dense tropical forests. Even with hyperspectral data it was a huge challenge. Vegetation isn’t like minerals. It’s carbon, Hydrogen, nitrogen and oxygen. Absorption features are more driven by leaf composition and health than by a species having unique feature, whereas a mineral has absorption feature unique to it or a family of minerals caused by electron bonds or presence/absence of other compounds. Vegetation of the same species can look vastly different. Even leaves on the same plant can look completely different from one another. In other words in the intra species variability is greater than the inter species variability. Or as I say too often, vegetation is hard.
Take my chlorotic plant here. According to Wikipedia “ chlorosis is a condition in which leaves produce insufficient chlorophyll. As chlorophyll is responsible for the green color of leaves, chlorotic leaves are pale, yellow, or yellow-white.” One leaf looks vastly different from the rest of the plant, but it’s the same plant. Species ID with spectral data is challenging because there are so many manifestations of how a single plant can appear. To create a spectral library of one species of plant is extremely difficult—there are literally endless permutations of how that species can appear. You really can’t use one spectrum to go find others. If you did, a classifier would go find others like that spectrum. But whatever state that spectrum is in (dry, wet, healthy) does not truly giving you a species ID. That’s not to say you can’t do some discrimination. Conifers and deciduous trees can be separated same for grasses and certain kinds of shrubs with certain data. But sub alpine fir from Douglas fir with only spectra is really challenging.
RayKokaly, USGS SpecLab, has a nice paper on vegetation biochemistry observable from imaging spectroscopy. My key take away from this is you can extract a ton of biophysical information from imagery and use that to study ecosystem health and productivity, but again plants have similar biochemistries, so it’s not like that can be applied to species identification. These chemistries are not unique to a species.
The four things that can be most helpful toward getting to species ID are the following:
1. Temporal data. Certain species emerge before others. Plants bloom at different times, have different responses to water stress, die at different times, etc. Even just two images from different times can help and more can provide a higher degree of accuracy.
2. Contextual information such as, “you aren’t going to find palm trees growing in the mountain west, so don’t look for them there, look for lodge pole, Douglas fir, and blue spruce.” Narrow your options and you’re more likely to have some success with a library and target detection. Again I would recommend HSI data. MSI would probably still be confounded depending on the spatial resolution.
3. Object based analysis. This method works well with high resolution data, so maybe the spectral signature isn’t enough information, but crown shape, clustering, and texture can all indicate certain species. ENVI’s Feature Extraction Tool can provide this type of analysis.
4. Deep Learning. We’re just starting to scratch the surface on what deep learning can do in vegetation. We have a working tool in house and if you have a hard vegetation problem you need solved, give me a shout. This yelp article on identifying food pictures gives a good overview of how deep learning works.
As I said in the beginning, vegetation is hard. Not impossible, but hard. But there are a number of things you can try to improve your results. Invasive species detection, weed infestation, and finding endangered plants are all areas of research in vital need of solutions. If you have a hard vegetation problem and want to talk it through, please reach out to me. It’s one of my favorite topics, in case you couldn't tell.
Follow me @asoconnor.
Tags: NDVI, vegetation analysis, deep learning
Well as another year winds down I start thinking about the things that caught my attention remote sensing wise in 2015.
1) Cloud deployments off and running! With remotely sensed data being so large there seemed to me collective industry doubt about moving applications to the cloud. Will my massive data be able to really be analyzed via an internet connection? Will it break the bank to work with my data on AWS? But I still want control? The barriers I saw broken down on cloud deployments were
a. Back end processing and preprocessing
b. Creation of a light weight product, like a report, automatically
c. Interaction with existing customer databases or data repositories
a. Back end processing and preprocessing
b. Creation of a light weight product, like a report, automatically
c. Interaction with existing customer databases or data repositories
Ultimately the cloud is ideal for remote sensing data and will enable the really big problems to be answered with data in one place, especially free public data, for people to ask questions about our world and others we never thought possible. For a while it was really difficult for organizations to come to terms with variable monthly cloud computing costs, it’s hard to allocate and predict. But it’s like a utility, you don’t know exactly how much your heat costs each, month, but you have an idea and over time you have a better idea of use. Governments, businesses, and institutions are starting to look at cloud computing in this model and feeling much safer with it as a robust option for data analytics.
2) I love working with Airbus Defense and Space. I know what you’re saying; “Amanda, you’re shamelessly plugging your business partner". And what do I say? “So what! They sent me a giant chocolate bar.” And that little hammer is going to be used hence forth for all my large chocolate breakages. But in all seriousness, I see a lot more people using their data because of their flexibility on tasking, great archive, and affordability. The Tri-stereo Pleiades data makes a lovely image generated point cloud. They have a really great image offering and it’s only going to get better this year.
3) Precision Agriculture -- a paradox. Precision agriculture has long been touted as the ideal field for remote sensing and there’s use here and there, but there is also a lot of concerns over expense of data and the margins it can help eek out. This year I saw huge excitement over Planet Labs promise of daily coverage of land masses with their constellation from the precision agriculture community. Coverage within a growing season has always been tough: too spotty and you miss something, too much and well, you have too much data. A drone seemed like the ideal platform for precision agriculture, but it depends on your market. You might not need to get leaf scale in a cornfield, and the price of corn doesn’t support that level of resolution. And bandwidth is a HUGE issue both domestically and internationally, rural areas (i.e. where there’s a lot of agriculture) don’t have great coverage or internationally, none in places. This community is poised like no other to drive automation, cloud scaling, and more data, more data, more data. With the diversity of crops, landscapes, fertilizer regimes, water use, and land management, there are a gazillion ways remote sensing can help to turn a higher profit for growers, sellers, processers, and suppliers. We are working on a precision agriculture toolkit that will be available in January. Check out this story from precisionag.com on all the cool things happening and how many involve imagery.
4) Drones. Yep they are everywhere and guess what? They create, really big data that’s really great! But like all wondrous things, there are a couple issues. Working with these massive datasets is not for the faint of RAM. Luckily, ENVI handles these big datasets for mosaicking, color balancing, deriving meaningful information, and a myriad of other tasks. I see a lot of datasets that look beautiful, but then there are alignment issues either to maps or band to band. Removing artifacts can be complex and time consuming, and the bigness of the data can bring some applications to their knees. But the right software (ENVI, naturally), a good cloud computing platform and a powerful workstation, and some image processing background can get you around these issues. Our services group has been getting a lot of experience working with data that would be great if only “x, y, and z” were corrected. Whether it’s the photogrammetry piece, the product piece, or preprocessing, we can get your drone data singing.
5) The sharing economy. One of our other partner’s is CloudEO. They have a great product that lets you use software and data for the time you need to use it as opposed to having to make a large long-term investment. You can get ENVI for a low monthly cost of $400 and have access to Airbus DS data like SPOT and Pleiades, and Landsat and Sentinel, AND you get compute power. You rent the data, tools and space, making one time imagery projects easy to fund, keep ongoing projects going because of a lower investment of infrastructure, and collaborate with the Cloud EO community of users to accelerate your discoveries. This and Digital Globe’s Geospatial Big Data platform is pushing pixel rental to new heights making it easier to ask big questions of big data. Why keep an image forever on a hard drive if you only need to ask it one thing?
So it’s 4:30 on New Year’s Eve and I’m the only person left here, so it's probably time to call it a wrap. It’s time for 2016 and new fun discoveries. I’ll be at AMS in January and ILMF in February. Stop by the Harris booth and tell me what’s getting you excited in remote sensing this year.
The Extensions Library, formerly Code Contribution, is a treasure trove of tools for ENVI and IDL. Some are written by Exelis VIS staff (tech support, sales engineers, product managers and software engineers) others by customers. Most tools are written to meet a specific customer need or an internal need or support a specific data type. These tools for ENVI can be dropped directly into the extensions folder for ENVI 5 tools C:\Program Files\Exelis\ENVI53\extensions or the save_add directory for ENVI Classic applications C:\ProgramFiles\Exelis\ENVI53\extensions or for IDL specific tools, the code just needs to be in your IDL path. Some code is source, some is distributed as binary Sav files. Users who want to publish their code on the extensions library can contact firstname.lastname@example.org and send their code in sav or pro form (there is a short form to fill out). You will need to use your Exelis VIS login to get to the Extensions Library. Here is a list of some of my favorite tools:
AirbusDefense and Space Catalog Query Tool
This tool drops into the ENVI 5 interface and allows users to directly browse the Airbus Defense and Space Geostore for Spot and Pleiades data. The nice thing here is you can browse for data in the context of other imagery or vector layers you might have inside of ENVI to make sure you’re getting data in the right area with the correct parameters, such as look angle to meet your needs exactly. There’s also the option to send an email to Airbus Defense and Space to get a quote for the imagery you’re interested in. So one stop shopping directly from ENVI.
ENVI Program Generator
Written by Eduardo Iturrate—Eduardo’s goal for this and the next bit of code I mention is bringing programming to the masses to make it easy to script as well as learn by example. The ENVI source code generator allows you to open data, do something to it (classify, segment, band math etc.) and create an output. So it generates the code to do this and et voila you have a batch script with a few button clicks. And while this batch script is a great utility, the source code it generates is a great way to self-teach ENVI programming.
IDLSource Code Generator
IDL’s key value has always been data access, analysis, and visualization. This powerful tool guides you to open a certain data type, do some sort of analysis or processing, and create a 2 or 3-d visualization/output. This tool is incredibly powerful for the non programmer who has need to access scientific data, doesn’t know the tricks of manipulating and analyzing those data, and would like a sandbox for creating output visualizations. The resulting program can be run again andagain or be used as an example to learn from. The bottom line, if you’re a beginner with IDL, this code will save you a ton of time and take a lot of anxiety out of learning a new language.
ENVI Report Generator
This tool allows you to create professional looking reports directly from ENVI including adding logos, annotation, legends and backgrounds. It’s great tool when you’recreating image products in ENVI that are actionable and need to be shared with others on your team for use in decision making.
Landsat Gapfill tool for ETM SLC-off data
This is a tool written by Mari Minari from our tech support team to fill in the Scan Line Corrector off data from Landsat 7. While ultimately infill is not ideal, given the failure of this component on Landsat 7, this tool allows one to still make use of these data. This tool follows the USGS methodology, so it’s the best option if one must work with SLC off data. At the moment this utility is for the Classic ENVI interface, but works just great there.
Devin White’s ENVI Plug-ins
Dr. Devin White, one of my favorite ex-VIS employees, has found great success with supercomputing at Oak Ridge National Lab. Devin hosts his ENVI library on GitHub and has some great tools for Modis, VIIRS, Ortho processing and more. He writes fast, effective code—truly a “geter’done” resource.
ENVI Generate Anaglyph
An oldie but goodie. Developed by our ENVI product manager Adam O’Connor, the Anaglyph Generation tool for DEMs and imagery creates red/blue results for viewing with 3-d glasses. It runs in the 5 interface or classic.
There are many more utilities and tools on the Exelis VIS website Extensions library, but these are the ones I use most frequently. And remember, IDL sav files and can be run freely with the IDL virtual machine.
So we had our ENVI Analytics Symposium last week in Boulder. Nine months of committee meetings, planning, and designing paid off beautifully with a well-organized and engaged conference crowd. As I digested all the presentations, a couple of themes and topics stood out, so here is my top 6 list (because top fives are so last year :))
That pivot to the cloud is happening. Numerous presenters have deployed their ENVI applications in a cloud modality. Datamapper.com, CloudEO, Digital Globe’s Geospatial Big Data, Esri, HySpeedgeo.com and I’m sure I've left out someone. Consumers are looking for answers and not everyone wants or needs to be a geospatial Ninja. Let us Ninjas figure out how to solve the problem and deploy to a cloud platform that can be accessed on demand and serve up actionable answers from a large pile of data.
Data Mapper’s Algorithm Market Powered by ENVI Services Engine
UAS, UAVs, Drones, whatever you call them, they are hot. Each presentation on these devices was followed by the question of “how are you legally flying them?” Precision Hawk, Agribotix, and Delair-Techall have exemptions or proper certifications to fly. And, see #1 above, both Agribotix and Precision Hawk are deploying their solution in the cloud. A new-ish method of data collection is pushing imagery into a more consumable platform for end users to get information.
Jason San Souci of Precision Hawk
Data Volume. Yes Big Data is having its day and then some. DigitalGlobe is taking on Geospatial Big Data with its approach of “Platform as a Service”— providing data and compute power to “show me where” I can find features in a sea of data. Dr. Devin White at Oak Ridge National lab is using a super computer all tricked out with GPUs to perform settlement mapping worldwide. Airbus DS has a World DEM at 12m resolution, enabling terrain modeling in areas which previously had only SRTM 90m data to work with — this dataset is huge. There are numerous scientists asking questions of big data and the compute power is there in force to do this and not just expensive super computers — a GPU can turn a laptop into a super computer if put to work the right way.
Dr.Devin White talking remote sensing and supercomputing, photo from Michael Prentice
Saving the world. It’s a dirty job, but someone’s got to do it. Enter Dr. Andrew Marx from Claremont Graduate University. He’s using Landsat data to tip and queue for Human Rights violations. Think looking at a pixel for change overtime to find where villages are burned or areas bombed, then using that ‘queue’ to get higher resolution data. A great use of free data to control costs of acquiring higher resolution Imagery. Dr. White, in additional to being a computer geek extraordinaire, is also putting his talents to use with the Gate’s foundation by finding settlements for polio vaccination campaigns and disease eradication. Vice Admiral Robert Murrett (Ret) was the perfect proxy to talk to these gentlemen in the panel discussion owning to his military experiences and understanding of how remote sensing can provide information to keep people safe and mitigate suffering.
Vice Admiral Robert Murrett (Ret) addressing the crowd, courtesy Trajectory Magazine
Spectral Analysis is still uncovering new discoveries. From Bill Baugh’s presentation on WV3 and SWIR data (you can see crazy boat wakes because of the bubbles in water in SWIR data). Ray Kokaly’s PRISM tool is amazing! 1). It’s free because it was developed by USGS 2). It works with ENVI or the IDL Virtual Machine and 3). It makes wickedly good mineral and vegetation maps. Ray also presented on a new vegetation senescence absorption feature in the 1500-1600nm range (I don’t recall the exact bands). I think it’s fair to say most people probably thought this feature noise or an artifact of atmospheric correction. Fred Kruse showed how thermal data can enhance HSI imagery for certain mineral mapping. So lots of great info on “imaging spectroscopy”, if you ascribe to the Roger Clark world view, or “Imaging Spectrometry”, if you ascribe to the Alex Goetz world view.
USGS Mineral Map of Afghanistan created with the PRISM Software
In the end it comes down to basics. Atmospherically correct your data; it does make a difference if you are going after a numeric result. If you fly under clouds with a UAV, you will have a shadow in the imagery and your data will be harder to interpret. The word “Data” is plural as in “These data are” or “The data were collected over a series of years and not without the PI losing her mind” (you’re welcome Fred Kruse). For pure pixels, georegister after analysis. Close your parenthesis, end your for loops, and in the meantime think about what you would do with unlimited compute and data resources and tell us about it at the next EAS!
Follow me on twitter @aoconnor and check out#EAS2015 for all the tweets and pictures at the conference.
Sign Up for News & Updates: Stay informed with the latest news, events, technologies and special offers.