Matthew Bower Exelis VIS  

Matthew Bower

Matthew Bower works at Exelis VIS as the principle scientist for the Jagwire product line. He has served as an engineer and scientist in the photogrammetry, computer vision, and video processing fields for thepast twelve years. Matthew was an early leader in the field of Unmanned Aerial Systems (UAS) video exploitation, developing some of the earliest analysis and exploitation products for UAS data. For the past five years, he has served as a technical advisor in these fields for major UAS programs of record.

4

May

2016

Informing Real-Time Operations with SmartCam Augmented Reality

Author: Matthew Bower

Entry 3 in Blog Series: Getting to Know Jagwire*

The virtue of flying full motion video (FMV) sensors is typically found in the real-time nature of the returned sensor data. The ability to watch a collection live while inflight means that you can observe conditions on the ground in real time and make adjustments. That is a powerful tool. For typical operations, this real-time aspect is key, breaking the paradigm of plan, fly, collect, quality check, and repeat if the collection was in some way incorrect.

Unfortunately, real-time operations are still hampered by the typical narrow field of view, or “soda straw” nature of many FMV sensor systems, and also by the difficulty of orienting oneself to relevant spatial features in a moving image frame. For many sensor operators, no tools are available to assist in the task. A few more fortunate operators have, at best, a separate “slaved” map in which the sensor footprint is viewed, in a different window, over a separate set of map data. A map-to-video correlation step must then take place in the mind of the sensor operator, leading to an increased workload, and the constant potential for misunderstanding and miscommunication.

Jagwire’s new augmented reality capability, powered bySmartCam3D, provides an elegant, powerful new solution to this problem of informing and communicating with the sensor operator. This capability provides real-time overlays of a variety of reference data sets including road networks, cultural and geographic points of interest, and custom KML data sets, all presented over live FMV data in real-time. By synchronizing operationally relevant reference data right onto the FMV picture, the operator is able to quickly and transparently understand the ground picture, vastly increasing operational effectiveness.

Over the next few blog entries, we will be looking at a variety of scenarios to illustrate the operational improvements possible when combining Jagwire web-services, ENVI analytics, and SmartCam3D augmented reality.

Entry 2 in Blog Series: Disseminate your Geospatial Data with Jagwire

Entry 1 in Blog Series: Save Time and Effort by Organizing Your Data with Jagwire

Comments (0) Number of views (1603) Article rating: 5.0

Categories: ENVI Blog | Imagery Speaks

Tags:

19

Apr

2016

Disseminate your Geospatial Data with Jagwire

Author: Matthew Bower

Entry 2 in the Blog Series: Getting to know Jagwire*

The platform is fueled, the sensor is operational, and the mission is beginning. Operation of airborne or space-borne sensors is a complicated business. From optics to airframes. From mission planning to data links. There are a dozen difficult problems that must be solved any time a new sensor first flies.With all the planning and forethought that goes into getting the sensor off the ground, is it any wonder that the problem of disseminating the data after it is captured is so often overlooked?

Most system requirements collect the sensor data at a single designated point (often referred to as a ground station). So it's usually just as everyone starts to congratulate themselves on the big job of gathering and collecting the data that another headache begins of how to disseminate it beyond the ground station.

This problem of Geospatial data dissemination is one that Jagwire was designed to solve. Once your sensor data makes it to the ground station, Jagwire can ensure that the data gets to the users who need it most, with minimal latency, and best-available quality-per-bitrate.  

Any sensor or geospatial data loaded into Jagwire can be retrieved via the easy-to-access web user-interface, or via the easy-to-use API. The data can be previewed, metadata can be reviewed, and the original, or raw data, is always available for download. For users not in possession of professional grade analysis tools, Jagwire provides light weight, web-based viewing, and includes downloadable tools for previewing and performing light exploitation on a variety of data types. This focus on choose-your-method dissemination lets users choose select the specific dissemination mechanism they require, and get access to the data they need, faster and easier.

For remote users with limited bandwidth connections, such as those using mobile devices, Jagwire supports extremely bit-efficient sensor product dissemination. Still images can be exposed via streaming JPIP protocols, moving only those pixels that the end user needs. Video is transcoded in real-time to a variety of bit rates, so your high-bandwidth users can enjoy full-rate HD, while remote users can still get great video over very small network pipes. LiDAR can be streamed in a fashion similar to JPIP, giving the consumer an immediate overview, followed by a refined picture developing over time.

All this adds up to a versatile dissemination system, that can help get your sensor data out-the-door faster. If your sensor data is stuck-in-place, take a look at Jagwire for getting that data to the users who need it most.

Entry 1 in Blog Series: Save Time and Effort by Organizing Your Data with Jagwire

Comments (0) Number of views (1550) Article rating: 5.0

Categories: ENVI Blog | Imagery Speaks

Tags:

24

Feb

2016

Save Time and Effort by Organizing Your Data with Jagwire

Author: Matthew Bower

Entry 1 in Blog Series: Getting to Know Jagwire

 

Performing detailed Geospatial Analysis is a complex and time consuming activity. Applicable data can vary greatly in size and complexity, from Multi-Spectral Satellite imagery covering many spectral bands and weighing in at up to gigabytes per image, to High Definition video collections covering hours of flight time, to compact vector and feature based data, small in size, but powerful in content.

For the scientist or analyst, organizing these vast collections of data can quickly become a time consuming and unrewarding chore. Jagwire provides a central repository for all of your Geospatial Data, storing and organizing the data, thus removing this burden from the user. Once equipped with Jagwire, users are able to formulate coherent questions and queries about their data, such as:

-         Do I have any LiDAR or Digital Elevation Model(DEM) coverage of this area?

-         Do I have any images that cover this area before and after a significant event took place?

-         Find me all the vector data that occurs within the footprint of this image.

-         Find me IR band video covering this location between the hours of 4 and 5 pm.

The ability to ask these sorts of questions about a datastore serves to jumpstart the work of the scientist or analyst, allowing them to skip over painful data organization tasks and move directly to the more important work that demands their time and focus.

Jagwire offers several differentiating features as a data archive system. One powerful feature is the ability to Federate searches between different Jagwire nodes. If you have different Jagwire nodes set up at different sites, a single search can look across all connected archives, giving the users a seamless way to share their data stores.  

Further, Jagwire offers considerably enhanced search precision according to temporal and spatial criteria. Consider the following case, wherein the data product is a full motion video collection, which moves in and out of the desired query area over time.

 

Most video archive systems would segment the incoming video stream into several clips of a given duration (segments 1 through 5 above). Then, when a spatial query is issued, the system would return the entirety of clips 1 through 5, as each clip has some segment that falls within the given spatial criteria.

Jagwire’s more precise spatial search criteria would provide the user with exactly and only that data which matches their request. The response would come as two consolidated clips, the first clip beginning approximately halfway through clip 1 and ending a quarter of the way into clip 3. The second would begin partway through clip 4 and end through clip 5. In this way, Jagwire provides the user with a set of consistent video streams, and limits the data returned to exactly and only what the user requested.  

Comments (0) Number of views (4211) Article rating: No rating

14

May

2015

AUVSI "Making the Most of Big Data" Panel

Author: Matthew Bower

Last week I was fortunate to participate in the “Making the Most of Big Data” Panel, at the Association for Unmanned Vehicle Systems International (AUVSI) Conference at the World Congress Center in Atlanta, Georgia. The panel consisted of industry experts on the technical execution, scientific, legal, and policy aspects of Geospatial Big Data, and was held before an audience of approximately 120 attendees.

I was able to speak about the future of Geospatial Big Data, including forward looking technological and cultural considerations. One topic of interest  to the audience was the “Passive Analytics” concept under research by Exelis VIS, and it’s applications to geospatial big data. The concept behind Passive Analytics is the idea that often times, geospatial data is collected with a specific purpose in mind, and that this purpose is reflected in the “telemetry” or “metadata” associated with the collect. By carefully examining the details and trends present in this metadata, it is possible to determine (to some reasonable degree of accuracy) the ways in which a user may want to exploit that data. 

For instance, consider a UAS platform carrying an FMV sensor and orbiting an areaof interest. The fact that the platform has performed an orbit, and kept the sensor steady on a specific area throughout the orbit, can be detected within the metadata. This can serve to identify to a system that this area is important, and queue the system to take some action, such as computing a 3D model from the source FMV. This 3D model may be a more effective product for exploitation by the end user.   

The monitoring of the incoming data, and the execution of processing when entry criteria are met, can be handled by the big data processing center without user input (passively, thus Passive Analytics). The result is a refined or enhanced product, available on-demand when the user searches the big data system.  The Passive Analytics frameworks, and related products, such as ENVI Services Engine (ESE), and Jagwire, are just some of the technologies Exelis VIS brings to the evolving world of geospatial big data.  

Speaking at the panel was a great experience; it’s exciting to see the new realm of possibilities opening up as Geospatial Big Data makes its way to the forefront of the Geospatial and Analytics communities.  My sincere thanks go to my fellow panelists, and the great team at AUVSI for the wonderful experience. I look forward to participating again next year, and seeing just how far Geospatial Big Data has grown.

 

Comments (1) Number of views (3244) Article rating: No rating

Categories: ENVI Blog | Imagery Speaks

Tags:

MOST POPULAR POSTS

AUTHORS

Authors

Authors

Authors

Authors

Authors

Authors

Authors

Authors

Authors

Authors

GUEST AUTHORS

Authors

Authors

Authors

Authors

Authors

Authors

Authors



© 2017 Exelis Visual Information Solutions, Inc., a subsidiary of Harris Corporation