The ecological sciences face the challenge of making measurements to detect subtle changes sometimes over large areas across varied temporal scales. The challenge is thus to measure patterns of slow, subtle change occurring along multiple spatial and temporal scales, and then to visualize those changes in a way that makes important variations visceral to the observer. Imaging plays an important role in ecological measurement but existing techniques often rely on approaches that are limited with respect to their spatial resolution, view angle, and/or temporal resolution. Furthermore, integrating imaging acquired through different modalities is often difficult, if not impossible. This research envisions a community-based and participatory approach based around augmented rephotography of ecosystems. We show a case study for the purpose of monitoring the urban tree canopy. The goal is to explore, for a set of urban locations, the integration of ground level rephotography with available LiDAR data, and to create a dynamic view of the urban forest, and its changes across various spatial and temporal scales. This case study gives the opportunity to explore various augments to improve the ground level image capture process, protocols to support 3D
inference from the contributed photography, and both in-situ and web based visualizations of the temporal change over time.
Classical stereo algorithms attempt to reconstruct 3D models of a scene by matching points between two images. Finding points that match is an important part of this process, and point matches are most commonly chosen as the minimum of an error function based on color or local texture. Here we motivate a probabilistic approach to this point matching problem, and provide an experimental design for the empirical measurement of the color matching error for corresponding points. We use this prior in a Bayesian scene reconstruction example, and show that we get better 3D reconstruction by not committing to a specific pixel match early in the visual processing. This allows a calibrated stereo camera to be considered as a probabilistic volume sensor -- which allows it to be more easily integrated with scene structure measurements from other kinds of sensors.
Autonomous systems that navigate through unknown and unstructured
environments must solve the ego-motion estimation problem. Fusing the
information from many different sensors makes this motion estimation
more stable, but requires that the relative position and orientation
of these sensors be known. Self-calibration algorithms are the most
useful for this calibration problem because the do not require any
known feature in the environment and can be used during system
operation. Here we give geometric constraints, the coherent motion
constraints, that allow a framework for the development of self-calibration algorithms for a heterogeneous sensor system (such as
cameras, laser range finders, and odometry). If, for all sensors, a
conditional probability density function can be defined to relate
sensor measurements to the sensor motion, then the coherent motion
constraints allows a maximum likelihood formulation of the sensor
calibration problem. We present complete algorithms here for the case
of a camera and laser range finder, in the case of both discrete and
differential motions.
Conference Committee Involvement (2)
Geospatial Informatics, Fusion, and Motion Video Analytics VI
19 April 2016 | Baltimore, MD, United States
Geospatial Informatics, Fusion, and Motion Video Analytics V
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.