Coordinate measuring system which based on the theory of binocular stereo vision is widely used in many areas,
whereas their effective measuring ranges are usually not larger than ten meters. In modern times the surveillance of large
scale is used more and more in the civil and military area with the development of camera and computer technology. So
based on this requirement this paper developed a new measuring model for this binocular stereo vision measuring system
which is proper used in outdoor surveillance to get the 3D coordinate of the moving object. When the distance between
two cameras is hundreds meters, the installation and camera calibration are quiet simple and convenient without
expensive calibration apparatus and an elaborate setup or a planar pattern shown at a few different orientations or
complicated camera imaging model and the parameters of math model are easy to get. After building the model of
measuring system error analysis is performed to show influence of every parameter on the measuring system error. Both
computer simulation and real data have been used to test the validity of our new simple measuring system model.
Far-range photogrammetry is widely used in the location determination problem in some dangerous situation. In this
paper we discussed the camera calibration problem which can be used in outdoors. Location determination based on
stereo vision sensors requires the knowledge of the camera parameters, such as camera position, orientation, lens
distortion, focal length etc. with high precision. Most of the existed method of camera calibration is placing many land
markers whose position is known accurately. But due to large distance and other practical problems we can not place the
land markers with high precision. This paper shows that if we don't know the position of the land marker, we also can
get the extrinsic camera parameters with essential matrix. The real parameters of the camera and the computed
parameters of the camera give rise to the geometric error. We develop and present theoretical analysis of the geometric
error and how to get the extrinsic camera parameters with high precision in large scale measurement. Experimental
results of the project which is used to measure the drop point of a high speed object testify the method we proposed with
high precision compared with traditional methods.
In position measurements by far-range photogrammetry, the scale between object and image has to be calibrated. It
means to get the parameters of the perspective projection matrix. Because the image sensor of fast-camera is CMOS,
there are many uncertain distortion factors. It is hard to describe the scale between object and image for the traditional
calibration based on the mathematical model. In this paper, a new method for calibrating stereo vision systems with
neural networks is described. A linear method is used for 3D position estimation and its error is corrected by neural
networks. Compared with DLT (Direct Linear Transformation) and direct mapping by neural networks, the accuracy is
improved. We have used this method in the drop point measurement of an object in high speed successfully.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.