<p><p>This book presents recent advances in robot control theory on task space sensory feedback control of robot manipulators. By using sensory feedback information, the robot control systems are robust to various uncertainties in modelling and calibration errors of the sensors. Several sensory task
Visual Servoing: Real-time Control Of Robot Manipulators Based On Visual Sensory Feedback
β Scribed by Koichi Hashimoto (editor), Tom Husband (editor)
- Publisher
- World Scientific Publishing Co Pte Ltd
- Year
- 1993
- Tongue
- English
- Leaves
- 373
- Series
- World Scientific Series In Robotics And Intelligent Systems; 7
- Edition
- Illustrated
- Category
- Library
No coin nor oath required. For personal study only.
β¦ Synopsis
This book treats visual feedback control of mechanical systems, mostly robot manipulators. It not only deals with image processing techniques and robot control schemes but also covers the latest investigation of the design of the visual servo mechanism based on modern linear and nonlinear control theory, the adaptive control scheme, fuzzy logic, and neural networks. New concepts for utilizing visual sensory information for real-time manipulator control are derived and the performances are evaluated through simulations and/or experiments.The contributors to this book are robotics specialists from all over the world. The book gives a practical perspective on visual servoing to researchers, engineers, and students working in this area.
β¦ Table of Contents
Contents
Preface
VISUAL CONTROL OF ROBOT MANIPULATORS - A REVIEW
Abstract
1 Introduction
2 Concepts of visual control
2.1 Definitions
2.2 Position versus image based servoing
3 Summary
4 Position-based visual servoing
4.1 Photogrammetric techniques
4.2 Stereo vision
4.3 Depth from motion
4.4 Depth from dynamics
5 Image based servoing
5.1 Approaches to image-based visual servoing
6 Implementational issues
6.1 Video standards
6.2 Cameras
6.3 Lenses
6.3,1 Camera location
6.4 Image processing
6.5 Feature extraction
6.6 Robot control/communications
7 Conclusion
8 References
Hand-Eye Coordination for Robotic Tracking and Grasping
Abstract
1 INTRODUCTION
2 PREVIOUS WORK
3 VISION SYSTEM
3.1 COMPUTING NORMAL OPTIC-FLOW IN REAL TIME
4 ROBOTIC ARM CONTROL
4.1 The Model of the Motion
4.
2 Estimating Arc Length S0 and Bending Parameter ΗΎ0
4.3 Smoothing of the Control Inputs
4.4 Prediction and Synchronization
5 MOTOR COORDINATION FOR GRASPING
6 EXPERIMENTAL RESULTS
7 SUMMARY AND FUTURE WORK
8 ACKNOWLEDGEMENTS
A Trajectory Curvature
B Velocity Expectation and Variance
C Verification of formulas 16, 17 and 18
D The linear filtering option
References
VISUAL SERVO CONTROL OF ROBOTS USING KALMAN FILT ERESTIMATES OF ROBOT POSE RELATIVE TO WORK-PIECES
ABSTRACT
1 Introduction
2 3D POSE Estimation Using the Extended Kalman Filter
2.1 Geometric Relationships
2.2 Kalman Filter Algorithm
3 End-Point Trajectory Control Design
3.1
General Control Strategy
3.2 The 2D Planar Motion Cast
4 Experimental System and Implementation Considerations
4.
1 Experimental System
4.2 Transputer Network Software Design
4.3 The Vision System and Image Pre-Processor
5 Experimental and Simulation Results
5.1 Relative Pose Estimation Experimental Results
5.2 Experimental Tracking Results for 2D Plane Motion
5.3 Experimental Results for 3D Tracking
6 Summary
7 References
FEATURE-BASED VISUAL SERVOING OF ROBOTIC SYSTEMS
ABSTRACT
1. Introduction
2. Resolved Motion Rate Visual Feedback Control Structure
3. Transformations From Feature Space To Joint Space
4. Differential Relationship between Part's Pose and Image Feature Points
5. Feature Selection for Control
6. Feature-Based Trajectory Generation
7. Simulation and Experimental Results
8. Summary
9. References
VISUAL SERVOING FOR ROBOTIC ASSEMBLY
ABSTRACT
1. Introduction
2. Static Camera Servoing in 3-D
2.1 Previous Work
2.2 Modeling and Control
2.3 Feature Tracking
2.4 Implementation
2.5 Experimental Results
3. Dynamic Sensor Placement
3.1 Introduction
3.2 Previous Work
3.3 Sensor Placement/Visual Tracking Hybrid Contro
3.4 Visual Tracking Model and Control
3.5 Depth-of-Field, Field-of-View, and Spatial Resolution Constraints
3.6 Singularity Avoidance
3.7 Experimental Results
4. Automatic Calibration of the Camera-Object Transformation
4.1 Introduction
4.2 Modeling and Control
4.3 Recursive Least-Squares Formulation
4.4 Experimental Results
5. Summary
6. Acknowledgments
7. References
LQ OPTIMAL AND NONLINEAR APPROACHES TO VISUAL SERVOING
ABSTRACT
1. Introduction
2. Model of Robot and Camera
2.1. Model of the Robot
2.2. Model of the Camera
2.3. Robot and Image Jacobians
3. LTI Model and Optimal Control Law
4. Model-Based Visual Servoing
5. Example, Simulation and Experiment
5.1. Example of Two Link Robot
5.2. 6 Degrees of Freedom LQ Control with PUMA 560
5.2.1. Simulations
5.2.2. Experiments
5.3. Nonlinear Model-based Control with Two Link DD Arm
5.3.1. Set up for Simulation and Experiments
5.3.2. Multi Sampling-Rate and Prediction Scheme
5.3.4. Feature-Based Nonlinear Controller
6. Conclusions
References
CLASSIFICATION AND REALIZATIONOF THE DIFFERENT VISION-BASED TASKS
ABSTRACT
1. Introduction
2. Modeling Interactions with the Environment
2.1 General Framework
2.2 Case of an Image Sensor
2.2.1. Points
2.2.2. Segments
2.2.3. Straight lines
2.2.4. Circles
2.2.5. Spheres
2.2.6. Cylinders
3. Vision-based Tasks
3.1 The Notion of Virtual Linkage
3.2 Classification of the vision-based tasks
3.2.1. Rigid Linkage
3.2.2. Prismatic Linkage
3.2.3. Revolute Linkage
3.2.4- Prismatic/Revolute Linkage
3.2.5. Plant/plant contact
3.2.6. Ball-and-socket linkage
3.2.7. Other linkages
3.3 Performance and Robustness of the Vision Based Tasks
4. Expression of the Task Function and Control
5. Results
5.1. Positioning with Respect to a Cylinder
5.2. Positioning with Respect to a Sphere
5.3. Vision Based Control Applied to Mobile Robot
References
A Dynamical Sensor for Robot Juggling
1 Introduction
2 System Overview
2.1 The Geometry and Dynamics of Juggling
2.1.1 Trajectory Generation via Ball Dynamics
2.1.2 Geometric Programming I: The Juggling Algorithm
2.1.3 Geometric Programming II: The Two-Juggle
2.2 The Geometry and Dynamics of a Motion Sensor
2.2.1 Previous Experience with a Planar Juggler
2.2.2 Triangulation Model
2.2.3 Sensory Management
2.2.4 Signal Processing
2.2.5 Real-Time Constraints
3 Sensing Issues Arising from Actuator Constraints
3.1 The Constraints
3.2 Modifications to the Sensing System
3.3 Effect of the Modifications
3.2.1 Occlusion Detection
3.2.2 Observer Based Window Placement
3.2.3 Impact Detection and Estimation
3.2.4 Window Size Adjustment
3.2.5 Window Overlap/Prioritization
3.3 Effect of the Modifications
3.3.1 Recovery from Out-of-Frame
3.3.2 Recovery from Ball-Ball Occlussions
4 Toward the Control of Attention
4.1 The Window Management Variables as a "State of Attention"
4.2 Observer Errors from a Noisy Model
4.3 Certainty Estimates from a Parallel Observer
4.4 Window Radius Dynamics for Bounded Estimator Errors
4.5 Boundness of the State of Attention
5 Conclusion
References
VIDEO-RATE ROBOT VISUAL SERVOING
Abstract
1 Introduction
2 Architecture for high-bandwidth control
2.1 Camera and lens
2.2 Image preprocessing and segmentation
2.3 Feature extraction
2.4 Robot control
2.5 Software
3 Image-based visual servoing
3.1 Planar positioning
3.2 3D position and orientation
4 System modeling
4,1 Experimental setup
4.2 Time delay
4.3 Perspective gain
4.4 Mechanical resonance
5 Conclusion
6 Acknowledgements
7 References
Visual Servoing of Robot Manipulators by Fuzzy Membership Function Based Neural Networks
ABSTRACT
1. Introduction
2. A Nonlinear Transformation from Differential Image Space to Differential Cartesian Space
3. Nonlinear Function Approximation By the Fuzzy Membership Function Based Neural Networks.
4. Design of the visual servo using the FMF networks.
5. Simulation Results
6. Conclusions
REFERENCES
Characterization and Use of Feature-Jacobian Matrix for Visual Servoing
Abstract
1. INTRODUCTION
2. DEFINITION OF FEATURE AND ROBOTIC TASK DESCRIBED IN THE FEATURE SPACE
2.1 Feature and Feature Space
2.2 ROBOTIC TASKS IN THE FEATURE SPACE
3. CONTROL OF ROBOT MOTION BASED ON FEATURES
3.1 Differential Relationship between Robot Motion and Feature Vector
3.2 Estimating the Feature Jacobian Matrix
3.2.1 Quadruple Camera Set Method
3.2.2 One Camera Method
3.3 Application Examples
4. FUZZY SELF-ORGANIZING VISUAL TRACKING CONTROLLER
4.1 Horizontal Tracking
4.2 Vertical Tracking
4.3 Fuzzy Threshold
4.4 Experiments
5. DISCUSSION AND CONCLUDING REMARKS
REFERENCES
π SIMILAR VOLUMES
This book raises the question of what visuality really is and how it is possible to explain it. Virtual reality is connected to our current environment with multiple ties. It affects the everyday operation of the media and hence all of our lives. The authors connect the concepts of pictorial turn an
"This book describes visual perception and control methods for robotic systems that need to interact with the environment. Multiple view geometry is utilized to extract low-dimensional geometric information from abundant and high-dimensional image information, making it convenient to develop general
Multi-view Geometry Based Visual Perception and Control of Robotic Systems describes visual perception and control methods for robotic systems that need to interact with the environment. Multiple view geometry is utilized to extract low-dimensional geometric information from abundant and high-dimens
<p>This is the fourth book from the Series "Scientific Fundamentals of RoΒ botics". The first two volumes have established abackqround for studying the dynamics and control of robots. While the first book was exclusiveΒ ly devoted to the dynamics of active spatial mechanisms, the second treated the
<P>The development of humanoid robots is one of the most challenging research fields within robotics. One of the crucial capabilities of such a humanoid is the ability to visually perceive its environment.</P><P>The present monograph deals with visual perception for the intended applications manipul