<div><p>Want to develop novel robot applications, but don’t know how to write a mapping or object-recognition system? You’re not alone, but you’re certainly not without help. By combining real-world examples with valuable knowledge from the Robot Operating System (ROS) community, this practical book
Introduction to Intelligent Robot System Design: Application Development with ROS
✍ Scribed by Gang Peng, Tin Lun LAM, Chunxu Hu, Yu Yao, Jintao Liu, Fan Yang
- Publisher
- Springer
- Year
- 2023
- Tongue
- English
- Leaves
- 580
- Category
- Library
No coin nor oath required. For personal study only.
✦ Synopsis
This book introduces readers to the principles and practical applications of intelligent robot system with robot operating system (ROS), pursuing a task-oriented and hands-on approach. Taking the conception, design, implementation, and operation of robot application systems as a typical project, and through “learning-by-doing, practicing-while-learning” approach, it familiarizes readers with ROS-based intelligent robot system design and development step by step.
The topics covered include ROS principles, mobile robot control, Lidar, simultaneous localization and mapping (SLAM), navigation, manipulator control, image recognition, vision calibration, object grasping, vision SALM, etc., with typical practical application tasks throughout the book, which are essential to mastering development methods for intelligent robot system.
Easy to follow and rich in content, the book can be used at colleges and universities as learning material and a teaching reference book for “intelligent robot,” “autonomous intelligent system,” “robotics principles,” and “robot system application development with ROS” in connection with automation, robotics engineering, artificial intelligence (AI), mechatronics, and other related majors. The book can assist in mastering the development and design of robot systems and provide the necessary theoretical and practical references to cultivate robot system development capabilities and can be used as teaching material for engineering training and competitions, or for reference, self-study, and training by engineering and technical personnel, teachers, and anyone who wants to engage in intelligent robot system development and design.
✦ Table of Contents
The Book Structure
Preface
Acknowledgments
Contents
About the Authors
Chapter 1: Composition of Robot System
1.1 Mobile Chassis and Manipulator
1.1.1 Mobile Chassis
1.1.2 Manipulator
1.2 Hardware Components of Robot System
1.2.1 Computing Platform
1.2.2 Motion Controller
1.2.3 Driver and Actuator
1.2.4 Sensor
1.2.5 Display and Touch Panel
1.2.6 Speaker and Microphone
1.2.7 Power Supply System
1.3 Sensor Description and Function Introduction
1.3.1 Encoder
1.3.2 Inertial Measurement Unit
1.3.3 LiDAR
1.3.4 Camera
1.3.5 Infrared Sensor
1.3.6 Ultrasonic Sensor
1.3.7 Millimeter Wave Radar
1.3.8 Collision Sensor
1.3.9 Multi-Sensor Fusion
1.4 Software Composition of Robot System
1.4.1 Operating System
1.4.2 Application Software
1.5 Summary
Further Reading
Exercises
Chapter 2: Connecting the Robot to ROS
2.1 Getting Started with ROS
2.1.1 Origin of ROS
2.1.2 Architecture of ROS
2.1.3 Features of ROS
2.2 ROS Installation
2.2.1 Operating Systems and ROS Versions
2.2.2 Introduction to Linux Foundation
2.2.3 ROS Installation
2.2.4 Set the Environment Variables
2.2.5 Verify Installation
2.3 File System and Communication Mechanisms in ROS
2.3.1 File System
2.3.2 ROS Communication and Its Working Mechanism
2.3.2.1 ROS Computation Graph Level
2.3.2.2 ROS Communication Mechanism
2.4 Writing the First ROS Program
2.4.1 ROS Package Dependency Management
2.4.2 ROS Workspace
2.4.3 Package Creation and Compilation
2.4.3.1 Package Creation
2.4.3.2 Package.xml File
2.4.3.3 Write the CMakeLists.txt File
2.4.3.4 Compile and Build Packages
2.4.4 Rules for Writing ROS Nodes
2.4.5 Two Ways to Run Nodes
2.4.6 Launch File
2.4.6.1 Basic Element
2.4.6.2 Parameter Settings
2.4.6.3 Remapping Mechanism
2.4.6.4 Nested Reuse
2.4.7 Fundamentals of Coordinate Transform
2.4.7.1 Introduction to Coordinate Transform
2.4.7.2 TF Principle
2.4.7.3 TF Message
2.4.7.4 TF Interface in roscpp and rospy
2.4.7.5 Use of Related Command in TF Package
2.5 ROS Common Components
2.5.1 Visualization Tools
2.5.1.1 Rviz
2.5.1.2 Gazebo
2.5.2 Rosbag: Data Logging and Replay
2.5.2.1 Use rosbag to Record Data
2.5.2.2 Play Back Bag Files
2.5.2.3 Check the Topic and Message of the Bag
2.5.3 Debugging Toolkit of ROS
2.5.3.1 Rqt_console
2.5.3.2 Rqt_graph
2.5.3.3 Rqt_plot
2.5.3.4 Rqt_reconfigure
2.5.3.5 Roswtf
2.6 Motion Control of Spark Chassis
2.7 Introduction to ROS External Devices
2.7.1 Remote Control Handle
2.7.1.1 Configuration and Use
2.7.1.2 Sending a Remote Handle Action Message
2.7.1.3 Controlling the turtlesim with the Remote Control Handle
2.7.2 LiDAR
2.7.2.1 Configuration and Use
2.7.2.2 LiDAR Releases Data in ROS
2.7.3 Vision Sensor
2.7.4 Inertial Measurement Unit and Positioning Module
2.7.4.1 Inertial Measurement Unit
2.7.4.2 Positioning Module
2.7.5 Servo Motor
2.7.6 Embedded Controller
2.7.6.1 Basic Configuration
2.7.6.2 Usage Example
2.8 Summary
Further Reading
Description of a Position
Description of an Orientation
Description of a Frame
Transformation from One Frame to Another
Exercises
Chapter 3: Robot System Modeling
3.1 Mobile Chassis Motion Model and Control
3.1.1 Mobile Robot Motion Model and Position Representation
3.1.2 URDF Modeling
3.1.2.1 Gazebo Simulation
3.1.2.2 URDF Model
3.1.3 Robot Status Publishing
3.1.4 Mobile Chassis Motion Control
3.2 LiDAR-Based Environment Perception
3.2.1 Rplidar Package
3.2.2 Introduction to hector_mapping
3.2.3 Usage of Hector_mapping
3.3 Summary
Further Reading
Exercises
Reference
Chapter 4: LiDAR SLAM for Mobile Robot
4.1 Foundation of SLAM
4.1.1 SLAM Overview
4.1.2 Mobile Robot Coordinate System
4.1.3 ROS Navigation Framework
4.1.4 Environment Mapping and Pose Estimation
4.2 Gmapping Algorithm
4.2.1 Principle Analysis
4.2.2 Implementation Process
4.3 Hector SLAM Algorithm
4.3.1 Principle Analysis
4.3.2 Mapping Results
4.4 Summary
Further Reading
Exercises
References
Chapter 5: Autonomous Navigation for Mobile Robot
5.1 Map-Based Localization
5.1.1 Monte Carlo Localization
5.1.2 Adaptive Monte Carlo Localization
5.1.2.1 Topics and Services of the AMCL Package
5.1.2.2 Parameters of AMCL Package
5.2 Map-Based Autonomous Navigation
5.2.1 Navigation Framework
5.2.2 Global Path Planning
5.2.2.1 Dijkstra´s Algorithm
5.2.2.2 A* Algorithm
5.2.3 Local Path Planning
5.2.3.1 Artificial Potential Field Method
5.2.3.2 Dynamic Window Approach
5.2.3.3 Timed Elastic Band Method
5.2.4 Navigation Package
5.2.4.1 Move_base Package Interface
5.2.4.2 Costmap Configuration
5.2.4.3 Local Planner Configuration
5.3 Summary
Further Reading
Exercises
References
Chapter 6: SLAM Based on Multi-Sensor
6.1 Inertial Measurement Unit and Calibration
6.1.1 IMU Measurement Model
6.1.1.1 Accelerometer Measurement Model
6.1.1.2 Gyroscope Measurement Model
6.1.2 Pre-Calibration of the System Error
6.1.3 Calibration of Random Error
6.2 LiDAR and IMU Extrinsic Parameter Calibration
6.3 Kinematic Odometer Model for Differential Wheeled Mobile Robot
6.4 Kalman Filter-Based Multi-sensor Fusion
6.5 Cartographer Algorithm
6.5.1 Principle Analysis
6.5.2 Mapping Results
6.6 Summary
Further Reading
Exercises
References
Chapter 7: Manipulator Motion Control
7.1 Manipulator Modeling
7.1.1 Common Manipulators in ROS
7.1.1.1 Universal Robots
7.1.1.2 Franka Panda
7.1.2 URDF Manipulator Model
7.1.3 URDF Modeling
7.2 Manipulator Control: MoveIt
7.2.1 Introduction to MoveIt
7.2.2 Setup Assistant to Configure Manipulator
7.2.3 MoveIt Visualization Control
7.2.3.1 Drag Planning
7.2.3.2 Random Planning
7.2.4 Manipulator Kinematics
7.2.4.1 D-H Modeling of the Manipulator
7.2.4.2 Forward Kinematics
7.2.4.3 Inverse Kinematics
7.3 MoveIt Programming: Manipulator Motion Planning
7.3.1 Motion Planning in Joint Space
7.3.2 Motion Planning in the Workspace
7.3.3 Motion Planning in Cartesian Space
7.3.4 Manipulator Collision Detection
7.4 Summary
Further Reading
Exercises
References
Chapter 8: Computer Vision
8.1 Getting to Know OpenCV
8.1.1 Installing OpenCV
8.1.2 Using OpenCV
8.2 Use of Monocular Vision Sensors
8.3 Camera Calibration
8.3.1 Pinhole Camera Model
8.3.2 Distortion
8.3.3 Camera Calibration Principle and Procedure
8.3.4 Camera Calibration Kit
8.4 Image Transformation and Processing
8.4.1 Perspective Transformation
8.4.2 Image Matching
8.4.3 Image Stitching
8.5 Image Feature Detection Algorithms
8.5.1 SIFT Algorithm
8.5.2 SURF Algorithm
8.5.3 FAST Algorithm
8.5.4 ORB Algorithm
8.6 Object Recognition
8.7 Summary
Further Reading
Exercises
References
Chapter 9: Vision-Based Manipulator Grasping
9.1 Depth Camera
9.1.1 Binocular and RGB-D Cameras
9.1.1.1 Binocular Camera
9.1.1.2 RGB-D Depth Camera
9.1.2 Binocular Camera Model and RGB-D Camera Model
9.1.2.1 Binocular Camera Model
9.1.2.2 RGB-D Camera Model
9.2 Deep Learning-Based Object Recognition
9.2.1 Convolutional Neural Network-Based Object Recognition
9.2.1.1 Mask R-CNN
9.2.1.2 SSD
9.2.1.3 YOLO
9.2.2 Common Deep Learning Frameworks
9.2.2.1 TensorFlow Framework
9.2.2.2 PyTorch Framework
9.3 Hand-Eye Calibration Principle and Procedure
9.4 Vision-Based Robot Grasping
9.4.1 Object Recognition
9.4.2 Object Grasping Pose Estimation
9.4.3 Grasping Pose Detection
9.4.4 Manipulator Grasping Motion Planning
9.5 Summary
Further Reading
Exercises
References
Chapter 10: Visual SLAM for Mobile Robot
10.1 Visual SLAM Framework
10.1.1 Visual Odometry
10.1.2 Back End Optimization
10.1.3 Loop Closure Detection
10.1.4 Mapping
10.1.4.1 Metric Map
10.1.4.2 Topological Map
10.1.4.3 Semantic Map
10.2 Introduction to the ORB-SLAM Algorithm
10.2.1 Tracking Thread
10.2.2 Local Mapping Thread
10.2.3 Loop Closing Thread
10.3 Dense Mapping
10.3.1 Representation of the Space Map
10.3.2 Binocular Camera Geometry Model and Calibration
10.3.2.1 2D-2D Mapping: Epipolar Geometry
10.3.2.2 Binocular Camera Extrinsic Parameter Calibration
10.3.2.3 3D-2D Mapping: PnP
10.3.3 Dense Mapping
10.4 Introduction to Other Visual SLAM Algorithms
10.4.1 LSD-SLAM
10.4.2 SVO
10.4.3 OpenVSLAM
10.4.4 VINS-Fusion
10.5 Summary
Further Reading
Exercises
References
Chapter 11: Introduction to ROS 2 and Programming Foundation
11.1 ROS 2 Design Concept
11.1.1 Problems with ROS 1
11.1.2 ROS 2 Development Status
11.1.3 ROS 2 Design Framework
11.1.3.1 Communication Model for ROS 2
11.2 ROS 2 Installation and Use
11.2.1 ROS 2 Installation
11.2.2 Running the Turtlesim
11.2.3 ROS 2 Command Line
11.3 ROS 2 Programming Foundation
11.3.1 ROS 2 Programming Method
11.3.1.1 Create an ROS 2 Workspace
11.3.1.2 Create ROS 2 Packages
11.3.1.3 Create a Simple Subscriber and Publisher (C++)
11.3.2 Differences Between ROS 2 and ROS 1 Programming
11.4 Summary
Further Reading
Exercises
Appendix A: Abbreviations
Appendix B: Index of Tasks in Each Chapter
📜 SIMILAR VOLUMES
Want to develop novel robot applications, but don't know how to write a mapping or object-recognition system? You're not alone, but you're certainly not without help. By combining real-world examples with valuable knowledge from the Robot Operating System (ROS) community, this practical book provide
<div><p>Want to develop novel robot applications, but don’t know how to write a mapping or object-recognition system? You’re not alone, but you’re certainly not without help. By combining real-world examples with valuable knowledge from the Robot Operating System (ROS) community, this practical book
<p>Since the late 1960s, there has been a revolution in robots and industrial automation, from the design of robots with no computing or sensorycapabilities (first-generation), to the design of robots with limited computational power and feedback capabilities (second-generation), and the design of <
A Concise Introduction to Robot Programming with ROS2 provides the reader with the concepts and tools necessary to bring a robot to life through programming. It will equip the reader with the skills necessary to undertake projects with ROS2, the new version of ROS. It is not necessary to have previo
<p><span>A Concise Introduction to Robot Programming with ROS2</span><span> provides the reader with the concepts and tools necessary to bring a robot to life through programming. It will equip the reader with the skills necessary to undertake projects with ROS2, the new version of ROS. It is not ne