aboutsummaryrefslogtreecommitdiffstats
diff options
context:
space:
mode:
Diffstat (limited to 'nodes/ti_estop/README.md')
-rw-r--r--nodes/ti_estop/README.md115
1 files changed, 59 insertions, 56 deletions
diff --git a/nodes/ti_estop/README.md b/nodes/ti_estop/README.md
index 96f3589..14b016b 100644
--- a/nodes/ti_estop/README.md
+++ b/nodes/ti_estop/README.md
@@ -46,79 +46,82 @@ roslaunch ti_estop rviz_ogmap.launch
46``` 46```
47 47
48## Launch File Parameters 48## Launch File Parameters
49`estop.launch` file specifies the followings:
50* YAML file that includes algorithm configuration parameters. For the descriptions of important parameters, refer to "`rosparam` Parameters" section below. For the descriptions of all parameters, please see `config/params.yaml`
51* Left input topic name to read left images from a stereo camera.
52* Right input topic name to read right images from a stereo camera.
53* Right camera parameter topic name to read width, height, distortion centers and focal length
54* Output semantic segmentation map tensor topic to publish the output tensors from a semantic segmentation network.
55* Output rectified right image topic name to publish rectified right images.
56* Output bounding box topic name to publish the 3D bounding boxes coordinates of detected obstacles.
57* Output disparity topic name to publish raw disparity maps.
58* Output occupancy grid topic name to publish ego-centric occupancy grid map.
59* Output emergency stop topic name to publish emergency stop flag when obstacles are too close to a robot. When this flag is true, a robot is forced to stop moving.
60 49
61 50Parameter | Description | Value
62## `rosparam` Parameters 51-------------------------|----------------------------------------------------------------------------|-------------------
52rosparam file | Algorithm configuration parameters (see "ROSPARAM Parameters" section) | config/params.yaml
53left_input_topic_name | Left input topic name to read left images from a stereo camera | camera/left/image_raw
54right_input_topic_name | Right input topic name to read right images from a stereo camera | camera/right/image_raw
55camera_info_topic | Right camera_info topic name to read relevant camera parameters | camera/right/camera_info
56semseg_cnn_out_image | Publish topic name for semantic segmentation output image | semseg_cnn/out_image
57semseg_cnn_tensor_topic | Publish topic name for semantic segmentation tensor | semseg_cnn/tensor
58rectified_image_topic | Publish topic name for rectified right image | camera/right/image_rect_mono
59bounding_box_topic | Publish topic name for 3D bounding boxes coordinates of detected obstacles | detection3D/3dBB
60raw_disparity_topic_name | Publish topic name for raw disparity map | camera/disparity/raw
61ogmap_topic_name | Publish topic name for ego-centric occupancy grid map | detection3D/ogmap
62estop_topic_name | Publish topic name for binary emergency stop message, indicating whether obstacle(s) is in proximity to the robot or not | detection3D/estop
63ssmap_output_rgb | Flag to indicate if the output semantic segmentation map is published in RGB format | true, false
64_ | The output semantic segmentation map is published in YUV420 if false | _
65
66## ROSPARAM Parameters
63 67
64### Basic input, LDC and SDE Parameters 68### Basic input, LDC and SDE Parameters
65 69
66 Parameter | Description | Value 70Parameter | Description | Value
67--------------------------|------------------------------------------------------------------------------|---------- 71-------------------------|------------------------------------------------------------------------------|----------
68 left_lut_file_path | LDC rectification table path for left image | String 72left_lut_file_path | LDC rectification table path for left image | String
69 right_lut_file_path | LDC rectification table path for right image | String 73right_lut_file_path | LDC rectification table path for right image | String
70 input_format | Input image format, 0: U8, 1: YUV422 | 0, 1 74input_format | Input image format, 0: U8, 1: YUV422 | 0, 1
71 sde_algo_type | SDE algorithm type, 0: single-layer SDE, 1: multi-layer SDE | 0, 1 75sde_algo_type | SDE algorithm type, 0: single-layer SDE, 1: multi-layer SDE | 0, 1
72 num_layers | Number of layers in multi-layer SDE | 2, 3 76num_layers | Number of layers in multi-layer SDE | 2, 3
73 sde_confidence_threshold | Disparity with confidence less than this value is invalidated | 0 ~ 7 77sde_confidence_threshold | Disparity with confidence less than this value is invalidated | 0 ~ 7
74 disparity_min | Minimum disparity to search, 0: 0, 1: -3 | 0, 1 78disparity_min | Minimum disparity to search, 0: 0, 1: -3 | 0, 1
75 disparity_max | Maximum disparity to search, 0: 63, 1: 127, 2: 191 | 0 ~ 2 79disparity_max | Maximum disparity to search, 0: 63, 1: 127, 2: 191 | 0 ~ 2
76 80
77### Camera Parameters 81### Camera Parameters
78 82
79 Parameter | Description | Value 83Parameter | Description | Value
80--------------------------|------------------------------------------------------------------------------|---------- 84-------------------------|------------------------------------------------------------------------------|----------
81 camera_height | Camera mounting height | Float32 85camera_height | Camera mounting height | Float32
82 camera_pitch | Camera pitch angle in radian | Float32 86camera_pitch | Camera pitch angle in radian | Float32
83 87
84### Occupancy Grid Map Parameters 88### Occupancy Grid Map Parameters
85 89
86 Parameter | Description | Value 90Parameter | Description | Value
87--------------------------|------------------------------------------------------------------------------|---------- 91-------------------------|------------------------------------------------------------------------------|----------
88 grid_x_size | Horizontal width of a grid of a OG map in millimeter | Integer 92grid_x_size | Horizontal width of a grid of a OG map in millimeter | Integer
89 grid_y_size | Vertical length of a grid of a OG map in millimeter | Integer 93grid_y_size | Vertical length of a grid of a OG map in millimeter | Integer
90 min_x_range | Minimum horizontal range in millimeter to be covered by a OG map | Integer 94min_x_range | Minimum horizontal range in millimeter to be covered by a OG map | Integer
91 max_x_range | Maximum horizontal range in millimeter to be covered by a OG map | Integer 95max_x_range | Maximum horizontal range in millimeter to be covered by a OG map | Integer
92 min_y_range | Minimum vertical range in millimeter to be covered by a OG map | Integer 96min_y_range | Minimum vertical range in millimeter to be covered by a OG map | Integer
93 max_y_range | Maximum vertical range in millimeter to be covered by a OG map | Integer 97max_y_range | Maximum vertical range in millimeter to be covered by a OG map | Integer
94 98
95The number of grids in one row is defined by (max_x_range - min_x_range) / grid_x_size. Likewise, the number of grids in one column is defined by (max_y_range - min_y_range) / grid_y_size. 99The number of grids in one row is defined by (max_x_range - min_x_range) / grid_x_size. Likewise, the number of grids in one column is defined by (max_y_range - min_y_range) / grid_y_size.
96 100
97### Obstacle Detection Parameters 101### Obstacle Detection Parameters
98 102Parameter | Description | Value
99 Parameter | Description | Value 103------------------------------|-------------------------------------------------------------------------|----------
100-------------------------------|-------------------------------------------------------------------------|---------- 104min_pixel_count_grid | Minimum number of pixels for a grid to be occupied | Integer
101 min_pixel_count_grid | Minimum number of pixels for a grid to be occupied | Integer 105min_pixel_count_object | Minimum number of pixels for connected grids to be an object | Integer
102 min_pixel_count_object | Minimum number of pixels for connected grids to be an object | Integer 106max_object_to_detect | Maximum number of objects to detect in a frame | Integer
103 max_object_to_detect | Maximum number of objects to detect in a frame | Integer 107num_neighbor_grid | Number of neighboring grids to check for connected component analysis | 8, 24
104 num_neighbor_grid | Number of neighboring grids to check for connected component analysis | 8, 24 108enable_spatial_obj_merge | Enabling flag of merging spatially close objects | 0, 1
105 enable_spatial_obj_merge | Enabling flag of merging spatially close objects | 0, 1 109enable_temporal_obj_merge | Enabling flag of use of temporal information | 0, 1
106 enable_temporal_obj_merge | Enabling flag of use of temporal information | 0, 1 110enable_temporal_obj_smoothing | Enabling flag of use of a corresponding object in a previous frame to compute an object position | 0, 1
107 enable_temporal_obj_smoothing | Enabling flag of use of a corresponding object in a previous frame to compute an object position | 0, 1 111object_distance_mode | Method to compute distance between objects (0: distance between centers, 1: distance between corners) | 0, 1
108 object_distance_mode | Method to compute distance between objects (0: distance between centers, 1: distance between corners) | 0, 1
109 112
110### e-Stop Parameters 113### e-Stop Parameters
111 114
112 Parameter | Description | Value 115Parameter | Description | Value
113-------------------------------|-------------------------------------------------------------------------|---------- 116------------------------------|-------------------------------------------------------------------------|----------
114 min_estop_distance | Minimum distance of e-Stop area. Should be 0 | 0 117min_estop_distance | Minimum distance of e-Stop area. Should be 0 | 0
115 max_estop_distance | Maximum distance of e-Stop area in millimeter | Integer 118max_estop_distance | Maximum distance of e-Stop area in millimeter | Integer
116 min_estop_width | Width of e-Stop area in millimeter at min_estop_distance | Integer 119min_estop_width | Width of e-Stop area in millimeter at min_estop_distance | Integer
117 max_estop_width | Width of e-Stop area in millimeter at max_estop_distance | Integer 120max_estop_width | Width of e-Stop area in millimeter at max_estop_distance | Integer
118 min_free_frame_run | Minimum number of consecutive frames without any obstacle in e-Stop area to be determined free | Integer 121min_free_frame_run | Minimum number of consecutive frames without any obstacle in e-Stop area to be determined free | Integer
119 min_obs_frame_run | Minimum number of consecutive frames with any obstacle in e-Stop area to be determined infringed | Integer 122min_obs_frame_run | Minimum number of consecutive frames with any obstacle in e-Stop area to be determined infringed | Integer
120 123
121e-Stop area forms a trapezoid defined by first four values. When obstacles are detected in e-Stop area, a robot is forced to stop. 124e-Stop area forms a trapezoid defined by the first four parameters. When obstacles are detected in the e-Stop area, `detection3D/estop` topic is turned on `1`, so that the robot can be forced to stop.
122 125
123 126
124## Camera Setup 127## Camera Setup
@@ -127,4 +130,4 @@ e-Stop area forms a trapezoid defined by first four values. When obstacles are d
127To create LDC-format LUT for ZED camera, please refer to [zed_capture/README.md](../../drivers/zed_capture/README.md). 130To create LDC-format LUT for ZED camera, please refer to [zed_capture/README.md](../../drivers/zed_capture/README.md).
128 131
129### Camera Mounting 132### Camera Mounting
130For accurate obstacle detection, it is important to mount properly and correct values of `camera_height` and `camera_pitch` should be provided. For example, incorrect values of camera pitch angle result in 3D object boxes being overlaid in front of or behind obstacles on images. It is recommended to install the stereo camera parallel to the ground plane or slightly tilted downward, e.g., 0° ~ 10°. In general, camera pitch angle should be close to 0 when a camera's height is low, while camera pitch angle can be larger to some extent when the camera is mounted higher. 133For accurate obstacle detection, it is important to mount properly and correct values of `camera_height` and `camera_pitch` should be provided. For example, incorrect values of camera pitch angle result in 3D object boxes being overlaid in front of or behind obstacles on images. It is recommended to install the stereo camera parallel to the ground plane or slightly tilted downward, e.g., between 0° and 10°. In general, camera pitch angle should be close to 0 when a camera's height is low, while camera pitch angle can be larger to some extent when the camera is mounted higher.