Skip to content

Commit bc458ae

Browse files
committed
[skip ci] publish latest
Signed-off-by: fmrico <fmrico@gmail.com>
1 parent 729235d commit bc458ae

18 files changed

Lines changed: 2471 additions & 27 deletions

File tree

_sources/index.rst.txt

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -44,4 +44,5 @@ If you use this software in your work, please consider citing our next paper.
4444
getting_started/index.rst
4545
design/index.rst
4646
software/index.rst
47+
tutorials/index.rst
4748
about/index.rst
Lines changed: 137 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,137 @@
1+
.. _blackboard:
2+
3+
How to use the NavState BlackBoard
4+
##################################
5+
6+
The `NavState` class is a central component in the EasyNav architecture. It implements a **shared blackboard**, which allows all modules (localizers, planners, controllers, etc.) to **read and write shared data without using ROS 2 communication mechanisms** internally. This design improves **determinism**, simplifies debugging, and avoids unnecessary overhead.
7+
8+
Overview
9+
--------
10+
11+
All information needed or produced by EasyNav modules flows through the `NavState`. Examples include:
12+
13+
- the robot's estimated pose (`robot_pose`),
14+
- a list of navigation goals (`goals`),
15+
- a path to follow (`path`),
16+
- environment maps (e.g., `map`, `map.dynamic`, `map.static`),
17+
- control commands (`cmd_vel`),
18+
- perception data (`points`, `image`, etc.).
19+
20+
The `NavState` provides a simple API to check for, retrieve, and update entries using keys.
21+
22+
Basic API
23+
---------
24+
25+
To **check whether a value is available**, use:
26+
27+
.. code-block:: cpp
28+
29+
if (nav_state.has("robot_pose")) {
30+
// do something
31+
}
32+
33+
To **read a value**, use the templated `get` method:
34+
35+
.. code-block:: cpp
36+
37+
const auto odom = nav_state.get<nav_msgs::msg::Odometry>("robot_pose");
38+
39+
To **write a value**, use:
40+
41+
.. code-block:: cpp
42+
43+
nav_state.set("cmd_vel", computed_twist);
44+
45+
Values are stored under a string key and must be copyable. You can store standard ROS messages, custom types, or even nested structures. Keys can use dot notation (e.g., `"map.static"`, `"map.dynamic"`).
46+
47+
Examples from Plugins
48+
---------------------
49+
50+
**Planner Example**
51+
52+
A planner typically requires the robot’s pose, a goal, and a map. It writes back a path:
53+
54+
.. code-block:: cpp
55+
56+
if (!nav_state.has("robot_pose") || !nav_state.has("goals") || !nav_state.has("map")) return;
57+
58+
auto pose = nav_state.get<nav_msgs::msg::Odometry>("robot_pose");
59+
auto goal = nav_state.get<nav_msgs::msg::Goals>("goals").goals.front().pose;
60+
auto map = nav_state.get<grid_map::GridMap>("map");
61+
62+
auto path = compute_path(map, pose.pose.pose, goal);
63+
64+
nav_state.set("path", path);
65+
66+
**Controller Example**
67+
68+
A controller reads the current pose and planned path, and outputs a velocity command:
69+
70+
.. code-block:: cpp
71+
72+
if (!nav_state.has("path") || !nav_state.has("robot_pose")) return;
73+
74+
const auto path = nav_state.get<nav_msgs::msg::Path>("path");
75+
const auto pose = nav_state.get<nav_msgs::msg::Odometry>("robot_pose").pose.pose;
76+
77+
geometry_msgs::msg::TwistStamped cmd_vel = compute_control(path, pose);
78+
79+
nav_state.set("cmd_vel", cmd_vel);
80+
81+
**Localization Example**
82+
83+
A localizer typically updates the estimated pose:
84+
85+
.. code-block:: cpp
86+
87+
nav_state.set("robot_pose", latest_odom);
88+
89+
Advanced Features
90+
-----------------
91+
92+
**Namespacing**
93+
94+
You can organize data hierarchically by using dots in keys:
95+
96+
- `"map.static"` vs `"map.dynamic"`
97+
- `"perception.lidar"` vs `"perception.camera"`
98+
99+
**Debugging**
100+
101+
You can inspect the contents of the blackboard as a string:
102+
103+
.. code-block:: cpp
104+
105+
std::cout << nav_state.debug_string();
106+
107+
**Custom Printers**
108+
109+
You can register pretty-printers for your own types using:
110+
111+
.. code-block:: cpp
112+
113+
NavState::register_printer<MyType>(
114+
[](const MyType & value) {
115+
std::ostringstream out;
116+
out << "MyType: " << value.to_string();
117+
return out.str();
118+
});
119+
120+
This improves the output of `debug_string()` for non-standard types.
121+
122+
Best Practices
123+
--------------
124+
125+
- Always check `has(key)` before `get<T>(key)` to avoid exceptions.
126+
- Use descriptive keys, preferably documented.
127+
- When designing a plugin, clearly define which `NavState` keys it consumes and which ones it produces.
128+
- Avoid storing large or non-copyable data unless necessary.
129+
130+
Conclusion
131+
----------
132+
133+
The `NavState` blackboard is a powerful and lightweight alternative to traditional ROS 2 message passing. It simplifies integration between modules and improves runtime performance by avoiding queues and callback latencies.
134+
135+
It is the recommended mechanism for **all internal data exchange** in EasyNav.
136+
137+

_sources/software/index.rst.txt

Lines changed: 20 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -5,6 +5,26 @@ EasyNav Software
55

66
The EasyNav software is developed under the `EasyNavigation GitHub organization <https://github.com/EasyNavigation>`_, and it follows a **highly modular structure**. This modularity is reflected in the separation of functionality across multiple repositories, each dedicated to a specific role or typology:
77

8+
9+
C++ API
10+
*******
11+
12+
- `EasyNav Core <https://easynavigation.github.io/EasyNavigation/>`_
13+
- `EasyNav Simple Stack <https://easynavigation.github.io/easynav_simple_stack/>`_
14+
- `EasyNav GridMap Stack <https://easynavigation.github.io/easynav_gridmap_stack/>`_
15+
16+
**HowTo's**:
17+
18+
.. toctree::
19+
:maxdepth: 1
20+
21+
./blackboard.rst
22+
./perceptions.rst
23+
24+
25+
Repositories
26+
************
27+
828
Core
929
----
1030

Lines changed: 158 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,158 @@
1+
.. _perceptions:
2+
3+
4+
Sensor Input and Perception Handling
5+
####################################
6+
7+
In EasyNav, **all sensor input flows through a single component**: the ``SensorsNode``. This node is responsible for:
8+
9+
- subscribing to sensor topics defined in the parameter file,
10+
- converting sensor data into unified internal formats (e.g., point clouds, images),
11+
- organizing and storing perceptions into groups (e.g., `"points"`, `"image"`),
12+
- publishing a fused view (optional),
13+
- writing the current valid perceptions into the `NavState` under their corresponding group key.
14+
15+
Sensor Configuration
16+
--------------------
17+
18+
Sensors are defined via ROS 2 parameters under the `sensors_node` configuration. For example:
19+
20+
.. code-block:: yaml
21+
22+
sensors_node:
23+
ros__parameters:
24+
use_sim_time: true
25+
forget_time: 0.5
26+
sensors: [laser1, camera1]
27+
perception_default_frame: odom
28+
laser1:
29+
topic: /scan_raw
30+
type: sensor_msgs/msg/LaserScan
31+
group: points
32+
camera1:
33+
topic: /rgbd_camera/points
34+
type: sensor_msgs/msg/PointCloud2
35+
group: points
36+
37+
All distance sensors (e.g., `LaserScan`, `PointCloud2`) must be grouped under the `"points"` group. These are converted internally into **`PointPerception`** instances and aggregated into a single structure called **`PointPerceptions`**, which is written into the `NavState` under the key `"points"`.
38+
39+
If image data is included:
40+
41+
.. code-block:: yaml
42+
43+
image1:
44+
topic: /rgbd_camera/image_raw
45+
type: sensor_msgs/msg/Image
46+
group: image
47+
48+
Then the group `"image"` will appear in the `NavState`, using the `ImagePerception` type.
49+
50+
Processing Point Perceptions
51+
----------------------------
52+
53+
To work with fused or filtered 3D points, EasyNav provides the utility class **`PointPerceptionsOpsView`**.
54+
55+
You typically retrieve the `PointPerceptions` from the `NavState`:
56+
57+
.. code-block:: cpp
58+
59+
if (!nav_state.has("points")) return;
60+
const auto perceptions = nav_state.get<PointPerceptions>("points");
61+
62+
Then create an operations view:
63+
64+
.. code-block:: cpp
65+
66+
PointPerceptionsOpsView view(perceptions);
67+
68+
The view provides a fluent interface to manipulate the point cloud. There are two kinds of operations:
69+
70+
- **Lightweight operations**: these return a reference (`PointPerceptionsOpsView &`) and operate on views without copying data.
71+
- **Heavyweight operations**: these return a `std::shared_ptr<PointPerceptionsOpsView>` and typically involve transformations or filtering that produce new point sets.
72+
73+
Common operations include:
74+
75+
.. code-block:: cpp
76+
77+
auto downsampled = PointPerceptionsOpsView(perceptions)
78+
.downsample(0.2); // reduce density (heavy)
79+
80+
auto fused = downsampled
81+
->fuse("base_link"); // transform all points to base_link (heavy)
82+
83+
auto filtered = fused
84+
->filter({-1.0, -1.0, 0.0}, {1.0, 1.0, 2.0}); // spatial crop (heavy)
85+
86+
auto points = filtered->as_points(); // retrieve std::vector<Point3D>
87+
88+
Operation Summary
89+
-----------------
90+
91+
+--------------------+----------------------------+------------------------------------------+
92+
| Operation | Return Type | Description |
93+
+====================+============================+==========================================+
94+
| `filter(...)` | `std::shared_ptr<...>` | Filters points inside a bounding box |
95+
+--------------------+----------------------------+------------------------------------------+
96+
| `downsample(res)` | `std::shared_ptr<...>` | Voxel downsampling |
97+
+--------------------+----------------------------+------------------------------------------+
98+
| `fuse(frame)` | `std::shared_ptr<...>` | Transforms all perceptions to a frame |
99+
+--------------------+----------------------------+------------------------------------------+
100+
| `collapse()` | `std::shared_ptr<...>` | Merge similar points into one |
101+
+--------------------+----------------------------+------------------------------------------+
102+
| `as_points()` | `std::vector<Point3D>` | Exports data as raw 3D point list |
103+
+--------------------+----------------------------+------------------------------------------+
104+
105+
Example: Updating a Map
106+
-----------------------
107+
108+
Many components use fused and filtered points to update occupancy or elevation maps:
109+
110+
.. code-block:: cpp
111+
112+
const auto perceptions = nav_state.get<PointPerceptions>("points");
113+
114+
auto fused = PointPerceptionsOpsView(perceptions)
115+
.downsample(dynamic_map_.resolution()) // reduce point density
116+
.fuse("map") // transform to map frame
117+
->filter({NAN, NAN, 0.1}, {NAN, NAN, NAN}) // ignore ground clutter
118+
.as_points();
119+
120+
for (const auto & p : fused) {
121+
if (dynamic_map_.check_bounds_metric(p.x, p.y)) {
122+
auto [cx, cy] = dynamic_map_.metric_to_cell(p.x, p.y);
123+
dynamic_map_.at(cx, cy) = 1;
124+
}
125+
}
126+
127+
Fused Visualization
128+
-------------------
129+
130+
If the `SensorsNode` has subscribers on its output topic, it will publish the fused perception result after processing:
131+
132+
.. code-block:: cpp
133+
134+
if (percept_pub_->get_subscription_count() > 0) {
135+
auto fused = PointPerceptionsOpsView(perceptions)
136+
.fuse(perception_default_frame_);
137+
138+
auto fused_points = fused->as_points();
139+
auto msg = points_to_rosmsg(fused_points);
140+
141+
msg.header.frame_id = perception_default_frame_;
142+
msg.header.stamp = fused->get_perceptions()[0]->stamp;
143+
percept_pub_->publish(msg);
144+
}
145+
146+
Extending to Other Modalities
147+
-----------------------------
148+
149+
In addition to `"points"` and `"image"`, developers can add new groups and corresponding `PerceptionBase`-derived classes. All perceptions:
150+
151+
- inherit from `PerceptionBase`,
152+
- have a `stamp`, `frame_id`, and `valid` flag,
153+
- are grouped by semantic label (e.g., `"points"`),
154+
- are automatically managed by the `SensorsNode`.
155+
156+
---
157+
158+
This unified and extensible perception handling design allows plugins to focus on **what** data they need, not **how** it was acquired, filtered, or transformed.

0 commit comments

Comments
 (0)