You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
# 14<sup>th</sup> Workshop on Planning and Robotics (PlanRob)
8
+
9
+
Co-located workshop at ICAPS'26 \
10
+
Dublin, Ireland \
11
+
June 28, 2026
12
+
13
+
8
14
9
15
## Aim and Scope of the Workshop
10
16
17
+
AI Planning & Scheduling (P&S) methods are crucial to enabling intelligent robots to perform autonomous, flexible, and interactive behaviors, but they must be tightly integrated into the overall robot architecture in order to be effective. This requires strong collaboration between researchers from both the AI and the Robotics communities. To foster this, the workshop aims to provide a stable, long-term forum where researchers from both the P&S and Robotics communities can openly discuss relevant issues such as research and development progress, future directions, and open challenges related to P&S when applied to Robotics.Recent advances in large-scale learning models, multimodal perception, and whole-body robotic systems are reshaping the landscape of planning and execution. The 2026 edition of PlanRob explicitly aims to address these emerging challenges, with a focus on integrating symbolic, geometric, and learning-based approaches for robust, scalable, and adaptive robot autonomy.
18
+
11
19
12
20
13
21
## Topics of Interest
14
22
23
+
Topics of interest include but are not limited to:
24
+
25
+
* Planning representations and models for robotics including domain modeling, abstraction, and formal representations
26
+
* Robot planning at multiple levels, including mission, task, path, motion, and integrated task-and-motion planning
27
+
* Learning-augmented planning for robotics, including:
28
+
- learning-based methods for planning and control
29
+
- generative models for planning
30
+
- continual learning and planning architectures and algorithms
31
+
* Foundation-model-based approaches for robot planning, including:
32
+
- LLM-, VLM-, and action-model-based methods for task as well as task-and-motion planning
33
+
- perception-grounded planning using vision-language representations
34
+
* Planning, execution, and control integration, including robot architectures supporting tight coupling between deliberation and execution
35
+
* Planning for complex robotic systems, including:
36
+
- high-dimensional and whole-body robotic planning and execution
37
+
- planning under real-world sensing
38
+
- actuation, and computational constraints
39
+
* Multi-agent and interactive planning, including:
40
+
- coordination and cooperation in multi-robot systems
41
+
- human-aware planning and execution for human–robot interaction
42
+
- adversarial and competitive planning in robotic domains
43
+
* Formal and algorithmic foundations of robot planning, including formal methods for verification, correctness, and safety
44
+
* Large-scale and real-world robotic applications, including:
45
+
- optimization of behavior in large-scale automated or semi-automated systems
46
+
- deployment and evaluation of planning methods on autonomous and intelligent robots in real-world settings
47
+
48
+
Important goals of the workshop are the discussion of solutions, results, open issues, and real-world challenges.
15
49
16
50
51
+
## Important Dates
52
+
53
+
- Submission Deadline: April 20, 2026 (AoE)
54
+
- Author Notification: May 5, 2026
55
+
- Camera-Ready Deadline: June 5, 2026
56
+
- ICAPS 2026 Workshops: June 28-29, 2026
57
+
58
+
Note that at the date of the submission deadline, all papers need to be registered, which includes all relevant information such as title, abstract, authors, and kind of paper (long, short, etc.). **You will still be able to upload/update your paper until April 3 (AoE).**
59
+
17
60
18
61
## Submission Instructions
19
62
63
+
There are two types of submissions:
64
+
- short position papers (four pages)
65
+
- regular papers (up to 10 pages)
66
+
67
+
Papers may have an additional page containing references. Regular papers may be scheduled with more time in the final program. A poster session may be considered to provide a further presentation opportunity.
68
+
69
+
The guidelines for formatting are the same as is used for ICAPS 2026 papers as described at: [http://www.aaai.org/Publications/Author/author.php](http://www.aaai.org/Publications/Author/author.php)), but with the AAAI copyright removed. The papers must be submitted in PDF format via the EasyChair system ([https://easychair.org/conferences/?conf=icapswsplanrob24](https://easychair.org/conferences/?conf=icapswsplanrob26)).
70
+
71
+
Please note that papers under review (e.g. which have been submitted to IJCAI-2026) are also welcome, however, in order to avoid potential conflicts, these manuscripts should be prepared as anonymous submissions for a double blind reviewing process.
72
+
73
+
74
+
### Workshop Proceedings
75
+
76
+
Accepted papers will be published on the workshop's website.
77
+
78
+
The organisers are investigating the availability of journal editors in order to invite a selection of accepted papers from the workshop to a special issue or post-proceedings volume.
Institute of Cognitive Sciences and Technologies (ISTC-CNR), Italy
22
128
23
129
24
130
## Organizing Committee
25
131
132
+
This workshop is partially supported by TRIFFID whose aim is to revolutionize the landscape of emergency response by seamlessly weaving advanced robotics with the crucial work of First Responders, ensuring they’re equipped to tackle disasters with unparalleled efficiency and precision. Innovative hybrid robotic platforms aims at blend autonomous legged and aerial capabilities, enabling real-time reconnaissance even in the most challenging terrains, while our sophisticated Ground-Station interface empowers operators to visualize and analyze disaster scenarios like never before. TRIFFID has received funding from the European Union’s Horizon Europe research and innovation programme under grant agreement No.101168042.
0 commit comments