Constrained Belief-Dependent POMDP Planning
Online decision making under uncertainty in partially observable domains, also known as Belief Space Planning, is a fundamental problem in robotics and Artificial Intelligence. Due to an abundance of plausible future unravelings, calculating an optimal course of action inflicts an enormous computational burden on the agent. Moreover, in many scenarios, e.g., information gathering and safety-related, it is required to introduce a belief-dependent constraint. In this research project we present a novel formulation and for a risk-averse belief-dependent probabilistically constrained continuous POMDP. We investigate different aspects of this framework, and, particularly, introduce adaptive simplification in a probabilistically constrained setting.