no code implementations • 29 Jan 2023 • Zahra Zahedi, Sarath Sreedharan, Subbarao Kambhampati
Handling trust is one of the core requirements for facilitating effective interaction between the human and the AI agent.
no code implementations • 18 Feb 2022 • Zahra Zahedi, Sarath Sreedharan, Subbarao Kambhampati
Through this paper, we will see how this new framework allows us to capture the various works done in the space of human-AI interaction and identify the fundamental behavioral patterns supported by these works.
no code implementations • 3 May 2021 • Zahra Zahedi, Mudit Verma, Sarath Sreedharan, Subbarao Kambhampati
The problem of trust management is particularly challenging in mixed human-robot teams where the human and the robot may have different models about the task at hand and thus may have different expectations regarding the current course of action, thereby forcing the robot to focus on the costly explicable behavior.
no code implementations • 18 Mar 2021 • Zahra Zahedi, Subbarao Kambhampati
In this paper, we aim at providing a comprehensive outline of the different threads of work in human-AI collaboration.
no code implementations • 5 Feb 2020 • Zahra Zahedi, Sailik Sengupta, Subbarao Kambhampati
Task allocation is an important problem in multi-agent systems.
no code implementations • 1 Mar 2019 • Zahra Zahedi, Sailik Sengupta, Subbarao Kambhampati
Thus, we define the concept of a trust boundary over the mixed strategy space of the human and show that it helps to discover optimal monitoring strategies.
no code implementations • 6 May 2018 • Fatemeh Zahedi, Zahra Zahedi
The system's ability to adapt and self-organize are two key factors when it comes to how well the system can survive the changes to the environment and the plant they work within.