Skip to main content
 

MARCER: A Multimodal Augmented Reality System for Composing and Executing Robot Action Plans (2024)

Undergraduates: LillyAnn Nekervis, Bryce Ikeda, Maitrey Gramopadhye


Faculty Advisor: Daniel Szafir
Department: Computer Science


Humans possess an innate ability to communicate their thoughts through natural language. In contrast, robots often struggle to convey their goals to users, but excel at simple and repetitive tasks. To bridge this gap, we combine the strengths of humans and robots by developing MARCER, a novel interactive and multimodal end-user robot programming system. MARCER utilizes a Large Language Model to translate users' natural language task descriptions and environmental surroundings into Action Plans for robot execution, based on a trigger-action programming paradigm that facilitates authoring reactive robot behaviors. system also affords interaction via augmented reality to help users parameterize and validate robot programs and provide real-time, visual previews and feedback directly in the context of the robot's operating environment. We present the design and implementation of MARCER and show its application in authoring robot programs across a breadth of domestic assistance scenarios to demonstrate how trigger-action programming, Large Language Models, and augmented reality hold deep-seated synergies that, when combined, may empower users to program general-purpose robots to perform everyday tasks.

Link to Abstract