In industrial settings, robotic tasks often require interaction with
various objects, necessitating compliant manipulation to prevent damage
while accurately tracking reference forces.
To this aim, interaction controllers are typically employed, but they
need either human tinkering for parameter tuning or precise
environmental modeling.
Both these aspects can be problematic, as the former is a time-consuming
procedure, and the latter is unavoidably affected by approximations,
hence being prone to failure during the actual application.
Addressing these challenges, current research focuses on devising
high-performance force controllers.
Along this line, this work introduces ORACLE
(Optimized Residual Action for Interaction
Control with Learned Environments), a novel
force control approach.
Utilizing neural networks, ORACLE predicts robot-environment interaction
forces, which are then used in an optimal residual action controller to
locally correct actions from a base force controller, minimizing the
expected force-tracking error.
Tested on a real Franka Emika Panda robot, ORACLE demonstrates superior
force tracking performance compared to state-of-the-art controllers,
with a short setup time.