Poster
in
Workshop: Machine Learning for Autonomous Driving
TITRATED: Learned Human Driving Behavior without Infractions via Amortized Inference
Vasileios Lioutas · Adam Scibior · Frank Wood
Learned models of human driving behavior have long been used for prediction in autonomous vehicles, but recently have also started being used to create non-playable characters for driving simulations. While such models are in many respects realistic, they tend to suffer from unacceptably high rates of driving infractions, such as collisions or off-road driving, particularly when deployed in locations not covered in the training dataset. In this paper we present a novel method for fine-tuning a model of human driving behavior in novel locations where human demonstrations are not available which reduces the incidence of such infractions. The method relies on conditional sampling from the learned model given the lack of infractions and extra infraction penalties applied at training time and can be regarded as a form of amortized inference. We evaluate it using the ITRA model trained on the INTERACTION dataset and transferred to CARLA. We demonstrate around 70% reduction in the infraction rate at a modest computational cost and provide evidence that further gains are possible with more computation or better inference algorithms.