Invited Talk
Learning for Interaction and Interaction for Learning
Danica Kragic
West Exhibition Hall C, B3
Humans learn though interaction and interact to learn. Automating highly dextreous tasks such as food handling, garment sorting, or assistive dressing relies on advances in mathematical modeling, perception, planning, control, to name a few. The advances in data-driven approaches, with the development of better simulation tools, allows for addressing these through systematic benchmarking of relevant methods. This can provide better understanding of what theoretical developments need to be made and how practical systems can be implemented and evaluated to provide flexible, scalable, and robust solutions. But are we solving the appropriate scientific problems and making the neccesarry step toward general solutions? This talk will showcase some of the challenges in developing physical interaction capabilities in robots, and overview our ongoing work on multimodal representation learning, latent space planning, learning physically-consistent reduced-order dynamics, visuomotor skill learning, and peak into our recent work on olfaction encoding.
Live content is unavailable. Log in and register to view live content