Title | : | Towards Embodied Visual Intelligence |
Speaker | : | Dinesh Jayaraman (UC Berkeley) |
Details | : | Mon, 17 Dec, 2018 11:00 AM @ AM Turing Hall |
Abstract: | : | What would it mean for a machine to see the world? Computer
vision has recently made great progress on problems such as finding
categories of objects and poses of people in images. However, I have
argued through my work in the last several years that studying such
tasks in isolated disembodied contexts, divorced from the physical
source of their images, is insufficient to build intelligent visual
agents. My research focuses on remarrying vision to action, by asking:
how might vision benefit from the ability to act in the world, and
vice versa? Could embodied visual agents teach themselves through
interaction and experimentation? Are there actions they might perform
to improve their visual perception? Could they exploit vision to
perform complex control tasks? In my talk, I will set up the context
for these questions, cover some strands of my work addressing them,
and discuss my long-term vision and directions that I hope to work on
in the future.
Bio: Dinesh Jayaraman is a postdoctoral scholar in EECS at UC Berkeley. He received his PhD from UT Austin (2017) and B. Tech from IIT Madras (2011). His research interests are broadly in computer vision, robotics, and machine learning. In the last few years, he has worked on active perception and visual learning in embodied agents, visual prediction, visuo-tactile robotic manipulation, semantic visual attributes, and zero-shot categorization. His work has been recognized with the ACCV Best Application Paper Award (2016), a Samsung PhD Fellowship Award (2016), a UT Austin Graduate Deans Prestigious Fellowship Supplement (2016), and a UT Austin Microelectronics and Computer Development Fellowship Award (2011). He has reviewed for top conferences and journals across computer vision, machine learning, and robotics, won a CVPR Outstanding Reviewer Award (2016), and has served as an Area Chair for NIPS 2018. |