I am a PhD student at Berkeley AI. I am broadly interested in developing algorithms that can learn to construct rich, interpretable models of the world as quickly and robustly as humans do.
I did my undergraduate and Master's at MIT, where I was part of the Computational Cognitive Science group advised by Prof. Josh Tenenbaum. During my time as an undergrad, I interned at DeepMind, Waymo, and worked on research projects under Prof. Pattie Maes at the Fluid Interfaces Group where our work was featured on 60 minutes. I also helped run HackMIT.
AlterEgo: A Personalized Wearable Silent Speech Interface
Arnav Kapur, Shreyas Kapur, Pattie Maes
23rd International Conference on Intelligent User Interfaces. ACM, 2018.
Video, Homepage, Press: [1, 2, 3]
Drishti: An Ultra-Low Cost Visual-Aural Assistive Technology for the Visually Impaired.
Shreyas Kapur, Arnav Kapur
Proceedings of the international Convention on Rehabilitation Engineering & Assistive Technology. Singapore Therapeutic, Assistive & Rehabilitative Technologies (START) Centre, 2014.
(Best Paper Award)
First Place, MIT's Autonomous Robotics Competition MASLAB, 2017.
Worked on Vision and Planner on the robot.
First Place, MIT's Web Programming Competition web.lab, 2018.
Worked on a presentation design software that allows the user to make fancy animated presentations right in the browser.
First Place, Intel National Science Fair (IRIS, India)
Google's Thinking Big Award, Intel International Science Fair.
Built a system to compute eyeglass prescriptions using a mobile app.
Experiment Design of the Year, Fast Company.