UW Allen School Colloquium: UW Reality Lab
The UW Reality Lab, an industry-funded effort to advance the state of the art in augmented and virtual reality, launched in January of this year. In March, the Lab funded a suite of 15 projects across research groups in the Allen School and beyond. In this colloquium, we will give a brief overview of the Lab, followed by six research presentations ranging from foundational projects that helped launch the lab to projects that were funded this year and already have exciting results to show:
“Metasurfaces for augmented reality visors,” Elyas Bayati (EE grad). We will introduce sub-wavelength diffractive optics, also known as metasurfaces for creating ultra-compact visors for augmented reality.
“Two-player Driving Game for Real-time Natural Human-Robot Communication,” Junha Roh (CSE grad). Our goal is to achieve real-time natural human-robot communication and we built a two-player driving game for collecting speech and driving data.
“Emptying, Refurnishing, and Relighting Indoor Spaces,” Edward Zhang (CSE grad). In this work, we show how you can scan a room and then virtually replace the existing furniture with new furniture, all in a visually realistic way.
“Surface Light Field Fusion,” JJ Park (CSE grad). We present an approach for interactively scanning highly reflective objects with a commodity RGBD sensor by modeling the scene appearance with surface light field.
“Photo Wake-Up: 3D Character Animation from a Single Photo,” Chung-Yi Weng (CSE grad). We present a technique to "wake up" a still photo by animating the human subject, and demonstrate bringing the central figure into the real world with the help of Augmented Reality.
“Watching sports in AR,” Kostas Rematas (CSE postdoc). Starting from a single Youtube soccer video, we generate dynamic AR “holograms” of the players, including the ball.
October 25, 2018
This video is closed captioned.