How do you get real-world footage onto next-gen displays? It takes more than just a lot of cameras (though that helps). This session will look at live action content for multi-perspective displays, including screens and HMDs, from capture to encode to render. How is it different from 2D? What are the hard problems and enabling technologies? And most importantly, how are we going to produce the first 5,000 hours of content necessary to jumpstart the ecosystem? This session will survey the current state of the art, weigh the tradeoffs, and attempt to predict the next five years of content production tech.
Ryan Damn, Co-Founder, Visby
Ryan is a camera geek, cinematographer, hacker, and frequent light field speaker. After 20 years of shooting video and building camera systems, he cofounded Visby in 2015 to make real-world capture of ‘holographic’ images a reality. Visby is pioneering new image formats and new capture systems for holographic displays.