High-quality real-time rendering in new Mercedes-Benz show car
September 30, 2015
Rotoscoping is a demanding and labor-intensive task that requires an artist to carefully separate out each element in a shot, often with sub-pixel accuracy. A collaboration between Foundry, The University of Bath and University College London to explore the idea of putting machine intelligence into artist tools, Roto++ looks at accelerating this process by creating a tool that learns the shape as an artist works. This collaboration produced a proof-of-concept tool that reduces the number of keyframes required when creating roto curves and offers more intuitive control when interacting with shapes. You can see the full paper and results on UCL’s website.
The Roto++ project provides a data set for non-commercial use, consisting of a professionally produced short movie called MODE with rotoscoped elements for every shot. The MODE short movie is provided courtesy of James Courtenay Media and the rotoscoping work was kindly performed for The Foundry by PixStone Images.
This data set—with over 10,000 frames of footage broken down into 158 shots—represents the real-world challenges in movie production. The shots range from 13 frames to over 300 frames in length, with an average shot length of 67 frames, which is typical for live-action movies. The shots have been categorized by professional artists on a scale of 1 to 5 to define the complexity level of the rotoscoping required. The data set is split into the following categories and effort, representing a total of 734 days of rotoscoping effort:
Isolated characters, trackable objects
(38 shots, 28 frames/day)
Limited motion blur, limited articulation, multiple characters
(94 shots, 14 frames/day)
Lengthy camera moves, fast action, motion blur
(14 shots, 11 frames/day)
Many characters, complex articulation, elements such as hair, dust and water
(12 shots, 6 frames/day)
This data set is made available by Foundry to the research community with thanks to James Courtenay Media and PixStone Images, in collaboration with The University of Bath and University College London.Download the data set