The evolution of stereoscopic 3D continued in 2009 from its niche market of feature animation and special venues toward a more generalist media form, including live action productions and more diversified distribution channels. It now seems safe to state publicly that 3D is here to stay, and it is the next big step in motion picture technology. Technological progress was demonstrated at every level, from high-end camera rigs to consumer displays, and the audience has continued to show a strong interest in 3D content. The debate has shifted from 3D cinema toward 3D alternative content in theaters and 3DTV at home.
This article will first comment on the events in 2009 that paved the way for that journey, and then forecast the engineering tasks that lie ahead.
MARKET SHIFT: FROM VENUES TO HOME THEATER
CINEMA IS GOING 3D
Even if a large part of the industry is still in observation mode, I would quote Ray Zone stating that "The 3D train has left the station." 2009 will be a turning point with the release of more than 12 feature 3D movies. That is one per month, and combined with the longer stay on screen than their 2D counterparts and the condensed releases on key dates, the fight has already begun for the 3D screens. Nearly 24 3D movies are slated for release in 2010, announcing a fierce battle. In addition, alternative content is also being converted to 3D. In Europe, following the positive outcome of the test run on Mozart’s Don Giovanni, beamed in 3D from Rennes to Paris, the French Tennis Open final was broadcast live in 3D to more than 20 theaters in France and Spain. Music shows and football matches have already announced showing in 3D theaters in 2010. In the U.S., it is obvious that the public will have access to National Footbal League (NFL) and National Basketball Association (NBA) programs in their 3D multiplexes. Sports events such as Formula one, NASCAR, and the 2010 Soccer World Cup may also claim their share of 3D screens in 2010.
On the distribution side, Sony announced a 10,000 Digital Projector deal with AMC and Regal.[sup1,2] Three thousand sites are expected to be in 3D by 2012. The CineAlta 4K OLED projectors will be fitted with a periscopic dual lens system from realD, providing passive polarization of a 2K 3D image. Barco announced similar figures, with 10,000 screens to be equipped and a new generation DLP with 4K resolution. These new projection systems will help reach the biotopic number of screens needed for 3D to thrive. When a single 3D movie needs 3K screens for a large release, 5K to 10K screens are needed to cope with multiple 3D releases in the same week, or to accommodate the "long tail" box office effect of previously released titles.
With big industry names betting on one technology in such large numbers, the 3D cinema egg-and-hen deadlock can be considered as definitively answered. The new frontier is now 3D at home.
3D IS NOW GOING TO THE HOME
A listing and analysis of the 2009 media, industry, trade association, and standardization group events focusing on the arrival of 3D in the home would fill an article by itself. Let us just name a few in random order: Panasonic announced a 3D Blu-ray proposal for 2010, Nvidia sold more 3D glasses than expected in the Europe, Middle East, Asia, and Africa (EMEA) market. Every consumer TV display technology (LCD, plasma, DLP, RPTV) is moving toward a 3D-capable 120 Hz refresh rate, with prototypes shown at 240 Hz. 3D video-on-demand (VoD) services were introduced for video game consoles and triple-play boxes. They will leverage the new HDMI 1.3 interconnect that is required to handle the bandwidth and synchronization requirements for 3D. English satellite operator BSkyB is planning to open a 3D channel in 2010.
On the user-generated 3D content front, we saw the releases of a stereoscopic webcam, Fujifilm’s digital 3D camera, and a 3D player for the online video service You Tube. By the end of the year, a 3D laptop will be available to edit 3D footage on the go.
The push is so strong that the SMPTE 3D Task Force Report on a production standard was mistakenly relayed in the media as the announcement of a 3DTV distribution standard in the works. Eventually, the International Telecommunications Union (ITU) decided to poll its members about the timeliness to open talks about actual 3DTV standardization.
In the meantime, the gamers market will be the niche for the first wave of 3D displays, as it did with pushing sales for HDTV Every major booth at the Consumer Electronics Association (CES) and the National Association of Broadcasting (NAB) Convention is expected to show some sort of 3D gear.
PRODUCTION SHIFT: FROM ANIMATION TO LIVE ACTION 3D AND 3DTV
LIVE 3D IS THE NEW FRONTIER
Only six years after the release of The Polar Express, stereoscopic 3D is now the norm in animation features. More than 40 3D CGI movies are being produced, and a compelling story will be needed to see a 2D-only animated movie reach the screens. For instance, at DreamWorks, Monsters vs. Aliens, Jim Mainard, DWA chief technologist, estimates that the additional cost of producing 3D was 8.5%, when from 2007 to 2009, 3D releases have shown an average earning per screen three times that of 2D screens.
The objective is live 3D production, at the cost, pace, and reliability of 2D production. Directors of photography (DPs) want a 3D camera that will allow them to shoot as fast and efficiently as they are able to shoot in 2D. Producers and directors want 3D post-production tools and pipelines that allow them to meet budgets and deadlines, with no drawbacks in quality and storytelling. This requires cameras and production systems designed natively for 3D. To quote Tim Sassoon, "we are experiencing a repeat of the delay between introduction of colored cartoons and color live action movies, to the point that multiple cameras and a consultant are needed." The difference is that the delay will likely not be 10 to 20 years. Many new production tools were introduced at NAB 2009 that show the move toward live 3D.
NEW 3D CAMERA RIGS AND IMAGE PROCESSING
So far, almost all 3D production crews are working with side-by-side cameras in an easy-to-assemble parallel rig, or with a more complex beam splitter rig (BSR), in which a half mirror splits the light between two cameras placed at 90°. Designing and building such a BSR is no easy task, and many specialized shops are now producing them in batch for sale and rental, offering motorization as an option.
The development of pocket-sized 2K cameras helps in making smaller rigs. Silicon Imaging goes all the way to integrate the control of a camera pair under a single unified user interface, with stereo-specific commands and monitoring. Theses functions include 3D display in many formats from anaglyph to active stereo, along with any image flipping needed to handle raw BSR images. These functions are available for any camera rig through the use of a dedicated 3D field monitor from Transvideo, or an image processing "StereoBrain" from Inition, among others.
The most advanced BSRs include a three-axis motion control linked to the zoom command. No two zoom lenses being identical, this complex system, based on self-built lookup tables (LUTs), compensates for optical axis and magnification discrepancies. Even with this high-end technology, the stereo pairs are not perfectly matched to be shown on 40 ft screens without some vertical parallaxes that stain the 3D quality and viewing comfort. Controlling this requires a visual effects (VFX) pass, sometimes called "stereo prepping," that consists mostly of vertical and rotational corrections, and should be applied to any 3D footage before entering the edit room, for it will most likely include a slight zoom-in and depth repositioning.
The state of the art is to process the images in realtime while they are shot, and to present the final corrected image to the DP and the director, as it will be used in the edit. Otherwise, the monitored image does not match what will actually be edited, both in framing and depth positioning. At NAB 2009, 3Ality demonstrated the ASIC-based SIP2100 in "learning mode" with the zoom’s LUT computed in realtime, and the perfected picture shown with a time delay of a few seconds.
Binocle presented another approach, based on graphics processing unit (GPU) computing, that delivers 5 frame/sec computation. The system, called "disparity tagger" shows the amount of parallax on screen, and color-codes it, according to the selected screen size. Just like zebras in a viewfinder warn of overexposed areas, this system flags the over-parallax zones. A "disparity killer" software re-aligns the images to perfect parallax. A feedback loop to the camera rig can automatically reduce the inter-axial distance, in the same way the iris is closed in auto exposure mode. Both systems were used in live productions this year, and will soon be a must in any 3DTV or 3D show. Eventually, these devices’ metadata needs to be integrated with post-production suites, especially when computer graphics elements are integrated.
STEREOSCOPIC POST-PRODUCTION TOOLS
Both Avid and Autodesk now offer 3D capabilities in their flagship products, with both Maya and Media Composer, respectively, including stereoscopic modeling and editing functions. Combined with the release of 3D-capable Toxic, Luster, and Nitrix, this demonstrates how 3D is now among the core new technologies implemented in the most-used movie production tools. Many smaller vendors now do the same, following Iridas, The Foundry, and EON, which were already present in the 3D field for a few years. Each newly introduced stereoscopic product deserves a meticulous scrutiny to distinguish simply 3D-patched products from deeply 3D-engineered ones. Depending on what you find beyond the "open 3D footage" and "select 3D display format" dialog boxes, your mileage along the 3D post road will vary, by orders of magnitude.
Among the tools deserving a "product of the year" mention are, "Neo3D" from Cineform both the NAB show "2009 TV Technology STAR Award" and the "2009 Vidy Award." This video codec can be described as a 3D-retrofit add-on for virtually any desktop video tool. It handles 3D assets in a way that enables a non-aware 3D host application to be 3D-capable. The second eye of a 3D asset is embedded in the metadata and presented to the host application only if and how you choose. The codec’s setting window allows you to select a 2D-compatible format (side-by-side, interleaved…) and to process 3D-specific effects, such as stereo prepping or depth placement. Such 3D-specific settings are kept as metadata in a database and GPU-rendered at run time. That database can be remotely addressed and updated from a networked workstation, allowing for an assistant stereographer to groom the 3D shoots while the director works on the edit or effects. We are waiting for desktop video applications that interface with the codec application programming interface (API) and provide stereoscopic image manipulation, such as 3D transitions and effects.
WHAT LIES AHEAD
Stereoscopic 3D is going to be on the roadmap of SMPTE members. We are at the very beginning of modern (i.e., digital) 3D technology. As the technical community charged with engineering the production and diffusion tools, it is up to us to figure out what is ahead.
ENGINEERING A 3D CAMERA
The 3D camera is likely the stereoscopic tool that needs and deserves most of the engineering effort. DPs and news crews want to use a camera, not a rig. There should be only one box, one "record" button, and one single red tally. 3D-compatible 2D cameras, with a master/slave operation will be needed in the future. One camera would receive all the commands and process all the telemetry and automatisms, and the second camera would just obey and send acknowledgements or warnings. For appropriate shooting of fast-moving sequences, the orientation of the rolling shutter, or imager scanning, should be reversible to match the mirrored image in a BS rig. A global shutter would also be a solution. Field recorders are usually able to ingest left and right images at once. They should record the 3D-specific metadata coming from the rig’s motion control and post-processing stages. This requires some metadata standardization, which is being done by the ASC.
Lens makers will have to consider the production of 3D lens adapters. Many 3D movies from the 1970s and 1980s were shot on a single strip using these devices. In this digital age, a 4K camera makes a perfect 2K 3D camera. Despite the limitations of a fixed or range-limited inter-axial, they will find their sweet spots in production, especially in shoots such as electronic newsgathering when the metrics of the set are known and the setup time is a crucial parameter. For multiple camera systems, zooms, paired by the manufacturers, are needed. Even if perfectly matched zooms do not exist, they can be delivered with progressivity and telecentry look-up tables (LUTs) on a smart card or directly downloaded to the camera through a digital link in the lens attachment. Such specialized lenses would be sold or rented in pairs that show the most overlapping LUT data.
There is ongoing work on integral imaging cameras, also called panoptic cameras, an example of 3D-specific imaging tools that will come to fruition in the coming years. These will come with the second generation of 3D images that will not be based on pairs of point of views, but rather on CG-like 3D models described as geometries and textures.
CONVERTING PRODUCTION FACILITIES TO 3D
Years, if not months, after converting to HD, there is no economy to replace the equipment. Existing 2D gear equipment will need to be retrofitted into 3D tools. The easy route is to use 3D-in-2D anamorphic formats, and eventually regenerate the resolution. SD to HD up converters will find a new life as half-HD to full-HD converters. The system efficiency can be improved with optimized formats such as Sensio’s. The ultimate goal is to produce full resolution 3D, reusing any available HD overcapacity. For example, Sony SRW decks were used to record Journey to the Center of the Earth left and right streams in 4:2:2 on a single tape. Quantel Pablo’s SD and HD simultaneous output was based on downconverting a secondary HD channel. It took only days to make both channels synchronize HD, and the Pablo was ready for 3D. Likely, the 1080@60p video mixers can be retrofitted as 3D@30p mixers. Using the same approach, the H264 5.1 profile, offering 1920 × 1080 at 120 frames/sec can be used as a 3D 1080p distribution format.
One very important point to understand in the technical evolution toward 3D is that we are not going to handle any new esoteric data type or format. Twice the amount of the very same information will be handled with a few new metadata describing the relative positions of the images, with the notable exception of depth maps, as will be shown later.
PLACING PIXELS IN THE 3D SPACE
This technological update would not be complete without mentioning depth maps and correspondence map computation. The depth map is a gray-level picture that describes each pixel’s distance to the camera. The correspondence maps describe the X, Y, Z vector linking every pixel of one point-of-view (PoV) to its homologous pixel in the second PoV They will be to 3D post-production what the Alpha channel is to 2D compositing. Their applications include 3D effects automatization, depth warping in production, 2D+Depth distribution formats, and eventually depth compression or expansion to match the display size and consumers’ taste. The accuracy of automatic computation is still not good enough for most applications and hand-generation is extremely labor-intensive. So far, realtime stereoscopic prepping is the only actual implementation, along with limited, but promising, results in PoV interpolation. This domain is seen as the most important research field in 3D imaging.
THE DISTRIBUTION CHALLENGE
With increasing content production and home equipment reaching the shelves, the demand for pirated 3D is around the corner. Some publications have said that 3D cannot be copied, but there is not a single technical reason why 3D could not be pirated. It is possible to record it on a 3D camera at the theater. The quality will be poor, but the 2D piracy experience already demonstrated that quality issues come second after availability. 3D movies are already available on illegal P2P networks. They are HD reconstructions, based on the 2D and anaglyphic 3D Blu-ray disk editions. Despite the imperfections of the color reconstruction, they offer a better 3D experience than anaglyphic disks to the happy owners of 3D displays. We are facing an interesting case in which illegal content is of superior quality to the official products.
New distribution channels also need to be invented. They should match the fluidity and reactivity of the bootleggers, and beat them on the quality. Panasonic’s announcement of a 3D extension of the Blu-ray disk standard to be submitted to the BD Association and the recent addition of a 3D player on YouTube are signs that the race between optical support and electronic delivery has started.
CONCLUSION
The 3D production tools and the use of stereoscopy to further storytelling are progressing. They are both signs and consequences of the maturation of the art. Nonetheless, 3D still has to prove its value as a full-fledged live action medium that conveys and expands the scope of the emotions inherent in a sports event, comedy, or drama. This will come with better equipment and easier access to production tools.
This winter, consumer-priced 3D projectors, laptops, and camera rigs will be introduced. By NAB 2010 one will be able to fit a stereoscopic production unit in a backpack. The industry will be many pixels and megabits short of the high-end quality of a 4K studio production, but will have all the tools needed to learn and extend the art of 3D storytelling. Film school students and independent moviemakers are eager to enter the new realm of digital stereoscopy.
ADDED MATERIAL
Bernard Mendiburu is a stereographer and digital cinema consultant working with feature animation studios in Los Angeles, CA, where his credits includes Meet The Robinsons and Monster vs. Mens. He recently published 3D Movie Making, Stereoscopic Digital Cinema from Script to Screen with focal Press. Mendiburu’s lectures and workshops on 3D cinema were selected by La"ika and CalArt’s Experimental Animation department. In 2009, he presented a paper on 3D Workflows at the SPIE Stereoscopic Display and Applications conference and at NAB’s Digital Cinema Summit. He presented the Paris’ Dimension3 two-day workshop on 3D post-production. Mendiburu recently joined the 3D@Home Consortium’s Advisory Committee on 3D Quality and was an active member of the SMPTE 3DTask Force.
REFERENCES
1. AMC entertainment to convert entire circuit to digital cinema projection with Sony 4k systems, http://pro.sony.com/bbsccms/assets/files/mkt/digicinema/pressreleases/FINAL_AMC_Sony_4K_Joint_National_News_Release_3-28-09.pdf.
2. Raising the Stakes, http://www.digitalcinemareport.com/Regal-Sony-Barco-Christie-NEC-TI -DLP-LCOS-4K-2K.
AUTHOR: | Bernard Mendiburu |
TITLE: | 2009 Report on Stereoscopic 3D Production |
SOURCE: | SMPTE Motion Imaging Journal 118 no6 68-71 S 2009 |
COPYRIGHT: | The magazine publisher is the copyright holder of this article and it is reproduced with permission. Further reproduction of this article in violation of the copyright is prohibited. To contact the publisher: www.smpte.org |