Peter Jackson Responds to ‘Hobbit’ Footage Critics, Explains 48-Frames Strategy

Shrugging off the mixed reaction that greeted ten minutes of unfinished footage from The Hobbit, which screened earlier this week at the Cinemacon exhibitors convention in Las Vegas, director Peter Jackson told the Hollywood Reporter, “It wasn’t particularly surprising because it is something new.”

CinemaCon 2012: Warner Bros. to Preview ‘The Hobbit’Peter Jackson’s ‘The Hobbit’ Should Hit Theaters at 48-Per-Second Frame Rate

The Oscar-winning director is filming The Hobbit, his two-part 3D prequel to The Lord of the Rings at the higher frame rate of 48 frames per second. Movies have been shot and projected at a standard rate of 24 frames per second since the arrival of talkies, and the new technique results in a dramatically different aesthetic look.

“A lot of the critical response I was reading was people saying it’s different. Well, yes, it certainly is,” Jackson, speaking by phone from New Zealand, said. “But I think, ultimately, it is different in a positive way, especially for 3D, especially for epic films and films that are trying to immerse the viewer in the experience of a story.”

Jackson’s epic, which Warners will release Dec. 14, will be the first major motion picture to be made in 48 fps – a different aesthetic look that Jackson believes can result in smoother, more lifelike pictures.

PHOTOS: The Scene at Cinemacon

While many at the crowded showroom at Caesars Palace applauded the depth and detail in the large-scale battle sequences, some found more intimate daylight sequences too crisp and bright, complaining that they looked more like HD video and that the new process sacrificed a traditional “cinematic” feeling.

A number of bloggers, whose opinions quickly ricocheted across the web, found fault with the 48 fps footage. On the popular fanboy web site Aint-It-Cool-News, Moisés Chiullan, writing under the name 
”Monty Cristo,” offered a mixed assessment but provided the anti-48 fps forces with a rallying cry when he reported that some “people said that all the brief clips ‘felt’ like the 1970 I, Claudius in HD,” a reference to a 40-year-old British TV series. Devin Faraci of BadAssDigest.com tweeted, “Oh no. Not a fan of 48fps. Oh no no no,” adding that “THE HOBBIT, frankly, did not look cinematic.”

Jackson acknowledged that the short, ten-minute clip package — ranging from action sequences to quieter moments between Martin Freeman’s Bilbo Baggins and Andy Serkis‘ Gollum – may have been too brief for viewers to acclimate to the new process.

“It does take you a while to get used to,” he said. “Ten minutes is sort of marginal, it probably needed a little bit more. Another thing that I think is a factor is it’s different to look at a bunch of clips and some were fast-cutting, montage-style clips. This is different experience than watching a character and story unfold.”

Because of that, he isn’t planning to release a 48 fps trailer for the movie. “I personally wouldn’t advocate a 48-frame trailer because the 48 frames is something you should experience with the entire film. A 2 1/2 minute trailer isn’t enough time to adjust to the immersive quality.”

STORY: Peter Jackson Debuts ‘The Hobbit’ Footage

Jackson himself has grown accustomed to watching 48fps imagery. He watches dailies in 48 frames every day, sometimes two hours worth.

“You get used to it reasonably quickly,” he said, commenting that now when he views traditional 24 frames footage, “I’m very aware of the strobing, the flicker and the artifacts.”

“We have obviously seen cuts of our movie at 48 and in a relatively short amount of time you have forgotten (the frame rate change). It is a more immersive and in 3D a gentler way to see the film.”

Jackson also explained the footage presented at Cinemacon would look different once it goes through the post-production process.

Because production is not scheduled to wrap until July, the customary postproduction that affects the overall look of a film has not yet been done, so the clips were unfinished. They were not yet color corrected, nor had the visual effects been completed. (In various scenes the actors were shown performing in front of a greenscreen.)

Jackson explained that his original The Lord of the Rings used various postproduction techniques to create a certain look for the movies, including “extensive” digital color grading, “added texture, and we took out highlights.”

“We’ll do the same with The Hobbit, to make it consistent and give it the feeling of otherworldliness – to get the mood, the tone, the feel of the different scenes,” he said. “We are certainly going to experiment with different finishing techniques to give the 48 frames a look that is more organic. But that work isn’t due to start until we wrap photography in July (both Hobbit films are being shot simultaneously).”

Jackson is also lensing the movie – which is being shot in 3D, a first for the franchise – using Red Epic cameras with 3Ality Technica 3D rigs.

The Red Epic, Jackson explained, allowed him to shoot in 5K resolution. (5K refers to the number of horizontal pixels that compose a frame.) Today, movies are generally lensed and projected at 2K, though the industry is moving in the direction of 4K.

“It is very clean. On a 5K camera you are seeing very crisp pictures,” he said. “Part of the digital grading will give those incredibly sharp pictures a texture and a feeling that we want the film to have. We haven’t done that yet. What you saw [at CinemaCon, in terms of “crispness”] is partly due to the lack of motion blur (from the high frame rate) and partly due to the camera (in terms of resolution).”

While the Cinemacon footage may have suggested the direction in which Jackson is heading, he added, “People haven’t experienced it yet in the way it should be experienced.”

In contrast to the first wave of skeptical tweets, a sampling of reaction from exhibitors, studio executives and producers at Cinemacon found many saying that 48 fps represents the wave of the future.

At one panel, Regal CEO Amy Miles said her circuit is committed to 48 frame rates, and that technological advances are key to a thriving in the exhibition business. “We have to evolve. We want to make sure we can deliver options. At Regal, we will be ready and able to provide that experience,” Miles said.

Producer Neal Moritz (Fast Five, 21 Jump Street) likened the reaction to the 48 fps footage to the first reactions to digital cameras, which wasn’t initially embraced by many filmmakers. “Now, every filmmaker wants digital. It just takes getting used to and this is no different,” he said.

IMAX Filmed Entertainment chairman Greg Foster said it amounts to a generational issue, and that all the kids who have grown up watching digital television find it easy to accept movies projected at 48 fps.

“It’s older people like me who will have a reaction, but there is no doubt it is here. Every filmmaker likes it,” Foster said. “And what a great way to start with The Hobbit.”

Sony Pictures Entertainment vice chairman Jeff Blake agreed that technological gains such as the move to 48 frame rates “revitalizes” the business.

A year ago at the 2011 edition of CinemaCon, James Cameron argued that higher frame rates offer an “enhanced sense of detail” and “enhanced clarity,” and he demonstrated the potential with test footage that was generally well received. Cameron showed those clips in comparison with the same scenes in 24fps where motion blur and other artifacts were visible.

But Jackson doesn’t believe 48fps is right for every movie, and he even proposed that different frame rates be mixed into a single film. “As another creative tool, I think (high frame rates) is a really important thing,” he said.

He also believes such options are important for exhibition. “As an industry there is a certain amount of trouble that we are in; kids seem to think watching a movie on an iPad is an okay think to do,” he said. “Advocating that we have to stick with what we know [24 fps] I think is a slightly narrow mined way of looking at things when as an industry we are facing declining audiences. We have to find ways to make it more vibrant, more immersive – something that will encourage people to come back to the theaters for that experience.”

Baz Luhrmann previews footage of DiCaprio-starring “Great Gatsby”

Baz Luhrmann previews footage of DiCaprio-starring “Great Gatsby”

PanARMENIAN.Net – Director Baz Luhrmann insisted that F. Scott Fitzgerald’s classic tale The Great Gatsby, first published in 1925, is “possibly even more relevant [today] than when it was written” as he introduced scenes from his new 3D film version at the CinemaCon exhibitors’ convention in Las Vegas, The Hollywood Reporter said.

Presenting the footage during the studio’s slate presentation Tuesday, April 24 Warner Bros. Pictures president Jeff Robinov said that Luhrmann uses 3D in the movie to increase the “emotion, intensity, power” and the “immersive” feel of the drama.

In the debut clips, viewers saw Luhrmann’s use of 3D from the flamboyant style of a party at Jay Gatsby’s mansion to close-ups, including intimate scenes between Gatsby (Leonardo DiCaprio) and Daisy (Carey Mulligan).

In a videotaped message, Luhrmann said he wanted to show the “spectacle” of what he is aiming to create, introducing a series of “untouched” 3D clips from the film that had not yet been color corrected and whose VFX had not been completed.

Great Gatsby was shot in 3D using Red Epic cameras and 3ality Technica rigs.

The film opens Christmas Day.

Douglas Trumbull Fuels New Filmmaking Paradigm with NVIDIA Quadro GPUs

Douglas Trumbull

Industry visionary Douglas Trumbull has been pioneering new technologies and filmmaking techniques since 1968 when he served as special effects supervisor on 2001: A Space Odyssey. Winner of this year’s prestigious Gordon E. Sawyer Award recipient from the Academy of Motion Picture Arts and Sciences, Trumbull has also been nominated for three Oscars, (for his work on Close Encounters of the Third Kind, Star Trek: The Motion Picture and Blade Runner), and won an Academy Scientific and Engineering Award in 1993 for the development of the Showscan Camera System – a revolutionary 65mm film format shot at 60 frames per second (fps). He is also credited with developing several technologies for IMAX, including IMAX Ridefilm, the company’s immersive simulation rides installed at entertainment venues and theme parks around the world.

Ever since the development of Showscan in the early 1980s, Trumbull has harbored a dream of high-resolution, high-frame-rate production and distribution.

Back then, it was an idea that was a bit before its time, but today, with the advent of digital cinema and technologies like NVIDIA Quadro professional graphics processing units (GPUs), that vision is now becoming a reality.

“The power that’s available through GPU computing and the current generation processors is having a radical impact on overall production costs,” said Trumbull.

Trumbull envisions a production pipeline where directors can shoot in a greenscreen virtual-set environment and almost immediately review the shot with all of the VFX elements rendered in real time, in stereoscopic 3D running at 120 fps.

He has been pioneering this new film production paradigm – a concept he has loosely dubbed “Hyper Cinema” – and experimenting with a combination of virtual-set technologies and real-time, high-frame-rate stereoscopic display technologies.

Trumbull turned to long-time collaborator Paul Lacombe, CEO of UNREEL– an augmented reality and virtual set systems integrator and developer – to help realize the Hyper Cinema concept and design technologies for the new studio. Unreel has deployed virtual sets at numerous broadcast facilities ranging from top broadcast and cable networks like CBS, ESPN and CNBC, to local TV broadcasters across the country. While the challenges of 4K, stereoscopic, 120 fps and photorealism are unique to the film industry, the concept is analogous. It all boils down to computational horsepower.

Trumbull has recently built a state-of-the-art production studio on his property in Berkshire Hills, MA. The studio features an 80-foot greenscreen stage and an adjacent screening room equipped with a special hemispherical high-gain Torus film screen, where he can review high-resolution, 3D footage rendered out in real time or near real time. The screen is curved to reflect more light back to the audience and offers a much wider field of view than traditional screens, delivering a more immersive experience.

Trumbull’s facility combines an impressive array of cutting-edge technologies ranging from virtual-set, motion-tracking and motion-analysis systems to 3D stereolithographic printers for creating physical miniature models.

“As I swing back into directing and producing films, it’s a place to do a lot of frontier-pushing experiments, where we can shoot and then screen material immediately in this new format,” explained Trumbull. “It also allows us to experiment on stage and see the results immediately on the screen to evaluate if the camera angle is correct, the movement of the camera is right and the actors are framed properly.”

He stressed that “the mantra around here is that, if it’s not real time, then it’s got to be near real time, which means we see the live-action foreground and computer-generated background, so that we can make aesthetic and editorial judgments and proceed with production very rapidly. I really like working with real actors and actresses, real makeup and real wardrobe, but since you can superimpose anything in real time, my rule is: no real sets, no real locations, just real people. It allows me shoot everything on one stage, in one place.”

Trumbull has built a state-of-the-art production studio on his property in Berkshire Hills, MA.

Trumbull’s “Hyper Cinema” workflow raises some immense challenges. Keying and compositing live-action footage shot on a virtual set with a mixture of CG and miniature plates requires intense processing horsepower throughout the pipeline. But when you increase the frame rate to 120 fps and resolution to 4K, and then double the amount of data for stereoscopic 3D, the challenges are even bigger. UNREEL has been working closely with Trumbull to tackle this problem.

“Shooting live action at 120 frames and getting the throughput to project at 120 frames is actually harder to do than real-time CG, because in CG, you can limit your polygon count and limit your texture maps to get the throughput that you want,” explained Lacombe. “I try to take the horsepower that NVIDIA can provide and use it to dramatically cut costs. The net impact is that every step in the production process is accelerated.”

With live-action footage, one of the key computationally intensive tasks is debayering raw image data from cameras like the RED EPIC, ARRI ALEXA or the Phantom 65. And soon, there is a plan to add color correction into the pipeline as well.

Trumbull is preparing to direct a science-fiction film to demonstrate the full potential of high-frame-rate production and his “Hyper Cinema” concept – his first feature film since 1983 when he directed Brainstorm, (which was originally supposed to be shot in the Showscan format as a proof of concept).

“It’s a new cinematic language, which calls for different kinds of camera angles and movements, different selection of lenses, different kinds of action and framing, and a different editorial pace, because the result of 120 frames in 3D on a very bright screen is like being inside a movie rather than looking at a movie,” said Trumbull. “It’s a very intense, participatory experience. What I’m trying to do is show the industry how we could substantially improve the movie-going experience for the public.”

While he’s not prepared to reveal the name of the new film yet, he admits that he has tailored his script to make the most of this new media and workflow paradigm.

“When you want to introduce a movie process like I’m trying to do, I feel that it’s incumbent on me to direct it myself. I’m trying to explore this new cinematic language where the audience feels as though they’re actually there with the characters and part of the adventure,” he said.

In fact, according to Trumbull, one of the key challenges the film industry faces today is developing a compelling in-theatre experience to reinvigorate slumping box-office numbers, which were at a 16-year low last year.

With younger audiences more likely to watch a film on their laptops than in a theater, Trumbull predicts that “we’re going to see a lot of turmoil in the exhibition business over the next few years as the industry transitions to a movie experience that’s so spectacular that you can’t get it on your laptop or smartphone, so that people come to a theater expecting something amazing, enveloping and powerful – all the things you can’t get on a small screen.”

Another key challenge is delivering that level of quality on a budget that is significantly smaller than the $200-million blockbusters that the major studios seem to bet on these days.

NVIDIA’s Quadro GPU technology has become a key enabler throughout the pipeline, impacting almost every stage of the production process.

“NVIDIA technology really comes into play as one of the big accelerators of this process,” said Lacombe. “We have NVIDIA Quadro cards in almost every computer on site, doing all kinds of graphics and parallel computing acceleration for high-throughput, high-bandwidth, 120 fps 3D material.”

That includes traditional VFX workstations running Autodesk Maya, 3ds Max and Rhino 3D, as well as a unique new adaptation of virtual set technology that encompasses camera tracking, keying and real-time rendering.

“We’ve got 16 motion analysis cameras up on the ceiling of the stage – the same type of cameras used on Avatar, or any other motion-capture system,” said Lacombe. “We use them to track our camera motion, so we can actually go into a virtual set with a hand-held camera and have an absolute lock between the foreground and background images with no slipping or sliding.”

“The rendering is all based on NVIDIA Quadro hardware,” explained Lacombe. “That can bring you to a certain level of real-time quality, (1920 X 1080 HD), but now Doug wants to go to 4K and do 120 fps and do stereo, so he’s really pushing the limits. To do that, we’re capturing real-time metadata, (timecode, camera tracking and the environment), and taking that into a ‘near time’ process where we can turnaround results in a matter of minutes, so that Doug can say whether he’s going to buy a shot and move on, or re-shoot. He can quickly render out a shot at 4K stereo 120 fps using the GPU to accelerate the process.”

“The key now is to try to up-res the real time output and to take it to the next level,” Lacombe added. “That’s where GPU acceleration using Maximus is going to play a key role.” NVIDIA Maximus technology enables a single, general-purpose workstation to not only render complex scenes immediately – thanks to the NVIDIA Quadro GPU – but also to harness the parallel-computing power of the integrated NVIDIA Tesla C2075 companion processor to perform multiple compute-intensive processes simultaneously.

Trumbull explained that wherever possible, he prefers to shoot miniature models and composite them into a shot rather than relying on a purely CG environment. “For a synthetic environment, it’s much less expensive to build miniatures than do full photorealistic CGI,” he explained.

But Trumbull’s on-site model-making shop is a virtual one. Models are designed in modeling software like Autodesk 3ds Max, Maya or Rhino 3D and printed to 3D stereolithographic printers, which build a physical model with special polymers – a process that is also powered by an NVIDIA Quadro pro graphics card.

“We’re actually using NVIDIA products in the way we make the models, which is quite an interesting new way to go about it,” said Trumbull. “When we paint them and light them with real lights, they look like real things rather than synthetic computer graphics, so we can get rid of a lot of manual work in building miniatures.”

Trumbull explained that none of this would have been possible without a GPU-accelerated pipeline. In fact, he reported that: “The first time I was even able to see 120 frame 3D was with a Quadro-powered server.”

Doug Trumbull detailing the Moon Bus for 2001.

Trumbull explained that he plans to prototype and debug his film in the pre-vis stage, borrowing a technique from the animation industry where films go through several major revisions based on storyboards and animatics, long before the actors are even signed, or the animators start to work.

“Being able to work at a higher resolution with more realism is what drives down overall production costs,” said Trumbull. “We’re going to rehearse live actors in virtual environments – not our principle cast, but good actors – just to debug the whole movie, so we can make whatever iterative changes need to be made long before we build our real virtual sets or even hire a real cast.”

But even if the sets are rough and the cast is a placeholder, during these rehearsals, Trumbull will be recording metadata of all sorts, including DMX data from the lighting systems and motion control camera moves. He explained that at this stage, “We need to be very frank with one another. The internal creative process is that no one should feel intimidated to say, ‘that’s a crummy line of dialog’ or ‘we should delete that sequence.’ That leads to a very high quality in the finished product. I’m trying to bring that same method or point-of-view to live-action rehearsals.”

With this unique “Hyper Cinema” approach, Trumbull expects that, “when we come back to actually shoot the main production, we hope to get 50 or 100 set ups per day and shoot the whole movie in a couple of weeks.” He anticipates that the efficiency of this approach will save as much as 75% of production costs over traditional workflows.

“We wouldn’t be anywhere near where we are now if it weren’t for the NVIDIA Quadro card,” said Trumbull. “It’s been a huge enabler for this whole philosophy, and fortunately for us, it just gets better and better. With these dramatic increases in performance, every stage of production is accelerated and the quality and cost of production continues to go down.”

Adobe CS6 review: Premiere Pro

Premiere Pro CS6 has a new interface that gives over the top part of the screen to the Source and Program monitors. The amount of buttons has been reduced drastically – though you can add back some or all of them. The Project panel has 16:9 scrubbable thumbnails, to which you can add trim marks.

The improved Mercury Playback Engine delivers even more performance from multi-core processors, according to Adobe. While still largely tied to Nvidia’s CUDA engine – where it now supports Tesla add-in boards for even greater performance – Open CL acceleration has been added to allow AMD graphics chips in the latest Apple MacBook Pros to be tapped as well. Other AMD cards won’t deliver the same performance benefits.

For input there’s ingesting and logging through Adobe Prelude as well as augmented hardware I/O support, including that for multicam editing. Previously this was limited to four cams, but now it’s limited only the speed of your computer. There’s new native support for direct editing of RED EPIC, RED SCARLET-X and Canon formats, so Premiere can deal with footage from all major high-end cameras at resolutions up to 5K.

There’s also tighter integration with other Adobe packages – projects can be sent to After Effects and SpeedGrade directly. This extends to output as well with the Dynamic Link Send to Encore CS6, allowing easy creation of Blu-ray, DVD and web video from the same project.

When it comes to editing, there are new trimming tools that support the three trim modes: regular, which cuts edges and edit points of clips without affecting anything else; roll trim, which edits the end of one clip and automatically moves the start of the next; and ripple trim, which moves everything up and down the timeline. Other new features include the Warp Stabilizer, which evens out wobbly handheld footage; adjustment layers that can be applied to any footage track; the enhanced 64-bit Mercury Playback engine (GPU-accelerated with a supported video card, and with more cards supported in this release).

There are new colour workflow options, too. The highlight is the improved Three-Way Color Corrector. It now allows for visual colour correction across the board for shadows, midtones and highlights, and you can switch workspaces to maximise the amount of space available to it.

Audio is another area that has been updated, with a redesigned Audio Mixer panel and an enhanced Meters panel. The Mixer layout has been improved to go with the colour visual indication of the peaks and troughs: double clicking on a fader resets it to zero while there options to show dynamic peak indicators that update regularly, or static peaks. Individual audio tracks can contain clips that have an unlimited number of audio channels.

Premiere Pro CS6 gains two new – well, new to Adobe anyway – companion products: Prelude and SpeedGrade.

Prelude CS6 doesn’t ship with Premiere Pro CS6, but you do get it with Creative Suite 6 Production Premium or a Creative Cloud subscription. It’s a standalone ingest and logging application for working with file-based media. Large groups of captured media can be quickly brought in and catalogued, and then only the clips you need sent to Premiere Pro.

SpeedGrade CS6 is also only available as part of Creative Suite 6 Production Premium or a Creative Cloud. Bought as part of Adobe’s acquisition of Iridas last year, it’s a professional-level colour grading tool similar to Apple’s Color that it used to ship alongside Final Cut Pro in Final Cut Studio. This looks to be essentially the same tool as Iridas’s – it doesn’t follow the same interface conventions as the rest of CS6, though considering its focussed approach, this could be a good thing.

SpeedGrade CS6 has been integrated with Premiere Pro CS6 to some degree though, as the editors gains a Send to SpeedGrade command.

Premiere Pro CS6, Prelude CS6 and SpeedGrade CS6 are part of CS6 Production Premium and the Master Collection – plus Creative Cloud.

CS6 Production Premium costs £1,509, or from £298 as an upgrade. CS6 Master Collection costs £2,223, or from £397 as an upgrade.

Creative Cloud costs £38.11 per month with an annual contract. Current owners of CS3 or later suites or products can get this for £22.23 per month. On a month-by-month basis, Creative Cloud costs 57.17.

All prices exclude VAT. Adobe says CS6 will be out before May 22.

Follow our step-by-step Macworld Masterclass on Premiere Pro Cs6’s new Three-Way Color Corrector tool

The Hobbit Will Fundamentally Change Your Movie-Going Experience

According to an article over at the Hollywood Reporter, the upcoming release of The Hobbit will be the catalyst that finally drives movie theaters to upgrade their projection hardware to 48 frames-per-second (fps) or better. The current theater playback technology supports 24fps, a standard that dates back to the 1920s; this, in-turn, was an upgrade from the 16fps that was common for earlier silent films. As video-game enthusiasts are keenly aware, a higher frame-rate reduces jitter and provides much smoother video playback, and that translates to a more immersive and realistic experience overall. (And for the trivia buffs among you, the move from 16fps to 24fps was necessitated by the advent of movie audio — the original 16fps film speed moved too slowly over the magnetic heads to produce quality sound playback.)

In recent years, theaters have been steadily converting from celluloid-film projectors to digital projection technology, although not without serious objection from such respected industry pundits as Roger Ebert. For those locations that have already gone digital, the upgrade to 48fps will simply involve a software upgrade and modest hardware investment.

Of course, improvements to the playback capabilities also require changes to the original filming. As we saw in a previous posting here at GeekDad, Peter Jackson is using Red EPIC digital cameras to film The Hobbit. The move to high-res, high-framerate in turn creates challenges for the crew that the technically-inclined GeekDad audience will appreciate:

A huge challenge across the board is the volume of data that is required for HFRs. Oatley reported that for The Hobbit production shoots 6-12TB of camera data per day. And the shooting schedule (for both parts of the two-part film) involves 265 days of principal photography.

That amount of data means that modern film-making is a serious IT operation requiring massive data centers and an army of dwarves, er, geeks to run them.

Oh — and as if Peter Jackson and The Hobbit don’t provide incentive enough for this projector upgrade, some guy named James Cameron is also pushing for the same thing in preparation for a little project of his called Avatar 2.

Head over to the Hollywood Reporter for the full story.

Just a Little More Trivia:
At the moment, video playback in most theaters is 2048×1080 at 24fps, which is pretty close to Blu-Ray’s 24fps for 1920×1080 progressive-scan video. Of course, frame rate is distinctly different from refresh rate, so that “120Hz” number the salesman quoted you has nothing to do with what we’re talking about here. Don’t worry though — The Hobbit will look great on your home theater in 2013. And if you need to obsess about frame rates, refresh rates, and digital cinema, the Wikipedia articles serve as a good starting point.

NAB Report: 4K Production Comes of Age

The Canon C500 with an AJA Ki Pro

I recently read an opinion – by a member of the ASC, no less – that while 4K images on 4K displays looked stunning, that degree of sharpness wouldn’t necessarily be popular among actors, especially actors who are, shall we say, not in their early twenties anymore. Personally, I think the increased availability of 4K acquisition is great, if only because big sensors allow oversampling, and oversampling allows problems like noise to be greatly minimized, regardless of how much of that sharpness you actually allow to emerge in the final image.

Positive as it is, though, the release of more 4K cameras isn’t something that would have been hard to predict. We knew JVC had their 4K GY-HMQ10 before IBC, and Canon tell me that the C500 was actually the first design they came up with, and the C300’s extra downscaling and compression options were an afterthought.

I don’t want to dwell too much on what Canon is doing as it’ll have been covered massively elsewhere, but it’s worth considering the essential illogic of the C300/C500 pricing structure. The C500 simply has the option to do less to the sensor data, but the price is actually higher. This is nothing particularly new; cheap cameras notoriously suffer permanently-active automatic systems for things like audio gain and exposure, although that isn’t a direct parallel given the amount of data involved in sampling a 4K sensor. Either way, it’s hard to shake the impression that you’re paying more for less, or at least less technologically speaking. Paying someone money to not do something feels like extortion, but I suppose it’s just a pricing model based on performance rather than the actual cost of manufacture.

Sony announced a 4K high frame rate camera – the NEX-FS700.

Perhaps this impression is provoked by the price point of the C500, which could be taken as high for a DSLR replacement or low for an Alexa replacement, but which is ultimately difficult to interpret because we haven’t really seen any objective tests of the thing. The C300, which uses the same sensor, is an excellent camera, but it’s impossible to tell how much of that is due to the 2:1 downsampling of the sensor, and how much extra noise will be visible in the unprocessed sensor output. More, unavoidably, but how much is tricky, and it’s this factor which will determine whether C500 can compete with other 4K devices. Particularly, it’ll be interesting to see how it compares to Red’s more expensive offerings, which were, historically, not known for their particularly stellar noise characteristics.

One interesting aside with respect to Canon and higher-end cameras is the corporate restructuring provoked by the reaction to the venerable 5D Mark II. As I once wrote for this publication, Canon seemed genuinely surprised at the success of that camera and their stills division personnel assumed a touchingly rabbit-in-the-headlights posture during at least one NAB shortly after the camera was released. Recognizing the problems created by a frankly shocking lack of communication between the stills and video people, Canon tell me they’ve now reorganized such that the professional imaging division – including both still and motion picture products – is now a single department. The intention is that this will avoid embarrassing repeats of the “what do you mean, 24P?” situation provoked by the 5D2’s great but somehow inexplicably patchy launch spec.

Still, there’s always the 1DC to fall back on if you must have 4K and don’t have $30,000. I’m a little cautious about the MJPEG codec, which is primitive by modern standards, even compared to other simple DCT codecs like ProRes and DNxHD. The 1DC throws a lot of bitrate at it, and the engineering decision is simple – there’s no way to put a high quality 4K h.264 codec in a package with the size, weight, cost and power consumption of a DSLR. Still, I’m unconvinced by this engineering choice, and I suspect serious 1DC users will prefer offboard recording.

Speaking of offboard recording, AJA announced a 4K version of their Ki Pro recorder which they call the Quad, to be released in a few months. While their push at launch was for a pairing with the C500, I suspect that it might soon grow to work with other 4K devices – even things like JVC’s 4K offering, although that has four HDMI outputs and I’m not sure what the connectivity options would be. I certainly like the C500 plus Ki Pro Quad option, as a vastly more usable 4K camera system than anything Red have ever dreamed of. Again, it really comes down to the performance of C500. The competitive factor with Red is one really good reason to hope both Canon and AJA are successful, anyway, because that C500 configuration is going to be considerably cheaper than an Epic outfit and should offer somewhat competitive performance.

The wider effect of all these flash-based recorders is that things like the Sony SR-R1, designed for use on the F-series cameras, are starting to look very expensive. Fine, the Sony recorder has specialist integration with speed ramping and other features on the F cameras, and its reliability and performance will doubtless be beyond question, but at some point you’re paying a huge amount of money for a logo. It’s not even as if the leading manufacturers are really offering increased image quality. It’s a bugbear of mine that Panasonic’s AVC-series codecs, Sony’s SR tape and flash formats, and many independent manufacturers all use variations on MPEG-4, but variations which are mutually incompatible. Panasonic announced AVC Ultra at the show, a 12-bit codec with 4:4:4 RGB capability to be used in conjunction with its upcoming 4K Varicam. AVC Ultra is clearly intended to compete directly with codecs used by Sony on the SR Memory cards, so that’s yet another variation on MPEG-4 that isn’t compatible with anything else. Perhaps if people realized that just tweaking the encoder settings on an MPEG-4 codec isn’t something you could reasonably describe as innovative or clever or difficult to do, they wouldn’t be so impressed, and people wouldn’t keep reinventing the same codec.

Of course, politics and vendor lock-in are all huge factors here, but it seems terribly unsatisfactory that we can throw desktop computers together out of a collection of hugely complex parts from various manufacturers, but we can’t edit video without camera-specific modifications to the edit software. Ultimately, there are advantages for everyone to a unified workflow, because users of a particular NLE that doesn’t like one particular camera won’t overlook that camera. I hope, but don’t for a second expect, to see this situation improve.

Blackmagic Design launched a new camera at NAB 2012.

And to wrap up the most obvious, show-stopping stuff: perhaps one day Grant Petty will decide what Blackmagic Design, as a company, is actually for. Right now, I’d hate to be one of their competitors, because it’s impossible to tell what they’re going to do next. They’re certainly fast. It’s not directly camera-related, but they bought Teranex at the end of last year and have completely re-engineered their flagship standards converter device to use more modern silicon. This is more or less what they did with Da Vinci: purchase an ailing company, re-engineer its products, and sell them for vastly less money – it’s just that with Teranex they did this to a complex hardware device in thirteen weeks. Whatever Blackmagic are, they’re certainly embedded systems engineers.

This is an example of something that’s bothered me for years. Da Vinci, for instance, was a company almost entirely dependent on a very particular type of hardware engineering, and chose to ignore the warp-speed advances in computer graphics that would make that hardware obsolete and ultimately lead to their downfall. Like Teranex, what they were doing was not complex, it simply had to be done quickly, and once the silicon gets faster, the whole thing becomes trivial. There is a horrible tendency for companies that see themselves as high-end to do this; Avid and Pro Tools were and to some extent still are strange and exotic pieces of software with odd user interfaces and features stuck on with duct tape and crazy glue. More recent development tends to be easier to learn and use and cheaper to boot. To some degree, any application that’s used by people who are, for instance, editors first and computer users second, will suffer from a corporate reticence to change the things and risk alienating the user, but as Da Vinci found, that sort of desperate elitism is not indefinitely sustainable.

Or, to put it another way, independent filmmakers really can’t claim any more that it’s the lack of access to good, affordable camera and postproduction equipment, often with high-end performance, that’s making them suck. But I digress.

To touch on the wider issues surrounding Blackmagic and their string of acquisitions, they claim it’s not particularly expansionist corporate policy; more simply happenstance that attractive situations arose. Still, it’s difficult to consider the company without wondering what they’re going to buy next. I have visions of the future in which it’s possible to eat a quarter-pounder with cheese at BMD Burger.

Anyway, yeah, the camera. The numbers look good, especially compared to the price, but I’m slightly cautious. It has a rolling shutter – a really, really rolling shutter, perhaps as bad as some DSLRs although it’s difficult to be objective. It’s 2.5K pixels wide on a fairly small sensor, which is not incomparable to the density of photosites on the original Red sensor, which was noisy. They claim 13 stops of dynamic range and the sensitivity options in the menu go up to 1600, but until someone has the opportunity to shoot objective tests, it’s impossible to really evaluate. Blackmagic don’t seem to object to the idea that the camera is basically a Hyperdeck Shuttle recorder with a sensor slapped on the front, so presumably much of the system software is common and will be reliable.

Other than the currently-unknowable imaging performance issues, there are three problems: the power connector is a nasty, feeble, domestic DC jack, the case is shiny rather than black, and it reverses-out in the middle of bright highlights. The highlight reversal issue is very likely to be solvable in firmware, the shiny case problem is a forehead-slapping criticism that was received with “Of course! We’ll get right on that,” and the DC connector… well, yes, I know Lemos are expensive, but this really isn’t good enough, guys. It’s a camera, not an editor’s toy in a nice cosy post suite.

Otherwise, the only concern is about lens mounts. The mount on the sample I handled seemed to be a separate chunk of metal bolted on the front, presumably on the basis that they could then make others. The first mount available is EF, which is probably reasonable given the huge user base of those lenses. I would strongly suggest that the second mount to do is Micro 4/3, not because it’s particularly useful in itself, but because it can be readily adapted to more or less anything. They could even sell it with an approved adapter to the customer’s choice of mount.

The final big-camera news is Sony-related. There’s the usual yearly crop of new handycams and small, individual-reporter stuff which is really indistinguishable from what Canon and JVC do in the same market segment. What’s interesting is their announcement of a 4K, high frame rate camera, the NEX-FS700, capable of hundreds of frames a second at various resolutions. Really something like this boils down to a price-to-noise ratio; it will be expensive or it will be noisy, and the balancing act of that and your requirements really determine how useful it is.

Memories of Sony at NAB 2012 are dominated by something different, though, and not something that’s new. The F65 camera is not new and neither are the Trimaster series OLED displays. Both are individually excellent, and not horrendously expensive, at least for Sony products. In combination, though, I think these technologies do finally put the lie to the idea that film looks alive and video looks dead. The new ACES-related color handling helps; grading helps, lighting is critical. But ultimately, we’ve finally got to a point where every frame of this stuff can, with care, end up looking like high-end studio fashion photography, and it is beautiful, and beyond all the products and the numbers, that’s what it’s all about.

The Hobbit Will Fundamentally Change Your Movie-Going Experience

According to an article over at the Hollywood Reporter, the upcoming release of The Hobbit will be the catalyst that finally drives movie theaters to upgrade their projection hardware to 48 frames-per-second (fps) or better. The current theater playback technology supports 24fps, a standard that dates back to the 1920s; this, in-turn, was an upgrade from the 16fps that was common for earlier silent films. As video-game enthusiasts are keenly aware, a higher frame-rate reduces jitter and provides much smoother video playback, and that translates to a more immersive and realistic experience overall. (And for the trivia buffs among you, the move from 16fps to 24fps was necessitated by the advent of movie audio — the original 16fps film speed moved too slowly over the magnetic heads to produce quality sound playback.)

In recent years, theaters have been steadily converting from celluloid-film projectors to digital projection technology, although not without serious objection from such respected industry pundits as Roger Ebert. For those locations that have already gone digital, the upgrade to 48fps will simply involve a software upgrade and modest hardware investment.

Of course, improvements to the playback capabilities also require changes to the original filming. As we saw in a previous posting here at GeekDad, Peter Jackson is using Red EPIC digital cameras to film The Hobbit. The move to high-res, high-framerate in turn creates challenges for the crew that the technically-inclined GeekDad audience will appreciate:

A huge challenge across the board is the volume of data that is required for HFRs. Oatley reported that for The Hobbit production shoots 6-12TB of camera data per day. And the shooting schedule (for both parts of the two-part film) involves 265 days of principal photography.

That amount of data means that modern film-making is a serious IT operation requiring massive data centers and an army of dwarves, er, geeks to run them.

Oh — and as if Peter Jackson and The Hobbit don’t provide incentive enough for this projector upgrade, some guy named James Cameron is also pushing for the same thing in preparation for a little project of his called Avatar 2.

Head over to the Hollywood Reporter for the full story.

Just a Little More Trivia:
At the moment, video playback in most theaters is 2048×1080 at 24fps, which is pretty close to Blu-Ray’s 24fps for 1920×1080 progressive-scan video. Of course, frame rate is distinctly different from refresh rate, so that “120Hz” number the salesman quoted you has nothing to do with what we’re talking about here. Don’t worry though — The Hobbit will look great on your home theater in 2013. And if you need to obsess about frame rates, refresh rates, and digital cinema, the Wikipedia articles serve as a good starting point.

NAB: RED Ups Ante with 6K Dragon as Canon, Sony and Panasonic Join 4K Camera Wars

LAS VEGAS — As the NAB (National Association of Broadcasters) Show officially starts Monday in Las Vegas, the major camera manufacturers have already kicked the show into full gear and this NAB is all about 4K (4096 x 3072). 4K is the next high definition standard for displaying HD content with a display resolution that is four times higher resolution than 1080p (1920 x 1080). At their respective press conferences today, Canon demoed its recently announced Cinema EOS C500 and 1D C cameras and showed two short films shot on the new cameras while Sony showed off its NEX-FS700 for the first time which will be a 1080p camera that will be 4K ready via a future firmware upgrade and external recorder. Panasonic showed off a 4K Varicam prototype camera as well as announced its AVC Ultra codec for supporting 4K workflows in its camera ecosystem. Meanwhile, RED, the pioneer in 4K cameras, kept a couple steps ahead of the competition as RED’s CEO Jim Jannard posted an announcement late Sunday night on the its RedUser.Net forums giving specs, availability and pricing info on its new 6K (6000 x 4000) Dragon sensor which will be a $6,000 upgrade for its EPIC camera and an as yet undisclosed upgrade price for its Scarlet camera.

The 4K wars started at NAB 2006 six years ago when RED came out of nowhere and introduced its RED One 4K camera at then unbelievable $17,000 entry price. At the time, Sony’s top end F65 1080p camera was selling at more than $200,000. While many industry skeptics doubted that RED could deliver, in 2007 RED delivered the first batch of RED One 4K cameras. Since then, RED has come out with the 5K EPIC and Scarlet cameras, which have been or are being used for movies such as the upcoming Hobbit movies from Peter Jackson and the 11-time Oscar nominated Martin Scorsese film Hugo.

RED had hinted at the Dragon sensor but Jannard’s post revealed more specific information for the first time. The Dragon sensor will have 6K resolution with 15+ stops of native dynamic range and will do full 5K resolution at 120 frames per second. 6K recording will be done at at least 85fps, according to RED’s Jim Jannard. He says the Dragon sensor itself “is slightly larger than the Mysterium-X sensor and the pixel size is slightly smaller. The pixel design and sensor design makes all the difference in the world … Most all of your current lenses will work.” Light performance is also better than the Mysterium-X, he says, with a 30 x 15.8 mm sensor and 5 micron pixel size. Availability will be some time in 2012 for EPIC upgrad3es and 2013 for Scarlet upgrades.

 

Meanwhile, Canon showed off its two new 4K Cinema EOS cameras at an exclusive screening at the Palms Hotel by screening two short films that were shot with the two cameras. The first camera, the EOS 1D C is equipped with an 18.1-megapixel full-frame 24mm x 36mm Canon CMOS sensor that records 8-bit 4:2:2 Motion JPEG 4K video to the camera’s CF memory card at 24 frames-per-second (fps) or Full HD 1920 x 1080 video at selectable frame rates from 24p to 60p. It will be priced around $15,000. The higher-end C500 can record up to 4096 x 2960 resolution for cinema or 3840 x 2160 Quad 4K (4 x 1080p) for TV. Both of these modes offer 10-bit 4:4:4 at 60 frames-per-second. There’s also a half-RAW option, at 4096 x 1080 or 3840 x 1080 resolutions at 120 fps. The C500 captures 4K via an external recorder option thru its 3G-SDI outputs or 1080p thru dual internal CF slots. Both cameras support Canon Log Gamma, empowering colorists to correct color in post production with more versatility by capturing additional information and a higher dynamic range. It will be priced at around $30,000. No availability was given for either camera thought both are expected some time in 2012.

Jiiva joins Vikram

Vikram who is presently shooting for Thaandavam will also be shooting for Bejoy Nambiar’s David. We also hear that Jiiva has signed up for the film. The film is said to be a gangster story with comedy as its USP. A few scenes have been shot at Mangalore and Alleppey.

The film is being produced by Reliance and will be made in Hindi and Tamil. It features Vikram, Jiiva, Isha Sharvani and Tabu in important roles. There are also rumours that the Bollywood Superstar Shahrukh Khan might be roped in for the project. Rathnavelu and Madhi handle the red epic camera.