Laser Projection at 55K Lumens, HFR, 48FPS - The Future of Cinema?

April 25, 2012

Laser Projection

This week is the industry convention CinemaCon in Las Vegas, the yearly tradeshow for the movie theater industry, both the exhibitors on one side, and the movie studios on the other. While we do get presentations from Hollywood showcasing their upcoming movies, this is also a major convention for technology, and looking towards the future. Ever since the controversial, divisive showcase of The Hobbit footage in 48FPS earlier this week, everyone has been buzzing about how awful it looked, and whether 48FPS really is the future (or not?). But today I saw a laser projection demo and whoa - this is the future I cannot wait for.

This morning's laser projection demo was hosted by Barco, one of the big manufactures of projectors for cinema. As this is one of the first times they've ever demoed prototype laser technology for the "public" (and not just industry execs), I was very anxious to get a glimpse of what it would look like, and if it would be the industry gamechanger I've been hoping for. Last year, when James Cameron personally demoed both 48FPS and 60FPS, I was convinced that might be the industry gamechanger (though now I'm a bit unsure after The Hobbit footage). This laser demo today, however, makes me feel like this is definitely the future of digital cinema, even though we won't really see laser systems being installed in mass until late 2013, at the earliest.

So why lasers (frickin' lazerz man)? Well, using a laser system for a projector's light source allows them to continue to improve screen brightness and image quality, as well as perfectly uniform brightness on huge screens upwards of 70 to 100 feet. Barco explained that our demo on the prototype equipment today was putting out 55,000 lumens (lumen is a measurement of the total "amount" of visible light emitted by a source), which is more than some current world records. Many have complained that brightness is an issue with 3D, but this takes brightness and clarity to an entirely new level, as was evident in the footage we saw.

Barco demoed a couple of different clips, one of them included 4K footage from the documentary Samsara, which was spectacular. I could view every last little detail as far back as I could see, and it looked stunningly bright, but not too bright. We also saw a demo of 48FPS 3D footage of a helicopter flying above Los Angeles shot on a RED camera. Literally looked like a window to the sky where it was flying. Finally, they showed the Burj Khalifa wall-clinging scene from Mission: Impossible Ghost Protocol, and honestly I think it even looked better than when I saw it on IMAX at its release (IMAX is also working on their own laser projector).

I learned today that laser and 48FPS can be used together, and it's not one or the other. In fact, based on the info they were providing us, it sounds like laser projection systems will simply be the upgraded projectors that all cinemas will begin to use starting late 2013 going on into 2014. Instead of just proclaiming that laser projection is the future and that's it, these two can be used together, and by the time we actually start seeing 48FPS (HFR) movies in 3D on laser projectors, we'll truly be entering a new era of cinematic image quality.

Barco Laser Projector
Barco's laser projector from January, image via Commercial Integrator

The future of cinema, right? Well, that seems to be the big question. What is the future of cinema? Is it going to be high frame rates like 48 and 60FPS? Or is it going to be lasers? Or is it going to be both, or neither? Will audiences accept HFR, or will they revolt like a lot of people here at CinemaCon? Will lasers become a consumer choice, or will they eventually be the industry standard light source in all projectors? Only time will tell, but I can definitely say that what I saw of laser projection was fantastic, I want to see every film projected with lasers, on a screen that big and that beautiful. That's how I love to watch movies.

Find more posts: 3D News, Discuss, Feat, Movie News



I'm glad people didn't like the Hobbit video in 48fps. 35mm and 24fps is still what looks best and works best.

Ryderup on Apr 25, 2012


"35mm and 24fps is still what looks best and works best." So that's it? There should be no attempts at innovation because of a standard set over 100 years ago? Maybe The Hobbit presentation was a misstep, but at least people are trying something new.

axalon on Apr 25, 2012


 Why mess with something that works?

Xerxexx on Apr 25, 2012


human nature? or we were still carving movies into a wall..

quack on Apr 26, 2012


At one point mailing letters was a perfectly fine way to communicate between two people, it worked so why bother changing it?  Then phones came along, well now we can speak to each other in real time so we don't need anything after that. Then the internet, cell phones, texting, etc. The point is, people are always "messing" with something that works, it's how new discoveries are made.

axalon on Apr 26, 2012


If theaters have to drop and re-drop $75,000 per screen year after year - movie prices will continue to skyrocket and mom and pop theaters will fold - these technical improvements hardly improve the films - indeed movies suck MORE THAN THEY EVER HAVE - - rather than theaters forced to invest in new projectors studios should invest in writers.

clapper on Jun 5, 2013


I'm happy to see new things being tried. Sometimes the new things work out to be better. Buggy whips got replaced with the accelerator peddle. Matte painting done with a paintbrush was great, but you may not want to try a dolly or crane shot that way. There was a time when people swore that solid-state digital cameras could never be able to replace tube based analog cameras.Turned out to be a lie.  Just a couple years back some people freaked whenever someone claimed to shoot or process a film on electronic devices. I responded that if that was going to be the rule, then anything shot or modified or displayed with an electronic device shall have to be called a video. They didn't like that very much. 24fps was chosen for only one reason. It was the slowest speed that could still give a sense of smooth motion, and even then you had to be very careful with pans or lateral motion within the scene. Plus, 24fps has horrible flicker. They had to split the shutter on projector into at least 2 blades to get the flicker rate up to 48fps. Even that was too slow, so using 3 blades got you up to 72 which was tolerable to most.

David on Apr 26, 2012


Change != innovation.  That's like saying "if a car is great with 4 wheels then 16 would be 4x as great!!!" According to you suggesting otherwise is rejecting innovation

Geoffrey Shauger on Apr 25, 2012


Right, but there *are* vehicles that benefit from having 16 wheels and they serve a very specific purpose. Just like 48fps potentially can. I'm not saying we should up and leave 24fps, far from it. 24fps has been the standard for over a hundred years, it's nice to see something new.

axalon on Apr 26, 2012


Show me a car with 16 wheels...not all vehicles are cars

Geoffrey Shauger on Apr 26, 2012


How about a semi-truck? Which can actually have up to 18 wheels. It's a vehicle with a very specific purpose.

axalon on Apr 26, 2012


A semitruck isn't a car.  But using your logic if a semitruck is great with 18 wheels then innovation means we should double it to 36 and it would be even better...right?

Geoffrey Shauger on Apr 26, 2012


No, it means in the gamut of vehicle options having 16 (or 18) wheels serves a very specific purpose. (e.g. towing large amounts of cargo) The same could be said for FPS. Read Underscore's post below, he puts it perfectly - 24fps and 35mm were compromises.

axalon on Apr 27, 2012


Framerate: 24 fps was standardized when sound was added to the picture. It was the minimum (i.e. cheapest) framerate where the sound still appeared to sync with the action on screen. I believe motion smoothing in TVs have unfairly demonized high framerates. The problem is not the frame rate but what the TV does to create the smoothing effect. For example, a 240 Hz TV, tries to make 24 frames become 240 frames by comparing each frame to the next and inventing in-between frames. That's right, it's a $200 chip in the $1500 TV creating 9 out of every 10 frames you're watching of that $200 million movie. If the filmmakers didn't want the motion blur, they could shoot it that way. The motion blur is there because they want it to be, not because of some limitation with 24 fps. The problem is, your 240 Hz TV tries to undo that specific choice made by the cinematographer by sharpenning the blurred effects and trying to remove the blur. This results in a "shot on video" look and it also makes CG look really bad. Your TV is applying a filter and CHANGING the image. If it was shot at 240 fps and played back at 240 fps (with no cheap TV hardware altering the source image), the motion blur would be completely intact, only the stutter would be missing. 35mm: It was only standard because 16mm looked too cheap and 70mm was too expensive. If money wasn't an issue, the last 50 years would have seen (at least) 70mm as the standard. If you get a chance to see a screening of something like Patton on an original 70mm print... WOW. Bottom line: 35mm and 24fps aren't magic; they're compromises. 24fps smoothed to 240Hz looks terrible. That doesn't say anything about higher fps source material when it's setup properly.

Underscore on Apr 27, 2012


Well said, I couldn't agree more.

axalon on Apr 27, 2012


Well said, but I do have a couple of questions and observations: Isn't the 240 Hz. refresh rate a multiple of 60 Hz, the defacto screen refresh rate for the first generation LCD screens? You know, 60, 120, 240 (doubling every time)... The thing is: you can't divide 60 by 24, so the processor has to do uneven division, which creates stutter, right? If it was simply a matter of frame rate, an interlaced display of 60 Hz. would provide 30 FPS: an untenable position for averaging 24-frame display: the frame rates don't line up, so some averaging has to be done: manufacturing transitions between full frames, across the time period when the shutter blacks out the exposure to advance to the next frame. What 120 did was provide even division by 24 (5). But that was incidental to doubling the line-current related 60 Hz. refresh rate. Nevertheless, it did away with smoothing through interpolation. What 240 does is again, provide even division and thus no interpolation stutter. Smooth motion, including SMOOTH BLUR with no stutter. In other words, it presents the intended blur smoothly, instead of in jerky motion. And as far as high-rate TV goes, commercial TV cameras are capable of presenting motion in excess of 480. There is no reason to adapt it to a 24-frame rate: that would be several steps backward. That is why motion on a hi-def plasma screen looks so much better than on a hi-def LCD screen.

DutchTreat on Aug 27, 2012


70 mm VS. 35 mm: slightly faster shutter rate for 70 mm: 30 fps, but frame twice as wide. Modern cinema projectors double the flicker rate from 24 fps to 48 fps mechanically, anyway.

DutchTreat on Aug 27, 2012


I do agree 120 or 240 is better than 60, except for some higher quality 60Hz TVs can operate at 24 or 48 Hz specifically for 24p playback. 60Hz is just the maximum. Also, some 120 Hz TVs don't properly divide and just do a double 3:2 pulldown. But that's not what I was really talking about in my post. If a filmmaker shoots a film at 24fps, he has created 24 unique images each second, yes? Now, when he plays that content back on a 240Hz (regardless of type), he should still only see 24 unique images per second. Those 24 unique images are refreshed 10 times each to get the 240Hz refresh rate the TV can accomplish. This is great. However, you'll still see some stutter because that's just the way the film was shot and 24Hz really isn't that high of a refresh rate. Now, turn on the special awesome "smooth motion" effect for that TV. Each manufacturer has their own name for this. When activated, the TV will use an algorithm to read the current frame and the next frame of content (frame 1 and 2 for this example) and create new unique images on each refresh. So, in the first example above, frame 1 was displayed for refresh 1-10, and frame 2 would be displayed for refresh 11-20. However, with the smooth motion effect, frame 1 is only displayed on refresh 1, unique in-between frames (created by the TV and not found anywhere on the Blu-ray) are displayed from refresh 2-10. Frame 2 displays on refresh 11, and more unique images from the TV for refresh 12-20. The algorithms do a good job of looking at an object and making its motion look smoother, but it can also make a scene with complex motion really fall apart. Sometimes objects that are out of focus become clearer and objects with motion blur have some of that blur removed. OR, the direction and amount of blur resulting from the extrapolated frames is slightly different than it would have been if it were actually shot at 240fps. The result is that in complex sequences (especially for visual effects), objects in the foreground can "wiggle" or "float" back and forth compared to the background elements. Even though it's subtle, your brain knows things aren't right. A good example of this is Avatar, when Jake Sully first awakes in his avatar in the lab. If you just sit there and turn the smooth motion effects on and off while watching, you see the difference. With it off, everything stutters for a while until your eyes become accustomed to it. Note, the frame-rate is 24, but the refresh rate is still 240Hz. When smooth motion is on, Jake Sully becomes disconnected from the room and seems to kind of float, as if it was a terrible compositing job. Now the actual frame rate is 240, which is NOT what was in the movie. As for cameras, high fps cameras can capture in excess of 1000fps (there's even a device that will do 1 million fps, fast enough to see light itself moving through the frame!). However, the actual footage is recorded out as a more standard rate, like 24 or 30fps for higher resolution content, and the content appears in slow motion. Of course, I'm sure Jackson's team had something very new or custom to shoot at 48 and record at 48 and so on, and this will trickle down into the rest of the pro market just like 4K and 3D has.

Underscore on Aug 28, 2012


I'm sorry Underscore, I know this is an old thread, but you are saying things wrong on so many levels. 24 fps was chosen as an advance on the silent speed of 16fps for reasons of optical sound quality not synchronisation. It is possible to synchronise at any speed but optical sound then current would have had very poor high frequency response at 16fps. It is true though that silent speeds were often hand cranked (but not always) and so for sync a constant motorized speed was settled upon. 35mm was not chosen over 16mm or 70mm for reasons of cost, neither the latter existed properly when film was invented. 16mm was introduced in the teens of the 20th century as an amateur format, cut from wider 35, it did not see professional use until at least the late thirties and into the war, mainly for newsreel purposes. 70mm came much later as a 65mm camera negative and a 70mm release print for additional sound tracks in the fifties (though experimental versions had existed much earlier). Cinemascope made 35mm much more viable for exhibitors, who would have to bear the brunt of 70mm projection costs. 48fps or 60fps are nothing new Mike Todd in the fifties and Douglas Trumball with Showscan in the seventies were well known examples. Hardly any ideas in the cinema are new, including of course 3D! There is no high speed scientific video camera that I know of that can capture a photon of light moving across the frame. At a million fps the photon would have moved almost 2/10ths of a mile by the time each frame was exposed, since the speed of light is 186,000 miles per second.

mercer41 on Nov 30, 2012


Hi Mercer, About 35mm and costs, I was not trying to make that claim in relationship with early film but rather late 20th century. A few films were shot in it, but it is more expensive and ultimately not worth it because, and you said it yourself, the exhibitors would have to be able to project it and it wasn't worth the cost. In my comments, I'm not referring to the overall sync of the picture. I'm referring to how well a discreet sound effect appears to sync up with or match a single frame of visual action. You can play images at 1 image per second with perfectly synced sound, but the sound will not look like it matches the image. Sounds in-between frames may not have an associated visual cue displayed on the previous or following frame. As framerate increases, this is less noticeable. As far as frequency response goes, I'd really like to hear more! If I'm not mistaken, there were many sound formats around and both the on-film sound tech and framerate were both standardized near each other, around 1930, and that the sound standardization was first developed for 35mm. Prior optical sound technologies weren't as good ask disk sound formats. What I'm not understanding is why framerate alone affects frequency response. However, regardless of frequency response or my original statement, the idea remains that the reason they went with 24 instead of 30 or 42 or 500 was that it was the lowest rate with acceptable results. Lastly, on the high speed camera. My bad - it's a trillion fps... Not a traditional camera, mind you, but it achieves the intended effect. However, if you want to take it out of the lab and put it into the field, the Phamtom v711 shoots up to 1.4 million (pretty low res).

Underscore on Nov 30, 2012


Well said, sir. I think the laser projection technology is the key piece here. I predict, mark my words right here, at this time, if successful, laser projection technology will bring back drive-ins.

zuniper on Nov 6, 2012


Was 48FPS on The Hobbit REALLY that bad? Thats making me nervous man... :/

LosZombies on Apr 25, 2012


I heared people cried after the demo, but not of joy....

David Banner on Apr 25, 2012


 fuck... >.

LosZombies on Apr 26, 2012


Are they releasing the Hobbit in regular 2D 24 FPS?

Chris Amaya on Apr 25, 2012



Ryderup on Apr 25, 2012


there is a reason 24 FPS looks the way it does, and why we find it natural. put your hand in front of a CRT display and wave it around, looks a little strange,  No motion blur, just a cut from one frame to the next, 24fps works because you get to use 1/24th of a second exposure, which is similar to what the eyes/brain combination views, just enough motion blur to make it more realistic than having all the frames in the world. 

Steelski on Apr 25, 2012


I didn't like BluRay until I watched about 20 hours of media. It might not be the audience doesn't like it. It maybe they haven't had enough exposure to it.

redtie on Apr 25, 2012


It took you 20 hours to like BluRay?  That's literally the dumbest thing I've read all day

Geoffrey Shauger on Apr 25, 2012


From the buzzing link Alex posted, "The 48fps footage I saw looked terrible. It looked completely non-cinematic. The sets looked like sets." I'd say that's what I thought about BluRay, until I got used to it. I also think CGI looks pretty terrible in BluRay, unless the whole movie is CGI. And yes, 20 hours, I timed it down to the nano second. Then I was like "Holy shit! It took me 20 hours to like BluRay. That's literally the dumbest experiment I've done all day!"

redtie on Apr 25, 2012


Does your TV have a high refresh rate? If so, have you disabled it?

Underscore on Apr 27, 2012


The settings were fine. It was my visual cortex that needed calibrating.

redtie on Apr 29, 2012


At the rate we're going the 'future of cinema' will be a revolving door of new technology updated every three years. As long as there is some sucker filmmaker to push it, we'll keep getting new stuff that takes us further and further away from the magic of film.

germss on Apr 25, 2012


I have a laser pocket projector that kicks ass. There's no doubt laser is the future.

redtie on Apr 25, 2012


Reaction to James Cameron's demo of 48 and 60 fps footage last year was universally enthusiastic.  I'm having a hard time believing everyone suddenly changed their mind about higher frame rates...

Michael Stat on Apr 25, 2012


All you ludites need to fuck off; you're ruining a brighter future for us that aren't scared of change.

crumb on Apr 25, 2012


 I doubt our opinions ever matter, they will do what they want to do.

Xerxexx on Apr 25, 2012


It'll matter if audiences start to reject the changes.

Max Renn on Apr 27, 2012


There's a time and a place for higher fps. It can give a project a completely different look and feel. In my opinion I don't want all movies to be high fps and have crystal clarity because that can potentially work against a cinematic feel.

Garrett on Apr 26, 2012


 however, what if it only feels that way to you.  Others may feel differently.  Also, so many things that made a movie theater feel like a movie theater are not even used now.  New things ten years ago that did not make you feel like you were at the movie theater, now do.  It is always changing.

tonhogg on May 19, 2012


It took you 20 hours to get use to seeing movies at a resolution that is closest to how they were filmed? Did it take you 20 hours to get use to IMAX?  Or 20 hours to get use to movie theaters when you were a kid?"This BluRay looks too much like a movie theater...this is weird" Oh I get took you 20 hours to get use to watching movies at theater quality without the smell of popcorn, or your shoes stuck to the spilled coke, or the annoying fat guy who sits next to you who is literally spilling into your seat. Now I totally get it.

Geoffrey Shauger on Apr 26, 2012


It took you 7 hours to get it? That's literally the dumbest thing I've read all day.

redtie on Apr 26, 2012


The clarity of Blu-ray and video-like 48 fps is not a good comparison.

Max Renn on Apr 26, 2012


Clarity was not the comparison. It's a comparison of adjusting from DVD to Blu-ray as being similar to adjusting from 24 fps to 48 fps. While not all people take a while to adjust, I am guessing some people will take a while to adjust.

redtie on Apr 29, 2012


lets go back to  stone age with one frame rate every 10000 years...

Taz on Apr 29, 2012


wait a year maybe two ... laser projectors is future

moover on May 1, 2012


Wonder if laser projectors will be significantly cheaper than the ole mercury lamp projectors? Anyone know?

Lloyd Stewart on Jun 27, 2012

New comments are no longer allowed on this post.



Subscribe to our feed -or- daily newsletter:
Follow Alex's main account on twitter:
For the latest posts only - follow this one:

Add our updates to your Feedly - click here

Get the latest posts sent in Telegram Telegram