Sunday, December 30, 2007

The Redrock M2 Cine-lens adapter report Part 1 - Background

As previously stated, Nance and I recently purchased the Redrock M2 lens adapter for our Sony Z1 HDV camcorder. I thought I would break up our report on this in to two parts. This first part is focused on a bit of optical explanation. If you understand the reasons for purchasing the Redrock M2, you can probably skip this posting and move on to the product review post.

The idea here being to answer the question:

What is the Redrock M2 and why would you want one?

The simple answer is:

To allow for the use of other lenses on the Z1 and to allow fine control over depth of field.

However, I thought some of my readers here might not have a reasonable understanding of some of the principals involved with depth of field as well as an understanding of how a lens treats an image. I am not going to delve into optical theory in depth here. Rather I will present some of these principals in a cause and effect explanation. If you want more detail on these topics, I encourage you to search the web.

Depth of field.

What exactly is depth of field? Simply stated, it is the region in an image that is in focus. That's it. Nothing more. The depth of the field in focus in an image. So, why do we care about this? In cinema, it is typically used to force the audience to pay attention to something specific in the frame. As a general rule, we have two methods for doing this: color/brightness and depth of field.

Consider the following image that I used in my previous posting:



In this image, Eeyore is in focus, and Tweety, behind Eeyore, is not in focus. Now in the following image, the reverse is true:






Using this effect, we can draw the viewer's eye to a particular item in the frame. This is used to great effect in shots involving people in a conversation. The focus can be shifted from one character to another at will depending on whom the director wants the audience to pay attention to. As you can imagine, there are a wide variety of applications of this effect in the telling of a cinematic story.

Controlling Depth of Field.


We have 2 variables that we can adjust when using depth of field. They are:

  • What is in focus.
  • What is the depth of the area in focus.

To control the first variable, we merely adjust the focus control on the lens. This process shifts elements within the lens to draw the point of focus from the minimum focus of the lens (this is determined by the type of lens in use) all the way out to infinity.

The second variable is controlled by the speed of the lens, the setting of it's aperture and the siize of the film plane. Let's look at this in some detail.

The speed of a lens is expressed using what is known as an F number (Cinema lenses are rated with a T instead of an F - I will explain the difference below). It expresses the amount of light that a given lens will allow through the glass elements of the lens. A lens with a FAST speed rating allows more light through and a SLOW rating allows less. This is important in situations where the amount of light you have to work with is a consideration.

A fast rating is 1.0. There are almost no lenses rated at 1.0. A more typical rating will be 1.4 for a VERY fast lens. Medium speeds are typically 3.0 - 4.5. A slow lens might be rated at 8.0.

Another side effect of this lens speed rating is that it impacts the minimum depth of field. The faster a lens is, the narrower your depth of field can be.

So, how can we control the DEPTH of the area in focus in a shot? We do this with the aperture.

The Aperture.

The aperture, also called the iris, is a mechanical device inside of a lens that controls the amount of light that a lens will allow to pass through it. It consists of a set of overlapping slats of spring steel that create a variable sized hole. Consider the following diagram:


The smaller the hole is, the less light is passed through and the WIDER the depth of field is. Another way to think about this phenomenon is by using your own eyes. Many times, when we are straining to see something we will squint our eyes. This has the effect of sharpening the image that we see. It's the same affect that the aperture has.

Each setting of the aperature has an F number associated with it as you can see. And that directly relates back to the lens speed. When the aperture is all the way open, the F number for the iris setting is the same as the F number for the lens itself.

F number versus T number.

Typically, still camera lenses are rated with this F number (or F-stop) that we have been talking about. But cinema lenses, such as those that might be found on a Panavision or Arri film camera, are rated in T stops. T stops are considered more accurate. As I understand it, F stops are determined by using a formula that is applied against the lens's design formula and T-stops are determined by measuring the actual amount of light transmitted at the back of the lens.

The Film Plane.

This is our final stop on our journey to depth of field nirvana. The size of the film plane in a camera (the place where the lens focuses it's image for exposure or recording) will impact the amount of depth of field that the lens will allow us control over. The larger that this film plane is, the narrower the minimum depth of field will be. When we have a narrow minimum depth of field, we have more flexability in determining what will be in focus for our shot!

In a 35mm film plane, the amount of the depth of field can be a fraction of an inch (with a fast lens). That's great if you have a 35mm film plane. Not so great if you have an HDV video camera whose sensor is 1/3 inch in size. When that is the case, your minimum depth of field with is several feet.

That film look.

A number of things differentiate how something shot on film looks vs. High Definition video. The principal ones are:

  • 24 frames a second in film vs. 30 in video
  • The lattitude of film vs. video
  • Control over a wide range of depth of field

There are others, but these are 3 of the main ones. The first one can be overcome with cameras that provide 24 frame progressive image recording as well as some tricks done in post processing. Lattitude has yet to be overcome. Depth of field can be overcome with devices such as the Redrock M2.

The Redrock M2 simulates a 35mm film plane and allows your video camera to photograph the simulated film plane and as a result gain the advantages of 35mm film lenses and their associated control over depth of field. It also allows you to use other lenses on a camera that has a fixed, non-interchangable lens.

It does this by placing a film lens in front of a translucent material that mimics a 35mm film plane. The video camera then focuses on this translucent material and sees the image captured by the 35mm lens. Once this system is in place, all exposure and focus operations are controlled by the 35mm lens. The lens on the video camera remains at a fixed focus and zoom position.

An inverted image.

When a lens captures an image and projects it onto the film plane in the camera, it inverts the image. The physics regarding optics that govern this effect are far beyond the scope of this article. Suffice it to say that this phenomenon occurs in ALL optical systems where a device (be it a glass lens or a pinhole) project an image onto a plane.

Consider this diagram:


The image on the left is the subject being photographed. Once that image passes through the lens in the middle, the image is inverted and recorded onto the film or video sensor as you can see by the image on the right.

Now you might be asking "Why don't I see that inversion when I look through the viewfinder of my digital SLR?" And the answer to this is that the optics in the SLR invert the image again before it is displayed in the viewfinder (and, for that matter, on the LCD display).

Now, most devices such as the Redrock M2 do not provide those extra optics to invert the image. As a result, the image must be inverted in post. Additionally, when you use your video camera with a device such as the Redrock M2, your view on the LCD and viewfinder is inverted.

Redrock does supply a free Mac software application that lets you feed the firewire output of your camera into a Mac and it will display the live video image and let you invert it on the Mac. Very handy and very nice. Redrock is also about ready to ship an optical inverter that will go between the Video camera and the Redrock M2 that will invert the image on the fly.

*WHEW*

OK, that was a lot of background. I hope I was able to clarify some of the typical questions that I get regarding these types of products. If you have more questions, feel free to post a comment and I will do my best to answer. And if I screwed anything up in this post, feel free to post a correction in the comments as well!

Check out part 2 of the review HERE.

Friday, December 28, 2007

Hitchcock on editing and a Redrock M2 tease

Been pretty busy here with Nance setting up the Redrock M2 on the Z1. We have some OTS tests to run tonight with a friend. Tomorrow I will assemble some of the test footage and I will blog about the setup experience, our experience with Redrock's customer service and our experience with the unit attached to the Z1 using a variety of Nikon lenses.

Here is a nice little video of Hitchcock talking about film editing:






And here is a still from one of our test shots using the M2. This is using a Nikon 55mm 2.8 micro lens. The subject is about 2.5" tall and the field of focus is about 1/2". Thanks to Brian Valente for his CC on this:


Tuesday, December 25, 2007

Merry Christmas everyone and a little HDV tidbit for you all.

Merry Christmas and a happy new year to all of my readers here. Thanks to everyone who has commented and e-mailed me about the blog here. It's always nice to know you are having a positive impact with your efforts.

May the new year bring you lots of prosperity and opportunity. I wish all of you the very best this new year!

In other news...

FCP will now allow you to transcode HDV to ProRes 4:2:2! This is a great update for FCP. I shoot all HDV footage and this takes an additional step out of my workflow. I hope it does for you too!

Read about it HERE.

Thanks to Shane over at Little Frog in HD for pointing this out!

In other, other news...

Redrock Micro, the makers of the M2 Cine Adapter are having a holiday sale. You can save over $600 on their HD Indie pack that includes the Cine adapter as well as their follow focus pack. You can check out the pricing HERE. It's good until 01/05/08 so if you were on the fence with this, now is the time to jump in and do it.

Speaking of doing it, Nance and I ordered this and I will post pics of the unboxing, installation of it on the Z1 and some test footage this weekend using our Nikon still lenses. So stay tuned!

Sunday, December 16, 2007

The digital storm: An editorial

In the late 70s, I went to work for a company called Micropolis. At that time, Micropolis made floppy disk drives for Tandy, Commodore and some other smaller companies as well as selling to the general public.

At this time, Microcomputers were brand new. Apple, Commodore and Tandy were the main producers of computers for the home and they were generally considered a hobbyist product. I remember making a bet with someone at that time that all storage would be solid state in 5 years. No more electro-mechanical devices. No more moving parts that were all prone to breaking.

I lost that bet.

Today, the principal device for mass data storage is the hard disk. An electro-mechanical device. But moving to a solid state device is inevitable. Look at P2 devices from Panasonic as a good example.

The point is that technology has a huge impact on all aspects of our lives. For better or worse, it drives forward and changes the way that we perceive our capabilities and limitations. This article will focus primarily on how technology impacts the film business. I will use the term film here to encompass both traditional film as well as digital technologies.

The path to digital

For a very long time, technology had a minimal impact on the motion picture industry. The initial use of film in the silent days remained largely unchanged in the mainstream until the advent of a practical sound system in 1929 as demonstrated in the film “The Jazz Singer” starring Al Jolson. The film was not entirely sync sound but had sections of the film with sync dialog in it.

There were those at the time that felt that sound was just a gimmick. That it would never succeed in the mainstream. Charlie Chaplin was one of these critics. But “The Jazz Singer” was a huge hit and forced all of the majors to re-assess the importance of sound in film. If audiences were going to flock to films with sound, their existing silent endeavors would lose audience share to the new “talkies”.

The immediate impact of “The Jazz Singer” forced all of the studios to act and act quickly. They hired sound consultants and engineers. They retrofitted stages on the lots to become “Sound stages”. They placed the camera inside of a sound proof booth. They hired dialog coaches to work with the stars. And many actors who had previously enjoyed a great deal of success in the movies now found themselves unwanted since their voices did not live up to the expectations of the studios.

The impact of sound in film cannot be understated. Short of the introduction of film itself, sound created the biggest frenzy of change that the film industry has ever seen. None of the subsequent technological changes in the film industry have caused the studios to adopt a significant technology change. Not color, not widescreen, not digital.

The subsequent technologies that I mentioned, were all adopted slowly. Color was implemented sporadically as was widescreen. Digital technologies have also been very slow to make inroads to the film industry.

In the late 80s, I got involved in a development project called “Polyphonic FX”. This system was intended to utilize computers, optical storage and digital time code in order to automate much of the manual process associated with adding sound effects to films.
The effects editor would be able to call up samples of sound effects from a library that was stored on a rack of optical disks, audition the sound to determine it’s worthiness for a given shot and then assign the sound to that part of the film through it’s SMPTE timecode.

Using this system, a single sound editor could do a complete feature film in the same time that it would take 7 editors to carry out the task. The system was shown at NAB and was subsequently purchased lock, stock and copyright, by a large sound house in Hollywood. We were happy, as we got out investment back and a tidy profit. The buyer, dismantled the system and threw it away.

They did this because it represented a threat to the sound editors. The perception was that it would put editors out of work. Today, we have products that effectively do the same thing. The sound editors use digital technologies to great effect. But it took a number of years before these types of products made inroads into the business.

Sometimes, the technology appears and is not practical because of the immaturity of the product. Non-linear computer based editing is a good example here. Early systems used a very low resolution image to edit with and were slow and cumbersome to use. However, these technologies have matured and evolved to the point that they are generally accepted.
Digital filmmaking also falls into this category. And by digital filmmaking I mean the production of movies without the traditional use of film. As we all know, digital technologies increase in capability while going down in price as a general rule. For the major motion picture industry, this is a very bad thing.

Back when I was working on Polyphonic FX, I was exposed to a lot of the post production process as we were partnering with a local sound post facility. This was my first exposure to filmmaking at the professional level and I got it into my head that this might be an interesting activity to be involved in.

So I outlined a little script and sent it off to a friend of mine that was working as a story consultant on episodic TV. She made some notes and advised me as to the correct format for the script and I proceeded to finish the script.

Next up, acquire the equipment needed to make a short film. This would consist of sound equipment, lighting, a camera and it’s associated support gear and various grip items. I wanted to buy the gear so that I could learn to use it as well as have it available for subsequent projects.
After pricing it all out and looking at the costs associated with film stock, processing, post production I concluded that there was a good reason that this was not a common hobby for the general public to engage in: Cost.

The cost for doing this was exorbitant. There was NO way I was going to be able to afford to do this. Even renting gear, it was out of the question. So I put the project on the back burner and forgot about it. So what happened here?

The studios win.

The cost of playing in this game is very very high. And this is very much to the studio’s advantage. After all, if every yahoo out there could make films, the studio’s raison d’etre would no longer exist. Can’t have that, now can we?

There was a time when the studios were run as a dictatorship with a single person at the helm - Harry Cohn at Columbia, Lew Wasserman at Universal, Jack Warner at Warner Bros. and so on. Today they are all owned and operated by multi-national corporations. I think that if they were still run by a single individual, their ability to survive in the coming storm might be possible.

The digital storm

As we have all witnessed in recent years (the last 5 especially), the tools required to make a film have gone the way of digital and are very cheap. At the low end, we have 300 dollar digital video cameras, low cost PCs with firewire built in and Moviemaker (or iMovie) included with these machines, gratis.

In the middle ground, we have cameras like the Panasonic HVX200, Sony Z1 etc. We have the Adobe production suite and Apple’s Final Cut studio. And there is a plethora of low cost equipment options available now to support the burgeoning independent filmmaker’s market.
At the high end we have cameras like the Viper and the Genesis. And now, we also have the RED 1. The RED 1 being poised to make a significant shift in the cost of making films for theatrical distribution. Other tools such as Apple’s Color and 3D compositing applications such as Lightwave and Maya remain very cost effective solutions that are available to the hobbyist or aspiring filmmaker.

Message to the studios: The cost of entry to your exclusive world has smashed!

There have been films made now on standard definition that have achieved the elusive status of “theatrically distributed”. The film November with Courtney Cox was shot on a Panasonic DVX 100 SD camera. There are other films that have been shot on the low cost HDV format that have seen distribution as well. But, alas, this is the exception.

Check out the web site http://www.withoutabox.com to get an idea of the explosion of film festivals that have cropped up in the last 5 years. The number is mind boggling. You can thank digital technology for this. Amateur filmmakers all vying for distribution of their films by gaining exposure through any of the hundreds of film festivals that are active in the US and throughout the world.

The Studios: We still own this business.

So if the cost of production has diminished significantly over the past few years, why are the studios still in power? Why has the democratization of film production not brought the studios to their knees?

Well, in some ways it has caused them to react. And react slowly. Take for example Universal’s Focus Features or Fox’s Searchlight divisions. These are just a few of the areas that the major’s have created to provide distribution for the explosion of independent features, over 1000 of which get produced annually. And that number will just continue to climb.

So, if we reduce the cost of creating blockbuster entertainment to the point that it no longer requires the financial wherewithal of a major studio to produce it, what then are the studios bringing to the table in order to remain viable in the future?

Financing
Production resources (stages, backlots, post, studio facilities etc).
Expertise in production
Distribution

So lets look at these in order:

Financing. I think, in the future, if a producer or director goes to a bank with a project in hand that appears to be an excellent bet in terms of making money at the box office, banks and venture capitalists will finance these projects. Regardless of any studio affiliation.
Production resources. More and more studio facilities are cropping up for use by indy filmmakers. Additionally, the studios themselves rent these facilities out to filmmakers regardless of their affiliation with the studio. This *might* end up being the bread and butter of the studios in the future.

Expertise in production. More and more, skilled artisans and technicians are finding viable work in the independent market. A recent film that I was involved with that had a $50K budget, used all union skill. The production had nothing to do with any studio. Hollywood is a small town. If you are good at what you do, you get work. It’s that simple.

And, finally, distribution. This is the biggie. Today, if you want your film to have a wide release in the US and Europe, the only players in that game are the majors. But... Again, technology to the rescue...

Digital distribution reduces the cost of getting the film out into the theaters by a significant margin. Gone are the days when prints have to be struck for each theater. Gone are the days of broken film and splicing in the projection booth. The cost savings and the positive impact overall is significant.

Here is where the studios have to maintain a stranglehold. In order for them to do this they must remain competitive. It won’t take much for some start-up to undermine them with a superior approach with lower costs that get passed along to the exhibitors. There is significant risk here for the majors.

Add to this the impending ability to effectively provide downloadable content to the home. And, no, I do not mean on your computer. I am talking about content sent to a box that is part of the home theater in high definition. Consider that many of the theaters today that have implemented digital projectors are using 2K resolution devices. Trust me when I tell you that you cannot tell the difference between a 2K projected image and an HD image at 1920x1080.
So now with low cost HDTV sets and low cost surround sound systems, you have a theatrical experience in your home. No cell phones. No one kicking the back of your chair or chattering during the film. The ability to pause the TV anytime you like or run it back to hear some muffled dialog again.

But you don’t get the shared experience of a film in a theater. You also do not get a 100’ screen. Is the draw to these attributes strong enough to maintain the existing theater experience? Only time will tell. I suspect it will survive but in a much reduced form factor. God knows, the internet has created an almost shut-in society in some regards with the ability to order anything online and have it delivered.

What I, as a consumer, wants: I would like to have digital delivery of HD content directly to my living room. I want the model to be a subscription model. A flat fee for a specific number of monthly downloads. Much the same way that we have with services like Netflix. I do not want to have to ever buy and store media like DVDs. I just want to be able to watch what I want, when I want.

I hope we can arrive at this at some point. The Chinese curse says “May you live in interesting times.”. I consider the present to be very interesting times. I don’t think it’s a curse.

OT: iPhone Ringtones

I am not a huge fan of ring tones in general, but on the iPhone I wanted to have a ringtone for when Nance calls me. Originally I setup a ringtone using one of the simple file re-name hacks but then Apple disable that and I was unwilling to do any of the other more radical hacks to get it working.

Last night I went to a Christmas party that a friend of mine was having. He works for Apple so I thought I would give him some grief over the ringtone issue. He said "No problem. Just do the Garageband Update that went out Friday. Drag and drop your original AIF file into a new project, click the cycle button to set the area that you want to use as a ringtone and select Share-> Send Ringtone to iTunes.

So now, I have my audio clip of Meg Ryan's orgasm in "When Harry met Sally" restored as the ringtone for when Nance calls my iPhone.

Thanks, Apple :)

Saturday, December 15, 2007

Pending editorial, Lynda.com and some legal stuff.

I have an editorial in the works that I plan to get posted up here this weekend. It covers technology and it's impact on the film industry.

Nance has been on a gig doing some legal videos in support of some law suit. She rented an hand held rig from EVS to do the footage and it seems to have worked out well. She has all of the footage captured now (about 2 hours worth) and will be cutting it down to a 10 minute video. There will be stills intercut with it and she will layer some sappy tear jerker music over it. Should be nice when it's done.

I start a new 9-5 gig out in Ventura at Lynda.com. If you are not aware of them, Lynda.com is one of the premiere sources for training materials on all things media. Check them out HERE.

We have also finally bitten the bullet and ordered a Redrock M2 Cine-lens adapter with follow focus. Redrock has an image inverter that they will be shipping in Feb. and we will pick one of those up as well.

Nance also picked up some new lighting gear at Filmtools out in Burbank. When the M2 gets here, we will post pics of the unboxing, assembly and some subsequent footage shot on the Z1 with it using Nikon lenses (I have a TON of high end Nikon still lenses that I have collected over the years).

So stay tuned for some interesting updates here!

Friday, December 07, 2007

Film in Focus

Focus Films is the "independent" arm of GE's Universal Studios. They have recently put up a new web site that I think is a real winner. A little bit of hype and a lot of great content for film lovers. Interviews with filmmakers, retrospectives - All kinds of great stuff.

Check it out HERE.

Thursday, December 06, 2007

Report from DV Expo

Had a pleasant day at the DV Expo yesterday. Nancy and I arrived around noon and parking was a breeze. The usual suspects were all in attendance: Panasonic (largest booth), Sony, JVC. Avid was there with a VERY small booth. I guess their new direction of spending resources on products and customer needs is evidenced by this.

None of the vendors announced anything at the show which is not surprising. Most save that for the bigger shows like NAB. Nance checked out some cranes and monitor mounts over at the Calumet booth as well as a nice little platform dolly system by Cambo that they have that is very slick - VERY fast to set up. Segmented/hinged track for fast layout stright, curved or a combination of both. And not too expensive at about $9000 for the track and dolly. They also offer a lower cost 3-point dolly as an option.

Next up was a look at the Lowel lighting systems. Softboxes that are using the flourescent "bulb" type fixtures. These softboxes are very nice. They do not use a speed ring but instead use a custom mount that is MUCH easier to deal with and have zero light leakage from the rear unlike the typical speed ring mount. These rigs were running about $800 with a stand and case with a tungsten fixture. The floursecent fixture is optional ($150) and will accomodate up to three 65 watt bulbs for a total effective output of about 600 watts. And, of course, the flourescent rigs are VERY cool and very low power.

Following this, we went to the Redrock booth to check out the Redrock M1 35mm lens adapter. nance and I had been tracking the development of this product since before they were shipping it. At the show, Redrock was displaying a new addition to their adapter. A prism image inverter that goes between the M1 and your camera.

They had an HVX200 setup with the redrock and this new adapter. They also had their follow focus unit installed. I have to say, we were both VERy impressed with this rig. One of the complaints that we have had in the past with these types of products was the image inversion issue. Not all of these types of products have this problem, but the M1 did.

The image inverter will sell for between $300 and $400 and is supposed to ship in February. Redrock also has some very attractive pricing for their indi bundles that is good through Jan 15th (i e. Rails/M1/HD lens adapter / Follow focus with full set of whips for $1800). If you have been thinking about getting one of these, now is a good time to take advantage of this great pricing.

We ran in to Larry Jordan (FCP instructor, author and editor par excellance) and had a nice chat with him. Larry has been co-hosting the Digital Production Buzz podcast recently and has made a GREAT addition to the show so be sure to check him out on iTunes.

Wednesday, December 05, 2007

DV Expo and how crappy movies get made

Nance and I are headed to the DV Expo today. I'll post pix and coverage of the event tomorrow on the blog here, so stay tuned.

On another topic, there is a blog post by a writer over at Trigger Street. It's a good read. I will be posting an editorial followup to this article tomorrow as well.

Check out the posting HERE.