Monday 5 October 2015

Aspect Ratios, Frame rates, Formats and File Compression

In this blog, I will be discussing what Aspect Ratios, Frame rates and Formats are and how they're used.

What aspects ratios?
The aspect ratio is the measurement of the with and the height of the screen. There are many different aspect ratios that have been used on TV and especially for Film. TV was always the ratio of 4:3, this is because the old TVs were the shape of a square. Now days, the modern and standard aspect ratio for TV is 16:9 or widescreen.

What is the difference between 16:9 and 4:3?

As you can see, the 16:9 displays a lot more information on the screen due to its increase size in frame. The reason that the black bars on the 4:3 happen is because some older TV shows that haven't been reformatted will still have that size, but because TVs are now wider, something have to fill that empty space. Despite its old fashioned use, 4:3 is still commonly considered in turns of framing for things such as motion graphics. This is because older TVs won't covert the larger 16:9 down, therefore cropping will happen. So for things such as motion graphics, they have bars on their screen to help them frame all the information so that when the cropping happens, the viewer still sees the information they need. Eventually, the 4:3 ratio will become something of the past as 16:9 feels more native due to our natural eye sight being a wide field of view.

In films, its a slightly different story. Hollywood films have completely switched to the 16:9 widescreen ratio. But, they have different aspect ratios with in that ratio; this is what they call letter boxing. In modern day film cameras, they have crop lines built into the cameras and external monitors to help with this letter boxing. But why? If you've seen a film thats been shot on what they call an anamorphic lens, which you probably have, then you would notice their is black bars on the screen. But, unlike the 4:3, the black bars occur on the top and bottom of the screen.

If you look at the picture to the right, you will see an example of these letter boxing. The reason for this is because when you use an anamorphic lens it distorts the image before it enters the camera. Its a vertical distortion, so when you view the image on a monitor (that isn't able to recompress the image for preview) the frame looks stretched vertically but horizontally its fine. So, when you place the footage in post, you then resize the image and you get an aspect ratio of 2.35:1 which is the standard for film and has been since the 1950s but you could only this ratio in cinemas designed for anamorphic. Nowadays, typically lower budget films are show on 1.85:1.
The image on the left shows the difference between the two aspect ratios, and you will more commonly see the 1.85:1 being used especially for TV. Its even used on shows like Game of Thrones (2011-) use this ratio. This is mainly due to visual preference but the anamorphic ratio is typically seen to be more cinematic.


Each of the last 3 Idents I analysed used the aspect ratio of 2.35:1 except for the Paramount Pictures. There are two reason to why they have chosen these aspect ratios. The most likely is that the sources aren't consistent in the fact that the two with the anamorphic aspect ratio came from the beginnings of films and the Paramount Pictures (which was 1.85:1) was just a video. The reason that the anamorphic ratio would be used instead of the typical 1.85:1 is because they would blend better with the films that open directly after (sudden changes in aspect ratio is off putting and considered a bad form of editing.) But, from a more creative point of view, this might not be the case. Aspect ratio is completely subjective, different people like different things therefore it largely depends on what the film's theme is or what visual style the director is going for/prefers. Some directors love the 2.35:1 because its the tradition with great conventions, where as others like the newbie 1.85:1 because it feels less intense and doesn't suffocate the audience from information.


File compression
There are two different types of file compression. They are, lossless and lossy. These are two different file types with the same overall season. The idea of file compression is to make a file smaller, this is great for filmmakers as storage is always a concern. The difference between the two types is that Lossless is compressed in size but remains the same quality where as Lossy reduces the size and the quality of the image. These both have their pros and cons.

Lossless, though doesn't lose quality (which is great), doesn't compress as much as lossy does meaning that you still end up with hefty file sizes. Lossy on the other hand, has much smaller files sizes then Lossless but still loses the overall quality.

So, you might be thinking why does Lossy exist if we can just make the files smaller without losing quality? Well, technically the two techniques are for two types of people. In filmmaking, most of us have to use lossy formats, not because we want too but because storage is still expensive to be able to do what studios do. Most of us can comfortably shoot compressed 1080p files and storage them without any need to compression before editing but 1080p RAW still isn't consumer yet because of its files sizes still be huge.

The studio, on the other hand, uses lossless to compress their RAW files. For example, Peter Jackson used the Red EPIC Dragon 6K camera for The Hobbit: Trilogy (2012-2014). They shot the entire film at a unconventional 48 frames per second on 4K RAW. Only a studio really have the technology to be able to hold that size for a film and then run it through compression, but I wouldnt be surprised if they didn't compress it at all and just edited in RAW due to Red's RAW codecs being widely compatible with editing software.

There are four factors that actually contribute towards the file size of a movie file. These are: duration, codec, resolution and frame rate. I will speak about Resolution and Frame rates later, but duration is the time that the movie file takes from start to finish and the codec is the file format in which the camera exports the footage.

Back too lossless and lossy, there are a wide variety of different formats used. These include;

Lossless:

  • RAW codecs
  • H.264 Lossless
  • Blackmagic codecs (exclusive to Blackmagic cameras)
Lossy:
  • MP4/H.264
  • Quicktime
  • Windows Media Video
Of course, there are 10s more of each, but they are just common examples. 
Here is an example of a lossy format  against a lossless format.



Knowing that studios have immense computing power compared to the comsumer market, I should imagine file sizes for something that is 30 seconds long isn't much of an issue. That being said, the ident is also a representation of the business so the quality is priority. But, when rendering out in Premiere, Lossy files are all you have as exporting RAW isn't avaliable. Therefore, my guess would be either they use higher resolutions and then import that directly into their films edits thus compressing the ident dimensions down to match that of the composition but no pixels would be lost. Thats how I'm doing it anyway.

Formats
What is meant by the term, Formats? Some people will also call this resolution, like myself. We have briefly spoken about and mentioned about Formats with aspect ratios and file compression and it provides a key role in both. When shoot a film, the format is one of the most visually important considerations you make. Currently, the standard for any video is compressed full HD but it hasn't always been that way. In the past, there have been many resolutions and as technology increased, so did the resolution. Increasing the resolution mean there was more space for the camera to capture pixels. Back in the 1990s, TV formats such as, programmes, advertisements and music videos were all shot around 480p at 4:3 aspect ratio (some using the anamorphic ratio of 2.35:1 but format stayed the same.) As time went on, 480p would still hold its place as the standard until mid 2000s when High definition (or HD) came out. This format was great for its time but was quickly pushed away with Sony's new format BluRay, which brought Full HD to the screens. So whats the difference between 480p (Standard Defintion or SD), HD and Full HD. Well, HD is 720p and Full HD is 1080p, these are bigger resolution with denser amounts of pixels (definition) allowing clearer pictures. For a time, 1080p was the thing to have but now its pretty much consumer, most cameras (including smart phones) have the capabilities to shoot Full HD compressed but 1080p RAW is still a format that isn't consumer but its been so long that no one really cares anymore. This is because a new format has started being used, known as 4K. Now, don't be tricked into thinking 4K is the highest format avaliable, its not. There are cameras out there capable of shooting 6K internally, such as the Red EPIC Dragon. Its also possible to then upscale this to 8K using several methods such as 1.3x anamorphic lens or shooting portrait and then manually blending frames together (which isn't convenient for even industry leaders).

So what formats do Production companies use for their idents? Well it depends on the film. For The Hobbit: Trilogy, they would of made several idents to match the formats and frame rates of each version of the film. Seeing as each of The Hobbit films were shot at 4K 48fps, they would of made several renders for each version. These include;

  • 4K resolution with 48 fps (3D)
  • 4K resolution with 24 fps (2D)
  • Full HD resolution with 24 fps (2D)
  • Full HD resolution with 48fps (3D)
There is a common misconception that 4K TVs are quite common to buy. This is wrong, what tend to be called 4K TVs are actually the format Ultra HD format which is similar to 4K with a different width. Not saying genuine 4K TVs dont exist, they do but they are as common as 2K tvs are to Full HD. Companies are exploiting this misconception with even cameras such as the GH4 which shoots a high bit rate of UHD, not 4K. Why does UHD exist then? Because its a direct 4x upscale to Full HD and since Full HD is the consumer standard, it would be illogical to start changing aspect ratios again since everyone is used to 16:9. Basically, 4K nor 2K are 16:9 but instead 16:10.

Frame Rates
Previously, I have mentioned the use of Frames rates through out the explanations above because Frame Rates are key in Film & TV. Frames Rates are a measurement and a format. Dependent on the type of production you are watching will decide the Frame rate for the show. For a cinematic experience, the industry standard is 24fps, for live TV its 60fps and for 3D cinematic films its 48fps. Our eye can identify up to 1000fps but in short terms. Frame rates are measured in Frames per seconds (fps). For example, in cinema for every second of film you watch, your eye processed 24 frames.

So, why have 60fps for Live TV if 24fps is known for its cinematic look? Well, everything up to 60fps is noticeably different. You can see the difference between 48fps and 60fps but once you go over the 60fps marker, it gets much harder. The difference you'll notice when frame rates get increased is the how smooth it feels and 60fps is used for Live TV to give a realistic feel. There is also other reasons to why higher and lower frame rates are used, not just for the smoothness or every how it feels but for its technical advantages. When you see slow motion in films, have you ever wondered how they managed that? Its really quite simple, you put the frame rate up so when you put the video in post, you'll realise that all your film is shot at 24fps but you shot your slow motion video at 48fps (or more). This means you can compress down the frames to 24fps but the video will be slower without any "fps lag" (which is when frames get compressed and information is lost meaning jumping in information.)

Another use for Frame rates, is for gaming. In fact, frame rates have become a big thing for gamers with benchmarks set at 60fps. This is done for the same reasons as Live TV. The extra smoothness the frames add means theres no motion blur so for sporting events, such as Nascar, its great because fast objects become very hard to see with motion blur. But also, some gamers to prefer to play with their frame rates higher then 60fps, this is because some events in games need more frames then others such as explosions compared to smaller muzzle flashes. So when an explosion goes of, the information slightly over runs the processes, meaning frames get lost. So for Gamers with weaker machines that just run at 60fps will fall to 50, sometimes 40, quite often which can make the game feel weird.

More onto the topic of film, 3D films have began to ditch the traditional 24fps with the new 48fps look. But, many people are angry with this. The idea for the switch is because 3D basically have two of the same compositions slightly off-layered so when corrected they give the illusion of one popping out. The two compositions means that double frames are needed for full experience and for films such as The Hobbit: Trilogy this technique was needed. The only problem is, theres a theory that suggests why 24fps looks cinematic and why anything more then 24fps doesn't. So even though, 48fps looks nice and smooth and those who are slightly naive would think this would still give a cinematic because its just a second multiple, it doesn't. Watching a film at 12fps isn't cinematic, its horrible, and 48fps just seems like a weird combination between Nascar and Cinema. Some people liked it though, because the extra double frames means more information getting feed into the retina allowing more detail in the film to be witnessed before the fast cuts.





No comments:

Post a Comment