Your first assumption might be that ProRes is all about retaining the maximum amount of detail and dynamic range in footage – it does that, but its most notable purpose is something else.
Like Apple ProRaw, this is a format made for editing. Video editors might convert captured clips to ProRes (or its rival Avid DNxHD) when importing them into a program like Apple Final Cut Pro, Adobe Premiere Pro or BlackMagic Design’s DaVinci Resolve. Alternatively, you can record straight to ProRes with many cameras, using an outboard recorder.
But why use ProRes? One reason is that it's much easier on your computer’s CPU, resulting in a far smoother editing process than you’d see using the H.264 or H.265 formats your camera probably captures natively.
What's interesting about its introduction on the iPhone 13 Pro and iPhone 13 Pro Max is that this is the first time you've been able to shoot with the format on a smartphone. Here's what that means and why it could be an important feature for some shooters.
What is Apple ProRes?
Apple introduced ProRes in 2007, as part of its Final Cut Pro 6 editing software. The sell back then was it “produces stunning HD quality at SD file sizes”.
This tells us one of its key characteristics. ProRes is a compressed format, but one that aims to be “visually lossless”.
ProRes is not a singular thing, though. It’s a family of six quality standards, from the space-saving 'ProRes 422 Proxy' right up to the ultra-high-quality 'ProRes 4444 XQ'.
It's not yet clear which of these will be coming to the iPhone 13 Pro series, but Apple says you'll be able to shoot in 4K/30p in the format (unless you choose the 128GB model, in which case you'll annoyingly be restricted to 1080p at 30fps).
Which of those six formats filmmakers choose depends on their priorities of image quality versus file size/portability. However, if you use hardware that can record ProRes natively, you’re going to be limited to what it supports, too. An Atomos Ninja V external recorder, for example, supports ProRes 422 LT, 422 and 422 HQ, for example.
Below is a table of the six ProRes standards, for the curious, along with how much space they save versus an uncompressed file. We used figures from Apple’s ProRes white paper, published in 2020, based on recording at 1080p resolution, 29.97fps.
What’s the difference between 4:4:4 and 4:2:2? They are different ways to encode color. 4:2:2 ProRes simplifies color information, but is still 10-bit, offering enough fidelity to avoid obvious color-banding in most cases. And as you can see from these figures, even the highest-quality ProRes files are still far smaller than uncompressed ones.
|4:2:2||Uncompressed 4:2:2 video||1,326Mbps|
|4:2:2||Apple ProRes 422 HQ||220Mbps|
|4:2:2||Apple ProRes 422||147Mbps|
|4:2:2||Apple ProRes 422 LT||102Mbps|
|4:2:2||Apple ProRes 422 Proxy||45Mbps|
|4:4:4||Uncompressed 4:4:4 video||2237Mbps|
|4:4:4||Apple ProRes 4444 XQ||500Mbps|
|4:4:4||Apple ProRes 4444||330Mbps|
Why wouldn’t you just use ProRes all the time? An H.264 or H.265-encoded clip is still going to be much smaller, as those formats are designed to provide decent image quality at low bit-rates. And it is precisely this style of encoding that makes them a bit of a pain, and processor drain, to work with in an editing suite. If you have an older laptop it may not even be able to handle 4K H.265 playback smoothly.
You’d use ProRes in the editing/production part of the process, before converting the final clip into something people can play on their phones, or on their TV via a USB stick – most likely an H.264 or H.265-encoded file. That said, YouTube does actually support ProRes 422.
Content creators can upload a ProRes file directly to YouTube, although it’s going to end up in the H.264 format once it has been through the platform’s compression wringer anyway.
Who uses Apple ProRes?
Anyone who edits video in one of the industry standard suites, like Final Cut Pro, Premiere Pro or DaVinci Resolve, has probably considered using ProRes at the very least.
It’s a practical, high-quality format to use when working on a video project. If you have a camera that can shoot uncompressed raw video, you might think you want to use that. However, if you’re dealing with high-resolution capture and any real length, it’ll chew through SSD space at an intimidating rate.
Using the 10-bit ProRes is a good idea even if your camera only shoots 8-bit video too. First, your computer will handle it better than the files your camera spits out – and even if the original footage only has 8-bit color, edits you make can be processed at a higher color fidelity. That means smoother gradients, and less obvious color banding.
Why is ProRes on the iPhone 13 Pro a big deal?
The introduction of Apple ProRes on the iPhone 13 Pro and Pro Max, which Apple has said is come "later this year", will make this video format more mainstream than it has ever been.
ProRes may be common parlance among people who make videos all the time, but getting your head around this stuff can be tricky for someone who has a phone and wants to start making funny, well-produced social media videos, for example.
You don’t need expensive video editing software to use ProRes, either. iMovie has supported it since 2014 – as long as it’s not in an MXF (Material Exchange Format) container, anyway. That’s the file extension you’ll see at the end of the file name. ProRes video also comes in the more familiar 'MOV' format.
Apple isn't really doing anything technologically groundbreaking here. You can already use tools to transcode the iPhone 12’s MOV (H.264) and HEVC (H.265) into ProRes. But by removing steps in the process, Apple is making it much easier for newcomers to adopt these best practices from day one. Removing an extra compression step also avoids potential image degradation – it’s like recording a copy of a copy.
Adoption of ProRes is also a sign of Apple’s confidence in the iPhone 13 Pro’s video output. While its phones’ video quality isn’t quite as competitive as stills – the need to crank out 30 or 60 frames per second restricts how much computational work can be done to mitigate the limitations of small sensor size – iPhones do still offer best-in-class video.
What about Apple ProRes Raw?
We can’t talk about ProRes without also mentioning the step-up format, ProRes Raw. This was introduced in 2018, and provides ProRes-style compression for raw video capture. It's the same concept: smaller files, relatively low CPU/GPU strain, and in this case, greater ease of use than the alternative format Cinema DNG. There are others too, but these are made by camera manufacturers, to serve their own cameras.
ProRes Raw is the kind of format you might use with a high-end mirrorless camera made for video, or a dedicated video camera, alongside an outboard recorder. Several Atomos recorders support ProRes Raw, including the Ninja V. DJI’s Zenmuse X7 drone camera can record in ProRes Raw natively, too. And the resulting file sizes are actually comparable with the higher quality bands of “non raw” ProRes.
You don’t have to use Final Cut Pro to work with ProRes Raw either, as Adobe Premiere Pro supports it. However, DaVinci Resolve does not, at the time of writing.
The reason is pretty obvious when you realize DaVinci Resolve is made by Blackmagic Design, which created its own compressed raw video format, Blackmagic Raw. That’s a license-free standard, unlike ProRes Raw. But from our perspective, as camera users whose choices are directed by the cameras we end up buying, this little war isn’t in our interest.
DaVinci Resolve does support all versions of non-Raw ProRes, however. Blackmagic may choose to support ProRes Raw when demand from the users of this $299 suite outweigh its desire to promote its own compressed raw format.
Sign up to receive daily breaking news, reviews, opinion, analysis, deals and more from the world of tech.
Andrew is a freelance journalist and has been writing and editing for some of the UK's top tech and lifestyle publications including TrustedReviews, Stuff, T3, TechRadar, Lifehacker and others.