If you've ever edited a digital video together before you've probably seen this big long list of different media formats available to you but have you ever stopped to wonder what all these names mean which one should you be using and why well to answer this question we have to understand digital video codecs
The word codec is short for encoder-decoder. in the context of digital video
an encoder is a bit of software that converts a digital video into a stream of ones and zeros that can be written to a file or sent over the Internet.
a decoder is a program that takes that stream and converts it back into a full-color digital video that can be sent to a display device.
well this might seem fairly straightforward it could be quite a bit more complicated than you might think a digital image is comprised of a grid of RGB color values and many of these images played back together create a video but in order to store a video into a file, we have to express all that information as one long string of ones and zeros so we have to be organized about how we store that information we could for example just store each RGB value left to right top to bottom that would work but what if someone else is encoding their color values starting at the center and moving out.
for example in the real world the encoding phase and the decoding phase are often done on
very different pieces of software running on very different computers in order for our image to be transferred correctly both the source and the destination have to agree on exactly how the image is encoded or else it's going to end up a garbled mess the solution to this problem is to publish a set of standards that describe how images should be encoded and decoded.
if I'm writing a program which means to encode an image into a file I refer to one of the published codecs standards and build my program to encode images according to those specifications then my friend on the playback side can write a decoding program that assumes the encoded file adheres to those same specifications as long as both of us have followed the codec specifications correctly then we should be able to encode and decode images together without issue in fact every device that's been programmed to be compatible with that codec will be able to playback the files that I create flawlessly if they can't then one of us has a and here too the codec specifications correctly.
although devices like cameras and computers are built very differently in terms of both hardware and software codecs make it possible to transfer digital video information from one device to another without any confusion however compatibility isn't the only thing that codecs bring to the table they also allow for image compression storing every single RGB value for every single pixel of every single frame of the video would require an enormous amount of data so much so that it'd be completely impractical to work with so as part of the encoding process we can compress the information in order to reduce the bitrate of our image and therefore reduce the file size.
lossless vs lossy compression
there are two main kinds of compression out there lossless and lossy.
lossless compression compresses the image without losing any fine detail whatsoever you could put an image through a lossless codec infinitely many times and it would still come out the other end looking exactly the same as it started
as you can imagine though the requirement that no information can be lost means that it isn't much you can do to reduce the bitrate sure simple images can be compressed very effectively complex ones as we see in the real world,
can't be compressed much without losing information as such lossless codecs typically have a very high bitrate and are not very convenient to work with to combat this we have lossy compression.
lossy compression can reduce file sizes very significantly by throwing away little bits of detail that the eye isn't likely to notice.
there are a wide variety of techniques at lossy codecs employ to do this so many in fact that
it would need to be a topic for another blog however what's important to understand is that more compression means smaller file sizes but reduced visual quality many lossy codecs will offer the user some choice about how heavily they want the image to be compressed a lossy codecs with a high bitrate can still produce a very good looking image while a lossy codecs with a low bitrate
we'll be very compact but we'll look noticeably worse than the original. so how much compression should you use them well it depends on your needs if for example, you're shooting a video that is going to have a heavy color correction or visual effects applied then you should probably use the highest quality codec you can and take a light touch when it comes to compression however if you're running a video streaming service like YouTube and Netflix then you're going to need to keep the bit rate as low as possible so it does not overwhelm your user's internet connections it all just depends on what you need.
now with all that in mind let's go over some common codecs you'll see when working in the world of digital video starting off we have the one-and-only h.264 codec this codec is designed to be as efficient as possible delivering the lowest possible file sizes without losing too much visual fidelity it's very widely adopted meaning that h.264 files are compatible with pretty much everything it's particularly popular among web streaming services since it allows them to deliver HD content to their users without putting undue strain on the average home internet connection
however, since its compression is fairly lossy and not optimized for playback performance it's not ideal for work like color correction and visual effects and while the quality of the image might look fine on your typical smartphone it's going to leave a lot to be desired if it's projected on a movie theatre screen.
there's also h.264 his younger brother in h.265 which has the same goals as h.264 however it's able to employ newer compression techniques to deliver even better images at the same bit rates as h.264 it is becoming increasingly popular however its compatibility isn't as universal as its predecessor so it's something to be aware of.
Apple ProRes DNxHD and GoPro Cineform
moving up we have the high-quality lossy codecs such as Apple ProRes DNxHD and GoPro Cineform these codecs are designed to maintain a very high degree of image quality while also
delivering a high degree of playback performance while their file sizes aren't as huge as lossless codecs there still too burdensome to stream over the internet so these codecs are mostly used
for high-quality image capture on the camera side or high-performance editing on the post-production side.
in the realm of cinema-quality is the top priority because of this most high budget films use raw codecs for capture which retain every bit of detail captured by the image sensor and they maintain visually lossless compression all the way to the theatre screen the standard codec for digital theatres is JPEG 2000 which is very very different from normal JPEG it's a lossless codec which allows images to be blown up to huge scales without any loss of detail whatsoever.
so that's a very basic introduction to the world of video codecs.