Deep-dive: Intel Arcs AV1 video encoding may be the future of GPU streaming
Imagine a global where video streaming may have significantly top quality without needing up more bandwidthor have exactly the same quality whilst having half the effect on your computer data caps. A global where Twitch streams arent a blocky mess and YouTube videos can in fact resemble the visual connection with playing the overall game yourself.

Adam Taylor/IDG
That world may be the future promised by AV1: A fresh, open source video codec that aims to dethrone H.264 because the primary video standard following its nearly 20 year-long reign. Typically very hard to encode, real-time AV1 encoding is currently open to consumers via Intels debut Arc Alchemist graphics cards (although launch has been limited by China up to now, with a U.S. release planned for later come early july).
I put Intels GPU encoder to the test utilizing a custom Gunnir Arc A380 desktop card to see if it delivers the promises weve seen from AV1 so far, and how it competes contrary to the existing H.264 encoders typically useful for live streaming. I also desire to explain why all this matters so much. Theres too much to this, so lets dig in.
[ Further reading: The very best graphics cards for PC gaming ]
Before H.264 became the predominant video codec everywhere, online video was chaos. I’ve many fond memories of my Windows 98 and OR WINDOWS 7 computers being plagued with all types of video player apps from QuickTime to RealPlayer to the DivX/Xvid players of dubious legality, all to play AMVs or game trailers downloaded from eMule or LimeWire. Then once YouTube started gathering popularity, most of us had to cope with .FLV Flash Video files. It had been hard to maintain. H.264 overtaking and become accepted by just about any application, website, and device felt as promised. But because the years (and years) have passed and video standards aim at higher resolutions and higher framerates, demand increased for video codecs that operated with higher efficiency.
While H.264 was made effectively royalty-free, H.265 still has plenty of patents and licensing costs tangled up in it which explains why you dont see many consumer applications supporting it, and without any live streaming platforms accept it.
YouTube and Netflix switched to almost exclusively using VP9 (Googles own open source video codec) but again adoption in the buyer application space has been virtually nonexistent, also it seems video streaming giants still desire even higher efficiency.

Adam Taylor/IDG
Thats where in fact the Alliance for Open Media will come in. AOM is really a collaborative effort to build up open-source, royalty-free, and flexible solutions for media streaming. Backed by basically every big corporation involved with web media including Google, Adobe, Nvidia, Netflix, Microsoft, Intel, Meta, Samsung, Amazon, Mozilla, and also Apple, AOMs focus is on creating (and protecting with a safe patent review process and establishing a legal defense fund to help keep the tech open) AV1, with AV1 as an ecosystem of open source video and image codecs. Tools for metadata and also image formats have already been developed, but what Im concentrating on this is actually the AV1 Bitstream video codec.
It may look strange for so many big corporations (and competing ones at that) to be working together on one project, but ultimately that is an endeavor that benefits every one of them. Lower bandwidth costs, higher-quality product, and easier interoperability for regardless of the future of streaming media might hold seems to beat out the advantages of prior philosophies around everyone developing their very own walled-off solutions.
My only personal concern here’s in relation to a number of these corporations histories of pushing anti-consumer DRM with regards to video both online and offline, and how those past actions might influence AV1 implementations.
AV1 in the open

Adam Taylor/IDG
All of this is fantastic, but how can you actually get AV1 videos? Remember that new codec adoption is normally very slow and AV1 is moving rapidly, with that said, it is possible to watch a good little bit of AV1 online at this time.
You start with YouTube, you need to head to your YouTube Playback Settings (while logged in) and choose Always Prefer AV1 to improve your likelihood of actually being delivered AV1 transcodes of videos. From there, any video at 5K resolution or more should curently have AV1 transcodes prepared to play. YouTube also created this AV1 Beta Launch Playlist back 2018 with some sample videos which were originally given AV1 copies for a few guaranteed testing. Anecdotally, Ive been seeing increasingly more high-traffic videos that I watch regularly playing back AV1.

Adam Taylor/IDG
The video wont show any different for you personally in the video player, but in the event that you right-click the video while its playing and click Stats for nerds you need to see av01 close to Codecs if all is working properly.

Adam Taylor/IDG
Actually having the ability to decode AV1 could be a mixed bag. Modern quad-core PCs from 2017 or later shouldnt have any issue decoding 1080p AV1 footage on CPU. But as soon as you exceed 1080p, youll want hardware-accelerated decoding. Nvidias GeForce RTX 30-series graphics cards support AV1 decode (like the RTX 3050) as do AMDs Radeon RX 6000-series GPUs and also the iGPUs on Intel 11th Gen and newer CPUs. When you have this hardware, ensure that your drivers are up-to-date and download the free AV1 Video Extension from the Microsoft Store, then refresh your browser. Based on your graphics hardware, youll even have the ability to start to see the Video Decode portion of your system carrying it out in Windows Task Manager. (Note: All this is on Windows 10 and 11 only, Windows 7 isn’t supported.)
Netflix has already been streaming AV1 for a few films to compatible devices as plenty of TVs, game consoles (like the older PlayStation 4 Pro), plus some cellular devices already support it. Netflixs streams are proving to become a wonderful showcase for AV1s Film Grain Synthesis feature the power for the encoder to investigate a video files film grain, take it off to cleanly compress the footage, and provide instructions to the decoder to re-create it faithfully, without wasting unnecessary bits on the grain.
Twitchs former Principal Research Engineer Yueshi Shen also shared an AV1 demo with me back 2020 showcasing what AV1 could bring to Twitch streaming. Available here, you will see mostly blocking-free gameplay in 1440p 120FPS delivered with only 8mbps of bandwidth, 1080p 60FPS only using 4.5mbps, and 720p 60FPS only using 2.2mbps. Without a real-world live streamed test, its still seriously impressive given the terribly low quality normal H.264 encoders would produce at those bitrates in normal Twitch streams.

Adam Taylor/IDG
Shen originally projected Twitch to possess full AV1 adoption by 2025, with hopes of big name content streaming it as soon as this season or 2023. Hopefully with consumer-accessible encoders available these days, they are able to start turning on these new features soon.
Did Intel deliver?
AV1 encoders that operate on the CPU have already been available for a long time now, but theyve been very hard to perform, taking several hours to process by way of a sample even on high core count machines. Performance has been improving steadily in the last couple years and two encoder options (SVT-AV1 and AOM AV1) are actually designed for use by OBS Studio version 27.2. As I covered once the OBS update released, they are still tough to perform in real-time, however they are usable and step one towards consumer AV1 video.
The majority of the streaming and content creator scene have already been waiting on hardware-accelerated encoders showing through to next-generation graphics cards, instead. As stated, Intel, AMD, and Nvidia all added hardware AV1 decoders with their previous generation hardware, and its own an assumed-certainty that at the very least Nvidia could have an AV1 hardware encoder on RTX 4000 hardware, or even AMD with RX 7000 aswell. However and despite many delays of these own Intel is first to advertise with GPU AV1 encoders on the brand new Arc Alchemist type of graphics cards.

Adam Taylor/IDG
The very first thing I wanted to research with the brand new A380 GPU was how well the AV1 encoding actually organized versus currently-available options. While AV1 being an overall codec is incredibly promising, the outcomes you obtain from any codec depends upon the encoder implementation and the sacrifices essential to run in real-time. GPU encoders operate on fixed-function hardware within the GPU, enabling video encoding and decoding with reduced impact on track 3D workloads, such as for example game performance, but dont always produce ideal conditions.

Right-click and open in new tab to see completely resolution.
Adam Taylor/IDG
Ive previously covered how different hardware H.264 encoders have improved through iterative GPU generations where it could be seen that early implementations of encoding with Nvidias NVENC, for instance, didn’t produce anywhere near as most of an excellent as their newer cards can. I also recently examined how encoder quality could be somewhat improved through software updates (such as for example AMDs recent updates with their AMF encoder). All this implies that while theres a lot of room to be hyped for the initial accessible AV1 encoder, theres also room to be disappointed by the 1st iteration of it, with a lot more ahead.
Probably the most reliable way Ive found to quantitatively measure video quality has been Netflixs VMAF Video Multi-Method Assessment Fusion. That is an algorithm that assesses video quality in a manner that very closely matches what actual human viewers experience video quality at confirmed distance and size, instead of counting on pure noise measurements such as for example with PSNR. Netflix has been building this technology (and blogging about any of it) for quite some years now, and its own gotten to a spot where it very reliably measures differences that help illustrated what I’d typically try showing you with repetitive side-by-side screenshots.
Im constantly testing an enormous library of lossless gameplay and live action video samples, but also for simple presentation, I would like to concentrate on several games in the FPS genre. The fast-paced camera movements, highly-detailed graphics with plenty of particle effects, and several HUD elements combine to supply some sort of worst case scenario for video encoding, as almost every Twitch streamer has struggled with.
My focus is on 1080p 60FPS video, with three primary bitrates: 3500kbps, 6000kbps, and 8000kbps. 3500kbps (or 3.5mbps) may be the lowest that could typically be advised to ever use at 1080p (at the very least, ahead of AV1), 6000kbps may be the soft cap for Twitch streams, with 8000kbps as an unofficial bandwidth cap that lots of have the ability to send to Twitch without issue. Twitch may be the focus here because of the site providing Source quality streams that dont feel the secondary stage of compression that YouTube streams do, which removes a few of the great things about higher-quality encoders at lower bitrates.

Right-click and open in new tab to see completely resolution.
Adam Taylor/IDG
Here, I compare the most common hardware H.264 encoders from Intel, Nvidia, and AMD contrary to the CPU X264 encoder running in the VerySlow CPU usage preset a thing that cant be run in real-time, but is frequently considered the benchmark for quality to shoot for along with Intels AV1 GPU encoder and two SVT-AV1 encode presets you can reasonably (on a high-end CPU like Threadripper) encode in real-time.
These email address details are, well, fascinating.

Right-click and open in new tab to see completely resolution.
Adam Taylor/IDG
At 6 and 8mbps, Intel Arcs AV1 encoder dates back and forth between scoring greater than X264 VerySlow and scoring slightly lower (VMAF scores are powered by a 0-100 scale with 100 being truly a perfect match for the lossless/uncompressed source material), but still scoring impressively greater than even the very best GPU H.264 encoders. Alone, that is already impressive enough. If youre a casino game streamer and you also work with a dual PC streaming setup or have the PCIe lanes and slots to include another GPU to your machine, once AV1 is enabled on Twitch you’d be in a position to stream significantly higher-quality streams than the majority of whats on the complete website, at exactly the same bitrate.
But if we go over at the low 3.5mbps bitrate, Intels AV1 encoder soars above the H.264 encoders, including X264 VerySlow. And in a few of the game tests, AV1 encoded on the Arc A380 at 3.5mbps scores greater than the majority of the H.264 options do at 6mbps (nearly double the bandwidth).

Right-click and open in new tab to see completely resolution.
Adam Taylor/IDG
Theoretically, if Twitch enabled AV1 streaming at this time, a streamer utilizing an Arc A380 to encode their broadcast would enable everyone involved with this technique to cut their bandwidth in two the streamer themselves, the viewer, and Twitch/Amazon without going for a hit in quality. In addition, it means it is possible to immediately get yourself a jump in quality without changing any network requirements on the streamers end.
Shifting from data to actual real-world visuals, the outcomes are still in the same way impressive.
At 6 and 8mbps, Intels AV1 encoder presents more of exactly the same instantly in comparison to H.264 encoders. Theres a little quantity of added sharpness, however, not enough to stick out, but an extremely noticeable insufficient blocking or artifacting in areas with big light changes or shadows.

Right-click and open in new tab to see completely resolution.
Adam Taylor/IDG
Again, you wont be impressed comparing results at these higher bitrates, nonetheless it is a noticable difference. AV1 appears to do a excellent job at more smoothly blending together areas where detail needs to be sacrificed, instead of creating pixelated-looking blocking that youre used to seeing. Sometimes at a particular distance you may even feel a portion of the H.264 footage looks sharper compared to the AV1 footage because of the extra crunch giving the illusion of added detail, but zooming in, that detail isnt actually there.

Right-click and open in new tab to see completely resolution.
Adam Taylor/IDG
At only 3.5mbps, Intels AV1 encoder does begin to exhibit handful of macro-blocking in gradients, but avoids it during detailed parts of the screen in comparison to H.264, and really delivers a view you wouldnt think will be possible at this type of low bitrate.
Looming competition
My VMAF graphs also included a couple of additional data points that scored significantly greater than Intels AV1 encoder. They are two encodes utilizing the SVT-AV1 CPU encoder. This encoder uses Presets (much like X264) numbered so that small numbers tend to be more difficult to encode with top quality (comparable to going slower with X264), and higher numbers are easier with worse quality. In my own testing, even on a 32-core Threadripper CPU, 8 and 9 were the only real realistic presets to encode with in real-time. So maintaining my theme of slightly-unobtainable benchmark quality, I included SVT presets 7 and 8 on my graphs.

Right-click and open in new tab to see completely resolution.
Adam Taylor/IDG
So when you can observe, while theres marginal differences between both of these specific presets, they both out-score Intels AV1 encoder by way of a long shot. Much how X264 VerySlow far outperforms the GPU H.264 encoders, that is to be likely.
I thought it had been vital that you include these for just two reasons. First, if youre just encoding videos for upload to YouTube, archives, and so forth, you may use these slower encoder profiles with AV1 and exponentially boost your bitrate efficiency (saving on space and upload times, or increasing quality for the quality you would have previously focused on). Secondly, Im hoping its a preview of what we must anticipate from GPU AV1 encoders on the next couple years.
While Intels QuickSync Video H.264 encoder (on both Arc GPUs and 12th Gen iGPUs) happens to be leading Nvidia and AMD in quality, previous generations lagged behind Nvidia (and also AMD if we return back far enough) this means if Nvidia launches RTX 4000 having an AV1 encoder, it might perform at the very least somewhat much better than Intels offering. Plus, as stated, that is just the initial iteration. As hardware improves, so will the encoder. I must say i want Nvidia and AMD to compete on the AV1 encoder front, delivering users even top quality but I must say Im pretty stoked with where our starting line is here now in Intels Arc graphics cards.
The continuing future of video streaming is quite bright, and considerably less blocky. Im uploading most of my YouTube videos in AV1 continue and appearance forward to streaming in the brand new format the moment platforms allow.
[Disclosure: My Gunnir Photon A380 graphics card unit was sampled if you ask me by Intel for inclusion in my own usual encoder quality analysis content on my YouTube channel. I had not been paid by Intel for just about any coverage, am under no obligation to state anything specific, and Intel have not seen anything I post concerning the GPU ahead of publishing. Ive been sent GPU samples by Nvidia and AMD because of this same purpose.]