Jump to content

YouTube Starts Testing the New Cutting Edge Codec AV1

LAwLz
2 hours ago, Dan Castellaneta said:

The Apple TV probably follows the iOS layout of using the AVC video and MP4 audio.

It’s due to Apple not supporting VP9. 

Link to comment
Share on other sites

Link to post
Share on other sites

9 minutes ago, Dan Castellaneta said:

Hmm, strange that Opus can do video, although to be fair, the way Opus was made to begin with is pretty damn strange.

You probably watch a ton of videos that are in VP9; usually, older videos or videos with relatively low amounts of views and are new tend to be in the AVC/AAC setup.

ok this is very strange... I decided I had best make sure I've got this right so I went looking and everything I see only talks about opus and audio... I'm going to keep looking until I find what I very clearly remember watching though.

 

edit: apparently I'm just stupid, it wasn't opus at all... they call it "Daala" ?‍♂️  cool stuff tho...

Edited by Ryan_Vickers

Solve your own audio issues  |  First Steps with RPi 3  |  Humidity & Condensation  |  Sleep & Hibernation  |  Overclocking RAM  |  Making Backups  |  Displays  |  4K / 8K / 16K / etc.  |  Do I need 80+ Platinum?

If you can read this you're using the wrong theme.  You can change it at the bottom.

Link to comment
Share on other sites

Link to post
Share on other sites

H.264 and HEVC are standardized by MPEG. That is why those 2 formats as well as MPEG-2 are widely used in the broadcast industry. At the same time they are full of patents which you have to pay (for commercial usage).  Also, those codecs can be used for live encoding/decoding. 

AV1 sounds promising for offline encoding, but I am unsure how it would perform in the broadcast situation. That is what I am interested in and how it compares to HEVC in the system resource usage.

Link to comment
Share on other sites

Link to post
Share on other sites

54 minutes ago, Niksa said:

H.264 and HEVC are standardized by MPEG. That is why those 2 formats as well as MPEG-2 are widely used in the broadcast industry. At the same time they are full of patents which you have to pay (for commercial usage).  Also, those codecs can be used for live encoding/decoding. 

AV1 sounds promising for offline encoding, but I am unsure how it would perform in the broadcast situation. That is what I am interested in and how it compares to HEVC in the system resource usage.

I'll be shocked if AV1 takes off in the broadcasting industry; it's always been about MPEG as far as the digital broadcasting realm is concerned.

Check out my guide on how to scan cover art here!

Local asshole and 6th generation console enthusiast.

Link to comment
Share on other sites

Link to post
Share on other sites

Is there a way to toggle this feature on/off after activating it?

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, tonyeezy said:

Is there a way to toggle this feature on/off after activating it?

Gotta go to the TestTube page (OP linked it in the topic post) and enable and disable it from there.

Check out my guide on how to scan cover art here!

Local asshole and 6th generation console enthusiast.

Link to comment
Share on other sites

Link to post
Share on other sites

they should do something to lower the size of videos but while keeping a decent quality

ASUS X470-PRO • R7 1700 4GHz • Corsair H110i GT P/P • 2x MSI RX 480 8G • Corsair DP 2x8 @3466 • EVGA 750 G2 • Corsair 730T • Crucial MX500 250GB • WD 4TB

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, aezakmi said:

they should do something to lower the size of videos but while keeping a decent quality

As it is, the VP9 videos are quite a bit smaller than the AVC videos and compress better.

It probably will end up that YouTube ends up aiming for something slightly higher quality than the current VP9 setup but significantly smaller.

Check out my guide on how to scan cover art here!

Local asshole and 6th generation console enthusiast.

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, WereCat said:

Its not the resolution, its the bitrate.

You can stream at 4k if you want but if its anything than static images it will look a lot worse on the same bitrate that would be typicaly used for 720p streaming because the picture fidelity would get significantly worse as the compression would have to be a lot greater.

Yeah but as you more highly compress the bitrate the harder it is to decode, and all VC-1 decoding is software right now, and it's a particularly hard codec to decode. That's my point.

 

It doesn't help you to have a much tighter bitrate if none of your watchers can actually decode it, at least not without setting their laptops on fire (exaggeration obvsly).

 

3 hours ago, MoonSpot said:

My Core2Duo didn't, and VP9 was a considerable consideration for my upgrade.  Would have much preferred to have the ability to offload to the dGPU, but nope. 

Went from a Q9000 with 4 cores at 2.00 gigahertz solid to a 7700HQ with 4 cores at 2.8(when intels 'feels like it') jigahurts.  Wouldn't feels as con'ed if it were an upgrade that needed to happen b/c of CPU bicep flex rather than hipster CPU 'finesse' energy-saving landfill-filling blathering blatherskite weren't the principal reason.

Yeah but the core2duo also doesn't support hardware decoding of H.265 the other primary high compression format so what's your point?

 

Does the Core2duo even have a GPU with h.264 hardware decoding?

 

Sorry to tell you but fixed hardware functionality is just as important as "CPU bicep flex" to doing any kind of serious task.

 

And 2.0 GHz on a C2D is not the same as 2.0 GHz on a modern CPU architecture so I'm not sure why you even mentioned clocks.

 

If you got a dGPU that can handle VP9 decode you can absolutely do it on the dGPU.

Link to comment
Share on other sites

Link to post
Share on other sites

6 minutes ago, LucidMew said:

What's wrong with VP9? I thought that was supposed to go up against HEVC/h.265 (and it won imo). What am I missing?

Probably the fact that it doesn't compress well enough.

AV1 looks unbelievably better at similar bitrates compared to VP9.

Check out my guide on how to scan cover art here!

Local asshole and 6th generation console enthusiast.

Link to comment
Share on other sites

Link to post
Share on other sites

5 hours ago, Dan Castellaneta said:

All I wish is for AV1 to have proper hardware acceleration support across a decent amount of hardware unlike VP9 did.

Also, I've been looking for fucking ages on what the audio bitrate is for the various format codes. 

You can't just "add" hardware feature to existing hardware, so older gen hardware will never ever have hardware acceleration for cutting edge stuff.

 

However, once a codec has taken off, all new hardware will support it and gradually phase out older hardware (it will take a long long time). For example, all new gpu released in the last 2 years or so all have h265/vp9 decoding capabilities.

Link to comment
Share on other sites

Link to post
Share on other sites

Hopefully this means snow and small particles aren't going to make Youtube melt the video quality any longer ...

If you want to reply back to me or someone else USE THE QUOTE BUTTON!                                                      
Pascal laptops guide

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, Ryan_Vickers said:

For some reason when I read the title I assumed it was being sarcastic xD well, I guess this is good.  It leaves me wondering where some other codecs fit into the "family" though, namely AVC, VP9, and Opus.  Are those just older versions in the same "line", or are they totally different?  Which if any of those were/are proprietary or charge some sort of fee to use?  I know Youtube has used (or can still if you set it to) AVC, and currently the "main" one is VP9... I wonder why Opus never caught on.  They were already using it for audio I believe, why not video too?  From what little I know, it too was supposed to be very high quality (better visuals per byte), and also totally free and open.

From what I understand having done wiki-walking a few weeks ago: VP9 was meant to be replaced by VP10, but VP10 was canned in favour of developing and publishing AV1 as the start of a new 'family' to replace the older VP* 'family'.

Link to comment
Share on other sites

Link to post
Share on other sites

32 minutes ago, Castdeath97 said:

Hopefully this means snow and small particles aren't going to make Youtube melt the video quality any longer ...

I feel like that's an unavoidable problem (well, aside form just throwing vastly more bitrate at it).  The new codec may look better "per MB" and so this might not be as much of an issue practically speaking, but will it handle things like this proportionally better?  I kinda doubt it.

Solve your own audio issues  |  First Steps with RPi 3  |  Humidity & Condensation  |  Sleep & Hibernation  |  Overclocking RAM  |  Making Backups  |  Displays  |  4K / 8K / 16K / etc.  |  Do I need 80+ Platinum?

If you can read this you're using the wrong theme.  You can change it at the bottom.

Link to comment
Share on other sites

Link to post
Share on other sites

6 hours ago, Ryan_Vickers said:

ok this is very strange... I decided I had best make sure I've got this right so I went looking and everything I see only talks about opus and audio... I'm going to keep looking until I find what I very clearly remember watching though.

 

edit: apparently I'm just stupid, it wasn't opus at all... they call it "Daala" ?‍♂️  cool stuff tho...

Daala, among a bunch of other projects, was folded into one; aptly named AV1. 

Link to comment
Share on other sites

Link to post
Share on other sites

If it is more complex than HEVC, it'll be a while before this becomes standard. Even high-end phone SoCs struggle with HEVC encodes, and it'll be years before everyone has a hardware codec in their pocket.

Link to comment
Share on other sites

Link to post
Share on other sites

HEVC has barely had a chance to take a hold and it's already been replaced?

Main Rig:-

Ryzen 7 3800X | Asus ROG Strix X570-F Gaming | 16GB Team Group Dark Pro 3600Mhz | Corsair MP600 1TB PCIe Gen 4 | Sapphire 5700 XT Pulse | Corsair H115i Platinum | WD Black 1TB | WD Green 4TB | EVGA SuperNOVA G3 650W | Asus TUF GT501 | Samsung C27HG70 1440p 144hz HDR FreeSync 2 | Ubuntu 20.04.2 LTS |

 

Server:-

Intel NUC running Server 2019 + Synology DSM218+ with 2 x 4TB Toshiba NAS Ready HDDs (RAID0)

Link to comment
Share on other sites

Link to post
Share on other sites

9 hours ago, WereCat said:

Hopefully... but it seems like it is quite a bit more demanding to encode than x265 as of right now.

"quite a bit" is an understatement.

Both encoding and decoding is extremely slow right now. But that's to be expected. The bitstream specifications were frozen just a few months ago and there have been next to no work in optimizing the encoders and decoders yet, but they already have several ideas on how to do it.

We will probably see quite a bit regarding AV encoding and decoding at IBC 2018 (International Broadcasting Convention) which is taking place right now.

 

8 hours ago, Sakkura said:

Don't get your hopes up for that kind of consolidation.

 

https://xkcd.com/927/

All the major players has agreed to adopt it.

Like I said in the OP, here are some of the ones that has agreed to adopt and start using it:

10 hours ago, LAwLz said:
  • Google
  • Apple
  • Microsoft
  • Mozilla
  • Cisco
  • IBM
  • Nvidia
  • AMD
  • ARM
  • Intel
  • Realtek
  • Amazon
  • Facebook
  • Netflix
  • Adobe
  • BBC
  • Hulu
  • VLC
  • Vimeo

 

So I have no worries that it will get adopted.

Link to comment
Share on other sites

Link to post
Share on other sites

8 hours ago, Ryan_Vickers said:

For some reason when I read the title I assumed it was being sarcastic xD well, I guess this is good.  It leaves me wondering where some other codecs fit into the "family" though, namely AVC, VP9, and Opus.  Are those just older versions in the same "line", or are they totally different?  Which if any of those were/are proprietary or charge some sort of fee to use?  I know Youtube has used (or can still if you set it to) AVC, and currently the "main" one is VP9...

AVC (H.264) and HEVC (H.265) are developed by MPEG.

VP8, VP9 and VP10 are developed by Google.

Theora and Daala are developed by Mozilla/Xiph.

Thor is developed by Cisco.

 

The story of how we ended up here and how everything relates to each other is this:

 

The past:

When HTML5 (or maybe it was WebRTC) was being drafted browser makers had to decide which video codec would be required by the W3C specifications to be supported by browsers in order to be compliant. A lot of companies, including Microsoft, lobbied for H.264. The problem with H.264 is that it requires a license from the MPEG (the group who developed it, which Microsoft just so happens to be a member of). It costs money to implement.

 

Many companies, including Google and Mozilla was very resentful of making H.264 a web standard because they believe that the web should be free and open. Requiring someone to pay licensing fees to develop a W3C complaint browser goes against the foundation of what the Internet was built upon.

So what Google did was buy a company called On2 Technologies, which had developed a codec called VP8. After buying the company, Google open sourced the codec and made it free to use. The codec was pretty good. Hardware support was lacking and the quality of the encoders were not anywhere near as good as let's say x264 (the most popular H.264 encoder). However the video quality was pretty competitive with H.264.

 

However, the MPEG group did not like this. So what they did was starting to threatening to sue anyone who used VP8 because they claimed it infringed on some of their patents. This lead to VP8 never really being adopted because H.264 was already established and people/companies were scared of using it.

 

Then Cisco came to the rescue. Cisco had hit the yearly licensing fee cap for H.264, which meant that no matter how many devices used their decoder, they could not be forced to pay more than what they already did. So what did they do? They open sourced their H.264 decoder, and made it free for anyone to use. They would shoulder the burden of paying the licensing fee, and anyone could technically implement H.264 without having to pay anything.

 

In the end, the W3C specification ended up allowing either H.264 or VP8 to be used.

 

The present:

Fast forward to HEVC (H.265) being released.

HEVC is a licensing nightmare. It's still developed by the MPEG, but the licensing is not only much more expensive (something like 100 times more expensive if I recall correctly), but it's no longer a single license you buy. In order to not open yourself up to lawsuits, you have to negotiate separate licensing agreements with something like 10 different companies and patent pools, because the developers and patent owners behind HEVC has not agreed what the licensing structure should look like. On top of that, HEVC did not have any license cap like AVC had (to prevent Cisco from just doing what they did before).

 

So now companies were starting to get really worried. Something had to be done because the HEVC licensing situation was terrible.

As a result, developed VP9 which has gained quite a bit of traction. It's comparable to HEVC in terms of compression efficiency (that is to say, how small the file is without losing quality, or how good the quality looks without making the file size bigger). Because of a lack of support from some of the big players (mainly Apple, Microsoft and Adobe if you ask me) it hasn't gained much traction though. Possibly because of MPEG's previous threats of legal actions against people who use VP8.

 

This is the state we are in right now. You're either using the previous generation of codecs (H.264 or VP8), or you have to decide if you want to use HEVC or VP9. HEVC is expensive and a legal clusterfuck, and VP9 is not widely adopted and you may run into compatibility issues. On top of that the encoder tools aren't that great either.

 

The future:

Because this situation isn't great for anyone, the Internet Engineering Task Force (IETF) decided that they had to do something. The IETF is an open standards organization known for things such as the IPv4, IPv6, IMAP, and several other protocols and standards. Needless to say, they are a pretty big deal.

The IETF has announced that they want a standardized video codec for use online. They are calling the project NETVC and has laid out some key features that they deem important, such as competitive compression rate, no overly complex licensing agreements, and so on.

 

Three companies were working on what I'd call "next generation" video codecs and decided to start aligning their goals with the goals of the IETF. I believe all three were also submitted as NETVC candidates. These companies were:

  • Google with their VP10 codec, the successor to VP9.
  • Mozilla/Xiph with their Daala codec. You might have heard of Xiph before, because they are the developers behind Opus as well as some other open and royalty free codecs.
  • Cisco with their Thor codec. Cisco owns some of the patents for HEVC, so they have reused those in Thor.

Then after working on their own separate codecs for a while they all agreed that they should combine their efforts and focus on one single codec, which resulted in AV1 being born. AV1 is primarily based on VP10, but with several features taken from both Daala and Thor, as well as new techniques and featured being developed from scratch.

 

But in order to avoid the previous pitfalls such as a lack of support and risk of lawsuits which occurred with VP8, these three companies decided to recruit other companies and that resulted in AOMedia being born.

 

So the status of AOMedia and AV1 today is this.

Almost 40 companies has joined AOMedia and agreed to not only endorse and push the standard, but also pool their patents and make them free to use for the purposes of AV1.

They have a fantastic license as well as legal teams which has fine combined their patents, how AV1 works, as well as created frameworks which makes sure it is completely legal and risk free to use AV1.

The AV1 bitstream specifications has been finalized and seems to have achieved the goals set out by the IETF for the NETVC standard.

 

I am not sure when NETVC is being decided but I believe the last milestone laid out in their roadmap is the storage format specifications and those need to be submitted this December.

After that, who knows. I don't think there is any doubt that AV1 will become the NETVC standard though.

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, LAwLz said:

All the major players has agreed to adopt it.

Like I said in the OP, here are some of the ones that has agreed to adopt and start using it:

 

So I have no worries that it will get adopted.

 

Sure, it'll get adopted. But will it be the only thing everyone uses all the time? That's far more questionable.

Link to comment
Share on other sites

Link to post
Share on other sites

9 minutes ago, Sakkura said:

Sure, it'll get adopted. But will it be the only thing everyone uses all the time? That's far more questionable.

For new stuff produced? I think so. I wouldn't be surprised if it gets more widely adopted than H.264 even.

There will still be other codecs on the market though. For example during editing companies will most likely still use ProRes or DNxHD. But hopefully it will become so widely used that it might be the only thing end user encounters (until AV2 is released, and work is already scheduled to begin on that).

 

How long that will take is up in the air though. Hardware support is scheduled for late 2019. Then support needs time to get into the hands of consumers before it can be widely adopted. So maybe ~5 years before we really see it taking off?

Link to comment
Share on other sites

Link to post
Share on other sites

45 minutes ago, LAwLz said:

"quite a bit" is an understatement.

Both encoding and decoding is extremely slow right now. But that's to be expected. The bitstream specifications were frozen just a few months ago and there have been next to no work in optimizing the encoders and decoders yet, but they already have several ideas on how to do it.

We will probably see quite a bit regarding AV encoding and decoding at IBC 2018 (International Broadcasting Convention) which is taking place right now.

 

image.png.446539d7e485c8536c58267a937a17d4.png

av1_pic11.jpg

 

So AV1 is 10,000x slower than the x264 (1080p Main) (with the reference encoder) but 60% lower video size at the same quality (around 20% to 30% vs x265) :D

 

 

So yes, "quite a bit" is indeed an understatement

https://code.fb.com/video-engineering/av1-beats-x264-and-libvpx-vp9-in-practical-use-case/

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Good, can't hurt to have more free-as-in-freedom standards. Too bad we'll probably need new hardware to use it.

Don't ask to ask, just ask... please 🤨

sudo chmod -R 000 /*

Link to comment
Share on other sites

Link to post
Share on other sites

Hope they use it to make videos look better but take the same space rather than looking the same but using less space in the future.

“Remember to look up at the stars and not down at your feet. Try to make sense of what you see and wonder about what makes the universe exist. Be curious. And however difficult life may seem, there is always something you can do and succeed at. 
It matters that you don't just give up.”

-Stephen Hawking

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×