Jump to content

YouTube Starts Testing the New Cutting Edge Codec AV1

LAwLz

as i read about it you get around 20% improvement over HEVC but with a 3X to 5X increase in computational complexity. That's a lot. Don't know if it makes sense for every situation.

.

Link to comment
Share on other sites

Link to post
Share on other sites

36 minutes ago, Mihle said:

Hope they use it to make videos look better but take the same space rather than looking the same but using less space in the future.

My guess is that their primary goal is to reduce space used, rather than increase quality.

It's probably the primary reason why Google has pushed so much for higher efficiency codecs. Imagine being able to cut storage costs as well as bandwidth usage by ~30% (maybe even more since not everything is VP9 on Youtube yet).

 

We're talking millions upon millions of dollars saved, as well as much improved user experience.

 

 

3 minutes ago, asus killer said:

as i read about it you get around 20% improvement over HEVC but with a 3X to 5X increase in computational complexity. That's a lot. Don't know if it makes sense for every situation.

It's still very early development. The encode times will drop by a lot as software gets more optimized. Also, they are betting big on hardware solutions. They have all the major hardware makers (except Qualcomm) involved.

Nvidia, AMD, Intel and ARM have all been involved and had input in how the codec should be structured for high efficiency in hardware.

 

One of the goals for NETVC is low latency video conferencing, so real time encoding will be possible.

Amazon is probably interested in adopting it for Twitch too.

Link to comment
Share on other sites

Link to post
Share on other sites

7 minutes ago, LAwLz said:

My guess is that their primary goal is to reduce space used, rather than increase quality.

It's probably the primary reason why Google has pushed so much for higher efficiency codecs. Imagine being able to cut storage costs as well as bandwidth usage by ~30% (maybe even more since not everything is VP9 on Youtube yet).

 

We're talking millions upon millions of dollars saved, as well as much improved user experience.

 

 

It's still very early development. The encode times will drop by a lot as software gets more optimized. Also, they are betting big on hardware solutions. They have all the major hardware makers (except Qualcomm) involved.

Nvidia, AMD, Intel and ARM have all been involved and had input in how the codec should be structured for high efficiency in hardware.

 

One of the goals for NETVC is low latency video conferencing, so real time encoding will be possible.

Amazon is probably interested in adopting it for Twitch too.

i get it's good for them, save space, save money. The problem is we have to get better hardware at our end. Let's hope for some optimization.

.

Link to comment
Share on other sites

Link to post
Share on other sites

Sadly m8 none of our phones or gpu's have hardware instructions for AV1 and neither will the new rtx 2080, and probably neither Intel HD from new 9000 series lineup and no phone unitl 2019 + will not have it.

I think this issues is way too big and annoying, i think we need a generic decoding platform using compute or something like that, when new codecs arrive they can use gpu efficiently to decode, maybe will be slower than hardware instructions but still using the cpu to decode videos is extremely innefficient.

Link to comment
Share on other sites

Link to post
Share on other sites

50 minutes ago, LAwLz said:

My guess is that their primary goal is to reduce space used, rather than increase quality.

It's probably the primary reason why Google has pushed so much for higher efficiency codecs. Imagine being able to cut storage costs as well as bandwidth usage by ~30% (maybe even more since not everything is VP9 on Youtube yet).

 

We're talking millions upon millions of dollars saved, as well as much improved user experience.

 

That's not what I want :(. I have more than enough internet connection at home. I want better quality than YouTube has, especially because it's many that does not upscale their 1080p fotage to 4k before uploading. (Wouldn't care as much if everyone did upscale, at least before I get a 4k screen)

 

Do you think Netflix would have done the same of they also switched.

(If not I will probably continue using Blu-ray's with the stuff I most want to watch in good quality (

“Remember to look up at the stars and not down at your feet. Try to make sense of what you see and wonder about what makes the universe exist. Be curious. And however difficult life may seem, there is always something you can do and succeed at. 
It matters that you don't just give up.”

-Stephen Hawking

Link to comment
Share on other sites

Link to post
Share on other sites

21 minutes ago, yian88 said:

Sadly m8 none of our phones or gpu's have hardware instructions for AV1 and neither will the new rtx 2080, and probably neither Intel HD from new 9000 series lineup and no phone unitl 2019 + will not have it.

I think this issues is way too big and annoying, i think we need a generic decoding platform using compute or something like that, when new codecs arrive they can use gpu efficiently to decode, maybe will be slower than hardware instructions but still using the cpu to decode videos is extremely innefficient.

We already have a "generic decoding platform". It's called GPU shaders.

There were shader decoders which ran on the GPU for VP9 (I believe AMD still uses this approach on their graphics cards rather than fixed-function hardware), and the same will happen for AV1.

But it will take time to get it working, and even more time to get it working well.

 

With a bit of luck we will hear about fully fixed hardware solutions around spring 2019, and then it will hopefully show up in devices ~6 months later.

I would not be surprised if the Samsung Galaxy S 11 will have full AV1 support in hardware, although that might be a bit too optimistic.

 

2 minutes ago, Mihle said:

Do you think Netflix would have done the same of they also switched.

(If not I will probably continue using Blu-ray's with the stuff I most want to watch in good quality (

Netflix is onboard and will adopt AV1, but my guess is that they will also use it for reducing bandwidth rather than increase quality.

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, LAwLz said:

One of the goals for NETVC is low latency video conferencing, so real time encoding will be possible.

Amazon is probably interested in adopting it for Twitch too.

Oh wow I didn't realize that they were looking at it from that perspective too. I'd assume to replace h.264 for WebRTC video?

 

1 hour ago, LAwLz said:

We already have a "generic decoding platform". It's called GPU shaders.

There were shader decoders which ran on the GPU for VP9 (I believe AMD still uses this approach on their graphics cards rather than fixed-function hardware), and the same will happen for AV1.

But it will take time to get it working, and even more time to get it working well.

Desktop/Laptop Vega used hybrid decoding, but Raven's Ridge APUs have a fixed function VP9 Encoder/Decoder.

Link to comment
Share on other sites

Link to post
Share on other sites

VLC doesn't support it.

Spoiler

Quiet Whirl | CPU: AMD Ryzen 7 3700X Cooler: Noctua NH-D15 Mobo: MSI B450 TOMAHAWK MAX RAM: HyperX Fury RGB 32GB (2x16GB) DDR4 3200 Mhz Graphics card: MSI GeForce RTX 2070 SUPER GAMING X TRIO PSU: Corsair RMx Series RM550x Case: Be quiet! Pure Base 600

 

Buffed HPHP ProBook 430 G4 | CPU: Intel Core i3-7100U RAM: 4GB DDR4 2133Mhz GPU: Intel HD 620 SSD: Some 128GB M.2 SATA

 

Retired:

Melting plastic | Lenovo IdeaPad Z580 | CPU: Intel Core i7-3630QM RAM: 8GB DDR3 GPU: nVidia GeForce GTX 640M HDD: Western Digital 1TB

The Roaring Beast | CPU: Intel Core i5 4690 (BCLK @ 104MHz = 4,05GHz) Cooler: Akasa X3 Motherboard: Gigabyte GA-Z97-D3H RAM: Kingston 16GB DDR3 (2x8GB) Graphics card: Gigabyte GTX 970 4GB (Core: +130MHz, Mem: +230MHz) SSHD: Seagate 1TB SSD: Samsung 850 Evo 500GB HHD: WD Red 4TB PSU: Fractal Design Essence 500W Case: Zalman Z11 Plus

 

Link to comment
Share on other sites

Link to post
Share on other sites

We will see how it goes, but heir goal of compressing more than H.264 and giving better image quality seems pie in the sky to  me. By definition if you are compressing something you are losing quality? No?

 

For the record I am not a programmer. 

Laptop: 2019 16" MacBook Pro i7, 512GB, 5300M 4GB, 16GB DDR4 | Phone: iPhone 13 Pro Max 128GB | Wearables: Apple Watch SE | Car: 2007 Ford Taurus SE | CPU: R7 5700X | Mobo: ASRock B450M Pro4 | RAM: 32GB 3200 | GPU: ASRock RX 5700 8GB | Case: Apple PowerMac G5 | OS: Win 11 | Storage: 1TB Crucial P3 NVME SSD, 1TB PNY CS900, & 4TB WD Blue HDD | PSU: Be Quiet! Pure Power 11 600W | Display: LG 27GL83A-B 1440p @ 144Hz, Dell S2719DGF 1440p @144Hz | Cooling: Wraith Prism | Keyboard: G610 Orion Cherry MX Brown | Mouse: G305 | Audio: Audio Technica ATH-M50X & Blue Snowball | Server: 2018 Core i3 Mac mini, 128GB SSD, Intel UHD 630, 16GB DDR4 | Storage: OWC Mercury Elite Pro Quad (6TB WD Blue HDD, 12TB Seagate Barracuda, 1TB Crucial SSD, 2TB Seagate Barracuda HDD)
Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, DrMacintosh said:

 By definition if you are compressing something you are losing quality? No?

Not necessarily but different compressions can combine an average of something to shrink the code which might give artifacts or miscolour the image. Ex. zip is a compression where data loss (quality) needs to be perfect. 

Link to comment
Share on other sites

Link to post
Share on other sites

9 hours ago, LAwLz said:

For new stuff produced? I think so. I wouldn't be surprised if it gets more widely adopted than H.264 even.

There will still be other codecs on the market though. For example during editing companies will most likely still use ProRes or DNxHD. But hopefully it will become so widely used that it might be the only thing end user encounters (until AV2 is released, and work is already scheduled to begin on that).

 

How long that will take is up in the air though. Hardware support is scheduled for late 2019. Then support needs time to get into the hands of consumers before it can be widely adopted. So maybe ~5 years before we really see it taking off?

That's always the issue with these new codecs isn't it?  It creates a divide in the market - devices that have hardware support, and devices that don't.  Often when it comes to a powerful desktop it doesn't matter so much since it can just "brute force" the playback without issues anyway, but for laptops, phones, etc. that efficiency is a make or break sort of thing.  To that end, it would be nice if there was a not-so-hardware way of implementing the decoding.  I had a thread recently (well, at first complaining about but then) looking into how VP9 on YouTube is accelerated (or not) by different hardware, and one interesting take away was how my N3350 seems to do it in Edge.  It doesn't technically have hardware support, but it's absolutely hardware accelerated (something I wish other browsers would also do...).  My understanding is that they've implemented some sort of software solution that uses the iGPU for compute to run it.  If this technique was more widely in use, perhaps the transition between codecs would be more continuous and less of a quantum leap, with all the pain that comes with it.

Solve your own audio issues  |  First Steps with RPi 3  |  Humidity & Condensation  |  Sleep & Hibernation  |  Overclocking RAM  |  Making Backups  |  Displays  |  4K / 8K / 16K / etc.  |  Do I need 80+ Platinum?

If you can read this you're using the wrong theme.  You can change it at the bottom.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, DrMacintosh said:

We will see how it goes, but heir goal of compressing more than H.264 and giving better image quality seems pie in the sky to  me. By definition if you are compressing something you are losing quality? No?

 

For the record I am not a programmer. 

H.264 is far from the holy grail of compression.  There may well be some sort of theoretical, mathematical limit to how good compression can get, but we aren't anywhere near that yet.  IT's just a matter of discovering/inventing new more efficient ways.  To see how compression has improved over time, just look back over the history of video codecs to this point.  Or, look forward and compare something like webp to jpg.

Solve your own audio issues  |  First Steps with RPi 3  |  Humidity & Condensation  |  Sleep & Hibernation  |  Overclocking RAM  |  Making Backups  |  Displays  |  4K / 8K / 16K / etc.  |  Do I need 80+ Platinum?

If you can read this you're using the wrong theme.  You can change it at the bottom.

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, DrMacintosh said:

We will see how it goes, but heir goal of compressing more than H.264 and giving better image quality seems pie in the sky to  me. By definition if you are compressing something you are losing quality? No?

 

For the record I am not a programmer. 

HEVC already provides better image quality at the same or smaller file sizes compared to H.264, and AV1 is one step above HEVC. 

 

Compression does not necessarily mean you lose any quality. PNG is lossless compression. So are the zip files you might have on your computer. FLAC is another lossless compression. 

 

But H.264, HEVC and AV1 are primarily lossy compressions. They remove some information in order to make the files smaller. However, it's more complicated than just "let's remove some color info and the file will be smaller".

AV1 simply has more clever ways of representing the same image with fewer bytes than let's say H.264 or HEVC. 

 

Let's take this as an example. It's 11 number 1 in a row. 

11111111111 - This is raw, uncompressed. 

1111111111 - This is how H.264 might represent it. As 10 number 1 in a row. Some data was lost, but we're pretty damn close to the original. 

11111x2 - this is how HEVC might represent it. A computer might interpret that as "five number 1 in a row, 2 times". The result is exactly the same as in the H.264 example, but it uses fewer bytes to store. 

1x10 - This is how AV1 might store it. In just 4 characters. 

 

Very oversimplified but you get the point. 

Link to comment
Share on other sites

Link to post
Share on other sites

6 hours ago, matrix07012 said:

VLC doesn't support it.

23 hours ago, LAwLz said:

 

  Hide contents

Like I said earlier, about 40 companies are in the alliance and have pledged to implement support for it in their products. Some of the most noteworthy companies in the alliance are:

  • Google
  • Apple
  • Microsoft
  • Mozilla
  • Cisco
  • IBM
  • Nvidia
  • AMD
  • ARM
  • Intel
  • Realtek
  • Amazon
  • Facebook
  • Netflix
  • Adobe
  • BBC
  • Hulu
  • VLC
  • Vimeo

 

VideoLan (VLC) is one of the companies & groups WORKING on AV1.

The codec is not ready right-this-second but will be supported in a future update to VLC once the codec is live and in the wild.

Link to comment
Share on other sites

Link to post
Share on other sites

12 hours ago, Technous285 said:

VideoLan (VLC) is one of the companies & groups WORKING on AV1.

The codec is not ready right-this-second but will be supported in a future update to VLC once the codec is live and in the wild.

I am 99% sure that vlc includes an AV1 decoder. 

I tired playing the file in vlc but couldn't get it working. My guess is that the demuxer doesn't support it yet. 

I'm going to try it without the container later when I'm back on my computer. 

Link to comment
Share on other sites

Link to post
Share on other sites

On 14/09/2018 at 4:32 PM, matrix07012 said:

VLC doesn't support it.

It's had 'experimental' support since 3.0 

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Camoxide said:

It's had 'experimental' support since 3.0 

Do I have to enable it somewhere?

Spoiler

Quiet Whirl | CPU: AMD Ryzen 7 3700X Cooler: Noctua NH-D15 Mobo: MSI B450 TOMAHAWK MAX RAM: HyperX Fury RGB 32GB (2x16GB) DDR4 3200 Mhz Graphics card: MSI GeForce RTX 2070 SUPER GAMING X TRIO PSU: Corsair RMx Series RM550x Case: Be quiet! Pure Base 600

 

Buffed HPHP ProBook 430 G4 | CPU: Intel Core i3-7100U RAM: 4GB DDR4 2133Mhz GPU: Intel HD 620 SSD: Some 128GB M.2 SATA

 

Retired:

Melting plastic | Lenovo IdeaPad Z580 | CPU: Intel Core i7-3630QM RAM: 8GB DDR3 GPU: nVidia GeForce GTX 640M HDD: Western Digital 1TB

The Roaring Beast | CPU: Intel Core i5 4690 (BCLK @ 104MHz = 4,05GHz) Cooler: Akasa X3 Motherboard: Gigabyte GA-Z97-D3H RAM: Kingston 16GB DDR3 (2x8GB) Graphics card: Gigabyte GTX 970 4GB (Core: +130MHz, Mem: +230MHz) SSHD: Seagate 1TB SSD: Samsung 850 Evo 500GB HHD: WD Red 4TB PSU: Fractal Design Essence 500W Case: Zalman Z11 Plus

 

Link to comment
Share on other sites

Link to post
Share on other sites

23 minutes ago, matrix07012 said:

Do I have to enable it somewhere?

You shouldn't do, but I wouldn't expect decent playback at this stage. I've had issues with VLC updating before though, so it might be worth uninstalling and getting a new version off their website. 

Link to comment
Share on other sites

Link to post
Share on other sites

On 9/14/2018 at 4:32 PM, matrix07012 said:

VLC doesn't support it.

Vlc is notorious for doing things right and being the best player. They probably prefer not to have it at all then have it not working properly.

If it becomes a thing they will had support for sure

.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×