Jump to content

Adobe flips on GPU-accelerated encoding for Premiere Pro

https://www-pcworld-com.cdn.ampproject.org/v/s/www.pcworld.com/article/3544027/adobe-flips-on-gpu-accelerated-encoding-for-premiere-pro-and-wow-its-fast.amp.html?amp_js_v=a3&amp_gsa=1&usqp=mq331AQFKAGwASA%3D#referrer=https%3A%2F%2Fwww.google.com&amp_tf=From %1%24s&ampshare=https%3A%2F%2Fwww.pcworld.com%2Farticle%2F3544027%2Fadobe-flips-on-gpu-accelerated-encoding-for-premiere-pro-and-wow-its-fast.html

 

"Today, Adobe officially rolled out hardware encoding support for Nvidia and AMD GPUs in Premiere Pro, After Effects, and Adobe Media Encoder, letting you lean on the power of your graphics card to speed up H.264 and HEVC video exporting"

 

It's definitely some interesting news. Also we need a drag race of a 3990x vs Titan RTX.

Adobe flips on GPU-accelerated encoding for Premiere Pro, and wow it's fast | PCWorld

Link to comment
Share on other sites

Link to post
Share on other sites

How is it different to current GPU acceleration in those applications? Weren't they already GPU-accelerated? O.o

ʕ•ᴥ•ʔ

MacBook Pro 13" (2018) | ThinkPad x230 | iPad Air 2     

~(˘▾˘~)   (~˘▾˘)~

Link to comment
Share on other sites

Link to post
Share on other sites

5 hours ago, mikeike951 said:

dude, you don't need to add a .html file of the news artical, wayback.org exists.

*Insert Witty Signature here*

System Config: https://au.pcpartpicker.com/list/Tncs9N

 

Link to comment
Share on other sites

Link to post
Share on other sites

5 hours ago, Soppro said:

How is it different to current GPU acceleration in those applications? Weren't they already GPU-accelerated? O.o

I think the previous GPU acceleration were for things like effects. This is for accelerating the encoding process using dedicated hardware thst exists on the GPU. 

It's even more efficient than using regular GPU cores, and it's for encoding which used to be done on the CPU before. 

 

It's the reason why people have said final cut on MacOS is so fast and "optimized". 

Link to comment
Share on other sites

Link to post
Share on other sites

5 hours ago, Soppro said:

How is it different to current GPU acceleration in those applications? Weren't they already GPU-accelerated? O.o

to a point, when it came to the final render it always used the CPU, you could force GPU rendering but it was still being worked it could fail and they would have to render it all over again and since the final render can take a very long time, to those on tight schedules this really was an issue as it wasted time, ad company's are an example since making an ad may take a week or two from inception to release.

a failed render would taint their reputation since their customers have to wait longer when they already bought both the ad and the ad space for said ad.

which could be millions of dollars just gone down the drain and would taint the ad company's reputation since they failed to deliver the product on time.

Just now, LAwLz said:

I think the previous GPU acceleration were for things like effects. This is for accelerating the encoding process using dedicated hardware thst exists on the GPU. 

It's even more efficient than using regular GPU cores, and it's for encoding which used to be done on the CPU before. 

this is right to a point, MPEG (the most common video codec) encoded media can be accelerated using a GPU, for a consumer this doesn't really matter sure, but to allow the video the be edited with many effects and other things, it has be decoded first THEN re-encoded for it to be shown on the user's screen, this can be a bit demanding for 1080p footage on regular hardware, but 4k (hell even 8k), HDR and many other media enhancing standards are becoming more common and they will take more and more horsepower to run it at acceptable levels.

combine that with the fact the editor is playing MULTIPLE video streams at the same time, and this can really bog down a CPU, hence why GPU encoding has been popular form a very long time, the reason it wasn't in the final rendering process is because it wasn't stable enough for the use's that many would be using it for (i.e LTT's render server would befit from it)  for LTT it's fine, if the video gets delayed a day due to failed render so what, do a live stream and make up for it, but for production companies like the ad company i talked about, they can't afford to have a failed render, since it could really cause some damage to their brand and reputation for failing to deliver their promised product.

*Insert Witty Signature here*

System Config: https://au.pcpartpicker.com/list/Tncs9N

 

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, Salv8 (sam) said:

the reason it wasn't in the final rendering process is because it wasn't stable enough for the use's that many would be using it for (i.e LTT's render server would befit from it) 

Your post is all over the place and not really correct; this is the most egregious of the errors.  Adobe has dragged their feet on putting hardware encoding into Premiere ever since NVidia made the NVENC APIs public (circa 2012).  We as Premiere users have been hammering on them ever since to do so, and they just stuck their heads in the sand and ignored us.  They slipped it in for iGPU hardware encoding a couple of years ago, but continued to ignore the discrete GPUs.

 

Two plugins were written that could do hardware encoding in Premiere.  One of them died away because it wasn't as well written nor maintained.  But the second one has been available for ... a couple of years now and has been rock solid and stable.  So your statement of "because it wasn't stable enough" is just... not correct.  It's perfectly stable and has been years.

 

The delay has been Adobe's refusal to add it.  Nothing more.

 

 

Editing Rig: Mac Pro 7,1

System Specs: 3.2GHz 16-core Xeon | 96GB ECC DDR4 | AMD Radeon Pro W6800X Duo | Lots of SSD and NVMe storage |

Audio: Universal Audio Apollo Thunderbolt-3 Interface |

Displays: 3 x LG 32UL950-W displays |

 

Gaming Rig: PC

System Specs:  Asus ROG Crosshair X670E Extreme | AMD 7800X3D | 64GB G.Skill Trident Z5 NEO 6000MHz RAM | NVidia 4090 FE card (OC'd) | Corsair AX1500i power supply | CaseLabs Magnum THW10 case (RIP CaseLabs ) |

Audio:  Sound Blaster AE-9 card | Mackie DL32R Mixer | Sennheiser HDV820 amp | Sennheiser HD820 phones | Rode Broadcaster mic |

Display: Asus PG32UQX 4K/144Hz displayBenQ EW3280U display

Cooling:  2 x EK 140 Revo D5 Pump/Res | EK Quantum Magnitude CPU block | EK 4090FE waterblock | AlphaCool 480mm x 60mm rad | AlphaCool 560mm x 60mm rad | 13 x Noctua 120mm fans | 8 x Noctua 140mm fans | 2 x Aquaero 6XT fan controllers |

Link to comment
Share on other sites

Link to post
Share on other sites

10 hours ago, Soppro said:

How is it different to current GPU acceleration in those applications? Weren't they already GPU-accelerated? O.o

Previous versions only accelerated the playback. When you encoded things it still went through AME (Adobe Media Encoder)

 

This is what this is, is putting the GPU encoding into AME.

https://helpx.adobe.com/media-encoder/using/whats-new/2020.html#hardware

 

Quote

Hardware-accelerated H.264 and HEVC encoding

May 2020 release (version 14.2)

With new support for NVIDIA and AMD GPUs on Windows, hardware encoding for H.264 and H.265 (HEVC) is now available across all platforms. This means consistently faster exports for these widely used formats. For more information, see Hardware acceleration system requirements.

The linked article also only talks about final encodes, which is what AME does.

Link to comment
Share on other sites

Link to post
Share on other sites

43 minutes ago, Kisai said:

Previous versions only accelerated the playback. When you encoded things it still went through AME (Adobe Media Encoder)

Sort of.  There's apparently a lot of misunderstanding about how Premiere does what it does.  Backwards first: Premiere can do the final encode internally or punt it over to AME.  That's entirely up to the end user.  They use identical engines, but they're separate apps.  Again, if you don't want to punt the final encode off to AME, you don't have to and Premiere will just do the deed for you.

 

The GPU acceleration for playback is technically for rendering.  Folks often mistake "rendering" for "encoding" and they're not the same thing; hell even some NLEs (eg: Final Cut Pro, Resolve) call the final product a "render".  It's more accurately an encode after the render.  The render is what is required, frame by frame, to composite what you want the video to have as far as appearance, color, effects, framing, etc.  And what Premiere does with the GPU is accelerate some of that rendering; specifically certain effects.  That's it.  But that acceleration aides in the real time playback (eg: your "accelerated for playback") because before Premiere can play the frame, it has to render it.

Edited by jasonvp
Clarification

Editing Rig: Mac Pro 7,1

System Specs: 3.2GHz 16-core Xeon | 96GB ECC DDR4 | AMD Radeon Pro W6800X Duo | Lots of SSD and NVMe storage |

Audio: Universal Audio Apollo Thunderbolt-3 Interface |

Displays: 3 x LG 32UL950-W displays |

 

Gaming Rig: PC

System Specs:  Asus ROG Crosshair X670E Extreme | AMD 7800X3D | 64GB G.Skill Trident Z5 NEO 6000MHz RAM | NVidia 4090 FE card (OC'd) | Corsair AX1500i power supply | CaseLabs Magnum THW10 case (RIP CaseLabs ) |

Audio:  Sound Blaster AE-9 card | Mackie DL32R Mixer | Sennheiser HDV820 amp | Sennheiser HD820 phones | Rode Broadcaster mic |

Display: Asus PG32UQX 4K/144Hz displayBenQ EW3280U display

Cooling:  2 x EK 140 Revo D5 Pump/Res | EK Quantum Magnitude CPU block | EK 4090FE waterblock | AlphaCool 480mm x 60mm rad | AlphaCool 560mm x 60mm rad | 13 x Noctua 120mm fans | 8 x Noctua 140mm fans | 2 x Aquaero 6XT fan controllers |

Link to comment
Share on other sites

Link to post
Share on other sites

34 minutes ago, jasonvp said:

And what Premiere does with the GPU is accelerate some of that rendering; specifically certain effects.  That's it.

And now it can also do the encoding on the GPU as well.

Link to comment
Share on other sites

Link to post
Share on other sites

Oh wow Adobe actually leverging existing hardware. Also I wonder how this is going to work for Radeon cards or even APUs hmmm. 

| Ryzen 7 7800X3D | AM5 B650 Aorus Elite AX | G.Skill Trident Z5 Neo RGB DDR5 32GB 6000MHz C30 | Sapphire PULSE Radeon RX 7900 XTX | Samsung 990 PRO 1TB with heatsink | Arctic Liquid Freezer II 360 | Seasonic Focus GX-850 | Lian Li Lanccool III | Mousepad: Skypad 3.0 XL / Zowie GTF-X | Mouse: Zowie S1-C | Keyboard: Ducky One 3 TKL (Cherry MX-Speed-Silver)Beyerdynamic MMX 300 (2nd Gen) | Acer XV272U | OS: Windows 11 |

Link to comment
Share on other sites

Link to post
Share on other sites

Was this more to do with quality control of the final encoded deliverable? It's my limited understanding that while a hardware CODEC is superior in terms of both performance and resource utilization, everything is baked in...including subtile flaws. With software, there's 100% control of the process. Software CODEC can be patched.

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, StDragon said:

Was this more to do with quality control of the final encoded deliverable? It's my limited understanding that while a hardware CODEC is superior in terms of both performance and resource utilization, everything is baked in...including subtile flaws. With software, there's 100% control of the process. Software CODEC can be patched.

Historically it's just been a bitrate thing.  If you didn't understand that you need to crank the bitrate on a hardware encoded output before starting, and assumed you could use the same bitrate you were using for software encoded h.264, you'd end up with an identically-sized file, but with abysmally worse quality.  NVidia, for instance, always suggested 1440p/60 be done at 50(!)Mbit/sec.  That's a whole lot higher than the software encoded output of the same resolution/frame rate.  And it results in a much larger file; "much larger" here is used in relationship to other h.264 files.  Still small overall, just larger than the software output.

 

Teaching folks this was one of the challenges, and I suspect that may be why Adobe punted on it for so long.  Maybe.  I don't know that for certain.

Editing Rig: Mac Pro 7,1

System Specs: 3.2GHz 16-core Xeon | 96GB ECC DDR4 | AMD Radeon Pro W6800X Duo | Lots of SSD and NVMe storage |

Audio: Universal Audio Apollo Thunderbolt-3 Interface |

Displays: 3 x LG 32UL950-W displays |

 

Gaming Rig: PC

System Specs:  Asus ROG Crosshair X670E Extreme | AMD 7800X3D | 64GB G.Skill Trident Z5 NEO 6000MHz RAM | NVidia 4090 FE card (OC'd) | Corsair AX1500i power supply | CaseLabs Magnum THW10 case (RIP CaseLabs ) |

Audio:  Sound Blaster AE-9 card | Mackie DL32R Mixer | Sennheiser HDV820 amp | Sennheiser HD820 phones | Rode Broadcaster mic |

Display: Asus PG32UQX 4K/144Hz displayBenQ EW3280U display

Cooling:  2 x EK 140 Revo D5 Pump/Res | EK Quantum Magnitude CPU block | EK 4090FE waterblock | AlphaCool 480mm x 60mm rad | AlphaCool 560mm x 60mm rad | 13 x Noctua 120mm fans | 8 x Noctua 140mm fans | 2 x Aquaero 6XT fan controllers |

Link to comment
Share on other sites

Link to post
Share on other sites

ABOUT DAMN TIME

~New~  BoomBerryPi project !  ~New~


new build log : http://linustechtips.com/main/topic/533392-build-log-the-scrap-simulator-x/?p=7078757 (5 screen flight sim for 620$ CAD)LTT Web Challenge is back ! go here  :  http://linustechtips.com/main/topic/448184-ltt-web-challenge-3-v21/#entry601004

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Doobeedoo said:

Oh wow Adobe actually leverging existing hardware. Also I wonder how this is going to work for Radeon cards or even APUs hmmm. 

The rumoured Ryzen 4700G is going to be video editor's favorite CPU huh.

Main Rig :

Ryzen 7 2700X | Powercolor Red Devil RX 580 8 GB | Gigabyte AB350M Gaming 3 | 16 GB TeamGroup Elite 2400MHz | Samsung 750 EVO 240 GB | HGST 7200 RPM 1 TB | Seasonic M12II EVO | CoolerMaster Q300L | Dell U2518D | Dell P2217H | 

 

Laptop :

Thinkpad X230 | i5 3320M | 8 GB DDR3 | V-Gen 128 GB SSD |

Link to comment
Share on other sites

Link to post
Share on other sites

11 hours ago, StDragon said:

Was this more to do with quality control of the final encoded deliverable? It's my limited understanding that while a hardware CODEC is superior in terms of both performance and resource utilization, everything is baked in...including subtile flaws. With software, there's 100% control of the process. Software CODEC can be patched.

Historically, you used the GPU, if available, for doing all the intermediate encoding, so when generating previews and scrubbing through a video so you didn't have to wait for the CPU to do that render. A low quality instant preview was better than a high quality preview that took a minute between scrubs.

 

When you use NLE software, it has to not only generate a preview of the final composition, it has to generate a preview of everything in the timeline on it's own layer. So Premiere Pro itself may have some specific GPU features, but they're not shared with Adobe Animate, Adobe Photoshop or Adobe After Effects. I'm actually fairly sure that After Effects and Animate have to do everything in software since they're not "video" editing programs, they are primarily animation tools that rely on objects, though it's possible they may still use the GPU for playback scrubbing.

 

Yes, NVenc, Quicksync and AMD VCE are all fixed-function hardware encoders that are designed for streaming, not final renders (that's why they suck at VBR.) That doesn't mean you can't tweak the input settings to give you a decent encode for netflix or youtube. One reason Adobe might have been hesitant to allow this is that people who work in video might not realize that a GPU encode is potentially inferior.

 

https://developer.nvidia.com/video-encode-decode-gpu-support-matrix

 

Take note that the best available encoding settings are only on the Turing parts other than the GeForce GTX 1650, and no support what-so-ever on the 1030. So if you have a Pascal part, that's FINE but it doesn't support the necessary features (h265 b frames) for a proper final master. So if you set things up in AME/PPro for a master, it may just switch back to software mode if the features aren't available.

 

It's been my experience that the GPU encoder is fine if you use youtube and you're a slightly aggressive with the settings, be it for streaming or uploading. Youtube is just going to re-encode it anyway, so upload something that is 2x the quality (eg, bitrate, resolution, framerate) that you need youtube to process, and you're usually good, particularly as more people get 4K screens.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×