Jump to content

This $7000 Card Does WHAT?? – Holy $H!T

Can you explain why it is such a big deal to have to lower the preview quality to get adequate timeline performance??

 

I don't understand what the advantage would be. 

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Razor512 said:

I wonder how hard it would be for them to make a phi module where it can just show up as 64 additional cores?

 

Xeon Phi is designed somewhat like a network device (cores aren't visible to the host and special Linux OS is working on the card that handles the communication and computation). For Phi to be used most efficiently the code must be written and optimized against it.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Fiat-Libertas said:

Can you explain why it is such a big deal to have to lower the preview quality to get adequate timeline performance??

 

I don't understand what the advantage would be. 

Sometimes a lower preview quality will cause issues when doing rotoscoping or doing other detail work, as some edits can look find at a low resolution, but look bad at a high resolution when you see that the masks made no longer properly cover the subject. If you watched the True Blood TV series, especially the episode there was some kind of energy explosion outside of that giant tent, you would notice the mask used going crazy, indicating that they likely rushed through and used a really low res proxy footage to do the masking.

Link to comment
Share on other sites

Link to post
Share on other sites

maybe quadro sli? :P

Ryzen 5 3600 stock | 2x16GB C13 3200MHz (AFR) | GTX 760 (Sold the VII)| ASUS Prime X570-P | 6TB WD Gold (128MB Cache, 2017)

Samsung 850 EVO 240 GB 

138 is a good number.

 

Link to comment
Share on other sites

Link to post
Share on other sites

So let me get this right - Red wants it's customers to spend $7000 on a card to solve a (hardware) problem caused by a (software) problem that shouldn't even exist in the first place???

 

Of course, finger-pointing is easy blaming Adobe for lackluster hardware support or GPU driver vendors for lacking timely updates. Yet the solution seems quite simple and has been around for years - multi GPU support. We've already got CUDA, so why not tweak the software being used (Adobe) to spread that workload across four Quadro cards??? Hell for $35K you could probably get programmers lining up to rewrite the software itself.

 

I don't see this as a hardware problem at all - four Pascal cards can push PLENTY of performance. It's greedy software companies overcharging for poorly-optimized proprietary code. And it's made worse by the fact that Red is capitalizing on this making users spend thousands more for only a marginal boost in productivity.

Link to comment
Share on other sites

Link to post
Share on other sites

My only question on this card is, does it get hot enough to cook on?

"The only thing that matters right now is that you're here, and you're safe."

Link to comment
Share on other sites

Link to post
Share on other sites

@LinusTech 

Can Geforce cards now play 10-bits in OpenGL (Photoshop, Premiere, etc.)? As was said in the video, this feature was unique to the Quadro line.

Link to comment
Share on other sites

Link to post
Share on other sites

how did nobody give you grief for checking drivers as a sound device?

 

anyway this is why I applied when you were hiring because I knew you'd need someone who actually knew about FPGA's at some point, this is a perfect scenario to get a few to provide those benefits you're looking for in timeline scrubbing and rendering performance. I can't speak to performance of RED footage specifically but I can say I've seen them implemented in adobe workflows in the past.

 

Maybe have an intern look into it.

Spoiler

CPU: TR3960x enermax 360 AIO Mobo: Aorus Master RAM: 128gb ddr4 trident z royal PSU: Seasonic Prime 1300w GPU: 5700xt, 5500xt, rx590 Case: c700p black edition Display: Asus MG279Q ETC: Living the VM life many accessories as needed Storage: My personal cluster is now over 100tb!

Link to comment
Share on other sites

Link to post
Share on other sites

@LinusTech

 

I was wondering if you considered an SSG card from AMD. They demoed 8k scrubbing with no issues. I mean, it was basically designed for what you wanted. I know it doesn't use CUDA, but a comparison video would be nice anyway.. 

/hoping someone will actually see this post.

Link to comment
Share on other sites

Link to post
Share on other sites

Why not use final cut pro? Most of your RED using industry affiliates use it.

Please quote me so that I know that you have replied unless it is my own topic.

Link to comment
Share on other sites

Link to post
Share on other sites

On ‎7‎/‎23‎/‎2017 at 2:47 PM, Razor512 said:

The quadro cards are just cards that likely cost less to make than the gaming cards, but because they do a little extra QA on the drivers, as well as more testing with professional applications, they use that as reasoning to price gouge. From what I have seen, it is often difficult to see any difference at all in reliability between the quadro and geforce cards (as long as they are using a type of processing that both cards have enabled and not crippled in the bios).

I agree, reliability wise, it's hard to tell. Software optimization wise, it can be noticeable. Some professional programs have features that are disabled unless you have workstation GPU (not just 10-bit color either). Also, tessellation optimization matters when dealing with high amount of polygons in 3D modeling. My FirePro V7900 while getting less fps overall than my 980 Ti or 1070 in 3DS max, remains more stable and usable with high poly scenes (The 1070 / 980 Ti would just lag where the FirePro would not).

 

I'd agree that most users would be fine with GeForce / Radeon cards though. It really would be cool if the Xeon Phi could be used for acceleration of encoding too.

Link to comment
Share on other sites

Link to post
Share on other sites

9 hours ago, VanayadGaming said:

@LinusTech

 

I was wondering if you considered an SSG card from AMD. They demoed 8k scrubbing with no issues. I mean, it was basically designed for what you wanted. I know it doesn't use CUDA, but a comparison video would be nice anyway.. 

/hoping someone will actually see this post.

Except it's not. Besides the SSD glued to it, the SSG is literally just a regular firepro.

Link to comment
Share on other sites

Link to post
Share on other sites

On 7/23/2017 at 6:49 PM, NvidiaIntelAMDLoveTriangle said:

Nothing.

It sucks your entire life saving tho. :P

 

Intel Xeon E5 1650 v3 @ 3.5GHz 6C:12T / CM212 Evo / Asus X99 Deluxe / 16GB (4x4GB) DDR4 3000 Trident-Z / Samsung 850 Pro 256GB / Intel 335 240GB / WD Red 2 & 3TB / Antec 850w / RTX 2070 / Win10 Pro x64

HP Envy X360 15: Intel Core i5 8250U @ 1.6GHz 4C:8T / 8GB DDR4 / Intel UHD620 + Nvidia GeForce MX150 4GB / Intel 120GB SSD / Win10 Pro x64

 

HP Envy x360 BP series Intel 8th gen

AMD ThreadRipper 2!

5820K & 6800K 3-way SLI mobo support list

 

Link to comment
Share on other sites

Link to post
Share on other sites

Other than the Ryzen 5 one, this is the only title I consider to be click-baity.  I'm not throwing a fit over it, but I'm surprised that there aren't a lot of complaints about it.   People got angry over the SLI one even though there were tons of indicators that it was about USB drives.

Make sure to quote or tag me (@JoostinOnline) or I won't see your response!

PSU Tier List  |  The Real Reason Delidding Improves Temperatures"2K" does not mean 2560×1440 

Link to comment
Share on other sites

Link to post
Share on other sites

7 hours ago, BachChain said:

Except it's not. Besides the SSD glued to it, the SSG is literally just a regular firepro.

Well, not really. They did a custom controller for that card so that it can try and do exactly what they did in the video. Scrape the timeline of an 8k video. With no stuttering. If it were just what you said, then a quadro with an ssd could do the same. Which it  can't.

Link to comment
Share on other sites

Link to post
Share on other sites

On 7/24/2017 at 7:02 AM, Azlaem said:

@LinusTech 

Can Geforce cards now play 10-bits in OpenGL (Photoshop, Premiere, etc.)? As was said in the video, this feature was unique to the Quadro line.

Pascal cards can output 10bit through DirectX and fullscreen exclusive OpenGL. It can not output 10 bit color over windowed OpenGL (like Photoshop and Premier uses)

Link to comment
Share on other sites

Link to post
Share on other sites

This card can do what :D

 

CPU: Intel i7 3970X @ 4.7 GHz  (custom loop)   RAM: Kingston 1866 MHz 32GB DDR3   GPU(s): 2x Gigabyte R9 290OC (custom loop)   Motherboard: Asus P9X79   

Case: Fractal Design R3    Cooling loop:  360 mm + 480 mm + 1080 mm,  tripple 5D Vario pump   Storage: 500 GB + 240 GB + 120 GB SSD,  Seagate 4 TB HDD

PSU: Corsair AX860i   Display(s): Asus PB278Q,  Asus VE247H   Input: QPad 5K,  Logitech G710+    Sound: uDAC3 + Philips Fidelio x2

HWBot: http://hwbot.org/user/tame/

Link to comment
Share on other sites

Link to post
Share on other sites

  • 3 months later...

which program was used to measure frames per second in the premiere pro 6k playback?
 

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×