Jump to content

This $7000 Card Does WHAT?? – Holy $H!T

weird audio click (like someone bumping the mic) in the privacy.com ad spot at the end

Link to comment
Share on other sites

Link to post
Share on other sites

Seems like the equivalent of a glorified and insanely overpriced Physx card. Knowing them, that card is probably just a GPU with special drivers, or at most, an FPGA that is optimized for their video format, and in  their price gouging fashion where the company is willing to charge like $400 for a where that cost less than $1 to make, they decided to price gouge in this area also.

 

Their greed will eventually be their downfall when another company steps into this field and offers similar hardware for a fraction of the price.

 

RED should have at least sold those cards for $100-200, especially if it is only useful for their formats, that would encourage people to adopt their technologies.

 

Imagine if they sold those cards in PCI express slot and thunderbolt configurations and designed them for adding RED acceleration as many devices as possible. That would make people more likely to buy their cameras, as it means easier times working with the footage in the field where you may only have a laptop.

 

Link to comment
Share on other sites

Link to post
Share on other sites

I couldn't understand the conclusion. Ok, the Rocket-X is cool but what about the Quadro? If the Titan provides the same performance and 10bit output why are you guys using the Quadro? Hell, why not use the 1080ti which has similar performance and 10bit support as the Titan but costs even less? Can you guys explain it in more details? 

NO! It's art, it's colonialism and you'll never get it!

Link to comment
Share on other sites

Link to post
Share on other sites

First thing I noticed, (OK, second thing, but this is a family friendly platform), that thing is huge. no really, how long is it and will you have issues installing it?

Link to comment
Share on other sites

Link to post
Share on other sites

17 minutes ago, Progressor said:

I couldn't understand the conclusion. Ok, the Rocket-X is cool but what about the Quadro? If the Titan provides the same performance and 10bit output why are you guys using the Quadro? Hell, why not use the 1080ti which has similar performance and 10bit support as the Titan but costs even less? Can you guys explain it in more details? 

The quadro cards are just cards that likely cost less to make than the gaming cards, but because they do a little extra QA on the drivers, as well as more testing with professional applications, they use that as reasoning to price gouge. From what I have seen, it is often difficult to see any difference at all in reliability between the quadro and geforce cards (as long as they are using a type of processing that both cards have enabled and not crippled in the bios).

 

What nvidia is banking on is for a professional to think, even if the quadro card is only 1% more reliable, are you willing to risk a 1% higher chance of something going wrong in order to save money?

 

Beyond that, adobe premiere  CC is really getting outdated in terms of its core functionality. Ever since adobe moved to the creative cloud business model where they get a continuous stream of money whether they release something new or not, they have largely stopped focusing on the core functions of the program. In the past just about every new version addes a small number of new features, but always made a big deal about performance improvements in the core functions of the timeline, exporting, effects, and additional multithreading. That was because in those days, people would not purchase a new version unless it improved their workflow, thus every update had to include universal improvements that impacted all users. For example, regardless of what you do in that program, you will always end up loading content to the timeline and eventually exporting the content that you have worked on so they had to always improve that process in some way.

 

 

With creative cloud, you see occasional updates with new features, but pretty much nothing that has to do with performance.

 

It is at a point now where final cut on core i5 based hackintosh with a core i5 CPU handles 4K video more smoothly than a desktop PC with an overclocked core i7 7700k running adobe premiere pro.

 

At the moment adobe premiere CC is more functional and has a more intuitive UI, but is poorly optimized to take advantage of the true capabilities or modern CPUs and GPUs. Sure it can use them, but it uses them poorly.

Link to comment
Share on other sites

Link to post
Share on other sites

11 hours ago, Razor512 said:

Seems like the equivalent of a glorified and insanely overpriced Physx card. Knowing them, that card is probably just a GPU with special drivers, or at most, an FPGA that is optimized for their video format, and in  their price gouging fashion where the company is willing to charge like $400 for a where that cost less than $1 to make, they decided to price gouge in this area also.

 

Their greed will eventually be their downfall when another company steps into this field and offers similar hardware for a fraction of the price.

 

RED should have at least sold those cards for $100-200, especially if it is only useful for their formats, that would encourage people to adopt their technologies.

 

Imagine if they sold those cards in PCI express slot and thunderbolt configurations and designed them for adding RED acceleration as many devices as possible. That would make people more likely to buy their cameras, as it means easier times working with the footage in the field where you may only have a laptop.

 

Well, for one, RED doesn't cater to consumers or even to most YouTubers. This is the type of stuff that would be used for making TV shows, commercials, and even full-length movies. Most people won't be working with RED footage to begin, drastically limiting the users that can take advantage of a Red Rocket card. Not being able to benefit from economies of scale probably drives up prices quite by a substantial portion, and R&D isn't cheap either. Most likely, this is an ASIC unit. 

 

For two, the companies that can afford $50k camera + accessories are likely to see such a return on investment that the cost is inconsequential. 

Edited by Zodiark1593
An unfortunate autocorrect moment

My eyes see the past…

My camera lens sees the present…

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, Razor512 said:

The quadro cards are just cards that likely cost less to make than the gaming cards, but because they do a little extra QA on the drivers, as well as more testing with professional applications, they use that as reasoning to price gouge. From what I have seen, it is often difficult to see any difference at all in reliability between the quadro and geforce cards (as long as they are using a type of processing that both cards have enabled and not crippled in the bios).

 

What nvidia is banking on is for a professional to think, even if the quadro card is only 1% more reliable, are you willing to risk a 1% higher chance of something going wrong in order to save money?

 

Beyond that, adobe premiere  CC is really getting outdated in terms of its core functionality. Ever since adobe moved to the creative cloud business model where they get a continuous stream of money whether they release something new or not, they have largely stopped focusing on the core functions of the program. In the past just about every new version addes a small number of new features, but always made a big deal about performance improvements in the core functions of the timeline, exporting, effects, and additional multithreading. That was because in those days, people would not purchase a new version unless it improved their workflow, thus every update had to include universal improvements that impacted all users. For example, regardless of what you do in that program, you will always end up loading content to the timeline and eventually exporting the content that you have worked on so they had to always improve that process in some way.

 

 

With creative cloud, you see occasional updates with new features, but pretty much nothing that has to do with performance.

 

It is at a point now where final cut on core i5 based hackintosh with a core i5 CPU handles 4K video more smoothly than a desktop PC with an overclocked core i7 7700k running adobe premiere pro.

 

At the moment adobe premiere CC is more functional and has a more intuitive UI, but is poorly optimized to take advantage of the true capabilities or modern CPUs and GPUs. Sure it can use them, but it uses them poorly.

Btw, it's not just drivers optimization, Quadro cards do much faster double precision floating point calculation. But as far as I'm aware double precision is only utilized by specific CAD and scientific software(it's what makes professional cards so much faster than gaming ones in those workflows), which is why I find Linus's choice so bizarre. So is it just reliability and compatibility with premiere that made Linus spend so much more money for a quadro? I really wish he talked more about the "quadro vs titan vs 1080ti" for their workflow and needs.

NO! It's art, it's colonialism and you'll never get it!

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, Progressor said:

Btw, it's not just drivers optimization, Quadro cards do much faster double precision floating point calculation. But as far as I'm aware double precision is only utilized by specific CAD and scientific software(it's what makes professional cards so much faster than gaming ones in those workflows), which is why I find Linus's choice so bizarre. So is it just reliability and compatibility with premiere that made Linus spend so much more money for a quadro? I really wish he talked more about the "quadro vs titan vs 1080ti" for their workflow and needs.

I thought he said that he was going to use Titan XPs because they performed better?

Link to comment
Share on other sites

Link to post
Share on other sites

No music in this video? 

Link to comment
Share on other sites

Link to post
Share on other sites

For the double precision workloads, those are pretty much disabled in software, as there is no way to do some intricate modifications in hardware to laser cut in such  way.

 

Double precision doesn't seem to be used in any video editor, though One thing I never tested is if there is an impact of it on complex 3D objects in a video editor.

 

It seems like it would be better to just stick with 1080 ti cards, and leave them at their factory clock speeds and keep the overclocks an additional 50MHz under the clock speed of a 24 hour stable prime 95 test.

Link to comment
Share on other sites

Link to post
Share on other sites

Holy shit.  While he was doing the infotainment bit and the screen was acting all fuzzy..

Scare the shit outta me.  I thought it was GPU acting up at first.

Currently focusing on my video game collection.

It doesn't matter what you play games on, just play good games you enjoy.

 

Link to comment
Share on other sites

Link to post
Share on other sites

His videos sometimes are confusing. 

All I hear was: RED X blah blah blah, Quodro blah bluh bluh, TitanXpp blah blah blah. 

Conclusion: because enable= disable, adobe premiere sucks, it is not worth it.

 

I have no idea who wrote the script for LTT, and those comparison charts are equally confusing to me. 

If it is not broken, let's fix till it is. 

Link to comment
Share on other sites

Link to post
Share on other sites

Ok, let me ask a ridiculously stupid, yet important question. Why not try an Intel xeon Phi? With it's 60 cores, it can be an ideal co-processor, uses a double PCI-e slot, and probably costs half as much as this red rocket stuff. Any thoughts on that? I'd like @LinusTech 's opinion about it. Now I know it's a bit dated, but worth giving a try. You can find relatively good models on Amazon. 

Link to comment
Share on other sites

Link to post
Share on other sites

16 minutes ago, thewipyk said:

Ok, let me ask a ridiculously stupid, yet important question. Why not try an Intel xeon Phi? With it's 60 cores, it can be an ideal co-processor, uses a double PCI-e slot, and probably costs half as much as this red rocket stuff. Any thoughts on that? I'd like @LinusTech 's opinion about it. Now I know it's a bit dated, but worth giving a try. You can find relatively good models on Amazon. 

If it could support standard x86 code without much trouble, e.g., if an application can see it as a normal CPU core, then that would be awesome for computing tasks, applications would just need to become aware of the bandwidth limitations of communicating with them and push tasks that need a lot of computation but little bandwidth over to those cores.

 

From this video, it seems like a system primarily using that product can run x86 code, bot it is hard to find info on how it integrates to a standard PC.

What is needed is for it to be designed in a way where when a user installs the product and the needed drivers, task manager will all of a sudden report 64 additional cores.

 

Then to take things further, just like how windows can be made aware of a big-little config like with the snapdragon 835 running windows 10, if you could see those extra cores as low performance cores that are only used as a last resort, it could be quite a nice overall performance bump.

 

Link to comment
Share on other sites

Link to post
Share on other sites

I still don't get what that card does. 

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, Progressor said:

I couldn't understand the conclusion. Ok, the Rocket-X is cool but what about the Quadro? If the Titan provides the same performance and 10bit output why are you guys using the Quadro? Hell, why not use the 1080ti which has similar performance and 10bit support as the Titan but costs even less? Can you guys explain it in more details? 

Neither the 1080Ti nor the Titans support 10-bit color in an OpenGL environment (Adobe). Quadro drivers are the only ones that can currently work in that arena

Link to comment
Share on other sites

Link to post
Share on other sites

Great video, exactly what I had observed, although you should try running a Dual CPU configuration with dual or tripple TitanXP, We have seen around 34fps  8K full debayer on a decked out HP Z840 with 2 GTX 1080 . 

Link to comment
Share on other sites

Link to post
Share on other sites

19 minutes ago, LyondellBasell said:

Neither the 1080Ti nor the Titans support 10-bit color in an OpenGL environment (Adobe). Quadro drivers are the only ones that can currently work in that arena

I see, this explains it then.

1 hour ago, thewipyk said:

Ok, let me ask a ridiculously stupid, yet important question. Why not try an Intel xeon Phi? With it's 60 cores, it can be an ideal co-processor, uses a double PCI-e slot, and probably costs half as much as this red rocket stuff. Any thoughts on that? I'd like @LinusTech 's opinion about it. Now I know it's a bit dated, but worth giving a try. You can find relatively good models on Amazon. 

From what I've read about the Phi it appeaser that it's not exactly co-cpu in terms that you stick the card and there're extra 60 cores available to the windows(even though the cpu architecture on the Phi is x64/86). The Phi card is kind of a computer in itself, it has its own Linux based operating system on it that you can boot into through windows or something like that. In order for you to take advantage of it's computing power, the software you're going to use with it has to specifically support it, in addition the Phi doesn't support some of the instruction sets that your CPU does, making things more complicated. From what I've looked up it seems there is no support for the Phi from any content creating/editing/cgi/3d/etc. software and no one has managed to make it work with one.

NO! It's art, it's colonialism and you'll never get it!

Link to comment
Share on other sites

Link to post
Share on other sites

I wonder how hard it would be for them to make a phi module where it can just show up as 64 additional cores?

 

If they could do that, it would be a pretty groundbreaking change to desktop computing. While those modules will clearly be bad for anything that relies on a large volume of data being processed, e.g., it would not work for a game, but for things where the PCi express bus will not be much of a bottleneck, those extra low powered cores could really improve performance.For example, imaging having a card like that working in the background in order to do camera tracking on your imported footage, so that when you are ready to do stabilization, you already have the results of 64 additional cores generating the tracking data.

 

Or when encoding, you have your GPU handle the GPU accelerated adjustments, your main CPU cores handling the more intensive CPU bound effects, and the 64 cores from the phi module handling the encoding.

 

If the cores can be used natively, then many developers will find ways to take advantage of them, and windows will probably get an update to the scheduler to where those threads are only ever used for apps that don't require one of those threads, to use them when the main CPU cores are under a heavy load.

Link to comment
Share on other sites

Link to post
Share on other sites

Hi Linus,

 

Red Rocket Cards are typically used in production for debayering and transcoding rather than playback for editing. As you mentioned in the video Rocket-X is pretty outdated at this point since it was made for the Epic Dragon 6K files. I recommend switching your team to a traditional Offline/Online workflow. For example, you could debayer your 8K R3D's to 8K ProRes or DNXHD. (ProRes is CPU bound as well) Once you have locked your Edit you can conform your timeline back to the R3D's. This would allow you to have one Hero workstation that can support R3D playback without the expense of a Rocket-X in every workstation. In our facility, we cut all projects @ 1080P ProRes LT and we conform back to the native camera files for the final Color Grade and finish @ 8K, 4K etc.

 

~ Alex

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, NvidiaIntelAMDLoveTriangle said:

Nothing.

I wonder, will they be able to get a current clamp and stick it onto the 12V connector just to see how much current is being pulled by the card during different tasks in premiere pro? While it will not get the full power draw, it will at least allow them to see if the card is pulling a lot of power, and thus doing something. It if barely pulls anything even under heavy load, then the card may be a glorified DRM key for a more efficient code path for the codec used.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×