Jump to content

November 9th Ultimate Video Editing PC Buyer's Guide - where to buy

LinusTech

@LinusTech

 

sGzQr7s.png

 

yeah ... please don't.

There is absolutely no need for you to shoot or edit 4K right now. It is just stupid and slows down your editing process significanlty.

Heck even a lot of mega-high-budget multi-million-dollar hollywood movies aren't edited in 4K yet ...

And no this is not overkill for editing 1080p footage at all.

 

You want to make higher quality videos? Get some better lens, get a high quality colorimeter, get a real good 1080p monitor ...

Mini-Desktop: NCASE M1 Build Log
Mini-Server: M350 Build Log

Link to comment
Share on other sites

Link to post
Share on other sites

Noob question: How does it work having a Quadro and a gtx780 in there? So how would the computer know to use the Quadro if you were opening Maya and then the 780 if you were opening Crysis?

Just seconding what BenvolioZF asked...

Which drivers did you use for the GPUs? - Quadro / Geforce / both...? (order of installing them). Did you plug the monitor in to the Quadro because it is in slot 1 or is it a driver related thing...?

Apple, Piss Off! ~ Linus 2014

No, you're not hallucinating, or maybe you are... either way, I'm back. ~ Linus 2015

Link to comment
Share on other sites

Link to post
Share on other sites

This build almost cost $6000 ($5936.49 to be precise as of 10/11/2013). 

 

Hey Linus, I'd like to see some Geekbench and Cinebench score of this beast

 

Amazing Build Guide by the way. Very detailed and helpful :)

Link to comment
Share on other sites

Link to post
Share on other sites

Just seconding what BenvolioZF asked...

Which drivers did you use for the GPUs? - Quadro / Geforce / both...? (order of installing them). Did you plug the monitor in to the Quadro because it is in slot 1 or is it a driver related thing...?

 

I want to know as well, hope to se  answer soon. 

Link to comment
Share on other sites

Link to post
Share on other sites

Nice video :)

Rig CPU Intel i5 3570K at 4.2 GHz - MB MSI Z77A-GD55 - RAM Kingston 8GB 1600 mhz - GPU XFX 7870 Double D - Keyboard Logitech G710+

Case Corsair 600T - Storage Intel 330 120GB, WD Blue 1TB - CPU Cooler Noctua NH-D14 - Displays Dell U2312HM, Asus VS228, Acer AL1715

 

Link to comment
Share on other sites

Link to post
Share on other sites

Didn't realize LCD monitors lost their calibration so quickly. I learned something new, if your statement is true.

 

'factory calibrated' is more like a marketing term.

To get accurate color reproduction you need to calibrate your monitor at your actual workplace with the lighting at your workplace and recalibrate every now and then.

Mini-Desktop: NCASE M1 Build Log
Mini-Server: M350 Build Log

Link to comment
Share on other sites

Link to post
Share on other sites

'factory calibrated' is more like a marketing term.

To get accurate color reproduction you need to calibrate your monitor at your actual workplace with the lighting at your workplace and recalibrate every now and then.

It's better than no calibration.   I don't edit movies and pictures professionally, but i appreciate that my LG had some sort of calibration done to it (with the included paper for verification).

Link to comment
Share on other sites

Link to post
Share on other sites

Noob question: How does it work having a Quadro and a gtx780 in there? So how would the computer know to use the Quadro if you were opening Maya and then the 780 if you were opening Crysis?

 

My question as well.  I know you can set different cards to have one run the PhysX calculations and one to do. . . . well everything else GPU related, so I am wondering if it is along those lines.  I think he did say which card was going to have the displays plug into, but I can't recall when, which or why. 

Intel Core i5 2400 - Corsair Vengeance 16.0GB Dual-Channel DDR3 @ 668MHz (9-9-9-24 - four strips) - ASUS P8Z68-V LE - GeForce GTX 660 Ti (MSI Power Edition) - 60GB Corsair Force 3 Boot drive with 2.5TB in a random nightmare of a configuration

Link to comment
Share on other sites

Link to post
Share on other sites

My question as well.  I know you can set different cards to have one run the PhysX calculations and one to do. . . . well everything else GPU related, so I am wondering if it is along those lines.  I think he did say which card was going to have the displays plug into, but I can't recall when, which or why. 

Basically its a poor man's Nvidia Maximus setup. You have your Quadro for basic work (GUI and monitor as stated almost at the end) then your second card, a Tesla or in this case a Geforce, does all the heavy lifting. Its how you get around some editing software that do not support things like SLI or Crossfire.

Link to comment
Share on other sites

Link to post
Share on other sites

Basically its a poor man's Nvidia Maximus setup. You have your Quadro for basic work (GUI and monitor as stated almost at the end) then your second card, a Tesla or in this case a Geforce, does all the heavy lifting. Its how you get around some editing software that do not support things like SLI or Crossfire.

So, would that mean you can SLI Geforce cards to run with the Quadro leading...? ("paired with one or more Tesla K20s")

Apple, Piss Off! ~ Linus 2014

No, you're not hallucinating, or maybe you are... either way, I'm back. ~ Linus 2015

Link to comment
Share on other sites

Link to post
Share on other sites

Didn't realize LCD monitors lost their calibration so quickly. I learned something new, if your statement is true.

Well it is true to an extent. I did work in commercial places that did photo edits for top magazines like Vogue and National Geographic and also worked in a color calibration place for paint called Datacolor. All these places consistently calibrated the monitors. The reason you do this is calibration is due to ambient light around you and what the screen delivers over time. It is a constantly changing environment for a PRO, not a normal YT person who uploads videos. If you’re a PRO constant calibration will matter. IF you wonder why he didn't do it was because this isn't a PRO build, it's for above average video editors. If you were a PRO you wouldn't have any of this stuff and you wouldn't have watched this video, it's for NORMAL people who want exceptional videos. I have been inside commercial places that did PRO commercials and there is none of this equipment. It's all PROPRIETARY hardware, not 5K worth but 105K worth.

Link to comment
Share on other sites

Link to post
Share on other sites

Basically its a poor man's Nvidia Maximus setup. You have your Quadro for basic work (GUI and monitor as stated almost at the end) then your second card, a Tesla or in this case a Geforce, does all the heavy lifting. Its how you get around some editing software that do not support things like SLI or Crossfire.

 

So the $800 card does the "basic" stuff and the $500 does the heavy stuff?  Why is an $800 card needed for this. . .  is it only a Quadro that can be set up this way?  I feel like I am missing a piece of the puzzle still. 

Intel Core i5 2400 - Corsair Vengeance 16.0GB Dual-Channel DDR3 @ 668MHz (9-9-9-24 - four strips) - ASUS P8Z68-V LE - GeForce GTX 660 Ti (MSI Power Edition) - 60GB Corsair Force 3 Boot drive with 2.5TB in a random nightmare of a configuration

Link to comment
Share on other sites

Link to post
Share on other sites

So the $800 card does the "basic" stuff and the $500 does the heavy stuff?  Why is an $800 card needed for this. . .  is it only a Quadro that can be set up this way?  I feel like I am missing a piece of the puzzle still. 

ähm, I think they want to use the Asus PA somesthing monitor => 10bit => GTX cards aren't able to 'do' 10bit

So the slow because of much more work per pixel Quadro gets to do the monitor work and the GTX gets the high speed needing work.

Link to comment
Share on other sites

Link to post
Share on other sites

So, would that mean you can SLI Geforce cards to run with the Quadro leading...? ("paired with one or more Tesla K20s")

 

So the $800 card does the "basic" stuff and the $500 does the heavy stuff?  Why is an $800 card needed for this. . .  is it only a Quadro that can be set up this way?  I feel like I am missing a piece of the puzzle still. 

 

Think of it this way, the Quadro has official support for a lot of professional features whereas the Geforce series can be shoehorned if you want to tinker. The Tesla is designed to do nothing but number crunch. In a sense its a second, dedicated computer within another computer. While you can mix and match the three series you're really not supposed to because it can cause stability issues. I assume the Geforce was selected because its sitting in place of a $2,000 to $5,000 card.

Link to comment
Share on other sites

Link to post
Share on other sites

So, would that mean you can SLI Geforce cards to run with the Quadro leading...? ("paired with one or more Tesla K20s")

 

The quadro has the monitor attached and the geforce on its own for its cuda core power, not  in sli. similar to having a stand alone physx card. But you could run a  sli settup next to the quadro id think.

Link to comment
Share on other sites

Link to post
Share on other sites

Why Woosh not sploosh?Slap a H320/Custom Loop in there and the workstation is even cooler.

Well, yes, I'm completely okay with that. This sort of thing isn't just limited to nVidia. It's been happenning for years with all sorts of components.

Their Quadro workstation cards are actually weaker than their GeForce cards, but most of their value comes from the drivers and software sire of things rather than hardware.

Intel's CPUs - right from the Pentiums up to the i7 4770K are all actually the same die. Like, they're actually the same CPU, except with cores disabled and different clock speeds.

So the whole idea of nVidia limiting double precision floating point operations to the Titan isn't something new and what it's able to do is allow nVidia to lower the price of the 780 and 780Ti whilst still charging a premium for a premium feature on the Titan so that it can make money and invest in R&D.

Pentium to 4770K is not the necessary same die.Only sometimes they are,when impurities disable a portion of the silicon.There are different die sizes for some.Even i7 and i5 which are both quad cores does not necessarily have the same die size.Even for the Xeon E5/i7 2011 ones,there are 3 types of die,

The first is 6 core HT,15MB cache

Second 10 core,(more-forgot,20-25 maybe?) cache

Third 12 core,30MB cache

The first one powers low end Xeons(4-6 core) and all i7s.

The second powers 6,8,10 cores.The 6 cores are either needing more cache,or the silicon decides to be naughty.

Third are exclusive for 12 cores unless again,the silicon problem.

Therefore the die is not really the same.Also for example how can some Sandy bridge i5 use hd2000,while the k uses hd3000?Strange right?

Basically its a poor man's Nvidia Maximus setup. You have your Quadro for basic work (GUI and monitor as stated almost at the end) then your second card, a Tesla or in this case a Geforce, does all the heavy lifting. Its how you get around some editing software that do not support things like SLI or Crossfire.

At that rate we can all call everyones' rigs are low end...
Link to comment
Share on other sites

Link to post
Share on other sites

Pentium to 4770K is not the necessary same die.Only sometimes they are,when impurities disable a portion of the silicon.There are different die sizes for some.Even i7 and i5 which are both quad cores does not necessarily have the same die size.Even for the Xeon E5/i7 2011 ones,there are 3 types of die,

The first is 6 core HT,15MB cache

Second 10 core,(more-forgot,20-25 maybe?) cache

Third 12 core,30MB cache

The first one powers low end Xeons(4-6 core) and all i7s.

The second powers 6,8,10 cores.The 6 cores are either needing more cache,or the silicon decides to be naughty.

Third are exclusive for 12 cores unless again,the silicon problem.

Therefore the die is not really the same.Also for example how can some Sandy bridge i5 use hd2000,while the k uses hd3000?Strange right?

 

I think you've misunderstood what I've said. 

 

What I meant to say was that all CPUs on LGA1150 are the same die. From the Pentium up to the 4770K. Of course CPUs on LGA2011..etc. are a different die. In fact, this only started recently with the Ivy-Bridge E series. Before that, Sandy-Bridge E and EP were actually all on the same die too. The 3930K is just an 8 core with two of its cores disabled. 

 

HD2000 is just a cut down version of HD3000. 

My Personal Rig - AMD 3970X | ASUS sTRX4-Pro | RTX 2080 Super | 64GB Corsair Vengeance Pro RGB DDR4 | CoolerMaster H500P Mesh

My Wife's Rig - AMD 3900X | MSI B450I Gaming | 5500 XT 4GB | 32GB Corsair Vengeance LPX DDR4-3200 | Silverstone SG13 White

Link to comment
Share on other sites

Link to post
Share on other sites

I think you've misunderstood what I've said. 

 

What I meant to say was that all CPUs on LGA1150 are the same die. From the Pentium up to the 4770K. Of course CPUs on LGA2011..etc. are a different die. In fact, this only started recently with the Ivy-Bridge E series. Before that, Sandy-Bridge E and EP were actually all on the same die too. The 3930K is just an 8 core with two of its cores disabled. 

 

HD2000 is just a cut down version of HD3000.

Actually some sandy bridge i5 have different die size than sb i7s(1155),according to wikipedia.
Link to comment
Share on other sites

Link to post
Share on other sites

Actually some sandy bridge i5 have different die size than sb i7s(1155),according to wikipedia.

Ähm, paulsterio especially defined LGA 1150 within the his/her post

Link to comment
Share on other sites

Link to post
Share on other sites

Ähm, paulsterio especially defined LGA 1150 within the his/her post

Haswell also have a different die size.

Link to comment
Share on other sites

Link to post
Share on other sites

The quadro has the monitor attached and the geforce on its own for its cuda core power, not  in sli. similar to having a stand alone physx card. But you could run a  sli settup next to the quadro id think.

Yeah, Linus said that the Quadro had the monitor attached. I was meaning multiple Geforce in SLI - as the Nvidia website quoted multiple Tesla cards, and Izam had already said it was like a a poor man's version of Maximus

 

Would multiple monitors all have to come from the Quadro?

 

With you mentioning PhysX... now I have a dumb idea that I may have to try on my new build... ;)

 

 

*Added in... Linus, have you tried utilising the Geforce for gaming...? - or is it a total no go...?

Apple, Piss Off! ~ Linus 2014

No, you're not hallucinating, or maybe you are... either way, I'm back. ~ Linus 2015

Link to comment
Share on other sites

Link to post
Share on other sites

It is a constantly changing environment for a PRO, not a normal YT person who uploads videos. If you’re a PRO constant calibration will matter. IF you wonder why he didn't do it was because this isn't a PRO build, it's for above average video editors. If you were a PRO you wouldn't have any of this stuff and you wouldn't have watched this video, it's for NORMAL people who want exceptional videos.

 

Well why then the need for 10bit output? ;)

This is build is inmho not very well thought through. There is no real need to perfectly fine tune color output in videos that will be watched on non calibrated and probably crappy monitors anyway. (not saying you shouldn't do it for reference)

Mini-Desktop: NCASE M1 Build Log
Mini-Server: M350 Build Log

Link to comment
Share on other sites

Link to post
Share on other sites

Well why then the need for 10bit output? ;)

This is build is inmho not very well thought through. There is no real need to perfectly fine tune color output in videos that will be watched on non calibrated and probably crappy monitors anyway. (not saying you shouldn't do it for reference)

 

Ähm, if they are ppl like me who's color vision ability is over average and shudder whilst working with an average monitor? To enjoy their work? To save precious time? Get better results?

 

Even little adjustments might result in a more pro looking clip. Those little details are far more simple / faster to recognise/found and work over/adjust with that kind of monitor. My impression is, they do not have that much of time, so to choose tools to help with the workflow

Don't forget the well-being, as as less exhausting your work surrounding / your tools challange your concentration...., as longer you are able to stay concentrated/productive. In a way the same as to look for a working mouse, clean air, natural light, sitting in a good chair, avoid noise,....

It is IMHO a thought through decision.

 

And as they think about 4k......

 

perfectly fine tune => if that would be the goal they wouldn't pick that monitor. It has 99% adobe RGB, but not pro 'black' all around (light within black, colors brightness in different areas of the screen....). The 'real' ones cost way more, see e.g. NEC SpectraView Reference or even more expensive pro monitors

 

There is always a middle ground, they chose IMHO the better than 8bit version, with solid, but not 'best' abilities aka still affordable.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×