Jump to content

We THOUGHT this $40,000 PC would break records...

nicklmg

Last time I was this early... I don't remember being this early before.

COMPUTER: Mobile Battlestation  |  CPU: INTEL I7-8700k |  Motherboard: Asus z370-i Strix Gaming  | GPU: EVGA GTX 1080 FTW ACX 3.0 | Cooler: Scythe Big Shuriken 2 Rev. b |  PSU: Corsair SF600 | HDD: Samsung 860 evo 1tb

 

Link to comment
Share on other sites

Link to post
Share on other sites

You didn't trace enough rays...

Link to comment
Share on other sites

Link to post
Share on other sites

14 minutes ago, nicklmg said:

-snip-

More like 75k. 10k each for the plat 8180s, 9k each for the GV100s, all the ram, the case etc.

Link to comment
Share on other sites

Link to post
Share on other sites

WHAT ?! . .Quadro's no good for gaming ..awww ....i really thought they would be great....

 

 

 

 

:P

CPU: Intel i7 3930k w/OC & EK Supremacy EVO Block | Motherboard: Asus P9x79 Pro  | RAM: G.Skill 4x4 1866 CL9 | PSU: Seasonic Platinum 1000w Corsair RM 750w Gold (2021)|

VDU: Panasonic 42" Plasma | GPU: Gigabyte 1080ti Gaming OC & Barrow Block (RIP)...GTX 980ti | Sound: Asus Xonar D2X - Z5500 -FiiO X3K DAP/DAC - ATH-M50S | Case: Phantek Enthoo Primo White |

Storage: Samsung 850 Pro 1TB SSD + WD Blue 1TB SSD | Cooling: XSPC D5 Photon 270 Res & Pump | 2x XSPC AX240 White Rads | NexXxos Monsta 80x240 Rad P/P | NF-A12x25 fans |

Link to comment
Share on other sites

Link to post
Share on other sites

What active DP -> HDMI adapter was used? I just tried the Plugable one (https://smile.amazon.com/Plugable-DisplayPort-Supports-displays-3840x2160/dp/B00S0C7QO8/ref=sr_1_3?ie=UTF8&qid=1537911002&sr=8-3&keywords=plugable+displayport+to+hdmi&dpID=41%2B2oAPV91L&preST=_SX300_QL70_&dpSrc=srch) with my LG 43MU79 (Same monitor just with a 3 year warranty) and it would not work at 4k @60hz. I know in the video it's used  at 1080p, but thought if it was a different brand I would give it a try (the Plugable is the only one I've found so far that does 4k@60hz)

Link to comment
Share on other sites

Link to post
Share on other sites

What if you NV Linked the cards in pairs, and then used the quadro sync board withe the two pairs?

Link to comment
Share on other sites

Link to post
Share on other sites

If you have a bottle neck to the video cards, wouldn't Thread-ripper provide an answer?

Link to comment
Share on other sites

Link to post
Share on other sites

6 minutes ago, Eryk Bartzson said:

If you have a bottle neck to the video cards, wouldn't Thread-ripper provide an answer?

Threadripper is too slow to bench with.

 

It's almost never used in 3DMark submissions.

Our Grace. The Feathered One. He shows us the way. His bob is majestic and shows us the path. Follow unto his guidance and His example. He knows the one true path. Our Saviour. Our Grace. Our Father Birb has taught us with His humble heart and gentle wing the way of the bob. Let us show Him our reverence and follow in His example. The True Path of the Feathered One. ~ Dimboble-dubabob III

Link to comment
Share on other sites

Link to post
Share on other sites

Nvidia needs to seriously do something about their "control panel." When a graphics card costs more than an entire PC, and there's no software let alone a OS to handle the horsepower, graphics card manufacturers to build software to direct the resources.

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, Akkeio said:

Nvidia needs to seriously do something about their "control panel." When a graphics card costs more than an entire PC, and there's no software let alone a OS to handle the horsepower, graphics card manufacturers to build software to direct the resources.

Not really Nvidias fault here; keep in mind these cards are specialized for large data calculations, not gaming. And Linus states that the cards wont run in SLI so it's more like each 1920x1080 space on the monitor is just getting the performance of one card; not one 4k monitor getting the pushing power of all 4 cards behind it; much different [could be wrong here; not fully aware of the potential of that add card he is using to connect all the gpus]. Plus just because something is super fast doesn't mean the game is optimized to use all the power. 

 

Even if the pcie add card is somehow "connecting" gpu power together; I am sure its no way near as efficient as SLI or NV Link.

Link to comment
Share on other sites

Link to post
Share on other sites

Can someone tell me what case he's using in this video?

 

Thanks

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, Ericarthurc said:

Not really Nvidias fault here; keep in mind these cards are specialized for large data calculations, not gaming. And Linus states that the cards wont run in SLI so it's more like each 1920x1080 space on the monitor is just getting the performance of one card; not one 4k monitor getting the pushing power of all 4 cards behind it; much different [could be wrong here; not fully aware of the potential of that add card he is using to connect all the gpus]. Plus just because something is super fast doesn't mean the game is optimized to use all the power. 

 

Even if the pcie add card is somehow "connecting" gpu power together; I am sure its no way near as efficient as SLI or NV Link.

I'm not saying its Nvidia's fault but, there is no operating system or software to control the usage of multiple GPU resources. It's easy to download third party overclocking software to direct the system resources but that's at the users risk of blowing a $9000 card, you know. Windows does not have a system to manage GPU system resources, like hard drives in RAID0. If you wanted to use your GPU to watch a 40GB 4k movie on VLC player for example, your onboard graphics would not be able to handle that amount of video memory, video players don't cant get GPU resources to run through windows. Point is: why is the GPU locked to specific programs.   

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, Akkeio said:

I'm not saying its Nvidia's fault but, there is no operating system or software to control the usage of multiple GPU resources. It's easy to download third party overclocking software to direct the system resources but that's at the users risk of blowing a $9000 card, you know. Windows does not have a system to manage GPU system resources, like hard drives in RAID0. If you wanted to use your GPU to watch a 40GB 4k movie on VLC player for example, your onboard graphics would not be able to handle that amount of video memory, video players don't cant get GPU resources to run through windows. Point is: why is the GPU locked to specific programs.   

Wait what? That is not true... in the slightest.

Link to comment
Share on other sites

Link to post
Share on other sites

thx Linus! i just saved myself from spending 40k in my new workstation9_9

Link to comment
Share on other sites

Link to post
Share on other sites

Is that laptop a Xiaomi Mi Notebook Air? Is a review dropping anytime soon?

Link to comment
Share on other sites

Link to post
Share on other sites

You guys left in the part where Linus says NVLink isnt intended for SLI, I assume it was written and probably filmed? before Nvidia announced Native SLI over NVLink on 2000 series, lul

MOAR COARS: 5GHz "Confirmed" Black Edition™ The Build
AMD 5950X 4.7/4.6GHz All Core Dynamic OC + 1900MHz FCLK | 5GHz+ PBO | ASUS X570 Dark Hero | 32 GB 3800MHz 14-15-15-30-48-1T GDM 8GBx4 |  PowerColor AMD Radeon 6900 XT Liquid Devil @ 2700MHz Core + 2130MHz Mem | 2x 480mm Rad | 8x Blacknoise Noiseblocker NB-eLoop B12-PS Black Edition 120mm PWM | Thermaltake Core P5 TG Ti + Additional 3D Printed Rad Mount

 

Link to comment
Share on other sites

Link to post
Share on other sites

adding my request to jls825 what is the case used in this build please ?

 

Link to comment
Share on other sites

Link to post
Share on other sites

On 9/26/2018 at 6:12 AM, TechyBen said:

Wait what? That is not true... in the slightest.

Which part, my good Sir...?

Link to comment
Share on other sites

Link to post
Share on other sites

7 minutes ago, Akkeio said:

Which part, my good Sir...?

All the part.

 

Quote

"I'm not saying its Nvidia's fault but, there is no operating system or software to control the usage of multiple GPU resources."

Windows 10 can choose which GPU to use. Rendering software can choose which GPU to use. Servers resource pool/split...

Quote

"It's easy to download third party overclocking software to direct the system resources but that's at the users risk of blowing a $9000 card, you know."

You can safely overclock if you wish. These are not consumer grade cards, you don't overclock them, you overclock 1080tis.

Quote

"Windows does not have a system to manage GPU system resources"

See above.

Quote

, like hard drives in RAID0. If you wanted to use your GPU to watch a 40GB 4k movie on VLC player for example, your onboard graphics would not be able to handle that amount of video memory,

A 7th Gen Intel Chip does Netflix 4k, that has "onboard graphics". You don't load the entire video to graphics memory. You don't even load the entire movie to RAM! You load the few frames being displayed.

 

Quote

video players don't cant get GPU resources to run through windows. Point is: why is the GPU locked to specific programs.   

See above again: https://www.howtogeek.com/351522/how-to-choose-which-gpu-a-game-uses-on-windows-10/

Or https://www.engadget.com/2018/08/15/nvidia-turing-quadro-8k-video-playback/

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, TechyBen said:

All the part.

 

Windows 10 can choose which GPU to use. Rendering software can choose which GPU to use. Servers resource pool/split...

You can safely overclock if you wish. These are not consumer grade cards, you don't overclock them, you overclock 1080tis.

See above.

A 7th Gen Intel Chip does Netflix 4k, that has "onboard graphics". You don't load the entire video to graphics memory. You don't even load the entire movie to RAM! You load the few frames being displayed.

 

See above again: https://www.howtogeek.com/351522/how-to-choose-which-gpu-a-game-uses-on-windows-10/

Or https://www.engadget.com/2018/08/15/nvidia-turing-quadro-8k-video-playback/

Ok, I understand where you are coming from. Point is: if I am paying ~$800 for a 1080 ti and I am playing GTA V on my computer and my frame rate is dropping, how do I know Specifically what is causing the problem? "But bro I can tell you that your processor on your computer is not powerful enough to run the game." No, I should not need to out source programs from third party companies to tell me about how to control my NVIDIA graphics card.

 

Nvidia gave Linus 4 commercial grade video rendering cards for jumbotron like screens, why is so difficult for him to connect all 4 together to run on 1 display? Is there no system or software to connect them to one output?   "Nvidia why you no help Linus?" 

 

GPU's have nothing to do with gaming. Professionally they are used for things like movie animation, special fx video rendering etc. Gaming GPU's are low end cards. How is it so difficult to get a video card integrated into a motherboard?   

Link to comment
Share on other sites

Link to post
Share on other sites

14 hours ago, Akkeio said:

Ok, I understand where you are coming from. Point is: if I am paying ~$800 for a 1080 ti and I am playing GTA V on my computer and my frame rate is dropping, how do I know Specifically what is causing the problem? "But bro I can tell you that your processor on your computer is not powerful enough to run the game." No, I should not need to out source programs from third party companies to tell me about how to control my NVIDIA graphics card.

Windows resource monitor. HDD/SSD vs RAM vs CPU... then GPU.

I can tell you from your hardware, without even testing if you like. People know the hardware that well (others have tested, so you don't need to).

Your example, has *nothing* to do with the GPU. If your CPU cannot play GTA, it is the CPU. If your GPU cannot play GTA, it is your GPU. What do you mean "I need to control my NVIDIA graphics card", control it to do what?

 

To do REAL resource checking, you need extremely special software, the kind developers use:

https://docs.unrealengine.com/en-us/Engine/Performance/GPU

That's not for you or I, it's for the OS developers and game engine developers/game designers.

 

Quote

Nvidia gave Linus 4 commercial grade video rendering cards for jumbotron like screens, why is so difficult for him to connect all 4 together to run on 1 display? Is there no system or software to connect them to one output?   "Nvidia why you no help Linus?" 

It's not difficult. It is expensive and needs custom hardware (like the monitor he chose, or a combining card/adapter of some sorts). These are for video display and/or CAD and custom code. Sometimes for expensive broadcast or entertainment industries. See Disney and it's *custom* NVIDIA setup for the video/VR rides! Cards that are Not for "gaming".

 

Quote

GPU's have nothing to do with gaming. Professionally they are used for things like movie animation, special fx video rendering etc. Gaming GPU's are low end cards. How is it so difficult to get a video card integrated into a motherboard?   

What? GPUs are both. NVidia and AMD consumer cards are identical vs their commercial products. Except more ram/shaders/colour bits. The difference is in support on the drivers. NVidia/AMD lock down the gaming support to consumer, and industry support to commercial. If you can trick the bios/ID reg, you can do either, with no change in performance/features (minus the locked down shader pipelines/cut off registers/lack of memory).

 

Gaming GPUs are not then, "low end", they are nearly identical. "integrated into a motherboard"... why? Huh? What? Again, all levels of wrong here. xD

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×