Jump to content
Search In
  • More options...
Find results that contain...
Find results in...
nicklmg

We THOUGHT this $40,000 PC would break records...

Recommended Posts

Last time I was this early... I don't remember being this early before.


COMPUTER: Mobile Battlestation  |  CPU: INTEL I7-8700k |  Motherboard: Asus z370-i Strix Gaming  | GPU: EVGA GTX 1080 FTW ACX 3.0 | Cooler: Scythe Big Shuriken 2 Rev. b |  PSU: Corsair SF600 | HDD: Samsung 860 evo 1tb

 

Link to post
Share on other sites

You didn't trace enough rays...


Build Logs: Cotton Candy Threads | ZEN CLARITY + | Just NCASE mITX | Noc Noc | NUC | Storage Log

 

Cotton Candy Threads - CPU AMD Threadripper 2950X | GPU EVGA FTW3 RTX 2080 Ti | MOBO Asus ROG Zenith Extreme | MEMORY 128GB (8x 16GB) Corsair Vengeance RGB 3200 | STORAGE 3x Samsung 960 Evo SSD + 4x Crucial P1 1TB + 2x Seagate Ironwolf 8TB 7.2k HDDs | PSU Corsair HX1200i w/ Cablemod Pro Extensions | COOLING Cooler Master TR4 ML360 | CASE Lian Li O11 Dynamic Black | LIGHTING 2x Corsair HD120 Fans, 4x Corsair Addressable RGB Strips, 2x Corsair Commander Pro | PCPP
 
ZEN CLARITY + - CPU AMD Ryzen 2700X | GPU Radeon VII | MOBO Crosshair VII Hero | MEMORY 32GB (4x 8GB) Corsair Vengeance RGB Pro @ 3200 | STORAGE Samsung 960 Pro SSD + 2x SanDisk Ultra II SSDs | PSU Corsair RM1000i | COOLING Corsair H150i Pro | CASE Crystal 570X | LIGHTING 6x Corsair SP120 Fans, Cablemod Addressable RGB Strip, Corsair Commander Pro | PCPP
 
Just NCASE mITX - CPU Intel Core i7 8700K @ 5.2GHz | GPU EVGA RTX 2080 Ti XC | MOBO Asus Z370-I Gaming | MEMORY 16GB (2x 8GB) G.Skill Triden-Z RGB 3000 | STORAGE Samsung 960 Evo 500GB SSD + Corsair MX500 1TB M.2 SSD | PSU Corsair SF600 | COOLING Noctua NH-U9S w/ Redux Push/Pull Fans | CASE NCase M1v5 | LIGHTING 2x Cablemod Addressable RGB Strips | PCPP
 
Noc Noc, Who's There? - CPU AMD Threadripper 1950X | GPU ASUS RTX 2080 Ti OC | MOBO ASRock X399M Taichi | MEMORY 32GB (4x 8GB) GSkill Trident-Z 3200 | STORAGE Samsung 970 Evo SSD | PSU Corsair HX1000i w/ Cablemod Pro B&W Kit | COOLING Noctua U9 TR4 w/ 2x Redux 92mm | CASE Corsair 280X White | FANS 6x Noctua 140mm Redux | PCPP
Link to post
Share on other sites

I love GN. LTT for entertainment, GN for more in depth stuff. 


Beta!  X570 VRM + Features list (PM me if you want to help) 

 

Will prob be making an X570 "tier" list when that is completed.

Some Custom Loop Questions, please help answer them.

 

PC (Main)

 

CPU: i5-8400 CPU Cooler: Cryorig M9 Plus   Motherboard: Gigabyte B360M DS3H | RAM: Crucial Ballistix Sport 2x8 DDR4-2400

 Boot/OS SSD: Inland 480GB SSD | Video Card: RX 570 4GB Strix OC | Case: Fractal Design Meshify C White TG (11/10) PSU: EVGA SuperNOVA G3 750

Monitor: Sceptre 24" 1080p 75hz Webcam: Logitech C920s

 

NAS:

Synology DS418J w/ 4x WD Red Pro 6TB RAID 10 (used 7.3/12TB)

 

Phone/Tablet:

iPhone XR 64GB iPad Mini 4 128GB

 

Laptops:

Dell XPS 15 9570 i7-8750H + 1050 ti + 20GB ram (16+4) + 1TB EX920 SSD

 

My old computers:

Athlon XP + 2GB DDR1 (!) + R9600 Pro 256MB Core 2 Quad Q8400 @ 3.4ghz + GTX 275 | i5-3570k @ 4.4ghz + GTX 

Link to post
Share on other sites
14 minutes ago, nicklmg said:

-snip-

More like 75k. 10k each for the plat 8180s, 9k each for the GV100s, all the ram, the case etc.


Beta!  X570 VRM + Features list (PM me if you want to help) 

 

Will prob be making an X570 "tier" list when that is completed.

Some Custom Loop Questions, please help answer them.

 

PC (Main)

 

CPU: i5-8400 CPU Cooler: Cryorig M9 Plus   Motherboard: Gigabyte B360M DS3H | RAM: Crucial Ballistix Sport 2x8 DDR4-2400

 Boot/OS SSD: Inland 480GB SSD | Video Card: RX 570 4GB Strix OC | Case: Fractal Design Meshify C White TG (11/10) PSU: EVGA SuperNOVA G3 750

Monitor: Sceptre 24" 1080p 75hz Webcam: Logitech C920s

 

NAS:

Synology DS418J w/ 4x WD Red Pro 6TB RAID 10 (used 7.3/12TB)

 

Phone/Tablet:

iPhone XR 64GB iPad Mini 4 128GB

 

Laptops:

Dell XPS 15 9570 i7-8750H + 1050 ti + 20GB ram (16+4) + 1TB EX920 SSD

 

My old computers:

Athlon XP + 2GB DDR1 (!) + R9600 Pro 256MB Core 2 Quad Q8400 @ 3.4ghz + GTX 275 | i5-3570k @ 4.4ghz + GTX 

Link to post
Share on other sites

WHAT ?! . .Quadro's no good for gaming ..awww ....i really thought they would be great....

 

 

 

 

:P


CPU: Intel i7 3930k w/OC & EK Supremacy EVO Block | Motherboard: Asus P9x79 Pro  | RAM: G.Skill 4x4 1866 CL9 | PSU: Seasonic Platinum 1000w | VDU: Panasonic 42" Plasma |

GPU: Gigabyte 1080ti Gaming OC w/OC & Barrow Block | Sound: Asus Xonar D2X - Z5500 -FiiO X3K DAP/DAC - ATH-M50S | Case: Phantek Enthoo Primo White |

Storage: Samsung 850 Pro 1TB SSD + Samsung 850 Evo 256GB SSD | Cooling: XSPC D5 Photon 270 Res & Pump | 2x XSPC AX240 White Rads | NexXxos Monsta 80x240 Rad P/P |

Link to post
Share on other sites

What active DP -> HDMI adapter was used? I just tried the Plugable one (https://smile.amazon.com/Plugable-DisplayPort-Supports-displays-3840x2160/dp/B00S0C7QO8/ref=sr_1_3?ie=UTF8&qid=1537911002&sr=8-3&keywords=plugable+displayport+to+hdmi&dpID=41%2B2oAPV91L&preST=_SX300_QL70_&dpSrc=srch) with my LG 43MU79 (Same monitor just with a 3 year warranty) and it would not work at 4k @60hz. I know in the video it's used  at 1080p, but thought if it was a different brand I would give it a try (the Plugable is the only one I've found so far that does 4k@60hz)

Link to post
Share on other sites
6 minutes ago, Eryk Bartzson said:

If you have a bottle neck to the video cards, wouldn't Thread-ripper provide an answer?

Threadripper is too slow to bench with.

 

It's almost never used in 3DMark submissions.


Intel Core i7 5820K | BeQuiet! Dark Rock Pro 4 | ASUS RVE | Trident Z 3200MHz 4x4GB | GTX 980 K|NGP|N 2-Way SLi

Samsung Galaxy S7 Edge Black 32GB | Exynos 8890 Octa | SanDisk Ultra 200GB SDXC

1 | 2 | 3 | 4 | Valley | Superposition

Link to post
Share on other sites

Nvidia needs to seriously do something about their "control panel." When a graphics card costs more than an entire PC, and there's no software let alone a OS to handle the horsepower, graphics card manufacturers to build software to direct the resources.

Link to post
Share on other sites
3 hours ago, Akkeio said:

Nvidia needs to seriously do something about their "control panel." When a graphics card costs more than an entire PC, and there's no software let alone a OS to handle the horsepower, graphics card manufacturers to build software to direct the resources.

Not really Nvidias fault here; keep in mind these cards are specialized for large data calculations, not gaming. And Linus states that the cards wont run in SLI so it's more like each 1920x1080 space on the monitor is just getting the performance of one card; not one 4k monitor getting the pushing power of all 4 cards behind it; much different [could be wrong here; not fully aware of the potential of that add card he is using to connect all the gpus]. Plus just because something is super fast doesn't mean the game is optimized to use all the power. 

 

Even if the pcie add card is somehow "connecting" gpu power together; I am sure its no way near as efficient as SLI or NV Link.

Link to post
Share on other sites
2 hours ago, Ericarthurc said:

Not really Nvidias fault here; keep in mind these cards are specialized for large data calculations, not gaming. And Linus states that the cards wont run in SLI so it's more like each 1920x1080 space on the monitor is just getting the performance of one card; not one 4k monitor getting the pushing power of all 4 cards behind it; much different [could be wrong here; not fully aware of the potential of that add card he is using to connect all the gpus]. Plus just because something is super fast doesn't mean the game is optimized to use all the power. 

 

Even if the pcie add card is somehow "connecting" gpu power together; I am sure its no way near as efficient as SLI or NV Link.

I'm not saying its Nvidia's fault but, there is no operating system or software to control the usage of multiple GPU resources. It's easy to download third party overclocking software to direct the system resources but that's at the users risk of blowing a $9000 card, you know. Windows does not have a system to manage GPU system resources, like hard drives in RAID0. If you wanted to use your GPU to watch a 40GB 4k movie on VLC player for example, your onboard graphics would not be able to handle that amount of video memory, video players don't cant get GPU resources to run through windows. Point is: why is the GPU locked to specific programs.   

Link to post
Share on other sites
2 hours ago, Akkeio said:

I'm not saying its Nvidia's fault but, there is no operating system or software to control the usage of multiple GPU resources. It's easy to download third party overclocking software to direct the system resources but that's at the users risk of blowing a $9000 card, you know. Windows does not have a system to manage GPU system resources, like hard drives in RAID0. If you wanted to use your GPU to watch a 40GB 4k movie on VLC player for example, your onboard graphics would not be able to handle that amount of video memory, video players don't cant get GPU resources to run through windows. Point is: why is the GPU locked to specific programs.   

Wait what? That is not true... in the slightest.

Link to post
Share on other sites

You guys left in the part where Linus says NVLink isnt intended for SLI, I assume it was written and probably filmed? before Nvidia announced Native SLI over NVLink on 2000 series, lul


Delidded 3770k 4.4GHz | Sapphire Nitro+ Special Edition RX 580 1550MHz/2250MHz  | #2 FireStrike Extreme & #2 Superposition 1080p Xtreme | 32GB DDR3 1600MHz

Link to post
Share on other sites
7 minutes ago, Akkeio said:

Which part, my good Sir...?

All the part.

 

Quote

"I'm not saying its Nvidia's fault but, there is no operating system or software to control the usage of multiple GPU resources."

Windows 10 can choose which GPU to use. Rendering software can choose which GPU to use. Servers resource pool/split...

Quote

"It's easy to download third party overclocking software to direct the system resources but that's at the users risk of blowing a $9000 card, you know."

You can safely overclock if you wish. These are not consumer grade cards, you don't overclock them, you overclock 1080tis.

Quote

"Windows does not have a system to manage GPU system resources"

See above.

Quote

, like hard drives in RAID0. If you wanted to use your GPU to watch a 40GB 4k movie on VLC player for example, your onboard graphics would not be able to handle that amount of video memory,

A 7th Gen Intel Chip does Netflix 4k, that has "onboard graphics". You don't load the entire video to graphics memory. You don't even load the entire movie to RAM! You load the few frames being displayed.

 

Quote

video players don't cant get GPU resources to run through windows. Point is: why is the GPU locked to specific programs.   

See above again: https://www.howtogeek.com/351522/how-to-choose-which-gpu-a-game-uses-on-windows-10/

Or https://www.engadget.com/2018/08/15/nvidia-turing-quadro-8k-video-playback/

Link to post
Share on other sites
1 hour ago, TechyBen said:

All the part.

 

Windows 10 can choose which GPU to use. Rendering software can choose which GPU to use. Servers resource pool/split...

You can safely overclock if you wish. These are not consumer grade cards, you don't overclock them, you overclock 1080tis.

See above.

A 7th Gen Intel Chip does Netflix 4k, that has "onboard graphics". You don't load the entire video to graphics memory. You don't even load the entire movie to RAM! You load the few frames being displayed.

 

See above again: https://www.howtogeek.com/351522/how-to-choose-which-gpu-a-game-uses-on-windows-10/

Or https://www.engadget.com/2018/08/15/nvidia-turing-quadro-8k-video-playback/

Ok, I understand where you are coming from. Point is: if I am paying ~$800 for a 1080 ti and I am playing GTA V on my computer and my frame rate is dropping, how do I know Specifically what is causing the problem? "But bro I can tell you that your processor on your computer is not powerful enough to run the game." No, I should not need to out source programs from third party companies to tell me about how to control my NVIDIA graphics card.

 

Nvidia gave Linus 4 commercial grade video rendering cards for jumbotron like screens, why is so difficult for him to connect all 4 together to run on 1 display? Is there no system or software to connect them to one output?   "Nvidia why you no help Linus?" 

 

GPU's have nothing to do with gaming. Professionally they are used for things like movie animation, special fx video rendering etc. Gaming GPU's are low end cards. How is it so difficult to get a video card integrated into a motherboard?   

Link to post
Share on other sites
14 hours ago, Akkeio said:

Ok, I understand where you are coming from. Point is: if I am paying ~$800 for a 1080 ti and I am playing GTA V on my computer and my frame rate is dropping, how do I know Specifically what is causing the problem? "But bro I can tell you that your processor on your computer is not powerful enough to run the game." No, I should not need to out source programs from third party companies to tell me about how to control my NVIDIA graphics card.

Windows resource monitor. HDD/SSD vs RAM vs CPU... then GPU.

I can tell you from your hardware, without even testing if you like. People know the hardware that well (others have tested, so you don't need to).

Your example, has *nothing* to do with the GPU. If your CPU cannot play GTA, it is the CPU. If your GPU cannot play GTA, it is your GPU. What do you mean "I need to control my NVIDIA graphics card", control it to do what?

 

To do REAL resource checking, you need extremely special software, the kind developers use:

https://docs.unrealengine.com/en-us/Engine/Performance/GPU

That's not for you or I, it's for the OS developers and game engine developers/game designers.

 

Quote

Nvidia gave Linus 4 commercial grade video rendering cards for jumbotron like screens, why is so difficult for him to connect all 4 together to run on 1 display? Is there no system or software to connect them to one output?   "Nvidia why you no help Linus?" 

It's not difficult. It is expensive and needs custom hardware (like the monitor he chose, or a combining card/adapter of some sorts). These are for video display and/or CAD and custom code. Sometimes for expensive broadcast or entertainment industries. See Disney and it's *custom* NVIDIA setup for the video/VR rides! Cards that are Not for "gaming".

 

Quote

GPU's have nothing to do with gaming. Professionally they are used for things like movie animation, special fx video rendering etc. Gaming GPU's are low end cards. How is it so difficult to get a video card integrated into a motherboard?   

What? GPUs are both. NVidia and AMD consumer cards are identical vs their commercial products. Except more ram/shaders/colour bits. The difference is in support on the drivers. NVidia/AMD lock down the gaming support to consumer, and industry support to commercial. If you can trick the bios/ID reg, you can do either, with no change in performance/features (minus the locked down shader pipelines/cut off registers/lack of memory).

 

Gaming GPUs are not then, "low end", they are nearly identical. "integrated into a motherboard"... why? Huh? What? Again, all levels of wrong here. xD

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

Buy VPN

×