Jump to content
Search In
  • More options...
Find results that contain...
Find results in...
GabenJr

Nvidia, you PROMISED!

Recommended Posts

@LinusTechYou have impressed me, an entire video with a new product and you didn't drop it yet. 


PC - NZXT 340 Black, Intel i7 6700k, Noctua NH-U9S, 16GB Corsair DDR 4 2133mhz, Asus H170 Pro Gaming , Gigabyte 1080 OC Windforce, Samsung 860 250GB (OS) Samsung 850 Evo 250GB (Games) Samsung 840 Evo 500GB (games)

 

Mac - 1.4ghz i5, 4GB DDR3 1600mhz, Intel HD 5000.  x2

 

Endlessly wishing for a BBQ in space.

Link to post
Share on other sites
5 hours ago, Spotty said:

Also RIP AMD who decided to call in sick on RTX3080 Benchmark Day.

Linus hurt it by either dropping it (my guess) or by forcing it to compete against 2 units and their older sibling that it has previously lost most challenges against, so it refused to work for him unless he apologized.

Link to post
Share on other sites

My 980ti is looking forward to a well earned retirement at this point, just waiting for the "Buy Now" button to light up to get my hands on one


My PC: I7 4770 (3.4Ghz)

32Gb Corsair Xms3 DDR3

Gigabyte G1 Gaming 980ti

Asus Z87-A Motherboard

Samsung 120gb SSD

Seagate 2TB HDD

Aerocool Xpredator Evil Black

 

http://www.3dmark.com/fs/8502026

Link to post
Share on other sites

In Nvidias defense, they never said "twice as fast".

They said "Twice as fast per watt", which doesn't even seem to be the case either though.

So maybe not a great defense.


I will never succumb to the New Cult and I reject the leadership of @Aelar_Nailo and his wicked parrot armies led by @FakeCIA and @DildorTheDecent. I will keep my eyes pure and remain dedicated to the path of the One True; IlLinusNati

Link to post
Share on other sites

I know this is an extreme example, perhaps the most extreme example I've found so far, but both 1080p AND 1440p as a bottleneck in the example below gets me thinking that 1080p is only a few generations from dying out like 720p.  Might be worth addressing in future 3080/3090 coverage

 

hard_limit.png

 

If 1080p and 1440p are the same frame rate then who in their right mind would opt for worse image quality at 1080p?  And this is all without factoring in AI upscaling in future DLSS iterations

Link to post
Share on other sites

any youtuber that has coverage for VR performance? the only thing I found was 4 lines from this article speculating on performance in 2 games :/
https://arstechnica.com/gaming/2020/09/nvidia-rtx-3080-review-4k-greatness-at-699-and-good-news-for-cheaper-gpus/


2020 AMD Build:

Ryzen 3800x - Asus TUF x570 - Crucial Ballistix 16GB 3600cl16 - ROG Strix GTX1070 OC 8G - EVGA SuperNOVA G2 550W - Sabrent Rocket 1TB

 

2012 Intel Build:

Intel i5-3570k @4.0Ghz - Asus Maximus V Formula - Corsair Vengeance 8GB 1866 - XFX HD7970 GHz - Enermax Revolution87+ 650w - Crucial MX500 500GB

Link to post
Share on other sites
37 minutes ago, Zenith_X1 said:

I know this is an extreme example, perhaps the most extreme example I've found so far, but both 1080p AND 1440p as a bottleneck in the example below gets me thinking that 1080p is only a few generations from dying out like 720p.  Might be worth addressing in future 3080/3090 coverage

 

 

 

If 1080p and 1440p are the same frame rate then who in their right mind would opt for worse image quality at 1080p?  And this is all without factoring in AI upscaling in future DLSS iterations

A few things need to happen before 1080p dies out like 720p.

 

Low end 'gaming' cards need to be capable of 'minimum' frame rates above 60 at High settings profiles on the 'average' AAA game at 1440p.

So a $250 GPU needs to be able to pull that of.

 

Monitors need to be of moderate quality and be fully tuned for 1440p 120hz+ (so fast enough pixel response) at a 'cheap' price tag.

So u need to be able to find a sub $300 monitor than can achieve 120hz+ 1440p without significant image quality issues.

 

If both of these are done then ull see the 'average' non enthusiast PC user who games have a 1440p display and 1440p capable GPU.

 

However neither of these milestones have been hit yet. But we are most certainly past the halfway point towards that final destination of 1080p being phased out.


CPU: Intel i7 3930k w/OC & EK Supremacy EVO Block | Motherboard: Asus P9x79 Pro  | RAM: G.Skill 4x4 1866 CL9 | PSU: Seasonic Platinum 1000w | VDU: Panasonic 42" Plasma |

GPU: Gigabyte 1080ti Gaming OC w/OC & Barrow Block | Sound: Asus Xonar D2X - Z5500 -FiiO X3K DAP/DAC - ATH-M50S | Case: Phantek Enthoo Primo White |

Storage: Samsung 850 Pro 1TB SSD + Samsung 850 Evo 256GB SSD | Cooling: XSPC D5 Photon 270 Res & Pump | 2x XSPC AX240 White Rads | NexXxos Monsta 80x240 Rad P/P |

Link to post
Share on other sites

Performance vs 1080Ti is impressive, may be time to upgrade for people that skipped 20xx series

Link to post
Share on other sites

It will be interesting to see how well this performs with F@H and if the rumors are true that MINERS will go grabbing these up the minute they go on sale. My pick is the 3090 though, but I'm guessing a water-cooled version of that card will still be north of $1400. You really had to be a fool to buy a RTX Titan at $2500 and then drop $200 more for a waterblock+backplate.

 

For anybody at nVidia reading this, I will gladly take those unsold RTX Titans off your hands at a "reduced price" :D

Link to post
Share on other sites
37 minutes ago, Luscious said:

It will be interesting to see how well this performs with F@H and if the rumors are true that MINERS will go grabbing these up the minute they go on sale. My pick is the 3090 though, but I'm guessing a water-cooled version of that card will still be north of $1400. You really had to be a fool to buy a RTX Titan at $2500 and then drop $200 more for a waterblock+backplate.

 

For anybody at nVidia reading this, I will gladly take those unsold RTX Titans off your hands at a "reduced price" :D

Apparently since efficiency is so important to mining, the Radion 7 is still on par with the 3080 when comparing Ethereum hash rates. So while im sure some miners will buy up 3080's i dont think all of them will.

 

From what iv seen the R7 can roughly match the 3080 but at slightly lower power, but the 3080 can go higher but requires an OC and higher power still.

 

So i guess it depedns on the individual miners situation, if power usage doesnt matter, sure they'll likely buy up 3080's , but if they are already running R7's ...i dont see the point. Their R7 have already paid for themselves.


CPU: Intel i7 3930k w/OC & EK Supremacy EVO Block | Motherboard: Asus P9x79 Pro  | RAM: G.Skill 4x4 1866 CL9 | PSU: Seasonic Platinum 1000w | VDU: Panasonic 42" Plasma |

GPU: Gigabyte 1080ti Gaming OC w/OC & Barrow Block | Sound: Asus Xonar D2X - Z5500 -FiiO X3K DAP/DAC - ATH-M50S | Case: Phantek Enthoo Primo White |

Storage: Samsung 850 Pro 1TB SSD + Samsung 850 Evo 256GB SSD | Cooling: XSPC D5 Photon 270 Res & Pump | 2x XSPC AX240 White Rads | NexXxos Monsta 80x240 Rad P/P |

Link to post
Share on other sites
17 hours ago, alextulu said:

What happened with the 5700 XT? Are AMD drivers that bad?

If you are referring to the tests where AMD had N/A rather than a number, most of those were CUDA tests which is Nvidia's proprietary computing implementation.  AMD uses OpenCL and very few programs will support both, typically they just support CUDA or OpenCL, so AMD literally cannot compete in a CUDA test.

 

Two takeaways I had from the video: 1. holy shit, I did not realise Radeon was that much better at CATIA.  My workplace is heavily CATIA focused and all the workstations have Nvidia cards in them, so that came as a bit of a shock.

2.  It seems wrong for Linus to primarily compare the 3080 to the 2080.  With the 3090 taking the place of the Titan it seems to me that the 3080 should be compared against the 2080 Ti, or do people think I am missing something here?

 

EDIT: ignore the second part, I re-checked the prices and I can see why you would do it that way.

Link to post
Share on other sites

"Reflex On + Boost" likely means it is when the GPU is ramped up to its Boost Clock; increasing the framerate whilst creating more load (and thus less to play with) as a trade-off.

Link to post
Share on other sites
On 9/16/2020 at 8:05 PM, Kierax said:

@LinusTechYou have impressed me, an entire video with a new product and you didn't drop it yet. 

But the RTX3080 is much cheaper than the Asus PQ22UC! How can you not be impressed he didn't drop the oled monitor??

Link to post
Share on other sites

For the SR-IOV feature, have now opened a ticket on the NVIDIA forums

 

https://forums.developer.nvidia.com/t/please-suport-a-limited-version-of-sr-iov-for-geforce-on-linux/154735?u=dreamcat4

Link to post
Share on other sites
On 9/17/2020 at 1:15 PM, Koeshi said:

If you are referring to the tests where AMD had N/A rather than a number, most of those were CUDA tests

But what about the games?

 

I know there are some games with Nvidia proprietary tech, but if you're using Radeon, those games should still run with those effects either disabled or running on the CPU (meaning lower performance, but should still run).

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×