Jump to content

Why Linus nearly ever uses NVIDIA GPUs

5 hours ago, Nuklearfire said:

The Money the most of a Time dont comes from Nvidia or AMD it comes from Manufatures like ASUS or Gigabyte and they sell both AMD and Nvidia Cards so that will not be a Reason.

 

But Guys and Girls (dont let me forget them) my Point from the beginning of this thread was a Compare in the Same Buget.and the Performance to it.

There are some kinds of Benchmark Sites but they have sometimes some really big difference.

 

so on the side i mostly look there is a 

 

RTX 2060 Super       with  13.830 3D-Mark Point for 409$ and a 

Radeon RX5700 XT with   13.819 3D-Mark Point for 389.99 

 

and thei are so thight that i dont see a reason to spend 20$ for 11 Points in Benchmark when you think there is a difference from up 2 20 Point when i run the same  Benchmarks 3 Times on the same System in aRow

 

Who cares about 3d mark?

 

I want to see real world performance. not a benchmark.

 

Link to comment
Share on other sites

Link to post
Share on other sites

33 minutes ago, Voluspa said:

@Nuklearfire

Price per points? 

$409/13.830= $29.5734 per point.

$389.99/13.819= $28.2212 per point. 

Almost similar. Factor that in with how many NVIDIA cards there are, people being comfortable with them(keep in mind people will pay more to stick with something they know), typically better driver support, better optimization, and the power efficiency. I'd be willing to bet that if someone here that was better than me at math sat down and did it, in the life of the cards based on power savings the 2060 ends up being cheaper by EOL. 

Cant forget the RTX. You're also paying $1.35 per point for a feature that Radeon doesn't have.

 

You're also pointing to not caring about the efficiency of the machine because power is cheap. 2060s is 175W compared to the 5700XT at 225W. That's almost 25% more efficient. That's a huge difference.

 

eh. tbh you wont notice much a difference in your light bill over the life of the pc.

 

sure you might save a few bucks over a year or so. but does that really matter? if your worried about a couple bucks for a year you probably arent spending $400 on a video card.

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, Voluspa said:

You're also pointing to not caring about the efficiency of the machine because power is cheap. 2060s is 175W compared to the 5700XT at 225W. That's almost 25% more efficient. That's a huge difference.

out of how many kwh's?

how much of your 25% in watts is it running at max power when gaming on a 2060 gpu?

 

monthly wh's

  • Colorado homes use 687 kWh
  • North Dakota homes use 1,240 kW
  • Wyoming homes use 863 kWh

so you game all out for 2 hrs on your 2060 gpu

175W x 2hrs = 350wh x 30 = 10,500wh/m

225W x 2hrs = 450wh  x 30 = 13,500wh/m

out of say 900,000wh

your gpu wh used compared to the entire house is roughly 1.5%

actually

10500/900000=1.1666667%

13500/900000=1.5%

a delta of 0.33333333% so very minscule

when your dropping a grand on a pc, and 500+ on a gpu, and your hooked up to hydro (electric) then the 50W difference between gpus is a nothing burger as kevin oleary says all the time.

 

now, lets get into rv'ing, boating, offgrid, and using solar

thats when power factors are important, efficiencies are important, wh's used is very important because every wh used takes a lot more time to recoup to put back into the battery system.

 

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

7 hours ago, RonnieOP said:

Who cares about 3d mark?

 

I want to see real world performance. not a benchmark.

Thats my point.

My question in zhis Thread was why they do not a Video with taking GPUS in som Pricepoints and Compare them in Reallife

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, amdorintel said:

out of how many kwh's?

how much of your 25% in watts is it running at max power when gaming on a 2060 gpu?

 

monthly wh's

  • Colorado homes use 687 kWh
  • North Dakota homes use 1,240 kW
  • Wyoming homes use 863 kWh

so you game all out for 2 hrs on your 2060 gpu

175W x 2hrs = 350wh x 30 = 10,500wh/m

225W x 2hrs = 450wh  x 30 = 13,500wh/m

out of say 900,000wh

your gpu wh used compared to the entire house is roughly 1.5%

actually

10500/900000=1.1666667%

13500/900000=1.5%

a delta of 0.33333333% so very minscule

when your dropping a grand on a pc, and 500+ on a gpu, and your hooked up to hydro (electric) then the 50W difference between gpus is a nothing burger as kevin oleary says all the time.

 

now, lets get into rv'ing, boating, offgrid, and using solar

thats when power factors are important, efficiencies are important, wh's used is very important because every wh used takes a lot more time to recoup to put back into the battery system.

 

 

 

 

You're making the same mistake everyone makes when they say "it's nothing"... 

 

You have to calculate this for the scenario if *all* PCs on the planet would use power hungry parts and if *all* PCs would use power efficient parts instead.  

What's the difference then?  Still "doesn't matter, nothing burger"? 

The direction tells you... the direction

-Scott Manley, 2021

 

Softwares used:

Corsair Link (Anime Edition) 

MSI Afterburner 

OpenRGB

Lively Wallpaper 

OBS Studio

Shutter Encoder

Avidemux

FSResizer

Audacity 

VLC

WMP

GIMP

HWiNFO64

Paint

3D Paint

GitHub Desktop 

Superposition 

Prime95

Aida64

GPUZ

CPUZ

Generic Logviewer

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

These threads always go nowhere incredible fast,  It's now on page 4 but I see none of the issues on page one have been settled.

 

I don't know why LTT choose what they choose because I don't watch the videos, however you can't compare a 1080 to 5700 when the first posts asked why a 2070 is used.

You need to compare the card that was used in each specific video and break down what alternatives there were at the time.

 

 

All you are doing by comparing the 1080 to the 5700 is highlighting that if it's still keeping up with the 5700 then that is a reflection on AMD not Nvidia.  Price is moot when the card is ancient and not even sold anymore.   Which makes the next few pages of the thread a waste of time debating an issue that doesn't exist. 

 

So after 4 pages we are now debating power efficiency, which is either a problem or it's not for the individual, it's not an issue that applies to everyone.  Just don't fall into the inconsistency trap of claiming that PSU's are cheap so an extra few $$ for a bigger PSU isn't a problem all the while arguing that they should be buying cheaper GPU's because of a few $$.

 

 

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

On 12/4/2019 at 4:13 AM, Nuklearfire said:

snip

Yeah, one of the biggest things that has bugged me is his episode on the EXP GDC, he says '...in order to get anything more powerful than what is already in it (a tenth gen nvidia card)...'.
Nvidia is notoriously awful when it comes to eGPU (to less of an extent with Thunderbolt III), AMD works awesome with eGPUs though, no modded driver or anything, it just works.

why no dark mode?
Current:

Watercooled Eluktronics THICC-17 (Clevo X170SM-G):
CPU: i9-10900k @ 4.9GHz all core
GPU: RTX 2080 Super (Max P 200W)
RAM: 32GB (4x8GB) @ 3200MTs

Storage: 512GB HP EX NVMe SSD, 2TB Silicon Power NVMe SSD
Displays: Asus ROG XG-17 1080p@240Hz (G-Sync), IPS 1080p@240Hz (G-Sync), Gigabyte M32U 4k@144Hz (G-Sync), External Laptop panel (LTN173HT02) 1080p@120Hz

Asus ROG Flow Z13 (GZ301ZE) W/ Increased Power Limit:
CPU: i9-12900H @ Up to 5.0GHz all core
- dGPU: RTX 3050 Ti 4GB

- eGPU: RTX 3080 (mobile) XGm 16GB
RAM: 16GB (8x2GB) @ 5200MTs

Storage: 1TB NVMe SSD, 1TB MicroSD
Display: 1200p@120Hz

Asus Zenbook Duo (UX481FLY):

CPU: i7-10510U @ Up to 4.3 GHz all core
- GPU: MX 250
RAM: 16GB (8x2GB) @ 2133MTs

Storage: 128GB SATA M.2 (NVMe no worky)
Display: Main 1080p@60Hz + Screnpad Plus 1920x515@60Hz

Custom Game Server:

CPUs: Ryzen 7 7700X @ 5.1GHz all core

RAM: 128GB (4x32GB) DDR5 @ whatever it'll boot at xD (I think it's 3600MTs)

Storage: 2x 1TB WD Blue NVMe SSD in RAID 1, 4x 10TB HGST Enterprise HDD in RAID Z1

Link to comment
Share on other sites

Link to post
Share on other sites

8 hours ago, mr moose said:

These threads always go nowhere incredible fast,  It's now on page 4 but I see none of the issues on page one have been settled.

It's because @LinusTech hasn't given the answer and nobody will be satisfied until he does. ?

 

Though I'm sure he has his reasons to not publicly say anything.

Link to comment
Share on other sites

Link to post
Share on other sites

To answer the OP and title: Nvidia's cards are better. 

They consistently perform on par or better for the given tier level, and usually have extra features to overcome the price gap. Especially on Turing vs the 5700 line, people forget that RTX, NVENC, and CUDA are things people uses. Especially NVENC and CUDA if you record and do anything other than gaming. 

In the high end, best AMD can compete with is last gen or one tier lower on the current gen. 

Intel HEDT and Server platform enthusiasts: Intel HEDT Xeon/i7 Megathread 

 

Main PC 

CPU: i9 7980XE @4.5GHz/1.22v/-2 AVX offset 

Cooler: EKWB Supremacy Block - custom loop w/360mm +280mm rads 

Motherboard: EVGA X299 Dark 

RAM:4x8GB HyperX Predator DDR4 @3200Mhz CL16 

GPU: Nvidia FE 2060 Super/Corsair HydroX 2070 FE block 

Storage:  1TB MP34 + 1TB 970 Evo + 500GB Atom30 + 250GB 960 Evo 

Optical Drives: LG WH14NS40 

PSU: EVGA 1600W T2 

Case & Fans: Corsair 750D Airflow - 3x Noctua iPPC NF-F12 + 4x Noctua iPPC NF-A14 PWM 

OS: Windows 11

 

Display: LG 27UK650-W (4K 60Hz IPS panel)

Mouse: EVGA X17

Keyboard: Corsair K55 RGB

 

Mobile/Work Devices: 2020 M1 MacBook Air (work computer) - iPhone 13 Pro Max - Apple Watch S3

 

Other Misc Devices: iPod Video (Gen 5.5E, 128GB SD card swap, running Rockbox), Nintendo Switch

Link to comment
Share on other sites

Link to post
Share on other sites

9 hours ago, Zando Bob said:

To answer the OP and title: Nvidia's cards are better. 

They consistently perform on par or better for the given tier level, and usually have extra features to overcome the price gap. Especially on Turing vs the 5700 line, people forget that RTX, NVENC, and CUDA are things people uses. Especially NVENC and CUDA if you record and do anything other than gaming. 

In the high end, best AMD can compete with is last gen or one tier lower on the current gen. 

basically.

 

NVIDIA's features (especially NVENC which is awesome) have become part of our workflows for lots of things. 

 

It would take a significant performance advantage for AMD for me to consider foregoing the features NVIDIA offers for a personal build. YMMV.

Link to comment
Share on other sites

Link to post
Share on other sites

8 hours ago, LinusTech said:

NVENC

biggest reason that will keep me on NVidia. i was so close to going with a 5700 for my next upgrade, but it's become a core part of how my computer is used now, i cannot give it up.

🌲🌲🌲

 

 

 

◒ ◒ 

Link to comment
Share on other sites

Link to post
Share on other sites

On 12/4/2019 at 10:41 AM, pizapower said:

AMD can't beat NVIDIA. When NVIDIA goes 7nm then GAME OVER for AMD like always.

which would be horrible for the consumer just take a look at the CPU market.

 

If AMD wouldnt be kicking Intels ass right now you would still be rocking your yearly quad core with minimal IPC improvement release cycle from Intel because why do more if people have no choice.

Link to comment
Share on other sites

Link to post
Share on other sites

Linus doesn't use AMD because he is being manipulated by interdimensional aliens and hilary clinton.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Pixel5 said:

which would be horrible for the consumer just take a look at the CPU market.

 

If AMD wouldnt be kicking Intels ass right now you would still be rocking your yearly quad core with minimal IPC improvement release cycle from Intel because why do more if people have no choice.

which is why i'm hoping Intel's GPU are going to compete with NVidia, AMD does not have the money or resources to go up again both Intel and NVidia at the same time.

🌲🌲🌲

 

 

 

◒ ◒ 

Link to comment
Share on other sites

Link to post
Share on other sites

21 hours ago, Arika S said:

which is why i'm hoping Intel's GPU are going to compete with NVidia, AMD does not have the money or resources to go up again both Intel and NVidia at the same time.

yea sadly they dont want we can thank intel for that as they used illegal means to keep AMD down the last time they could not compete with their CPU performance.

Sadly at the time AMD didnt know the full extend of this which is why they still bought ATI which turned out to eat up too much of their money later on.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×