Jump to content
Search In
  • More options...
Find results that contain...
Find results in...
Nuklearfire

Why Linus nearly ever uses NVIDIA GPUs

Recommended Posts

Posted · Original PosterOP
19 hours ago, amdorintel said:

also keep in mind who gives linus money for promotion air time

could be heavily favored to one company because they have paid him more money over the years

something to keep in mind

The Money the most of a Time dont comes from Nvidia or AMD it comes from Manufatures like ASUS or Gigabyte and they sell both AMD and Nvidia Cards so that will not be a Reason.

 

But Guys and Girls (dont let me forget them) my Point from the beginning of this thread was a Compare in the Same Buget.and the Performance to it.

There are some kinds of Benchmark Sites but they have sometimes some really big difference.

 

so on the side i mostly look there is a 

 

RTX 2060 Super       with  13.830 3D-Mark Point for 409$ and a 

Radeon RX5700 XT with   13.819 3D-Mark Point for 389.99 

 

and thei are so thight that i dont see a reason to spend 20$ for 11 Points in Benchmark when you think there is a difference from up 2 20 Point when i run the same  Benchmarks 3 Times on the same System in aRow

 

Link to post
Share on other sites

@Nuklearfire

Price per points? 

$409/13.830= $29.5734 per point.

$389.99/13.819= $28.2212 per point. 

Almost similar. Factor that in with how many NVIDIA cards there are, people being comfortable with them(keep in mind people will pay more to stick with something they know), typically better driver support, better optimization, and the power efficiency. I'd be willing to bet that if someone here that was better than me at math sat down and did it, in the life of the cards based on power savings the 2060 ends up being cheaper by EOL. 

Cant forget the RTX. You're also paying $1.35 per point for a feature that Radeon doesn't have.

 

You're also pointing to not caring about the efficiency of the machine because power is cheap. 2060s is 175W compared to the 5700XT at 225W. That's almost 25% more efficient. That's a huge difference.

 

Link to post
Share on other sites
5 hours ago, Nuklearfire said:

The Money the most of a Time dont comes from Nvidia or AMD it comes from Manufatures like ASUS or Gigabyte and they sell both AMD and Nvidia Cards so that will not be a Reason.

 

But Guys and Girls (dont let me forget them) my Point from the beginning of this thread was a Compare in the Same Buget.and the Performance to it.

There are some kinds of Benchmark Sites but they have sometimes some really big difference.

 

so on the side i mostly look there is a 

 

RTX 2060 Super       with  13.830 3D-Mark Point for 409$ and a 

Radeon RX5700 XT with   13.819 3D-Mark Point for 389.99 

 

and thei are so thight that i dont see a reason to spend 20$ for 11 Points in Benchmark when you think there is a difference from up 2 20 Point when i run the same  Benchmarks 3 Times on the same System in aRow

 

Who cares about 3d mark?

 

I want to see real world performance. not a benchmark.

 

Link to post
Share on other sites
33 minutes ago, Voluspa said:

@Nuklearfire

Price per points? 

$409/13.830= $29.5734 per point.

$389.99/13.819= $28.2212 per point. 

Almost similar. Factor that in with how many NVIDIA cards there are, people being comfortable with them(keep in mind people will pay more to stick with something they know), typically better driver support, better optimization, and the power efficiency. I'd be willing to bet that if someone here that was better than me at math sat down and did it, in the life of the cards based on power savings the 2060 ends up being cheaper by EOL. 

Cant forget the RTX. You're also paying $1.35 per point for a feature that Radeon doesn't have.

 

You're also pointing to not caring about the efficiency of the machine because power is cheap. 2060s is 175W compared to the 5700XT at 225W. That's almost 25% more efficient. That's a huge difference.

 

eh. tbh you wont notice much a difference in your light bill over the life of the pc.

 

sure you might save a few bucks over a year or so. but does that really matter? if your worried about a couple bucks for a year you probably arent spending $400 on a video card.

Link to post
Share on other sites
4 hours ago, Voluspa said:

You're also pointing to not caring about the efficiency of the machine because power is cheap. 2060s is 175W compared to the 5700XT at 225W. That's almost 25% more efficient. That's a huge difference.

out of how many kwh's?

how much of your 25% in watts is it running at max power when gaming on a 2060 gpu?

 

monthly wh's

  • Colorado homes use 687 kWh
  • North Dakota homes use 1,240 kW
  • Wyoming homes use 863 kWh

so you game all out for 2 hrs on your 2060 gpu

175W x 2hrs = 350wh x 30 = 10,500wh/m

225W x 2hrs = 450wh  x 30 = 13,500wh/m

out of say 900,000wh

your gpu wh used compared to the entire house is roughly 1.5%

actually

10500/900000=1.1666667%

13500/900000=1.5%

a delta of 0.33333333% so very minscule

when your dropping a grand on a pc, and 500+ on a gpu, and your hooked up to hydro (electric) then the 50W difference between gpus is a nothing burger as kevin oleary says all the time.

 

now, lets get into rv'ing, boating, offgrid, and using solar

thats when power factors are important, efficiencies are important, wh's used is very important because every wh used takes a lot more time to recoup to put back into the battery system.

 

 

 

 

Link to post
Share on other sites
Posted · Original PosterOP
7 hours ago, RonnieOP said:

Who cares about 3d mark?

 

I want to see real world performance. not a benchmark.

Thats my point.

My question in zhis Thread was why they do not a Video with taking GPUS in som Pricepoints and Compare them in Reallife

Link to post
Share on other sites
3 hours ago, amdorintel said:

out of how many kwh's?

how much of your 25% in watts is it running at max power when gaming on a 2060 gpu?

 

monthly wh's

  • Colorado homes use 687 kWh
  • North Dakota homes use 1,240 kW
  • Wyoming homes use 863 kWh

so you game all out for 2 hrs on your 2060 gpu

175W x 2hrs = 350wh x 30 = 10,500wh/m

225W x 2hrs = 450wh  x 30 = 13,500wh/m

out of say 900,000wh

your gpu wh used compared to the entire house is roughly 1.5%

actually

10500/900000=1.1666667%

13500/900000=1.5%

a delta of 0.33333333% so very minscule

when your dropping a grand on a pc, and 500+ on a gpu, and your hooked up to hydro (electric) then the 50W difference between gpus is a nothing burger as kevin oleary says all the time.

 

now, lets get into rv'ing, boating, offgrid, and using solar

thats when power factors are important, efficiencies are important, wh's used is very important because every wh used takes a lot more time to recoup to put back into the battery system.

 

 

 

 

You're making the same mistake everyone makes when they say "it's nothing"... 

 

You have to calculate this for the scenario if *all* PCs on the planet would use power hungry parts and if *all* PCs would use power efficient parts instead.  

What's the difference then?  Still "doesn't matter, nothing burger"? 


RYZEN 5 3600 | MSI GTX 1060 6GB GAMING X | 16GB CORSAIR VENGEANCE LPX 3200 DDR4 | MSI B350M MORTAR | 250GB SAMSUNG EVO 860 | 2TB SEAGATE BARRACUDA | 1TB TOSHIBA SSHD | 120GB KINGSTON SSD | WINDOWS 10 PRO | INWIN 301| BEQUIET PURE POWER 10 500W 80+ SILVER | ASUS 279H | LOGITECH Z906 | DELL KB216T | LOGITECH M185 | SONY DUALSHOCK 4

 

LENOVO IDEAPAD 510 | i5 7200U | 8GB DDR4 | NVIDIA GEFORCE 940MX | 1TB WD | WINDOWS 10 GO HOME 

Link to post
Share on other sites

These threads always go nowhere incredible fast,  It's now on page 4 but I see none of the issues on page one have been settled.

 

I don't know why LTT choose what they choose because I don't watch the videos, however you can't compare a 1080 to 5700 when the first posts asked why a 2070 is used.

You need to compare the card that was used in each specific video and break down what alternatives there were at the time.

 

 

All you are doing by comparing the 1080 to the 5700 is highlighting that if it's still keeping up with the 5700 then that is a reflection on AMD not Nvidia.  Price is moot when the card is ancient and not even sold anymore.   Which makes the next few pages of the thread a waste of time debating an issue that doesn't exist. 

 

So after 4 pages we are now debating power efficiency, which is either a problem or it's not for the individual, it's not an issue that applies to everyone.  Just don't fall into the inconsistency trap of claiming that PSU's are cheap so an extra few $$ for a bigger PSU isn't a problem all the while arguing that they should be buying cheaper GPU's because of a few $$.

 

 


QuicK and DirtY. Read the CoC it's like a guide on how not to be moron.  Also I don't have an issue with the VS series.

Sometimes I miss contractions like n't on the end of words like wouldn't, couldn't and shouldn't.    Please don't be a dick,  make allowances when reading my posts.

Link to post
Share on other sites

I dont really think comparing cards so far appart  in relase dates is a valid comparison.

The 1080ti does not sell for its original msrp in 2019. Of course AMDs 2019 lineup is going to give more compute power for the cost compared to a bleeding edge card released nearly 4 years ago, if you go by msrp. 


 

 

Link to post
Share on other sites
On 12/4/2019 at 4:13 AM, Nuklearfire said:

snip

Yeah, one of the biggest things that has bugged me is his episode on the EXP GDC, he says '...in order to get anything more powerful than what is already in it (a tenth gen nvidia card)...'.
Nvidia is notoriously awful when it comes to eGPU (to less of an extent with Thunderbolt III), AMD works awesome with eGPUs though, no modded driver or anything, it just works.


Current:
Custom SFF AMD Sleeper:
- CPU: Ryzen 5 2600 @ 3.8Ghz all cores

- GPU: Reference RX Vega 64
- RAM: 16GB (2x8GB) @ 3200Mhz
- Storage: 256GB Samsung PM951 NVMe SSD, 2TB WD Black HDD
- Displays: Asus VG248QE 1080p @144hz, 2x Dell SE2417HGR 1080p @60Hz

Dell Precision M6700:

- CPU: i7-3720qm @ 3.8Ghz all cores Currently @ 3.4GHz trying to fix weird issues
- dGPU: K3000M "Gigahertz Edition" 1006Mhz Core, 2004Mhz MEM

- eGPU: MSI Armour RX 480 4GB
- RAM: 16GB (2x8GB) HyperX Impact @ 1600Mhz
- Storage: 256GB mSATA w/ 2x 1TB WD Black HDDs in RAID 0
- Display: Samsung LTN173HT02 120Hz 1080p internal display With 120Hz Input Mod!

Link to post
Share on other sites
8 hours ago, mr moose said:

These threads always go nowhere incredible fast,  It's now on page 4 but I see none of the issues on page one have been settled.

It's because @LinusTech hasn't given the answer and nobody will be satisfied until he does. 😛

 

Though I'm sure he has his reasons to not publicly say anything.

Link to post
Share on other sites

To answer the OP and title: Nvidia's cards are better. 

They consistently perform on par or better for the given tier level, and usually have extra features to overcome the price gap. Especially on Turing vs the 5700 line, people forget that RTX, NVENC, and CUDA are things people uses. Especially NVENC and CUDA if you record and do anything other than gaming. 

In the high end, best AMD can compete with is last gen or one tier lower on the current gen. 


X58 Madlads: X58 Xeon/i7 discussion     X99 bois: X99 Xeon/i7 discussion

 

Big Rig (Completed) - (Current) - i7 5960X - 4.7Ghz/3.7Ghz ~ 1.3v/1.1v core/uncore - 76-78C under RealBench load- Custom Loop: 2x 360GTS with EK-ZMT/Stubbies and EK D5 pump/res combo - EVGA X99 Classified - 32GB (4x8GB) HyperX Predator DDR4 - 3200MHz CL16 - AMD Radeon VII (best TimeSpy so far: here) - 1TB 970 Evo - Corsair RM1000i - Phanteks Enthoo Evolv ATX TG - 6x iPPC NF-F12 2000 - LG 25UM56-P - 25" 2560x1080 at 75Hz

 

Planned Rig (big rig moving to F@H and benching) - i7 5820K - whatever cooler I decide on - EVGA X99 Micro 2 - 16GB (4x4GB) EVGA SSC DDR4 - EVGA XC Ultra 1660 Ti - 250GB 960 Evo - Whatever other drives I end up running - Corsair CX550 - Fractal Design Meshify C Mini - LG 25UM56-P - 25" 2560x1080 at 75Hz

Planned X58 rig - i7 950 - NH-D15S - EVGA X58 Classified SLI 3-Way - 24GB (3x8GB) HyperX Savage Red DDR3 - 2x EVGA Classified 780s - probably a basic SSD - EVGA 1600W T2 - Fractal Design Define S - 3x NF-P12 Redux + whatever other fans I end up using

Delayed Linux Box - X5670 - Intel i7 920 stock cooler - EVGA X58 Micro - 6GB (6x1GB) Corsair DDR3 most likely - AMD Radeon WX2100 - probably a basic SSD - 600W Enhance Flex ATX PSU - Old Dell slim mATX chassis - 2-3x NF-A8 fans

 

I lowkey enjoy HEDT

 

Link to post
Share on other sites
9 hours ago, Zando Bob said:

To answer the OP and title: Nvidia's cards are better. 

They consistently perform on par or better for the given tier level, and usually have extra features to overcome the price gap. Especially on Turing vs the 5700 line, people forget that RTX, NVENC, and CUDA are things people uses. Especially NVENC and CUDA if you record and do anything other than gaming. 

In the high end, best AMD can compete with is last gen or one tier lower on the current gen. 

basically.

 

NVIDIA's features (especially NVENC which is awesome) have become part of our workflows for lots of things. 

 

It would take a significant performance advantage for AMD for me to consider foregoing the features NVIDIA offers for a personal build. YMMV.

Link to post
Share on other sites
On 12/4/2019 at 10:41 AM, pizapower said:

AMD can't beat NVIDIA. When NVIDIA goes 7nm then GAME OVER for AMD like always.

which would be horrible for the consumer just take a look at the CPU market.

 

If AMD wouldnt be kicking Intels ass right now you would still be rocking your yearly quad core with minimal IPC improvement release cycle from Intel because why do more if people have no choice.

Link to post
Share on other sites
1 minute ago, Pixel5 said:

which would be horrible for the consumer just take a look at the CPU market.

 

If AMD wouldnt be kicking Intels ass right now you would still be rocking your yearly quad core with minimal IPC improvement release cycle from Intel because why do more if people have no choice.

which is why i'm hoping Intel's GPU are going to compete with NVidia, AMD does not have the money or resources to go up again both Intel and NVidia at the same time.

Link to post
Share on other sites
21 hours ago, Arika S said:

which is why i'm hoping Intel's GPU are going to compete with NVidia, AMD does not have the money or resources to go up again both Intel and NVidia at the same time.

yea sadly they dont want we can thank intel for that as they used illegal means to keep AMD down the last time they could not compete with their CPU performance.

Sadly at the time AMD didnt know the full extend of this which is why they still bought ATI which turned out to eat up too much of their money later on.

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×