Jump to content

I am never buying consumer GPU's ever again

XZDX
Go to solution Solved by jaslion,

Whats the rest of your specs?

 

A k4200 is flat out weaker than a 1660ti by a good bit.

 

Sli is dead so you are quite lucky some ongoing games still seem to have good scaling. Still perfect scaling it shouldnt be beating a 1660ti.

 

So I assume anothet issue was going on and you just switching cards resolves it magically

You can't pay me.  You can give me an A2000 or die, but I will never use a 4090 as long as I live.

 

I recently got a set of Quadro K4200's.  Not only is framesync ACTUALLY IN SYNC, the entire pipeline I have set up desktop wise is running better than with the 1660ti I recently got.  Theres no comparison.  Even if they are Kepler, KEPLER, they are still performing v a s t l y b e t t e r.

 

I am baffled, confused, angry I didn't think of this before, and due to the fact that all of these are 20 dollars, I'm selling my shitass gaming cards.

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

IN GAMES AND RENDER

WHAT IS HAPPENING

Link to comment
Share on other sites

Link to post
Share on other sites

What is the meaning of this post

Apprentice Software Developer

Link to comment
Share on other sites

Link to post
Share on other sites

Shit posting... add more details of what you mean.

Oh he did, in another post.. then why make this one?

 

 

Maximums - Asus Z97-K /w i5 4690 Bclk @106.9Mhz * x39 = 4.17Ghz, 8GB of 2600Mhz DDR3,.. Gigabyte GTX970 G1-Gaming @ 1550Mhz

 

Link to comment
Share on other sites

Link to post
Share on other sites

I am getting better frame timings in games and general render times on a significantly older card and I am extremely confused

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, XZDX said:

I am getting better frame timings in games and general render times on a significantly older card and I am extremely confused

Technology isn't perfect,, never will be 100% perfect, but I'm glad you have a better experience, which is great.

 

Maximums - Asus Z97-K /w i5 4690 Bclk @106.9Mhz * x39 = 4.17Ghz, 8GB of 2600Mhz DDR3,.. Gigabyte GTX970 G1-Gaming @ 1550Mhz

 

Link to comment
Share on other sites

Link to post
Share on other sites

7 minutes ago, SkilledRebuilds said:

Shit posting... add more details of what you mean.

Oh he did, in another post.. then why make this one?

 

 

Yeah I figured out what it was doing and its doing exactly what I want, IE SLI K4200's, but its not actually telling me that its actually working.  Its just...... doing it.


And a K4200 by itself is better quality in desktop and game than the 1660ti I have.  It may get a few more frames, but they don't land consistantly and my 144hz monitor will end up looking awful....somehow.

 

IDK whats going on but I am digging it.  I am also potentially selling a 1660ti if I don't get completely engulfed by Bend, because thats the only use I can think for it

Link to comment
Share on other sites

Link to post
Share on other sites

I also get perfect Frametimes, not on a Quadro, GeForce or TESLA.

☣️MANHATTEN PROJECT☣️:  5800X3D. 32GB. 7900 XT.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, XZDX said:

Yeah I figured out what it was doing and its doing exactly what I want, IE SLI K4200's, but its not actually telling me that its actually working.  Its just...... doing it.


And a K4200 by itself is better quality in desktop and game than the 1660ti I have.  It may get a few more frames, but they don't land consistantly and my 144hz monitor will end up looking awful....somehow.

 

IDK whats going on but I am digging it.  I am also potentially selling a 1660ti if I don't get completely engulfed by Bend, because thats the only use I can think for it

So its working better than a 1660Ti, what the heck does that have to do with a 4090 that would completely blow them out of the water?

 

I mean they're 4GB Direct X 11 cards, you must be playing very specific games.

Router:  Intel N100 (pfSense) WiFi6: Zyxel NWA210AX (1.7Gbit peak at 160Mhz)
WiFi5: Ubiquiti NanoHD OpenWRT (~500Mbit at 80Mhz) Switches: Netgear MS510TXUP, MS510TXPP, GS110EMX
ISPs: Zen Full Fibre 900 (~930Mbit down, 115Mbit up) + Three 5G (~800Mbit down, 115Mbit up)
Upgrading Laptop/Desktop CNVIo WiFi 5 cards to PCIe WiFi6e/7

Link to comment
Share on other sites

Link to post
Share on other sites

52 minutes ago, Alex Atkin UK said:

So its working better than a 1660Ti, what the heck does that have to do with a 4090 that would completely blow them out of the water?

 

I mean they're 4GB Direct X 11 cards, you must be playing very specific games.

Workstation quality with frames actually landing vs exploding fireballs?  Hello?  Drop the A2000 price I'll have one tomorrow

Link to comment
Share on other sites

Link to post
Share on other sites

It's not because one card works better than one other for a certain thing on one system that suddenly workstation cards are the solution for everything...

F@H
Desktop: i9-13900K, ASUS Z790-E, 64GB DDR5-6000 CL36, RTX3080, 2TB MP600 Pro XT, 2TB SX8200Pro, 2x16TB Ironwolf RAID0, Corsair HX1200, Antec Vortex 360 AIO, Thermaltake Versa H25 TG, Samsung 4K curved 49" TV, 23" secondary, Mountain Everest Max

Mobile SFF rig: i9-9900K, Noctua NH-L9i, Asrock Z390 Phantom ITX-AC, 32GB, GTX1070, 2x1TB SX8200Pro RAID0, 2x5TB 2.5" HDD RAID0, Athena 500W Flex (Noctua fan), Custom 4.7l 3D printed case

 

Asus Zenbook UM325UA, Ryzen 7 5700u, 16GB, 1TB, OLED

 

GPD Win 2

Link to comment
Share on other sites

Link to post
Share on other sites

all your problems coming explicitly from using SLI I guess. Not being consumer class GPUs.

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, Kilrah said:

It's not because one card works better than one other for a certain thing on one system that suddenly workstation cards are the solution for everything...

Tarkov, OBS, rendering in KdenLive, honestly, more games run than I even expected, and they are running pretty well.  Like not 400 FPS..... but like, I don't need that lol.

 

For everything going on in my life right now, setting up a huge broadcast production stack on the cheap, and accidentally ending up with stuff like GPU docks and CNC mills the last few months......  like are you shitting me I am testing everything I can possibly think of.

 

The last time I had a card that covered this much shit the 9800GT was still useable and not absolute landfill, as much as I adore them.

3 hours ago, Salted Spinach said:

So if it beats a 1660Ti it must beat a 4090 as well?

 

Right...

 

You're hilarious.  No, I will not be buying consumer GPU's anymore because of the quality of the output of the card, and the quality of the cards themselves.  At that, I'll be buying used quadros, which personally I thought would never happen (amd fan till intel GPU release, now I don't care).  Not that a 4GB card from 2013 is better, but that its working better than my 1660 is rather shocking.  I would think by Turing that frame timing would be like this with most cards, but instead we get necked down refreshes, or cards that literally explode.  And I mean hell if I am getting this from OLD cards, what the hell will happen when I go newer than my 1660ti?

 

 

TLDR:  I thought I was going to have to buy another pile of RX580's at 120 a pop and I got a 105 dolla discount on badass performance

 

I was basically asleep last night after I finished testing and being scared

 

4 hours ago, starsmine said:

all your problems coming explicitly from using SLI I guess. Not being consumer class GPUs.

Ah ah ah, the K5000 and K6000 are supposed to do SLI with just a bridge.  Lower like these need a UEFI tag to load both.  And if it was explicitly an SLI issue then I wouldn't be able to change an INF file and go from ass performance on 2 cards to buff performance on one.  I think if the driver has complete control of SLI, like the GUI suite, it works less well than manually marking SLI apps.  I'm not sure yet.

Link to comment
Share on other sites

Link to post
Share on other sites

Sounds like a bunch of subjective placebo to me.

 

SLI is never better than a single GPU for frame times no matter how you cut it, plus you are running a pretty rubbish CPU on X79, a XEON E5-2690 V2 from 2013, low IPC followed by low clock speed.

 

As I can see it, there is much more to this than simply; oh a non consumer grade card (Which it is consumer grade) You mean gaming grade... runs better than your old 1660 Ti, a low end GPU.

 

You went from 6GB Vram to 4GB, that's the most illogical thing I have seen anyone do in 2024 where games are now started to push beyond 8GB even. 🤣

☣️MANHATTEN PROJECT☣️:  5800X3D. 32GB. 7900 XT.

Link to comment
Share on other sites

Link to post
Share on other sites

The reason your frame times are supposedly better is likely due to the K4200 being far far far slower than a 1660 Ti so you no longer have a CPU bottleneck.

 

https://www.techpowerup.com/gpu-specs/quadro-k4200.c2602

 

1660Ti is 340% faster than 1 K4200, in SLi they don't even make up half of a 1660 Ti.

 

Any new low end chip will decimate that XEON.

 

https://www.cpubenchmark.net/compare/2057vs3735vs3859vs5853/Intel-Xeon-E5-2690-v2-vs-Intel-i5-10600K-vs-AMD-Ryzen-5-5600X-vs-Intel-i3-14100F

☣️MANHATTEN PROJECT☣️:  5800X3D. 32GB. 7900 XT.

Link to comment
Share on other sites

Link to post
Share on other sites

Like for real...

 

PCPartPicker Part List: https://pcpartpicker.com/list/XPGph3

CPU: AMD Ryzen 5 5500 3.6 GHz 6-Core Processor  ($87.83 @ Amazon)
Motherboard: MSI B450M-A PRO MAX II Micro ATX AM4 Motherboard  ($69.98 @ Amazon)
Memory: TEAMGROUP T-Force Vulcan Z 32 GB (2 x 16 GB) DDR4-3600 CL18 Memory  ($57.99 @ Amazon)
Total: $215.80
Prices include shipping, taxes, and discounts when available
Generated by PCPartPicker 2024-05-20 14:34 EDT-0400

 

 

 

Even the lowly Ryzen 5 5500 makes your chip look like a Pentium D with just an RX 6600 and these games are light on the CPU.

The 1% lows are the stuttering, the lows are a lot worse on the XEON. See the end of the video.

 

 

 

EDIT this is the V3 chip, so a Haswell based 2690, it's a much faster CPU in games than yours and even that lost.

 

 

☣️MANHATTEN PROJECT☣️:  5800X3D. 32GB. 7900 XT.

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, FireyFoxiconnRageEX said:

Sounds like a bunch of subjective placebo to me.

 

SLI is never better than a single GPU for frame times no matter how you cut it, plus you are running a pretty rubbish CPU on X79, a XEON E5-2690 V2 from 2013, low IPC followed by low clock speed.

 

As I can see it, there is much more to this than simply; oh a non consumer grade card (Which it is consumer grade) You mean gaming grade... runs better than your old 1660 Ti, a low end GPU.

 

You went from 6GB Vram to 4GB, that's the most illogical thing I have seen anyone do in 2024 where games are now started to push beyond 8GB even. 🤣

Sure, but I'm not playing Apex.  Actually if anything I'm playing games with homebrew audience.  Now I will say, the moment an A2000 or similar pops up and I can go to 12, I'm snatching that shit.  What I gained was frametiming and more consistancy with broadcasts, and thats a MASSIVE plus for me right now.

 

As well, I can deshroud this 1660 and stick it in my R510 and it can actually be of use

 

2 hours ago, FireyFoxiconnRageEX said:

The reason your frame times are supposedly better is likely due to the K4200 being far far far slower than a 1660 Ti so you no longer have a CPU bottleneck.

 

https://www.techpowerup.com/gpu-specs/quadro-k4200.c2602

 

1660Ti is 340% faster than 1 K4200, in SLi they don't even make up half of a 1660 Ti.

 

Any new low end chip will decimate that XEON.

 

https://www.cpubenchmark.net/compare/2057vs3735vs3859vs5853/Intel-Xeon-E5-2690-v2-vs-Intel-i5-10600K-vs-AMD-Ryzen-5-5600X-vs-Intel-i3-14100F

I feel its more to do with the PCIe Gen vs anything else.  Equally though, I am getting good performance in things like Dirt Rally.

 

As well, I never said SLI fixes everything, I'm just amazed that its even working, and TBH for render stuff I think I want that ability.

 

Also yeah any new chip can whatever but I'm sure as shit not paying 300 bucks for any of that shit.  Besides if I did anything new it'd be dual socket and Power9.

 

2 hours ago, FireyFoxiconnRageEX said:

Like for real...

 

PCPartPicker Part List: https://pcpartpicker.com/list/XPGph3

CPU: AMD Ryzen 5 5500 3.6 GHz 6-Core Processor  ($87.83 @ Amazon)
Motherboard: MSI B450M-A PRO MAX II Micro ATX AM4 Motherboard  ($69.98 @ Amazon)
Memory: TEAMGROUP T-Force Vulcan Z 32 GB (2 x 16 GB) DDR4-3600 CL18 Memory  ($57.99 @ Amazon)
Total: $215.80
Prices include shipping, taxes, and discounts when available
Generated by PCPartPicker 2024-05-20 14:34 EDT-0400

 

 

 

Even the lowly Ryzen 5 5500 makes your chip look like a Pentium D with just an RX 6600 and these games are light on the CPU.

The 1% lows are the stuttering, the lows are a lot worse on the XEON. See the end of the video.

 

 

 

EDIT this is the V3 chip, so a Haswell based 2690, it's a much faster CPU in games than yours and even that lost.

 

 

I'm glad you spend money and watch LTT.  You are a very good consumer 🙂

 

Enjoy your grenades I'm gunna go vacuum up workstation parts

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, FireyFoxiconnRageEX said:

As I can see it, there is much more to this than simply; oh a non consumer grade card (Which it is consumer grade) You mean gaming grade... runs better than your old 1660 Ti, a low end GPU.

What even is "gaming grade"? Can it handle high workloads for extended periods of time? That's not gaming grade. That's hardware properly built for its purpose, whether it's playing video games, rendering some scene, etc. Basically, "gaming grade" is pure marketing speak.

"It pays to keep an open mind, but not so open your brain falls out." - Carl Sagan.

"I can explain it to you, but I can't understand it for you" - Edward I. Koch

Link to comment
Share on other sites

Link to post
Share on other sites

12 hours ago, XZDX said:

Workstation quality with frames actually landing vs exploding fireballs?  Hello?  Drop the A2000 price I'll have one tomorrow

Your issues likely have nothing to do with the physical hardware, but purely down to software.

 

As a anecdote, my 3070 works just fine, and every other card that I've had has worked fine too, unless it was an issue driven by software.

"It pays to keep an open mind, but not so open your brain falls out." - Carl Sagan.

"I can explain it to you, but I can't understand it for you" - Edward I. Koch

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, Godlygamer23 said:

Your issues likely have nothing to do with the physical hardware, but purely down to software.

 

As a anecdote, my 3070 works just fine, and every other card that I've had has worked fine too, unless it was an issue driven by software.

I know too much to blame software alone.  Its a factor, but not _the_

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, Godlygamer23 said:

What even is "gaming grade"? Can it handle high workloads for extended periods of time? That's not gaming grade. That's hardware properly built for its purpose, whether it's playing video games, rendering some scene, etc. Basically, "gaming grade" is pure marketing speak.

I'm comparing, say, a 770, to a 4200 or even a 5200.  Nowadays especially, binning tiers matter.  Quality of on-die circuits matter, and nvidia has a shotgun effect on GPU releases.

 

A quadro could have a 16 series die on it, but it will be a bit better or a bit worse depending on fuses blown on the chip.  The 4200 vs the 4000 vs a 770 vs a 750 ti.  Theres a lot of factors that matter.  On top of clocking, etc.

 

The earlier comment on PCIe bandwidth is a good one and makes me wonder if a GPU shouldn't be one gen behind on pcie to your proc to be able to fully flood the card and use all of it.

Link to comment
Share on other sites

Link to post
Share on other sites

Whats the rest of your specs?

 

A k4200 is flat out weaker than a 1660ti by a good bit.

 

Sli is dead so you are quite lucky some ongoing games still seem to have good scaling. Still perfect scaling it shouldnt be beating a 1660ti.

 

So I assume anothet issue was going on and you just switching cards resolves it magically

Link to comment
Share on other sites

Link to post
Share on other sites

5 hours ago, jaslion said:

Whats the rest of your specs?

 

A k4200 is flat out weaker than a 1660ti by a good bit.

 

Sli is dead so you are quite lucky some ongoing games still seem to have good scaling. Still perfect scaling it shouldnt be beating a 1660ti.

 

So I assume anothet issue was going on and you just switching cards resolves it magically

2690V2 40GB ram dual nics SSD's and a NAS

I know it is weaker, T H A T  W  A S N O T T H E P O I N T O F T H E P O S T.  The engineering is flat out better, and the fact that every frame is on time is vomitous.  I don't exactly know whats making the experience better, whether its clockrate, driver differences, games, maybe the bandwidth is helping bc its lower, I really don't know.

 

All I know is, K4200's are technically 670/770 ish in capability, and they are slightly higher binned.  The modern equivelant of the most powerful card from back then, a Titan Black, is exploding and causing fires.  And before the 4090 it was fake power supplies, before that the Corsair H1 catching fire.

 

I enjoy the major step up in engineering, and that its a 15 dollar card doing as good if not better in some cases than the newer, worse designed and manufactured, card.  And yall should probably think long and hard about this before buying a 5090.

Link to comment
Share on other sites

Link to post
Share on other sites

51 minutes ago, XZDX said:

The modern equivelant of the most powerful card from back then, a Titan Black, is exploding and causing fires.  And before the 4090 it was fake power supplies, before that the Corsair H1 catching fire.

I mean these are all loose things? And for psu dont buy a cruddy one?

 

A 2690v2 in gamez is behind a 2600k so would not at all surprise me that this is the case where the gpu pulls ahead so much the cpu cant keep up enough and you get weirdness.

 

I've seen that plenty of times. Probably if you lets say locked the 1660ti to 40fps in apex or whatever it would be just fine.

 

Also 40gb of ram? This may also not help since the only way to get that is to break normal dual channel or quad channel links. Which again will have an impact on frametimes.

 

Would defo not be surprised if you put the 1660 in a somewhat up to spec system for the 2020's games it will be doing fine.

 

55 minutes ago, XZDX said:

, T H A T  W  A S N O T T H E P O I N T O F T H E P O S T. 

So whilst its not the point of my answer either. It WILL have a major influence.

 

Keep in mind the BEST gpu's of the time of the 2690v2 were the hd7970 and gtx680. The hd 7970 was able to keep a 2600k near  100% usage throughout its decade of relevance. The 1660ti is over double that fast. With the single core perf of a 2600k being basically bare min spec these days for a nice gaming experience I would absolutly not be surprised that is going to be the major factor.

 

Feel free to put like an a2000 or a4000 in there I would not be too surprised the issues come back.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×