Jump to content

Was this a bad time to buy a 3090 off the EVGA queue?

Emertxe
43 minutes ago, Helpful Tech Wiard said:

Like FF14 at 60fps and apex at 144fps

FF14 is a weird case, as far i can tell there's no framerate limit, but they actually tried to impose one several times (truly the worst developers lol)

 

when i played this game half a year ago there also was none, definitely got over 120fps... but overall it was sluggish,  especially in tonberry at the plaza or what it's called... barely 60fps...

 

20220113_024402.thumb.jpg.7822d9488e7012c1d57ed331c755eaf7.jpg

 

 

likewise also MHW goes at least to 180, but believe it probably goes much higher, as ive been following  this closely and others have extensively studied this and a "framerate cap" never came up iirc.

 

242093757_Screenshot_20220113-025022_SamsungInternet.thumb.jpg.f3ed481d2dae9b7d4666e08950c99581.jpg

 

 

this is mostly important for bowgunners, but if you aren’t playing bowgun you probably doing something wrong anyways  : D  

The direction tells you... the direction

-Scott Manley, 2021

 

Softwares used:

Corsair Link (Anime Edition) 

MSI Afterburner 

OpenRGB

Lively Wallpaper 

OBS Studio

Shutter Encoder

Avidemux

FSResizer

Audacity 

VLC

WMP

GIMP

HWiNFO64

Paint

3D Paint

GitHub Desktop 

Superposition 

Prime95

Aida64

GPUZ

CPUZ

Generic Logviewer

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, Emertxe said:

A few days ago I randomly got a product notice for an EVGA rtx 3090 Kingpin I signed up for last June. I decided to jump on it to upgrade from my gtx 1080, which ended up being $2300 after tax/shipping

 

However, with stuff like the 3090 ti and 4000 series around the corner, I'm really second guessing actually using it as opposed to returning/selling. Sure, it would be a lot harder to get any other GPU, but I had already given up and was going to wait another year until I upgrade. However, I got a 1440p 270hz monitor, so I want to start pushing more frames in the games I play. I currently only hit ~200 fps in the esports titles I play on lowest, which feels like a really stupid reason to get an upgrade now (especially to an over-valued 3090), but it's certainly a big part of the reason.

 

Just wanted to get some opinions, should I just flip it and wait, or give up on new gpus and just use it? If I decide to use it, I'll probably be locked in for at least the next 5 years. No idea if I'll be kicking myself or not, especially since I already made it this long waiting.

 

What CPU are you running?

S.K.Y.N.E.T. v4.3

AMD Ryzen 7 5800X3D | 64GB DDR4 3200 | 12GB RX 6700XT |   Twin 24" Pixio PX248 Prime 1080p 144Hz Displays | 256GB Sabrent NVMe (OS) | 500GB Samsung 840 Pro #1 | 500GB Samsung 840 Pro #2 | 2TB Samsung 860 Evo1TB Western Digital NVMe | 2TB Sabrent NVMe | Intel Wireless-AC 9260

Link to comment
Share on other sites

Link to post
Share on other sites

8 minutes ago, Imglidinhere said:

 

What CPU are you running?

8700k, I've been thinking about the jump to 12900k, but I might wait one more generation.

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, Emertxe said:

8700k, I've been thinking about the jump to 12900k, but I might wait one more generation.

I'd honestly just wait. There's still skepticism regarding if Raptor Lake is going to be supported on LGA1700 at all, so... might not even bother if you ask me.

S.K.Y.N.E.T. v4.3

AMD Ryzen 7 5800X3D | 64GB DDR4 3200 | 12GB RX 6700XT |   Twin 24" Pixio PX248 Prime 1080p 144Hz Displays | 256GB Sabrent NVMe (OS) | 500GB Samsung 840 Pro #1 | 500GB Samsung 840 Pro #2 | 2TB Samsung 860 Evo1TB Western Digital NVMe | 2TB Sabrent NVMe | Intel Wireless-AC 9260

Link to comment
Share on other sites

Link to post
Share on other sites

8 hours ago, Mark Kaine said:

FF14 is a weird case, as far i can tell there's no framerate limit, but they actually tried to impose one several times (truly the worst developers lol)

 

when i played this game half a year ago there also was none, definitely got over 120fps... but overall it was sluggish,  especially in tonberry at the plaza or what it's called... barely 60fps...

There is something wrong in how that game utilizes the CPU (bottlenecking both the CPU and GPU),hence the performance issues.

A PC Enthusiast since 2011
AMD Ryzen 7 5700X@4.65GHz | GIGABYTE GTX 1660 GAMING OC @ Core 2085MHz Memory 5000MHz
Cinebench R23: 15669cb | Unigine Superposition 1080p Extreme: 3566
Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Vishera said:

There is something wrong in how that game utilizes the CPU (bottlenecking both the CPU and GPU),hence the performance issues.

Tbh a lot of even newer games have that issue,  mainly cpu bottlenecking, they'll use 3 or 4 cores, or less, and the rest of the cores (if there are any) lays dormant.

 

Something must be difficult about utilizing multi core architecture that devs have issues implementing it, ideally it should just be a switch in the API or something (speculating here of course)

 

Also additionally there seems to be a correlation between network lag and framerate,  that doesn't have to be like that... ironically Monster Hunter World doesn't do that, extensively at least,  there may be network lag, but the framerate stays solid (at 60 for example) 

The direction tells you... the direction

-Scott Manley, 2021

 

Softwares used:

Corsair Link (Anime Edition) 

MSI Afterburner 

OpenRGB

Lively Wallpaper 

OBS Studio

Shutter Encoder

Avidemux

FSResizer

Audacity 

VLC

WMP

GIMP

HWiNFO64

Paint

3D Paint

GitHub Desktop 

Superposition 

Prime95

Aida64

GPUZ

CPUZ

Generic Logviewer

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

17 minutes ago, Mark Kaine said:

Tbh a lot of even newer games have that issue,  mainly cpu bottlenecking, they'll use 3 or 4 cores, or less, and the rest of the cores (if there are any) lays dormant.

I was talking about too many draw calls,and not the number of cores that are being utilized.

When you have too many draw calls you can't utilize CPU cores to their full potential and have weird performance issues.

The source of the issue is bad code.

17 minutes ago, Mark Kaine said:

Something must be difficult about utilizing multi core architecture that devs have issues implementing it, ideally it should just be a switch in the API or something (speculating here of course)

It all depends on the engine,most game devs don't have the skill or the tools to fix such things in the engines they are using.

Every engine handles cores differently,some are hard coded to use just 4 threads,some are hard coded to use 12 or 16 threads.

But the best ones are those that dynamically utilize the CPU,meaning they can utilize all of the cores if they need to.

A PC Enthusiast since 2011
AMD Ryzen 7 5700X@4.65GHz | GIGABYTE GTX 1660 GAMING OC @ Core 2085MHz Memory 5000MHz
Cinebench R23: 15669cb | Unigine Superposition 1080p Extreme: 3566
Link to comment
Share on other sites

Link to post
Share on other sites

11 hours ago, Emertxe said:

Apex is actually technically limited at 190fps, since apparently there's some terrible micro stuttering past that point even though the 3090 can push the mid 200s.

From what I found it was 144 at some point so idk.

but it’s semantics anyways

I could use some help with this!

please, pm me if you would like to contribute to my gpu bios database (includes overclocking bios, stock bios, and upgrades to gpus via modding)

Bios database

My beautiful, but not that powerful, main PC:

prior build:

Spoiler

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Btw, did this thread go from 

"guys i think i made a mistake?"

to

"guys i think I'll just buy a kingpin card instead yolo!"

 

or am i imaging things? ; D

 

 

The direction tells you... the direction

-Scott Manley, 2021

 

Softwares used:

Corsair Link (Anime Edition) 

MSI Afterburner 

OpenRGB

Lively Wallpaper 

OBS Studio

Shutter Encoder

Avidemux

FSResizer

Audacity 

VLC

WMP

GIMP

HWiNFO64

Paint

3D Paint

GitHub Desktop 

Superposition 

Prime95

Aida64

GPUZ

CPUZ

Generic Logviewer

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, Mark Kaine said:

Btw, did this thread go from 

"guys i think i made a mistake?"

to

"guys i think I'll just buy a kingpin card instead yolo!"

 

or am i imaging things? ; D

 

 

Well, it was a Kingpin card from the beginning. Not like I have a choice of anything else for a 3090 lol

Link to comment
Share on other sites

Link to post
Share on other sites

15 hours ago, Mark Kaine said:

Something must be difficult about utilizing multi core architecture that devs have issues implementing it, ideally it should just be a switch in the API or something (speculating here of course)

I don't have direct experience working with it, but parallization can be pretty tricky to work with. Don't forget, anything that runs in parallel still needs a way to delegate what to process where, and it needs to combine it back in the end, which in itself takes processing overhead time. That, and you can start to run risks of race conditions and pointer conflicts if it's not perfectly handled. I don't know what popular engines like Unity and Unreal provide, but implementing asynchronous processing may not always be beneficial, especially when trying to make everything run in real-time video games.

Link to comment
Share on other sites

Link to post
Share on other sites

On 1/12/2022 at 5:57 PM, Emertxe said:

 

I realize the 3090 was always going to take the biggest hit in getting screwed with generational leaps, which is partly why I'm second guessing getting one this far in the generation.

If you bought a Kingpin, flip it. If not, up to you. How badly do you want that upgrade?

 

On 1/12/2022 at 5:59 PM, Helpful Tech Wiard said:

not for some people.

most can tell the diference from like 144 to 170 and to 240.

ive never seen a engine limit fps that low.

ive seen like a limit of over 300 on most.

any games you can prove do that?

bUt ThE hUmAn eYe OnLy SeEs 24 FpS!!!!!!!!11

I enjoy buying junk and sinking more money than it's worth into it to make it less junk.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, aisle9 said:

If you bought a Kingpin, flip it. If not, up to you. How badly do you want that upgrade?

What makes you say that for the Kingpin? While I wouldn't normally buy a Kingpin, I'd appreciate the OCability and binning. Although, the biggest reason is that the other high end cards aren't available.

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, Emertxe said:

Well, it was a Kingpin card from the beginning. Not like I have a choice of anything else for a 3090 lol

I was going for a Kingpin but it never came up.

A FTW3 Ultra 3080 ti did and the hybrid version. I bought the air cooled FTW3 but not the hybrid since I already had 3 3080 tis. 

I did get a MSI Gaming X Trio 3090 and it was $2,249, so a bit more than the Kingpin for a bit less. It is in a productivity build so not a big deal.

 

As for selling. I sold off my 10 series just before the holidays and the GTX 1080s were the best sellers. I thought it would be the 1080 tis but they got far less bids.  It is an option if you don't want to keep the 1080 as a spare. 

RIG#1 CPU: AMD, R 7 5800x3D| Motherboard: X570 AORUS Master | RAM: Corsair Vengeance RGB Pro 32GB DDR4 3200 | GPU: EVGA FTW3 ULTRA  RTX 3090 ti | PSU: EVGA 1000 G+ | Case: Lian Li O11 Dynamic | Cooler: EK 360mm AIO | SSD#1: Corsair MP600 1TB | SSD#2: Crucial MX500 2.5" 2TB | Monitor: ASUS ROG Swift PG42UQ

 

RIG#2 CPU: Intel i9 11900k | Motherboard: Z590 AORUS Master | RAM: Corsair Vengeance RGB Pro 32GB DDR4 3600 | GPU: EVGA FTW3 ULTRA  RTX 3090 ti | PSU: EVGA 1300 G+ | Case: Lian Li O11 Dynamic EVO | Cooler: Noctua NH-D15 | SSD#1: SSD#1: Corsair MP600 1TB | SSD#2: Crucial MX300 2.5" 1TB | Monitor: LG 55" 4k C1 OLED TV

 

RIG#3 CPU: Intel i9 10900kf | Motherboard: Z490 AORUS Master | RAM: Corsair Vengeance RGB Pro 32GB DDR4 4000 | GPU: MSI Gaming X Trio 3090 | PSU: EVGA 1000 G+ | Case: Lian Li O11 Dynamic | Cooler: EK 360mm AIO | SSD#1: Crucial P1 1TB | SSD#2: Crucial MX500 2.5" 1TB | Monitor: LG 55" 4k B9 OLED TV

 

RIG#4 CPU: Intel i9 13900k | Motherboard: AORUS Z790 Master | RAM: Corsair Dominator RGB 32GB DDR5 6200 | GPU: Zotac Amp Extreme 4090  | PSU: EVGA 1000 G+ | Case: Streacom BC1.1S | Cooler: EK 360mm AIO | SSD: Corsair MP600 1TB  | SSD#2: Crucial MX500 2.5" 1TB | Monitor: LG 55" 4k B9 OLED TV

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×