Jump to content

Is the fastest GPU ALWAYS the best?

New CPUs:

High power draw, but you can limit them to ~100 W without much loss in performance.

 

New RTX GPUs:

High power draw, but you can limit them to ~350 W without much loss in performance.

 

I wonder what the results would be if you compair full stock vs full power limited system in terms of performance and total power draw. If you can save ~~ 200 W without basically losing anything noticeable, we should at least question the default settings 😄

And if the power draw can be reduced this much, the whole new-plug-new-power-supply-thing + large cooling will be even more questionable.

 

About DLSS 3:

Not sure if this is realy a huge benfit. Of course the fps-bars are longer, but if there is a "AI generated frame between two frames" and the resulting fps are not twice as high or more the delay between two "real" frames has to increase or there will be micro stuttering - but in any case, there will be some kind of input lag.

Of course not every game is fast paced, therefore you will not notice the lag in every game - but lower fps are also not that much noticeable in slower games. In flight sim 2020 for example probably no one will notice the delay, but probably no one will notice the increased fps, too 😄

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, Dogzilla07 said:

Maybe not, from the aspect of the company/shareholders. Lack of DP 2.0 means buyers have to get a 5000 series card for 4K >120Hz. It mitigates the severity of another 1080 Ti situation.

Not really, they kind of messed up this whole section. Dp 1.4 can run 4k240 (10bit 4:4:4) and that is how Samsung G8 (which they actually used as an example) runs. They have so much employees, someone should really fact check these big launch reviews because now this misinformation is being peddled all over reddit.

Link to comment
Share on other sites

Link to post
Share on other sites

@riba2233sure it does, a couple of youtubers said EW Chroma, ew. That alone will make people not keep the 4090 when the 5000 series with DP2.0 launches xD and they buy their 240Hz 4K monitors.

 

It's just the right amount of technically possible/viable (from a legal perspective, to minimize chance for litigation) [and from a perspective of utilitarian/engineering/more logic driven buyers that don't mind settling for Chroma subsampling if it's not too noticeable, to cover that demographic well], but imperfect to invoke OCD anxiety/remorse with the other demographics (that will never settle for subpar Chroma based on principle/emotion/perfectionism). It's textbook psychology, advertising/marketing.

Link to comment
Share on other sites

Link to post
Share on other sites

13 minutes ago, Dogzilla07 said:

@riba2233sure it does, a couple of youtubers said EW Chroma, ew. That alone will make people not keep the 4090 when the 5000 series with DP2.0 launches xD and they buy their 240Hz 4K monitors.

 

It's just the right amount of technically possible/viable (from a legal perspective, to minimize chance for litigation) [and from a perspective of utilitarian/engineering/more logic driven buyers that don't mind settling for Chroma subsampling if it's not too noticeable, to cover that demographic well], but imperfect to invoke OCD anxiety/remorse with the other demographics (that will never settle for subpar Chroma based on principle/emotion/perfectionism). It's textbook psychology, advertising/marketing.

You didn't read well, I wrote 4:4:4, that is full chroma, without any subsampling. So 4k, 240hz, 444, 10bit, hdr. All on dp 1.4.

 

Those monitors don't even support dp 2.0!

Link to comment
Share on other sites

Link to post
Share on other sites

Which Raptor Lake Intel processor would be preferred for use with the 4090 if a balance between good gaming performance AND productivity like Blender and Resolve is required?

 

I work in both areas so I need something that covers both, preferably not bottlenecking either of them. Productivity usually gains from lots of cores, but gaming gains from higher clock speeds and lesser cores, but what if I need something that can do the best of both sides of that coin?

 

It's a company investment so cost isn't really a factor, I just need the best Intel one in regards to this.

Link to comment
Share on other sites

Link to post
Share on other sites

39 minutes ago, Glazarus said:

Which Raptor Lake Intel processor would be preferred for use with the 4090 if a balance between good gaming performance AND productivity like Blender and Resolve is required?

 

I work in both areas so I need something that covers both, preferably not bottlenecking either of them. Productivity usually gains from lots of cores, but gaming gains from higher clock speeds and lesser cores, but what if I need something that can do the best of both sides of that coin?

 

It's a company investment so cost isn't really a factor, I just need the best Intel one in regards to this.

As productivity is a factor & cost isn't a barrier you want as many cores as possible. So the 13700k or 13900k would be the best choice.

You would also want the IGPU on the CPU for QuickSync features so avoid the KF.

CPU: Ryzen 5900x | GPU: RTX 3090 FE | MB: MSI X470 Gaming Pro Carbon | RAM: 32gb Ballistix | PSU: Corsair RM750 | Cooler: Sythe Fuma 2 | Case: Phanteks P600s | Storage: 2TB WD Black SN 750 & 1TB Sabrent Rocket | OS: Windows 11 Pro & Linux Mint

Link to comment
Share on other sites

Link to post
Share on other sites

9 minutes ago, solado said:

As productivity is a factor & cost isn't a barrier you want as many cores as possible. So the 13700k or 13900k would be the best choice.

You would also want the IGPU on the CPU for QuickSync features so avoid the KF.

This was my strategy on my previous (current) system with the Intel Core i9 7960X, but while it worked well with productivity, it started to bottleneck a few games with its low clock speed. That's why I'm wondering if just "more cores" is the way to go, or if it's better to strike a balance with a processor that gets improved clock speeds at the cost of losing a few cores. I can't choose based on only gaming or productivity as I need something that works for both and won't bottleneck my 4090 in either. 

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, riba2233 said:

You didn't read well, I wrote 4:4:4, that is full chroma, without any subsampling. So 4k, 240hz, 444, 10bit, hdr. All on dp 1.4.

Hm, my bad then, it can ?, why did every youtuber say it can't then xD I do admit I'm hazy about the exact details and didn't double check what DP 1.4 supports fully and what not. Was just going off youtubers, reviewers, and what others said about DP 1.4 vs DP 2.0

Link to comment
Share on other sites

Link to post
Share on other sites

7 minutes ago, Glazarus said:

This was my strategy on my previous (current) system with the Intel Core i9 7960X, but while it worked well with productivity, it started to bottleneck a few games with its low clock speed. That's why I'm wondering if just "more cores" is the way to go, or if it's better to strike a balance with a processor that gets improved clock speeds at the cost of losing a few cores. I can't choose based on only gaming or productivity as I need something that works for both and won't bottleneck my 4090 in either. 

Cores win over frequency with productivity, a 13700k will also beat an i5 in gaming slightly because it will have a higher frequency. I don't understand the thought behing a lower i5 not bottlenecking a 4090 and the thought an i7 will.

 

You can do the I5 if you want, but you would be leaving performance in blender and resolve on the table. As this is for a company and actual work rather than pleasure or a hobby the i7 or i9 would be a worth while investment regardless of future products. I am a video editor myself and for productivity in this environment time is money, so investing in more cores is always a wise choice.

CPU: Ryzen 5900x | GPU: RTX 3090 FE | MB: MSI X470 Gaming Pro Carbon | RAM: 32gb Ballistix | PSU: Corsair RM750 | Cooler: Sythe Fuma 2 | Case: Phanteks P600s | Storage: 2TB WD Black SN 750 & 1TB Sabrent Rocket | OS: Windows 11 Pro & Linux Mint

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, Dogzilla07 said:

Hm, my bad then, it can ?, why did every youtuber say it can't then xD I do admit I'm hazy about the exact details and didn't double check what DP 1.4 supports fully and what not. Just going off youtubers, reviewers, and what others said about DP 1.4 vs DP 2.0

Yeah that is the exact problem like I wrote, they put wrong information in video and now it is spreading on the internet like a plague. They are an authority and they should have better qc for their content.

Link to comment
Share on other sites

Link to post
Share on other sites

12 minutes ago, solado said:

You can do the I5 if you want, but you would be leaving performance in blender and resolve on the table. As this is for a company and actual work rather than pleasure or a hobby the i7 or i9 would be a worth while investment regardless of future products. I am a video editor myself and for productivity in this environment time is money, so investing in more cores is always a wise choice.

Yeah, looked up the i7s you mentioned and the Intel Core i7 13700K 3.4 GHz 54MB seems like a pretty good choice overall. For the balance I'm aiming for, would the Intel Core i9 13900K 3.0 GHz 68MB be a worse choice? Like, it will definitely improve productivity, but bottleneck the 4090 for gaming more than the i7 13700K? The productivity I'm doing usually makes most of the 4090 card, so the processor is primarily there to support the load. If the difference between the i9 13900K and the i7 13700K in productivity is negligible but the performance in games gets better with the i7, that should be the road to go?

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, Glazarus said:

Yeah, looked up the i7s you mentioned and the Intel Core i7 13700K 3.4 GHz 54MB seems like a pretty good choice overall. For the balance I'm aiming for, would the Intel Core i9 13900K 3.0 GHz 68MB be a worse choice? Like, it will definitely improve productivity, but bottleneck the 4090 for gaming more than the i7 13700K? The productivity I'm doing usually makes most of the 4090 card, so the processor is primarily there to support the load. If the difference between the i9 13900K and the i7 13700K in productivity is negligible but the performance in games gets better with the i7, that should be the road to go?

Difficult to say really until we get the benchmarks, The 13700k going by history would be a smart choice and perform incredably well in both gaming and producitivty. i9 would be more of a nice to have.

Also what resolution do you game at ?

CPU: Ryzen 5900x | GPU: RTX 3090 FE | MB: MSI X470 Gaming Pro Carbon | RAM: 32gb Ballistix | PSU: Corsair RM750 | Cooler: Sythe Fuma 2 | Case: Phanteks P600s | Storage: 2TB WD Black SN 750 & 1TB Sabrent Rocket | OS: Windows 11 Pro & Linux Mint

Link to comment
Share on other sites

Link to post
Share on other sites

10 minutes ago, solado said:

Difficult to say really until we get the benchmarks, The 13700k going by history would be a smart choice and perform incredably well in both gaming and producitivty. i9 would be more of a nice to have.

Also what resolution do you game at ?

I'm gonna plan for the i7 and wait for some benchmarks. Would be nice if LTT did a combo test with different processors from AMD and Intel in these newest releases to figure out which one works best in a combo with the 4090. I'm generally Intel-biased because I've had a lot of problems with AMD previously in terms of stability issues compared to Intel which generally always performed more stable. So, while I know AMD has been killing it, I rather use Intel anyway, especially now that the Raptor Lake seems to be a good choice.

 

As for resolution, 4K 60 fps is usually were I'm at, so I don't need the super high framerates that current games reach, but I'm generally not buying graphics cards each generation and instead build a system that will survive work and games for at least 5 years, usually past a generation or two. So the next card for me will probably be a 60 or 70 card from Nvidia whenever they arrive. 

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Glazarus said:

I'm generally Intel-biased because I've had a lot of problems with AMD previously in terms of stability issues compared to Intel which generally always performed more stable. So, while I know AMD has been killing it, I rather use Intel anyway, especially now that the Raptor Lake seems to be a good choice.

While I run an AMD system myself. 100% with you on that especially for productivity and video editing. I have had a lot of problems with AMD stability with USB devices and audio crackling / popping. Also the QuickSync is too good not to have for things like Premiere pro, Resolve etc.

 

Quote

As for resolution, 4K 60 fps is usually were I'm at, so I don't need the super high framerates that current games reach, but I'm generally not buying graphics cards each generation but build a system that will survive work and games for at least 5 years, usually past a generation or two. So the next card for me will probably be a 60 or 70 card from Nvidia whenever they arrive. 

 

CPU Bottleneck's aren't going to be a problem at 4k 60. Most CPU's can push modern games to 200+ fps at 1080 so at 4k the limiting factor will be the GPU even with a 4090.

 

CPU: Ryzen 5900x | GPU: RTX 3090 FE | MB: MSI X470 Gaming Pro Carbon | RAM: 32gb Ballistix | PSU: Corsair RM750 | Cooler: Sythe Fuma 2 | Case: Phanteks P600s | Storage: 2TB WD Black SN 750 & 1TB Sabrent Rocket | OS: Windows 11 Pro & Linux Mint

Link to comment
Share on other sites

Link to post
Share on other sites

23 minutes ago, solado said:

While I run an AMD system myself. 100% with you on that especially for productivity and video editing. I have had a lot of problems with AMD stability with USB devices and audio crackling / popping. Also the QuickSync is too good not to have for things like Premiere pro, Resolve etc.

 

 

CPU Bottleneck's aren't going to be a problem at 4k 60. Most CPU's can push modern games to 200+ fps at 1080 so at 4k the limiting factor will be the GPU even with a 4090.

 

 

Feels like I'm leaning towards the i7, maybe if the i9 proves in benchmarks to be similar to the i7 in performance of games I might jump up to that. 

 

But then there's the issue of motherboards supporting all of this. I have no clue. Stability here is the main culprit as I've had motherboards before failing connectivity which rendered drives faulty and the current normal performance standard of plug-ins use Type-C and for some reason there's always a lack of Type-C ports on PC computers. So, a stable motherboard, supporting the Raptor Lake processors and the 4090, without it covering all PCI lanes (I have a video output card for my Resolve) I need to attach. I also have the Fractal Design Define 7 XL for my current rig and it seems there aren't much other options for sound proof and large chassis?

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, riba2233 said:

You didn't read well, I wrote 4:4:4, that is full chroma, without any subsampling. So 4k, 240hz, 444, 10bit, hdr. All on dp 1.4.

 

Those monitors don't even support dp 2.0!

So CableMatters and other cable makers are spreading lies too? Along with outlets that have stated this since last year?

https://www.cablematters.com/Blog/DisplayPort/does-displayport-1-4-support-240hz

https://www.reddit.com/r/buildapc/comments/iq7f3u/can_displayport_14a_do_4k_at_240hz/

https://www.tweaktown.com/news/77368/displayport-2-0-monitors-coming-in-late-2021-4k-240hz-and-8k-120hz/index.html

Current Network Layout:

Current Build Log/PC:

Prior Build Log/PC:

Link to comment
Share on other sites

Link to post
Share on other sites

11 hours ago, Uttamattamakin said:

I have Played it on 4K 30 FPS with RTX on a 3080, granted there are BIG drops.  1440p same settings otherwise it is fine.  I really don't know what is wrong with other peoples computers. 

Max settings in cyberpunk is 120 to 130+ fps without RTX, with rtx enabled its pretty shit on any *3000-series rtx graphics card.. Even combined with a 5900x.

 

 

Edited by MultiGamerClub

Useful threads: PSU Tier List | Motherboard Tier List | Graphics Card Cooling Tier List ❤️

Baby: MPG X570 GAMING PLUS | AMD Ryzen 9 5900x /w PBO | Corsair H150i Pro RGB | ASRock RX 7900 XTX Phantom Gaming OC (3020Mhz & 2650Memory) | Corsair Vengeance RGB PRO 32GB DDR4 (4x8GB) 3600 MHz | Corsair RM1000x |  WD_BLACK SN850 | WD_BLACK SN750 | Samsung EVO 850 | Kingston A400 |  PNY CS900 | Lian Li O11 Dynamic White | Display(s): Samsung Oddesy G7, ASUS TUF GAMING VG27AQZ 27" & MSI G274F

 

I also drive a volvo as one does being norwegian haha, a volvo v70 d3 from 2016.

Reliability was a key thing and its my second car, working pretty well for its 6 years age xD

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Lurick said:

Did you even read the first link you posted?  They also say that dp 1.4 can run 4k240. 

Second link is just some clueless anons on reddit.

Third link didn't say anything about dsc on 1.4 so a bit misleading.

But whatever your links say, fact is that samsung g8 runs at 4k240, 10bit 444 on dp 1.4. so draw your own conclusions on who is lying or not. Dsc on dp 1.4 can boost bandwidth by three times which is more than enough for these resolutions.

Link to comment
Share on other sites

Link to post
Share on other sites

23 hours ago, silentdragon95 said:

Okay so I get why we need to have "clickbait" titles.

Why do they NEED clickbait titles? To make more money to produce content you can't even find because of the clickbait titles. LTT actively choses to spread dishonest, misinformation, and misleading information with their video titles for the simple fact of more money. LTT made better content when they had less money so destroying your own brand for money seems absolutely idiotic/mid life Linus crisis 

ƆԀ S₱▓Ɇ▓cs: i7 6ʇɥפᴉƎ00K (4.4ghz), Asus DeLuxe X99A II, GT҉X҉1҉0҉8҉0 Zotac Amp ExTrꍟꎭe),Si6F4Gb D???????r PlatinUm, EVGA G2 Sǝʌǝᘉ5ᙣᙍᖇᓎᙎᗅᖶt, Phanteks Enthoo Primo, 3TB WD Black, 500gb 850 Evo, H100iGeeTeeX, Windows 10, K70 R̸̢̡̭͍͕̱̭̟̩̀̀̃́̃͒̈́̈́͑̑́̆͘͜ͅG̶̦̬͊́B̸͈̝̖͗̈́, G502, HyperX Cloud 2s, Asus MX34. פN∩SW∀S 960 EVO

Just keeping this here as a backup 9̵̨̢̨̧̧̡̧̡̧̡̧̡̡̢̢̡̢̧̡̢̡̡̢̧̛̛̛̛̛̛̱̖͈̠̝̯̹͉̝̞̩̠̹̺̰̺̲̳͈̞̻̜̫̹̱̗̣͙̻̘͎̲̝͙͍͔̯̲̟̞͚̖̘͉̭̰̣͎͕̼̼̜̼͕͎̣͇͓͓͎̼̺̯͈̤̝͖̩̭͍̣̱̞̬̺̯̼̤̲͎̖̠̟͍̘̭͔̟̗̙̗̗̤̦͍̫̬͔̦̳̗̳͔̞̼̝͍̝͈̻͇̭̠͈̳͍̫̮̥̭͍͔͈̠̹̼̬̰͈̤͚̖̯͍͉͖̥̹̺͕̲̥̤̺̹̹̪̺̺̭͕͓̟̳̹͍̖͎̣̫͓͍͈͕̳̹̙̰͉͙̝̜̠̥̝̲̮̬͕̰̹̳͕̰̲̣̯̫̮͙̹̮͙̮̝̣͇̺̺͇̺̺͈̳̜̣̙̻̣̜̻̦͚̹̩͓͚̖͍̥̟͍͎̦͙̫̜͔̭̥͈̬̝̺̩͙͙͉̻̰̬̗̣͖̦͎̥̜̬̹͓͈͙̤̜̗͔̩̖̳̫̑̀̂̽̈́̈́̿͒̿̋̊͌̾̄̄̒̌͐̽̿̊͑̑̆͗̈̎̄͒̑̋͛̑͑̂͑̀͐̀͑̓͊̇͆̿͑͛͛͆́͆̓̿̇̀̓͑͆͂̓̾̏͊̀̇̍̃́̒̎̀̒̄̓̒̐̑̊̏̌̽̓͂͋̓̐̓͊̌͋̀̐̇̌̓̔͊̈̇́̏͒̋͊̓̆̋̈̀̌̔͆͑̈̐̈̍̀̉̋̈́͊̽͂̿͌͊̆̾̉͐̿̓̄̾͑̈́͗͗̂̂́̇͂̀̈́́̽̈́̓̓͂̽̓̀̄͌̐̔̄̄͒͌̈́̅̉͊̂͒̀̈́̌͂̽̀̑̏̽̀͑̐̐͋̀̀͋̓̅͋͗̍́͗̈́̆̏̇͊̌̏̔̑̐̈́͑̎͑͆̏̎́̑̍̏̒̌̊͘͘̚̕̚̕̕̚̕̚̕̕͜͜͜͜͜͝͝͠͠͝͝͝͝͝͝͝͠͝͝ͅͅͅͅͅͅͅ8̵̨̛̛̛̛̮͍͕̥͉̦̥̱̞̜̫̘̤̖̬͍͇͓̜̻̪̤̣̣̹̑͑̏̈́̐̐́̎͒̔͒̌̑̓̆̓͑̉̈́́͋̌͋͐͛͋̃̍̽̊͗͋͊̂̅͊͑́͋͛̉̏̓͌̾̈́̀͛͊̾͑̌̀̀̌̓̏̑́̄̉̌͂́͛̋͊̄͐͊̈́̀̌̆̎̿̓̔̍̎̀̍̚̕̕͘͘͘̕̚͝͝͠͠͠0̶̡̡̡̢̨̨͕̠̠͉̺̻̯̱̘͇̥͎͖̯͕̖̬̭͔̪̪͎̺̠̤̬̬̤̣̭̣͍̥̱̘̳̣̤͚̭̥͚̦͙̱̦͕̼͖͙͕͇̭͓͉͎̹̣̣͕̜͍͖̳̭͕̼̳̖̩͍͔̱̙̠̝̺̰̦̱̿̄̀͐͜͜ͅͅt̶̡̨̡̨̧̢̧̢̨̧̧̧̧̢̡̨̨̢̨̢̧̢̛̛̛̛̛̠͍̞̮͇̪͉̩̗̗͖̫͉͎͓̮̣̘̫͔̘̬̮̙̯̣͕͓̲̣͓͓̣̹̟͈̱͚̘̼̙̖̖̼̙̜̝͙̣̠̪̲̞̖̠̯̖̠̜̱͉̲̺͙̤̻̦̜͎̙̳̺̭̪̱͓̦̹̺͙̫̖̖̰̣͈͍̜̺̘͕̬̥͇̗̖̺̣̲̫̟̣̜̭̟̱̳̳̖͖͇̹̯̜̹͙̻̥̙͉͕̜͎͕̦͕̱͖͉̜̹̱̦͔͎̲̦͔̖̘̫̻̹̮̗̮̜̰͇̰͔̱͙̞̠͍͉͕̳͍̰̠̗̠̯̜̩͓̭̺̦̲̲͖̯̩̲̣̠͉̦̬͓̠̜̲͍̘͇̳̳͔̼̣͚̙͙͚͕̙̘̣̠͍̟̪̝̲͇͚̦̖͕̰̟̪͖̳̲͉͙̰̭̼̩̟̝̣̝̬̳͎̙̱͒̃̈͊̔͒͗̐̄̌͐͆̍͂̃̈́̾͗̅̐͒̓̆͛̂̾͋̍͂̂̄̇̿̈͌̅̈́̃̾̔̇̇̾̀͊͋̋̌̄͌͆͆̎̓̈́̾̊͊̇̌̔̈́̈́̀̐͊̊̍͑̊̈̓͑̀́̅̀̑̈́̽̃̽͛̇́̐̓̀͆̔̈̀̍̏̆̓̆͒̋́̋̍́̂̉͛̓̓̂̋̎́̒̏̈͋̃̽͆̓̀̔͑̈́̓͌͑̅̽́̐̍̉̑̓̈́͌̋̈́͂̊́͆͂̇̈́̔̃͌̅̈́͌͛̑̐̓̔̈́̀͊͛̐̾͐̔̾̈̃̈̄͑̓̋̇̉̉̚̕̚͘̕̚̚̕̕͜͜͜͜͜͜͜͜͜͜͜͜͜͝͝͝͠͝͝͝͝͝͠ͅͅͅͅͅi̵̢̧̢̧̡̧̢̢̧̢̢̢̡̡̡̧̧̡̡̧̛̛͈̺̲̫͕̞͓̥̖̭̜̫͉̻̗̭̖͔̮̠͇̩̹̱͈̗̭͈̤̠̮͙͇̲͙̰̳̹̲͙̜̟͚͎͓̦̫͚̻̟̰̣̲̺̦̫͓̖̯̝̬͉̯͓͈̫̭̜̱̞̹̪͔̤̜͙͓̗̗̻̟͎͇̺̘̯̲̝̫͚̰̹̫̗̳̣͙̮̱̲͕̺̠͉̫̖̟͖̦͉̟͈̭̣̹̱̖̗̺̘̦̠̯̲͔̘̱̣͙̩̻̰̠͓͙̰̺̠̖̟̗̖͉̞̣̥̝̤̫̫̜͕̻͉̺͚̣̝̥͇̭͎̖̦̙̲͈̲̠̹̼͎͕̩͓̖̥̘̱̜͙̹̝͔̭̣̮̗̞̩̣̬̯̜̻̯̩̮̩̹̻̯̬̖͂̈͂̒̇͗͑̐̌̎̑̽̑̈̈́͑̽́̊͋̿͊͋̅̐̈́͑̇̿̈́̌͌̊̅͂̎͆̏̓͂̈̿̏̃͑̏̓͆̔̋̎̕͘͘͘͜͜͜͜͜͜͜͝͝͠͠ͅͅͅͅͅͅͅͅͅZ̴̧̢̨̢̧̢̢̡̧̢̢̢̨̨̨̡̨̧̢̧̛̛̬̖͈̮̝̭̖͖̗̹̣̼̼̘̘̫̠̭̞͙͔͙̜̠̗̪̠̼̫̻͓̳̟̲̳̻̙̼͇̺͎̘̹̼͔̺̹̬̯̤̮̟͈̭̻͚̣̲͔͙̥͕̣̻̰͈̼̱̺̤̤͉̙̦̩̗͎̞͓̭̞̗͉̳̭̭̺̹̹̮͕̘̪̞̱̥͈̹̳͇̟̹̱̙͚̯̮̳̤͍̪̞̦̳̦͍̲̥̳͇̪̬̰̠͙͕̖̝̫̩̯̱̘͓͎̪͈̤̜͎̱̹̹̱̲̻͎̖̳͚̭̪̦̗̬͍̯̘̣̩̬͖̝̹̣̗̭͖̜͕̼̼̲̭͕͔̩͓̞̝͓͍̗̙̯͔̯̞̝̳̜̜͉̖̩͇̩̘̪̥̱͓̭͎͖̱̙̩̜͎̙͉̟͎͔̝̥͕͍͓̹̮̦̫͚̠̯͓̱͖͔͓̤͉̠͙̋͐̀͌̈́͆̾͆̑̔͂͒̀̊̀͋͑̂͊̅͐̿́̈́̐̀̏̋̃̄͆͒̈́̿̎́́̈̀̀͌̔͋͊̊̉̿͗͊͑̔͐̇͆͛̂̐͊̉̄̈́̄̐͂͂͒͑͗̓͑̓̾̑͋̒͐͑̾͂̎̋̃̽̂̅̇̿̍̈́́̄̍͂͑̏̐̾̎̆̉̾͂̽̈̆̔́͋͗̓̑̕͘̕͘͜͜͜͜͜͝͝͝͝͠͠͝ͅo̶̪͆́̀͂̂́̄̅͂̿͛̈́̿͊͗́͘͝t̴̡̨̧̨̧̡̧̨̡̢̧̢̡̨̛̪͈̣̭̺̱̪̹̺̣̬̖̣̻͈̞̙͇̩̻̫͈̝̭̟͎̻̟̻̝̱͔̝̼͍̞̼̣̘̤̯͓͉̖̠̤͔̜̙͚͓̻͓̬͓̻̜̯̱̖̳̱̗̠̝̥̩͓̗̪̙͓̖̠͎̗͎̱̮̯̮͙̩̫̹̹̖͙̙͖̻͈̙̻͇͔̙̣̱͔̜̣̭̱͈͕̠̹͙̹͇̻̼͎͍̥̘͙̘̤̜͎̟͖̹̦̺̤͍̣̼̻̱̲͎̗̹͉͙̪̞̻̹͚̰̻͈͈͊̈́̽̀̎̃̊́̈́̏̃̍̉̇̑̂̇̏̀͊̑̓͛̽͋̈́͆́̊͊̍͌̈́̓͊̌̿̂̾̐͑̓̀́͒̃̋̓͆̇̀͊̆͗̂͑͐̀͗̅̆͘̕͘̕̕͜͜͝͝͝͝͝͝͝ͅͅͅͅͅͅͅͅͅḁ̶̢̡̨̧̡̡̨̨̧̨̡̡̢̧̨̡̡̛̛̛͍̱̳͚͕̩͍̺̪̻̫̙͈̬͙̖͙̬͍̬̟̣̝̲̼̜̼̺͎̥̮̝͙̪̘̙̻͖͇͚͙̣̬̖̲̲̥̯̦̗̰̙̗̪̞̗̩̻̪̤̣̜̳̩̦̻͓̞̙͍͙̫̩̹̥͚̻̦̗̰̲̙̫̬̱̺̞̟̻͓̞͚̦̘̝̤͎̤̜̜̥̗̱͈̣̻̰̮̼̙͚͚̠͚̲̤͔̰̭̙̳͍̭͎̙͚͍̟̺͎̝͓̹̰̟͈͈̖̺͙̩̯͔̙̭̟̞̟̼̮̦̜̳͕̞̼͈̜͍̮͕̜͚̝̦̞̥̜̥̗̠̦͇͖̳͈̜̮̣͚̲̟͙̎̈́́͊̔̑̽̅͐͐͆̀͐́̓̅̈͑͑̍̿̏́͆͌̋̌̃̒̽̀̋̀̃̏̌́͂̿̃̎̐͊̒̀̊̅͒̎͆̿̈́̑̐̒̀̈́̓̾͋͆̇̋͒̎̈̄̓̂͊̆͂̈́̒̎͐̇̍̆̋̅̿̔͒̄̇̂̋̈́͆̎̔̇͊̊̈́̔̏͋́̀͂̈́̊͋͂̍̾̓͛̇̔̚͘̚̕̚͘͘̕̕̕̚͘͘̚̕̚̕͜͜͜͝͝͝͝͝͝͝͝ͅͅͅͅͅç̵̧̢̨̢̢̢̧̧̡̨̡̢̧̧̧̨̡̡̨̨̢̢̢̧̨̢̨̢̛̛͉̗̠͇̹̖̝͕͚͎̟̻͓̳̰̻̺̞̣͚̤͙͍͇̗̼͖͔͕͙͖̺͙̖̹̘̘̺͓̜͍̣̰̗̖̺̗̪̘̯̘͚̲͚̲̬̞̹̹͕̭͔̳̘̝̬͉̗̪͉͕̞̫͔̭̭̜͉͔̬̫͙̖̙͚͔͙͚͍̲̘͚̪̗̞̣̞̲͎͔͖̺͍͎̝͎͍̣͍̩̟͈͕̗͉̪̯͉͎͖͍̖͎̖̯̲̘̦̟̭͍͚͓͈͙̬͖̘̱̝̜̘̹̩̝̥̜͎̬͓̬͙͍͇͚̟̫͇̬̲̥̘̞̘̟̘̝̫͈̙̻͇͎̣̪̪̠̲͓͉͙͚̭̪͇̯̠̯̠͖̞̜͓̲͎͇̼̱̦͍͉͈͕͉̗̟̖̗̱̭͚͎̘͓̬͍̱͍̖̯̜̗̹̰̲̩̪͍̞̜̫̩̠͔̻̫͍͇͕̰̰̘͚͈̠̻̮͊̐̿̏̐̀̇̑̐̈͛͑͑̍̑̔̃̈́̓̈́̇̐͑̐̊̆͂̀̏͛̊̔̍̽͗͋̊̍̓̈́̏̅͌̀̽́̑͒͒̓͗̈́̎͌͂̕̚͘͘͜͜͜͜͜͠͝͝͝͝ͅͅͅͅͅͅͅS̵̡̡̧̧̨̨̡̢̡̡̡̡̧̧̡̧̢̫̯͔̼̲͉͙̱̮̭̗͖̯̤͙̜͚̰̮̝͚̥̜̞̠̤̺̝͇̻̱͙̩̲̺͍̳̤̺̖̝̳̪̻̗̮̪̖̺̹̭͍͇̗̝̱̻̳̝̖̝͎̙͉̞̱̯̙̜͇̯̻̞̱̭̗͉̰̮̞͍̫̺͙͎̙̞̯̟͓͉̹̲͖͎̼̫̩̱͇̲͓̪͉̺̞̻͎̤̥̭̺̘̻̥͇̤̖̰̘̭̳̫̙̤̻͇̪̦̭̱͎̥̟͖͕̣̤̩̟̲̭̹̦̹̣͖̖͒̈́̈́̓͗̈̄͂̈́̅̐̐̿̎̂͗̎̿̕͘͜͜͜͜͝͝ͅͅt̸̡̡̧̧̨̡̢̛̥̥̭͍̗͈̩͕͔͔̞̟͍̭͇̙̺̤͚͎͈͎͕̱͈̦͍͔͓̬͚̗̰̦͓̭̰̭̎̀̂̈́̓̒̈́̈́̂̄̋́̇̂͐͒̋̋̉͐̉̏̇͋̓̈́͐̾͋̒͒͐̊̊̀̄͆̄͆̑͆̇̊̓̚̚̕̚̕͜͠͝͝ͅͅơ̵̡̨̡̡̡̨̛̺͕̼͔̼̪̳͖͓̠̘̘̳̼͚͙͙͚̰͚͚͖̥̦̥̘̖̜̰͔̠͕̦͎̞̮͚͕͍̤̠̦͍̥̝̰̖̳̫̮̪͇̤̱̜͙͔̯͙̙̼͇̹̥̜͈̲̺̝̻̮̬̼̫̞̗̣̪̱͓̺̜̠͇͚͓̳̹̥̳̠͍̫͈̟͈̘̯̬̞͔̝͍͍̥̒̐͗͒͂͆̑̀̿̏́̀͑͗̐́̀̾̓́̌̇̒̈́̌̓͐̃̈́̒̂̀̾͂̊̀̂͐̃̄̓̔̽̒̈́̇̓͌̇̂̆̒̏̊̋͊͛͌̊̇̒̅͌̄̎̔̈́͊́̽̋̈̇̈́́͊̅͂̎̃͌͊͛͂̄̽̈́̿͐̉̽̿́́̉͆̈́̒́̂̾̄̇̌̒̈̅̍̿̐͑̓͊̈́̈̋̈́̉̍̋̊̈̀̈́̾̿̌̀̈́͌̑̍́̋̒̀̂̈́́̾̏̐̅̈̑͗͐̈͂̄̾̄̈́̍̉͑͛͗͋̈́̃̄̊́́͐̀̀̽̇̓̄̓̃͋͋̂̽̔̀̎͌̈́̈́̑̓̔̀̓͐͛͆̿̋͑͛̈́͂̅̋̅͆͗̇́̀̒́̏͒̐̍͂̓͐͐̇̂̉̑̊͑̉̋̍͊̄̀͂̎͒̔͊̃̏̕̚̕̕͘͘͘̚͘̚͘̕͘̚͘̚̚̚̕͘͜͜͜͝͝͠͠͝͝͠͠͝͝͝͝͝͝͝͝͝ͅͅͅc̴̨̡̢̢̢̡̡̢̛̛̛̻͇̝̣͉͚͎͕̻̦͖̤̖͇̪̩̤̻̭̮̙̰̖̰̳̪̱̹̳̬͖̣͙̼̙̰̻̘͇͚̺̗̩̫̞̳̼̤͔͍͉̟͕̯̺͈̤̰̹̍̋́͆̾̆̊͆͋̀͑͒̄̿̄̀̂͋̊͆́͑̑̽͊̓́̔̽̌͊̄͑͒͐̑͗̿̃̀̓̅́̿͗̈́͌̋̀̏̂͌̓́̇̀͒͋̌̌̅͋͌̆͐̀̔̒͐̊̇̿̽̀̈́̃̒̋̀̈́̃̏̂̊͗̑̊̈̇̀̌͐̈́̉̂̏͊̄͐̈̽͒̏̒̓́̌̓̅́̓̃͐͊͒̄͑̒͌̍̈́̕͘̚͘̕͘̚̕͜͝͠͝͝͝ͅǩ̴̢̢̢̧̨̢̢̢̨̨̨̢̢̢̨̧̨̡̡̢̛̛̛̛̛̛̛̜̥̩̙͕̮̪̻͈̘̯̼̰̜͚̰͖̬̳͖̣̭̼͔̲͉̭̺͚̺̟͉̝̱̲͎͉̙̥̤͚͙̬̪̜̺͙͍̱̞̭̬̩̖̤̹̤̺̦͈̰̗̰͍͇̱̤̬̬͙̙̲̙̜͖͓̙̟̙̯̪͍̺̥͔͕̝̳̹̻͇̠̣͈̰̦͓͕̩͇͈͇̖͙͍̰̲̤̞͎̟̝̝͈͖͔͖̦̮̗̬̞̞̜̬̠̹̣̣̲̮̞̤̜̤̲̙͔͕̯͔͍̤͕̣͔͙̪̫̝̣̰̬̬̭̞͔̦̟̥̣̻͉͈̮̥̦̮̦͕̤͇̺͆͆̈͗̄̀̌̔̈́̈̉̾̊̐̆̂͛̀̋́̏̀̿͒̓̈́̈́͂̽̾͗͊̋̐̓̓̀̃̊̊͑̓̈̎̇͑̆̂̉̾̾̑͊̉̃́̑͌̀̌̐̅̃̿̆̎̈́̀̒́͛̓̀̊́̋͛͒͊̆̀̃̊͋̋̾̇̒̋͂̏͗͆̂̔́̐̀́͗̅̈̋̂̎̒͊̌̉̈̈́͌̈́̔̾̊̎́͐͒̋̽̽́̾̿̚̕͘͘̚̕̕̕̚̚̕̚̕͘͜͜͜͝͠͝͝͝͝͝͝͝͝ͅͅͅͅͅͅB̸̢̧̨̡̢̧̨̡̡̨̡̨̡̡̡̢̨̢̨̛̛̛̛̛̛͉̞͚̰̭̲͈͎͕͈̦͍͈̮̪̤̻̻͉̫̱͔̞̫̦̰͈̗̯̜̩̪̲̻̖̳͖̦͎͔̮̺̬̬̼̦̠̪̤͙͍͓̜̥̙̖̫̻̜͍̻̙̖̜̹͔̗̪̜̖̼̞̣̠̫͉̯̮̤͈͎̝̪͎͇͙̦̥͙̳̫̰̪̣̱̘̤̭̱͍̦͔̖͎̺̝̰̦̱̣͙̙̤͚̲͔̘̱̜̻͔̥̻͖̭͔̜͉̺͕͙͖̜͉͕̤͚̠̩̮̟͚̗͈͙̟̞̮̬̺̻̞͔̥͉͍̦̤͓̦̻̦̯̟̰̭̝̘̩̖̝͔̳͉̗̖̱̩̩̟͙͙͛̀͐̈́̂̇͛̅̒̉̏̈́̿͐́̏̃̏̓̌̽͐̈́͛̍͗͆͛̋̔̉͂̔̂̓̌͌͋̂͆̉͑̊̎́̈́̈̂͆͑́̃̍̇̿̅̾́́̿̅̾̆̅̈́̈̓͒͌͛̃͆̋͂̏̓̅̀͂̽̂̈̈́̎̾̐͋͑̅̍̈́̑̅̄͆̓̾̈́͐̎̊͐̌̌̓͊̊̔̈́̃͗̓͊͐̌͆̓͗̓̓̾̂̽͊͗́́́̽͊͆͋͊̀̑̿̔͒̏̈́́̏͆̈́͋̒͗͂̄̇̒͐̃͑̅̍͒̎̈́̌̋́̓͂̀̇͛̋͊͆̈́̋́̍̃͒̆̕̚̚̕̕̕͘̕̚̚͘̕͜͜͜͜͝͠͠͝͠͝͝͝͝͠͝͝͝͝ͅͅͅͅͅI̵̡̢̧̨̡̢̨̡̡̢̡̧̡̢̢̢̡̢̛̛͕͎͕̩̠̹̩̺̣̳̱͈̻̮̺̟̘̩̻̫͖̟͓̩̜̙͓͇̙̱̭̰̻̫̥̗̠͍͍͚̞̘̫͉̬̫̖̖̦͖͉̖̩̩̖̤̺̥̻̝͈͎̻͓̟̹͍̲͚͙̹̟̟̯͚̳̟͕̮̻̟͈͇̩̝̼̭̯͚͕̬͇̲̲̯̰̖̙̣̝͇̠̞̙͖͎̮̬̳̥̣̺̰͔̳̳̝̩̤̦̳̞̰̩̫̟͚̱̪̘͕̫̼͉̹̹̟̮̱̤̜͚̝̠̤̖̮̯̳͖̗̹̞̜̹̭̿̏͋̒͆̔̄̃̾̓͛̾̌́̅̂͆̔͌͆͋̔̾́̈̇̐̄̑̓̂̾́̄̿̓̅̆͌̉̎̏̄͛̉͆̓̎͒͘̕̕͜͜͜͜͜͜͜͝͠ͅͅƠ̷̢̛̛̛̛̛̛̛̛̟̰͔͔͇̲̰̮̘̭̭̖̥̟̘̠̬̺̪͇̲͋͂̅̈́̍͂̽͗̾͒̇̇̒͐̍̽͊́̑̇̑̾̉̓̈̾͒̍̌̅̒̾̈́̆͌̌̾̎̽̐̅̏́̈̔͛̀̋̃͊̒̓͗͒̑͒̃͂̌̄̇̑̇͛̆̾͛̒̇̍̒̓̀̈́̄̐͂̍͊͗̎̔͌͛̂̏̉̊̎͗͊͒̂̈̽̊́̔̊̃͑̈́̑̌̋̓̅̔́́͒̄̈́̈̂͐̈̅̈̓͌̓͊́̆͌̉͐̊̉͛̓̏̓̅̈́͂̉̒̇̉̆̀̍̄̇͆͛̏̉̑̃̓͂́͋̃̆̒͋̓͊̄́̓̕̕̕̚͘͘͘̚̕̚͘̕̕͜͜͝͝͝͠͝͝͝͝͠ͅS̷̢̨̧̢̡̨̢̨̢̨̧̧̨̧͚̱̪͇̱̮̪̮̦̝͖̜͙̘̪̘̟̱͇͎̻̪͚̩͍̠̹̮͚̦̝̤͖̙͔͚̙̺̩̥̻͈̺̦͕͈̹̳̖͓̜͚̜̭͉͇͖̟͔͕̹̯̬͍̱̫̮͓̙͇̗̙̼͚̪͇̦̗̜̼̠͈̩̠͉͉̘̱̯̪̟͕̘͖̝͇̼͕̳̻̜͖̜͇̣̠̹̬̗̝͓̖͚̺̫͛̉̅̐̕͘͜͜͜͜ͅͅͅ.̶̨̢̢̨̢̨̢̛̻͙̜̼̮̝̙̣̘̗̪̜̬̳̫̙̮̣̹̥̲̥͇͈̮̟͉̰̮̪̲̗̳̰̫̙͍̦̘̠̗̥̮̹̤̼̼̩͕͉͕͇͙̯̫̩̦̟̦̹͈͔̱̝͈̤͓̻̟̮̱͖̟̹̝͉̰͊̓̏̇͂̅̀̌͑̿͆̿̿͗̽̌̈́̉̂̀̒̊̿͆̃̄͑͆̃̇͒̀͐̍̅̃̍̈́̃̕͘͜͜͝͠͠z̴̢̢̡̧̢̢̧̢̨̡̨̛̛̛̛̛̛̛̛̲͚̠̜̮̠̜̞̤̺͈̘͍̻̫͖̣̥̗̙̳͓͙̫̫͖͍͇̬̲̳̭̘̮̤̬̖̼͎̬̯̼̮͔̭̠͎͓̼̖̟͈͓̦̩̦̳̙̮̗̮̩͙͓̮̰̜͎̺̞̝̪͎̯̜͈͇̪̙͎̩͖̭̟͎̲̩͔͓͈͌́̿͐̍̓͗͑̒̈́̎͂̋͂̀͂̑͂͊͆̍͛̄̃͌͗̌́̈̊́́̅͗̉͛͌͋̂̋̇̅̔̇͊͑͆̐̇͊͋̄̈́͆̍̋̏͑̓̈́̏̀͒̂̔̄̅̇̌̀̈́̿̽̋͐̾̆͆͆̈̌̿̈́̎͌̊̓̒͐̾̇̈́̍͛̅͌̽́̏͆̉́̉̓̅́͂͛̄̆͌̈́̇͐̒̿̾͌͊͗̀͑̃̊̓̈̈́̊͒̒̏̿́͑̄̑͋̀̽̀̔̀̎̄͑̌̔́̉̐͛̓̐̅́̒̎̈͆̀̍̾̀͂̄̈́̈́̈́̑̏̈́̐̽̐́̏̂̐̔̓̉̈́͂̕̚̕͘͘̚͘̚̕̚̚̚͘̕̕̕͜͜͝͠͠͝͝͝͝͠͝͝͝͠͝͝͝͝͝͝ͅͅͅī̸̧̧̧̡̨̨̢̨̛̛̘͓̼̰̰̮̗̰͚̙̥̣͍̦̺͈̣̻͇̱͔̰͈͓͖͈̻̲̫̪̲͈̜̲̬̖̻̰̦̰͙̤̘̝̦̟͈̭̱̮̠͍̖̲͉̫͔͖͔͈̻̖̝͎̖͕͔̣͈̤̗̱̀̅̃̈́͌̿̏͋̊̇̂̀̀̒̉̄̈́͋͌̽́̈́̓̑̈̀̍͗͜͜͠͠ͅp̴̢̢̧̨̡̡̨̢̨̢̢̢̨̡̛̛͕̩͕̟̫̝͈̖̟̣̲̖̭̙͇̟̗͖͎̹͇̘̰̗̝̹̤̺͉͎̙̝̟͙͚̦͚͖̜̫̰͖̼̤̥̤̹̖͉͚̺̥̮̮̫͖͍̼̰̭̤̲͔̩̯̣͖̻͇̞̳̬͉̣̖̥̣͓̤͔̪̙͎̰̬͚̣̭̞̬͎̼͉͓̮͙͕̗̦̞̥̮̘̻͎̭̼͚͎͈͇̥̗͖̫̮̤̦͙̭͎̝͖̣̰̱̩͎̩͎̘͇̟̠̱̬͈̗͍̦̘̱̰̤̱̘̫̫̮̥͕͉̥̜̯͖̖͍̮̼̲͓̤̮͈̤͓̭̝̟̲̲̳̟̠͉̙̻͕͙̞͔̖͈̱̞͓͔̬̮͎̙̭͎̩̟̖͚̆͐̅͆̿͐̄̓̀̇̂̊̃̂̄̊̀͐̍̌̅͌̆͊̆̓́̄́̃̆͗͊́̓̀͑͐̐̇͐̍́̓̈́̓̑̈̈́̽͂́̑͒͐͋̊͊̇̇̆̑̃̈́̎͛̎̓͊͛̐̾́̀͌̐̈́͛̃̂̈̿̽̇̋̍͒̍͗̈͘̚̚͘̚͘͘͜͜͜͜͜͜͠͠͝͝ͅͅͅ☻♥■∞{╚mYÄÜXτ╕○\╚Θº£¥ΘBM@Q05♠{{↨↨▬§¶‼↕◄►☼1♦  wumbo╚̯̪̣͕̙̩̦͓͚̙̱̘̝̏̆ͤ̊̅ͩ̓̏̿͆̌Θ̼̯͉ͭͦ̃͊͑̉ͯͤ̈́ͬ͐̈́͊ͤͅº͍̪͇͖̝̣̪̙̫̞̦̥ͨ̂ͧ̄̿£̺̻̹̠̯͙͇̳ͬ̃̿͑͊ͨͣ╚̯̪̣͕̙̩̦͓͚̙̱̘̝̏̆ͤ̊̅ͩ̓̏̿͆̌Θ̼̯͉ͭͦ̃͊͑̉ͯͤ̈́ͬ͐̈́͊ͤͅº͍̪͇͖̝̣̪̙̫̞̦̥ͨ̂ͧ̄̿£̺̻̹̠̯͙͇̳ͬ̃̿͑͊ͨͣ╚̯̪̣͕̙̩̦͓͚̙̱̘̝̏̆ͤ̊̅ͩ̓̏̿͆̌Θ̼̯͉ͭͦ̃͊͑̉ͯͤ̈́ͬ͐̈́͊ͤͅº͍̪͇͖̝̣̪̙̫̞̦̥ͨ̂ͧ̄̿£̺̻̹̠̯͙͇̳ͬ̃̿͑͊ͨͣ╚̯̪̣͕̙̩̦͓͚̙̱̘̝̏̆ͤ̊̅ͩ̓̏̿͆̌Θ̼̯͉ͭͦ̃͊͑̉ͯͤ̈́ͬ͐̈́͊ͤͅº͍̪͇͖̝̣̪̙̫̞̦̥ͨ̂ͧ̄̿£̺̻̹̠̯͙͇̳ͬ̃̿͑͊ͨͣ╚̯̪̣͕̙̩̦͓͚̙̱̘̝̏̆ͤ̊̅ͩ̓̏̿͆̌Θ̼̯͉ͭͦ̃͊͑̉ͯͤ̈́ͬ͐̈́͊ͤͅº͍̪͇͖̝̣̪̙̫̞̦̥ͨ̂ͧ̄̿£̺̻̹̠̯͙͇̳ͬ̃̿͑͊ͨͣ╚̯̪̣͕̙̩̦͓͚̙̱̘̝̏̆ͤ̊̅ͩ̓̏̿͆̌Θ̼̯͉ͭͦ̃͊͑̉ͯͤ̈́ͬ͐̈́͊ͤͅº͍̪͇͖̝̣̪̙̫̞̦̥ͨ̂ͧ̄̿£̺̻̹̠̯͙͇̳ͬ̃̿͑͊ͨͣ╚̯̪̣͕̙̩̦͓͚̙̱̘̝̏̆ͤ̊̅ͩ̓̏̿͆̌Θ̼̯͉ͭͦ̃͊͑̉ͯͤ̈́ͬ͐̈́͊ͤͅº͍̪͇͖̝̣̪̙̫̞̦̥ͨ̂ͧ̄̿£̺̻̹̠̯͙͇̳ͬ̃̿͑͊ͨͣ╚̯̪̣͕̙̩̦͓͚̙̱̘̝̏̆ͤ̊̅ͩ̓̏̿͆̌Θ̼̯͉ͭͦ̃͊͑̉ͯͤ̈́ͬ͐̈́͊ͤͅº͍̪͇͖̝̣̪̙̫̞̦̥ͨ̂ͧ̄̿£̺̻̹̠̯͙͇̳ͬ̃̿͑͊ͨͣ╚̯̪̣͕̙̩̦͓͚̙̱̘̝̏̆ͤ̊̅ͩ̓̏̿͆̌Θ̼̯͉ͭͦ̃͊͑̉ͯͤ̈́ͬ͐̈́͊ͤͅº͍̪͇͖̝̣̪̙̫̞̦̥ͨ̂ͧ̄̿£̺̻̹̠̯͙͇̳ͬ̃̿͑͊ͨͣ╚̯̪̣͕̙̩̦͓͚̙̱̘̝̏̆ͤ̊̅ͩ̓̏̿͆̌Θ̼̯͉ͭͦ̃͊͑̉ͯͤ̈́ͬ͐̈́͊ͤͅº͍̪͇͖̝̣̪̙̫̞̦̥ͨ̂ͧ̄̿£̺̻̹̠̯͙͇̳ͬ̃̿͑͊ͨͣ╚̯̪̣͕̙̩̦͓͚̙̱̘̝̏̆ͤ̊̅ͩ̓̏̿͆̌Θ̼̯͉ͭͦ̃͊͑̉ͯͤ̈́ͬ͐̈́͊ͤͅº͍̪͇͖̝̣̪̙̫̞̦̥ͨ̂ͧ̄̿£̺̻̹̠̯͙͇̳ͬ̃̿͑͊ͨͣ╚̯̪̣͕̙̩̦͓͚̙̱̘̝̏̆ͤ̊̅ͩ̓̏̿͆̌Θ̼̯͉ͭͦ̃͊͑̉ͯͤ̈́ͬ͐̈́͊ͤͅº͍̪͇͖̝̣̪̙̫̞̦̥ͨ̂ͧ̄̿£̺̻̹̠̯͙͇̳ͬ̃̿͑͊ͨͣ╚̯̪̣͕̙̩̦͓͚̙̱̘̝̏̆ͤ̊̅ͩ̓̏̿͆̌Θ̼̯͉ͭͦ̃͊͑̉ͯͤ̈́ͬ͐̈́͊ͤͅº͍̪͇͖̝̣̪̙̫̞̦̥ͨ̂ͧ̄̿£̺̻̹̠̯͙͇̳ͬ̃̿͑͊ͨͣ╚̯̪̣͕̙̩̦͓͚̙̱̘̝̏̆ͤ̊̅ͩ̓̏̿͆̌Θ̼̯͉ͭͦ̃͊͑̉ͯͤ̈́ͬ͐̈́͊ͤͅº͍̪͇͖̝̣̪̙̫̞̦̥ͨ̂ͧ̄̿£̺̻̹̠̯͙͇̳ͬ̃̿͑͊ͨͣ╚̯̪̣͕̙̩̦͓͚̙̱̘̝̏̆ͤ̊̅ͩ̓̏̿͆̌Θ̼̯͉ͭͦ̃͊͑̉ͯͤ̈́ͬ͐̈́͊ͤͅº͍̪͇͖̝̣̪̙̫̞̦̥ͨ̂ͧ̄̿£̺̻̹̠̯͙͇̳ͬ̃̿͑͊ͨͣ

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, Dean0919 said:

Agreed with you with 100%, but what's worse is that if Nvidia continues this, soon gamers will mean only rich gamers, because low class or even mid class people can't afford such expensive cards and there are a lot mid and low class people around the world, which means gaming world will continue losing those gamers because they simply can't afford those cards. It's just sad. We need to do something to make Nvidia stop their greedy approach or they will continue cranking up prices and taking more and more money from the buyers.

100% right. The entry level to lower end of mid-range has completely stagnated gen on gen since Pascal, yet costs the same or more.

Zen 3 Daily Rig (2022 - Present): AMD Ryzen 9 5900X + Optimus Foundations AM4 | Nvidia RTX 3080 Ti FE + Alphacool Eisblock 3080 FE | G.Skill Trident Z Neo 32GB DDR4-3600 (@3733 c14) | ASUS Crosshair VIII Dark Hero | 2x Samsung 970 Evo Plus 2TB | Crucial MX500 1TB | Corsair RM1000x | Lian Li O11 Dynamic | LG 48" C1 | EK Quantum Kinetic TBE 200 w/ D5 | HWLabs GTX360 and GTS360 | Bitspower True Brass 14mm | Corsair 14mm White PMMA | ModMyMods Mod Water Clear | 9x BeQuiet Silent Wings 3 120mm PWM High Speed | Aquacomputer Highflow NEXT | Aquacomputer Octo

 

Test Bench: 

CPUs: Intel Core 2 Duo E8400, Core i5-2400, Core i7-4790K, Core i9-10900K, Core i3-13100, Core i9-13900KS

Motherboards: ASUS Z97-Deluxe, EVGA Z490 Dark, EVGA Z790 Dark Kingpin

GPUs: GTX 275 (RIP), 2x GTX 560, GTX 570, 2x GTX 650 Ti Boost, GTX 980, Titan X (Maxwell), x2 HD 6850

Bench: Cooler Master Masterframe 700 (bench mode)

Cooling: Heatkiller IV Pro Pure Copper | Koolance GPU-210 | HWLabs L-Series 360 | XSPC EX360 | Aquacomputer D5 | Bitspower Water Tank Z-Multi 250 | Monsoon Free Center Compressions | Mayhems UltraClear | 9x Arctic P12 120mm PWM PST

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, Dean0919 said:

Ray tracing did same thing in "Hellblade: Senua's Sacrifice" video game on my system. Without ray tracing I'm definitely over 75 fps on 1440p resolution, but as soon as I turn on ray tracing, fps drops much lower (45 and that low) on RTX 3080.

Think im gonna say the one thing i didnt think i would say.. Fuck RTX i guess.

 

Not in a playable state with it enabled even with DLSS on Balanced or on Quality, fuck it.

 

Even have the option to buy the 4090 for 2000$ but honestly that would be suicide for the money i spent to get the 3080ti.

 

Nah.. Let's wait one generation and see how it goes.

Useful threads: PSU Tier List | Motherboard Tier List | Graphics Card Cooling Tier List ❤️

Baby: MPG X570 GAMING PLUS | AMD Ryzen 9 5900x /w PBO | Corsair H150i Pro RGB | ASRock RX 7900 XTX Phantom Gaming OC (3020Mhz & 2650Memory) | Corsair Vengeance RGB PRO 32GB DDR4 (4x8GB) 3600 MHz | Corsair RM1000x |  WD_BLACK SN850 | WD_BLACK SN750 | Samsung EVO 850 | Kingston A400 |  PNY CS900 | Lian Li O11 Dynamic White | Display(s): Samsung Oddesy G7, ASUS TUF GAMING VG27AQZ 27" & MSI G274F

 

I also drive a volvo as one does being norwegian haha, a volvo v70 d3 from 2016.

Reliability was a key thing and its my second car, working pretty well for its 6 years age xD

Link to comment
Share on other sites

Link to post
Share on other sites

On 10/11/2022 at 4:45 PM, LpoolTech said:

I've watched few reviews so far and I find it interesting that the numbers are high everywhere but they differ a lot too jayz cyberpunk was average but SOTR was high, opposite of ltt scores 

Watched Jayz video first and later had same impression when watching LTT video. Just now watching both side by side for comparison and... Can't do direct comparison on others since either listed preset is different or benchmarked games don't match.

image.thumb.png.428e1d776c8b600f14791b0b274dbbc4.pngimage.thumb.png.30bbe87b9d27bf8ac6679f0da00efb00.png

Oka
y, it seems LTT F'ed it up. Don't they run these on automated scripts now? Guess they did not properly test their automation scripts before running the benchmarks.

Link to comment
Share on other sites

Link to post
Share on other sites

56 minutes ago, Just that Mario said:

Watched Jayz video first and later had same impression when watching LTT video. Just now watching both side by side for comparison and... Can't do direct comparison on others since either listed preset is different or benchmarked games don't match.

image.thumb.png.428e1d776c8b600f14791b0b274dbbc4.pngimage.thumb.png.30bbe87b9d27bf8ac6679f0da00efb00.png

Oka
y, it seems LTT F'ed it up. Don't they run these on automated scripts now? Guess they did not properly test their automation scripts before running the benchmarks.

Well Anthony said that he repeat the taste few times as it was so weird if my memory is correct so maybe its the CPU difference  

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, Just that Mario said:

Watched Jayz video first and later had same impression when watching LTT video. Just now watching both side by side for comparison and... Can't do direct comparison on others since either listed preset is different or benchmarked games don't match.

image.thumb.png.428e1d776c8b600f14791b0b274dbbc4.pngimage.thumb.png.30bbe87b9d27bf8ac6679f0da00efb00.png

Oka
y, it seems LTT F'ed it up. Don't they run these on automated scripts now? Guess they did not properly test their automation scripts before running the benchmarks.

Basically Anthony has stated yeah their results were wrong Fidelity FX forced itself to be enabled which lead to their anomylous results.

Link to comment
Share on other sites

Link to post
Share on other sites

So what's the best performing GPU money can buy today that peaks at under 100W? Because efficiency matters a whole lot more to me.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×