Jump to content

Raja Koduri goal for 2017 is to have 1000$ 4K ready PC

3DOSH
2 hours ago, deXxterlab97 said:

Holy hell so many nines

Image result for nine rodriguez

 

Spoiler

I hope at least one of you gets it

 

-------

Current Rig

-------

Link to comment
Share on other sites

Link to post
Share on other sites

25 minutes ago, 3DOSH said:

I don't have anything against the parts but the idea that 1080 can do 4k @60fps.

Depends on the game.

 

It's like how AMD claims that the 480 can do VR. It's true for some games, and false for other games.

Link to comment
Share on other sites

Link to post
Share on other sites

12 minutes ago, LAwLz said:

Depends on the game.

 

It's like how AMD claims that the 480 can do VR. It's true for some games, and false for other games.

Yeah I'm sure Half Life 2 can do 4k even on a 1050.

 

It mostly doesn't depends on the game it depends on whenever you find 30 to 40 FPS minimums acceptable or a noticeable downgrade in quality to booth. It's more than reasonable to assume we're talking modern games in which case fucking no: single 1080 ain't enough for 4k.

 

Bare minimum for PC gamers should be SLI 1070s or single Titan XP

-------

Current Rig

-------

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, vorticalbox said:

the problem is its not 4k amd gaming.

it's not even 4k Nvidia gaming.

 

i5s wont push enough drawcalls to play AAA titles at 4k60 smoothly. They will have dips. Heck DF showed a 6600k at 4.2GHz dipping with a 1070 at 1440p, and it was a CPU issue, not GPU.

 

i5s, are not enough for high end gaming. But people refuse to acknowledge it, because they bought an i5 and hate the idea that they were screwed over.

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, Prysin said:

it's not even 4k Nvidia gaming.

 

i5s wont push enough drawcalls to play AAA titles at 4k60 smoothly. They will have dips. Heck DF showed a 6600k at 4.2GHz dipping with a 1070 at 1440p, and it was a CPU issue, not GPU.

 

i5s, are not enough for high end gaming. But people refuse to acknowledge it, because they bought an i5 and hate the idea that they were screwed over.

I was never clear if that was due to hyperthreading or having more L3 cache, or possibly both.

 

-------

Current Rig

-------

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, deXxterlab97 said:

Even a 400$ 1070 can do somewhat decent enough 

Hell, a RX480 is enough for 4k gaming...

There's a giant difference between 4K capable and 4K capable at fucking ultra settings with 8X MSAA :P 

 

The RX480 (and 1060 and 980 and 970 and RX470 and 290/290X/390/390X ect. ect.) can all quite easily run triple As at 4K and even 60fps but at lower settings such as low and medium but if you're willing to play at 30fps then ultra in some games is doable and high is doable in basically every single fucking game so for fuck sakes, learn the difference between 4K "ready" and 4K "capable" at higher settings.

(Sorry dexxter for making you read this rant xD)

Looking at my signature are we now? Well too bad there's nothing here...

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

What? As I said, there seriously is nothing here :) 

Link to comment
Share on other sites

Link to post
Share on other sites

12 minutes ago, Misanthrope said:

I was never clear if that was due to hyperthreading or having more L3 cache, or possibly both.

 

it's both. Hyperthreading alleviates memory constraints, and as you can understand getting the textures and other assets from the harddrive, to the GPU (which the CPU are part of) is demanding. However thanks to hyperthreading you help alleviating.

 

Hyperthreading also helps with getting more drawcalls, as it alleviates the "bubbles" you get in a non-hyperthreaded CPU. Every bubble may allow for another drawcall. This is why i7's for the most part has a flat out 10-30% performance improvements over i5s. Same goes for i3s vs pentiums.

 

L3 simply helps holding more info on-die, which further cuts down on latency penalties brought on by fetching data from system memory or worse still, HDD/SSDs

Link to comment
Share on other sites

Link to post
Share on other sites

29 minutes ago, Prysin said:

i5s wont push enough drawcalls to play AAA titles at 4k60 smoothly. They will have dips. Heck DF showed a 6600k at 4.2GHz dipping with a 1070 at 1440p, and it was a CPU issue, not GPU.

 

i5s, are not enough for high end gaming. But people refuse to acknowledge it, because they bought an i5 and hate the idea that they were screwed over.

You are making quite the leap there. Just because the i5 will have some dips (the i7 will most likely have that too) doesn't mean the i5 will be unsuitable for 1440p gaming.

I am not usually not reading CPU and GPU threads (because they are the most cancerous shit I've ever seen, because of all the Nvidia and AMD fanboys) but my guess is that it's less about "people refuse to acknowledge it" and more about "people don't like generalizations based on edge cases, and are OK with dips below 60 fps". 

Link to comment
Share on other sites

Link to post
Share on other sites

6 minutes ago, LAwLz said:

You are making quite the leap there. Just because the i5 will have some dips (the i7 will most likely have that too) doesn't mean the i5 will be unsuitable for 1440p gaming.

I am not usually not reading CPU and GPU threads (because they are the most cancerous shit I've ever seen, because of all the Nvidia and AMD fanboys) but my guess is that it's less about "people refuse to acknowledge it" and more about "people don't like generalizations based on edge cases, and are OK with dips below 60 fps". 

thing is, digital foundry proved its not just a issue tomorrow, but today. Their tests of an i5 6600k at 4.2Ghz showed 70-100% utilization in AAA titles. That is disturbing considering it is a contemporary product, wildly popular by the gaming mainstream, yet evidence show that their purchase will not be another Sandy Bridge story, it will be a G3258 story. Great then and there, no so much later on.

Link to comment
Share on other sites

Link to post
Share on other sites

13 minutes ago, Prysin said:

thing is, digital foundry proved its not just a issue tomorrow, but today. Their tests of an i5 6600k at 4.2Ghz showed 70-100% utilization in AAA titles. That is disturbing considering it is a contemporary product, wildly popular by the gaming mainstream, yet evidence show that their purchase will not be another Sandy Bridge story, it will be a FX 8350 story. Great then and there, no so much later on.

CPU % utilization is a terrible indicator of how bottlenecked something is. Just because you use 100% CPU doesn't mean getting a better CPU would increase performance. It might, but there might be other bottlenecks in your system too.

 

Also, in that test what FPS did they get, with what games, at what settings, and with what GPU?

If you're getting 200 FPS and your CPU is pegged at 100% utilization then you can hardly say it's a "problem".

 

 

You're using extremely vague "facts" here, and that is setting off my bullshit alarm like crazy. I don't know if you're doing it on purpose (I totally understand that you might not want to go though the effort of finding all the videos you're referring to), but since you can basically come up with any conclusion you want by looking at different parts of a test, my bullshit alarm always goes off when I see claims like yours.

If you're not posting the full story, with all the facts, then it's very easy to make misleading generalizations.

 

 

 

17 minutes ago, Prysin said:

This is why i7's for the most part has a flat out 10-30% performance improvements over i5s. Same goes for i3s vs pentiums.

Source?

"Flat out 30% faster" implies that it's across the board and not just in some situations. I have no problem believing that the i7 is "up to 30% faster" (at the same clock) in a best case scenario, but for the average use case I really doubt you will see that big of a difference.

 

 

Generalizations are bad, and they can be extremely misleading if you are basing you generalization on things other than averages.

Link to comment
Share on other sites

Link to post
Share on other sites

21 minutes ago, Prysin said:

thing is, digital foundry proved its not just a issue tomorrow, but today. Their tests of an i5 6600k at 4.2Ghz showed 70-100% utilization in AAA titles. That is disturbing considering it is a contemporary product, wildly popular by the gaming mainstream, yet evidence show that their purchase will not be another Sandy Bridge story, it will be a G3258 story. Great then and there, no so much later on.

It's there, but I wouldn't call the i5 performance disturbing. We can recognize the difference without the hyperbole.

-------

Current Rig

-------

Link to comment
Share on other sites

Link to post
Share on other sites

6 minutes ago, LAwLz said:

CPU % utilization is a terrible indicator of how bottlenecked something is. Just because you use 100% CPU doesn't mean getting a better CPU would increase performance. It might, but there might be other bottlenecks in your system too.

 

Also, in that test what FPS did they get, with what games, at what settings, and with what GPU?

If you're getting 200 FPS and your CPU is pegged at 100% utilization then you can hardly say it's a "problem".

 

 

You're using extremely vague "facts" here, and that is setting off my bullshit alarm like crazy. I don't know if you're doing it on purpose (I totally understand that you might not want to go though the effort of finding all the videos you're referring to), but since you can basically come up with any conclusion you want by looking at different parts of a test, my bullshit alarm always goes off when I see claims like yours.

If you're not posting the full story, with all the facts, then it's very easy to make misleading generalizations.

 

 

 

Source?

"Flat out 30% faster" implies that it's across the board and not just in some situations. I have no problem believing that the i7 is "up to 30% faster" (at the same clock) in a best case scenario, but for the average use case I really doubt you will see that big of a difference.

i see you enjoy using hyperbole. Please, if you are going to quote me, atleast argue on everything i said not just specific numbers suiting your agenda. 10-30% is common. Sometimes it is the clock speeds (10% ish area) doing most of the benefit, but more often it is the hyperthreading doing the heavy lifting. We see this with pentiums vs i3s too. However i am on a tablet atm so it is a bitch to dig up all the videos (fuck you youtube app/website and your abysmal mobile setup).

 

CPU % utilization is fine if it WAS 200FPS, i do remember it being more like 55-70 FPS in the cases i am talking of. Still above or around 60FPS, but not "enough" to make it a "non issue".

 

% utilization is also an issue when you know that the resolution you are playing at, 1440p in this case, isn't always being only GPU dependent. We know this by comparing CPUs like the i5 2500k, the i5 3570k, the i5 4690k and the i5 6600k. We know by now that IPC and clock speeds will definetively matter even at 1440p. Meaning this resolution is still not so taxing its only a GPU issue.

 

I would usually agree that it is a bit preemptive to say "i5s arent dying". But fact comes before emotion and opinion. If we see that the CPUs is holding things back, we know that it's no longer a matter of IF, but why and how long these productions will stay relevant.

 

WE know from tests done by various reviewers, that a i7 2600k or i7 3770k is sitting around or above a i5 6600k, despite notably lower IPC and lower or equal clock speeds. This is disturbing, because it goes to show HOW much hyperthreading does help. 4 cores/4 threads isnt enough for modern AAA titles. It just isn't. You may claim it is, but you'll be hurting the PC Gaming scene by doing so. Sure an i5 is still enough, just barely, it's not a major bottleneck today, but it is "tomorrow". It isn't enough for high end gaming a couple of years from now. Thing is, people keep talking of this bullshit "just buy an i5 now and sell it later and buy an i7"... Well, 2-3 years from now, there will be few 6700ks sold in stores, the few that is will be marked up due to retailers trying to flay the ones desperate for a old gen product fitting their socket. The i5s of today, will not be worth that much on the used market due to ZEN and Kaby/Cannon lake products flooding it. The i7s of today will still be relevant in 2-3 years, and as such their used prices, specially K skus, will not drop that much. So it will be a net loss buying a i5 6600k and upgrading in 2 years, then it will be buying a i7 6700k right away.

This is just a example of products, it can apply to basically any current gen vs next gen platform, as i am sure you will understand.

 

I could go on, but i am sure your personal opinion is too far entrenched into one camp or the other regarding this subject

Link to comment
Share on other sites

Link to post
Share on other sites

19 minutes ago, Misanthrope said:

It's there, but I wouldn't call the i5 performance disturbing. We can recognize the difference without the hyperbole.

can you? can you recognize the difference when it is not clear as a day?

Remember, most consumers NEVER check reviews, they compare marketing materials and what the donkey at Best Buy or whatever other chain they purchase from says. Even among tech savvy people, they check reviews like LTT, HWC, Pauls Hardware, JayzTwoCents, TechofTomorrow or Salazar Studio. Channels with little or no technical knowledge behind them. So they simply wont know. Iv'e seen videos from mentioned channels blame only the games themselves before the hardware because they simply fail to see past the faults of the coding to discover it is also a fault in the hardware used.

Link to comment
Share on other sites

Link to post
Share on other sites

56 minutes ago, Prysin said:

Hyperthreading also helps with getting more drawcalls, as it alleviates the "bubbles" you get in a non-hyperthreaded CPU. Every bubble may allow for another drawcall. This is why i7's for the most part has a flat out 10-30% performance improvements over i5s. Same goes for i3s vs pentiums. 

We are talking gaming here? If so, the difference can be huge, but sometimes it' few FPS at most even in newer AAA titles.

The ability to google properly is a skill of its own. 

Link to comment
Share on other sites

Link to post
Share on other sites

17 minutes ago, Prysin said:

can you?

Yes, Even if I couldn't, your sanctimonious lecturing won't help anything.

-------

Current Rig

-------

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, Energycore said:

-snip-

Where i live, thats nearly 1900€....

Groomlake Authority

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, LAwLz said:

CPU % utilization is a terrible indicator of how bottlenecked something is. Just because you use 100% CPU doesn't mean getting a better CPU would increase performance. It might, but there might be other bottlenecks in your system too.

CPU utilization as shown in Windows is a poor indicator of anything. All it says is the CPU is active doing something for a certain percentage of the sample period. For example if you made a piece of code that is an infinite loop that executes nothing the core that it will run on will be shown as 100% utilized in Windows, with hyper-threading that core no longer becomes wasted unless you waste both threads in the same way.

 

Everything has to be kept in balance if not game engines have to do compensation which could lead to CPU threads showing quite active but executing very little.

 

So in the extreme example, which would never happen in real life, a 4c/8t CPU might be showing 100% utilization but if you were to swap in a 8c/16t CPU it might also show 100% utilization without any actual performance increase. To be clear this would not happen and game engine optimization is no that bad.

 

Monitoring the power draw of the CPU might actually give you a better indication of it's actual utilization.

Link to comment
Share on other sites

Link to post
Share on other sites

That would be great, cause I'm planning to upgrade this year but I also see it hardly achievable.

Cause what I see 4K gaming is  AAA titles maxed with no compromise in lowering video settings to get 60fps on 4K cause wouldn't make sense lowering anything.

| Ryzen 7 7800X3D | AM5 B650 Aorus Elite AX | G.Skill Trident Z5 Neo RGB DDR5 32GB 6000MHz C30 | Sapphire PULSE Radeon RX 7900 XTX | Samsung 990 PRO 1TB with heatsink | Arctic Liquid Freezer II 360 | Seasonic Focus GX-850 | Lian Li Lanccool III | Mousepad: Skypad 3.0 XL / Zowie GTF-X | Mouse: Zowie S1-C | Keyboard: Ducky One 3 TKL (Cherry MX-Speed-Silver)Beyerdynamic MMX 300 (2nd Gen) | Acer XV272U | OS: Windows 11 |

Link to comment
Share on other sites

Link to post
Share on other sites

You can have 4k for $300 now.

Intel Xeon E5 1650 v3 @ 3.5GHz 6C:12T / CM212 Evo / Asus X99 Deluxe / 16GB (4x4GB) DDR4 3000 Trident-Z / Samsung 850 Pro 256GB / Intel 335 240GB / WD Red 2 & 3TB / Antec 850w / RTX 2070 / Win10 Pro x64

HP Envy X360 15: Intel Core i5 8250U @ 1.6GHz 4C:8T / 8GB DDR4 / Intel UHD620 + Nvidia GeForce MX150 4GB / Intel 120GB SSD / Win10 Pro x64

 

HP Envy x360 BP series Intel 8th gen

AMD ThreadRipper 2!

5820K & 6800K 3-way SLI mobo support list

 

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, Misanthrope said:

 

 

  Reveal hidden contents

I hope at least one of you gets it

 

oh shit is this nines rodriguez from masquerade bloodlines? liked playing that game last year 

Link to comment
Share on other sites

Link to post
Share on other sites

They should say "4k ultra" ready pc, because frankly 4k has been accessible for a couple of years now as long as you're willing to lower some settings.

Don't ask to ask, just ask... please 🤨

sudo chmod -R 000 /*

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, Sauron said:

They should say "4k ultra" ready pc, because frankly 4k has been accessible for a couple of years now as long as you're willing to lower some settings.

Nice try, you're late to the party :D 

/JK but to be fair, I did rant about it before you showed up so...

Looking at my signature are we now? Well too bad there's nothing here...

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

What? As I said, there seriously is nothing here :) 

Link to comment
Share on other sites

Link to post
Share on other sites

12 minutes ago, Mr.Meerkat said:

Nice try, you're late to the party :D 

/JK but to be fair, I did rant about it before you showed up so...

it's worth ranting AGAIN

Don't ask to ask, just ask... please 🤨

sudo chmod -R 000 /*

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, Sauron said:

it's worth ranting AGAIN

True true considering that a RX460 is enough to run older and easier to run triple As at 30FPS 4K (low and even medium) so...can we all please stop saying we need a fucking 1070 minimum for 4K gaming?

Looking at my signature are we now? Well too bad there's nothing here...

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

What? As I said, there seriously is nothing here :) 

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, Doobeedoo said:

That would be great, cause I'm planning to upgrade this year but I also see it hardly achievable.

Cause what I see 4K gaming is  AAA titles maxed with no compromise in lowering video settings to get 60fps on 4K cause wouldn't make sense lowering anything.

You'd be surprised, some settings barely make any difference in visual quality whereas a higher resolution is a night and day difference.

Don't ask to ask, just ask... please 🤨

sudo chmod -R 000 /*

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×