Jump to content

a new type of shilling - game developer locks content to specific Intel i7 CPUs [updated]

1 minute ago, AlwaysFSX said:

I don't mean something like a Sempron but there's no reason an older i3 can't have enough power to drive modern titles without bottlenecking 90% of the graphics market.

In VR?! You're out of your mind. Maybe if vectorized optimization was just as good here as it is in HPC land, but with current coding standards? No way in Hell.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

40 minutes ago, Bouzoo said:

Might be a bit complicating because AMD. Also many people use Gxxxx, and they get new revisions (almost) every year. 

not really, proper multi-threading would make even FX 8350s totally usable in most AAA games. But we are still limping around on 4 core engines that has shitty multithreading algorithms that make any sort of multithreading mean fuck-all because their fork and merge overhead is far too great to offset Ahmdal's Law.

 

@patrickjp93 is right when he says game programmers are idiots. Perhaps not right for the right reasons, but he is right. Games aren't remotely close to being "efficiently demanding". The closest game we have to being as demanding of the hardware it can be, yet being efficient is Ashes of the Singularity. No other game comes close in terms of performance vs hardware utilization. But that game was built ground up from the engine to the last UI element to use multiple cores/threads and GPGPU effectively.

 

If you want to know how powerful FX CPUs are, look at Cinebench. A FX 6300 scores about the same as a i3 6100. A FX 8350 at stock is around a 6600k.... IPC aside, if games was coded effectively, the FX 8350 would be around a i5 in performance, instead of struggling to beat a i3 6100. But physics, AI and other things arent vectorized at all, and i suspect they have horrid library optimization issues to deal with too, or rather, they rely so heavily on 3rd party plugins and libraries that they dont use the most optimal code. They just get the extensions they need from GitHub and never question the function aslong as the answer is correct.

Link to comment
Share on other sites

Link to post
Share on other sites

8 minutes ago, Prysin said:

not really, proper multi-threading would make even FX 8350s totally usable in most AAA games. But we are still limping around on 4 core engines that has shitty multithreading algorithms that make any sort of multithreading mean fuck-all because their fork and merge overhead is far too great to offset Ahmdal's Law.

 

@patrickjp93 is right when he says game programmers are idiots. Perhaps not right for the right reasons, but he is right. Games aren't remotely close to being "efficiently demanding". The closest game we have to being as demanding of the hardware it can be, yet being efficient is Ashes of the Singularity. No other game comes close in terms of performance vs hardware utilization. But that game was built ground up from the engine to the last UI element to use multiple cores/threads and GPGPU effectively.

 

If you want to know how powerful FX CPUs are, look at Cinebench. A FX 6300 scores about the same as a i3 6100. A FX 8350 at stock is around a 6600k.... IPC aside, if games was coded effectively, the FX 8350 would be around a i5 in performance, instead of struggling to beat a i3 6100. But physics, AI and other things arent vectorized at all, and i suspect they have horrid library optimization issues to deal with too, or rather, they rely so heavily on 3rd party plugins and libraries that they dont use the most optimal code. They just get the extensions they need from GitHub and never question the function aslong as the answer is correct.

I know games are far from being efficient. Far, far away. My point was not really on 6 and 8 FX series, but there are 4 series and others as well. Problem is where to draw the line on what should be locked out. And then, there are G series Intel. Far Cry 4 had a huge backlash on locking out everything except i series (and QC iirc?). 

The ability to google properly is a skill of its own. 

Link to comment
Share on other sites

Link to post
Share on other sites

Oh well... I feel better now for spending the extra buck on the processor, I was genuinely starting to feel bad about not going with a i5 6500, so I'd have saved more money for a new GPU next month(now I have to wait till late Feb to buy the 1070).

 

I mean it's funny how a new high end GPU is still the biggest impact on games, 2nd gen of i7's can still do well enough if paired with a powerful enough GPU, but seen that perhaps Processors will more and more now have a bigger role in the games sure made me feel better about giving up to have a new GPU sooner in order to have a nicer CPU.
 

Personal Desktop":

CPU: Intel Core i7 10700K @5ghz |~| Cooling: bq! Dark Rock Pro 4 |~| MOBO: Gigabyte Z490UD ATX|~| RAM: 16gb DDR4 3333mhzCL16 G.Skill Trident Z |~| GPU: RX 6900XT Sapphire Nitro+ |~| PSU: Corsair TX650M 80Plus Gold |~| Boot:  SSD WD Green M.2 2280 240GB |~| Storage: 1x3TB HDD 7200rpm Seagate Barracuda + SanDisk Ultra 3D 1TB |~| Case: Fractal Design Meshify C Mini |~| Display: Toshiba UL7A 4K/60hz |~| OS: Windows 10 Pro.

Luna, the temporary Desktop:

CPU: AMD R9 7950XT  |~| Cooling: bq! Dark Rock 4 Pro |~| MOBO: Gigabyte Aorus Master |~| RAM: 32G Kingston HyperX |~| GPU: AMD Radeon RX 7900XTX (Reference) |~| PSU: Corsair HX1000 80+ Platinum |~| Windows Boot Drive: 2x 512GB (1TB total) Plextor SATA SSD (RAID0 volume) |~| Linux Boot Drive: 500GB Kingston A2000 |~| Storage: 4TB WD Black HDD |~| Case: Cooler Master Silencio S600 |~| Display 1 (leftmost): Eizo (unknown model) 1920x1080 IPS @ 60Hz|~| Display 2 (center): BenQ ZOWIE XL2540 1920x1080 TN @ 240Hz |~| Display 3 (rightmost): Wacom Cintiq Pro 24 3840x2160 IPS @ 60Hz 10-bit |~| OS: Windows 10 Pro (games / art) + Linux (distro: NixOS; programming and daily driver)
Link to comment
Share on other sites

Link to post
Share on other sites

22 minutes ago, Bouzoo said:

I know games are far from being efficient. Far, far away. My point was not really on 6 and 8 FX series, but there are 4 series and others as well. Problem is where to draw the line on what should be locked out. And then, there are G series Intel. Far Cry 4 had a huge backlash on locking out everything except i series (and QC iirc?). 

thing is, a AMD quad, even the Bulldozer derivates AND Phenom IIs are actually good enough in multi-threading to hit i3 levels. They also have AVX, which some older Intel CPUs doesnt have. Thus they are perfectly able to run never titles if the titles would be optimized properly.

Link to comment
Share on other sites

Link to post
Share on other sites

Played 35 mins of Horde Mode on my 4.6Ghz FX-8350. No frame drops, even when grenades take out 7 zombies and send body parts everywhere. I also still have reprojection as a back-up and Oculus users have Async-Space Warp so it would probably run fine on an i3 or FX-6350.

 

i7's are not necessary to play Horde mode.

Link to comment
Share on other sites

Link to post
Share on other sites

Wait so my i7 4930K (Ivy Bridge-E) was originally locked out? Is there actually any evidence that this game uses instruction sets exclusive to these newer generation CPUs i.e. AVX2 and FMA3 (anything else? I only quickly looked).

 

I don't see that stacking up for two reasons

  1. Haswell-E looks to also have been locked out and supports AVX2/FMA3
  2. The main part of the game runs on much more general hardware requirements

I understand that there is genuine concern over performance in these more hardware demanding game modes which would lead to these higher recommendations but that in itself is not evidence of critical dependency on AVX2/FMA3. I particularly believe this to be the case since the modes were planned to be unlocked 3 months later on hardware without these, enough time to do final optimization and quality control to ensure acceptable game play on lower performance hardware.

 

So then that leads to only two possibilities; Requirement for hyper-threading (double check, have that and 2 more cores), Per core performance is too low on Ivy Bridge-E and Haswell-E to achieve consistent 90 FPS (HA not likely).

 

I don't actually care since I was never going to buy the game but I think the Ivy Bridge-E and Haswell-E CPUs were unnecessarily locked out, kind of interesting but not surprising for Ivy Bridge-E these are getting rather old now.

 

I can only conclude that the reason behind this if at all technical was performance concerns, but to that I would say the game was released 3 months too early.

 

P.S. You have to go back to 1st generation i7 to not have any form of AVX.

Link to comment
Share on other sites

Link to post
Share on other sites

52 minutes ago, leadeater said:

Wait so my i7 4930K (Ivy Bridge-E) was originally locked out? Is there actually any evidence that this game uses instruction sets exclusive to these newer generation CPUs i.e. AVX2 and FMA3 (anything else? I only quickly looked).

 

I don't see that stacking up for two reasons

  1. Haswell-E looks to also have been locked out and supports AVX2/FMA3
  2. The main part of the game runs on much more general hardware requirements

I understand that there is genuine concern over performance in these more hardware demanding game modes which would lead to these higher recommendations but that in itself is not evidence of critical dependency on AVX2/FMA3. I particularly believe this to be the case since the modes were planned to be unlocked 3 months later on hardware without these, enough time to do final optimization and quality control to ensure acceptable game play on lower performance hardware.

 

So then that leads to only two possibilities; Requirement for hyper-threading (double check, have that and 2 more cores), Per core performance is too low on Ivy Bridge-E and Haswell-E to achieve consistent 90 FPS (HA not likely).

 

I don't actually care since I was never going to buy the game but I think the Ivy Bridge-E and Haswell-E CPUs were unnecessarily locked out, kind of interesting but not surprising for Ivy Bridge-E these are getting rather old now.

 

I can only conclude that the reason behind this if at all technical was performance concerns, but to that I would say the game was released 3 months too early.

 

P.S. You have to go back to 1st generation i7 to not have any form of AVX.

As far as we understand it,  it was arbitrarily locked to 5th,6th,7th gen i7 cpus with no other stipulation.  

 

Im curious to know how my 9590 would fair here. But i dont have a vr set yet. 

CPU: Amd 7800X3D | GPU: AMD 7900XTX

Link to comment
Share on other sites

Link to post
Share on other sites

56 minutes ago, leadeater said:

-snip-

You're thinking about this from a logical and technical point of view. The truth is that this is what they thought:

"Hey, how do we make people buy our CPUs? People haven't had any reason to upgrade recently."

"How about we pay developers to arbitrarily lock down parts of a game, and make it only available to those who has bought our high end SKUs recently?"

"Brilliant!"

 

 

3 hours ago, Prysin said:

it is, because if you just make the game too hard to run on craptops, people claim your game is unoptimized and poorly made. Which means 3 weeks after release they magically improve the performance in general by simply nerfing the fuck outta the AI, physics and particle effects.

Or they could just have a slider, like with every other setting in games.

There is no excuse.

Link to comment
Share on other sites

Link to post
Share on other sites

9 minutes ago, LAwLz said:

You're thinking about this from a logical and technical point of view. The truth is that this is what they thought:

"Hey, how do we make people buy our CPUs? People haven't had any reason to upgrade recently."

"How about we pay developers to arbitrarily lock down parts of a game, and make it only available to those who has bought our high end SKUs recently?"

"Brilliant!"

Oh I totally think this was nothing more than a naive and poorly executed exclusivity deal between the studio and Intel, maybe with some technical assistance but even that has not been proven and the deal could have been nothing more than financing. "Working with [Company Name] allowed us to" is a very generic statement often used in public statements to mean a huge variety of things, it actually doesn't imply direct assistance from Intel engineers at all.

 

If I didn't present logical reasoning with technical justification that would just invite other people to discount my opinion or try to disprove it, potentially in a very lengthy convoluted manner utterly ignoring the fundamental basis of the issue and the actual evidence at hand, not the inferred evidence backed by nothing more than conjecture and inference on that incomplete or unproven evidence. But hey that never happens right? ;).

Link to comment
Share on other sites

Link to post
Share on other sites

Practices from Intel that should have never even happened.. Glad people stood up to this.

gtfo hardware locking, on any game 

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, patrickjp93 said:

-snip

This isn't to add to the argument, but I can't help but notice your horrid attitude. This entire debate you've been riding on your massive ego and acting like everyone else is a clueless idiot. It's rather offensive. Especially since you claim that I am a terrible software developer with no standards because I condemn this company's actions. 

 

Any way you can tone it down a few notches and back up your claims? 

Wishing leads to ambition and ambition leads to motivation and motivation leads to me building an illegal rocket ship in my backyard.

 

Link to comment
Share on other sites

Link to post
Share on other sites

7 hours ago, patrickjp93 said:

And if you watch his video, he rants about his own fellow developers, and the techniques he demonstrates are rudimentary.

 

He doesn't demonstrate anything about victim vs. pure hierarchy. He knows enough to get by, and he's the best they have.

 

Are you joking?

If he doesn't demonstrate anything related to cache, why do you keep bringing his name up as evidence towards your claims? Again, calling him the best they have, is yet another opinion. I would say wake me up when you have substantial evidence, but then i'd never see you again, and that would hurt.

My (incomplete) memory overclocking guide: 

 

Does memory speed impact gaming performance? Click here to find out!

On 1/2/2017 at 9:32 PM, MageTank said:

Sometimes, we all need a little inspiration.

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, christianled59 said:

Any way you can tone it down a few notches and back up your claims? 

Believe me, he will not. If anything, he has been getting worse over time.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, LAwLz said:

Believe me, he will not. If anything, he has been getting worse over time.

Even more so wont quote users that prove him dead wrong and keeps going. 

CPU: Amd 7800X3D | GPU: AMD 7900XTX

Link to comment
Share on other sites

Link to post
Share on other sites

9 hours ago, patrickjp93 said:

Now, I'm going to cook a gourmet dinner and eat it. Take some time to install a Linux VM (I prefer Fedora, but Ubuntu will be fine) or a dual boot, update GCC using sudo apt-get update or sudo dnf update, copy the code into TextEdit, save the file as a .cpp file, compile it like I showed above, run it like I showed above, and come back to me with a result.

*tips fedora*

How ya doin m'lady

Stuff:  i7 7700k @ (dat nibba succ) | ASRock Z170M OC Formula | G.Skill TridentZ 3600 c16 | EKWB 1080 @ 2100 mhz  |  Acer X34 Predator | R4 | EVGA 1000 P2 | 1080mm Radiator Custom Loop | HD800 + Audio-GD NFB-11 | 850 Evo 1TB | 840 Pro 256GB | 3TB WD Blue | 2TB Barracuda

Hwbot: http://hwbot.org/user/lays/ 

FireStrike 980 ti @ 1800 Mhz http://hwbot.org/submission/3183338 http://www.3dmark.com/3dm/11574089

Link to comment
Share on other sites

Link to post
Share on other sites

45 minutes ago, Lays said:

*tips fedora*

How ya doin m'lady

More like, M'agey

My (incomplete) memory overclocking guide: 

 

Does memory speed impact gaming performance? Click here to find out!

On 1/2/2017 at 9:32 PM, MageTank said:

Sometimes, we all need a little inspiration.

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

6 hours ago, christianled59 said:

This isn't to add to the argument, but I can't help but notice your horrid attitude. This entire debate you've been riding on your massive ego and acting like everyone else is a clueless idiot. It's rather offensive. Especially since you claim that I am a terrible software developer with no standards because I condemn this company's actions. 

 

Any way you can tone it down a few notches and back up your claims? 

I've been riding on merit of fact and knowing this industry. This is exactly what Nvidia does to get GameWorks into games and what AMD does to get Gaming Evolved in. The only difference is where you have to address different bottlenecks.

 

The claims have been backed up. You have done nothing to convince me you have a clue about optimization, so I stand by my claim you're clueless.

5 hours ago, goodtofufriday said:

Even more so wont quote users that prove him dead wrong and keeps going. 

If users proved me wrong, I'd quote them. Still waiting.

 

4 hours ago, Lays said:

*tips fedora*

How ya doin m'lady

See, it's crap like this which falls under harassment because it's repeated, but the moderators do nothing because I'm somehow the black sheep for having standards, experience, and expertise.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

6 minutes ago, patrickjp93 said:

I've been riding on merit .....

You can shove your "merit". No one cares. 

 

 

6 minutes ago, patrickjp93 said:

If users proved me wrong, I'd quote them. Still waiting.

I'm still waiting as well. It's been almost two weeks.

 

PSU Tier List | CoC

Gaming Build | FreeNAS Server

Spoiler

i5-4690k || Seidon 240m || GTX780 ACX || MSI Z97s SLI Plus || 8GB 2400mhz || 250GB 840 Evo || 1TB WD Blue || H440 (Black/Blue) || Windows 10 Pro || Dell P2414H & BenQ XL2411Z || Ducky Shine Mini || Logitech G502 Proteus Core

Spoiler

FreeNAS 9.3 - Stable || Xeon E3 1230v2 || Supermicro X9SCM-F || 32GB Crucial ECC DDR3 || 3x4TB WD Red (JBOD) || SYBA SI-PEX40064 sata controller || Corsair CX500m || NZXT Source 210.

Link to comment
Share on other sites

Link to post
Share on other sites

11 hours ago, patrickjp93 said:

In VR?! You're out of your mind. Maybe if vectorized optimization was just as good here as it is in HPC land, but with current coding standards? No way in Hell.

What changes the game running in VR than not. It's a head tracking system, which isn't resource intensive, and two viewpoints in game running on what's essentially a giant monitor.

.

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, djdwosk97 said:

You can shove your "merit". No one cares. 

 

Please, show me where you quoted me here: 

 

I didn't have to because I showed how it did look the same, die, PCB, and all.

 

1 minute ago, AlwaysFSX said:

What changes the game running in VR than not. It's a head tracking system, which isn't resource intensive, and two viewpoints in game running on what's essentially a giant monitor.

The 90FPS requirement for two separate displays (not one giant monitor), and yes, all the polling from the headset does eat up CPU cycles.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, patrickjp93 said:

I didn't have to because I showed how it did look the same, die, PCB, and all.

It's not even the right fucking shape. 

PSU Tier List | CoC

Gaming Build | FreeNAS Server

Spoiler

i5-4690k || Seidon 240m || GTX780 ACX || MSI Z97s SLI Plus || 8GB 2400mhz || 250GB 840 Evo || 1TB WD Blue || H440 (Black/Blue) || Windows 10 Pro || Dell P2414H & BenQ XL2411Z || Ducky Shine Mini || Logitech G502 Proteus Core

Spoiler

FreeNAS 9.3 - Stable || Xeon E3 1230v2 || Supermicro X9SCM-F || 32GB Crucial ECC DDR3 || 3x4TB WD Red (JBOD) || SYBA SI-PEX40064 sata controller || Corsair CX500m || NZXT Source 210.

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, djdwosk97 said:

It's not even the right fucking shape. 

We agree to disagree. But you're yanking this off-topic. Take this to PM or to off-topic discussion.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, patrickjp93 said:

We agree to disagree. But you're yanking this off-topic. Take this to PM or to off-topic discussion.

"We agree to disagree"...... Just admit that you were wrong and that you were pulling shit out of your ass. 

 

They're a different shape, the 2830 has notches, the 2830 has a much smaller heat spreader relative to the size of the PCB, and the 2830 has components outside of the heatspreader. You were 100% wrong and you know it. Image result for E7 2830 delidded29045918889l.jpg

 

PSU Tier List | CoC

Gaming Build | FreeNAS Server

Spoiler

i5-4690k || Seidon 240m || GTX780 ACX || MSI Z97s SLI Plus || 8GB 2400mhz || 250GB 840 Evo || 1TB WD Blue || H440 (Black/Blue) || Windows 10 Pro || Dell P2414H & BenQ XL2411Z || Ducky Shine Mini || Logitech G502 Proteus Core

Spoiler

FreeNAS 9.3 - Stable || Xeon E3 1230v2 || Supermicro X9SCM-F || 32GB Crucial ECC DDR3 || 3x4TB WD Red (JBOD) || SYBA SI-PEX40064 sata controller || Corsair CX500m || NZXT Source 210.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, patrickjp93 said:

The 90FPS requirement for two separate displays (not one giant monitor), and yes, all the polling from the headset does eat up CPU cycles.

It uses CPU cycles but it should never be enough that it negatively impacts performance unless you're pinned to 100%, which in that case something is wrong with the engine. Vectorization or not, even a couple generation old i3 has enough power to run modern games without hitching.

.

Link to comment
Share on other sites

Link to post
Share on other sites

Guest
This topic is now closed to further replies.


×