Jump to content

Intel Caught Cheating, Gets a Slap on the Wrist 14 Years Later

sTizzl

Cinebench is a benchmark.

Duhh... So you don't know the Cinebench scandal ??

  ﷲ   Muslim Member  ﷲ

KennyS and ScreaM are my role models in CSGO.

CPU: i3-4130 Motherboard: Gigabyte H81M-S2PH RAM: 8GB Kingston hyperx fury HDD: WD caviar black 1TB GPU: MSI 750TI twin frozr II Case: Aerocool Xpredator X3 PSU: Corsair RM650

Link to comment
Share on other sites

Link to post
Share on other sites

The damn thing saved me more money on heating bills, so I can't be bothered.

 

Edit: Wouldn't be eligible anyway, Prescott rolled out in 2004.

FX 6300 @4.8 Ghz - Club 3d R9 280x RoyalQueen @1200 core / 1700 memory - Asus M5A99X Evo R 2.0 - 8 Gb Kingston Hyper X Blu - Seasonic M12II Evo Bronze 620w - 1 Tb WD Blue, 1 Tb Seagate Barracuda - Custom water cooling

Link to comment
Share on other sites

Link to post
Share on other sites

Duhh... So you don't know the Cinebench scandal ??

Duhh... So you don't know the Cinebench scandal ??

It was hardly a scandal. Intel has no reason to trust the microcode of its competition. If it attempts to compile C/C++ for AMD chips there will be instructions missing and some will execute in different counts of clock cycles, leading to a skewed result assuming it doesn't crash. Cinebench is not a scandal. It's just the reality that Intel can't account for AMD CPUs because that would require full IP disclosure on the part of AMD to get it done.

If AMD and Intel came to a mutual agreement on developing a new Cinebench which released with each company's C/C++ compiler, you'd still see AMD falling behind as much as the ICC-compiled benches say so. Disassemble a benchmark, run the assembly through a decompiler, and run that through Clang or GCC. AMD chips don't have AVX 256 or SSE 256 available, not to mention fewer ALUs per core. It's impossible for AMD to beat Intel at Cinebench with its current architectures if you use anything Westmere or newer.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

It was hardly a scandal. Intel has no reason to trust the microcode of its competition. If it attempts to compile C/C++ for AMD chips there will be instructions missing and some will execute in different counts of clock cycles, leading to a skewed result assuming it doesn't crash. Cinebench is not a scandal. It's just the reality that Intel can't account for AMD CPUs because that would require full IP disclosure on the part of AMD to get it done.

If AMD and Intel came to a mutual agreement on developing a new Cinebench which released with each company's C/C++ compiler, you'd still see AMD falling behind as much as the ICC-compiled benches say so. Disassemble a benchmark, run the assembly through a decompiler, and run that through Clang or GCC. AMD chips don't have AVX 256 or SSE 256 available, not to mention fewer ALUs per core. It's impossible for AMD to beat Intel at Cinebench with its current architectures if you use anything Westmere or newer.

I didn't say that they could beat them just the program perfferd Intel over AMD CPUs

.

  ﷲ   Muslim Member  ﷲ

KennyS and ScreaM are my role models in CSGO.

CPU: i3-4130 Motherboard: Gigabyte H81M-S2PH RAM: 8GB Kingston hyperx fury HDD: WD caviar black 1TB GPU: MSI 750TI twin frozr II Case: Aerocool Xpredator X3 PSU: Corsair RM650

Link to comment
Share on other sites

Link to post
Share on other sites

Duhh... So you don't know the Cinebench scandal ??

No I don't actually. But it wasn't necessary for Intel to fake them. Really. But big companies are really stupid so it probably did happen. 

 (\__/)

 (='.'=)

(")_(")  GTX 1070 5820K 500GB Samsung EVO SSD 1TB WD Green 16GB of RAM Corsair 540 Air Black EVGA Supernova 750W Gold  Logitech G502 Fiio E10 Wharfedale Diamond 220 Yamaha A-S501 Lian Li Fan Controller NHD-15 KBTalking Keyboard

Link to comment
Share on other sites

Link to post
Share on other sites

No I don't actually. But it wasn't necessary for Intel to fake them. Really. But big companies are really stupid so it probably did happen. 

That is why I was surprised they already had the lead so why Intel why.

  ﷲ   Muslim Member  ﷲ

KennyS and ScreaM are my role models in CSGO.

CPU: i3-4130 Motherboard: Gigabyte H81M-S2PH RAM: 8GB Kingston hyperx fury HDD: WD caviar black 1TB GPU: MSI 750TI twin frozr II Case: Aerocool Xpredator X3 PSU: Corsair RM650

Link to comment
Share on other sites

Link to post
Share on other sites

That is why I was surprised they already had the lead so why Intel why.

They are stupid. Bad communication or Intel used a PR firm that thought it would be a good idea to fake results. I bet Intel doesn't use them anymore. 

 (\__/)

 (='.'=)

(")_(")  GTX 1070 5820K 500GB Samsung EVO SSD 1TB WD Green 16GB of RAM Corsair 540 Air Black EVGA Supernova 750W Gold  Logitech G502 Fiio E10 Wharfedale Diamond 220 Yamaha A-S501 Lian Li Fan Controller NHD-15 KBTalking Keyboard

Link to comment
Share on other sites

Link to post
Share on other sites

They are stupid. Bad communication or Intel used a PR firm that thought it would be a good idea to fake results. I bet Intel doesn't use them anymore. 

I hope so.

  ﷲ   Muslim Member  ﷲ

KennyS and ScreaM are my role models in CSGO.

CPU: i3-4130 Motherboard: Gigabyte H81M-S2PH RAM: 8GB Kingston hyperx fury HDD: WD caviar black 1TB GPU: MSI 750TI twin frozr II Case: Aerocool Xpredator X3 PSU: Corsair RM650

Link to comment
Share on other sites

Link to post
Share on other sites

i think Linus bashed it once and now everyone does. pretty much whatever Linus says people will blindly follow here.

I suppose the same can be said for Ubisoft bashing with the "keeps on digging" GIF that gets posted around ... 

Interested in Linux, SteamOS and Open-source applications? Go here

Gaming Rig - CPU: i5 3570k @ Stock | GPU: EVGA Geforce 560Ti 448 Core Classified Ultra | RAM: Mushkin Enhanced Blackline 8GB DDR3 1600 | SSD: Crucial M4 128GB | HDD: 3TB Seagate Barracuda, 1TB WD Caviar Black, 1TB Seagate Barracuda | Case: Antec Lanboy Air | KB: Corsair Vengeance K70 Cherry MX Blue | Mouse: Corsair Vengeance M95 | Headset: Steelseries Siberia V2

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

I suppose the same can be said for Ubisoft bashing with the "keeps on digging" GIF that gets posted around ... 

 

 

i think everyone on page with that one tho, They're kinda fucked.

Link to comment
Share on other sites

Link to post
Share on other sites

Please stop posting WCCF crap.

 

The only issues I've seen with WCCF was rumors they were tagging with clickbait, and seemingly just flat out making up their own rumors for the hell of it.

 

This article has a link to a gizmodo article, as well as a link to the official lawsuit site with all the details... you have an issue with this?

Link to comment
Share on other sites

Link to post
Share on other sites

i think everyone on page with that one tho, They're kinda fucked.

Over what? Resolution and framerates? Since when did gamers become so narrow-minded? 

 

Regardless of what reasons Ubisoft gives, the real reason for the sub-1080p resolution and lackluster framerate is the underpowered hardware on Consoles limiting development, something which shouldn't and wouldn't affect the PC version of the game. Yes, some parts aren't optimized, but you can still get much higher levels of graphic fidelity at 1080p resolutions with minimal tweaking of settings on the PC version. Ubisoft is hardly the only developer forced to deliver sub-1080p resolutions and sub-60 framerates to maintain higher levels of graphic fidelity on console versions, regardless of the "excuses" they come up with. 

 

I'm usually the last person to say this: gameplay over graphics. Ubisoft does an outstanding job with attention to detail in building their in-game environments. They do an outstanding job with a believable and authentic story. They've created some of the most breath-taking environments in their games (Assassin's Creed, Far Cry to name a couple). It makes up for the resolution and framerates, which frankly, aren't a problem on the PC version (and if it is, only affecting a small portion of players - the vocal minority). 

 

And before you say something like "well, framerate is important for first-person shooter games", bear in mind: Ubisoft doesn't develop many First-person shooters, and when they do, it's largely single-player where framerates will not give you an advantage; Ubisoft's multi-player modes are nothing to write home about, and are largely played for the single-player. Framerate advantages are only useful for multiplayer to get an edge over other players, but that's hardly an issue for Ubisoft games. 

 

The hate is unwarranted and blind bandwagoning. 

 

Back on topic: Seeing as this news is from something that happened 14 years ago, it's hardly relevant now. Intel doesn't need faked results to beat out AMD in raw performance. 

Interested in Linux, SteamOS and Open-source applications? Go here

Gaming Rig - CPU: i5 3570k @ Stock | GPU: EVGA Geforce 560Ti 448 Core Classified Ultra | RAM: Mushkin Enhanced Blackline 8GB DDR3 1600 | SSD: Crucial M4 128GB | HDD: 3TB Seagate Barracuda, 1TB WD Caviar Black, 1TB Seagate Barracuda | Case: Antec Lanboy Air | KB: Corsair Vengeance K70 Cherry MX Blue | Mouse: Corsair Vengeance M95 | Headset: Steelseries Siberia V2

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Can we ban the WCCF reaction posts? They add absolutely nothing to the conversation. You don't like the source? Take a second to find a better one and post it rather than trying to look cool for bashing and derailing a thread...

Link to comment
Share on other sites

Link to post
Share on other sites

The only issues I've seen with WCCF was rumors they were tagging with clickbait, and seemingly just flat out making up their own rumors for the hell of it.

 

This article has a link to a gizmodo article, as well as a link to the official lawsuit site with all the details... you have an issue with this?

OP posted the link to WCCF which gives them more clicks. This is just like advertising their POS site.

NEW PC build: Blank Heaven   minimalist white and black PC     Old S340 build log "White Heaven"        The "LIGHTCANON" flashlight build log        Project AntiRoll (prototype)        Custom speaker project

Spoiler

Ryzen 3950X | AMD Vega Frontier Edition | ASUS X570 Pro WS | Corsair Vengeance LPX 64GB | NZXT H500 | Seasonic Prime Fanless TX-700 | Custom loop | Coolermaster SK630 White | Logitech MX Master 2S | Samsung 980 Pro 1TB + 970 Pro 512GB | Samsung 58" 4k TV | Scarlett 2i4 | 2x AT2020

 

Link to comment
Share on other sites

Link to post
Share on other sites

OP posted the link to WCCF which gives them more clicks. This is just like advertising their POS site.

The difference in this case is this is legitimate news almost no one is covering currently.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

The difference in this case is this is legitimate news almost no one is covering currently.

You just said they linked to a gizmodo article...

NEW PC build: Blank Heaven   minimalist white and black PC     Old S340 build log "White Heaven"        The "LIGHTCANON" flashlight build log        Project AntiRoll (prototype)        Custom speaker project

Spoiler

Ryzen 3950X | AMD Vega Frontier Edition | ASUS X570 Pro WS | Corsair Vengeance LPX 64GB | NZXT H500 | Seasonic Prime Fanless TX-700 | Custom loop | Coolermaster SK630 White | Logitech MX Master 2S | Samsung 980 Pro 1TB + 970 Pro 512GB | Samsung 58" 4k TV | Scarlett 2i4 | 2x AT2020

 

Link to comment
Share on other sites

Link to post
Share on other sites

You just said they linked to a gizmodo article...

Gizmodo is the only group covering this right now. WCCF is perfectly in the right to be covering this, despite the fact I think it's old news.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

Over what? Resolution and framerates? Since when did gamers become so narrow-minded? 

 

Regardless of what reasons Ubisoft gives, the real reason for the sub-1080p resolution and lackluster framerate is the underpowered hardware on Consoles limiting development, something which shouldn't and wouldn't affect the PC version of the game. Yes, some parts aren't optimized, but you can still get much higher levels of graphic fidelity at 1080p resolutions with minimal tweaking of settings on the PC version. Ubisoft is hardly the only developer forced to deliver sub-1080p resolutions and sub-60 framerates to maintain higher levels of graphic fidelity on console versions, regardless of the "excuses" they come up with. 

 

I'm usually the last person to say this: gameplay over graphics. Ubisoft does an outstanding job with attention to detail in building their in-game environments. They do an outstanding job with a believable and authentic story. They've created some of the most breath-taking environments in their games (Assassin's Creed, Far Cry to name a couple). It makes up for the resolution and framerates, which frankly, aren't a problem on the PC version (and if it is, only affecting a small portion of players - the vocal minority). 

 

And before you say something like "well, framerate is important for first-person shooter games", bear in mind: Ubisoft doesn't develop many First-person shooters, and when they do, it's largely single-player where framerates will not give you an advantage; Ubisoft's multi-player modes are nothing to write home about, and are largely played for the single-player. Framerate advantages are only useful for multiplayer to get an edge over other players, but that's hardly an issue for Ubisoft games. 

 

The hate is unwarranted and blind bandwagoning. 

 

Back on topic: Seeing as this news is from something that happened 14 years ago, it's hardly relevant now. Intel doesn't need faked results to beat out AMD in raw performance. 

 

Sorry if this is off topic, but I had to say something about it...

 

The real issue is not about "gameplay over graphics"... sure they do outstanding job at gameplay, but having a nice graphics are nice especially when I already spend thousands of money to play games with graphics at 1080p with 60fps... but the excuse they give, shoving those words into our mouth... if you can't do it, say you can't do it... I dont know about others, but I most certainly can understand that... but coming out saying 30fps is more cinematics, that's what disgust me (most probably all the community). 

 

I know all they talk about is console, but as you also mentioned it shouldn't affect the PC. Now, here's the problem... Ubisoft's recent history is "Console Parity". They push console's interest on PC. WatchDogs parity scandal, and announcing GTX680 as bare minimum for AC:Unity? And not to mention, there are games on PC locked at 30fps, though probably none from Ubisoft but keep it at this rate I'm pretty sure they have plan to do it... call me paranoid, but it has happened with other publisher/developer before I would say totally plausible Ubisoft is doing it (errrr... it already happened in The Crew?)

 

"Ubisoft keeps on digging" is really correct impression.

 

oh, btw... I play the pixelated Terraria at 1080p and make sure I have 60fps gameplay. so, yeah... even in pixelated games I like that resolution and framerate...

Gadgets: Lenovo Y580 (Nostromo, Naga Epic, Hydra, TrackIr5), Surface Pro 3 (i3), Lumia 930, PSVita

Rig: i7-4770K, 8GB Kingston HyperX, Asus Strix GTX970, MSI Z87-GD65, Asus Xonar DGX, CMStorm Scout II, CM Seidon 240M, BlackWidow Ultimate, Naga Epic, Goliathus Extended Control, TrackIr5, Sennheiser HD205, Audio-Technica ATR2500, Edifier speaker, Logitech G940, Logitech G27, Logitech F710, Dell S2340L, Philips 200VL, Samsung 830 128GB SSD, DXRacer FA01N

Link to comment
Share on other sites

Link to post
Share on other sites

http://wccftech.com/intel-settles-15-year-class-action-lawsuit-faking-benchmarks/

 

 

 

 

Seriously? and i thought Intel was reputable back then, i sure hope they aren't still attempting to pull these kind of shenanigans!

 

And this really makes me wonder what these companies like intel nvidia and amd can really do to get inflated benchmark numbers.

 

Don't know why this is shocking and this is why being a fanboy of the companies, instead of a specific product is stupid. These companies all do shady stuff and in the past AMD and Intel were a lot closer and AMD was ahead at one time.

 

Take benchmarks of the GTX 970, when they will put it up against a throttling reference R9 290 or 290x (and those reference cards were BAD) or not do 1440p benchmarks or 4k benchmarks or use in game downsampling/supersampling that are in the game settings themselves. You don't think Nvidia has anything to do with that? A good brand R9 290 is a very good card still and can beat a GTX 980 at higher resolutions, yet the 970 is marketed on every website as the greatest thing to ever hit GPU's. This is all marketing BS.

 

 http://forums.anandtech.com/showthread.php?t=2403455

 

Shadows of Mordor saw the same thing. GTX 970 is one hell of a card, and very impressive on temps/power usage on reference models, but it isn't a wonder card that blows everything away.

 

Add to this? The same companies you think hate each other, like AMD and Nvidia? Have been caught fixing prices on GPU's, while idiots and fanboys of one brand say the other is trash lol.

 

http://www.tomshardware.com/news/nvidia-amd-ati-graphics,6311.html

 

Be a fan of a product and not a company, and make sure the benchmarks and tests validate that fanhood. If I had to rate them in terms of most shady? Microsoft takes the cake over any hardware maker as far as paying off bloggers, "journalists" and astroturfing. There is a reason MS spends so much money lobbying. They are shady as hell and they want to avoid fines. Nvidia would be next, Intel, and probably AMD last. AMD is only less shady, because they don't have the money to slant benchmarks, and hire astroturfers to laud praises on Nvidia GPU's and trash AMD GPU's. If AMD had more money they would be just as bad. 

 

http://richg42.blogspot.com/2014/05/the-truth-on-opengl-driver-quality.html

 

^ Former Valve guy that worked on OpenGL/Steam OS implementation on the vendors. Vendor A=Nvidia B = AMD C =Intel. He says Nvidia is referred to as the video card mafia. He LIKES Nvidia as a product, just like most people do. They usually make quality products. The BS they will stoop to? I don't think anyone is really a fan of that.

 

I have no doubt AMD would be just as bad, if they were ahead and had the money to do the same shady crap. Hell, the ONLY reason I think we even got Mantle, is because AMD was behind on single core performance on CPU and needed a low level API to compete in games. If they weren't I don't know if we ever would have got a low level API. They would have just kept selling us expensive CPU's that were overkill for what the game required.

CPU:24/7-4770k @ 4.5ghz/4.0 cache @ 1.22V override, 1.776 VCCIN. MB: Z87-G41 PC Mate. Cooling: Hyper 212 evo push/pull. Ram: Gskill Ares 1600 CL9 @ 2133 1.56v 10-12-10-31-T1 150 TRFC. Case: HAF 912 stock fans (no LED crap). HD: Seagate Barracuda 1 TB. Display: Dell S2340M IPS. GPU: Sapphire Tri-x R9 290. PSU:CX600M OS: Win 7 64 bit/Mac OS X Mavericks, dual boot Hackintosh.

Link to comment
Share on other sites

Link to post
Share on other sites

Duhh... So you don't know the Cinebench scandal ??

What scandal? 

TDLx2vT.png

All CPU's at 2.8GHz there in R15.

jYnes7s.png

Seems like the Phenom/Bulldozer performed better in Cinebench than in that other rendering benchmark.

SC2HotS-pcgh.png

Didn't you know about the Starcraft 2 scandal? I remember you saying every benchmark out there is fabricated/biased except the ones from Teksyndicate. Also you pulled that from Wendells mouth who read it off this document; http://www.ftc.gov/sites/default/files/documents/cases/091216intelcmpt.pdf

Thats a document from the government I assume? There are only accusations in there, there was no Cinebench scandal as it performed worse in any other benchmark. Cinebench is one of the few benchmarks AMD keeps up, in all others it just falls massively behind.

Link to comment
Share on other sites

Link to post
Share on other sites

Duhh... So you don't know the Cinebench scandal ??

What scandal? 

TDLx2vT.png

All CPU's at 2.8GHz there in R15.

jYnes7s.png

Seems like the Phenom/Bulldozer performed better in Cinebench than in that other rendering benchmark.

SC2HotS-pcgh.png

Didn't you know about the Starcraft 2 scandal? I remember you saying every benchmark out there is fabricated/biased except the ones from Teksyndicate. Also you pulled that from Wendells mouth who read it off this document; http://www.ftc.gov/sites/default/files/documents/cases/091216intelcmpt.pdf

Thats a document from the government I assume? There are only accusations in there, there was no Cinebench scandal as it performed worse in any other benchmark. Cinebench is one of the few benchmarks AMD keeps up, in all others it just falls massively behind.

Link to comment
Share on other sites

Link to post
Share on other sites

@Faa there was a time Cinebench didn't compile code to optimize for AMD chips. This was a problem because you had same-gen chips on near equal process nodes were getting 5-point differences. Nowadays Intel tries based on some guarantees and specs from AMD, but it's still far from perfect, and there is no AMD quad core that can match a sandy bridge quad, much less a Haswell. The FX9590 keeping up with the I7 3820 is not an accomplishment...

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

@Faa there was a time Cinebench didn't compile code to optimize for AMD chips. This was a problem because you had same-gen chips on near equal process nodes were getting 5-point differences. Nowadays Intel tries based on some guarantees and specs from AMD, but it's still far from perfect, and there is no AMD quad core that can match a sandy bridge quad, much less a Haswell. The FX9590 keeping up with the I7 3820 is not an accomplishment...

It's AMD's own fault not offering their own compiler for Windows, they do have one for Linux however. Let's not blame or claim the Intel compiler nerfing AMD's performance and pretend like the 8350 has a bunch more potential left, AMD's CPU's even benefit greatly from Intels compiler than the one from MS, it's just the flawed bulldozer CPU with 2006 IPC at 4GHz.

 

Link to comment
Share on other sites

Link to post
Share on other sites

It's AMD's own fault not offering their own compiler for Windows, they do have one for Linux however. Let's not blame or claim the Intel compiler nerfing AMD's performance and pretend like the 8350 has a bunch more potential left, AMD's CPU's even benefit greatly from Intels compiler than the one from MS, it's just the flawed bulldozer CPU with 2006 IPC at 4GHz.

You have to remember Intel has a huge vested interest in making x86 highly effective and easy to use on any platform. AMD has to lean on the instructions Intel creates since Intel will never buy a license to AMD instructions beyond x64. Plus AMD is trying to get out from under x86 and move completely to ARM.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×