Jump to content

AMD's quarterly earnings report - operating loss of 49 million and net loss of 102 million USD

zMeul

buying AMD would be like playing the lottery with your entire savings account

what happens with our brand new ZEN and Polaris if AMD goes under - U'r fckd! no matter MicroSoft phasing you out, AMD will do it for them  ^_^

A scary thought indeed :S Scary stuff. Maybe the console makers will buy decide to make an (unlikely) new console, and give a nice big lump sum? That'd be nice too. 

Bleigh!  Ever hear of AC series? 

Link to comment
Share on other sites

Link to post
Share on other sites

As an investor this is great news. We can see the trend that the operating loss is getting smaller and smaller. Anyone who invested in a company doesn't care that this Quarter will be a loss or not. It cares about the trend and ROI over a longer period. This shows that the company will slowly bust steadily make a profit eventually in the next few years. And that is perfect. 

 

There are only like 3 of us in this comment thread that see this as a positive. Like you said in a situation that AMD is in you look at it from quarter to quarter and a year is basically overnight in the business world. It's like some of the people here expect AMD to make millions of dollars in profit from one quarter after operating at a major loss.

Spoiler

Cpu: Ryzen 9 3900X – Motherboard: Gigabyte X570 Aorus Pro Wifi  – RAM: 4 x 16 GB G. Skill Trident Z @ 3200mhz- GPU: ASUS  Strix Geforce GTX 1080ti– Case: Phankteks Enthoo Pro M – Storage: 500GB Samsung 960 Evo, 1TB Intel 800p, Samsung 850 Evo 500GB & WD Blue 1 TB PSU: EVGA 1000P2– Display(s): ASUS PB238Q, AOC 4k, Korean 1440p 144hz Monitor - Cooling: NH-U12S, 2 gentle typhoons and 3 noiseblocker eloops – Keyboard: Corsair K95 Platinum RGB Mouse: G502 Rgb & G Pro Wireless– Sound: Logitech z623 & AKG K240

Link to comment
Share on other sites

Link to post
Share on other sites

Not too surprising. With Nvidia owning the high end and Intel tearing up the low end with most iris pro models being roughly equal to a 750ti in gaming performance, AMD really has nowhere left to go.

A 750 Ti costs 100$ or so. Iris Pro found on the 5775C costs upwards of 400$. I don't see your logic

Archangel (Desktop) CPU: i5 4590 GPU:Asus R9 280  3GB RAM:HyperX Beast 2x4GBPSU:SeaSonic S12G 750W Mobo:GA-H97m-HD3 Case:CM Silencio 650 Storage:1 TB WD Red
Celestial (Laptop 1) CPU:i7 4720HQ GPU:GTX 860M 4GB RAM:2x4GB SK Hynix DDR3Storage: 250GB 850 EVO Model:Lenovo Y50-70
Seraph (Laptop 2) CPU:i7 6700HQ GPU:GTX 970M 3GB RAM:2x8GB DDR4Storage: 256GB Samsung 951 + 1TB Toshiba HDD Model:Asus GL502VT

Windows 10 is now MSX! - http://linustechtips.com/main/topic/440190-can-we-start-calling-windows-10/page-6

Link to comment
Share on other sites

Link to post
Share on other sites

AMD is the reason you're not paying 1000 dollars for a 970.

No, demand is why. There is very little room for Nvidia to increase prices to attain higher profit margins. Even monopolies are not immune to basic microeconomics. In a market with elastic demand (GPUs), increasing prices will decrease the number of units sold, period. Losing AMD only shifts the demand curve. It does not turn that curve on its head. Would you people educate yourselves before speculating? It's truly annoying picking up after you.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

No, demand is why. There is very little room for Nvidia to increase prices to attain higher profit margins. Even monopolies are not immune to basic microeconomics. In a market with elastic demand (GPUs), increasing prices will decrease the number of units sold, period. Losing AMD only shifts the demand curve. It does not turn that curve on its head. Would you people educate yourselves before speculating? It's truly annoying picking up after you.

A truly hilarious read.

- CPU: Intel i7 3770 - GPU: MSI R9 390 - RAM: 16GB of DDR3 - SSD: Crucial BX100 - HDD: Seagate Barracuda 1TB -

 

Link to comment
Share on other sites

Link to post
Share on other sites

not really as the card they rebadged beat their nvidia counter parts 

 

Not too sure about that tbh but okay.

Spoiler

Chernobyl

AMD FX8350 @ 5GHz | Asus Sabretooth 990FX R2 | 16GB HyperX Savage @1950mhz CL9 | 120GB Kingston SSDNow

EK AMD LTX CSQ | XSPC D5 Dual Bay | Alphacool NexXxoS XT45 240mm & Coolgate Triple HD360

 

Spoiler

Kraken

Intel i5 4670K Bare Die 4.9GHz | ASUS Maximus VII Ranger Z97 | 16GB HyperX Savage 2400MHz | Samsung EVO 250GB

EK Supremecy EVO & EK-MOSFET M7G  | Dual 360mm Rads | Primochill CTR Phase II w/D5 | MSI GTX970 1670MHz/8000MHz

 

Graphic Design Student & Overall Nerd

 

Link to comment
Share on other sites

Link to post
Share on other sites

I can, but I as a customer pay for this crap to be implemented in the game. Remember that this replaces what would otherwise be high end settings made by the devs themselves. So I can either have a worse looking game than I paid for, or get shit performance because NVidia wants to lock in all their consumers and thus undermine competition (vendor lock in and unfair business strategies).

Devs get GW for free and Nvidia helps them implement it. If it didn't exist, no dev would even bother.

You know why? Because it's a terrible investment for the game studio. Whether we like it or not, AAA gaming is still bigger on consoles, which don't benefit from features that are as high-tech as GW is. It also requires hiring additional people, as the math behind some of that fancy smoke and dust is ridiculously complex. That all adds up to be a very expensive effect, just for some smoke. Add to that the fact that it'll run fairly heavy, so 99% of people won't even use it , and I think it's rather easy to see why a company won't bother.

You know who does benefit from high-tech eye candy though? Companies that make high-tech GPU's. So they make a suite of eye candy, and provide it to the devs for free, including assistance with implementation. Boom, marketing.

Rest assured that you didn't pay for gameworks, the devs spent next to no money on it. What you did pay for was the lowering of the general visual fidelity that needs to happen in order for the game to run on consoles ("optimization", aka people lowering texture resolutions and mesh density). You also paid for that stupid commercial you hate, and so on...

You know who did pay for gameworks? I did, along with everybody else who owns an Nvidia card. You better start blaming us for the piss poor performance of GW games.

Do I think you're being unreasonable? No, not at all. Do I think Nvidia should optimize GW for AMD cards? Nope, but they should allow AMD to do so. Personally, I find GW to be largely useless. I could use it, I guess, but it still has a lot of impact on a high-end Nvidia setup. It's a niche thing, IMHO.

"It's a taxi, it has a FARE METER."

Link to comment
Share on other sites

Link to post
Share on other sites

A 750 Ti costs 100$ or so. Iris Pro found on the 5775C costs upwards of 400$. I don't see your logic

Iris Pro costs $70. The CPU costs $330. Is that really too much to ask?

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

Iris Pro costs $70. The CPU costs $330. Is that really too much to ask?

Why is the Iris Pro not on Pentiums then?

Archangel (Desktop) CPU: i5 4590 GPU:Asus R9 280  3GB RAM:HyperX Beast 2x4GBPSU:SeaSonic S12G 750W Mobo:GA-H97m-HD3 Case:CM Silencio 650 Storage:1 TB WD Red
Celestial (Laptop 1) CPU:i7 4720HQ GPU:GTX 860M 4GB RAM:2x4GB SK Hynix DDR3Storage: 250GB 850 EVO Model:Lenovo Y50-70
Seraph (Laptop 2) CPU:i7 6700HQ GPU:GTX 970M 3GB RAM:2x8GB DDR4Storage: 256GB Samsung 951 + 1TB Toshiba HDD Model:Asus GL502VT

Windows 10 is now MSX! - http://linustechtips.com/main/topic/440190-can-we-start-calling-windows-10/page-6

Link to comment
Share on other sites

Link to post
Share on other sites

A truly hilarious read.

What exactly is hilarious about what I said? It's one hundred percent true. Nvidia would lose profits if it drove prices upward too much. They're already very close to the max they can price at (so is Intel, and every other non-commodity seller).

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

Why is the Iris Pro not on Pentiums then?

The GPU has to be integrated into the die, so it's probably a basic cost-benefit analysis. Intel figures more enthusiasts will buy it at a decent premium compared to people who would buy at entry for a lower premium.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

I'm so glad I dumped my AMD stocks a long time ago so I didn't lose all that much dosh. Hopefully they can bounce back midyear with the new GPU's and whatnot.

FX 6300 @4.8 Ghz - Club 3d R9 280x RoyalQueen @1200 core / 1700 memory - Asus M5A99X Evo R 2.0 - 8 Gb Kingston Hyper X Blu - Seasonic M12II Evo Bronze 620w - 1 Tb WD Blue, 1 Tb Seagate Barracuda - Custom water cooling

Link to comment
Share on other sites

Link to post
Share on other sites

Devs get GW for free and Nvidia helps them implement it. If it didn't exist, no dev would even bother.

You know why? Because it's a terrible investment for the game studio. Whether we like it or not, AAA gaming is still bigger on consoles, which don't benefit from features that are as high-tech as GW is. It also requires hiring additional people, as the math behind some of that fancy smoke and dust is ridiculously complex. That all adds up to be a very expensive effect, just for some smoke. Add to that the fact that it'll run fairly heavy, so 99% of people won't even use it , and I think it's rather easy to see why a company won't bother.

You know who does benefit from high-tech eye candy though? Companies that make high-tech GPU's. So they make a suite of eye candy, and provide it to the devs for free, including assistance with implementation. Boom, marketing.

Rest assured that you didn't pay for gameworks, the devs spent next to no money on it. What you did pay for was the lowering of the general visual fidelity that needs to happen in order for the game to run on consoles ("optimization", aka people lowering texture resolutions and mesh density). You also paid for that stupid commercial you hate, and so on...

You know who did pay for gameworks? I did, along with everybody else who owns an Nvidia card. You better start blaming us for the piss poor performance of GW games.

Do I think you're being unreasonable? No, not at all. Do I think Nvidia should optimize GW for AMD cards? Nope, but they should allow AMD to do so. Personally, I find GW to be largely useless. I could use it, I guess, but it still has a lot of impact on a high-end Nvidia setup. It's a niche thing, IMHO.

 

Actually devs has to pay for source code access to GW. Either a little for access or more for editing rights. That said, of course they get "incentives" to implement GW into a game. Don't be naive. Maybe it's not outright money, but instead hardware, direct access to NVidia engineers/programmers, maybe even have NVidia do some programming for you, saving the dev a lot of money.

 

I completely disagree. Graphics is very much a selling point. Always have, and still is. No one would wine about downgrading from teasers to final product, if it didn't matter. Also you are incorrect. Tomb Raider, Rise of the Tomb Raider and Deus Ex: Mankind Divided all have TressFX hair on the PlayStation 4 versions. It's about scalability and optimization. Two things GW sucks at hard.

 

I don't mind graphics API's making high end effects available, but they need to be open, scalable and optimized. As for companies like Ubisoft, they should just go the route of EA, where they have a dedicated engine group. That would make development a lot cheaper, while making the engine a lot more optimized and contain a lot more advanced graphics features. For the midend devs, something like AMD's OpenFX (or what they called it) is much better due to source code access and vendor agnostics.

 

Battlefront had really nice graphical effects, but the game could still look nice on a low end card. EA benefited from that, not the GPU vendors. NVidia however benefit from making GW overly complex and overly ressource demanding. That is true. That is also yet another reason why GW needs to die.

 

I still pay for the time and development costs of getting GW into the game. I still get an inferior game, that runs worse. Especially since the GW contract seems to include the game having to be highly optimized for NVidia, even at the cost of AMD performance.

 

That I completely agree on. I don't expect NVidia to optimize GW for AMD. Only that they give the dev and/or AMD the possibility to do so. However OpenFX is the best of both worlds.

Watching Intel have competition is like watching a headless chicken trying to get out of a mine field

CPU: Intel I7 4790K@4.6 with NZXT X31 AIO; MOTHERBOARD: ASUS Z97 Maximus VII Ranger; RAM: 8 GB Kingston HyperX 1600 DDR3; GFX: ASUS R9 290 4GB; CASE: Lian Li v700wx; STORAGE: Corsair Force 3 120GB SSD; Samsung 850 500GB SSD; Various old Seagates; PSU: Corsair RM650; MONITOR: 2x 20" Dell IPS; KEYBOARD/MOUSE: Logitech K810/ MX Master; OS: Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

Actually devs has to pay for source code access to GW. Either a little for access or more for editing rights. That said, of course they get "incentives" to implement GW into a game. Don't be naive. Maybe it's not outright money, but instead hardware, direct access to NVidia engineers/programmers, maybe even have NVidia do some programming for you, saving the dev a lot of money.

 

I completely disagree. Graphics is very much a selling point. Always have, and still is. No one would wine about downgrading from teasers to final product, if it didn't matter. Also you are incorrect. Tomb Raider, Rise of the Tomb Raider and Deus Ex: Mankind Divided all have TressFX hair on the PlayStation 4 versions. It's about scalability and optimization. Two things GW sucks at hard.

 

I don't mind graphics API's making high end effects available, but they need to be open, scalable and optimized. As for companies like Ubisoft, they should just go the route of EA, where they have a dedicated engine group. That would make development a lot cheaper, while making the engine a lot more optimized and contain a lot more advanced graphics features. For the midend devs, something like AMD's OpenFX (or what they called it) is much better due to source code access and vendor agnostics.

 

Battlefront had really nice graphical effects, but the game could still look nice on a low end card. EA benefited from that, not the GPU vendors. NVidia however benefit from making GW overly complex and overly ressource demanding. That is true. That is also yet another reason why GW needs to die.

 

I still pay for the time and development costs of getting GW into the game. I still get an inferior game, that runs worse. Especially since the GW contract seems to include the game having to be highly optimized for NVidia, even at the cost of AMD performance.

 

That I completely agree on. I don't expect NVidia to optimize GW for AMD. Only that they give the dev and/or AMD the possibility to do so. However OpenFX is the best of both worlds.

 

Indeed, game studios do benefit from GW/OFX, as they can slap an "in-engine footage" on their trailers and yet make them look great (a big part of that is also AE and speedgrade, but I disgress). It's still not practical in use though, not even "openFX" (Which is actually not a real thing. OpenFX is an API for writing compositing plugins. IIRC it's called RADEON SDK). I still think (likewise for GW) those effect only look nice in trailers, never in-game. I've never played with TressFX/hairworks on, as the effects just look hideous outside a controlled environment (like a "gameplay" trailer.)

 

I never said GFX wasn't a selling point though, but I did say that spending ludicrous amounts of money reinventing the wheel just isn't very good business practice. TressFX on consoles is the first time any of the effects is actually kind of worth it, yet it'll still look as bad as it did in tomb raider, because simulated hair just doesn't work in games and because the PS4's GPU is worse than my phone's.

 

Let's talk about engines for a second.

Ubisoft is a bit special - their games are arguably very different in core features. Their engines (DUNIA - AnvilNext and Disrupt) already share a lot of graphics and open world stuff, but by having a different engine for different genres actually allows for better optimization of game-specific features. The climbing code in AC, for example: I can imagine AnvilNext contains some very specific tools just for that. DUNIA, on the other hand, has it's own "specialities" like better vehicle support, FPS stuffs.

 

EA's approach is not necessarily better, as the studios themselves need to implement a lot of gameplay features themselves, something frostbite (which still originally was an FPS engine) might not always handle equally well. 

 

In both cases, graphics stuff is shared, so neither solution is really better than the other.

 

I'd strongly prefer both VisualFX and Radeon SDK to die in a fire. They're both useless and unnecessary. They still cover a minority of the market - with the exception of the TressFX implementation on console- and are essentially useless outside of trailer making and marketing. In-engine features would arguably be better, though I still don't feel like hair simulation is at a stage where we want it in games.

 

I'm still bothered by the fact that you insist on complaining about the fact you've paid for GW. You paid for so many things in game that you never use. You paid for the game to be translated into 50 languages, none of which you speak. You paid for the game to be voice acted in some of those languages, you paid for the game to have Low, medium, high and ultra settings, which you don't all use. You paid the same amount for a digital version, but didn't get a box. That's the way this works.

"It's a taxi, it has a FARE METER."

Link to comment
Share on other sites

Link to post
Share on other sites

Actually, they don't perform "terribly" in The Witcher 3.

I played the game from the beginning to the end three times, once with my old GTX 660, second time with my GTX 970 and third time with my R9 290X, and I can tell you there's little to no difference, in favor of the 290X, I couldn't run HairWorks on my GTX 970 and I can't run in on my 290X due to framerates dropping too low, I mean I can run it but this overtesselated piece of garbage from NVIDIA is not worth the hassle. HBAO+ that's included is decent and takes like 3-4 frames per second on my 290X so it's cool.

So TL;DR, The Witcher 3 runs good on AMD cards, my flatmate has R9 380, runs very well (better than a GTX 960). And my R9 290X runs it just as well, or a little better than my 970. Can't say much for Fallout 4 but on recent drivers it seems that if you tweak one or two settings it's more than fine on R9 390 cards for example

 

Do not mind Majestic. he is a blatant fanboy. He cannot take any form of criticism and will most likely fling insults and cherrypicked benchmarks at you since you said something to "prove" him wrong.

Link to comment
Share on other sites

Link to post
Share on other sites

Do not mind Majestic. he is a blatant fanboy. He cannot take any form of criticism and will most likely fling insults and cherrypicked benchmarks at you since you said something to "prove" him wrong.

Oh, I didn't read the whole conversation, I just happen to play The Witcher 3 a lot, on different GPUs from Nvidia and AMD and thought that I should correct some misconceptions, if what you said is true then he won't probably believe me, well, I won't try convincing him then ;-;

CPU: AMD Ryzen 7 5800X3D GPU: AMD Radeon RX 6900 XT 16GB GDDR6 Motherboard: MSI PRESTIGE X570 CREATION
AIO: Corsair H150i Pro RAM: Corsair Dominator Platinum RGB 32GB 3600MHz DDR4 Case: Lian Li PC-O11 Dynamic PSU: Corsair RM850x White

Link to comment
Share on other sites

Link to post
Share on other sites

Oh, I didn't read the whole conversation, I just happen to play The Witcher 3 a lot, on different GPUs from Nvidia and AMD and thought that I should correct some misconceptions, if what you said is true then he won't probably believe me, well, I won't try convincing him then ;-;

He tried to get a mod to censor me yesterday because if you ask him a Phenom X4 9000 series (same as a Q6600) and R9 280 is a bad combo xD

Archangel (Desktop) CPU: i5 4590 GPU:Asus R9 280  3GB RAM:HyperX Beast 2x4GBPSU:SeaSonic S12G 750W Mobo:GA-H97m-HD3 Case:CM Silencio 650 Storage:1 TB WD Red
Celestial (Laptop 1) CPU:i7 4720HQ GPU:GTX 860M 4GB RAM:2x4GB SK Hynix DDR3Storage: 250GB 850 EVO Model:Lenovo Y50-70
Seraph (Laptop 2) CPU:i7 6700HQ GPU:GTX 970M 3GB RAM:2x8GB DDR4Storage: 256GB Samsung 951 + 1TB Toshiba HDD Model:Asus GL502VT

Windows 10 is now MSX! - http://linustechtips.com/main/topic/440190-can-we-start-calling-windows-10/page-6

Link to comment
Share on other sites

Link to post
Share on other sites

Oh, I didn't read the whole conversation, I just happen to play The Witcher 3 a lot, on different GPUs from Nvidia and AMD and thought that I should correct some misconceptions, if what you said is true then he won't probably believe me, well, I won't try convincing him then ;-;

Know your fanboys.

CPU i7 6700 Cooling Cryorig H7 Motherboard MSI H110i Pro AC RAM Kingston HyperX Fury 16GB DDR4 2133 GPU Pulse RX 5700 XT Case Fractal Design Define Mini C Storage Trascend SSD370S 256GB + WD Black 320GB + Sandisk Ultra II 480GB + WD Blue 1TB PSU EVGA GS 550 Display Nixeus Vue24B FreeSync 144 Hz Monitor (VESA mounted) Keyboard Aorus K3 Mechanical Keyboard Mouse Logitech G402 OS Windows 10 Home 64 bit

Link to comment
Share on other sites

Link to post
Share on other sites

I hope Zen and Polaris delivers so AMD can get back in the race being a top dog without working at a loss.

I'll happily get a Polaris GPU if it really does meet their performance expectations and can match or best my current setup; same for the Zen line.

Please be good!

5950X | NH D15S | 64GB 3200Mhz | RTX 3090 | ASUS PG348Q+MG278Q

 

Link to comment
Share on other sites

Link to post
Share on other sites

Indeed, game studios do benefit from GW/OFX, as they can slap an "in-engine footage" on their trailers and yet make them look great (a big part of that is also AE and speedgrade, but I disgress). It's still not practical in use though, not even "openFX" (Which is actually not a real thing. OpenFX is an API for writing compositing plugins. IIRC it's called RADEON SDK). I still think (likewise for GW) those effect only look nice in trailers, never in-game. I've never played with TressFX/hairworks on, as the effects just look hideous outside a controlled environment (like a "gameplay" trailer.)

 

I never said GFX wasn't a selling point though, but I did say that spending ludicrous amounts of money reinventing the wheel just isn't very good business practice. TressFX on consoles is the first time any of the effects is actually kind of worth it, yet it'll still look as bad as it did in tomb raider, because simulated hair just doesn't work in games and because the PS4's GPU is worse than my phone's.

 

Let's talk about engines for a second.

Ubisoft is a bit special - their games are arguably very different in core features. Their engines (DUNIA - AnvilNext and Disrupt) already share a lot of graphics and open world stuff, but by having a different engine for different genres actually allows for better optimization of game-specific features. The climbing code in AC, for example: I can imagine AnvilNext contains some very specific tools just for that. DUNIA, on the other hand, has it's own "specialities" like better vehicle support, FPS stuffs.

 

EA's approach is not necessarily better, as the studios themselves need to implement a lot of gameplay features themselves, something frostbite (which still originally was an FPS engine) might not always handle equally well. 

 

In both cases, graphics stuff is shared, so neither solution is really better than the other.

 

I'd strongly prefer both VisualFX and Radeon SDK to die in a fire. They're both useless and unnecessary. They still cover a minority of the market - with the exception of the TressFX implementation on console- and are essentially useless outside of trailer making and marketing. In-engine features would arguably be better, though I still don't feel like hair simulation is at a stage where we want it in games.

 

I'm still bothered by the fact that you insist on complaining about the fact you've paid for GW. You paid for so many things in game that you never use. You paid for the game to be translated into 50 languages, none of which you speak. You paid for the game to be voice acted in some of those languages, you paid for the game to have Low, medium, high and ultra settings, which you don't all use. You paid the same amount for a digital version, but didn't get a box. That's the way this works.

 

Idk about that. I really liked Lara Crofts hair in Tomb Raider, and looking at the tech demos from Rise Of the Tomb Raider, it looks even better now, as it can get wet, have snowflakes in it, etc. I also really like WaveWorks in Watch Dogs, as that looks really nice (ironically the low WW setting looked more natural than the high end setting). I think these effects looks very good. We've also seen how it is implemented in Deus Ex MD on both Adam Jensen, but also NPC's/enemies. And those demos are on a PS4.

 

I mean just look at Adam's hair in the beginning here:

 

 

I think it looks a thousand times better than mesh hair. In the first gameplay trailer we ever saw, Adam takes down an enemy with neck long hair. Looked really good too. The problem as I said are not how advanced their are, but how ressource wasteful their are. GW tends to be extreme wasteful even on 980ti's, whereas the new versions of TressFX with it's master/slave strand principle seems to be much easier to run. It also has scalability when it comes to distance, etc.

 

Of course it's a subjective thing what to prefer, but I think it's objectively more natural looking.

Oh Btw RotTR also had tressFX hair on the XBone. See here:

 

https://youtu.be/IlUxKO5zYOA?t=162

 

look from 2:42. Especially when snow and water gets on the hair. Might not look like real life (what does), but it does look better than any other hair tech on the market (and this is on low setting for the Xbotato).

 

Well investing a lot of money on high end effects can be a marketing ploy in and of itself. GameWorks is a testement to that, despite it's horrible performance. But other games has used their graphics quite well as a marketing point.

 

As for Frostbyte, idk. Afaik it works equally well on everything from Battlefront to Command&Conquer to Plants vs Zombies. Again, you have a dedicated team, who knows everything about their engine, and thus easily add new features and optimizations for the game dev creating the gameplay/graphics. I would definitely recommend Ubisoft to go this way. Maybe base it on the Snowdrop engine from the Division, which looks insanely good.

 

Not sure about that example. Localization are paid for by the locations they are sold in. If lets say the Danish gui/subtitle translations that Ubisoft is so good at doing, would not pay off in Denmark, they wouldn't do it. Translations also don't make my gaming experience visually worse or my gameplay for that matter. Neither does it come with strings attached to a specific hardware vendor.

Watching Intel have competition is like watching a headless chicken trying to get out of a mine field

CPU: Intel I7 4790K@4.6 with NZXT X31 AIO; MOTHERBOARD: ASUS Z97 Maximus VII Ranger; RAM: 8 GB Kingston HyperX 1600 DDR3; GFX: ASUS R9 290 4GB; CASE: Lian Li v700wx; STORAGE: Corsair Force 3 120GB SSD; Samsung 850 500GB SSD; Various old Seagates; PSU: Corsair RM650; MONITOR: 2x 20" Dell IPS; KEYBOARD/MOUSE: Logitech K810/ MX Master; OS: Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

Not too surprising. With Nvidia owning the high end and Intel tearing up the low end with most iris pro models being roughly equal to a 750ti in gaming performance, AMD really has nowhere left to go.

Most? There is not an Iris Pro SKU out for consumers at the moment that is faster than a GTX 750, let alone a GTX 750 Ti. Now, the GT4e Skylake one is rumored to be faster, but we will not know that until it is released, and even if it is, that does not make "most" of the Iris Pro SKU's faster. Only one. 

 

 

Iris Pro costs $70. The CPU costs $330. Is that really too much to ask?

 
Technically speaking, the box price for the 5775C was supposed to be $377. $27 more than the launch price of the 4790k. Making the Iris Pro even cheaper. Problem is, lack of supply has caused the price to go higher than the intended MSRP. 

 

 

Why is the Iris Pro not on Pentiums then?

 
They would most likely lose money putting it on a Pentium. The appeal of a stronger iGPU is to appeal to budget gamers and mobile partners, but the desktop Pentium is not exactly a "budget gamer" market. It is intended to be a business grade SKU, and no business is going to want to pay a premium on a pentium with an iGPU that will add no benefit to their word documents or Excel worksheets. 
 
However, i believe putting these iGPU's on locked CPU's (Such as i3's and locked i5's) would be a genius idea. Think about it. People buying -K CPU's are more than likely overclocking, meaning they are prone to buy expensive hardware for more performance, meaning they should have a dGPU in the first place. Non-K CPU's means people are either not wanting to OC or they can't afford the Z boards, K SKU's, and aftermarket cooling (budget limited) and are most likely going to buy a cheap dGPU to go along with it. Pairing these SKU's with a stronger iGPU would increase their demand in the budget gaming market, allowing for intel to add a small premium to them. I would pay $50 more for an i3 if it would save me $100 on a GTX 750 Ti. Not to mention I could build a smaller PC, have less fans, etc.

My (incomplete) memory overclocking guide: 

 

Does memory speed impact gaming performance? Click here to find out!

On 1/2/2017 at 9:32 PM, MageTank said:

Sometimes, we all need a little inspiration.

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

They would most likely lose money putting it on a Pentium. The appeal of a stronger iGPU is to appeal to budget gamers and mobile partners, but the desktop Pentium is not exactly a "budget gamer" market. It is intended to be a business grade SKU, and no business is going to want to pay a premium on a pentium with an iGPU that will add no benefit to their word documents or Excel worksheets.

 

Pentium is definitely a budget gamer CPU. Especially the G3258. No business overclocks (except LMG).

Link to comment
Share on other sites

Link to post
Share on other sites

Pentium is definitely a budget gamer CPU. Especially the G3258. No business overclocks (except LMG).

A lot of people find it hard to recommend dual cores anymore after the Far Cry 4 and DA Inquisition fiasco.

CPU i7 6700 Cooling Cryorig H7 Motherboard MSI H110i Pro AC RAM Kingston HyperX Fury 16GB DDR4 2133 GPU Pulse RX 5700 XT Case Fractal Design Define Mini C Storage Trascend SSD370S 256GB + WD Black 320GB + Sandisk Ultra II 480GB + WD Blue 1TB PSU EVGA GS 550 Display Nixeus Vue24B FreeSync 144 Hz Monitor (VESA mounted) Keyboard Aorus K3 Mechanical Keyboard Mouse Logitech G402 OS Windows 10 Home 64 bit

Link to comment
Share on other sites

Link to post
Share on other sites

-snip-

Keep in mind these are non-GAAP results, which in every country but America, are not worth the paper they are written on, you know america not following the IASB and all.

 

(GAAP = generally accepted accounting principles).

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×