Jump to content

can Intel's Iris PRO inside Broadwell be a low-end discrete video card killer?

ignore

 

I refuse to ignore this post. I fully acknowledge it.

CPU: Intel i5 4690k W/Noctua nh-d15 GPU: Gigabyte G1 980 TI MOBO: MSI Z97 Gaming 5 RAM: 16Gig Corsair Vengance Boot-Drive: 500gb Samsung Evo Storage: 2x 500g WD Blue, 1x 2tb WD Black 1x4tb WD Red

 

 

 

 

"Whatever AMD is losing in suddenly becomes the most important thing ever." - Glenwing, 1/13/2015

 

Link to comment
Share on other sites

Link to post
Share on other sites

no, that card is beyond the point; the GTX 780Ti air-cooled cannot use that much power

that card is most likely built for LN2

 

yes:

GTX-780-Ti-Classified-1900-MHz-842x620.j

 

Point is, we've only seen the water cooled version of Fiji, which should be able to handle 2x8 pin just fine. We don't know how much power it actually needs, but we've heard Fiji will be a 300 w tdp part, just like 290x, which runs just fine on air. So I don't understand the worry. But of course, we will all be much wiser, when the card finally gets reviewed.

Watching Intel have competition is like watching a headless chicken trying to get out of a mine field

CPU: Intel I7 4790K@4.6 with NZXT X31 AIO; MOTHERBOARD: ASUS Z97 Maximus VII Ranger; RAM: 8 GB Kingston HyperX 1600 DDR3; GFX: ASUS R9 290 4GB; CASE: Lian Li v700wx; STORAGE: Corsair Force 3 120GB SSD; Samsung 850 500GB SSD; Various old Seagates; PSU: Corsair RM650; MONITOR: 2x 20" Dell IPS; KEYBOARD/MOUSE: Logitech K810/ MX Master; OS: Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

Point is, we've only seen the water cooled version of Fiji, which should be able to handle 2x8 pin just fine. We don't know how much power it actually needs, but we've heard Fiji will be a 300 w tdp part, just like 290x, which runs just fine on air. So I don't understand the worry. But of course, we will all be much wiser, when the card finally gets reviewed.

GDDR5 operates OK at high temperatures; my, let's say, "fear" is that HBM cannot withstand those temperatures and an actual air-cooler Fiji might be technically impossible without putting a serious limit on it's clock

 

example: Micron's GDDR5

8XT3icN.png

Link to comment
Share on other sites

Link to post
Share on other sites

GDDR5 operates OK at high temperatures; my, let's say, "fear" is that HBM cannot withstand those temperatures and an actual air-cooler Fiji might be technically impossible without putting a serious limit on it's clock

 

example: Micron's GDDR5

 

 

I seriously doubt Fiji will go above 95c. Afaik, only Fermi has ever done such a thing, and 290x was only close due to a really bad stock cooler. Custom coolers had no problem. The vram on some Titan X's went over 100c, so I really doubt we will see a general issue here.

 

01-PCB_w_600.jpg

Watching Intel have competition is like watching a headless chicken trying to get out of a mine field

CPU: Intel I7 4790K@4.6 with NZXT X31 AIO; MOTHERBOARD: ASUS Z97 Maximus VII Ranger; RAM: 8 GB Kingston HyperX 1600 DDR3; GFX: ASUS R9 290 4GB; CASE: Lian Li v700wx; STORAGE: Corsair Force 3 120GB SSD; Samsung 850 500GB SSD; Various old Seagates; PSU: Corsair RM650; MONITOR: 2x 20" Dell IPS; KEYBOARD/MOUSE: Logitech K810/ MX Master; OS: Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

I seriously doubt Fiji will go above 95c. Afaik, only Fermi has ever done such a thing, and 290x was only close due to a really bad stock cooler. Custom coolers had no problem. The vram on some Titan X's went over 100c, so I really doubt we will see a general issue here.

yes, but we are talking about current GDDR5 temps, not HBM - I still can't find HBM data from SK Hynix

Link to comment
Share on other sites

Link to post
Share on other sites

With the investment in HBM, I think AMD will implement it into whatever makes economical sense. I definitely think 2016 will be a very interesting year for AMD's portfolio of products.

Agreed.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

yes, but we are talking about current GDDR5 temps, not HBM - I still can't find HBM data from SK Hynix

 

HBM has thermo dummy bumps for dissapating heat. The silicon interposer takes most of the heat, but looks like the HBM module can handle over 100c on the silicon:

 

http://www.setphaserstostun.org/hc26/HC26-11-day1-epub/HC26.11-3-Technology-epub/HC26.11.310-HBM-Bandwidth-Kim-Hynix-Hot%20Chips%20HBM%202014%20v7.pdf

 

Page 24

Watching Intel have competition is like watching a headless chicken trying to get out of a mine field

CPU: Intel I7 4790K@4.6 with NZXT X31 AIO; MOTHERBOARD: ASUS Z97 Maximus VII Ranger; RAM: 8 GB Kingston HyperX 1600 DDR3; GFX: ASUS R9 290 4GB; CASE: Lian Li v700wx; STORAGE: Corsair Force 3 120GB SSD; Samsung 850 500GB SSD; Various old Seagates; PSU: Corsair RM650; MONITOR: 2x 20" Dell IPS; KEYBOARD/MOUSE: Logitech K810/ MX Master; OS: Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

thanks!

so my assumption was incorrect 

 

We won't know for sure until reviews are in. I just don't think it's an issue.

Watching Intel have competition is like watching a headless chicken trying to get out of a mine field

CPU: Intel I7 4790K@4.6 with NZXT X31 AIO; MOTHERBOARD: ASUS Z97 Maximus VII Ranger; RAM: 8 GB Kingston HyperX 1600 DDR3; GFX: ASUS R9 290 4GB; CASE: Lian Li v700wx; STORAGE: Corsair Force 3 120GB SSD; Samsung 850 500GB SSD; Various old Seagates; PSU: Corsair RM650; MONITOR: 2x 20" Dell IPS; KEYBOARD/MOUSE: Logitech K810/ MX Master; OS: Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

Aren't Zen APU due in 2017? That is a long time, if Intel is already starting to pull ahead.

Link to comment
Share on other sites

Link to post
Share on other sites

Aren't Zen APU due in 2017? That is a long time, if Intel is already starting to pull ahead.

IMHO, it would be wise for AMD to make a surprise Carrizo desktop launch at the 65W mark with some eDRAM. They're just letting Itel walk all over them.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

IMHO, it would be wise for AMD to make a surprise Carrizo desktop launch at the 65W mark with some eDRAM. They're just letting Itel walk all over them.

I have my doubt about carrizo in the desktop. It seems that it aren't scaling aswell past 35w.

They would have to do a seperate die. 

Link to comment
Share on other sites

Link to post
Share on other sites

I have my doubt about carrizo in the desktop. It seems that it aren't scaling aswell past 35w.

They would have to do a seperate die.

Where's the proof of scaling issues?

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

Where's the proof of scaling issues?

The high density library they implemented will limit the clockspeed over a high-performance library.

 

Also, if carrizo could scale properly, why wouldn't AMD have considered it?

If it didn't have scaling issues, it surely would compete against high TDP kaveri.

Link to comment
Share on other sites

Link to post
Share on other sites

The high density library they implemented will limit the clockspeed over a high-performance library.

Also, if carrizo could scale properly, why wouldn't AMD have considered it?

If it didn't have scaling issues, it surely would compete against high TDP kaveri.

The desktop space is drying up. I'm sure AMD saw it as a revenue vs. investment sort of deal.

That's also an assumption, not proof.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

Funny how most of the games benchmarked were CPU-bound. And how everyone disses on AMD for doing this, but when Intel does it it's all fine and dandy.

Really now.

[witty signature]

Link to comment
Share on other sites

Link to post
Share on other sites

Aren't Zen APU due in 2017? That is a long time, if Intel is already starting to pull ahead.

Zen will ship sometime 2016. Rumor is 2H 2016 although AMD pushed back K12 to 2017 so they can ramp Zen development to get it out onto the market sooner.

Link to comment
Share on other sites

Link to post
Share on other sites

Funny how most of the games benchmarked were CPU-bound. And how everyone disses on AMD for doing this, but when Intel does it it's all fine and dandy.

Really now.

I think I can explain it. Everyone knows AMD has the graphics lead. Seeing all these APUs without decent CPU performance gains, an area AMD lacks in, when we're still not seeing heterogeneous acceleration in consumer software just leads to a dislike of APUs. They represent something not helpful to AMD's fanbase. Intel on the other hand sucks at graphics. I'll admit it. They're behind on core count and in architecture in general where gaming is concerned. For Intel to make the big leap in graphics and take the lead position from AMD is a great feat, no matter how small the lead may be and no matter how much it's AMD's fault for not keeping up in memory bandwidth. Given the 128MB of L4 cache in the 5775C augments its performance to the point it keeps pace with the 4790K in CPU benches while also letting it's graphics cores breathe to the point they can beat AMD's offerings, I can only imagine how much AMD's Carrizo would benefit from having that same 128MB of cache onboard. It would cream Intel's iGPU and really piss on Nvidia's parade. Not to mention Carrizo's CPU performance would be augmented too, to the point it might not be so disappointing to the fans.

People are disappointed in AMD's lack of innovation and are proud of Intel for making big strides and coming from being a no-name in graphics to a worthwhile iGPU competitor since it started making iGPU with Sandy Bridge (or did in-house designs begin with IVB?).

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

that's exactly the problem
Fiji dissipates a lot of heat and the HBM chips are very close to the GPU die itself, making thermal contact via the heatsink - that's a very big problem is HBM operational temps must not reach hi values

Well you cannot forget that HBM has it's own thermal management and dummy bumps for thermal dissipation. HBM has built in sensors and as the stack starts getting hotter and hotter the more often it will cycle its heat to the PCB. Temperatures around 85C are normal for HBM stacks according to Hynix.

 

there are a lot of pages here are there any comments mentioning that anandtech has the i5-5765c at a little under 300 bucks?

-snip-

 

As impressive as the performance is i'd hardly call a system with a 300 dollar CPU a "budget build" and the 7870k may be behind performance wise but the i5 is costing you double for that 12 fps

 

I'm sure somebody has pointed this out somewhere in here and been appropriately smashed though..

Higher frequency memory will close that gap even more. They ran their benchmarks on 2133 MHz DDR3. The move to 2400 MHz will yield another 3-5 FPS gain. A little overclock to match up frequencies and you can easily get equivalent to better performance out of a chip half the price. The platform is also cheaper as well.

 

Yea, cherry pick a Kingpin card that is specifically designed (over designed) for additional power for water/LN2 cooling.

Who's not to say AMD didn't purposely throw more power at the card so enthusiasts can overclock to their liking? It is water cooled out of the box so it's not a possibility that can be ruled out.

Link to comment
Share on other sites

Link to post
Share on other sites

People are disappointed in AMD's lack of innovation and are proud of Intel for making big strides and coming from being a no-name in graphics to a worthwhile iGPU competitor since it started making iGPU with Sandy Bridge (or did in-house designs begin with IVB?).

 

Intel's been making (and designing) iGPUs for a long time. Ivy Bridge's HD Graphics was mostly just a scaled-up version of Sandy Bridge, which is where they moved it on-die and added Quick Sync. Clarkdale is when they moved it on-package (in a separate die next to the CPU itself) and it first stopped being complete and utter ass, and where they changed the name from Intel GMA (yuck) to HD Graphics.

Link to comment
Share on other sites

Link to post
Share on other sites

Intel's been making (and designing) iGPUs for a long time. Ivy Bridge's HD Graphics was mostly just a scaled-up version of Sandy Bridge, which is where they moved it on-die and added Quick Sync. Clarkdale is when they moved it on-package (in a separate die next to the CPU itself) and it first stopped being complete and utter ass, and where they changed the name from Intel GMA (yuck) to HD Graphics.

But Intel GMA and the first iterations of HD Graphics were all Imagination Technologies designs/chips that Intel just copied verbatim, and the primary reason for having them was of course office users. I'm trying to pinpoint when Intel actually dove into doing fully in-house designs.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

But Intel GMA and the first iterations of HD Graphics were all Imagination Technologies designs/chips that Intel just copied verbatim, and the primary reason for having them was of course office users. I'm trying to pinpoint when Intel actually dove into doing fully in-house designs.

 

That went into their Atom processors, they developed their own GPU pretty early on. Like in 2006 when they changed from a fixed function pipeline to a programmable pipeline, that's 100% their tech.

 

Edit - behold, the future of graphics:

 

DMXHhVJ.jpg

Link to comment
Share on other sites

Link to post
Share on other sites

That went into their Atom processors, they developed their own GPU pretty early on. Like in 2006 when they changed from a fixed function pipeline to a programmable pipeline, that's 100% their tech.

Edit - behold, the future of graphics:

 

I'm pretty sure even at that point Intel just did a hack job on a licensed 3DFX or Imagination technologies design. I'll start digging back through the history. I don't think Intel's been doing in-house architectures for 9 years, but maybe I'm wrong, of course lord knows they weren't trying to do anything big with it, and I wouldn't count anything until around Nehalem against their 15-year victory clock (quoting the original Intel-IBM war where Intel went from ground 0 to market domination).

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

keep in mind that these are at low settings and resolutions. We still have no idea how they'll scale with more complex graphics and higher resolutions. But still, looks impressive.

2017 Macbook Pro 15 inch

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×