Jump to content

can Intel's Iris PRO inside Broadwell be a low-end discrete video card killer?

Unless you're consistently flitting between 70+ and 50-, there's no point in having a FreeSync/GSync product, based on Adaptive Sync or not.

 

Really unfortunate your skull is so thick and you are unable to absorb contradictory information to the stuff you have heard first.

 

Also its a shame that you think the only games people play are demanding garbage like crysis and witcher.... and people who play cs, dota, lol for the gameplay not the graphics or any older games are not allowed to enjoy smooth gameplay.

Link to comment
Share on other sites

Link to post
Share on other sites

*cries in a corner with a Intel HD Graphics 380m*

 

:mellow:

Blue Jay

CPU: Intel Core i7 6700k (OC'd 4.4GHz) Cooler: CM Hyper 212 Evo Mobo: MSI Z170A Gaming Pro Carbon GPU: EVGA GTX 950 SSC RAM: Crucial Ballistix Sport 8GB (1x8GB) SSD: Samsung 850 EVO 250 GB HDD: Seagate Barracuda 1TB Case: NZXT S340 Black/Blue PSU: Corsair CX430M

 

Other Stuff

Monitor: Acer H236HL BID Mouse: Logitech G502 Proteus Spectrum Keyboard: I don't even know Mouse Pad: SteelSeries QcK Headset: Turtle Beach X12

 

GitHub

Link to comment
Share on other sites

Link to post
Share on other sites

No. But what it does allow you to do is to build a gaming PC without a GPU and allow you time to save up for a better one than you had budget for initially. You can game at low settings very competently in the meantime.

 

Hurr durr missed the "low end" part of your question. If you're referring to like the 740 or something then absolutely. Maybe even something like the 260X.

Link to comment
Share on other sites

Link to post
Share on other sites

I never thought inbuilt graphics would be so good!

 

This will make some pretty awesome mobile devices.

 My Buyer’s Guide!   

Build:                                               

CPU: Intel Core i5 4690K Cooler: Cryorig R1 Ultimate RAM: Kingston Fury White Series 8GB SSD: OCZ 100 ARC 240GB HDD: Seagate Barracuda 1TB Motherboard: MSI Z97S SLI Krait Edition Graphics Card: Powercolor PCS+ R9 390 Case: Phanteks Enthoo Pro (White) Power Supply: EVGA G2 750W Monitor: LG 29UM67-P 29" 21:9 Freesync Sexiness Mouse: Razer Deathadder ChromKeyboard: Razer Blackwidow 2014 Headset: Turtle Beach Ear Force XP400

Link to comment
Share on other sites

Link to post
Share on other sites

Hopefully this cause a major increase in gpu power efficiency, because power usage of dgpus is quite ludicrous now.

Everything you need to know about AMD cpus in one simple post.  Christian Member 

Wii u, ps3(2 usb fat),ps4

Iphone 6 64gb and surface RT

Hp DL380 G5 with one E5345 and bunch of hot swappable hdds in raid 5 from when i got it. intend to run xen server on it

Apple Power Macintosh G5 2.0 DP (PCI-X) with notebook hdd i had lying around 4GB of ram

TOSHIBA Satellite P850 with Core i7-3610QM,8gb of ram,default 750hdd has dual screens via a external display as main and laptop display as second running windows 10

MacBookPro11,3:I7-4870HQ, 512gb ssd,16gb of memory

Link to comment
Share on other sites

Link to post
Share on other sites

Really unfortunate your skull is so thick and you are unable to absorb contradictory information to the stuff you have heard first.

Also its a shame that you think the only games people play are demanding garbage like crysis and witcher.... and people who play cs, dota, lol for the gameplay not the graphics or any older games are not allowed to enjoy smooth gameplay.

Facepalm* Your frame rates in the games you mention tend to be so consistent as to not need the extra VRR support.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

thats pretty amazing if its true. I dont believe the hyped market break charts. But if they are within reason, this is pretty cool. People playing mmo's are going to love this. Not really much need to get a card if you really dont want to. especially for older mmos

Link to comment
Share on other sites

Link to post
Share on other sites

thats pretty amazing if its true. I dont believe the hyped market break charts. But if they are within reason, this is pretty cool. People playing mmo's are going to love this. Not really much need to get a card if you really dont want to. especially for older mmos

 

Yeah I used to game on Intel HD 2000, and the difference between that and now is astoundingly good.  Thousands of games will run just fine on these Iris Pros.

ExMachina (2016-Present) i7-6700k/GTX970/32GB RAM/250GB SSD

Picard II (2015-Present) Surface Pro 4 i5-6300U/8GB RAM/256GB SSD

LlamaBox (2014-Present) i7-4790k/GTX 980Ti/16GB RAM/500GB SSD/Asus ROG Swift

Kronos (2009-2014) i7-920/GTX680/12GB RAM/120GB SSD

Link to comment
Share on other sites

Link to post
Share on other sites

Facepalm* Your frame rates in the games you mention tend to be so consistent as to not need the extra VRR support.

 

Its astonishing how hard it is for your to admit you were proven wrong. 

 

Ye, you just keep facepalming buddy.

Link to comment
Share on other sites

Link to post
Share on other sites

Its astonishing how hard it is for your to admit you were proven wrong.

Ye, you just keep facepalming buddy.

I wasn't proven wrong. You disagree with me. You've proven nothing.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

I wasn't proven wrong. You disagree with me. You've proven nothing.

 

So for more demanding games intel isnt strong enough to use adaptive sync. 

But for less demanding games intel doesnt need to use adaptive sync.

 

Thats what you said. Or did you forget already?

 

Pathetic really... me - im pathetic, for wasting my time with people like you.

Link to comment
Share on other sites

Link to post
Share on other sites

So for more demanding games intel isnt strong enough to use adaptive sync.

But for less demanding games intel doesnt need to use adaptive sync.

Thats what you said. Or did you forget already?

Pathetic really... me - im pathetic, for wasting my time with people like you.

Who uses a 720p monitor these days? No one's getting above 60 FPS at 1080p with even Iris Pro 6200. Therefore, Intel's graphics are still not powerful enough to require VRR support. Now, to be clear, Intel's graphics going back to Ivy Bridge support eDP and DP. They more than likely have Adaptive Sync supported, but as to whether or not they will build up a standard around that, open or otherwise, is unclear.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

Who uses a 720p monitor these days? No one's getting above 60 FPS at 1080p with even Iris Pro 6200. Therefore, Intel's graphics are still not powerful enough to require VRR support. Now, to be clear, Intel's graphics going back to Ivy Bridge support eDP and DP. They more than likely have Adaptive Sync supported, but as to whether or not they will build up a standard around that, open or otherwise, is unclear.

 

Uh... VRR is beneficial at less than 60 FPS too. In fact the 45 FPS range is where I think VRR has the greatest impact on the experience.

Link to comment
Share on other sites

Link to post
Share on other sites

Uh... VRR is beneficial at less than 60 FPS too. In fact the 45 FPS range is where I think VRR has the greatest impact on the experience.

That's a head scratcher for me considering your frame pacing when down that low tends to be pretty consistent on its own. If Intel finds there's sufficient demand for it, Intel will build it.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

That's a head scratcher for me considering your frame pacing when down that low tends to be pretty consistent on its own. If Intel finds there's sufficient demand for it, Intel will build it.

Vsync off has tearing whatever what fps you have, vsync on has input lag, variable refresh rate solves those 2 problems, don't know why you're scratching your head.

Link to comment
Share on other sites

Link to post
Share on other sites

That's a head scratcher for me considering your frame pacing when down that low tends to be pretty consistent on its own. If Intel finds there's sufficient demand for it, Intel will build it.

 

No it doesn't. At ~45 FPS you either get lag/stuttering with Vsync or tearing without it.

Link to comment
Share on other sites

Link to post
Share on other sites

I don't think low end cards would disapear with these better igpu, not for now anyway. They are already very poor value for brand new builds, but can be viable option for upgrading older PCs, like the Q6600 that Linus had in his scrapyard wars machine, that came out 8 years ago. By the time skylake is 8 years old, low end GPUs will be quite a bit more powerful than even this Iris igpu. So in short, for new builds, low end GPUs are already on life support, but as an upgrade for an older system, they can still be viable.

Link to comment
Share on other sites

Link to post
Share on other sites

Vsync off has tearing whatever what fps you have, vsync on has input lag, variable refresh rate solves those 2 problems, don't know why you're scratching your head.

I've never seen tearing in my life below a monitor's rated refresh rate. I'll be honest I never have, and that's including Tomb Raider, PS2 games emulated, Medieval 2 Total War live battles... I honestly don't see the point because I have never bothered with VSync or seen the tearing.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

No it doesn't. At ~45 FPS you either get lag/stuttering with Vsync or tearing without it.

I speak from personal experience. I assume you do too. I've never seen the tearing happen without VSync, and I play enough GW2 where I think I'd notice considering frame rates tend to be in the 45 fps range, sometimes going all the way down to 11 in big world events.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

I speak from personal experience. I assume you do too. I've never seen the tearing happen without VSync, and I play enough GW2 where I think I'd notice considering frame rates tend to be in the 45 fps range, sometimes going all the way down to 11 in big world events.

 

Well that's nice for you, but some people do notice the tearing.

Link to comment
Share on other sites

Link to post
Share on other sites

Well that's nice for you, but some people do notice the tearing.

Instead of anecdotes, it would be nice to provide real evidence of tearing below the refresh rate. I could remotely believe it if your frame rates were bouncing all over hell and creation, but with usually a leeway of 4ms at least from grey to grey chances and pacing even without VRR support, it seems theoretically difficult to produce on newer monitors.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

I've never seen tearing in my life below a monitor's rated refresh rate. I'll be honest I never have, and that's including Tomb Raider, PS2 games emulated, Medieval 2 Total War live battles... I honestly don't see the point because I have never bothered with VSync or seen the tearing.

I see it all the time with my poor 750 Ti, only at exactly 45fps where I don't notice it strangely, so I lock my fps there for the summer to save on heat. But yes, it does tear below refresh rate.

Link to comment
Share on other sites

Link to post
Share on other sites

Instead of anecdotes, it would be nice to provide real evidence of tearing below the refresh rate. I could remotely believe it if your frame rates were bouncing all over hell and creation, but with usually a leeway of 4ms at least from grey to grey chances and pacing even without VRR support, it seems theoretically difficult to produce on newer monitors.

 

What the heck are you talking about. It's a mathematical fact that you will get tearing if you're running 45 FPS on a 60 Hz monitor without Vsync.

Link to comment
Share on other sites

Link to post
Share on other sites

What the heck are you talking about. It's a mathematical fact that you will get tearing if you're running 45 FPS on a 60 Hz monitor without Vsync.

Actually no. All monitors have an onboard RAM where the image information is stored. There are plenty of dual-frame monitors which will just refresh the previous frame until the new one is completed. No need for VSync at all.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

Actually no. All monitors have an onboard RAM where the image information is stored. There are plenty of dual-frame monitors which will just refresh the previous frame until the new one is completed. No need for VSync at all.

 

If it just refreshes the previous frame, then you have lag/stutter equivalent to Vsync. That's the point, without VRR you always get a downside.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×