Jump to content

will apple switch to intel graphics for macbook pro?

fade2green514

To power-phrase your argument: "Intel make good drivers for their iGPU"

The entire IT industry (excluding Intel) begs to differ, no-one says they make bad drivers, just not good drivers. They always have been 'meh'. There is no need for a 1.5 page debate on this topic.

 

Also, Abductive reasoning is superior /s

 

EDIT:

After looking at some of your other posts... you do actually claim to know everything....

 

Love the arrogance, keep it up, and rember you don't need to provide facts and supporting arguments, that's for plebs.

Read it again. I don't claim to know everything. I claim to have never said anything incorrect, which has mostly panned out. Being wrong is fun, because I learn when I am. This is no such instance, and your reasoning is so weak my freshman students could have ripped it to shreds. If you intend to step into the ring with me, you better punch so hard I don't remain conscious.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

People who think Intel's iGPUs are unacceptable for productivity...

R7 M370X: 640 shaders at 800 MHz = 620 * 2 * 0.8*10^9 = 992GFlops

4950HQ Iris Pro: 384 shaders at 1.15GHz = 384 * 2 * 1.15*10^9 = 883.2GFlops

A downgrade, but not a bad tradeoff for a cooler-running machine

GT4e Iris Pro: 576 shaders at 1.0 GHz = 576 * 2 * 1*10^9 = 1.152*10^12 = 1.152TFlops

576 shaders at 1.15 GHz = 576 * 2 * 1.15*10^9 = 1.3284*10^12 = 1.3284TFlops

An upgrade, and you get the cooler running machine!

if only it were this simple. by that logic amd's fury x should just completely destroy the titan x... but as proven by nvidia time and time again pure processing power isn't the only thing they are selling you, and pure processing power doesn't always translate to better performance.

Link to comment
Share on other sites

Link to post
Share on other sites

if only it were this simple. by that logic amd's fury x should just completely destroy the titan x... but as proven by nvidia time and time again pure processing power isn't the only thing they are selling you, and pure processing power doesn't always translate to better performance.

Sigh, no it doesn't. You're using a straw man argument. The Fury X is VRAM limited to get the desired flops, and it's ROP limited to get the gaming performance potential the chip has. For a ~1TFLOP chip, system RAM is easily adequate.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

Their macbook pros haven't been "pro" for a very long time. It's sad that so many design students buy MB PROs 15" when a Dell 3800 is the same price and yet it has a 4K screen, weighs less, and has a QUADRO :wacko:.

 

I don't understand why they can't put a K1100M in it either. Its not going to fuck with temps any more than the 370x would.

why would they put a slow K1100M in? that performs worse than a GTX 740M?

Because Quadro Driver support? Doesn't matter under OS X. they make their own drivers ANYWAY ...

Mini-Desktop: NCASE M1 Build Log
Mini-Server: M350 Build Log

Link to comment
Share on other sites

Link to post
Share on other sites

People who think Intel's iGPUs are unacceptable for productivity...

 

R7 M370X: 640 shaders at 800 MHz = 620 * 2 * 0.8*10^9 = 992GFlops

4950HQ Iris Pro: 384 shaders at 1.15GHz = 384 * 2 * 1.15*10^9 = 883.2GFlops

A downgrade, but not a bad tradeoff for a cooler-running machine

GT4e Iris Pro: 576 shaders at 1.0 GHz   = 576 * 2 * 1*10^9      = 1.152*10^12 = 1.152TFlops

                       576 shaders at 1.15 GHz = 576 * 2 * 1.15*10^9 = 1.3284*10^12 = 1.3284TFlops

An upgrade, and you get the cooler running machine!

 

Those numbers are interesting.

Link to comment
Share on other sites

Link to post
Share on other sites

Read it again. I don't claim to know everything. I claim to have never said anything incorrect, which has mostly panned out. Being wrong is fun, because I learn when I am. This is no such instance, and your reasoning is so weak my freshman students could have ripped it to shreds. If you intend to step into the ring with me, you better punch so hard I don't remain conscious.

Maybe you should look at the definition of arrogance. Then re-read what you have typed.

 

Also, you might want to read up on Graham's Hierarchy of Disagreement. your entire counter argument is ad hominem with a mix of namecalling mixed in.

 

with regards to my other point, regarding what's his face, jim keller. you stated that due to your 'investor conference call' you stated he was heading to intel and to watch out for a media statment today or tomorrow. All indications so far indicate that is incorrect (http://www.thespynews.com/hot-news/2015/09/21/cpu-wiz-jim-keller-leaves-amd-to-join-apple/767 - not a very reliable source, infact it is shit, but currently the only source informing of his next where abouts, and from what I have seen alot more reliable then your predictions), and that the intel corporate page shows no events till the 11th of october or some such date, where they'd disclose this information. See how I just refuted your central point? It may well be that you're correct in your statement in this case, but a good counter argument is not "I am right, I said so, I am doing a masters therefore I claim the higher ground"

 

My original post was simply me agreeing with goodbyte, by stating that university students, in general, while having alot of theroetical knowledge, don't yet understand how to transfer that knowledge into practical real world senarios, which is why when you do join the workforce, in whatever feild you have choosen, it will be as a jr. dev/jr. reseach assistant/jr. sysadmin. (yes I know you interned at IBM, haveing worked directly for IBM in the past, I can saftely say unless your in the R&D tower, it doesn't actually mean anything, I know I asked you about which tower you where going to work in before your internship a while back, but I never actually got a response, mabye share some light? However in saying that, as a masters degree requires no original thought to obtain (assuming you are getting you masters buy completeing a thesis) it would be unlikey you would be doing R&D, you'd think they'd leave that all to the PHD students). This being the IT industry, experiance is king, failing on having that (teaching does not equal IT experiance), certification or a degree come second (i.e. i know which buttons to press, or I can make an educated guess on which buttons to press). I could also pull out my fairly impressive education and use that as a debate point, but this is the internet, no-one cares.

Link to comment
Share on other sites

Link to post
Share on other sites

Logic:

 

PC Users: Oh, that'll give me good performance for my money, I'll spend $500 on that PC.

 

Apple Users: Ooh! It's shiny! I'll buy it for $3000!

There are people who bought an Audi A3 instead of a Skoda Fabia. Imagine that. Why aren't we all driving Lada Nivas?

 

I don't really play many games for gameplay anymore honestly. I play most games just for the graphics.

 

Link to comment
Share on other sites

Link to post
Share on other sites

Maybe you should look at the definition of arrogance. Then re-read what you have typed.

 

Also, you might want to read up on Graham's Hierarchy of Disagreement. your entire counter argument is ad hominem with a mix of namecalling mixed in.

 

with regards to my other point, regarding what's his face, jim keller. you stated that due to your 'investor conference call' you stated he was heading to intel and to watch out for a media statment today or tomorrow. All indications so far indicate that is incorrect (http://www.thespynews.com/hot-news/2015/09/21/cpu-wiz-jim-keller-leaves-amd-to-join-apple/767 - not a very reliable source, infact it is shit, but currently the only source informing of his next where abouts, and from what I have seen alot more reliable then your predictions), and that the intel corporate page shows no events till the 11th of october or some such date, where they'd disclose this information. See how I just refuted your central point? It may well be that you're correct in your statement in this case, but a good counter argument is not "I am right, I said so, I am doing a masters therefore I claim the higher ground"

 

My original post was simply me agreeing with goodbyte, by stating that university students, in general, while having alot of theroetical knowledge, don't yet understand how to transfer that knowledge into practical real world senarios, which is why when you do join the workforce, in whatever feild you have choosen, it will be as a jr. dev/jr. reseach assistant/jr. sysadmin. (yes I know you interned at IBM, haveing worked directly for IBM in the past, I can saftely say unless your in the R&D tower, it doesn't actually mean anything, I know I asked you about which tower you where going to work in before your internship a while back, but I never actually got a response, mabye share some light? However in saying that, as a masters degree requires no original thought to obtain (assuming you are getting you masters buy completeing a thesis) it would be unlikey you would be doing R&D, you'd think they'd leave that all to the PHD students). This being the IT industry, experiance is king, failing on having that (teaching does not equal IT experiance), certification or a degree come second (i.e. i know which buttons to press, or I can make an educated guess on which buttons to press). I could also pull out my fairly impressive education and use that as a debate point, but this is the internet, no-one cares.

No, I disagree using facts as my basis, and then I insult you. I'm allowed to accomplish multiple goals in a single response, am I not?

 

Also, I'm offered senior or managerial positions for development and software engineering. It helps having a Github page for a portfolio with more than 400K lines of bug-free code to show off. Application isn't nearly as difficult as theory. Once the theory is settled, application of it is merely brute force when it comes to code, a translation from one logical language (math) to another (code).

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

They already have.

Only the most expensive MacBook Pro comes with a iGPU and GPU. The rest of the MacBook Pro only uses a iGPU. The 13" model runs on Intel Iris 6100 and 15" runs on Intel Iris Pro.

http://www.apple.com/macbook-pro/specs-retina/

Intel Xeon E5 1650 v3 @ 3.5GHz 6C:12T / CM212 Evo / Asus X99 Deluxe / 16GB (4x4GB) DDR4 3000 Trident-Z / Samsung 850 Pro 256GB / Intel 335 240GB / WD Red 2 & 3TB / Antec 850w / RTX 2070 / Win10 Pro x64

HP Envy X360 15: Intel Core i5 8250U @ 1.6GHz 4C:8T / 8GB DDR4 / Intel UHD620 + Nvidia GeForce MX150 4GB / Intel 120GB SSD / Win10 Pro x64

 

HP Envy x360 BP series Intel 8th gen

AMD ThreadRipper 2!

5820K & 6800K 3-way SLI mobo support list

 

Link to comment
Share on other sites

Link to post
Share on other sites

Don't you love it when university kids think they know everything?

i used to know everything. but then i got a job and reality hit me on the head with an admantium hammer.....

Link to comment
Share on other sites

Link to post
Share on other sites

There are people who bought an Audi A3 instead of a Skoda Fabia. Imagine that. Why aren't we all driving Lada Nivas?

Valuable life lesson: You don't know how much of a piece of shit the product you're using is until you've used something better. 

I've driven an A4, S4, C300, Cayman S, Macan S, Honda Insight, Rav4, Hyundai Sonata, Cadillac ATS, and probably a few others I can't remember....And holy crap the difference between any of the German cars is well worth the higher price points (although, I personally hated the C300 as I found it drove very meh (for me) -- I prefer at least a bit of road feel and not insane levels of turbo lag).

 

So perspective is everything.

PSU Tier List | CoC

Gaming Build | FreeNAS Server

Spoiler

i5-4690k || Seidon 240m || GTX780 ACX || MSI Z97s SLI Plus || 8GB 2400mhz || 250GB 840 Evo || 1TB WD Blue || H440 (Black/Blue) || Windows 10 Pro || Dell P2414H & BenQ XL2411Z || Ducky Shine Mini || Logitech G502 Proteus Core

Spoiler

FreeNAS 9.3 - Stable || Xeon E3 1230v2 || Supermicro X9SCM-F || 32GB Crucial ECC DDR3 || 3x4TB WD Red (JBOD) || SYBA SI-PEX40064 sata controller || Corsair CX500m || NZXT Source 210.

Link to comment
Share on other sites

Link to post
Share on other sites

Valuable life lesson: You don't know how much of a piece of shit the product you're using is until you've used something better. 

I've driven an A4, S4, C300, Cayman S, Macan S, Honda Insight, Rav4, Hyundai Sonata, Cadillac ATS, and probably a few others I can't remember....And holy crap the difference between any of the German cars is well worth the higher price points.

VW's electrical issues have been way too prevalent for years for me to trust a German car as my next one. Audi was just as bad for a while.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

VW's electrical issues have been way too prevalent for years for me to trust a German car as my next one. Audi was just as bad for a while.

Pre-2004 I would agree, post 2004 Audi has been doing really well. 

 

And of all the cars I've driven, I do really enjoy the experience of the A4 (S4 better, and Cayman S even better -- but that's mainly due to outright performance more than the driving experience, although yes, the two are very related). 

PSU Tier List | CoC

Gaming Build | FreeNAS Server

Spoiler

i5-4690k || Seidon 240m || GTX780 ACX || MSI Z97s SLI Plus || 8GB 2400mhz || 250GB 840 Evo || 1TB WD Blue || H440 (Black/Blue) || Windows 10 Pro || Dell P2414H & BenQ XL2411Z || Ducky Shine Mini || Logitech G502 Proteus Core

Spoiler

FreeNAS 9.3 - Stable || Xeon E3 1230v2 || Supermicro X9SCM-F || 32GB Crucial ECC DDR3 || 3x4TB WD Red (JBOD) || SYBA SI-PEX40064 sata controller || Corsair CX500m || NZXT Source 210.

Link to comment
Share on other sites

Link to post
Share on other sites

Pre-2004 I would agree, post 2004 Audi has been doing really well. 

 

And of all the cars I've driven, I do really enjoy the experience of the A4 (S4 better, and Cayman S even better -- but that's mainly due to outright performance more than the driving experience, although yes, the two are very related). 

2008 VW Jetta was still potentially a death trap if the electronics failed on the highway and shut down your whole car.

 

I'm happy with my 2012 Hyundai Elantra and probably will be until 2025. Hopefully by then we have solid hydrogen cell-driven cars that can go 2000+ miles on a single fill. We'll break 1000 before the end of the decade on a 15 gallon tank.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

Intel iGPUs on the pro and ARM processors for the regular MacBooks? Crazy to see how powerful these once "weak platforms" are becoming.

Link to comment
Share on other sites

Link to post
Share on other sites

Intel iGPUs on the pro and ARM processors for the regular MacBooks? Crazy to see how powerful these once "weak platforms" are becoming.

I thought the MacBook was based on Broadwell-Y.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

I thought the MacBook was based on Broadwell-Y.

It is. 

PSU Tier List | CoC

Gaming Build | FreeNAS Server

Spoiler

i5-4690k || Seidon 240m || GTX780 ACX || MSI Z97s SLI Plus || 8GB 2400mhz || 250GB 840 Evo || 1TB WD Blue || H440 (Black/Blue) || Windows 10 Pro || Dell P2414H & BenQ XL2411Z || Ducky Shine Mini || Logitech G502 Proteus Core

Spoiler

FreeNAS 9.3 - Stable || Xeon E3 1230v2 || Supermicro X9SCM-F || 32GB Crucial ECC DDR3 || 3x4TB WD Red (JBOD) || SYBA SI-PEX40064 sata controller || Corsair CX500m || NZXT Source 210.

Link to comment
Share on other sites

Link to post
Share on other sites

I thought the MacBook was based on Broadwell-Y.

 

I'm more relating it to the rumors that the Pros would go Iris only and the regular MacBooks would switch to ARM

Link to comment
Share on other sites

Link to post
Share on other sites

i only see them doing this for lower priced models, Final Cut Pro X uses GPU acceleration

2017 Macbook Pro 15 inch

Link to comment
Share on other sites

Link to post
Share on other sites

I'm more relating it to the rumors that the Pros would go Iris only and the regular MacBooks would switch to ARM

Ah, ok :)

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

-snip snip-

Always like seeing your posts, good to see points that are well thought out.

Higher frame rate over higher resolution.

CPU-i5 4690k -GPU-MSI 970 sli -Mobo-MSI g45 gaming -Memory-16gb crucial ballistix -PSU- EVGA 80+ gold g2 850w -Case- corsair 200r

Monitors- Acer XB240H, Asus ROG Swift, Dell P2815Q 2160p  -Keyboard- Corsair k70 RGB -Mouse- Corsair M65 -Mouse Pad- Glorious Extended Pad -Headphone- BeyerDynamic DT990 250ohm, Senheiser HD 518, Fiio E10k

Link to comment
Share on other sites

Link to post
Share on other sites

Not that I disagree with the point... but you can get a Macbook pro 15 inch for 1999 (education) ..... not many people buy the fully speced out one since the performance difference between the bottom end 15 inch (which is very well sepced) is not much worse than one speced at 2999. for ~2400 you can get one with a dedicated GPU. 

 

The difference in price is big enough that we should try and be a little more accurate. 

I have a 2019 macbook pro with 64gb of ram and my gaming pc has been in the closet since 2018

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×