Jump to content

So apparently, the Intel/Radeon marriage is a thing that's happening

jasonc_01
Go to solution Solved by captain cactus,

sauce: https://newsroom.intel.com/editorials/new-intel-core-processor-combine-high-performance-cpu-discrete-graphics-sleek-thin-devices/

 

Quote

The new product, which will be part of our 8th Gen Intel Core family, brings together our high-performing Intel Core H-series processor, second generation High Bandwidth Memory (HBM2) and a custom-to-Intel third-party discrete graphics chip from AMD’s Radeon Technologies Group* – all in a single processor package.

It’s a prime example of hardware and software innovations intersecting to create something amazing that fills a unique market gap. Helping to deliver on our vision for this new class of product, we worked with the team at AMD’s Radeon Technologies Group. In close collaboration, we designed a new semi-custom graphics chip, which means this is also a great example of how we can compete and work together, ultimately delivering innovation that is good for consumers.

So this was the semi-custom design AMD talked about a while back. But it has happened. Here's Intel's video:

 

 

So yeah. We now have an Intel CPU with an AMD RTG GPU in the 35-55W TDP range.

 

Here's how they did that:

 

Intel-8th-Gen-CPU-discrete-graphics.jpg

 

That's a single HBM2 stack so we're likely limited to 4GB of video RAM. The details of the AMD GPU are unknown at this point, likely to be a Vega-based GPU, but things as SP count and clock speeds are not known yet.

47 minutes ago, RagnarokDel said:

Actually it makes perfect sense, they make money from Ryzen-based laptops and from Intel based laptops.

Making less money and making more money isn't the same.

Link to comment
Share on other sites

Link to post
Share on other sites

This is str8 killer dope

ƆԀ S₱▓Ɇ▓cs: i7 6ʇɥפᴉƎ00K (4.4ghz), Asus DeLuxe X99A II, GT҉X҉1҉0҉8҉0 Zotac Amp ExTrꍟꎭe),Si6F4Gb D???????r PlatinUm, EVGA G2 Sǝʌǝᘉ5ᙣᙍᖇᓎᙎᗅᖶt, Phanteks Enthoo Primo, 3TB WD Black, 500gb 850 Evo, H100iGeeTeeX, Windows 10, K70 R̸̢̡̭͍͕̱̭̟̩̀̀̃́̃͒̈́̈́͑̑́̆͘͜ͅG̶̦̬͊́B̸͈̝̖͗̈́, G502, HyperX Cloud 2s, Asus MX34. פN∩SW∀S 960 EVO

Just keeping this here as a backup 9̵̨̢̨̧̧̡̧̡̧̡̧̡̡̢̢̡̢̧̡̢̡̡̢̧̛̛̛̛̛̛̱̖͈̠̝̯̹͉̝̞̩̠̹̺̰̺̲̳͈̞̻̜̫̹̱̗̣͙̻̘͎̲̝͙͍͔̯̲̟̞͚̖̘͉̭̰̣͎͕̼̼̜̼͕͎̣͇͓͓͎̼̺̯͈̤̝͖̩̭͍̣̱̞̬̺̯̼̤̲͎̖̠̟͍̘̭͔̟̗̙̗̗̤̦͍̫̬͔̦̳̗̳͔̞̼̝͍̝͈̻͇̭̠͈̳͍̫̮̥̭͍͔͈̠̹̼̬̰͈̤͚̖̯͍͉͖̥̹̺͕̲̥̤̺̹̹̪̺̺̭͕͓̟̳̹͍̖͎̣̫͓͍͈͕̳̹̙̰͉͙̝̜̠̥̝̲̮̬͕̰̹̳͕̰̲̣̯̫̮͙̹̮͙̮̝̣͇̺̺͇̺̺͈̳̜̣̙̻̣̜̻̦͚̹̩͓͚̖͍̥̟͍͎̦͙̫̜͔̭̥͈̬̝̺̩͙͙͉̻̰̬̗̣͖̦͎̥̜̬̹͓͈͙̤̜̗͔̩̖̳̫̑̀̂̽̈́̈́̿͒̿̋̊͌̾̄̄̒̌͐̽̿̊͑̑̆͗̈̎̄͒̑̋͛̑͑̂͑̀͐̀͑̓͊̇͆̿͑͛͛͆́͆̓̿̇̀̓͑͆͂̓̾̏͊̀̇̍̃́̒̎̀̒̄̓̒̐̑̊̏̌̽̓͂͋̓̐̓͊̌͋̀̐̇̌̓̔͊̈̇́̏͒̋͊̓̆̋̈̀̌̔͆͑̈̐̈̍̀̉̋̈́͊̽͂̿͌͊̆̾̉͐̿̓̄̾͑̈́͗͗̂̂́̇͂̀̈́́̽̈́̓̓͂̽̓̀̄͌̐̔̄̄͒͌̈́̅̉͊̂͒̀̈́̌͂̽̀̑̏̽̀͑̐̐͋̀̀͋̓̅͋͗̍́͗̈́̆̏̇͊̌̏̔̑̐̈́͑̎͑͆̏̎́̑̍̏̒̌̊͘͘̚̕̚̕̕̚̕̚̕̕͜͜͜͜͜͝͝͠͠͝͝͝͝͝͝͝͠͝͝ͅͅͅͅͅͅͅ8̵̨̛̛̛̛̮͍͕̥͉̦̥̱̞̜̫̘̤̖̬͍͇͓̜̻̪̤̣̣̹̑͑̏̈́̐̐́̎͒̔͒̌̑̓̆̓͑̉̈́́͋̌͋͐͛͋̃̍̽̊͗͋͊̂̅͊͑́͋͛̉̏̓͌̾̈́̀͛͊̾͑̌̀̀̌̓̏̑́̄̉̌͂́͛̋͊̄͐͊̈́̀̌̆̎̿̓̔̍̎̀̍̚̕̕͘͘͘̕̚͝͝͠͠͠0̶̡̡̡̢̨̨͕̠̠͉̺̻̯̱̘͇̥͎͖̯͕̖̬̭͔̪̪͎̺̠̤̬̬̤̣̭̣͍̥̱̘̳̣̤͚̭̥͚̦͙̱̦͕̼͖͙͕͇̭͓͉͎̹̣̣͕̜͍͖̳̭͕̼̳̖̩͍͔̱̙̠̝̺̰̦̱̿̄̀͐͜͜ͅͅt̶̡̨̡̨̧̢̧̢̨̧̧̧̧̢̡̨̨̢̨̢̧̢̛̛̛̛̛̠͍̞̮͇̪͉̩̗̗͖̫͉͎͓̮̣̘̫͔̘̬̮̙̯̣͕͓̲̣͓͓̣̹̟͈̱͚̘̼̙̖̖̼̙̜̝͙̣̠̪̲̞̖̠̯̖̠̜̱͉̲̺͙̤̻̦̜͎̙̳̺̭̪̱͓̦̹̺͙̫̖̖̰̣͈͍̜̺̘͕̬̥͇̗̖̺̣̲̫̟̣̜̭̟̱̳̳̖͖͇̹̯̜̹͙̻̥̙͉͕̜͎͕̦͕̱͖͉̜̹̱̦͔͎̲̦͔̖̘̫̻̹̮̗̮̜̰͇̰͔̱͙̞̠͍͉͕̳͍̰̠̗̠̯̜̩͓̭̺̦̲̲͖̯̩̲̣̠͉̦̬͓̠̜̲͍̘͇̳̳͔̼̣͚̙͙͚͕̙̘̣̠͍̟̪̝̲͇͚̦̖͕̰̟̪͖̳̲͉͙̰̭̼̩̟̝̣̝̬̳͎̙̱͒̃̈͊̔͒͗̐̄̌͐͆̍͂̃̈́̾͗̅̐͒̓̆͛̂̾͋̍͂̂̄̇̿̈͌̅̈́̃̾̔̇̇̾̀͊͋̋̌̄͌͆͆̎̓̈́̾̊͊̇̌̔̈́̈́̀̐͊̊̍͑̊̈̓͑̀́̅̀̑̈́̽̃̽͛̇́̐̓̀͆̔̈̀̍̏̆̓̆͒̋́̋̍́̂̉͛̓̓̂̋̎́̒̏̈͋̃̽͆̓̀̔͑̈́̓͌͑̅̽́̐̍̉̑̓̈́͌̋̈́͂̊́͆͂̇̈́̔̃͌̅̈́͌͛̑̐̓̔̈́̀͊͛̐̾͐̔̾̈̃̈̄͑̓̋̇̉̉̚̕̚͘̕̚̚̕̕͜͜͜͜͜͜͜͜͜͜͜͜͜͝͝͝͠͝͝͝͝͝͠ͅͅͅͅͅi̵̢̧̢̧̡̧̢̢̧̢̢̢̡̡̡̧̧̡̡̧̛̛͈̺̲̫͕̞͓̥̖̭̜̫͉̻̗̭̖͔̮̠͇̩̹̱͈̗̭͈̤̠̮͙͇̲͙̰̳̹̲͙̜̟͚͎͓̦̫͚̻̟̰̣̲̺̦̫͓̖̯̝̬͉̯͓͈̫̭̜̱̞̹̪͔̤̜͙͓̗̗̻̟͎͇̺̘̯̲̝̫͚̰̹̫̗̳̣͙̮̱̲͕̺̠͉̫̖̟͖̦͉̟͈̭̣̹̱̖̗̺̘̦̠̯̲͔̘̱̣͙̩̻̰̠͓͙̰̺̠̖̟̗̖͉̞̣̥̝̤̫̫̜͕̻͉̺͚̣̝̥͇̭͎̖̦̙̲͈̲̠̹̼͎͕̩͓̖̥̘̱̜͙̹̝͔̭̣̮̗̞̩̣̬̯̜̻̯̩̮̩̹̻̯̬̖͂̈͂̒̇͗͑̐̌̎̑̽̑̈̈́͑̽́̊͋̿͊͋̅̐̈́͑̇̿̈́̌͌̊̅͂̎͆̏̓͂̈̿̏̃͑̏̓͆̔̋̎̕͘͘͘͜͜͜͜͜͜͜͝͝͠͠ͅͅͅͅͅͅͅͅͅZ̴̧̢̨̢̧̢̢̡̧̢̢̢̨̨̨̡̨̧̢̧̛̛̬̖͈̮̝̭̖͖̗̹̣̼̼̘̘̫̠̭̞͙͔͙̜̠̗̪̠̼̫̻͓̳̟̲̳̻̙̼͇̺͎̘̹̼͔̺̹̬̯̤̮̟͈̭̻͚̣̲͔͙̥͕̣̻̰͈̼̱̺̤̤͉̙̦̩̗͎̞͓̭̞̗͉̳̭̭̺̹̹̮͕̘̪̞̱̥͈̹̳͇̟̹̱̙͚̯̮̳̤͍̪̞̦̳̦͍̲̥̳͇̪̬̰̠͙͕̖̝̫̩̯̱̘͓͎̪͈̤̜͎̱̹̹̱̲̻͎̖̳͚̭̪̦̗̬͍̯̘̣̩̬͖̝̹̣̗̭͖̜͕̼̼̲̭͕͔̩͓̞̝͓͍̗̙̯͔̯̞̝̳̜̜͉̖̩͇̩̘̪̥̱͓̭͎͖̱̙̩̜͎̙͉̟͎͔̝̥͕͍͓̹̮̦̫͚̠̯͓̱͖͔͓̤͉̠͙̋͐̀͌̈́͆̾͆̑̔͂͒̀̊̀͋͑̂͊̅͐̿́̈́̐̀̏̋̃̄͆͒̈́̿̎́́̈̀̀͌̔͋͊̊̉̿͗͊͑̔͐̇͆͛̂̐͊̉̄̈́̄̐͂͂͒͑͗̓͑̓̾̑͋̒͐͑̾͂̎̋̃̽̂̅̇̿̍̈́́̄̍͂͑̏̐̾̎̆̉̾͂̽̈̆̔́͋͗̓̑̕͘̕͘͜͜͜͜͜͝͝͝͝͠͠͝ͅo̶̪͆́̀͂̂́̄̅͂̿͛̈́̿͊͗́͘͝t̴̡̨̧̨̧̡̧̨̡̢̧̢̡̨̛̪͈̣̭̺̱̪̹̺̣̬̖̣̻͈̞̙͇̩̻̫͈̝̭̟͎̻̟̻̝̱͔̝̼͍̞̼̣̘̤̯͓͉̖̠̤͔̜̙͚͓̻͓̬͓̻̜̯̱̖̳̱̗̠̝̥̩͓̗̪̙͓̖̠͎̗͎̱̮̯̮͙̩̫̹̹̖͙̙͖̻͈̙̻͇͔̙̣̱͔̜̣̭̱͈͕̠̹͙̹͇̻̼͎͍̥̘͙̘̤̜͎̟͖̹̦̺̤͍̣̼̻̱̲͎̗̹͉͙̪̞̻̹͚̰̻͈͈͊̈́̽̀̎̃̊́̈́̏̃̍̉̇̑̂̇̏̀͊̑̓͛̽͋̈́͆́̊͊̍͌̈́̓͊̌̿̂̾̐͑̓̀́͒̃̋̓͆̇̀͊̆͗̂͑͐̀͗̅̆͘̕͘̕̕͜͜͝͝͝͝͝͝͝ͅͅͅͅͅͅͅͅͅḁ̶̢̡̨̧̡̡̨̨̧̨̡̡̢̧̨̡̡̛̛̛͍̱̳͚͕̩͍̺̪̻̫̙͈̬͙̖͙̬͍̬̟̣̝̲̼̜̼̺͎̥̮̝͙̪̘̙̻͖͇͚͙̣̬̖̲̲̥̯̦̗̰̙̗̪̞̗̩̻̪̤̣̜̳̩̦̻͓̞̙͍͙̫̩̹̥͚̻̦̗̰̲̙̫̬̱̺̞̟̻͓̞͚̦̘̝̤͎̤̜̜̥̗̱͈̣̻̰̮̼̙͚͚̠͚̲̤͔̰̭̙̳͍̭͎̙͚͍̟̺͎̝͓̹̰̟͈͈̖̺͙̩̯͔̙̭̟̞̟̼̮̦̜̳͕̞̼͈̜͍̮͕̜͚̝̦̞̥̜̥̗̠̦͇͖̳͈̜̮̣͚̲̟͙̎̈́́͊̔̑̽̅͐͐͆̀͐́̓̅̈͑͑̍̿̏́͆͌̋̌̃̒̽̀̋̀̃̏̌́͂̿̃̎̐͊̒̀̊̅͒̎͆̿̈́̑̐̒̀̈́̓̾͋͆̇̋͒̎̈̄̓̂͊̆͂̈́̒̎͐̇̍̆̋̅̿̔͒̄̇̂̋̈́͆̎̔̇͊̊̈́̔̏͋́̀͂̈́̊͋͂̍̾̓͛̇̔̚͘̚̕̚͘͘̕̕̕̚͘͘̚̕̚̕͜͜͜͝͝͝͝͝͝͝͝ͅͅͅͅͅç̵̧̢̨̢̢̢̧̧̡̨̡̢̧̧̧̨̡̡̨̨̢̢̢̧̨̢̨̢̛̛͉̗̠͇̹̖̝͕͚͎̟̻͓̳̰̻̺̞̣͚̤͙͍͇̗̼͖͔͕͙͖̺͙̖̹̘̘̺͓̜͍̣̰̗̖̺̗̪̘̯̘͚̲͚̲̬̞̹̹͕̭͔̳̘̝̬͉̗̪͉͕̞̫͔̭̭̜͉͔̬̫͙̖̙͚͔͙͚͍̲̘͚̪̗̞̣̞̲͎͔͖̺͍͎̝͎͍̣͍̩̟͈͕̗͉̪̯͉͎͖͍̖͎̖̯̲̘̦̟̭͍͚͓͈͙̬͖̘̱̝̜̘̹̩̝̥̜͎̬͓̬͙͍͇͚̟̫͇̬̲̥̘̞̘̟̘̝̫͈̙̻͇͎̣̪̪̠̲͓͉͙͚̭̪͇̯̠̯̠͖̞̜͓̲͎͇̼̱̦͍͉͈͕͉̗̟̖̗̱̭͚͎̘͓̬͍̱͍̖̯̜̗̹̰̲̩̪͍̞̜̫̩̠͔̻̫͍͇͕̰̰̘͚͈̠̻̮͊̐̿̏̐̀̇̑̐̈͛͑͑̍̑̔̃̈́̓̈́̇̐͑̐̊̆͂̀̏͛̊̔̍̽͗͋̊̍̓̈́̏̅͌̀̽́̑͒͒̓͗̈́̎͌͂̕̚͘͘͜͜͜͜͜͠͝͝͝͝ͅͅͅͅͅͅͅS̵̡̡̧̧̨̨̡̢̡̡̡̡̧̧̡̧̢̫̯͔̼̲͉͙̱̮̭̗͖̯̤͙̜͚̰̮̝͚̥̜̞̠̤̺̝͇̻̱͙̩̲̺͍̳̤̺̖̝̳̪̻̗̮̪̖̺̹̭͍͇̗̝̱̻̳̝̖̝͎̙͉̞̱̯̙̜͇̯̻̞̱̭̗͉̰̮̞͍̫̺͙͎̙̞̯̟͓͉̹̲͖͎̼̫̩̱͇̲͓̪͉̺̞̻͎̤̥̭̺̘̻̥͇̤̖̰̘̭̳̫̙̤̻͇̪̦̭̱͎̥̟͖͕̣̤̩̟̲̭̹̦̹̣͖̖͒̈́̈́̓͗̈̄͂̈́̅̐̐̿̎̂͗̎̿̕͘͜͜͜͜͝͝ͅͅt̸̡̡̧̧̨̡̢̛̥̥̭͍̗͈̩͕͔͔̞̟͍̭͇̙̺̤͚͎͈͎͕̱͈̦͍͔͓̬͚̗̰̦͓̭̰̭̎̀̂̈́̓̒̈́̈́̂̄̋́̇̂͐͒̋̋̉͐̉̏̇͋̓̈́͐̾͋̒͒͐̊̊̀̄͆̄͆̑͆̇̊̓̚̚̕̚̕͜͠͝͝ͅͅơ̵̡̨̡̡̡̨̛̺͕̼͔̼̪̳͖͓̠̘̘̳̼͚͙͙͚̰͚͚͖̥̦̥̘̖̜̰͔̠͕̦͎̞̮͚͕͍̤̠̦͍̥̝̰̖̳̫̮̪͇̤̱̜͙͔̯͙̙̼͇̹̥̜͈̲̺̝̻̮̬̼̫̞̗̣̪̱͓̺̜̠͇͚͓̳̹̥̳̠͍̫͈̟͈̘̯̬̞͔̝͍͍̥̒̐͗͒͂͆̑̀̿̏́̀͑͗̐́̀̾̓́̌̇̒̈́̌̓͐̃̈́̒̂̀̾͂̊̀̂͐̃̄̓̔̽̒̈́̇̓͌̇̂̆̒̏̊̋͊͛͌̊̇̒̅͌̄̎̔̈́͊́̽̋̈̇̈́́͊̅͂̎̃͌͊͛͂̄̽̈́̿͐̉̽̿́́̉͆̈́̒́̂̾̄̇̌̒̈̅̍̿̐͑̓͊̈́̈̋̈́̉̍̋̊̈̀̈́̾̿̌̀̈́͌̑̍́̋̒̀̂̈́́̾̏̐̅̈̑͗͐̈͂̄̾̄̈́̍̉͑͛͗͋̈́̃̄̊́́͐̀̀̽̇̓̄̓̃͋͋̂̽̔̀̎͌̈́̈́̑̓̔̀̓͐͛͆̿̋͑͛̈́͂̅̋̅͆͗̇́̀̒́̏͒̐̍͂̓͐͐̇̂̉̑̊͑̉̋̍͊̄̀͂̎͒̔͊̃̏̕̚̕̕͘͘͘̚͘̚͘̕͘̚͘̚̚̚̕͘͜͜͜͝͝͠͠͝͝͠͠͝͝͝͝͝͝͝͝͝ͅͅͅc̴̨̡̢̢̢̡̡̢̛̛̛̻͇̝̣͉͚͎͕̻̦͖̤̖͇̪̩̤̻̭̮̙̰̖̰̳̪̱̹̳̬͖̣͙̼̙̰̻̘͇͚̺̗̩̫̞̳̼̤͔͍͉̟͕̯̺͈̤̰̹̍̋́͆̾̆̊͆͋̀͑͒̄̿̄̀̂͋̊͆́͑̑̽͊̓́̔̽̌͊̄͑͒͐̑͗̿̃̀̓̅́̿͗̈́͌̋̀̏̂͌̓́̇̀͒͋̌̌̅͋͌̆͐̀̔̒͐̊̇̿̽̀̈́̃̒̋̀̈́̃̏̂̊͗̑̊̈̇̀̌͐̈́̉̂̏͊̄͐̈̽͒̏̒̓́̌̓̅́̓̃͐͊͒̄͑̒͌̍̈́̕͘̚͘̕͘̚̕͜͝͠͝͝͝ͅǩ̴̢̢̢̧̨̢̢̢̨̨̨̢̢̢̨̧̨̡̡̢̛̛̛̛̛̛̛̜̥̩̙͕̮̪̻͈̘̯̼̰̜͚̰͖̬̳͖̣̭̼͔̲͉̭̺͚̺̟͉̝̱̲͎͉̙̥̤͚͙̬̪̜̺͙͍̱̞̭̬̩̖̤̹̤̺̦͈̰̗̰͍͇̱̤̬̬͙̙̲̙̜͖͓̙̟̙̯̪͍̺̥͔͕̝̳̹̻͇̠̣͈̰̦͓͕̩͇͈͇̖͙͍̰̲̤̞͎̟̝̝͈͖͔͖̦̮̗̬̞̞̜̬̠̹̣̣̲̮̞̤̜̤̲̙͔͕̯͔͍̤͕̣͔͙̪̫̝̣̰̬̬̭̞͔̦̟̥̣̻͉͈̮̥̦̮̦͕̤͇̺͆͆̈͗̄̀̌̔̈́̈̉̾̊̐̆̂͛̀̋́̏̀̿͒̓̈́̈́͂̽̾͗͊̋̐̓̓̀̃̊̊͑̓̈̎̇͑̆̂̉̾̾̑͊̉̃́̑͌̀̌̐̅̃̿̆̎̈́̀̒́͛̓̀̊́̋͛͒͊̆̀̃̊͋̋̾̇̒̋͂̏͗͆̂̔́̐̀́͗̅̈̋̂̎̒͊̌̉̈̈́͌̈́̔̾̊̎́͐͒̋̽̽́̾̿̚̕͘͘̚̕̕̕̚̚̕̚̕͘͜͜͜͝͠͝͝͝͝͝͝͝͝ͅͅͅͅͅͅB̸̢̧̨̡̢̧̨̡̡̨̡̨̡̡̡̢̨̢̨̛̛̛̛̛̛͉̞͚̰̭̲͈͎͕͈̦͍͈̮̪̤̻̻͉̫̱͔̞̫̦̰͈̗̯̜̩̪̲̻̖̳͖̦͎͔̮̺̬̬̼̦̠̪̤͙͍͓̜̥̙̖̫̻̜͍̻̙̖̜̹͔̗̪̜̖̼̞̣̠̫͉̯̮̤͈͎̝̪͎͇͙̦̥͙̳̫̰̪̣̱̘̤̭̱͍̦͔̖͎̺̝̰̦̱̣͙̙̤͚̲͔̘̱̜̻͔̥̻͖̭͔̜͉̺͕͙͖̜͉͕̤͚̠̩̮̟͚̗͈͙̟̞̮̬̺̻̞͔̥͉͍̦̤͓̦̻̦̯̟̰̭̝̘̩̖̝͔̳͉̗̖̱̩̩̟͙͙͛̀͐̈́̂̇͛̅̒̉̏̈́̿͐́̏̃̏̓̌̽͐̈́͛̍͗͆͛̋̔̉͂̔̂̓̌͌͋̂͆̉͑̊̎́̈́̈̂͆͑́̃̍̇̿̅̾́́̿̅̾̆̅̈́̈̓͒͌͛̃͆̋͂̏̓̅̀͂̽̂̈̈́̎̾̐͋͑̅̍̈́̑̅̄͆̓̾̈́͐̎̊͐̌̌̓͊̊̔̈́̃͗̓͊͐̌͆̓͗̓̓̾̂̽͊͗́́́̽͊͆͋͊̀̑̿̔͒̏̈́́̏͆̈́͋̒͗͂̄̇̒͐̃͑̅̍͒̎̈́̌̋́̓͂̀̇͛̋͊͆̈́̋́̍̃͒̆̕̚̚̕̕̕͘̕̚̚͘̕͜͜͜͜͝͠͠͝͠͝͝͝͝͠͝͝͝͝ͅͅͅͅͅI̵̡̢̧̨̡̢̨̡̡̢̡̧̡̢̢̢̡̢̛̛͕͎͕̩̠̹̩̺̣̳̱͈̻̮̺̟̘̩̻̫͖̟͓̩̜̙͓͇̙̱̭̰̻̫̥̗̠͍͍͚̞̘̫͉̬̫̖̖̦͖͉̖̩̩̖̤̺̥̻̝͈͎̻͓̟̹͍̲͚͙̹̟̟̯͚̳̟͕̮̻̟͈͇̩̝̼̭̯͚͕̬͇̲̲̯̰̖̙̣̝͇̠̞̙͖͎̮̬̳̥̣̺̰͔̳̳̝̩̤̦̳̞̰̩̫̟͚̱̪̘͕̫̼͉̹̹̟̮̱̤̜͚̝̠̤̖̮̯̳͖̗̹̞̜̹̭̿̏͋̒͆̔̄̃̾̓͛̾̌́̅̂͆̔͌͆͋̔̾́̈̇̐̄̑̓̂̾́̄̿̓̅̆͌̉̎̏̄͛̉͆̓̎͒͘̕̕͜͜͜͜͜͜͜͝͠ͅͅƠ̷̢̛̛̛̛̛̛̛̛̟̰͔͔͇̲̰̮̘̭̭̖̥̟̘̠̬̺̪͇̲͋͂̅̈́̍͂̽͗̾͒̇̇̒͐̍̽͊́̑̇̑̾̉̓̈̾͒̍̌̅̒̾̈́̆͌̌̾̎̽̐̅̏́̈̔͛̀̋̃͊̒̓͗͒̑͒̃͂̌̄̇̑̇͛̆̾͛̒̇̍̒̓̀̈́̄̐͂̍͊͗̎̔͌͛̂̏̉̊̎͗͊͒̂̈̽̊́̔̊̃͑̈́̑̌̋̓̅̔́́͒̄̈́̈̂͐̈̅̈̓͌̓͊́̆͌̉͐̊̉͛̓̏̓̅̈́͂̉̒̇̉̆̀̍̄̇͆͛̏̉̑̃̓͂́͋̃̆̒͋̓͊̄́̓̕̕̕̚͘͘͘̚̕̚͘̕̕͜͜͝͝͝͠͝͝͝͝͠ͅS̷̢̨̧̢̡̨̢̨̢̨̧̧̨̧͚̱̪͇̱̮̪̮̦̝͖̜͙̘̪̘̟̱͇͎̻̪͚̩͍̠̹̮͚̦̝̤͖̙͔͚̙̺̩̥̻͈̺̦͕͈̹̳̖͓̜͚̜̭͉͇͖̟͔͕̹̯̬͍̱̫̮͓̙͇̗̙̼͚̪͇̦̗̜̼̠͈̩̠͉͉̘̱̯̪̟͕̘͖̝͇̼͕̳̻̜͖̜͇̣̠̹̬̗̝͓̖͚̺̫͛̉̅̐̕͘͜͜͜͜ͅͅͅ.̶̨̢̢̨̢̨̢̛̻͙̜̼̮̝̙̣̘̗̪̜̬̳̫̙̮̣̹̥̲̥͇͈̮̟͉̰̮̪̲̗̳̰̫̙͍̦̘̠̗̥̮̹̤̼̼̩͕͉͕͇͙̯̫̩̦̟̦̹͈͔̱̝͈̤͓̻̟̮̱͖̟̹̝͉̰͊̓̏̇͂̅̀̌͑̿͆̿̿͗̽̌̈́̉̂̀̒̊̿͆̃̄͑͆̃̇͒̀͐̍̅̃̍̈́̃̕͘͜͜͝͠͠z̴̢̢̡̧̢̢̧̢̨̡̨̛̛̛̛̛̛̛̛̲͚̠̜̮̠̜̞̤̺͈̘͍̻̫͖̣̥̗̙̳͓͙̫̫͖͍͇̬̲̳̭̘̮̤̬̖̼͎̬̯̼̮͔̭̠͎͓̼̖̟͈͓̦̩̦̳̙̮̗̮̩͙͓̮̰̜͎̺̞̝̪͎̯̜͈͇̪̙͎̩͖̭̟͎̲̩͔͓͈͌́̿͐̍̓͗͑̒̈́̎͂̋͂̀͂̑͂͊͆̍͛̄̃͌͗̌́̈̊́́̅͗̉͛͌͋̂̋̇̅̔̇͊͑͆̐̇͊͋̄̈́͆̍̋̏͑̓̈́̏̀͒̂̔̄̅̇̌̀̈́̿̽̋͐̾̆͆͆̈̌̿̈́̎͌̊̓̒͐̾̇̈́̍͛̅͌̽́̏͆̉́̉̓̅́͂͛̄̆͌̈́̇͐̒̿̾͌͊͗̀͑̃̊̓̈̈́̊͒̒̏̿́͑̄̑͋̀̽̀̔̀̎̄͑̌̔́̉̐͛̓̐̅́̒̎̈͆̀̍̾̀͂̄̈́̈́̈́̑̏̈́̐̽̐́̏̂̐̔̓̉̈́͂̕̚̕͘͘̚͘̚̕̚̚̚͘̕̕̕͜͜͝͠͠͝͝͝͝͠͝͝͝͠͝͝͝͝͝͝ͅͅͅī̸̧̧̧̡̨̨̢̨̛̛̘͓̼̰̰̮̗̰͚̙̥̣͍̦̺͈̣̻͇̱͔̰͈͓͖͈̻̲̫̪̲͈̜̲̬̖̻̰̦̰͙̤̘̝̦̟͈̭̱̮̠͍̖̲͉̫͔͖͔͈̻̖̝͎̖͕͔̣͈̤̗̱̀̅̃̈́͌̿̏͋̊̇̂̀̀̒̉̄̈́͋͌̽́̈́̓̑̈̀̍͗͜͜͠͠ͅp̴̢̢̧̨̡̡̨̢̨̢̢̢̨̡̛̛͕̩͕̟̫̝͈̖̟̣̲̖̭̙͇̟̗͖͎̹͇̘̰̗̝̹̤̺͉͎̙̝̟͙͚̦͚͖̜̫̰͖̼̤̥̤̹̖͉͚̺̥̮̮̫͖͍̼̰̭̤̲͔̩̯̣͖̻͇̞̳̬͉̣̖̥̣͓̤͔̪̙͎̰̬͚̣̭̞̬͎̼͉͓̮͙͕̗̦̞̥̮̘̻͎̭̼͚͎͈͇̥̗͖̫̮̤̦͙̭͎̝͖̣̰̱̩͎̩͎̘͇̟̠̱̬͈̗͍̦̘̱̰̤̱̘̫̫̮̥͕͉̥̜̯͖̖͍̮̼̲͓̤̮͈̤͓̭̝̟̲̲̳̟̠͉̙̻͕͙̞͔̖͈̱̞͓͔̬̮͎̙̭͎̩̟̖͚̆͐̅͆̿͐̄̓̀̇̂̊̃̂̄̊̀͐̍̌̅͌̆͊̆̓́̄́̃̆͗͊́̓̀͑͐̐̇͐̍́̓̈́̓̑̈̈́̽͂́̑͒͐͋̊͊̇̇̆̑̃̈́̎͛̎̓͊͛̐̾́̀͌̐̈́͛̃̂̈̿̽̇̋̍͒̍͗̈͘̚̚͘̚͘͘͜͜͜͜͜͜͠͠͝͝ͅͅͅ☻♥■∞{╚mYÄÜXτ╕○\╚Θº£¥ΘBM@Q05♠{{↨↨▬§¶‼↕◄►☼1♦  wumbo╚̯̪̣͕̙̩̦͓͚̙̱̘̝̏̆ͤ̊̅ͩ̓̏̿͆̌Θ̼̯͉ͭͦ̃͊͑̉ͯͤ̈́ͬ͐̈́͊ͤͅº͍̪͇͖̝̣̪̙̫̞̦̥ͨ̂ͧ̄̿£̺̻̹̠̯͙͇̳ͬ̃̿͑͊ͨͣ╚̯̪̣͕̙̩̦͓͚̙̱̘̝̏̆ͤ̊̅ͩ̓̏̿͆̌Θ̼̯͉ͭͦ̃͊͑̉ͯͤ̈́ͬ͐̈́͊ͤͅº͍̪͇͖̝̣̪̙̫̞̦̥ͨ̂ͧ̄̿£̺̻̹̠̯͙͇̳ͬ̃̿͑͊ͨͣ╚̯̪̣͕̙̩̦͓͚̙̱̘̝̏̆ͤ̊̅ͩ̓̏̿͆̌Θ̼̯͉ͭͦ̃͊͑̉ͯͤ̈́ͬ͐̈́͊ͤͅº͍̪͇͖̝̣̪̙̫̞̦̥ͨ̂ͧ̄̿£̺̻̹̠̯͙͇̳ͬ̃̿͑͊ͨͣ╚̯̪̣͕̙̩̦͓͚̙̱̘̝̏̆ͤ̊̅ͩ̓̏̿͆̌Θ̼̯͉ͭͦ̃͊͑̉ͯͤ̈́ͬ͐̈́͊ͤͅº͍̪͇͖̝̣̪̙̫̞̦̥ͨ̂ͧ̄̿£̺̻̹̠̯͙͇̳ͬ̃̿͑͊ͨͣ╚̯̪̣͕̙̩̦͓͚̙̱̘̝̏̆ͤ̊̅ͩ̓̏̿͆̌Θ̼̯͉ͭͦ̃͊͑̉ͯͤ̈́ͬ͐̈́͊ͤͅº͍̪͇͖̝̣̪̙̫̞̦̥ͨ̂ͧ̄̿£̺̻̹̠̯͙͇̳ͬ̃̿͑͊ͨͣ╚̯̪̣͕̙̩̦͓͚̙̱̘̝̏̆ͤ̊̅ͩ̓̏̿͆̌Θ̼̯͉ͭͦ̃͊͑̉ͯͤ̈́ͬ͐̈́͊ͤͅº͍̪͇͖̝̣̪̙̫̞̦̥ͨ̂ͧ̄̿£̺̻̹̠̯͙͇̳ͬ̃̿͑͊ͨͣ╚̯̪̣͕̙̩̦͓͚̙̱̘̝̏̆ͤ̊̅ͩ̓̏̿͆̌Θ̼̯͉ͭͦ̃͊͑̉ͯͤ̈́ͬ͐̈́͊ͤͅº͍̪͇͖̝̣̪̙̫̞̦̥ͨ̂ͧ̄̿£̺̻̹̠̯͙͇̳ͬ̃̿͑͊ͨͣ╚̯̪̣͕̙̩̦͓͚̙̱̘̝̏̆ͤ̊̅ͩ̓̏̿͆̌Θ̼̯͉ͭͦ̃͊͑̉ͯͤ̈́ͬ͐̈́͊ͤͅº͍̪͇͖̝̣̪̙̫̞̦̥ͨ̂ͧ̄̿£̺̻̹̠̯͙͇̳ͬ̃̿͑͊ͨͣ╚̯̪̣͕̙̩̦͓͚̙̱̘̝̏̆ͤ̊̅ͩ̓̏̿͆̌Θ̼̯͉ͭͦ̃͊͑̉ͯͤ̈́ͬ͐̈́͊ͤͅº͍̪͇͖̝̣̪̙̫̞̦̥ͨ̂ͧ̄̿£̺̻̹̠̯͙͇̳ͬ̃̿͑͊ͨͣ╚̯̪̣͕̙̩̦͓͚̙̱̘̝̏̆ͤ̊̅ͩ̓̏̿͆̌Θ̼̯͉ͭͦ̃͊͑̉ͯͤ̈́ͬ͐̈́͊ͤͅº͍̪͇͖̝̣̪̙̫̞̦̥ͨ̂ͧ̄̿£̺̻̹̠̯͙͇̳ͬ̃̿͑͊ͨͣ╚̯̪̣͕̙̩̦͓͚̙̱̘̝̏̆ͤ̊̅ͩ̓̏̿͆̌Θ̼̯͉ͭͦ̃͊͑̉ͯͤ̈́ͬ͐̈́͊ͤͅº͍̪͇͖̝̣̪̙̫̞̦̥ͨ̂ͧ̄̿£̺̻̹̠̯͙͇̳ͬ̃̿͑͊ͨͣ╚̯̪̣͕̙̩̦͓͚̙̱̘̝̏̆ͤ̊̅ͩ̓̏̿͆̌Θ̼̯͉ͭͦ̃͊͑̉ͯͤ̈́ͬ͐̈́͊ͤͅº͍̪͇͖̝̣̪̙̫̞̦̥ͨ̂ͧ̄̿£̺̻̹̠̯͙͇̳ͬ̃̿͑͊ͨͣ╚̯̪̣͕̙̩̦͓͚̙̱̘̝̏̆ͤ̊̅ͩ̓̏̿͆̌Θ̼̯͉ͭͦ̃͊͑̉ͯͤ̈́ͬ͐̈́͊ͤͅº͍̪͇͖̝̣̪̙̫̞̦̥ͨ̂ͧ̄̿£̺̻̹̠̯͙͇̳ͬ̃̿͑͊ͨͣ╚̯̪̣͕̙̩̦͓͚̙̱̘̝̏̆ͤ̊̅ͩ̓̏̿͆̌Θ̼̯͉ͭͦ̃͊͑̉ͯͤ̈́ͬ͐̈́͊ͤͅº͍̪͇͖̝̣̪̙̫̞̦̥ͨ̂ͧ̄̿£̺̻̹̠̯͙͇̳ͬ̃̿͑͊ͨͣ╚̯̪̣͕̙̩̦͓͚̙̱̘̝̏̆ͤ̊̅ͩ̓̏̿͆̌Θ̼̯͉ͭͦ̃͊͑̉ͯͤ̈́ͬ͐̈́͊ͤͅº͍̪͇͖̝̣̪̙̫̞̦̥ͨ̂ͧ̄̿£̺̻̹̠̯͙͇̳ͬ̃̿͑͊ͨͣ

Link to comment
Share on other sites

Link to post
Share on other sites

This is crazy. Intel has finally thrown the towel. And who did they obviously decide to partner with...? Nope. Its AMD! xD Both companies have some to gain and some to lose by doing this, but overall, I see this as a favorable outcome. We (at SA) are speculating the chip is destined for Apple, and FreeSync on these chips will be a huuuge plus.

 

Really, the biggest loser I see here is Nvidia. Even Intel had enough of their shit. Their cross licensing agreement expired recently, and Intel not only didn't bother renewing it, they didn't even consider (the objectively better) Nvidia GPUs for their new series of processors. Apple has similarly ditched Nvidia years ago as well, and I wonder why... 9_9

Quote

The problem is that this is an nVidia product and scoring any nVidia product a "zero" is also highly predictive of the number of nVidia products the reviewer will receive for review in the future.

On 2015-01-28 at 5:24 PM, Victorious Secret said:

Only yours, you don't shitpost on the same level that we can, mainly because this thread is finally dead and should be locked.

On 2016-06-07 at 11:25 PM, patrickjp93 said:

I wasn't wrong. It's extremely rare that I am. I provided sources as well. Different devs can disagree. Further, we now have confirmed discrepancy from Twitter about he use of the pre-release 1080 driver in AMD's demo despite the release 1080 driver having been out a week prior.

On 2016-09-10 at 4:32 PM, Hikaru12 said:

You apparently haven't seen his responses to questions on YouTube. He is very condescending and aggressive in his comments with which there is little justification. He acts totally different in his videos. I don't necessarily care for this content style and there is nothing really unique about him or his channel. His endless dick jokes and toilet humor are annoying as well.

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, leadeater said:

Holy shit... sorry but my jaw just fell off. Totally speechless, Intel working with AMD, where are the flying pigs?

In a US government lab. 

Make sure to quote or tag me (@JoostinOnline) or I won't see your response!

PSU Tier List  |  The Real Reason Delidding Improves Temperatures"2K" does not mean 2560×1440 

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, MyName13 said:

Making less money and making more money isn't the same.

AMD doesn't currently have the chops to compete at the niche this product is going for. Meanwhile this gets their graphics name out there on a top end consumer product without impacting the sales of all the lower end stuff.

Link to comment
Share on other sites

Link to post
Share on other sites

27 minutes ago, Shahnewaz said:

This is crazy. Intel has finally thrown the towel. And who did they obviously decide to partner with...? Nope. Its AMD! xD Both companies have some to gain and some to lose by doing this, but overall, I see this as a favorable outcome. We (at SA) are speculating the chip is destined for Apple, and FreeSync on these chips will be a huuuge plus.

 

Really, the biggest loser I see here is Nvidia. Even Intel had enough of their shit. Their cross licensing agreement expired recently, and Intel not only didn't bother renewing it, they didn't even consider (the objectively better) Nvidia GPUs for their new series of processors. Apple has similarly ditched Nvidia years ago as well, and I wonder why... 9_9

Thing is, there is absolutely nothing stopping Intel from having a NVidia offering as well other than possibly an exclusivity contract with AMD.

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, Dylanc1500 said:

Thing is, there is absolutely nothing stopping Intel from having a NVidia offering as well other than maybe an exclusivity contract.

That is true, and you'd imagine they would have went to Nvidia in the first place since AMD is their direct competitor, and Nvidia GPUs are more efficient.

But nope. Which means either AMD has something Nvidia just can't provide, or Intel just doesn't want to do business anymore with Nvidia... like Apple.

Quote

The problem is that this is an nVidia product and scoring any nVidia product a "zero" is also highly predictive of the number of nVidia products the reviewer will receive for review in the future.

On 2015-01-28 at 5:24 PM, Victorious Secret said:

Only yours, you don't shitpost on the same level that we can, mainly because this thread is finally dead and should be locked.

On 2016-06-07 at 11:25 PM, patrickjp93 said:

I wasn't wrong. It's extremely rare that I am. I provided sources as well. Different devs can disagree. Further, we now have confirmed discrepancy from Twitter about he use of the pre-release 1080 driver in AMD's demo despite the release 1080 driver having been out a week prior.

On 2016-09-10 at 4:32 PM, Hikaru12 said:

You apparently haven't seen his responses to questions on YouTube. He is very condescending and aggressive in his comments with which there is little justification. He acts totally different in his videos. I don't necessarily care for this content style and there is nothing really unique about him or his channel. His endless dick jokes and toilet humor are annoying as well.

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

This may be rumoured Vega 11 GPU with 2048 cores (so basically half a vega 10). I wouldn't be surprised since a theoretical cut down Vega 10 would support 1 HBM stack, so this intel+Radeon combination kind of lines up with that.

hello!

is it me you're looking for?

ᴾC SᴾeCS ᴰoWᴺ ᴮEᴸoW

Spoiler

Desktop: X99-PC

CPU: i7 5820k

Mobo: X99 Deluxe

Cooler: Dark Rock Pro 3

RAM: 32GB DDR4
GPU: GTX 1080

Storage: 1TB 850 Evo, 1TB HDD, bunch of external hard drives
PSU: EVGA G2 750w

Peripherals: Logitech G502, Ducky One 711

Audio: Xonar U7, O2 amplifier (RIP), HD6XX

Monitors: 4k 24" Dell monitor, 1080p 24" Asus monitor

 

Laptop:

-Overkill Dell XPS

Fully maxed out early 2017 Dell XPS 15, GTX 1050 4GB, 7700HQ, 1TB nvme SSD, 32GB RAM, 4k display. 97Whr battery :x 
Dell was having a $600 off sale for the fully specced out model, so I decided to get it :P

 

-Crapbook

Fully specced out early 2013 Macbook "pro" with gt 650m and constant 105c temperature on the CPU (GPU is 80-90C) when doing anything intensive...

A 2013 laptop with a regular sized battery still has better battery life than a 2017 laptop with a massive battery! I think this is a testament to apple's ability at making laptops, or maybe how little CPU technology has improved even 4+ years later (at least, until the recent introduction of 15W 4 core CPUs). Anyway, I'm never going to get a 35W CPU laptop again unless battery technology becomes ~5x better than as it is in 2018.

Apple knows how to make proper consumer-grade laptops (they don't know how to make pro laptops though). I guess this mostly software power efficiency related, but getting a mac makes perfect sense if you want a portable/powerful laptop that can do anything you want it to with great battery life.

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Shahnewaz said:

That is true, and you'd imagine they would have went to Nvidia in the first place since AMD is their direct competitor, and Nvidia GPUs are more efficient.

But nope. Which means either AMD has something Nvidia just can't provide, or Intel just doesn't want to do business anymore with Nvidia... like Apple.

All I'm going say is, honestly the most likely is the lowest bidder. It's all business they don't care what AMD or NVidia do until it starts to effect their bottom line.

Link to comment
Share on other sites

Link to post
Share on other sites

20 minutes ago, rattacko123 said:

This may be rumoured Vega 11 GPU with 2048 cores (so basically half a vega 10). I wouldn't be surprised since a theoretical cut down Vega 10 would support 1 HBM stack, so this intel+Radeon combination kind of lines up with that.

Apparently it's not vega, and there's 24 CU's, not 10.  Die size measurements confirm this number is in the right ballpark too

Solve your own audio issues  |  First Steps with RPi 3  |  Humidity & Condensation  |  Sleep & Hibernation  |  Overclocking RAM  |  Making Backups  |  Displays  |  4K / 8K / 16K / etc.  |  Do I need 80+ Platinum?

If you can read this you're using the wrong theme.  You can change it at the bottom.

Link to comment
Share on other sites

Link to post
Share on other sites

Something not being considered here is the Apple effect.  Now before you completely dismiss it offhand hear me out.

 

TL;DR I'm speculating a fallout between Apple and Nvidia over form factor or profit margins.

 

Apple indicated they're planning a new Mac Pro.  The people who are most interested in getting a Mac Pro are the enthusiasts, who are also the same folks that would consider building a hackintosh.

 

Apple has been very quiet about their stance on hackintosh, but lets not forget Apple doesn't sell its OS.  They make money selling marked-up hardware that comes bundled with the OS.  Instead of announcing a position against hackintosh and risk fan base resentment Apple can sidestep the issue by creating a Mac Pro thats not compatible with off-the-shelf GPU cards, and in some way make it impossible to build a hachintosh.  Price-wise Apple has no way of competing with hackintosh.

 

To do this Apple needs a proprietary form factor with a non-standard PCI connector.  One part Apple does engineer is the motherboard.  In this way Apple could licence GPUs with boards that have their connector and chip for authentication with the OS, that include Apple tax.

 

What does this have to do with Intel you ask?  Apple no doubt made offers at some point to Nvidia and AMD to compete for exclusivity with regard to GPU support going forward. AMD obviously won out as supported by the fact the entire iMac/iMac Pro line-up use Vega.  Likely that deal includes graphic support only for AMD, not Nvidia or Intel integrated, in macOS 10.14 in 2018/19.

 

So when it came time to do a deal with Intel and AMD for the CPUs that will be in the Mac Pro and MacBooks that will be announced when Apple releases macOS 10.14 the condition was that the CPUs with integrated graphics had to be AMD as a result of that previous agreement.

 

No idea wheather Apple ships 10k, 100k, or 1M combined desktops and laptops, but it was obviously worth it for Intel to win the CPU deal despite having to swallow the bitter pill of including AMD graphics in the Intel chip.

Again, this is speculation, however I have yet to hear any other reason why it would be in Intel's interest to support thier direct competitor (AMD) inside their chip except in light of this situation.

 

Thoughts?

Link to comment
Share on other sites

Link to post
Share on other sites

6 hours ago, tsk said:

Would be good if the OP actually was informative. Then again this is LTT forum. 

There was no other info this morning, I think the oc3d article went up just after I posted this. Just getting back to it now TBH  :D

CPU | Intel i9-10850K | GPU | EVGA 3080ti FTW3 HYBRID  | CASE | Phanteks Enthoo Evolv ATX | PSU | Corsair HX850i | RAM | 2x8GB G.skill Trident RGB 3000MHz | MOTHERBOARD | Asus Z490E Strix | STORAGE | Adata XPG 256GB NVME + Adata XPG 1T + WD Blue 1TB + Adata 480GB SSD | COOLING | Evga CLC280 | MONITOR | Acer Predator XB271HU | OS | Windows 10 |

                                   

                                   

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Dylanc1500 said:

All I'm going say is, honestly the most likely is the lowest bidder. It's all business they don't care what AMD or NVidia do until it starts to effect their bottom line.

The cheapest and the most effective option is to not do anything like this. It's not like Intel is losing marketshare in the laptop segment at the moment. There is no reason for them to do this on their own.

Most likely reason is that Apple demanded something like this from them. And Apple is not someone who looks for lowest bidders and they definitely do not leave performance on the table. On that logic an Intel-Nvidia combo would be exactly what Apple wants, but that's not happening in this case.

 

Even if Apple didn't ask for this, it would make plain business sense to partner with Nvidia and not your direct competitor.

Edited by Shahnewaz
Quote

The problem is that this is an nVidia product and scoring any nVidia product a "zero" is also highly predictive of the number of nVidia products the reviewer will receive for review in the future.

On 2015-01-28 at 5:24 PM, Victorious Secret said:

Only yours, you don't shitpost on the same level that we can, mainly because this thread is finally dead and should be locked.

On 2016-06-07 at 11:25 PM, patrickjp93 said:

I wasn't wrong. It's extremely rare that I am. I provided sources as well. Different devs can disagree. Further, we now have confirmed discrepancy from Twitter about he use of the pre-release 1080 driver in AMD's demo despite the release 1080 driver having been out a week prior.

On 2016-09-10 at 4:32 PM, Hikaru12 said:

You apparently haven't seen his responses to questions on YouTube. He is very condescending and aggressive in his comments with which there is little justification. He acts totally different in his videos. I don't necessarily care for this content style and there is nothing really unique about him or his channel. His endless dick jokes and toilet humor are annoying as well.

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

"Update: We have been informed that AMD is producing the chips and selling them directly to Intel for integration into these new SKUs. There are no royalties or licensing, but the Semi-Custom division should still receive the revenue for these specialized products made only for Intel."

from PCPER

Link to comment
Share on other sites

Link to post
Share on other sites

22 minutes ago, cj09beira said:

this will also increase the sales volume of hbm2 which will help its adoption long term,

Part of me thinks AMD did the deal to help that along. Though AMD already needs HBM3 for Vega. (Clocks barely do much for the RX Vega cards, it's all about the HBM overclocking.)

Link to comment
Share on other sites

Link to post
Share on other sites

I feel like AMD's graphics division isn't much of a threat to anyone. Ryzen was and is great, but Vega is just another disappointment. It's not top of the line, and it's basically unavailable because of HBM2 yields combined with miners buying up all the cards.

Ketchup is better than mustard.

GUI is better than Command Line Interface.

Dubs are better than subs

Link to comment
Share on other sites

Link to post
Share on other sites

If AMD can make their GPUs not run so freaking hot, then i see this going quite well. Though past experiences makes me worry for the temperatures their GPUs will hit in laptops

🌲🌲🌲

 

 

 

◒ ◒ 

Link to comment
Share on other sites

Link to post
Share on other sites

13 minutes ago, Sierra Fox said:

If AMD can make their GPUs not run so freaking hot, then i see this going quite well. Though past experiences makes me worry for the temperatures their GPUs will hit in laptops

Lower clocked Polaris and even Vega are actually very power efficient. AMD just has to push the desktop GPU range to breaking point to be competitive with Nvidia. Intel is smart enough and also would never pair anything with their technology what would make them look bad.

Link to comment
Share on other sites

Link to post
Share on other sites

Not the first time competing companies have collaborated to create technology.

I am not surprised by this move and I am sure in time we will see use of this tech shared by AMD, NVidia and Intel.

COMMUNITY STANDARDS   |   TECH NEWS POSTING GUIDELINES   |   FORUM STAFF

LTT Folding Users Tips, Tricks and FAQ   |   F@H & BOINC Badge Request   |   F@H Contribution    My Rig   |   Project Steamroller

I am a Moderator, but I am fallible. Discuss or debate with me as you will but please do not argue with me as that will get us nowhere.

 

Spoiler

  

 

Character is like a Tree and Reputation like its Shadow. The Shadow is what we think of it; The Tree is the Real thing.  ~ Abraham Lincoln

Reputation is a Lifetime to create but seconds to destroy.

You have enemies? Good. That means you've stood up for something, sometime in your life.  ~ Winston Churchill

Docendo discimus - "to teach is to learn"

 

 CHRISTIAN MEMBER 

 

 
 
 
 
 
 

 

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, Carlos R said:

Already looking forward to Friday’s WAN Show 

Hope they actually talk about it and not get derailed by non tech news, sure very interesting stuff but almost nothing got covered last week lol. I blame the technical issues more for that though.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×