Jump to content

Intel GPUs are coming

tsk

I wonder in which market intel will be able to compete in. Highly doubt they could seriously enter the consumer side (am i right in thinking most of the useful technologies are the IP of nvidia and amd?) 

Link to comment
Share on other sites

Link to post
Share on other sites

This will only bury AMD deeper, but if Intel actually make better product, then I'm down with it.

Link to comment
Share on other sites

Link to post
Share on other sites

Counterplay by Nvidia: Nvidia enters CPU market. 

 

Imagine that ? 

However I would rather prefer them to figure out how to use the GPU chip to perform CPU tasks. 

Connection200mbps / 12mbps 5Ghz wifi

My baby: CPU - i7-4790, MB - Z97-A, RAM - Corsair Veng. LP 16gb, GPU - MSI GTX 1060, PSU - CXM 600, Storage - Evo 840 120gb, MX100 256gb, WD Blue 1TB, Cooler - Hyper Evo 212, Case - Corsair Carbide 200R, Monitor - Benq  XL2430T 144Hz, Mouse - FinalMouse, Keyboard -K70 RGB, OS - Win 10, Audio - DT990 Pro, Phone - iPhone SE

Link to comment
Share on other sites

Link to post
Share on other sites

14 hours ago, mr moose said:

EDIT: and this of course is assuming Intel are even interested in the gaming side of discrete GPU's and not just after the compute side of the market.

My feeling is they want to get in on the enterprise compute market more before they lose it. It was not that long ago that the top 500 supercomputer list was dominated by Intel driven CPU clusters, now they are getting relegated to secondary tasks and orchestration with GPUs/ASICs doing the actual computation. There are CPU only tasks too but it won't stay that way.

 

With AMD having CPUs in the market now that have 128 PCIe lanes and more RAM per socket than Intel has they stand to lose this market segment all together, a very influential market segment. Doing nothing isn't an option and currently Xeon Phi isn't scaling it's performance as fast as GPUs are, it's saving grace is it's ability to run tasks a GPU simply cannot, currently.

Link to comment
Share on other sites

Link to post
Share on other sites

23 hours ago, Armakar said:

Maybe this will push even more high end GPU's, seeing as we are still missing a GPU capable of 4k60fps (No, the 1080Ti is not capable of this, it hits 60 in some games not all)

Only if you need to max out the game which is bloody pointless:

 

A single 1080 Ti is all the power any one needs

 

 

Personal Desktop":

CPU: Intel Core i7 10700K @5ghz |~| Cooling: bq! Dark Rock Pro 4 |~| MOBO: Gigabyte Z490UD ATX|~| RAM: 16gb DDR4 3333mhzCL16 G.Skill Trident Z |~| GPU: RX 6900XT Sapphire Nitro+ |~| PSU: Corsair TX650M 80Plus Gold |~| Boot:  SSD WD Green M.2 2280 240GB |~| Storage: 1x3TB HDD 7200rpm Seagate Barracuda + SanDisk Ultra 3D 1TB |~| Case: Fractal Design Meshify C Mini |~| Display: Toshiba UL7A 4K/60hz |~| OS: Windows 10 Pro.

Luna, the temporary Desktop:

CPU: AMD R9 7950XT  |~| Cooling: bq! Dark Rock 4 Pro |~| MOBO: Gigabyte Aorus Master |~| RAM: 32G Kingston HyperX |~| GPU: AMD Radeon RX 7900XTX (Reference) |~| PSU: Corsair HX1000 80+ Platinum |~| Windows Boot Drive: 2x 512GB (1TB total) Plextor SATA SSD (RAID0 volume) |~| Linux Boot Drive: 500GB Kingston A2000 |~| Storage: 4TB WD Black HDD |~| Case: Cooler Master Silencio S600 |~| Display 1 (leftmost): Eizo (unknown model) 1920x1080 IPS @ 60Hz|~| Display 2 (center): BenQ ZOWIE XL2540 1920x1080 TN @ 240Hz |~| Display 3 (rightmost): Wacom Cintiq Pro 24 3840x2160 IPS @ 60Hz 10-bit |~| OS: Windows 10 Pro (games / art) + Linux (distro: NixOS; programming and daily driver)
Link to comment
Share on other sites

Link to post
Share on other sites

12 hours ago, TheOriginalHero said:

I wonder in which market intel will be able to compete in. Highly doubt they could seriously enter the consumer side (am i right in thinking most of the useful technologies are the IP of nvidia and amd?) 

The GPU companies use the same chips/arch across all products/markets. If they are making a graphics capable arch, they just have to throw it in a board, make drivers and profit. I doubt intel would even try if they couldn’t get around the patents.

 

Im just saying, they state graphics specifically, if they are making a graphics capable arch, they will create a consumer version. Probably not first (like nvidia does) but eventually. 

  

Link to comment
Share on other sites

Link to post
Share on other sites

16 hours ago, TheOriginalHero said:

I wonder in which market intel will be able to compete in. Highly doubt they could seriously enter the consumer side (am i right in thinking most of the useful technologies are the IP of nvidia and amd?) 

It's hard to guess, but right now Nvidias' most lucrative market is by far gaming graphics cards. That is likely to shift to HPC in the coming years though. 

Link to comment
Share on other sites

Link to post
Share on other sites

This will be interesting for Raja no budget, sky is the limit. Wonder what he can do.

 

 

(obviously there is a budget but not as constraining)

GPU drivers giving you a hard time? Try this! (DDU)

Link to comment
Share on other sites

Link to post
Share on other sites

I'm guessing gamers aren't Intel's intended target group, at least initially, with the dedicated graphics cards. It will be interesting to see, nonetheless, if this turns out to be true.

Link to comment
Share on other sites

Link to post
Share on other sites

18 hours ago, leadeater said:

My feeling is they want to get in on the enterprise compute market more before they lose it. It was not that long ago that the top 500 supercomputer list was dominated by Intel driven CPU clusters, now they are getting relegated to secondary tasks and orchestration with GPUs/ASICs doing the actual computation. There are CPU only tasks too but it won't stay that way.

I believe this is the correct answer.

 

The nVidia CEO indirectly jabbed at the inherent inefficiencies and high cost of general hardware based microprocessors, which is exactly what the x86 is.  GPUs rely heavily on software based complexity and abstraction bit with simple and numerous cores, while ASICs do not have unnecessary circuitry and hardware that does nothing.

 

Intel trying to enter the GPU market is a 180 from their entire strategy since their inception.

Link to comment
Share on other sites

Link to post
Share on other sites

21 hours ago, leadeater said:

My feeling is they want to get in on the enterprise compute market more before they lose it. It was not that long ago that the top 500 supercomputer list was dominated by Intel driven CPU clusters, now they are getting relegated to secondary tasks and orchestration with GPUs/ASICs doing the actual computation. There are CPU only tasks too but it won't stay that way.

 

With AMD having CPUs in the market now that have 128 PCIe lanes and more RAM per socket than Intel has they stand to lose this market segment all together, a very influential market segment. Doing nothing isn't an option and currently Xeon Phi isn't scaling it's performance as fast as GPUs are, it's saving grace is it's ability to run tasks a GPU simply cannot, currently.

I agree, when it comes to productivity workloads, the Xeon Phi doesnt do well in the singlethreaded/multithreaded, and even blender, it doesnt to well there either, considering its a multicore benchmark, you'd hope it would do well, and the cheaper HEDT CPUs such as threadripper/i9, they beat it out. I think this is because of the clock speeds, because since it has more cores, slower speeds are going to be ran accross all cores. But in its server market, its great for handling more workloads and spreading the workloads accross multiple different cores.

Link to comment
Share on other sites

Link to post
Share on other sites

21 hours ago, leadeater said:

My feeling is they want to get in on the enterprise compute market more before they lose it. It was not that long ago that the top 500 supercomputer list was dominated by Intel driven CPU clusters, now they are getting relegated to secondary tasks and orchestration with GPUs/ASICs doing the actual computation. There are CPU only tasks too but it won't stay that way.

I assume that Nvidia would need the X86 instructions set for the CPU manufacturing process, or they would simply have to make their own instructions set.

Link to comment
Share on other sites

Link to post
Share on other sites

I doubt we would see consumer cards form Intel any time soon, if at all we ever will. It seems they are more focused towards the low-mid end for OEMs. It might be a good 6-8 years before Intel forays into consumer cards, assuming RTG is bought by Intel(which I doubt it would be).

On 11/9/2017 at 5:43 AM, Bouzoo said:

Now with Intel in the game, we will literally have teams Red, Green and Blue (RGB). What a time to be alive. 

Mind blown!

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, TheBeastPC said:

I assume that Nvidia would need the X86 instructions set for the CPU manufacturing process, or they would simply have to make their own instructions set.

Nvidia is already the biggest player in this market, ASIC/GPU compute wise. x86 is not necessary in the scientific community, they want to run very specific types of work loads and are prepared to write specific code to achieve it on custom hardware or non custom hardware. Which is able to do it the fastest and the most reliably wins, it still has to be fairly usable which is why CUDA is much more popular than OpenCL.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, TheBeastPC said:

I agree, when it comes to productivity workloads, the Xeon Phi doesnt do well in the singlethreaded/multithreaded, and even blender, it doesnt to well there either, considering its a multicore benchmark, you'd hope it would do well, and the cheaper HEDT CPUs such as threadripper/i9, they beat it out. I think this is because of the clock speeds, because since it has more cores, slower speeds are going to be ran accross all cores. But in its server market, its great for handling more workloads and spreading the workloads accross multiple different cores.

Not really a big issue for large research clusters with hundreds or thousands of GPUs, you pick the hardware that will run the tasks the fastest. For a lot of things that's a GPU so it's pretty much always Nvidia, where it can't be run on a GPU it will be run on another type of ASIC like Xeon Phi or just on CPU.

 

Have a watch of the video where Linus went to a Canadian university to have a look at their new HPC compute clusters, that video will explain all the basic stuff you need to know and will answer most of your question you're likely to ask. 

Link to comment
Share on other sites

Link to post
Share on other sites

On 11/9/2017 at 5:49 AM, Shreyas1 said:

Since intel will be able to spend more on research and development than AMD, we might have more competition 

in AI , ML and driving yes but consumer side , i dount

On 11/9/2017 at 6:21 AM, Princess Cadence said:

It means we're yet to see an even shittier stock cooling on reference cards once Intel boards in GPU market.

No cooling ,get your own After market solution or considering how light the intel stock cooler is you can throw that up on a PCB, i mean i recently took apart my 1070 amp and that heatsink had some heft to it

On 11/9/2017 at 6:23 AM, Inkz said:

I can  just see the "INTEL EXTREME EDITION GPUS!!!!" So extreme they don't even come with a cooler and expects you to bring your own.

intel HD 3-non overclockable+ stock haeatsink included

intel HD 5-non overclockable+ stock haeatsink included

intel HD 7-non overclockable+ stock haeatsink included

intel HD 3K/Ti - overclockable+no heatsink included overclocks only if paired with intel K+X/Z combo

intel HD 5K/Ti - overclockable+no heatsink included overclocks only if paired with intel K+X/Z combo

intel HD 7K/Ti - overclockable+no heatsink included overclocks only if paired with intel K+X/Z combo

intel HD 9- overclockable significantly more expensive RGB 

 

On 11/9/2017 at 10:22 AM, Sniperfox47 said:

AMD also had *very* limited resources to work with. You can have all the IP you want, without money to pay for the patents you don't have, to pay the engineers making the product, and to pay for marketing and integration, it's a very challenging task.

 

Intel on the other hand has the opposite issue though, you're right. They can smack Raja with a wad of cash, but that doesn't give them the IP to build a GPU.

 

But AMD doesn't have to cross license to Intel though. Unlike x86, there are other players in the field especially if you're worried about compute rather than rendering. Between AMD, ARM, Imagination, Qualcomm, and Broadcom, Intel has options.

 

The Intel/RTG EMIB package is cool, but it really doesn't compete in the sector where Nvidia is a concern for Intel. Intel needs more options for scalable and parallel compute in HPC or they're going to fall behind as CPU performance becomes less important.

intel does have a problem of plenty but i guess they got themselves in thios situation by being slacky and making bad decisions \and betting on the wrong markets - IoT and Security

On 11/9/2017 at 6:37 PM, Prysin said:

 

To quote Bits and Chips;

https://mobile.twitter.com/BitsAndChipsEng/status/928552779945308161

 

yeah.... dGPUs in 2017, great plan

 

 

No, intel isnt aiming for gaming. They are aiming for AI, Automotive and other similar applications which benefits hugely from having mass parrallellism

Intel would require atleast 24 months of time .so lets assume we seaea the first wave of cards in 2019 but by then nVidia would have released volta and hopefully ampere and woud have been working on whatever is coming next 

honestly i dont think they would be avle to compete with volta let alone ampere and the beaurocrasy at intel would only slow down this new 'team' under raja  

On 11/9/2017 at 6:44 PM, linustouchtips said:

first intel architecture of their new graphics line

 

codenamed raja 

codename koduri , given intels fixation on naming its lineups after places 

PS:koduri is a town famous for marble production

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×