Jump to content

Jim Keller joins Intel

NumLock21

If you work at pixar you dont even need cuda cores i dont think, last i heard they use CPU render farms. Their CUDA support on renderman wouldnt really be useful for the people working there either i dont think. Pretty sure its just for licensing it out to others.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Kamjam21xx said:

1950x and wx9100 are pretty cheap compared to their counterparts. An openCL workflow pairs good with those. Unless you need cuda or something else, thats a lot of bang for your buck for a personal workstation. Unless you need cuda, or it makes a big difference in the time it takes you to finish working on something.

 

I think beyond that, youre talking render farm or really specific workloads.

 

Edit: Sorry digital media professionals, and still its a per workflow thing. 

1 hour ago, Kamjam21xx said:

If you work at pixar you dont even need cuda cores i dont think, last i heard they use CPU render farms. Their CUDA support on renderman wouldnt really be useful for the people working there either i dont think. Pretty sure its just for licensing it out to others.

You are completely missing the point I was making about support, knowledge databases and other things which aren't just dollars and performance. Those things matters a lot.

 

Also, got any examples of where the wx9100 is better than Nvidia's counterpart?

Your posts contains A LOT of uncertainty like "I don't think" and "pretty sure" yet you seem very confident with the conclusions you're drawing. I would not be that confident if I were you.

 

Believe me, companies are not buying things because they are Intel or Nvidia fanboys. When you're talking about investing hundreds of thousands, if not millions, of dollars in computers which stickers are on the box are not exactly important. Hell, a lot of times you don't even know which processor is in servers. You can often figure it out with a bit of digging, but in general the sales talks are focus on the number of cores and how many GB of RAM it has, not which architecture, what frequency, or anything like that.

Ever tried renting servers from let's say Amazon or Microsoft? You can't even see if it's an Intel or AMD processor you're renting, much less which architecture it is.

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, mr moose said:

 

It's not just about IT people. I meant more about researchers and so on, who request only Intel based system for their use and everything. Then they only develop prototypes for Intel-Nvidia based systems and increase mindshare for both when they weren't obligated to do that.

As far as data centers go, yeah I know, they do non efficient cooling just by fear than doing anything else will be too hard maintenance wise. Which is sad sometimes.

3 hours ago, Kamjam21xx said:

 

Yeah.. I know the feeling. Issue is that in many cases this lead to suboptimal codes..

3 hours ago, LAwLz said:

 

Some do, but most don't. Personally I'm bad because I understand where most issue are regarding memory layouts but I don't know how to solve them efficiently. For instance I understand why Monte Carlo path tracing is a pain on GPU because of the lack of consistency of the data needed which often leads to cache misses and slow down computation a lot because interweaving might not be enough to hide the latency.

 

Well if you don't know the competition it can. That's why there is next to nothing coded with OpenCL compared to CUDA in deep learning. Mostly because CUDA hides most of the difficulties already, so for non well versed programmers in hardware, it's way easier indeed.

 

Yes I know, my point was that even computer science people mostly do not care that much about what laptop they'll buy and what's inside. Which makes assuming that computer science people know about hardware somewhat of a stretch.

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, LAwLz said:

Chances are a lot of them know far more than you about how hardware actually works. How memory mapping works, how instructions are executed, and so on.

Being able to put together some PC parts you bought and have it work is not the same as knowing about computer hardware.

 

And no, it is not ignorance that "allows Nvidia to be almost untouchable on deep learning", nor is it that which makes Intel the "go to CPU manufacturer".

 

Nvidia are not "almost untouchable" for deep learning. For consumers and hobbyists they might be, but that has a lot to do with the hardware available (AMD are behind on the GPU side, no dancing around it) and the investment made on the software side. Nvidia has a much larger and more accessible repertoire of deep learning information as well as partnerships with other companies.

And that is not to say that you can't do the same things with AMD hardware, but would you rather go whaling in the dark trying to figure something out, or do you go to the manufacturer which are providing easy to understand guides on how things work and how you can use it?

 

 

And people don't default to Intel for processors because they believe Intel processors are better than AMD. They default to Intel processors because they do not care which processor are in the machine they are buying. They buy something that seems good, and Intel has far stronger relationships with manufacturers as well as Intel having consistently been better than AMD for a very long time).

If you're building a laptop that you're going to sell to hundreds of thousands of people, you don't pick processor based on some ideology like "we need to help the underdog". If someone gives you incentives in the form of for example help with engineering, or software development or some other task where you might be struggling, you are more likely to pick them as your vendor. If companies like Lenovo and Dell switched the majority of their computers to using AMD processors, then AMD's marketshare would skyrocket. The vast majority of people would not go out of their way to seek out Intel processors, because they most likely don't even know there is a difference.

I'd like to clarify the reason for that "stronger relationship" is that Intel spent 20 years doing everything short of murdering people (that we know of) to keep AMD out of the OEM & Server markets. If AMD didn't overpay for ATI and release Bulldozer, it would be easy to remember why AMD is out of those markets.

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, LAwLz said:

Chances are a lot of them know far more than you about how hardware actually works. How memory mapping works, how instructions are executed, and so on.

Being able to put together some PC parts you bought and have it work is not the same as knowing about computer hardware.

 

And no, it is not ignorance that "allows Nvidia to be almost untouchable on deep learning", nor is it that which makes Intel the "go to CPU manufacturer".

 

Nvidia are not "almost untouchable" for deep learning. For consumers and hobbyists they might be, but that has a lot to do with the hardware available (AMD are behind on the GPU side, no dancing around it) and the investment made on the software side. Nvidia has a much larger and more accessible repertoire of deep learning information as well as partnerships with other companies.

And that is not to say that you can't do the same things with AMD hardware, but would you rather go whaling in the dark trying to figure something out, or do you go to the manufacturer which are providing easy to understand guides on how things work and how you can use it?

 

 

And people don't default to Intel for processors because they believe Intel processors are better than AMD. They default to Intel processors because they do not care which processor are in the machine they are buying. They buy something that seems good, and Intel has far stronger relationships with manufacturers as well as Intel having consistently been better than AMD for a very long time).

If you're building a laptop that you're going to sell to hundreds of thousands of people, you don't pick processor based on some ideology like "we need to help the underdog". If someone gives you incentives in the form of for example help with engineering, or software development or some other task where you might be struggling, you are more likely to pick them as your vendor. If companies like Lenovo and Dell switched the majority of their computers to using AMD processors, then AMD's marketshare would skyrocket. The vast majority of people would not go out of their way to seek out Intel processors, because they most likely don't even know there is a difference.

Honestly I knew alot of cs majors in college and it was not really the case at all. There were some that definitely did but those were the ones that went in to cs with alot of that interest in hardware and knowledge. There are some classes that they take that go over some of the stuff you refer to but it's alot like some if the more complicated classes in engineering. People just do enough to get by and most don't even understand what they are doing or the concepts behind it. Anyways bottom line is that cs has little to do with hardware especially design of hardware. 

Link to comment
Share on other sites

Link to post
Share on other sites

11 minutes ago, Brooksie359 said:

Honestly I knew alot of cs majors in college and it was not really the case at all. There were some that definitely did but those were the ones that went in to cs with alot of that interest in hardware and knowledge. There are some classes that they take that go over some of the stuff you refer to but it's alot like some if the more complicated classes in engineering. People just do enough to get by and most don't even understand what they are doing or the concepts behind it. Anyways bottom line is that cs has little to do with hardware especially design of hardware. 

People in compiler design or OS design or HPC in general do because they need to, but general software engineers, theoretical computer scientists, a significant chunk of machine learning people usually don't. And people in security often don't depending on their research area. Cryptography for instance is not that hardware knowledge based.

Link to comment
Share on other sites

Link to post
Share on other sites

16 minutes ago, Taf the Ghost said:

I'd like to clarify the reason for that "stronger relationship" is that Intel spent 20 years doing everything short of murdering people (that we know of) to keep AMD out of the OEM & Server markets. If AMD didn't overpay for ATI and release Bulldozer, it would be easy to remember why AMD is out of those markets.

20 years? Gonna need a citation on that. Preferably not AdoredTV because I do not want to give that person a view, but feel free to post his sources here.

 

From what I know Intel were offering sketchy rebate to some OEMs for a few years and that unfairly disadvantaged AMD. I am not aware of some other things Intel might have done.

 

 

19 minutes ago, Brooksie359 said:

Honestly I knew alot of cs majors in college and it was not really the case at all. There were some that definitely did but those were the ones that went in to cs with alot of that interest in hardware and knowledge. There are some classes that they take that go over some of the stuff you refer to but it's alot like some if the more complicated classes in engineering. People just do enough to get by and most don't even understand what they are doing or the concepts behind it. Anyways bottom line is that cs has little to do with hardware especially design of hardware. 

You're quoting a pretty large post and I have no idea which part you're referring to. Are you referring to the part where I said most people in CS classes probably know more about hardware than you?

What you have to remember is that knowing where a product fits in on some benchmark graph is not being knowledgeable about hardware. You're judging a fish by its ability to climb a tree.

 

The people who designed the Bugatti Veyron's transmission probably can't cite the horse power and torque of the base model Honda Civic from 2011, right? But using that as an argument for why they don't know about cars is silly kind of silly.

Link to comment
Share on other sites

Link to post
Share on other sites

37 minutes ago, LAwLz said:

 

By the way the computer science people im talking about are people I studied with. So making the assumption they're just better than me is a bit insulting. That aside my point was just that people like Koduri or Keller were not ordinary computer science students (if they ever were I don't know their background.), solely because people on computer science mostly study software and not hardware and are unsuited for hardware design (That's why computer engineers and electrical engineers exist). I was merely pointing out that hardware design is not related to computer science as closely as people believe.

Link to comment
Share on other sites

Link to post
Share on other sites

8 hours ago, Humbug said:

Interesting. Any good sources to read more on this? 

I have only seen the rumours.

I have friends and relatives that work at AMD.  Lisa Su was not happy about what Raja did, The way she looked at it was Raja went behind her back to the board and got them to side with him.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, LAwLz said:

20 years? Gonna need a citation on that. Preferably not AdoredTV because I do not want to give that person a view, but feel free to post his sources here.

 

From what I know Intel were offering sketchy rebate to some OEMs for a few years and that unfairly disadvantaged AMD. I am not aware of some other things Intel might have done.

https://www.networkworld.com/article/2239461/data-center/intel-and-antitrust--a-brief-history.html

 

Some of us were buying & advising others on computers during the period. Most are a decade late to disliking Intel's business practices. Somehow I've known a bunch of people that worked for Intel, in various capacities, but I have a long dislike with their management. 

Link to comment
Share on other sites

Link to post
Share on other sites

5 hours ago, LAwLz said:

It's like the old saying "Nobody ever got fired for buying IBM".

Well I do have a few stories about IBM and people getting fired, EMC too ;).

 

5 hours ago, LAwLz said:

Can you get a comparable product from Fortinet for cheaper

Why did you pick Fortinet? Seems like an odd choice when talking about Cisco, would have expected HPE/Aruba or some other more networking focused company rather than firewall/security.

 

Without getting in to a really long discussion, as someone who prefers Cisco switches I'd buy a Fortigate over Cisco when it comes to firewalls even if they cost the same.

Link to comment
Share on other sites

Link to post
Share on other sites

https://www.forbes.com/sites/jasonevangelho/2018/04/26/the-engineering-duo-that-saved-apple-and-amd-are-teaming-up-at-intel/#15807f0c241f

 

Looks like Keller is doing just more than SoC though.  He looks like he is going to be the lead system architect not just a chip designer.

 

Quote

Basically, Koduri is designing the IP cores, and Keller will transform them into products.

But specific to GPU's and AI.

Link to comment
Share on other sites

Link to post
Share on other sites

57 minutes ago, Taf the Ghost said:

https://www.networkworld.com/article/2239461/data-center/intel-and-antitrust--a-brief-history.html

 

Some of us were buying & advising others on computers during the period. Most are a decade late to disliking Intel's business practices. Somehow I've known a bunch of people that worked for Intel, in various capacities, but I have a long dislike with their management. 

Oh you were not talking about things specifically about Intel being anti-competitive against AMD, but rather all the antitrust lawsuits they have been involved in (even those that the courts ruled in favor of Intel)? Then I can see where you got the 20 years from. I think it is a bit dishonest to present it in the way you did though. You made it sound like Intel has been pulling lots and lots of anti competitive things against AMD which is the reason why they are so disadvantaged now, but I think that's a very oversimplified way of looking at things which removes a lot of other factors, such as Intel always being larger with more resources than AMD.

 

 

21 minutes ago, leadeater said:

Well I do have a few stories about IBM and people getting fired, EMC too ;).

Well, that was not my point.

 

21 minutes ago, leadeater said:

Why did you pick Fortinet? Seems like an odd choice when talking about Cisco, would have expected HPE/Aruba or some other more networking focused company rather than firewall/security.

Don't let Fortinet hear you say that. I was at an event earlier this week with several security vendors and Fortinet were constantly telling everyone that they do more than just firewalls. They sell pretty much everything now, including switches and access points.

 

Anyway, Fortigate might have been a bad example to someone involved with networking. It's just that they were on my mind and I was hoping I could get the point across by bringing up someone that is probably fairly unknown to a lot of people.

My point was that being equal to your competitor is usually not enough because switching vendor is a risk, and you also have to factor in things like support and possible retraining. The benefit has to be fairly large before the switch is worth it.

Link to comment
Share on other sites

Link to post
Share on other sites

It looks like Francois Piednoel has some interesting opinions on this.

 

 

CPU - Ryzen Threadripper 2950X | Motherboard - X399 GAMING PRO CARBON AC | RAM - G.Skill Trident Z RGB 4x8GB DDR4-3200 14-13-13-21 | GPU - Aorus GTX 1080 Ti Waterforce WB Xtreme Edition | Case - Inwin 909 (Silver) | Storage - Samsung 950 Pro 500GB, Samsung 970 Evo 500GB, Samsung 840 Evo 500GB, HGST DeskStar 6TB, WD Black 2TB | PSU - Corsair AX1600i | Display - DELL ULTRASHARP U3415W |

Link to comment
Share on other sites

Link to post
Share on other sites

6 hours ago, LAwLz said:

snipperoo

I wasn't being literal.  Would be nice to hear more about others that make waves and in what way though.

Think I need to start using a sarcastic font went saying clearly idiotic things which are meant to shine light in an opposite direction.  As you've said:

Quote

How many people can name even one person who worked on Sandy Bridge or which companies they are currently working for? I certainly can't.

Nor can I.  I bet one of them is named Bob though.

Titan-AE-image-titan-ae-36734764-500-180

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, LAwLz said:

Oh you were not talking about things specifically about Intel being anti-competitive against AMD, but rather all the antitrust lawsuits they have been involved in (even those that the courts ruled in favor of Intel)? Then I can see where you got the 20 years from. I think it is a bit dishonest to present it in the way you did though. You made it sound like Intel has been pulling lots and lots of anti competitive things against AMD which is the reason why they are so disadvantaged now, but I think that's a very oversimplified way of looking at things which removes a lot of other factors, such as Intel always being larger with more resources than AMD.

 

Intel's major anti-competitive activities against AMD have tended to come as both petty little things, and then big pushes to prevent AMD from establishing a space within the expanding markets. The most well known is preventing AMD from reaching a stable market share in the Server space, but they also worked the OEMs a lot to keep AMD out of the laptop space during the big swing in sales during the mid-2000s. Even when AMD had better mobile products, you basically couldn't find them. Bulldozer make the laptop space mute, as it would have run them out of the space anyway, though it's possible the team that approved Dozer wouldn't have if they still had a presence in mobile.

 

Intel has made some good & bad products, but I shouldn't be forced into a position to purchase the bad ones when they illegally prevent the competitor from being in the market.

Link to comment
Share on other sites

Link to post
Share on other sites

6 minutes ago, Carclis said:

It looks like Francois Piednoel has some interesting opinions on this.

 

 

He's always ranting about something.

Anytime he's in the news it usually sounds like something from r/iamverysmart 

Link to comment
Share on other sites

Link to post
Share on other sites

26 minutes ago, LAwLz said:

Well, that was not my point.

I was just being an ass :).

 

26 minutes ago, LAwLz said:

They sell pretty much everything now, including switches and access points.

I'd never buy a Fortinet switch or AP though, wouldn't even put them on the to be considered list myself. I've seen the AP management stuff and it's garbage compared to Aruba and Ruckus etc.

Link to comment
Share on other sites

Link to post
Share on other sites

New Intel architecture coming in 3,2,1...

On a mote of dust, suspended in a sunbeam

Link to comment
Share on other sites

Link to post
Share on other sites

5 hours ago, Agost said:

New Intel architecture coming in 3,2,1...

Xen architecture featuring Half-Life 3

I mean Valve is apparently interested in making games again after all so why not

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, Aprime said:

Xen architecture featuring Half-Life 3

I mean Valve is apparently interested in making games again after all so why not

Yeah, what the heck happened to half life. Im down with that.

Link to comment
Share on other sites

Link to post
Share on other sites

9 hours ago, LAwLz said:

20 years? Gonna need a citation on that. Preferably not AdoredTV because I do not want to give that person a view, but feel free to post his sources here.

 

From what I know Intel were offering sketchy rebate to some OEMs for a few years and that unfairly disadvantaged AMD. I am not aware of some other things Intel might have done.

 

 

You're quoting a pretty large post and I have no idea which part you're referring to. Are you referring to the part where I said most people in CS classes probably know more about hardware than you?

What you have to remember is that knowing where a product fits in on some benchmark graph is not being knowledgeable about hardware. You're judging a fish by its ability to climb a tree.

 

The people who designed the Bugatti Veyron's transmission probably can't cite the horse power and torque of the base model Honda Civic from 2011, right? But using that as an argument for why they don't know about cars is silly kind of silly.

Completely different. It would be more like a mechanical engineer that didn't work on cars and a mechanic. The mechanic probably knows more just because they have the background. I mean I am a mechanical engineer and there alot of people who are super into cars and engines that know more about it than I do tbh. The main reason being that I don't need to know about car engineer for what I do. Again I am not saying that across the board CS majors don't understand hardware but alot of them don't. I have had many conversations with CS majors about hardware and the inner workings of cpus but alot if them just had their eyes glazed over because it was like I was speaking French. There are some that actually know what I am talking about and are far more knowledgeable in the area but that is a very small percentage.

Link to comment
Share on other sites

Link to post
Share on other sites

9 minutes ago, Brooksie359 said:

Completely different. It would be more like a mechanical engineer that didn't work on cars and a mechanic. The mechanic probably knows more just because they have the background. I mean I am a mechanical engineer and there alot of people who are super into cars and engines that know more about it than I do tbh. The main reason being that I don't need to know about car engineer for what I do. Again I am not saying that across the board CS majors don't understand hardware but alot of them don't. I have had many conversations with CS majors about hardware and the inner workings of cpus but alot if them just had their eyes glazed over because it was like I was speaking French. There are some that actually know what I am talking about and are far more knowledgeable in the area but that is a very small percentage.

Depends on what those CS guys are doing with their degrees.  A good engine programmer or database programmer, MUST know the inner workings of silicon they are targeting.

 

Also this is becoming a prerequisite for good shader programmers too.  To do proper optimizations of compute shaders they must know the inner workings of GPU's.  Any large scale application developer also.

 

If you are talking to guys that did CS degrees that are only doing scripting or basic programming at a high level at their jobs, yeah people like that aren't going to know much about the underlying hardware, since they don't need to know those things.

 

I have a cousin that did CS and MIS and now script programming (various languages) for financial companies.  He doesn't know anything about the hardware the base application is running on, that is because he doesn't need to know it, not his job to know it.

Link to comment
Share on other sites

Link to post
Share on other sites

25 minutes ago, Razor01 said:

Depends on what those CS guys are doing with their degrees.  A good engine programmer or database programmer, MUST know the inner workings of silicon they are targeting.

 

Also this is becoming a prerequisite for good shader programmers too.  To do proper optimizations of compute shaders they must know the inner workings of GPU's.  Any large scale application developer also.

 

If you are talking to guys that did CS degrees that are only doing scripting or basic programming at a high level at their jobs, yeah people like that aren't going to know much about the underlying hardware, since they don't need to know those things.

 

I have a cousin that did CS and MIS and now script programming (various languages) for financial companies.  He doesn't know anything about the hardware the base application is running on, that is because he doesn't need to know it, not his job to know it.

Exactly my point. Just like I don't need to know how a car works to do hvac. I mean why would someone know something like that if they didn't need to know it and aren't interested in it. 

Link to comment
Share on other sites

Link to post
Share on other sites

6 hours ago, Razor01 said:

If you are talking to guys that did CS degrees that are only doing scripting or basic programming at a high level at their jobs, yeah people like that aren't going to know much about the underlying hardware, since they don't need to know those things.

That is basically every CS Bachelor's degree though, you come out of those qualified to actually learn something useful or move in to a more specialized post grad course. It's something I see a lot of, people having unrealistic expectations of what they will actually learn during the degree and also what it amounts to after they graduate. A full IT career path actually has a high bar of entry, higher than most realize so get disappointed with where they are at in their first 1 or 3 years if they even get a job in IT (not counting PC repair type jobs).

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×