Jump to content

Jim Keller joins Intel

NumLock21
1 minute ago, leadeater said:

That is basically every CS Bachelor's degree though, you come out of those qualified to actually learn something useful or move in to a more specialized post grad course. It's something I see a lot of, people having unrealistic expectations of what they will actually learn during the degree and also what it amounts to after they graduate. A full IT career path actually has a high bar of entry, higher than most realize so get disappointed with where they are at in their first 1 or 3 years if they even get a job in IT (not counting PC repair type jobs).

Well it depends on the person, most guys that are hired to be working on large applications, are usually superstars in college and have extensive application development during college or even before lol.  But these are few and far between, most job opportunities are contractual and small projects right now for kids coming out of college, test before you hire type thing.  You are correct college's base graduation requirements gives a good wide base of what the industry needs but not what the bachelor degree guy needs to really go up the ladder.

Link to comment
Share on other sites

Link to post
Share on other sites

15 minutes ago, Razor01 said:

Well it depends on the person, most guys that are hired to be working on large applications, are usually superstars in college and have extensive application development during college or even before lol.  But these are few and far between, most job opportunities are contractual and small projects right now for kids coming out of college, test before you hire type thing.  You are correct college's base graduation requirements gives a good wide base of what the industry needs but not what the bachelor degree guy needs to really go up the ladder.

One of the issues with college is it is supposed to prepare you for anything you would want to do with your degree so it prepares you for nothing. Most degrees are very broad in their nature and end up reaching the basics of what you need to know for most of the jobs that would come with that degree. It's your job to find it is that you want to do and learn more about it and gain experience where ever you can. I mean there is a reason for the degrees being this way though. I have a friend who's dad works in the medical device industry as an engineer. He told him that he would be much better off going for a mechanical engineering degree and then getting a job at a medical device company than to go for a biomedical engineering degree. The reason being is that a mechanical engineer can get a medical device job but a biomedical engineering degree can't get you a mechanical engineering job. 

Link to comment
Share on other sites

Link to post
Share on other sites

7 minutes ago, Razor01 said:

Well it depends on the person, most guys that are hired to be working on large applications, are usually superstars in college and have extensive application development during college or even before lol.  But these are few and far between, most job opportunities are contractual and small projects right now for kids coming out of college, test before you hire type thing.  You are correct college's base graduation requirements gives a good wide base of what the industry needs but not what the bachelor degree guy needs to really go up the ladder.

On the developer career track it's not so bad, even if it's not a great job it'll be something like a unit/code tester or something along those lines. Technical career track starts off with soul crushing for the first 1 to 3  years of help desk or desktop support before you can even start to get a look in, even an interview for something like a system admin and during those first 1 to 3 years you have to spend your own personal time studying and getting industry certifications.

 

You have very little ability and opportunity to show your worth and your skills in those job roles and any past work experience isn't of much use to an employer, it's unlikely you got to do any meaningful projects on a large enough network. That's one of the big reasons why a lot of employers still look at industry certs for those entry sys admin and sys engineer positions, a lot of trust is put on the applicant that they can actually do the job without breaking stuff.

 

The shift to DevOps is good though for people looking to start out now, if you learn a lot about infrastructure as code and demonstrate it well then you will stand out.

 

I'd say it's easier to get in to the developer career path but harder to advance through and easy to get stuck at a certain level and technical is harder to get in to but easier to advance through with more clear progression paths and requirements.

Link to comment
Share on other sites

Link to post
Share on other sites

17 hours ago, laminutederire said:

As far as data centers go, yeah I know, they do non efficient cooling just by fear than doing anything else will be too hard maintenance wise. Which is sad sometimes.

This has more to do with how old the buildings are than not wanting to use better cooling solutions, in fact it's the most wanted thing. One of our main data centers is over 25 years old, it was converted for that purpose and is the old classic raised floor with cold forced air pushed through floor grates in front of the server racks so it can be sucked through. Basically the cool the room not the equipment approach, very old and inefficient.

 

Buildings usually have 20 year minimum life span which as you can see doesn't fit with the development cycle of technology. This is why shipping container data centers became popular. Cheap, easily configurable, movable, stackable and best of all resealable.

 

Do we want heat exchanges on the back of our racks and evaporators on the roof, hell yes. Can it actually be done, usually not after the fact or not cost effectively.

 

You can burn 50%-60% of power on cooling which puts a larger load on your UPS and generators, and electrical distribution, which cannot be used for more useful things like powering actual equipment. Cooling efficiency can directly effect server room density and introduce very large costs, every data center facilities manager wants more efficient cooling and lower power usage but sometimes it's just not a possibility.

 

17 hours ago, laminutederire said:

It's not just about IT people. I meant more about researchers and so on, who request only Intel based system for their use and everything. Then they only develop prototypes for Intel-Nvidia based systems and increase mindshare for both when they weren't obligated to do that.

Yep get that all the time from researchers, it must be an Nvidia GPU because we use CUDA.

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, leadeater said:

This has more to do with how old the buildings are than not wanting to use better cooling solutions, in fact it's the most wanted thing. One of our main data centers is over 25 years old, it was converted for that purpose and is the old classic raised floor with cold forced air pushed through floor grates in front of the server racks so it can be sucked through. Basically the cool the room not the equipment approach, very old and inefficient.

 

Buildings usually have 20 year minimum life span which as you can see doesn't fit with the development cycle of technology. This is why shipping container data centers became popular. Cheap, easily configurable, movable, stackable and best of all resealable.

 

Do we want heat exchanges on the back of our racks and evaporators on the roof, hell yes. Can it actually be done, usually not after the fact or not cost effectively.

 

You can burn 50%-60% of power on cooling which puts a larger load on your UPS and generators, and electrical distribution, which cannot be used for more useful things like powering actual equipment. Cooling efficiency can directly effect server room density and introduce very large costs, every data center facilities manager wants more efficient cooling and lower power usage but sometimes it's just not a possibility.

 

Yep get that all the time from researchers, it must be an Nvidia GPU because we use CUDA.

What I meant was that it's very hard to push a novel non proved in practice and only in theory method to data centers. So it's all about power efficiency, but it's also more about safety. I should know, I tried to push a new method in their pipeline as a student. It was something complicated that lacked some details. The reason they gave me for not trying to help figuring out the detail was that even if it works it's too risky for them to try, so they wouldn't even try.

That reliance on CUDA is so wrong on so many levels, but they keep entrenching themselves into an intellectual trap, which for researchers is kinda ethically questionable. But that's another story.

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, laminutederire said:

I should know, I tried to push a new method in their pipeline as a student. It was something complicated that lacked some details.

Do tell, I'd like to know more :)

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, leadeater said:

Do tell, I'd like to know more :)

Some chemical based energy recovery system. The idea was to recover energy, but in a different way than a Carnot machine (efficiency for the temp range being around less than 5%, it wouldn't be worth it). I was using some kind of chemical coupling to have an efficiency over 1 in the best case scenario, but I wasn't sure about the kinetics of chemical reaction seeing as I'm a math/computer science guy more than a physicist/chemist.

I was in contact with HPE about it but they didnt seem to want to dedicate their own resources for it if I wasn't going to make a start up out of it. (Which I didn't want to do because as I said I'm no chemist)

Link to comment
Share on other sites

Link to post
Share on other sites

29 minutes ago, laminutederire said:

Some chemical based energy recovery system. The idea was to recover energy, but in a different way than a Carnot machine (efficiency for the temp range being around less than 5%, it wouldn't be worth it). I was using some kind of chemical coupling to have an efficiency over 1 in the best case scenario, but I wasn't sure about the kinetics of chemical reaction seeing as I'm a math/computer science guy more than a physicist/chemist.

I was in contact with HPE about it but they fient seem to want to dedicate their own resources for it if I wasn't going to make a start up out of it. (Which I didn't want to do because as I said I'm no chemist)

What was the thermal transfer source? Direct die or exhaust heat? If it was something like direct die I can see why there were reservations, direct die water cooling is still mad scientist territory for most server guys lol.

 

Unless power draw goes down I can see thermal conduction start being a problem with extremely small process node CPUs and GPUs, that spot heat might start to fight back. Not sure if you've seen that carbon thermal pad Linus tried out. That had wicked high surface conductivity, that with more efficient ways of getting heat out of the server and out of the room would be nice, longing for the day server rooms aren't screaming loud fan noise.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, leadeater said:

What was the thermal transfer source? Direct die or exhaust heat? If it was something like direct die I can see why there were reservations, direct die water cooling is still mad scientist territory for most server guys lol.

 

Unless power draw goes down I can see thermal conduction start being a problem with extremely small process node CPUs and GPUs, that spot heat might start to fight back. Not sure if you've seen that carbon thermal pad Linus tried out. That had wicked high surface conductivity, that with more efficient ways of getting heat out of the server and out of the room would be nice, longing for the day server rooms aren't screaming loud fan noise.

Either or. It was modular enough to be able to do both in principle. Direct die is always better since you can heat up more the fluid for the recovery cycle. But since the reaction was pretty serious (toxicity wise for maintenance), it made sense not to do it directly. I thought about publishing it, but since it wasn't complete I don't know what journal would be interested in that.

Well the thing is that well soon achieve post silicon processes where thermal properties aren't the same anymore.

That being said, my approach was less about reducing noise than about reducing power draw footprint. Because in a way, a computer is somewhat of a grill/heater. You get electric power in, and nearly all of it is exhausted as heat. There's only a few watts really useful to change transistor gates states and so on. That means that the power for our use is significantly less than than the actual power used. So it makes sense to try and recover that energy in form of heat which didn't have a purpose in the first place (or to move to carbon based transistors and everything, to reduce the amount of power dissipated).

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, laminutederire said:

Snip

Using excess heat to provide heating to the surrounding area is a decent way to employ some of the waste from our infrastructure (in this case a datacenter). It can be difficult to implement though due to regulation and other barriers.

 

I'm not sure if that's what you had in mind when talking about recovering heat energy.

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, Trixanity said:

Using excess heat to provide heating to the surrounding area is a decent way to employ some of the waste from our infrastructure (in this case a datacenter). It can be difficult to implement though due to regulation and other barriers.

 

I'm not sure if that's what you had in mind when talking about recovering heat energy.

Well there's that and swimming pool heating yes, but that does not cover a constant need. You re not going to heat the swimming pool as much during the night and the summer and same goes for houses. I meant more recovery as in electric power recovery into... electric power to be refer in the data Center or onto the power grid.

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, laminutederire said:

Either or. It was modular enough to be able to do both in principle. Direct die is always better since you can heat up more the fluid for the recovery cycle. But since the reaction was pretty serious (toxicity wise for maintenance), it made sense not to do it directly. I thought about publishing it, but since it wasn't complete I don't know what journal would be interested in that.

Well the thing is that well soon achieve post silicon processes where thermal properties aren't the same anymore.

That being said, my approach was less about reducing noise than about reducing power draw footprint. Because in a way, a computer is somewhat of a grill/heater. You get electric power in, and nearly all of it is exhausted as heat. There's only a few watts really useful to change transistor gates states and so on. That means that the power for our use is significantly less than than the actual power used. So it makes sense to try and recover that energy in form of heat which didn't have a purpose in the first place (or to move to carbon based transistors and everything, to reduce the amount of power dissipated).

I mean most cooling systems recover heat but nothing is really done with that heat for the most part. I mean you use a refrigeration cycle to cool something and you transfer the heat absorbed to something like water.  No sometimes they do something with it but alot if times nothing is done with it. The problem becomes it's not super often that one would want heating and cooling at the same time so having extra heat energy doesn't do much and most don't try and run a heat engine off the extra heat.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, laminutederire said:

Well there's that and swimming pool heating yes, but that does not cover a constant need. You re not going to heat the swimming pool as much during the night and the summer and same goes for houses. I meant more recovery as in electric power recovery into... electric power to be refer in the data Center or onto the power grid.

Yeah I forgot to add that feeding power back into the grid is also a thing. Again, that can also be a bitch to get up and running and not purely from a technical standpoint. I think one of the data centers being build by Apple will feature energy recovery in some form which was a mutual desire from both Apple and the local community. Can't remember the details though.

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, Trixanity said:

Yeah I forgot to add that feeding power back into the grid is also a thing. Again, that can also be a bitch to get up and running and not purely from a technical standpoint. I think one of the data centers being build by Apple will feature energy recovery in some form which was a mutual desire from both Apple and the local community. Can't remember the details though.

In Europe there are a lot of heat networks fed by data centers as well. Yeah it's a bitch :P but I hope that humanity will understand that ecology doesn't hurt ! (And economically speaking it  makes sense). You can amortized the cost in less 5 years if I recall correctly

Link to comment
Share on other sites

Link to post
Share on other sites

9 minutes ago, Brooksie359 said:

I mean most cooling systems recover heat but nothing is really done with that heat for the most part. I mean you use a refrigeration cycle to cool something and you transfer the heat absorbed to something like water.  No sometimes they do something with it but alot if times nothing is done with it. The problem becomes it's not super often that one would want heating and cooling at the same time so having extra heat energy doesn't do much and most don't try and run a heat engine off the extra heat.

Yeah that's why I researched on converting it into something else reusable!

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, laminutederire said:

Yeah that's why I researched on converting it into something else reusable!

Maybe you could try and introduce the hot water into the hot water supply? I mean if you could make the water heater not have to work as hard it would definitely save overall power.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Brooksie359 said:

Maybe you could try and introduce the hot water into the hot water supply? I mean if you could make the water heater not have to work as hard it would definitely save overall power.

The biggest barrier for that is regulations. You'd have to have your equipment sanitized fairly often to comply with health norms and so on. That's on top of finding a location well situated power and internet access wise, which will allow you to connect yourself on top of public water pipes. In a way it's necessary for health concerns, but it prevents many companies to jump on that type of train..

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, laminutederire said:

The biggest barrier for that is regulations. You'd have to have your equipment sanitized fairly often to comply with health norms and so on. That's on top of finding a location well situated power and internet access wise, which will allow you to connect yourself on top of public water pipes. In a way it's necessary for health concerns, but it prevents many companies to jump on that type of train..

If the water you are getting to cool the refrigerant comes from the city anyways why does it matter? I mean generally the same water from the city I used everywhere In a building. 

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, Brooksie359 said:

If the water you are getting to cool the refrigerant comes from the city anyways why does it matter? I mean generally the same water from the city I used everywhere In a building. 

Water and heat below boiling temperatures is generally a bad thing, this is why hot and cold water in such systems is isolated and heat is transfered to cold water through a heat exchanger and not direct water flow.

 

image.png.d8451e65f923eaaa00eff5cef353fdf0.png

Link to comment
Share on other sites

Link to post
Share on other sites

14 hours ago, leadeater said:

Water and heat below boiling temperatures is generally a bad thing, this is why hot and cold water in such systems is isolated and heat is transfered to cold water through a heat exchanger and not direct water flow.

 

image.png.d8451e65f923eaaa00eff5cef353fdf0.png

I mean you would be feeding the hot water back to the water heater if you were to try and create such a system.

Link to comment
Share on other sites

Link to post
Share on other sites

57 minutes ago, Brooksie359 said:

I mean you would be feeding the hot water back to the water heater if you were to try and create such a system.

You might still get growth in the pre-heating water path, safer to use a heat exchanger than hope the last boiling stage kills anything that might now be in it. That Asetek system I posted can be feed in to building hot water or a central heating system.

Link to comment
Share on other sites

Link to post
Share on other sites

54 minutes ago, leadeater said:

You might still get growth in the pre-heating water path, safer to use a heat exchanger than hope the last boiling stage kills anything that might now be in it. That Asetek system I posted can be feed in to building hot water or a central heating system.

Yeah I guess that would make sense. I mean if it exchanges with the water before the heater it would be a good idea.

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, Brooksie359 said:

Yeah I guess that would make sense. I mean if it exchanges with the water before the heater it would be a good idea.

Issue might then be that you can't heat up the water enough

Link to comment
Share on other sites

Link to post
Share on other sites

46 minutes ago, laminutederire said:

Issue might then be that you can't heat up the water enough

You aren't trying to heat the water up enough to make it hot water. You are just trying to heat it up that way the water heater doesn't have to heat it up as much. 

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×