Jump to content

electronics of the future could be self-cooling.

Guest Jagobeth

I wouldn't be surprised if they started to create their own energy to power themselves (possibly using the energy from the heat).

Link to comment
Share on other sites

Link to post
Share on other sites

Electronics of the future will have super conductive materials, thus generate no resistance so no heat! (assuming a super conductive material at room temp is made/discovered)

CPU: Intel 3570 GPUs: Nvidia GTX 660Ti Case: Fractal design Define R4  Storage: 1TB WD Caviar Black & 240GB Hyper X 3k SSD Sound: Custom One Pros Keyboard: Ducky Shine 4 Mouse: Logitech G500

 

Link to comment
Share on other sites

Link to post
Share on other sites

Electronics of the future will build themselves out of the box

Link to comment
Share on other sites

Link to post
Share on other sites

The problem is though as soon as this tech becomes a reality we will simply squeeze more cores in and more transistors so there will always be heat to be taken away. And that is a good thing because we will always get more powerful hardware every year.

 (\__/)

 (='.'=)

(")_(")  GTX 1070 5820K 500GB Samsung EVO SSD 1TB WD Green 16GB of RAM Corsair 540 Air Black EVGA Supernova 750W Gold  Logitech G502 Fiio E10 Wharfedale Diamond 220 Yamaha A-S501 Lian Li Fan Controller NHD-15 KBTalking Keyboard

Link to comment
Share on other sites

Link to post
Share on other sites

What does this have to do with Noctua?

Well making components that cool themselves eliminates the need for heatsinks or fans and that's all noctua makes kinda has everything to do with them...

Link to comment
Share on other sites

Link to post
Share on other sites

Well making components that cool themselves eliminates the need for heatsinks or fans and that's all noctua makes kinda has everything to do with them...

 

Oh you meant goodbye! I thought for some reason you were under the impression that Noctua had bought this technology :D

Link to comment
Share on other sites

Link to post
Share on other sites

I could be very wrong on this, but I think the article is wrong....the fact that the more graphene you have, the better transfer of heat you get does not mean the end of heatsinks and fans.  It would mean you could transfer away heat quicker, but the heat needs to go somewhere.  The better thing to take out of this article is imagine the potential for heatsinks and fans, if graphene is for example 100x more efficient then you need 100th the surface area....or by keeping the heatsinks the same size, you could get cooling that will really heat up your room :P (by pushing your CPU's output even higher)

0b10111010 10101101 11110000 00001101

Link to comment
Share on other sites

Link to post
Share on other sites

I wouldn't be surprised if they started to create their own energy to power themselves (possibly using the energy from the heat).

awesome idea!

 

  1. GLaDOS: i5 6600 EVGA GTX 1070 FE EVGA Z170 Stinger Cooler Master GeminS524 V2 With LTT Noctua NFF12 Corsair Vengeance LPX 2x8 GB 3200 MHz Corsair SF450 850 EVO 500 Gb CableMod Widebeam White LED 60cm 2x Asus VN248H-P, Dell 12" G502 Proteus Core Logitech G610 Orion Cherry Brown Logitech Z506 Sennheiser HD 518 MSX
  2. Lenovo Z40 i5-4200U GT 820M 6 GB RAM 840 EVO 120 GB
  3. Moto X4 G.Skill 32 GB Micro SD Spigen Case Project Fi

 

Link to comment
Share on other sites

Link to post
Share on other sites

This sounds really cool. Can't really imagine what it would really look like though.

Link to comment
Share on other sites

Link to post
Share on other sites

I wouldn't be surprised if they started to create their own energy to power themselves (possibly using the energy from the heat).

Except that you have just "invented" a perpetual motion machine or sorts. They are very impossible, especially for a computer. There is simply no way that a device can power itself because it is impossible for it to 100% efficiently use the energy that it outputs; even if it was, nothing would happen because part of that energy is data for your computer to work.

I will be interested to see where this goes, but they way they explained it in the article didn't quite add up because as @WanderingFool said, that wouldn't eliminate the need for cooling, just make it possible for it to be more efficient and effective.

HTTP/2 203

Link to comment
Share on other sites

Link to post
Share on other sites

I remember thinking about technology like this and how damn awesome it'd be, but I just made myself sad since I love watercooling and all that it entails.

FX 6300 @4.8 Ghz - Club 3d R9 280x RoyalQueen @1200 core / 1700 memory - Asus M5A99X Evo R 2.0 - 8 Gb Kingston Hyper X Blu - Seasonic M12II Evo Bronze 620w - 1 Tb WD Blue, 1 Tb Seagate Barracuda - Custom water cooling

Link to comment
Share on other sites

Link to post
Share on other sites

I could be very wrong on this, but I think the article is wrong....the fact that the more graphene you have, the better transfer of heat you get does not mean the end of heatsinks and fans.  It would mean you could transfer away heat quicker, but the heat needs to go somewhere.  The better thing to take out of this article is imagine the potential for heatsinks and fans, if graphene is for example 100x more efficient then you need 100th the surface area....or by keeping the heatsinks the same size, you could get cooling that will really heat up your room :P (by pushing your CPU's output even higher)

 

You are not wrong. In fact, Thermene's gen 2 graphene based thermal paste is proving pretty impressive in my tests. It has better thermal conductivity compared to AS5 and MX4 and on par with PK3 with the batch I tested. This particular batch is not even the best they can make so I am very optimistic about this product. Check out their website if anyone is interested (I obviously have no vested interest in the product): http://thermene.com/

Link to comment
Share on other sites

Link to post
Share on other sites

The problem is though as soon as this tech becomes a reality we will simply squeeze more cores in and more transistors so there will always be heat to be taken away. And that is a good thing because we will always get more powerful hardware every year.

umm did u read the article or just the title?? The only reason heat is generated is because of the inefficiency in silicone. Graphene is 100% effecient meaning there will be no heat

Finally my Santa hat doesn't look out of place

Link to comment
Share on other sites

Link to post
Share on other sites

The title is over sensationalized to be honest. 

The article says that graphene would have little to no resistance in heat transfer. Unlike other metals and alloys, graphe also increases its heat transmittance with an increase in area (or volume).

 

Meaning that there could be better heat conductivity to be dispelled, not making it self cooling. In CPUs this could meant the integrated heat spreader could be made out of graphene, allowing for more heat to be transferred effectively, and without the limiting heat capacity or aluminum or other metals.

▶ Learn from yesterday, live for today, hope for tomorrow. The important thing is not to stop questioning. - Einstein◀

Please remember to mark a thread as solved if your issue has been fixed, it helps other who may stumble across the thread at a later point in time.

Link to comment
Share on other sites

Link to post
Share on other sites

I wouldn't be surprised if they started to create their own energy to power themselves (possibly using the energy from the heat).

 

I think this is too good to be truth: what this article is implying is basically perpetual free energy. Even on a smaller scale I must adhere to "I believe it when I see it" 

-------

Current Rig

-------

Link to comment
Share on other sites

Link to post
Share on other sites

umm did u read the article or just the title?? The only reason heat is generated is because of the inefficiency in silicone. Graphene is 100% effecient meaning there will be no heat

Actually I think you are thinking of Stanene (which is still a theoretical substance as it hasn't be tested and created in real life).  Graphene still has inefficiencies in electrical transfer, and the article was talking about heat transfer.  As I have said I think the article came to the wrong conclusion and misinterpreted the findings.  Having the ability to transfer away more heat does not equate to making the overall system cooler.  To be honest I guess an analogy to what I am saying would be putting out 3 blocks of ice, 1 has a regular aluminum heat sink put on it, and another 1 has a graphene heat sink the same size put on it.  Which melts first at room temp?  The Graphene will because it is heating up the ice faster...coming to the conclusion that Graphene will be able to cool down something still will depend on the heat output vs cooling efficiency.

 

I think this is too good to be truth: what this article is implying is basically perpetual free energy. Even on a smaller scale I must adhere to "I believe it when I see it" 

I agree, the article is taking a minor discovery and turning it into the next big thing by throwing all logic out the window

0b10111010 10101101 11110000 00001101

Link to comment
Share on other sites

Link to post
Share on other sites

Except that you have just "invented" a perpetual motion machine or sorts. They are very impossible, especially for a computer. There is simply no way that a device can power itself because it is impossible for it to 100% efficiently use the energy that it outputs; even if it was, nothing would happen because part of that energy is data for your computer to work.

I will be interested to see where this goes, but they way they explained it in the article didn't quite add up because as @WanderingFool said, that wouldn't eliminate the need for cooling, just make it possible for it to be more efficient and effective.

Never said it was going to happen now, but I can imagine that some in the future, some of the energy from the heat can instead be transferred into another form of energy (possibly electrical). Who knows, maybe something like that would be some form of cooling. It probably won't give enough to power to the entire system, but it would be more efficient than the heat waste and the slightly more power required to cool it. Technology will surprise us. I'm sure of it.

Link to comment
Share on other sites

Link to post
Share on other sites

Actually I think you are thinking of Stanene (which is still a theoretical substance as it hasn't be tested and created in real life).  Graphene still has inefficiencies in electrical transfer, and the article was talking about heat transfer.  As I have said I think the article came to the wrong conclusion and misinterpreted the findings.  Having the ability to transfer away more heat does not equate to making the overall system cooler.  To be honest I guess an analogy to what I am saying would be putting out 3 blocks of ice, 1 has a regular aluminum heat sink put on it, and another 1 has a graphene heat sink the same size put on it.  Which melts first at room temp?  The Graphene will because it is heating up the ice faster...coming to the conclusion that Graphene will be able to cool down something still will depend on the heat output vs cooling efficiency.

 

I agree, the article is taking a minor discovery and turning it into the next big thing by throwing all logic out the window

yeah i was totally off base and now feel stupid :P

Finally my Santa hat doesn't look out of place

Link to comment
Share on other sites

Link to post
Share on other sites

We are still quite a way away from that. Graphene is SO interesting is tons of ways to future technology but still a lot of way to go.

 

The discovery of Graphene(outside of theory) earned Geim and Novoselov a nobel in 2010, so we are very much in the early stages.

The Mistress: Case: Corsair 760t   CPU:  Intel Core i7-4790K 4GHz(stock speed at the moment) - GPU: MSI 970 - MOBO: MSI Z97 Gaming 5 - RAM: Crucial Ballistic Sport 1600MHZ CL9 - PSU: Corsair AX760  - STORAGE: 128Gb Samsung EVO SSD/ 1TB WD Blue/Several older WD blacks.

                                                                                        

Link to comment
Share on other sites

Link to post
Share on other sites

I think this is too good to be truth: what this article is implying is basically perpetual free energy. Even on a smaller scale I must adhere to "I believe it when I see it" 

 

He reveals the trick to a popular 'free energy' BS that came before his video, then he performs a trick of his own at the end, any idea what is going on at the end of this video? I'm thinking he has cut a hole in the carpet, but I'm not sure..

 

Link to comment
Share on other sites

Link to post
Share on other sites

umm did u read the article or just the title?? The only reason heat is generated is because of the inefficiency in silicone. Graphene is 100% effecient meaning there will be no heat

There is no such thing as 100% efficiency. It might be close to 100% but it won't be 100% efficient otherwise we could power the world with perpetual machines and have infinite electricity. That isn't possible so this article is false. 

 (\__/)

 (='.'=)

(")_(")  GTX 1070 5820K 500GB Samsung EVO SSD 1TB WD Green 16GB of RAM Corsair 540 Air Black EVGA Supernova 750W Gold  Logitech G502 Fiio E10 Wharfedale Diamond 220 Yamaha A-S501 Lian Li Fan Controller NHD-15 KBTalking Keyboard

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×