Jump to content

A future with only passively cooled ARM chips

kasdashd

I can't see how we'll solve climate change in time without drastically cutting down on personal PC and server power consumption. 

 

Increasing cooling and/or power for a GPU or CPU is just an excuse to not innovate enough to make chips efficient enough, to the point where they can get by with passive cooling only, like the RTX 4090, which uses up to 500W of power!

 

ARM chips are also way more efficient than x86 chips in terms of performance per watt, so no wonder that's where everyone is headed right now, even if there's still a long way to go for Windows, Intel, and AMD, with Apple taking the lead with their M-series chips.

 

Performance at any cost is not impressive, efficiency is impressive and interesting, cause it shows real technological advancement, that also benefits science, and not just personal and enterprise computers.

Link to comment
Share on other sites

Link to post
Share on other sites

25 minutes ago, kasdashd said:

I can't see how we'll solve climate change in time without drastically cutting down on personal PC and server power consumption. 

Nuclear power

 

25 minutes ago, kasdashd said:

Increasing cooling and/or power for a GPU or CPU is just an excuse to not innovate enough to make chips efficient enough, to the point where they can get by with passive cooling only, like the RTX 4090, which uses up to 500W of power!

Moore's law is/isn't dead, and GPUs have gotten dramatically more efficient. In the race for more performance though, there's a limitation in physics that we've hit.

 

The RTX 4090 is an extreme example and can go above 600W, not 500W. Its a 450W TDP card that has a factory vBIOS that allows 133%. The card having 675W available to it, which it can reach in my experience.

 

An argument against your narrative is the Steam Deck, which can comfortably play AAA games at decent settings at a total 15W TDP.

 

If you're expecting RTX 4090 performance for <400W, then you'll have to wait for a dramatic improvement in computer engineering, since its simply a limitation of physics at the moment. Sure, we might incrementally get closer, something that DLSS does assist with. In a game like Warframe, I get lower power draw when enabling DLSS with little to no change in visual quality, demonstrating that running 1440p with AI upscaling to 4K is more efficient than raw rasterization of 4K.

 

25 minutes ago, kasdashd said:

Performance at any cost is not impressive, efficiency is impressive and interesting, cause it shows real technological advancement, that also benefits science, and not just personal and enterprise computers.

Daily drive a Steam Deck or ROG Ally, its a very acceptable experience and is environmentally friendly with regards to the power draw. I did it for over 3 weeks between house closing dates living in a hotel, having only my phone and Steam Deck OLED.

Ryzen 7950x3D PBO +200MHz / -15mV curve CPPC in 'prefer cache'

RTX 4090 @133%/+230/+1000

Builder/Enthusiast/Overclocker since 2012  //  Professional since 2017

Link to comment
Share on other sites

Link to post
Share on other sites

22 minutes ago, kasdashd said:

Increasing cooling and/or power for a GPU or CPU is just an excuse to not innovate enough to make chips efficient enough, to the point where they can get by with passive cooling only, like the RTX 4090, which uses up to 500W of power!

The 4090 is one of the most efficient GPUs for heavy workloads (which is what it's designed for). Using Folding@Home as an example, it's 6th for PPD/kWh: https://folding.lar.systems/gpu_ppd/overall_ranks_power_to_ppd. The only GPUs beating it are other Ada Lovelace (4000 series) cards that are running further inside their efficiency curve than the 4090. It's more efficient than every other GPU before it. The gap between the Ada cards and even the previous generation is pretty wide, and that's one single generation, if you compared it to Maxwell or Kepler (decade-ish old architectures) the results would be comical. And they are, the 4090 does ~2.4m points per kWh, the Maxwell 980 Ti does... 0.184m points for the same kWh of power draw. 

Intel HEDT and Server platform enthusiasts: Intel HEDT Xeon/i7 Megathread 

 

Main PC 

CPU: i9 7980XE @4.5GHz/1.22v/-2 AVX offset 

Cooler: EKWB Supremacy Block - custom loop w/360mm +280mm rads 

Motherboard: EVGA X299 Dark 

RAM:4x8GB HyperX Predator DDR4 @3200Mhz CL16 

GPU: Nvidia FE 2060 Super/Corsair HydroX 2070 FE block 

Storage:  1TB MP34 + 1TB 970 Evo + 500GB Atom30 + 250GB 960 Evo 

Optical Drives: LG WH14NS40 

PSU: EVGA 1600W T2 

Case & Fans: Corsair 750D Airflow - 3x Noctua iPPC NF-F12 + 4x Noctua iPPC NF-A14 PWM 

OS: Windows 11

 

Display: LG 27UK650-W (4K 60Hz IPS panel)

Mouse: EVGA X17

Keyboard: Corsair K55 RGB

 

Mobile/Work Devices: 2020 M1 MacBook Air (work computer) - iPhone 13 Pro Max - Apple Watch S3

 

Other Misc Devices: iPod Video (Gen 5.5E, 128GB SD card swap, running Rockbox), Nintendo Switch

Link to comment
Share on other sites

Link to post
Share on other sites

35 minutes ago, kasdashd said:

ARM chips are also way more efficient than x86 chips in terms of performance per watt

It is not. Just because your low power phones (which are MEANT to be low power) use less power, and because Apple did a pretty good chip (good luck finding another ARM chip that is as efficient), that doesn't mean that any ARM chip is that good.

 

Since you mentioned servers, the best ARM CPUs at the moment (Ampere Altra) use 250~350W of power each, their efficiency wasn't much better than the Epycs that were released at the same time, and now they get heavily beaten by both Intel and AMD with their current offerings on both power consumption and performance.

 

Apart from that, others have already explained that we did improve on the efficiency part, but we require even more performance and push that to an extreme nowadays.

FX6300 @ 4.2GHz | Gigabyte GA-78LMT-USB3 R2 | Hyper 212x | 3x 8GB + 1x 4GB @ 1600MHz | Gigabyte 2060 Super | Corsair CX650M | LG 43UK6520PSA
ASUS X550LN | i5 4210u | 12GB
Lenovo N23 Yoga

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, kasdashd said:

I can't see how we'll solve climate change in time without drastically cutting down on personal PC and server power consumption. 

How much of the world's total power consumption is accounted by personal computers, though?

 

1 hour ago, kasdashd said:

ARM chips are also way more efficient than x86 chips in terms of performance per watt, so no wonder that's where everyone is headed right now, even if there's still a long way to go for Windows, Intel, and AMD, with Apple taking the lead with their M-series chips.

People who make (a lot of) money from computers and pay the corresponding power bill care much more than consumers about efficiency and density, and they aren't exactly rushing away from Epycs and Quadros yet. There are reasons for that, as @igormp points out. That may change, but make no mistake, ARM isn't some form of magic when it comes to transforming power into useful outputs. Just like driving a pick-up truck for your daily suburban commute is stupidly inefficient, yet you can't replace it efficiently with a fleet of beetle-like cars when it comes to actual heavy lifting.

Link to comment
Share on other sites

Link to post
Share on other sites

ARM is a RISC architecture, what's RISC? Reduced Instruction Set Computing. Reduced. That's why it's more efficient at doing very specific things, x86-64 is the jack of all trades master of none of computing, it'll process pretty much anything you throw at it, at the price of being less efficient, that's how things are nowadays.

And not everyone uses a 4090 or one of the latest meme Intel CPUs that draw like 500 watts. Normal desktop computers are already efficient, you have C-states, auto sleep, low power SSDs, normal LED monitors draw like 15 watts. Laptops are more efficient if you don't play games at all.

 

For the master plan to work you'd need all computers to be the same with minor variations perhaps in memory size, clock, etc. but they would all need to share the same chipset, CPU, GPU and memory, and the OS would ONLY be coded to work with that specific set of components.

 

And once again you can't "fix climate change", that's not how it works, if you want to cut down on power usage you better stop looking at computers and start looking at air conditioning and shitty building design. For every A(mpere) used by a computer you have 50.000 more used by air conditioners everywhere.

Isn't it weird how "activists" and the media never talk about that elephant in the room? it's always the cow farts, or cars, or incandescent light bulbs, or desktop computers, or fishing, but never the A/C that keeps them living comfortable. Weird.

Caroline doesn't need to hear all this, she's a highly trained professional.

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, Caroline said:

but never the A/C that keeps them living comfortable.

You'd have to pry my A/C from my cold (from A/C) dead hands.

Link to comment
Share on other sites

Link to post
Share on other sites

32 minutes ago, Caroline said:

but never the A/C that keeps them living comfortable. 

Or even uncomfortable. Like every time I go to a place heated up to 25ºC in winter, when we're all coming from the streets with warm clothes, and then find it cool down to 17ºC in the summer, when people arrive in shorts...

Link to comment
Share on other sites

Link to post
Share on other sites

If anything we need more air conditioning systems, in the form of reversible heat pumps. A cold climate air source heat pump is 3 to 5 times more efficient than resistive electric heating. Even high efficiency condensing gas furnaces are less efficient than a heat pump running off electricity generated by that same fuel in a power plant.

 

Those roll-around portable air conditioners and window units at the big-box store are bronze age tech in comparison.

I sold my soul for ProSupport.

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, Caroline said:

ARM is a RISC architecture, what's RISC? Reduced Instruction Set Computing. Reduced. That's why it's more efficient at doing very specific things, x86-64 is the jack of all trades master of none of computing, it'll process pretty much anything you throw at it, at the price of being less efficient, that's how things are nowadays.

FWIW, that's a pretty outdated definition given how moderns µarches look like. A Mx chip from apple is hella complex and the front-end is just a minor part of it.

x86 also decodes many of its instructions into µops that are pretty risc-like.

FX6300 @ 4.2GHz | Gigabyte GA-78LMT-USB3 R2 | Hyper 212x | 3x 8GB + 1x 4GB @ 1600MHz | Gigabyte 2060 Super | Corsair CX650M | LG 43UK6520PSA
ASUS X550LN | i5 4210u | 12GB
Lenovo N23 Yoga

Link to comment
Share on other sites

Link to post
Share on other sites

I think the personal & office PC angle is already getting covered (our gaming PCs aside). The processing power most people need has increased far more slowly than power available. Most people for personal use would be fine with a tablet.

 

Servers, with the rise of AI, are another matter. Probably the best solution available now is to use the waste heat for something useful. District heating, or heating for industrial greenhouses, for example.  

Link to comment
Share on other sites

Link to post
Share on other sites

On 4/12/2024 at 1:25 PM, kasdashd said:

I can't see how we'll solve climate change in time without drastically cutting down on personal PC and server power consumption. 

 

Increasing cooling and/or power for a GPU or CPU is just an excuse to not innovate enough to make chips efficient enough, to the point where they can get by with passive cooling only, like the RTX 4090, which uses up to 500W of power!

 

ARM chips are also way more efficient than x86 chips in terms of performance per watt, so no wonder that's where everyone is headed right now, even if there's still a long way to go for Windows, Intel, and AMD, with Apple taking the lead with their M-series chips.

 

Performance at any cost is not impressive, efficiency is impressive and interesting, cause it shows real technological advancement, that also benefits science, and not just personal and enterprise computers.

you are confusing energy usage with efficiency


Efficiency is watts to solve a problem. 
ARM is NOT more efficient then x86 overall, it only is at sub 5W, but saying that also muddies the waters

If you have a server pulling 1500W to do a petaflop of math
a 1.5W microcomputer has to do more then a teraflop of math to be considered more efficient. If its only doing sub 1 teraflop, the 1500W server is MORE efficient. 

So yes you see Nvidia keep upping the wattage every generation with their server parts, but its doing MORE math. a 1.5x increase of power for a 2.25x increase in math performance is worth it. And cooling has gotten more efficient, and better at using water with new data center practices. 

 

  

On 4/12/2024 at 5:07 PM, igormp said:

FWIW, that's a pretty outdated definition given how moderns µarches look like. A Mx chip from apple is hella complex and the front-end is just a minor part of it.

x86 also decodes many of its instructions into µops that are pretty risc-like.

All risc chips do µops now as well, 
CISC and RISC chips (other then like one oddball arm core that uses a microwatt that is placed inside HDMI repeaters) are all Out of Order execution, and are superscalar. 

CISC just has well... more instructions, aka the more complicated front end. and thats is the main reason it struggles to compete at sub 1 watt because you cant simplify that. x86 has always done µops even back with the 8086. 
ADD AL, Memory location. 
Is two uops, and was an ASM command on the 8086. 

Its just arm makes me go

LDR R0, Memory location
ADD R1, R1, R0

But ARM I can also do

LDMIA {R0,R3} 
and thats like 8 uops if I had to guess. 


People put to much stock in the RISC/CISC debate. 

Every new version of ARM adds more instructions, "complicating" the front end more, its just far more selective then x86. But that also makes it... so it cant do task accelerated. like there are media decode commands for x86 that make it ASIC like for those kinds of tasks. AKA more efficient then ARM. 

 

  

On 4/12/2024 at 2:44 PM, Caroline said:

ARM is a RISC architecture, what's RISC? Reduced Instruction Set Computing. Reduced. That's why it's more efficient at doing very specific things, x86-64 is the jack of all trades master of none of computing, it'll process pretty much anything you throw at it, at the price of being less efficient, that's how things are nowadays.

I would argue you have that backwards, x86 is a master at many things, just not at being generalized. 
The MORE specific the task, the better x86 is then arm. until you get to the point where you want an ASIC. 

ARM generally only has generalized instructions that you have to chain together to do your specific task which is less efficient. 

Link to comment
Share on other sites

Link to post
Share on other sites

On 4/12/2024 at 2:40 PM, SpaceGhostC2C said:

People who make (a lot of) money from computers and pay the corresponding power bill care much more than consumers about efficiency and density, and they aren't exactly rushing away from Epycs and Quadros yet. There are reasons for that, as @igormp points out. That may change, but make no mistake, ARM isn't some form of magic when it comes to transforming power into useful outputs. Just like driving a pick-up truck for your daily suburban commute is stupidly inefficient, yet you can't replace it efficiently with a fleet of beetle-like cars when it comes to actual heavy lifting.

Bingo. I often think of Andrei Alexandrescu's remark, I believe in at least one iteration of his presentation "Fastware" https://www.youtube.com/results?search_query=alexandrescu+fastware . Something like:

 

"When I was at Facebook, power efficiency was job one. At the scales and workloads we ran, if you could improve the power efficiency of a process by 1%, you could save your annual salary in the reduction of one month of one center's electricity bill. We constantly A/B tested all these approaches, different server architectures, programming languages, ARM chips, more exotic hardware... Time after time, we almost always found that the way to save the most power was to go fast - use whatever hardware and software would complete the task in the shortest possible real linear time - and then sleep. Or, in practice, pack that task into many fewer servers."

 

Which is not to say that most consumer computer use - Web browsing, comms, light gaming - is the same as general purpose computing. Efficiency per se isn't the same problem as accomplishing the given task at low "cost". You may well be able to get around happily on one or two orders of magnitude less power, by putting ARM Linux on a docked tablet or phone or an Android box, or using an X86 Chromebook or netbook. I daily drove an X3 Steam machine for a while and was fine.

 

Also, though I personally make a hobby out of following the frontiers of green tech and reducing my power use and other forms of waste, individualized climate guilt is corpo propaganda. Seek and destroy incandescent lightbulbs, get efficient home heating/cooling if you're in control of it, use sleep mode when you're away from your machine, drive ICEs as little as you can. Congratulations, you're done. Anything else, done for the rest of your life, will be wiped out by a single empty flight across the Pacific to maintain the paperwork on an airline's landing slots, or the smelting of aluminum to make this year's Wal-Mart branded Christmas decoration. Your entertainment isn't killing the planet.

Link to comment
Share on other sites

Link to post
Share on other sites

On 4/12/2024 at 2:40 PM, SpaceGhostC2C said:

How much of the world's total power consumption is accounted by personal computers, though?

From https://frontiergroup.org/resources/fact-file-computing-is-using-more-energy-than-ever/

 

"In 2020, the information and communication technology sector as a whole, including data centers, networks and user devices, consumed about 915 TWh of electricity, or 4-6% of all electricity used in the world."

 

"Data centers consumed 240-340 terawatt-hours of electricity in 2022..."

 

"Globally, cryptocurrency mining consumed 110 TWh of electricity in 2022."

 

That's interesting, but we still don't really know how much of that total is from home computing. Let's see what the US Energy Information Administration has to say about it: https://www.eia.gov/energyexplained/electricity/use-of-electricity.php

 

Spoiler

image.png.a83930c877c69fcd8f5206f72444000c.png

image.thumb.png.6eb4f8f1ad2647415751c1e134789cb9.png

In the USA, home computers and their related equipment account for only 2.3% of residential energy usage. That's only about 34.5TWh. Heating and cooling take up the most energy usage compared to all other categories by far. So, home computers, at least in the USA, consume a lot less electricity than you might imagine. This makes sense - the average user doesn't have a high-power i7 or an RTX 4090, they're more likely to have an i3 or i5 and a basic graphics card, if not just integrated graphics.

 

The people who run datacenters don't take energy efficiency lightly either. They have massive electricity bills to manage, so getting the most computing power per watt is going to be a priority.

Computer engineering grad student, cybersecurity researcher, and hobbyist embedded systems developer

 

Daily Driver:

CPU: Ryzen 7 4800H | GPU: RTX 2060 | RAM: 16GB DDR4 3200MHz C16

 

Gaming PC:

CPU: Ryzen 5 5600X | GPU: EVGA RTX 2080Ti | RAM: 32GB DDR4 3200MHz C16

Link to comment
Share on other sites

Link to post
Share on other sites

they been saying that for years if you buy this led you will save moeny for a bit but your power bill will just keep going up. ya we save alot of power els were but also crating new things that use power.

 

there alot of things that just sux power.

 

if you really want to save power build an off grid house.

I have dyslexia plz be kind to me. dont like my post dont read it or respond thx

also i edit post alot because you no why...

Thrasher_565 hub links build logs

Corsair Lian Li Bykski Barrow thermaltake nzxt aquacomputer 5v argb pin out guide + argb info

5v device to 12v mb header

Odds and Sods Argb Rgb Links

 

Link to comment
Share on other sites

Link to post
Share on other sites

20 minutes ago, thrasher_565 said:

they been saying that for years if you buy this led you will save moeny for a bit but your power bill will just keep going up. ya we save alot of power els were but also crating new things that use power.

 

there alot of things that just sux power.

 

if you really want to save power build an off grid house.

huh?

Link to comment
Share on other sites

Link to post
Share on other sites

In the future, sapient ai and robots should be performing all the labor, even that of intellectual and artistic kind, leaving rest of humanity to just be hedonists living on universal basic income and have every single needs be pamper to. In short, it shall be an utopian society that has achieved post scarcity economy by allowing ai to take all of our jobs. Who agree with me?

Sudo make me a sandwich 

Link to comment
Share on other sites

Link to post
Share on other sites

37 minutes ago, wasab said:

In the future, sapient ai and robots should be performing all the labor, even that of intellectual and artistic kind, leaving rest of humanity to just be hedonists living on universal basic income and have every single needs be pamper to. In short, it shall be an utopian society that has achieved post scarcity economy by allowing ai to take all of our jobs. Who agree with me?

reminds me of fallout 3... ya we discoverer unlimited power but we destroyed the world...

little of juge dred in there too were we just build over things. ya i dont thing your going to be job less and you get to do nothing...

 

I have dyslexia plz be kind to me. dont like my post dont read it or respond thx

also i edit post alot because you no why...

Thrasher_565 hub links build logs

Corsair Lian Li Bykski Barrow thermaltake nzxt aquacomputer 5v argb pin out guide + argb info

5v device to 12v mb header

Odds and Sods Argb Rgb Links

 

Link to comment
Share on other sites

Link to post
Share on other sites

Things that help much more towards preventing climate change, and most of them we already have technology to do plenty more of than we are currently doing:

  • Replace all gas and resistive heating with heat pumps.
  • Increase insulation of buildings.
  • Build and use more public transport.
  • More electric cars and busses.
  • Build more clean energy production, including nuclear power.
  • Make items people buy last longer.
  • Make people and companies keep what they have for longer.
  • Make datacenters use spare heat for for example heating up nearby buildings.
  • Repurpose materials more than we currently do.

New server hardware do become more and more efficient over time already and it would be much harder to accelerate that.

“Remember to look up at the stars and not down at your feet. Try to make sense of what you see and wonder about what makes the universe exist. Be curious. And however difficult life may seem, there is always something you can do and succeed at. 
It matters that you don't just give up.”

-Stephen Hawking

Link to comment
Share on other sites

Link to post
Share on other sites

16 hours ago, BrandonLatzig said:

This thread makes me wonder how heat pumps work

Just like refrigerators, freezers, and air conditioners, they rely on the near-miracle that is the refrigeration cycle. ACs don't "make cold", they move heat.

 

The compressor squashes refrigerant into the condenser coil under high pressure, which raises its boiling point above ambient temperature. That causes the refrigerant to condense into liquid and shed heat into the coil, which has a fan blowing over it so that heat is dissipated into the space. The now cooler refrigerant passes through a constriction (sometimes a capillary tube, sometimes a metering valve), then sprays into the evaporator coil. Because it's not under pressure anymore, the refrigerant's boiling point plummets well below ambient, so it "boils". That means it suddenly needs to take on a lot of energy, which it does by sucking heat out of the evaporator coil and the air around it. Once the temperature and pressure stabilize, it's back into the compressor to go through the whole process all over again.

 

This moves heat energy away from the evaporator coil and concentrates it around the condenser coil.

 

This is very efficient in terms of energy consumed by the machine vs energy moved from place to place, because the refrigerant spontaneously does the job. All the machine has to do is create the necessary conditions for that to happen (run a compressor and a couple fans.)

 

Heat pumps are just air conditioners that can run "backwards" by reversing the roles of the evaporator and condenser coils.

 

Put the "condenser" indoors and use refrigerant with a boiling point well under -20 F, and you've got a system that can heat a house by scavenging heat from the outside air in the middle of winter.  It doesn't have to feel warm outside for that to work, as long as it's warmer than the refrigerant's boiling point.

 

Spoiler

 

 

I sold my soul for ProSupport.

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, thrasher_565 said:

reminds me of fallout 3... ya we discoverer unlimited power but we destroyed the world...

little of juge dred in there too were we just build over things. ya i dont thing your going to be job less and you get to do nothing...

 

You think nuclear weapons mean unlimited power? Cute.

 

I am talking about type III civilization technology here and an economy in which average per capita wealth per person is literally an entire planet orbiting within a Dyson sphere. Average per person energy expenditure can be the entire daily output of an entire star and all goods can be assemble and fabricated at the molecular level like those fabricators you see in star trek while services can be entirely produced by ai robots at our whims. 

 

When we reach this level, we are like literal gods. 

Sudo make me a sandwich 

Link to comment
Share on other sites

Link to post
Share on other sites

On 4/12/2024 at 7:25 PM, kasdashd said:

I can't see how we'll solve climate change in time without drastically cutting down on personal PC and server power consumption.

Performance at any cost is not impressive, efficiency is impressive and interesting, cause it shows real technological advancement, that also benefits science, and not just personal and enterprise computers.

image.png.b0669dae6b7aa4305cf639d0a1b54e87.png

Unfortunately, there is no silver bullet. Feel free to reduce your footprint, but don't expect it to have a meaningful impact. The ageing of the population also means that more and more, the people that will suffer the worse impacts are in the minority. The next USA presidents have already exceeded the average life expectancy, they might have the best incentives to policy make for people that are young now.

image.thumb.png.cc6a59a038a8583fe71cee46aa298ae3.png

 

Also, China and other nations are developing, and energy use/CO2 emission are directly linked to quality of life. And CO2 emissions are cumulatives, it takes a long time for CO2 to be sealed back in the ground as rocks.

 

Game Theory makes a scenario where all nations agree to cut down on quality of life to mitigate the worst effect of climate change, unlikely. Solving climate change will require a severe, through rearchitecting of the human civilization. 

 

I'll take some flak for this, I think the bet with the highest chance of making a difference in climate change, is rushing an artificial general intelligence, give it the biggest supercomputer we can throw at it, and have it solve two challenges:

  1. design a scalable cheap an easy to manufacture fusion generator power plant and have everybody that can lift an hammer build them. No patents, no royalty, no nothing. Release the designs open source.
  2. design batteries with 100X endurance and 20X energy density that can be manufactured for cheap from common materials, and electrify everything that can be electrified

It would still leave out ships and planes, that can be powered with hydrogen or ammonia IF energy is cheap green and plentiful.

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, Needfuldoer said:

Just like refrigerators, freezers, and air conditioners, they rely on the near-miracle that is the refrigeration cycle.

 

The compressor squashes refrigerant into the condenser coil under high pressure, which raises its boiling point. That causes the refrigerant to shed heat into the coil, which has a fan blowing over it so that heat is dissipated into the space. The now cooler refrigerant passes through a constriction (sometimes a capillary tube, sometimes a metering valve), then sprays into the evaporator coil. Because it's not under pressure anymore, the refrigerant's boiling point plummets, so it "boils". That means it suddenly needs to take on a lot of energy, which it does by sucking heat out of the evaporator coil and the air around it. Once the temperature and pressure stabilize, it's back into the compressor to go through the whole process all over again.

 

This is very efficient in terms of energy consumed by the machine vs energy moved from place to place, because the refrigerant spontaneously does the job. All the machine has to do is create the necessary conditions for that to happen (run a compressor and a couple fans.)

 

Heat pumps are just air conditioners that can run "backwards" by reversing the roles of the evaporator and condenser coils.

 

This moves heat energy away from the evaporator coil and concentrates it around the condenser coil. ACs don't "make cold", they move heat. Put the "condenser" indoors and use refrigerant with a boiling point well under -20 F, and you've got a system that can heat a house by scavenging heat from the outside air in the middle of winter.  It doesn't have to feel warm outside for that to work, as long as it's warmer than the refrigerant's boiling point.

 

  Reveal hidden contents

 

 

Ah why thank you!
I gotta look up to see if they work with normal ventilation systems or if they need their own (idk why they would, but I would like to know before I make this suggestion to my parents)

Link to comment
Share on other sites

Link to post
Share on other sites

20 hours ago, BrandonLatzig said:

Ah why thank you!
I gotta look up to see if they work with normal ventilation systems or if they need their own (idk why they would, but I would like to know before I make this suggestion to my parents)

Either/or.

 

If you already have a forced-air system, the indoor coil of a heat pump can replace the indoor half of an air conditioning system. Then the furnace just runs its blower when the heat pump calls for it. (You can also leave the furnace in place as a backup heat source.)

 

If you have steam, hydronic (hot water) baseboards, or resistive electric heat, you'll need to install wall or ceiling cassettes. They commonly get installed on outside walls, but they can go anywhere the HVAC company can reach with refrigerant lines.

 

If you have electric heat, a heat pump is a no-brainier. At worst it will draw half as much electricity, and the vast majority of the time it will draw 1/3 or less as much. I went from electric baseboard to a heat pump, and my only regret is I didn't do it sooner.

I sold my soul for ProSupport.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×