Jump to content

Cooling a TV?

Troika
13 hours ago, Troika said:

Insulating what exactly?

Insulating the exit point to the window. It can be done somewhat cheaply (foam with holes for tubes) but it's an extra cost. Also if you're particularly concerned then insulation for the tubing.

 

 

13 hours ago, Troika said:

That would be accurate if the TV was outside and the ambient temp was 90~100F but its not, the ambient temp is between 72~74F so the delta is much more significant.

I had assumed your goal was to dump the heat outside, not to transfer it to your room more quickly. If your rad is outside, then the "ambient" is the outside temperature. 

The general point remains, plasma TVs run MUCH cooler (temperature wise) than CPUs. If your TV is 100F in a 72F room and it's 100F outside... there's not going to be any heat transfer. If it's 110 outside then you'd actually be warming the screen. 

 

If your plan is to dump the heat from the TV into the room, this will not make your room cooler. The net impact will be a rounding error. Coolers make the thing being cooled cooler, they don't unexist heat though. They move heat somewhere else. The whole "energy is neither created nor destroyed" thing. 

https://en.wikipedia.org/wiki/Conservation_of_energy

 

Converting energy into mass is possible, it's just intractable to do and requires billions worth of gear to convert small amounts. Also tons of energy. 

 

13 hours ago, Troika said:

That would be accurate if the TV was outside and the ambient temp was 90~100F but its not, the ambient temp is between 72~74F so the delta is much more significant. Also, the room temp isn't as important for a waterloop because the main limitation is the thermal mass of the water and the size of the radiator.

Let's assume you have a "perfectly" insulated water cooler that only transfers heat between the TV and the location of the radiator.

The TV generates heat. It might be 20C higher than the rest of your room, assuming the room otherwise can handle the heat dissipation. 

If the radiator is in an area that's the same temperature as the TV, then the air being blown on the radiator then no heat transfer will occur. If there's a 10C delta then a certain amount of heat transfer will occur. If there's a 20C delta then each bit of air moving over the radiator will transfer ~2x the heat energy out of the radiator. If it's hotter outside then the reverse happens (imagine blowing a hair dryer over a radiator in a cold room with the plate touching you... it'd transfer heat onto your body)

 

https://en.wikipedia.org/wiki/Newton's_law_of_cooling

 

Rate of Heat Transfer = Heat Transfer Coefficient * Surface Area * (Temp of Object - Temp of Environment)

 

The heat transfer coefficient of a water cooler is OK. If you're mounting the plate to the back of a plastic TV then this hurts the heat transfer coefficient. Not having strong mounting pressure (i.e. using thermal tape instead of bolting something down) also hurts the heat transfer coefficient. 

The 500W from the TV is spread out over around 600 in^2. If your "cold" plate is only 6 in^2 then this limits the ability to transfer heat. 
It does seem that you're going for 6x9.45"x1.57" or 89 in^2 of cooling area. This helps. It's 15x what I'd imagined in terms of area. You'd want to have them all spaced out from each other, and likely on a sheet of thermally conductive material (aka heatspeader), again, with decent mounting pressure, since a worse mount means a lower H.  


 

Quote

The question I asked has yet to be answered. To reiterate, will cooling the back of a plasma panel negatively affect the panel? I don't know exactly how that technology works and I don't know if lowering the operating temperature will negatively affected it. LCDs, when cooled, does negatively affect them because it slows down the LCD response time.

Plasma TVs work by heating up gasses which then emit light. If the gas is cooler, then less light comes out and the display is less responsive. 

 

Not sure if temperature gradients across the screen (some cool spots, other hot spots) would cause issues though. It'll likely be hard to find examples of this done since the value is so questionable. I also don't think you'd have substantial temperature deltas during the hotter months of the year and during the colder months of the year you'd need the cooling less. 

The closest thing I've seen is this - https://www.google.com/search?q=water+cooling+OLED+tv&rlz=1C1GCEB_enUS1019US1019&oq=water+cooling+OLED+tv&aqs=chrome..69i57j33i160i579.3895j0j4&sourceid=chrome&ie=UTF-8

which is a more sensible project because OLEDs (and microLED backlit LCDs if you're cooling the backlights) work better at lower temperatures. 

 


As stated WAY up in the thread, if you want to do this as a mad science project, go for it. The utility is questionable outside of "fun" otherwise.

 

 

 

image.thumb.png.d97726b23520d165db545604204cd143.png

3900x | 32GB RAM | RTX 2080

1.5TB Optane P4800X | 2TB Micron 1100 SSD | 16TB NAS w/ 10Gbe
QN90A | Polk R200, ELAC OW4.2, PB12-NSD, SB1000, HD800
 

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×