Jump to content

goodbye, 3x8; new PCI-e power standard delivers 600 watts over a single connector

BachChain
8 hours ago, Mel0nMan said:

How long until single SATA power to one of these adapters start showing up on Wish huh? 

SATA that won't do no I need molex to run this off my power supply I found on AliExpress that says it can do 2k watts. 

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, leadeater said:

The problem is more PSU support, sure this is a PCIe spec but currently "desktop" PSU's don't have any of these sideband communications at all so where does that 4 PIN sideband connections actually go? To new PSUs, to a connector on the motherboard?

Probably the motherboard, should the 12VO standard catch on.

"The most important step a man can take. It’s not the first one, is it?
It’s the next one. Always the next step, Dalinar."
–Chapter 118, Oathbringer, Stormlight Archive #3 by Brandon Sanderson

 

 

Older stuff:

Spoiler

"A high ideal missed by a little, is far better than low ideal that is achievable, yet far less effective"

 

If you think I'm wrong, correct me. If I've offended you in some way tell me what it is and how I can correct it. I want to learn, and along the way one can make mistakes; Being wrong helps you learn what's right.

 

Link to comment
Share on other sites

Link to post
Share on other sites

16 minutes ago, J-from-Nucleon said:

Probably the motherboard, should the 12VO standard catch on.

Hopefully not. That would make for weird cable runs where you need Y-Style cables. Could be a good chance to include those sidechannel signals in the 12VO standard for the PSU. After all PCIe power is 12V and the PSU manages its creation.

Link to comment
Share on other sites

Link to post
Share on other sites

10 hours ago, BachChain said:

spacer.png

Hot 🥵

 

So, to be clear this is different from the 12 pin plug that Nvidia used on the 30 series founders edition?

"Put as much effort into your question as you'd expect someone to give in an answer"- @Princess Luna

Make sure to Quote posts or tag the person with @[username] so they know you responded to them!

 RGB Build Post 2019 --- Rainbow 🦆 2020 --- Velka 5 V2.0 Build 2021

Purple Build Post ---  Blue Build Post --- Blue Build Post 2018 --- Project ITNOS

CPU i7-4790k    Motherboard Gigabyte Z97N-WIFI    RAM G.Skill Sniper DDR3 1866mhz    GPU EVGA GTX1080Ti FTW3    Case Corsair 380T   

Storage Samsung EVO 250GB, Samsung EVO 1TB, WD Black 3TB, WD Black 5TB    PSU Corsair CX750M    Cooling Cryorig H7 with NF-A12x25

Link to comment
Share on other sites

Link to post
Share on other sites

I'd like to see a GPU generation without a leap in power draw. Throwing more Watts against your problem sounds too cheap to me... (and too expensive for my power bill)

GTX 1080 -> 180W (250W for the 1080Ti)
RTX 2080 -> 215W (250W for the 2080 Ti)
RTX 3080 -> 320W (350W for the 3080 Ti)

 

The RTX 3060 Ti pulls 200W and with that more power than the 1080. The only 30-Series GPU drawing less than a 1080 is the 3060 non-Ti, that is 170W. Is the 3060 THAT much faster that it shows the difference of 2 generations? Not in my opinion... It has Raytracing - but is the 3060 even powerful enough to really leverage it?

Adding a 600 Watt Connector sends the wrong signals.

Link to comment
Share on other sites

Link to post
Share on other sites

Found some speculation on Igors Lab YT video comments about the side channel that make a lot of sense IMHO.

 

Two pins could be used for remote sensing the voltage that actually reaches the GPU which would then allow to slightly increase the voltage at the PSU side, i.e., regulate it. Cable loss and connector quality could be measured as well. The other two could simply be I2C that allow to e.g., exchange wattage capabilities and the like as I suggested before. The user suggested a resistor method as a fallback which the GPU measures if there are no I2C transactions seen after a certain time.

 

Giving the PSU a USB connector could make all this information user accessible. Nice treat IMHO (I think certain models already allow power draw monitoring in this way).

Link to comment
Share on other sites

Link to post
Share on other sites

So will PSU sideband communication be throughout the MB with PCIe 5.0 too, or just for GPU power connectors?

PCIe 5.0 really will be a whole new platform with new CPUs, MBs, RAM, and now possibly PSUs too if the MB requires sideband. It really is a hard cutoff at that point.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, StDragon said:

PCIe 5.0 really will be a whole new platform with new CPUs, MBs, RAM, and now possibly PSUs too if the MB requires sideband. It really is a hard cutoff at that point.

Right. All that's left is RGB.

New RGB standard will improve lighting and colours by 2500% and improve fps by 200%

"The most important step a man can take. It’s not the first one, is it?
It’s the next one. Always the next step, Dalinar."
–Chapter 118, Oathbringer, Stormlight Archive #3 by Brandon Sanderson

 

 

Older stuff:

Spoiler

"A high ideal missed by a little, is far better than low ideal that is achievable, yet far less effective"

 

If you think I'm wrong, correct me. If I've offended you in some way tell me what it is and how I can correct it. I want to learn, and along the way one can make mistakes; Being wrong helps you learn what's right.

 

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, Laborant said:

I'd like to see a GPU generation without a leap in power draw. Throwing more Watts against your problem sounds too cheap to me... (and too expensive for my power bill)

GTX 1080 -> 180W (250W for the 1080Ti)
RTX 2080 -> 215W (250W for the 2080 Ti)
RTX 3080 -> 320W (350W for the 3080 Ti)

Your trend of "throwing more power at the problem for more performance" only exists because you're looking at GPUs in recent history. When you go back further you find we've been hanging around the 250W range for an 80-series card for a long time.

 

GTX 280 - 236W (204W for the node-shrunk GTX 285 and 289W for the DGPU 295)

GTX 480 - 250W

GTX 580 - 244W (365W for the DGPU 590)

GTX 680 - 195W (300W for the DGPU 690)

GTX 780 - 250W (Also 250W for the 780Ti)

GTX 980 - 165W (250W for the 1080Ti)

 

Ampere cards are indeed very power hungry vs previous generations for many reasons including poor node choice as well as architectural inefficiencies and the need for power-hungry GDDR6X. But Turing really wasn't all that power hungry on the historical scale, it only looked that way because Maxwell and Pascal were insanely efficient.

 

To me, Ampere looks more like an outlier when it comes to GPU power consumption, not the continuation of a trend. I would not expect power consumption to increase with the 4000 series and would instead expect Nvidia to try and bring the cards back down to the 250W range.

2 hours ago, Laborant said:

The RTX 3060 Ti pulls 200W and with that more power than the 1080. The only 30-Series GPU drawing less than a 1080 is the 3060 non-Ti, that is 170W. Is the 3060 THAT much faster that it shows the difference of 2 generations? Not in my opinion... It has Raytracing - but is the 3060 even powerful enough to really leverage it?

The 3060Ti is a ~15% percent faster than the 1080Ti, but consumes 20% less power. The 3060 is 15-20% faster than the 1080 while consuming 6% less power.

 

What more were you expecting? Those differences sound perfectly reasonable to me for a two-generational improvement - they're equivalent to ~15-20% per generation - especially considering that so much of the transistor budget was spent on the introduction of RT and tensor cores.

 

With regards to RT, the 3060 performs better than the 2070 when running RT titles due to its second-gen RT cores. RT on that card is perfectly usable on lower RT settings.

CPU: i7 4790k, RAM: 16GB DDR3, GPU: GTX 1060 6GB

Link to comment
Share on other sites

Link to post
Share on other sites

5 hours ago, Mihle said:

Article says it's 3mm instead of 4,2mm pin spacing, so no, it's not backwards compatible the way you are thinking of. (The whole plug is shorter than what 2x6 connectors would be)

The pin shapes is also not the same shape pattern I think but that wouldn't matter because different size.

 

Can also take more wattage than what 2x6 or 2x8 can take.

I wrote the post before I found the picture. The 4-pin part actually looks like a more typical header standoff, and not a 3.5mm floppy connector.

 

What I intended by "backwards compatibility consideration" was that the 4-pins were probably meant to allow a configuration where the 4-pins are absent on the PSU (eg an adapter) and a 2x6 or 2x8 to 12VHPWR adapter that just slots directly into the back of the GPU. Or maybe the adapter works the other way around and allows 12VHPWR to 2x8 and shorts one of the signal pins to the PSU to indicate an adapter is in use.

 

As for what the signal pins might be used for, probably a "capability" switch. eg "open" = 2 x 75w, "closed" = 2 x 150w, "heart-beat" = variable as needed. So passive adapters can exist since people aren't going to throw their GPU's out just for a power upgrade.

 

Link to comment
Share on other sites

Link to post
Share on other sites

It's a bit weird to see such massive cables when USB-C can deliver up to 240W. That tiny uninspiring connector with thin cable. Why do we need such massive cables and connectors then for graphic cards? Imagine having 3x USB-C routed to a graphic card to deliver lets say 600W total with some reserve per cable. Or 2x8/3x8 pin monsters we have now...

 

Sure there can be electrical differences, but watts are watts and they are derived from voltage and amps. So it's not million amps or volts for which you'd need such massive connectors. So, what gives!?

Link to comment
Share on other sites

Link to post
Share on other sites

5 hours ago, TVwazhere said:

Hot 🥵

 

So, to be clear this is different from the 12 pin plug that Nvidia used on the 30 series founders edition?

 

 

From another thread

20 hours ago, jonnyGURU said:

I don't know why Igor does this.  He's so disruptive sometimes.  Like when he said there was a "united front against ATX12VO" (bullshit).

 

This connector isn't "entirely new" and the 12-pin connector will not end up "just like Virtual Link".

 

FACT:  The 12-pin portion of the 12+4 pin is EXACTLY the same as the current 12-pin used on FE cards.

 

FACT:  While the new Nvidia cards will use this new 12+4-pin, they will not require the +4 portion.  The current 12-pin used on the FE card will work as well.

 

And while the new spec "allows for" up to 600W cards, the existence of it does not imply that the next gen cards WILL BE 600W.  IF the card is 600W, the card manufacturer may or may not use one of the +4 pins as a sense wire to make sure a "correct connector" is in play.

 

It's funny because Igor published the original Astron 12-pin drawings when those "leaked", but didn't bother to put the two drawings (the 12-pin and the 12+4-pin) side by side to see that the shape of the connector, the terminal size, the power rating, etc. are all the same.

 

 

:)

Link to comment
Share on other sites

Link to post
Share on other sites

7 minutes ago, Caroline said:

Thing with regulations is that they always end up blaming the end user for literally anything while the massive corps can still do whatever they want, sure things should become more efficient but that contradicts the entire way of manufacturing things "big players" have, which is manufacturing massive amounts of shitty products that break in 1-2 years or become obsolete in less than 5, think about smartphones and all kinds of "smart" crap, most are used for a while then end up at a landfill when something apparently better comes out. Look at phones, they just add more RAM to keep the hellish spaghetti code that's Android running *somewhat* smoothly and random bullshit like 8 cameras to them to kinda justify the price tag and show off an "innovation".

 

Meme laws that put all the blame on us in an attempt to make us "feel bad" for having I dunno a high end PC, a 3000W space heater from the 80's that's still seemingly better than all the plastic crap manufactured nowadays (but hey, they're "eco friendly" because they use half the power...) or even eating a burger. That's applying a band-aid over a shotgun wound to the lungs and expect the patient to stay alive.

 

If you drop 2k on a graphics card the least you can expect is for it to last for the next 10 years, I mean it'll physically last but in 10 years, at this rate a 3090 won't be enough to run new games with decent quality at all.

I agree it is not easy. Here in Europe we have managed to see vacuum cleaners using less than half the power compared to a few years ago, yet still performing as well. We outlawed incandescent bulbs, halogen bulbs and more recently, tubes. This has all had a very measurable positive effect on our energy usage. People were up in arms, newspapers were waging war on Brussels, but it has had a positive result and just a few years later nobody wants to return to the old ways. It is not about making users feel guilty, it is about gentle persuasion of the big manufacturers. We know it can be done, and should be done. We as users should be demanding more. Considering the price of electricity here in Europe at the moment because of gas prices etc, we may have to wind our usage in. It is only going to get worse.

 

I totally agree on bloated software too.

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, Laborant said:

I'd like to see a GPU generation without a leap in power draw. Throwing more Watts against your problem sounds too cheap to me... (and too expensive for my power bill)

GTX 1080 -> 180W (250W for the 1080Ti)
RTX 2080 -> 215W (250W for the 2080 Ti)
RTX 3080 -> 320W (350W for the 3080 Ti)

 

The RTX 3060 Ti pulls 200W and with that more power than the 1080. The only 30-Series GPU drawing less than a 1080 is the 3060 non-Ti, that is 170W. Is the 3060 THAT much faster that it shows the difference of 2 generations? Not in my opinion... It has Raytracing - but is the 3060 even powerful enough to really leverage it?

Adding a 600 Watt Connector sends the wrong signals.

3x00 cards are being pushed way outside their efficiency peak on power, just look at RTX Axxx Quadros (not really quadros but screw that I'm not calling it RTX A4000).

thats not even getting into A100 which already have 500W+ versions and water cooled like many are pull even more

Good luck, Have fun, Build PC, and have a last gen console for use once a year. I should answer most of the time between 9 to 3 PST

NightHawk 3.0: R7 5700x @, B550A vision D, H105, 2x32gb Oloy 3600, Sapphire RX 6700XT  Nitro+, Corsair RM750X, 500 gb 850 evo, 2tb rocket and 5tb Toshiba x300, 2x 6TB WD Black W10 all in a 750D airflow.
GF PC: (nighthawk 2.0): R7 2700x, B450m vision D, 4x8gb Geli 2933, Strix GTX970, CX650M RGB, Obsidian 350D

Skunkworks: R5 3500U, 16gb, 500gb Adata XPG 6000 lite, Vega 8. HP probook G455R G6 Ubuntu 20. LTS

Condor (MC server): 6600K, z170m plus, 16gb corsair vengeance LPX, samsung 750 evo, EVGA BR 450.

Spirt  (NAS) ASUS Z9PR-D12, 2x E5 2620V2, 8x4gb, 24 3tb HDD. F80 800gb cache, trueNAS, 2x12disk raid Z3 stripped

PSU Tier List      Motherboard Tier List     SSD Tier List     How to get PC parts cheap    HP probook 445R G6 review

 

"Stupidity is like trying to find a limit of a constant. You are never truly smart in something, just less stupid."

Camera Gear: X-S10, 16-80 F4, 60D, 24-105 F4, 50mm F1.4, Helios44-m, 2 Cos-11D lavs

Link to comment
Share on other sites

Link to post
Share on other sites

So instead of working to reduce the power draw, they prefer making a new connector that lets them take up more power... Nice.

CPU: AMD Ryzen 3700x / GPU: Asus Radeon RX 6750XT OC 12GB / RAM: Corsair Vengeance LPX 2x8GB DDR4-3200
MOBO: MSI B450m Gaming Plus / NVME: Corsair MP510 240GB / Case: TT Core v21 / PSU: Seasonic 750W / OS: Win 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

21 minutes ago, RejZoR said:

It's a bit weird to see such massive cables when USB-C can deliver up to 240W. That tiny uninspiring connector with thin cable. Why do we need such massive cables and connectors then for graphic cards? Imagine having 3x USB-C routed to a graphic card to deliver lets say 600W total with some reserve per cable. Or 2x8/3x8 pin monsters we have now...

 

Sure there can be electrical differences, but watts are watts and they are derived from voltage and amps. So it's not million amps or volts for which you'd need such massive connectors. So, what gives!?

I had to look it up, the 240W standard is 50V at 5A. It also requires cables to self-identify it is designed for that use case. Not just any USB-C cable will work at that full power.

 

Generally speaking the higher the current (amps) the thicker the total conductors need to be. Higher voltage implies better insulation. You can trade off those for blindly transferring power.

 

Many parts in a PC have stuck to a max of 12V for a long time. To deliver 240W at 12V implies 20A flowing. That's going to need a fat cable. How you get the power at a more sane current is by increasing voltage. I think this will help, but it does mean the industry will have to go through a pain stage to increase the voltage used though the ecosystem.

Gaming system: R7 7800X3D, Asus ROG Strix B650E-F Gaming Wifi, Thermalright Phantom Spirit 120 SE ARGB, Corsair Vengeance 2x 32GB 6000C30, RTX 4070, MSI MPG A850G, Fractal Design North, Samsung 990 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Productivity system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, 64GB ram (mixed), RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, random 1080p + 720p displays.
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

16 minutes ago, porina said:

I had to look it up, the 240W standard is 50V at 5A. It also requires cables to self-identify it is designed for that use case. Not just any USB-C cable will work at that full power.

 

Generally speaking the higher the current (amps) the thicker the total conductors need to be. Higher voltage implies better insulation. You can trade off those for blindly transferring power.

 

Many parts in a PC have stuck to a max of 12V for a long time. To deliver 240W at 12V implies 20A flowing. That's going to need a fat cable. How you get the power at a more sane current is by increasing voltage. I think this will help, but it does mean the industry will have to go through a pain stage to increase the voltage used though the ecosystem.

This is 100% as anyone whose looked at home wiring knows quite well.

 

Does bring to mention though the question of if stepping down from a higher voltage would be worthwhile on a component level, seeing as we have seen trends towards that in most of the rest of the industry.

 

Anywho... this connector and the clear desire for something like it is starting to make nVidia's effort look pretty decent lol.

LINK-> Kurald Galain:  The Night Eternal 

Top 5820k, 980ti SLI Build in the World*

CPU: i7-5820k // GPU: SLI MSI 980ti Gaming 6G // Cooling: Full Custom WC //  Mobo: ASUS X99 Sabertooth // Ram: 32GB Crucial Ballistic Sport // Boot SSD: Samsung 850 EVO 500GB

Mass SSD: Crucial M500 960GB  // PSU: EVGA Supernova 850G2 // Case: Fractal Design Define S Windowed // OS: Windows 10 // Mouse: Razer Naga Chroma // Keyboard: Corsair k70 Cherry MX Reds

Headset: Senn RS185 // Monitor: ASUS PG348Q // Devices: Note 10+ - Surface Book 2 15"

LINK-> Ainulindale: Music of the Ainur 

Prosumer DYI FreeNAS

CPU: Xeon E3-1231v3  // Cooling: Noctua L9x65 //  Mobo: AsRock E3C224D2I // Ram: 16GB Kingston ECC DDR3-1333

HDDs: 4x HGST Deskstar NAS 3TB  // PSU: EVGA 650GQ // Case: Fractal Design Node 304 // OS: FreeNAS

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, Kisai said:

I wrote the post before I found the picture. The 4-pin part actually looks like a more typical header standoff, and not a 3.5mm floppy connector.

 

What I intended by "backwards compatibility consideration" was that the 4-pins were probably meant to allow a configuration where the 4-pins are absent on the PSU (eg an adapter) and a 2x6 or 2x8 to 12VHPWR adapter that just slots directly into the back of the GPU. Or maybe the adapter works the other way around and allows 12VHPWR to 2x8 and shorts one of the signal pins to the PSU to indicate an adapter is in use.

 

As for what the signal pins might be used for, probably a "capability" switch. eg "open" = 2 x 75w, "closed" = 2 x 150w, "heart-beat" = variable as needed. So passive adapters can exist since people aren't going to throw their GPU's out just for a power upgrade.

 

The +4 are sense wires.  It's essentially a "data wart" attached to the 12-pin FE connector.  S1 is "card_pwr_stable", S2 is "Card_Cbl_Pres#", S3 is "Sense0" and S4 is reserved.  I'm not going to pretend to know what these actually mean/do.  Nor do I know which of these pins the 600W cards will use and if it will be just a jump to ground like the current 8-pin PCIe vs. 6-pin PCIe. 

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Distinctly Average said:

I agree it is not easy. Here in Europe we have managed to see vacuum cleaners using less than half the power compared to a few years ago, yet still performing as well. We outlawed incandescent bulbs, halogen bulbs and more recently, tubes. This has all had a very measurable positive effect on our energy usage. People were up in arms, newspapers were waging war on Brussels, but it has had a positive result and just a few years later nobody wants to return to the old ways. It is not about making users feel guilty, it is about gentle persuasion of the big manufacturers. We know it can be done, and should be done. We as users should be demanding more. Considering the price of electricity here in Europe at the moment because of gas prices etc, we may have to wind our usage in. It is only going to get worse.

These are all the result of public policies in the EU to stop offshore drilling and prospecting and a move towards renewables. So yes, I would expect this to get worse over time.

Rather then upgrade PC hardware, maybe it's time to augment them with personal solar panels and a bank of batteries. Just sayin 😁

Link to comment
Share on other sites

Link to post
Share on other sites

40 minutes ago, StDragon said:

These are all the result of public policies in the EU to stop offshore drilling and prospecting and a move towards renewables. So yes, I would expect this to get worse over time.

Rather then upgrade PC hardware, maybe it's time to augment them with personal solar panels and a bank of batteries. Just sayin 😁

Well, given there is an obesity problem maybe they should be provided with pedal powered generators. Pedal fast enough in forza and you get an extra turbo boost.

Link to comment
Share on other sites

Link to post
Share on other sites

17 hours ago, estiar said:

Sounds like a recipe for a melted connector

USB to ~380V anyone?

 

This was a listing on Polish ebay-like site. "You can connect connect concrete mixer to a PC. Can be used with micro USB. I don't have drivers for concrete mixer but I heard Microsoft is already working on them. Yes I have Tesla output but only from HDMI."

 

przejsciowka-usb-na-380v.jpg

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, Distinctly Average said:

I agree it is not easy. Here in Europe we have managed to see vacuum cleaners using less than half the power compared to a few years ago, yet still performing as well. We outlawed incandescent bulbs, halogen bulbs and more recently, tubes. This has all had a very measurable positive effect on our energy usage. People were up in arms, newspapers were waging war on Brussels, but it has had a positive result and just a few years later nobody wants to return to the old ways. It is not about making users feel guilty, it is about gentle persuasion of the big manufacturers. We know it can be done, and should be done. We as users should be demanding more. Considering the price of electricity here in Europe at the moment because of gas prices etc, we may have to wind our usage in. It is only going to get worse.

... And the cost of light bulbs has increased like 20-30 times, who pays for it? Consumers. The energy needed to produce this efficient light bulbs is also much higher. It is also untrue that nobody want to return to old ways. I know many people who still buy the old light bulbs because it's just more cost efficient. And people saying Brussels is wrong were right at that time. Before we got LED, the ones with noble gas weren't fit to be used in many places, like hallways or bathrooms, since they need a lot of energy to start it up. Simmilar story with vacuum cleaners. Yes they use less energy, but you use them quite rarely comperatively, and again it has increased costs. If EU wants to reduce energy usage they should start by banning wireless charging, it is much less energy efficient, losses are like 30%, and people charge phones daily.

 

The fact that energy prices increase now due to gas is also fault of EU whose politics resulted in many countries relying on the gas and renewables. Renewables are expensive itself and they require gas to stabilise the system, which Europe doesn't have. This year winter was quite cold and it wasn't windy, you got your result. No, more renewables is not the answer because it will have opposite effect - we will shut down coal plants and build has ones to stabilise (nothing else is really flexible enough). Also specifically building new gas pipes to same country (starts with R) doesn't help. Who pays for this? End users again.

 

If normal people pay for such changes, demanding more of them means you're either blind, don't care about poorer than you, or just are lobbied by conglomerates making money on this.

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, Ydfhlx said:

... And the cost of light bulbs has increased like 20-30 times, who pays for it? Consumers. The energy needed to produce this efficient light bulbs is also much higher. It is also untrue that nobody want to return to old ways. I know many people who still buy the old light bulbs because it's just more cost efficient. And people saying Brussels is wrong were right at that time. Before we got LED, the ones with noble gas weren't fit to be used in many places, like hallways or bathrooms, since they need a lot of energy to start it up. Simmilar story with vacuum cleaners. Yes they use less energy, but you use them quite rarely comperatively, and again it has increased costs. If EU wants to reduce energy usage they should start by banning wireless charging, it is much less energy efficient, losses are like 30%, and people charge phones daily.

 

The fact that energy prices increase now due to gas is also fault of EU whose politics resulted in many countries relying on the gas and renewables. Renewables are expensive itself and they require gas to stabilise the system, which Europe doesn't have. This year winter was quite cold and it wasn't windy, you got your result. No, more renewables is not the answer because it will have opposite effect - we will shut down coal plants and build has ones to stabilise (nothing else is really flexible enough). Also specifically building new gas pipes to same country (starts with R) doesn't help. Who pays for this? End users again.

 

If normal people pay for such changes, demanding more of them means you're either blind, don't care about poorer than you, or just are lobbied by conglomerates making money on this.

That reminds me of the story about the vacuum cleaner. As vacuum cleaners became popular, more coal power plants were built, and since they were still very sooty at the time, it made the air dirtier, and thus the carpets dirtier. 

 

Sometimes an optimization somewhere requires an optimization further up the chain of processes in order for that optimization to not become a overall detriment.

 

So forcing power and cable standards is a good thing. But it requires not only that industry play ball, but that industry sees a way to sell a bunch of new kit. So LED bulbs were never going to take off without a ban on the manufacture of incandescent bulbs, but the ban was intended to promote CFL's. CFL's however are worse for the environment. Where incandescent bulbs threw out some trivial amounts of metal and glass (and could be recycled if really wanted to), CFL's contain hazardous materials, more solid waste overall, for not that much of an improvement.  LED bulbs luckily came along and solved all those problems, heck now the bulbs can be entirely plastic. If one was so inclined, they could reuse the base... if industry ever decides to play ball. Nobody wants to retrofit low voltage lighting, but if you could just leave in a edison-base to LV LED base in the socket, you could then replace the LED's if they fail.

 

Which BTW, of all the LED bulbs I've bought. They're all 10 years old or more, and I've only had one fail, and it's not the oldest one, it was the Philips bulb which I'd say was probably the worst designed, because it had to go inside a glass light fixture so it eventually reached a thermal point where the PCB traces melted.

 

A lot of people hate change, because they are ultimately selfish, and see any change as ridiculous if it doesn't personally benefit them immediately.

Link to comment
Share on other sites

Link to post
Share on other sites

10 hours ago, J-from-Nucleon said:

Right. All that's left is RGB.

New RGB standard will improve lighting and colours by 2500% and improve fps by 200%

They're going to add more colors to add 50% more performance. Get ready for RGBYW

Link to comment
Share on other sites

Link to post
Share on other sites

This connector is weird, and I can't really see how it'll work when it comes to backwards compatibility for current desktop PCs.

 

For servers, I guess it makes sense if you're planning on having a really dense unit with multiple dual GPUs in a single PCIe slot, or a really hefty custom PCIe device, but OTOH that SXM competitor that some big techs are working on (OAM) is supposed to handle up to 700W, and allows for a way more dense design anyway, so idk

FX6300 @ 4.2GHz | Gigabyte GA-78LMT-USB3 R2 | Hyper 212x | 3x 8GB + 1x 4GB @ 1600MHz | Gigabyte 2060 Super | Corsair CX650M | LG 43UK6520PSA
ASUS X550LN | i5 4210u | 12GB
Lenovo N23 Yoga

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×