Jump to content
Search In
  • More options...
Find results that contain...
Find results in...
inmediovirtus

Game Streaming's Dirty Secret

Recommended Posts

17 minutes ago, RejZoR said:

gaming and streaming

From what I can tell they are comparing local gameplay vs streaming gameplay remotely (stadia, GeForce now, project xcloud, etc).
 

Someone please correct me if I have misunderstood.

Link to post
Share on other sites
7 hours ago, Bombastinator said:

Phrases like “how?” Come to mind.  Server farms are 50% less efficient electrically than home PCs?  The places that measure electricity so hard they buy new chips just because they’re more electrically efficient care so little they use 50% more juice. This smells more like a measurement error than anything else.  

Running something on one PC is more efficient than running it on a server, sending the generated data via a bunch of routing servers, to a PC at the user's end.

4 hours ago, Bombastinator said:

Varying the source of energy could be a measurement error.  People like to make money with businesses.  Server farms are businesses server farms use a lot of electricity and create a lot of heat.  Therefore they tend to be placed A; where electricity is cheap, and B where cooling is cheap.  This is why you see server farms in weird places like next to thermal electric stations in Greenland.

Greenland does not have server farms. It generally has very bad internet connectivity.

 

Greenland also does not rely on thermal electric stations; most of its electricity comes from hydro.

 

Maybe you're thinking of Iceland.

Link to post
Share on other sites
2 hours ago, Sakkura said:

Running something on one PC is more efficient than running it on a server, sending the generated data via a bunch of routing servers, to a PC at the user's end.

Greenland does not have server farms. It generally has very bad internet connectivity.

 

Greenland also does not rely on thermal electric stations; most of its electricity comes from hydro.

 

Maybe you're thinking of Iceland.

I might be.  They’re easy to transpose. 


Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to post
Share on other sites
23 hours ago, RejZoR said:

I love it how they quickly jump on the carbon footprint bandwagon for gaming and streaming entertainment, but when half of the world was literally running graphic cards to generate money with GPU mining farms, no fucks were given. And those ate gigawatts if not terawatts of power globally. Where were environmentalists then?

The energy usage was actually one of the main reasons why I wasn't super onboard with cryptocurrencies. It just seems insanely wasteful.

Link to post
Share on other sites
28 minutes ago, Beskamir said:

The energy usage was actually one of the main reasons why I wasn't super onboard with cryptocurrencies. It just seems insanely wasteful.

People didn't care for as long as the outcome was a positive. Which meant they spent like 300€ a month on electricity just to run this nonsense, but because they got lets say 400€ from mining, it was worth it.

Link to post
Share on other sites

Back when I first heard about mining I looked at  the way it worked and determined that it was a not very efficient way of turning electricity into money, and the winners of the battle would be the places with the cheapest electricity and since stolen electricity is the cheapest of all this was going to wind up a criminal enterprise. Later some Columbian drug lord said he would no longer take payment in anything but bitcoin and those together just left a hatful of “nope”.


Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to post
Share on other sites
2 hours ago, Bombastinator said:

Back when I first heard about mining I looked at  the way it worked and determined that it was a not very efficient way of turning electricity into money, and the winners of the battle would be the places with the cheapest electricity and since stolen electricity is the cheapest of all this was going to wind up a criminal enterprise. Later some Columbian drug lord said he would no longer take payment in anything but bitcoin and those together just left a hatful of “nope”.

Oh absolutely.

 

Crypto-currencies were only a boon to dirty-power generation plants because it saved them from having to shutdown entirely by offering industrial energy discounts to waste energy on cryptocoin boondoggles.

 

https://www.coindesk.com/a-new-york-power-plant-is-mining-50k-worth-of-bitcoin-a-day

Quote

Greenidge Generation, a natural gas power plant near the town of Dresden in the Finger Lakes region, announced it had successfully installed a mining farm in its facility. Comprised of nearly 7,000 mining rigs and powered by electricity generated on-site, the facility can mine an average of 5.5 bitcoins (BTC) every day, roughly $50,000, according to CoinDesk’s Bitcoin Price Index.

Greenidge uses its own “behind the meter” power, the generated electricity it uses itself at the basic cost of production. Kevin Zhang, director of Greenidge’s blockchain strategies, said in a statement the initiative would provide potential investors with unique exposure to both the cryptocurrency and energy markets.

 

https://www.coindesk.com/miner-reopens-coal-fired-power-plant-in-cheap-energy-quest

Quote

IOT Group, an Australian technology firm, has partnered with Hunter Energy to recommission the Redbank power plant – which was shuttered in 2014 – to reduce its energy costs, a local media outlet reported. The companies expect that IOT’s new energy bill will be reduced by 20 percent through using the facility, compared to its current costs.

 

IOT’s decision to go “behind the grid” is part of a larger trend within the cryptocurrency sector, according to a company statement. Proof-of-work blockchains, such as bitcoin and ethereum, consume massive amounts of energy through mining due to the mathematical problems the network must solve.

 

https://www.crowdfundinsider.com/2019/12/154727-wyoming-cryptocurrency-advocates-propose-reviving-coal-plants-to-power-cryptomining/

Quote

Parties working to make Wyoming friendly to cryptocurrency businesses have offered to revive aging coal-fired power plants in the region.

 

Wyoming “utility giant” PacifiCorp has come to regard the plants as a liability, but members of the crypto are desperate to bring down costs.

 

Cryptocurrency mining is an energy-intensive proposition. On average, it costs about $5000 USD to produce one bitcoin. Most of that price represents energy costs because cryptocurrency “mining” can be accomplished with very few employees.

 

Literately the cryptocoin fad is driving carbon emissions upward instead of decomissioning "cheap-and-dirty" energy sources. Likewise these power stations are essentially spending their customers and shareholders money on speculative investments and the customers are not getting any savings from any coins mined.

 

Link to post
Share on other sites
10 minutes ago, Kisai said:

Oh absolutely.

 

Crypto-currencies were only a boon to dirty-power generation plants because it saved them from having to shutdown entirely by offering industrial energy discounts to waste energy on cryptocoin boondoggles.

 

https://www.coindesk.com/a-new-york-power-plant-is-mining-50k-worth-of-bitcoin-a-day

 

https://www.coindesk.com/miner-reopens-coal-fired-power-plant-in-cheap-energy-quest

 

https://www.crowdfundinsider.com/2019/12/154727-wyoming-cryptocurrency-advocates-propose-reviving-coal-plants-to-power-cryptomining/

 

Literately the cryptocoin fad is driving carbon emissions upward instead of decomissioning "cheap-and-dirty" energy sources. Likewise these power stations are essentially spending their customers and shareholders money on speculative investments and the customers are not getting any savings from any coins mined.

 

Yep.  Crypto puts an ever raising floor on the price of electricity, and the people who make out like bandits are, well, bandits. 


Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to post
Share on other sites

Long term streaming could be better anyway.
1. It's much easier to move server farm than all of homes that it's supporting, therefore it can have much better efficiency in terms of power delivery.
2. The high cost of AC makes it popular to undervolt cards and get more perf/watt than home usage.
3. I believe cooling and data transmission are the two factors making the difference, servers already being more efficient than home PCs, but the Internet infrastructure should get updated to very efficient fibre etc. game streaming or no game streaming, what's more if the user has AC at home, it'll probably suck more additional juice cooling the heat from their PC than the server farm


But it's a wholly different experience, think of arm-based stream receivers that use next to no energy until the stream is started, no full OS running in a background. The farms would be purpose built for this and the biggest one imo - you need less HW than you currently have because it's never 100% of users who play at one time(and they'd probably just lower the quality in rush hours).

I will still prefer to own my machine, but I can see how streaming can improve and in many ways beat local gaming.

Link to post
Share on other sites
9 hours ago, RejZoR said:

Bullshit. If we're bitching about gaming and streaming, then we should also bitch about EVERY shape or form of entertainment that uses any kind of electricity to work. Oh, racing cars? Pollution and global warming. TV? Muh electricity. Radio? Muh electricity. Newsflash, our world runs on electricity. Hell, forums and "social media" to bitch on about global warming runs on electricity and produces fuck ton of heat. Pick all of them. Or none. Picking on games and gamers exclusively is the most fucking boomer thing possible and to this date I NEVER used boomer thing to bitch about something.

 

Did you even read my post? That was my entire point. My issue is when people only care about other things in response to a specific point.

Link to post
Share on other sites
10 hours ago, RejZoR said:

Bullshit. If we're bitching about gaming and streaming, then we should also bitch about EVERY shape or form of entertainment that uses any kind of electricity to work. Oh, racing cars? Pollution and global warming. TV? Muh electricity. Radio? Muh electricity. Newsflash, our world runs on electricity. Hell, forums and "social media" to bitch on about global warming runs on electricity and produces fuck ton of heat. Pick all of them. Or none. Picking on games and gamers exclusively is the most fucking boomer thing possible and to this date I NEVER used boomer thing to bitch about something.

No.  
this is specifically about cloud vs home.  This claim that cloud is 50% less efficient strikes me as massively implausible, so I’d like to see how it is derived.  The statement in the article is statistical and Theres an old saying: “ there’s lies, damn lies, and statistics” this is because statistics are massively manipulatable. 
 

I don’t know that it’s wrong.  It might be right. But it seems unlikely to me. 


Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to post
Share on other sites

Contrary to many people here apparently, I think it's absolutely key and critical to look and whole use stream dirt to dirt efficiency for any comparative activity like this.

 

We can have problems with the specific methodology and assignments of costs, but it's still better than not to look wholistically.

 

Interesting stuff.


LINK-> Kurald Galain:  The Night Eternal 

Top 5820k, 980ti SLI Build in the World*

CPU: i7-5820k // GPU: SLI MSI 980ti Gaming 6G // Cooling: Full Custom WC //  Mobo: ASUS X99 Sabertooth // Ram: 32GB Crucial Ballistic Sport // Boot SSD: Samsung 850 EVO 500GB

Mass SSD: Crucial M500 960GB  // PSU: EVGA Supernova 850G2 // Case: Fractal Design Define S Windowed // OS: Windows 10 // Mouse: Razer Naga Chroma // Keyboard: Corsair k70 Cherry MX Reds

Headset: Senn RS185 // Monitor: ASUS PG348Q // Devices: Note 10+ - Surface Book 2 15"

LINK-> Ainulindale: Music of the Ainur 

Prosumer DYI FreeNAS

CPU: Xeon E3-1231v3  // Cooling: Noctua L9x65 //  Mobo: AsRock E3C224D2I // Ram: 16GB Kingston ECC DDR3-1333

HDDs: 4x HGST Deskstar NAS 3TB  // PSU: EVGA 650GQ // Case: Fractal Design Node 304 // OS: FreeNAS

 

 

 

Link to post
Share on other sites
2 minutes ago, Bombastinator said:

No.  
this is specifically about cloud vs home.  This claim that cloud is 50% less efficient strikes me as massively implausible, so I’d like to see how it is derived.  The statement in the article is statistical and Theres an old saying: “ there’s lies, damn lies, and statistics” this is because statistics are massively manipulatable. 
 

I don’t know that it’s wrong.  It might be right. But it seems unlikely to me. 

i don't think it does honestly. Server hardware is explicitly designed for uptime not idle power efficiency, and the software is likewise optimized that same way. Eventually you'd expect game streamings centralization of resources and efficiencies of scales to outweigh things, but I wouldn't be in any way surprised to see that it isn't true right now.

 

The economics of space efficiency also seems to sometimes run counter to energy efficiency for many server installations.


LINK-> Kurald Galain:  The Night Eternal 

Top 5820k, 980ti SLI Build in the World*

CPU: i7-5820k // GPU: SLI MSI 980ti Gaming 6G // Cooling: Full Custom WC //  Mobo: ASUS X99 Sabertooth // Ram: 32GB Crucial Ballistic Sport // Boot SSD: Samsung 850 EVO 500GB

Mass SSD: Crucial M500 960GB  // PSU: EVGA Supernova 850G2 // Case: Fractal Design Define S Windowed // OS: Windows 10 // Mouse: Razer Naga Chroma // Keyboard: Corsair k70 Cherry MX Reds

Headset: Senn RS185 // Monitor: ASUS PG348Q // Devices: Note 10+ - Surface Book 2 15"

LINK-> Ainulindale: Music of the Ainur 

Prosumer DYI FreeNAS

CPU: Xeon E3-1231v3  // Cooling: Noctua L9x65 //  Mobo: AsRock E3C224D2I // Ram: 16GB Kingston ECC DDR3-1333

HDDs: 4x HGST Deskstar NAS 3TB  // PSU: EVGA 650GQ // Case: Fractal Design Node 304 // OS: FreeNAS

 

 

 

Link to post
Share on other sites
5 minutes ago, Curufinwe_wins said:

i don't think it does honestly. Server hardware is explicitly designed for uptime not idle power efficiency, and the software is likewise optimized that same way. Eventually you'd expect game streamings centralization of resources and efficiencies of scales to outweigh things, but I wouldn't be in any way surprised to see that it isn't true right now.

 

The economics of space efficiency also seems to sometimes run counter to energy efficiency for many server installations.

It’s not impossible it might be true.  But a set of statistics with no description of what they are derived from is effectively a push poll.  And that’s a big big difference. So much that the economics stop being reasonable.   If this is the case server farms shouldn’t exist period. They’d be too expensive.  The ONLY thing they would make sense for is off site emergency backup and that would just be expensive and people would live with it.  They’re used for all kinds of stuff though.   I wanna see numbers not pictures, and I wanna know where the numbers came from.  I don’t trust this thing. 


Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to post
Share on other sites
5 hours ago, Bombastinator said:

stolen electricity is the cheapest

Like in Iran where miners put their equipment in schools and religious buildings because these get free energy 


Hi

 

Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler

Hi

 

 

 

 

 

 

 

 

 

 

Link to post
Share on other sites
4 hours ago, Bombastinator said:

No.  
this is specifically about cloud vs home.  This claim that cloud is 50% less efficient strikes me as massively implausible, so I’d like to see how it is derived.  The statement in the article is statistical and Theres an old saying: “ there’s lies, damn lies, and statistics” this is because statistics are massively manipulatable. 
 

I don’t know that it’s wrong.  It might be right. But it seems unlikely to me. 

Why is it massively implausible? You're using multiple systems for nothing that actually contributes to generating data, just shuffling it around the internet. And you still have a system running at the user end just to display the visuals. So the server running the game, using a similar amount of energy as your local system would, is only partially balanced out by reduced energy usage at the user end.

 

So you have probably somewhere above 100% the original energy usage1 from the systems at either end, plus the extra energy usage of the network servers in between. I couldn't tell you if that comes out to 130% or 180% or whatever, but it sure seems plausible that it's above 100%.

 

1 that being the energy you'd use running it locally.

Link to post
Share on other sites

But that can't be true, Tesla's only use electricity. Computers only use electricity. Therefore computers are like Teslas, and Teslas are carbon neutral. They contribute zero carbon emissions! This research must be funded by big oil. REEEEE /s 


AMD Phenom™ II X6 1100T @ 4.0GHz | MSI 890FXA-GD65 | MSI GTX 550Ti | 16GB Kingston DDR3 | Samsung 850 EVO 250GB | WD 750GB | Antec 300 | ASUS Xonar DG | Corsair A50 | OCZ 600W | Windows 10 Pro

Sony MDR-V250 | Logitech G610 Orion Brown | Logitech G402 | Samsung C27JG5 

Intel Core™ i5-8520U | WD Blue M.2 250GB | 1TB Seagate FireCuda | 8GB DDR4 | Windows 10 Home | ASUS Vivobook 15 

Intel Core™ i7-3520M | GT 630M | 16 GB Corsair Vengeance DDR3 | Samsung 850 EVO 250GB | macOS Catalina  Lenovo IdeaPad P580

AMD Phenom™ II X2 550 @ 3.10GHz | Gigabyte GA-MA785GM-US2H | XFX Radeon HD 4870 | 4GB Corsair XMS2 | WD 250GB | Thermaltake TR2 500W | Windows 10 Pro

iPhone 6s (iOS 13.6.1) | iPad Mini (iOS 9.3.5) | Samsung Galaxy S5e

Link to post
Share on other sites
On 8/29/2020 at 1:17 PM, RejZoR said:

I love it how they quickly jump on the carbon footprint bandwagon for gaming and streaming entertainment, but when half of the world was literally running graphic cards to generate money with GPU mining farms, no fucks were given. And those ate gigawatts if not terawatts of power globally. Where were environmentalists then?

I remember seeing posts talking about these kinds of issues. Just because you didn't see the articles back then, doesn't mean they weren't there.  A few Google searches will show you the necessary articles from various news sites. And I love seeing some people in here not seeming to care about the environmental problems that e-waste still creates to this day. It's still a huge deal and needs to have further attention.


Desktops

 

- The specifications of my almighty machine:

MB: MSI Z370-A Pro || CPU: Intel Core i3 8350K 4.00 GHz || RAM: 20GB DDR4  || GPU: Nvidia GeForce GTX1070 || Storage: 1TB HDD & 250GB HDD  & 128GB x2 SSD || OS: Windows 10 Pro & Ubuntu 19.10

 

--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

 

Laptops

 

- Main laptop specs:

Model: ASUS X302UA || CPU: Intel Core i3 6006U || RAM: 8GB DDR3L || GPU: Integrated Intel HD 520 || Storage: 128GB SSD || OS: Windows 10 Home

Link to post
Share on other sites
47 minutes ago, Master Delta Chief said:

I remember seeing posts talking about these kinds of issues. Just because you didn't see the articles back then, doesn't mean they weren't there.  A few Google searches will show you the necessary articles from various news sites. And I love seeing some people in here not seeming to care about the environmental problems that e-waste still creates to this day. It's still a huge deal and needs to have further attention.

Of course it's a big deal, but unfortunately, until everyone is inside the oven, nobody outside of it cares how hot it is.

 

To put things in perspective. The easiest way to cool hardware would be to have heat jettisoned directly into space, so why aren't we just launching "servers" on satellites? Because the launch costs... never mind the environmental cost of rockets and long term maintenance of satellites. Hence satellites are a communication medium that is easily replaced if one is damaged. Prior to that, you would have to operate super massive radio stations to transmit across the continent, and forget different continents. Ironically, the Internet has made it much cheaper than satellite to transmit, with only a small latency penalty often better than the satellite. Still, the internet relies heavily on point-to-point, not point-to-multipoint communications, so live broadcasts are super inefficient, energy-wise since what you have are a dozens if not thousands of people initiating streams to the end point without any multicasting of the same live event. The original idea of "channels" on a time schedule was actually more energy efficient than allowing VOD (Video on Demand, eg netflix)/IPP (Impulse Pay per View), but the latter wins out because you can watch your show at any day or time after the broadcast, or even make it direct to VOD so that the ideal compression and resolution is used so that people not watching in 4K can watch a 1080 stream, and such.

 

It's a no-win scenario in many cases because people can't see past their own experience.

 

For example, let's use something entirely different:

A 100 story condo tower. It has 10 units per floor. So there's realistically 10000 units and each unit can house a minimum of 1 and a maximum of 2 people. However this tower has only been built with the minimum of features to make it cheap. So there are no kitchens, and everyone has to share the one bathroom on each floor, and there's only two stairwells and two elevators. If everyone had to rely on food couriers, the building could not support 10,000 food couriers all coming to the building at once. Yet this building is super-efficient energy wise because there aren't kitchens and bathrooms in every unit. All the long-term maintenance issues (eg things that involve water) are constrained to the parts outside the unit.

 

This is the kind of argument that is created when people go "why don't you save energy", because the decision to lower you costs is made against that environmental impact of the decision. Most people, when presented with a decision, they will pick the one that benefits them the most and costs themselves the least amount of money or time.

 

If people had absolutely no regard for the environment, everyone would live on a 1 hectare of land, with their house directly in the center of it, and a 50m driveway to leave the property. They would have to drive... simply to visit the neighbor. This is a reality for people who live in actual rural areas where there are no stores for within a 20 minute drive. These houses are usually super-sized, poorly insulated, and people only ever live out of one room in them anyway, so their energy costs are exponentially larger than someone living in that condo in the city.

 

Like the entire city vs rural experience comes up when people wonder why doesn't everyone just live somewhere else. Well a city is more expensive, but all the services are available, and it's a lot more energy/money efficient to live in a city during your productive life (Eg age 20-50) where you can get away without spending half your day doing unproductive activities (which come with maintaining a house) 

 

So to go back to the "to build your own pc, or to stream", is very much this argument. Streaming is putting 10,000 people into one condo tower with services designed to maximize energy efficiency/cheapness of being packed like sardines. As long as that infrastructure supports that, you may get away with it. However as I explicitly stated, 1 bathroom per floor, and only two elevators, puts a severe crunch on the common property (the hallways, elevators, bathrooms). In a data center context that means all the routing and switching equipment must be provisioned with the assumption that the connection will be saturated at all times. You can't simply run a 1Gb line to each Stadia machine and then compress it to 25Mbit and put 40 machines in the rack with only 1Gb of routing bandwidth to the rack. 

 

The on the other side of the equation, is it not cheaper to just have a single PS4 that has a 230w power supply (similar to a laptop) and draws 150w than it is to run a 1U server with a 1000w PSU that draws 800 watts, PLUS the power of the TV/cast device? In terms of energy use, the PS4 is probably still using less energy. Heck a fully kitted out PC with 1000w of parts is still more efficient than the server, because the player isn't going to be playing it for 24/7. A server will not be turned off when the player goes away, it will be re-provisioned to the next player. Sure it's possible to spin down servers, but that adds minutes of boot time. You're just not going to do that. Anyone who's operated web servers knows you can't just assume capacity, you have to over-provision for load spikes, which means you may as well just operate the server at 5% the entire time just in case you get slashdot'd or linked by a celebrity.

 

Link to post
Share on other sites

Dont complain about gaming, or even computing in general, complain about people still burning fossil fuels while we have new technologies with 0 emissions ready.

I swear people will take any excuse to bash any form of relatively new media and blame all their problems on it.


Main Computer - Tempest

Manjaro Linux  ||||  Intel Xeon E3-1231 v3  ||||  GeForce GTX 1660 SUPER   ||||  16GB DDR3-1866   ||||  Asus H97-Plus   ||||  EVGA Supernova G2 650w 80+ Gold 

500GB Samung 840EVO |  240GB Sandisk SSD Plus | 1TB WD Blue 7200RPM

 

HTPC - Kralkatorrik

Linux Mint   ||||   Intel Core i3-4170   ||||   GeForce GTX 750 Ti   ||||   8GB DDR3-1333   ||||   Asus B85M-E   ||||   Antec Signature 650W

120GB SSD

 

Laptop - Dell Latitude E4130

Linux Mint   |||   Intel Core i5-520m   |||   8GB DDR3L-1333   |||   152000mAh Custom Battery

120GB SSD

 

Windows NVIDIA GameStream Server - Dell Optiplex 790

Windows 10 Pro  |||   Intel Core i5-2400   |||   8GB DDR3-1600  |||   GeForce GTX 1650 (Half Height)

120GB SSD | 1TB WD Blue 5400RPM

 

Minecraft Server

Ubuntu 18.04 LTS   |||  AMD Phenom II X4 945   |||   8GB DDR3-1333   |||   Asus M5A99FX-Pro   |||   EVGA 600BQ

8GB SATA SSD (for OS) | 250GB WD Black 7200RPM (for server files) | 1TB WD Blue 5400RPM (Incremental Backups)

Link to post
Share on other sites
1 hour ago, Master Delta Chief said:

And I love seeing some people in here not seeming to care about the environmental problems that e-waste still creates to this day. It's still a huge deal and needs to have further attention.

Tell that to manufacturers, some phone manufacturer for example goes as far as gluing the battery to the screen to make sure you buy a new one instead of fixing the old one........ Companies go to extreme lengths to make it very hard/borderline impossible to repair their stuff.

Link to post
Share on other sites

https://www.google.com/about/datacenters/renewable/

 

At least in the case of Google, they use renewables. 

There's an environmental cost to making solar panels but... 

Let's just say that it's very probable that sharing hardware, having immense optimization, etc. will in the long run be more efficient. 


R9 3900x; 64GB RAM | RTX 2080 | 1.5TB Optane P4800x

1TB ADATA XPG Pro 8200 SSD | 2TB Micron 1100 SSD
HD800 + SCHIIT VALI | Topre Realforce Keyboard

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×