Jump to content

Prediction: RTX Cards Are Barely Faster Than Pascal

Max_Settings

Yeah, the lack of benchmarks has me wondering what sort of boost they'll actually give over the 10 series.

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, KarathKasun said:

MS has a game focused RT API.  RT is where things are going to be heading as far as lighting and effects.

 

We are moving back to specialized hardware for specific types of graphical effects.  Actually, this is what any person who pays attention to tech was expecting.  We cant make general purpose compute units any faster if transistors dont shrink, and specialized hardware is many times faster than GP hardware.

Though since this eats into the GP shader budget, I will rue the day when we have hardware that is amaze balls in then-current games, but performs worse than hardware now-generations old.

 

Hopefully they'll figure out how to leverage the specialized hardware for applications developed back then through some driver magic.

Link to comment
Share on other sites

Link to post
Share on other sites

18 hours ago, Alex Atkin UK said:

What do you mean, Raytracing itself?  Seems unlikely seeing as its part of DirectX rather than something nVidia came up with on their own.

 

Screen space reflections are REALLY jarring when you are moving the camera around and see all reflections on large bodies of water vanish, I can see why this is the area a focus as reflections/lighting can make the biggest difference to how a game feels.

Microsoft was likely forced to implement it after Nvidia told them that they were going ahead with shipping it. Nvidia has OptiX to backport ray tracing to OpenGL and Direct3D 10/11 and the VK_NV_raytracing vendor extension to bring ray tracing to Vulkan. Without adding it to Direct3D 12, the API would have lacked feature parity and developers might find OpenGL and Vulkan as compelling alternatives given that it should integrate more easily there from Microsoft not having the same level of control.

 

I am not a graphics developer though, so I could be misunderstanding something. OptiX might work with Direct3D 12. However, I could not find anything to suggest that it did and I am not going to go study the API documentation to figure it out. In any case, Microsoft wants people using their APIs and not someone else’s. It would be harder to lock people into their platform if developers were to start using cross platform APIs.

 

edit: I was missing something:

 

https://en.m.wikipedia.org/wiki/OptiX#Ray_tracing_with_OptiX

 

OptiX is CUDA based and likely will not benefit from the new hardware’s raytracing acceleration unit. That would mean that unless Microsoft added support for ray tracing to Direct3D, the only way to use the new hardware properly would be through Vulkan. There is no way that Microsoft could allow such a thing.

Link to comment
Share on other sites

Link to post
Share on other sites

18 minutes ago, M.Yurizaki said:

Though since this eats into the GP shader budget, I will rue the day when we have hardware that is amaze balls in then-current games, but performs worse than hardware now-generations old.

 

Hopefully they'll figure out how to leverage the specialized hardware for applications developed back then through some driver magic.

Nah, your old games will be stuck in a time capsule of "never going to be any faster".

 

This happened back when we started seeing multicore CPU's as well.  Old single thread games still run like crap today.

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, KarathKasun said:

Nah, your old games will be stuck in a time capsule of "never going to be any faster".

 

This happened back when we started seeing multicore CPU's as well.  Old single thread games still run like crap today.

Well I mean, if we get say a 3080 with less than 2000 CUDA cores and the speed is more or less the same because more die space was given to RT cores or something, then theoretically the performance should be worse than the GeForce 10 series if running in pure CUDA core mode.

 

I'm assuming no other improvements.

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, M.Yurizaki said:

Well I mean, if we get say a 3080 with less than 2000 CUDA cores and the speed is more or less the same because more die space was given to RT cores or something, then theoretically the performance should be worse than the GeForce 10 series if running in pure CUDA core mode.

 

I'm assuming no other improvements.

Its going to happen like that.  Like I said, look at what happened with CPU's... More cores = less clocks = older games will theoretically get to a point to where they run slower on modern hardware.

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, xg32 said:

my minimum target for the 2080 ti is 4k/98fps with RT enabled, high settings; and 4k/120fps without RT on medium settings (oc'd 1080 ti does 90-100fps on most of AAA games), if RTX falls short of that, it's gonna be a very tough decision (no alternatives), not going sli.

Sorry to burst your bubble but there is already an hands on review from techradar, on of the guys played tomb raider on 2080TI 4k ultra details and recorded 54-57 FPS with rtx on...

 

Honestly look at those presentation again, especially tomb raider and you can literally notice fps drop with rtx. As of now it looks like another hairworks.

Link to comment
Share on other sites

Link to post
Share on other sites

16 minutes ago, Danoniero said:

Sorry to burst your bubble but there is already an hands on review from techradar, on of the guys played tomb raider on 2080TI 4k ultra details and recorded 54-57 FPS with rtx on...

 

Honestly look at those presentation again, especially tomb raider and you can literally notice fps drop with rtx. As of now it looks like another hairworks.

I will not discount RTX and what it does improve since lighting does look more real but after looking at a HairWorks demo again man are you spot on.  Yeah HairWorks also looked great but you literally had to slow the game down to pay attention to someones head.  I am not sure we are going to stop playing a game to admire the reflections and lighting in a game like Nvidia featured.  With this being said yeah RTX should have been just a feature not a new moniker, $1200 is a ton of money but if it can go 30%+ better than a 1080 Ti in traditional benchmarks then maybe more some psychopaths like myself it can be somewhat justified.  

SFF Time N-ATX V2 - Gigabyte X570 I Aorus Pro WIFI - AMD Ryzen 9 5800X3D - Gigabyte Gaming OC RTX 4090 - LG C2 OLED 42" 

 

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, Jrasero said:

I will not discount RTX and what it does improve since lighting does look more real but after looking at a HairWorks demo again man are you spot on.  Yeah HairWorks also looked great but you literally had to slow the game down to pay attention to someones head.  I am not sure we are going to stop playing a game to admire the reflections and lighting in a game like Nvidia featured.  With this being said yeah RTX should have been just a feature not a new moniker, $1200 is a ton of money but if it can go 30%+ better than a 1080 Ti in traditional benchmarks then maybe more some psychopaths like myself it can be somewhat justified.  

If the 2080 Ti is 30% faster than a 1080 Ti across the board AND you can turn on RTX effects without a performance penalty, I will concede that the price is "reasonable" (albeit very expensive).

 

How much do you want to bet that turning on the RTX effects hurts performance quite a bit?

CPU: Ryzen 7 5800x3D || GPU: Gigabyte Windforce RTX 4090 || Memory: 32GB Corsair 3200mhz DDR4 || Motherboard: MSI B450 Tomahawk || SSD1: 500 GB Samsung 850 EVO M.2 (OS drive) || SSD2: 500 GB Samsung 860 EVO SATA (Cache Drive via PrimoCache) || Spinning Disks: 3 x 4TB Western Digital Blue HDD (RAID 0) || Monitor: LG CX 55" OLED TV || Sound: Schiit Stack (Modi 2/Magni 3) - Sennheiser HD 598, HiFiMan HE 400i || Keyboard: Logitech G915 TKL || Mouse: Logitech G502 Lightspeed || PSU: EVGA 1300-watt G+ PSU || Case: Fractal Design Pop XL Air
 

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, Jrasero said:

I will not discount RTX and what it does improve since lighting does look more real but after looking at a HairWorks demo again man are you spot on.  Yeah HairWorks also looked great but you literally had to slow the game down to pay attention to someones head.  I am not sure we are going to stop playing a game to admire the reflections and lighting in a game like Nvidia featured.  With this being said yeah RTX should have been just a feature not a new moniker, $1200 is a ton of money but if it can go 30%+ better than a 1080 Ti in traditional benchmarks then maybe more some psychopaths like myself it can be somewhat justified.  

Lighting is really the last frontier in computer generated imaging.  We can already make more polygons than pixels in real time.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, MadPistol said:

If the 2080 Ti is 30% faster than a 1080 Ti across the board AND you can turn on RTX effects without a performance penalty, I will concede that the price is "reasonable" (albeit very expensive).

 

How much do you want to bet that turning on the RTX effects hurts performance quite a bit?

RTX on probably does negatively effect performance.  Yeah I guess I wonder, is RTX a thing you can turn on and off?  Was that mentioned.  I just figured it is something games are developed for so either your GPU can or can't handle it

SFF Time N-ATX V2 - Gigabyte X570 I Aorus Pro WIFI - AMD Ryzen 9 5800X3D - Gigabyte Gaming OC RTX 4090 - LG C2 OLED 42" 

 

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Jrasero said:

RTX on probably does negatively effect performance.  Yeah I guess I wonder, is RTX a thing you can turn on and off?  Was that mentioned.  I just figured it is something games are developed for so either your GPU can or can't handle it

You will probably have an option to turn it on or off in the settings, like so many other things. If you turn it on and your card does not support I imagine that things will look weird, crash or just not work. All this is speculation, of course.

Link to comment
Share on other sites

Link to post
Share on other sites

You will DEFINITELY be able to turn it off as new games still have to include lighting methods for none RTX card anyway, so it would be dumb to exclude the option.

 

I still say its not "another HairWorks" because honestly, hair moving a bit more realistically is meaningless compared to lighting and shadows looking more realistic.  As shown in the demos, being able to see what direction explosions are happening outside the viewport is going to be BIG for a lot of games.

 

Any game taking place at night time is going to absolutely stunning.

 

This is something that will affect game play, unlike HairWorks that was just a "nice to have".

Router:  Intel N100 (pfSense) WiFi6: Zyxel NWA210AX (1.7Gbit peak at 160Mhz)
WiFi5: Ubiquiti NanoHD OpenWRT (~500Mbit at 80Mhz) Switches: Netgear MS510TXUP, MS510TXPP, GS110EMX
ISPs: Zen Full Fibre 900 (~930Mbit down, 115Mbit up) + Three 5G (~800Mbit down, 115Mbit up)
Upgrading Laptop/Desktop CNVIo WiFi 5 cards to PCIe WiFi6e/7

Link to comment
Share on other sites

Link to post
Share on other sites

23 hours ago, Max_Settings said:

 

I think you'll basically just move everything down a peg like we did when Kepler was refreshed with the 700 series.

2080ti = 20% faster than 1080ti

1080ti = 2080

1080 = 2070

1070 = 2060


Main System: EVGA GTX 1080 SC, i7 8700, 16GB DDR4 Corsair LPX 3000mhz CL15, Asus Z370 Prime A, Noctua NH D15, EVGA GQ 650W, Fractal Design Define R5, 2TB Seagate Barracuda, 500gb Samsung 850 Evo
Secondary System: EVGA GTX 780ti SC, i5 3570k @ 4.5ghz, 16gb DDR3 1600mhz, MSI Z77 G43, Noctua NH D15, EVGA GQ 650W, Fractal Design Define R4, 3TB WD Caviar Blue, 250gb Samsung 850 Evo
 
Link to comment
Share on other sites

Link to post
Share on other sites

23 minutes ago, Alex Atkin UK said:

You will DEFINITELY be able to turn it off as new games still have to include lighting methods for none RTX card anyway, so it would be dumb to exclude the option.

 

I still say its not "another HairWorks" because honestly, hair moving a bit more realistically is meaningless compared to lighting and shadows looking more realistic.  As shown in the demos, being able to see what direction explosions are happening outside the viewport is going to be BIG for a lot of games.

 

Any game taking place at night time is going to absolutely stunning.

 

This is something that will affect game play, unlike HairWorks that was just a "nice to have".

Yeah but if you turn off ray tracing then you're defeating the entire point of the new cards. The RTX 2080ti (let alone the much slower 2070) can't maintain 60fps at 1080p in Shadow of the Tomb Raider. Even if they manage to use driver enhancements to bring it up to 60fps (which in and of itself would be a miracle) then there's no way the 2070 will be able to do it. So they're selling a feature that they can't really run just because it's better than other stuff at it.


Main System: EVGA GTX 1080 SC, i7 8700, 16GB DDR4 Corsair LPX 3000mhz CL15, Asus Z370 Prime A, Noctua NH D15, EVGA GQ 650W, Fractal Design Define R5, 2TB Seagate Barracuda, 500gb Samsung 850 Evo
Secondary System: EVGA GTX 780ti SC, i5 3570k @ 4.5ghz, 16gb DDR3 1600mhz, MSI Z77 G43, Noctua NH D15, EVGA GQ 650W, Fractal Design Define R4, 3TB WD Caviar Blue, 250gb Samsung 850 Evo
 
Link to comment
Share on other sites

Link to post
Share on other sites

7 hours ago, xg32 said:

my minimum target for the 2080 ti is 4k/98fps with RT enabled, high settings; 

Bad news, 2080ti can't even do 1080p 60fps with RT enabled in SOTR


Main System: EVGA GTX 1080 SC, i7 8700, 16GB DDR4 Corsair LPX 3000mhz CL15, Asus Z370 Prime A, Noctua NH D15, EVGA GQ 650W, Fractal Design Define R5, 2TB Seagate Barracuda, 500gb Samsung 850 Evo
Secondary System: EVGA GTX 780ti SC, i5 3570k @ 4.5ghz, 16gb DDR3 1600mhz, MSI Z77 G43, Noctua NH D15, EVGA GQ 650W, Fractal Design Define R4, 3TB WD Caviar Blue, 250gb Samsung 850 Evo
 
Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, Zeitec said:

Yeah but if you turn off ray tracing then you're defeating the entire point of the new cards. The RTX 2080ti (let alone the much slower 2070) can't maintain 60fps at 1080p in Shadow of the Tomb Raider. Even if they manage to use driver enhancements to bring it up to 60fps (which in and of itself would be a miracle) then there's no way the 2070 will be able to do it. So they're selling a feature that they can't really run just because it's better than other stuff at it.

I guess it depends if the bottleneck is entirely within RTX or actually some other option that can be turned off to help.  Demonstrations are naturally going to want to crank everything to Ultra and probably not fully optimised yet either.

 

Also surely there will be RTX profiles that scale things back a bit to maintain frame rate?  Just an on/off would make the 2070 rather useless at higher than 720p I'd think.

Router:  Intel N100 (pfSense) WiFi6: Zyxel NWA210AX (1.7Gbit peak at 160Mhz)
WiFi5: Ubiquiti NanoHD OpenWRT (~500Mbit at 80Mhz) Switches: Netgear MS510TXUP, MS510TXPP, GS110EMX
ISPs: Zen Full Fibre 900 (~930Mbit down, 115Mbit up) + Three 5G (~800Mbit down, 115Mbit up)
Upgrading Laptop/Desktop CNVIo WiFi 5 cards to PCIe WiFi6e/7

Link to comment
Share on other sites

Link to post
Share on other sites

Am I the only one who's not impressed at all with Ray Tracing 'visual enhancements"? Pretty much if you don't care to it and leave it off on these few upcoming titles that will support it then there will be no real difference between Pascal and Turing.

Personal Desktop":

CPU: Intel Core i7 10700K @5ghz |~| Cooling: bq! Dark Rock Pro 4 |~| MOBO: Gigabyte Z490UD ATX|~| RAM: 16gb DDR4 3333mhzCL16 G.Skill Trident Z |~| GPU: RX 6900XT Sapphire Nitro+ |~| PSU: Corsair TX650M 80Plus Gold |~| Boot:  SSD WD Green M.2 2280 240GB |~| Storage: 1x3TB HDD 7200rpm Seagate Barracuda + SanDisk Ultra 3D 1TB |~| Case: Fractal Design Meshify C Mini |~| Display: Toshiba UL7A 4K/60hz |~| OS: Windows 10 Pro.

Luna, the temporary Desktop:

CPU: AMD R9 7950XT  |~| Cooling: bq! Dark Rock 4 Pro |~| MOBO: Gigabyte Aorus Master |~| RAM: 32G Kingston HyperX |~| GPU: AMD Radeon RX 7900XTX (Reference) |~| PSU: Corsair HX1000 80+ Platinum |~| Windows Boot Drive: 2x 512GB (1TB total) Plextor SATA SSD (RAID0 volume) |~| Linux Boot Drive: 500GB Kingston A2000 |~| Storage: 4TB WD Black HDD |~| Case: Cooler Master Silencio S600 |~| Display 1 (leftmost): Eizo (unknown model) 1920x1080 IPS @ 60Hz|~| Display 2 (center): BenQ ZOWIE XL2540 1920x1080 TN @ 240Hz |~| Display 3 (rightmost): Wacom Cintiq Pro 24 3840x2160 IPS @ 60Hz 10-bit |~| OS: Windows 10 Pro (games / art) + Linux (distro: NixOS; programming and daily driver)
Link to comment
Share on other sites

Link to post
Share on other sites

24 minutes ago, Princess Cadence said:

Am I the only one who's not impressed at all with Ray Tracing 'visual enhancements"? Pretty much if you don't care to it and leave it off on these few upcoming titles that will support it then there will be no real difference between Pascal and Turing.

I mean it looks really good in some cases, but most of the time it just looks pretty good. Like that Battlefield 5 demo was pretty impressive, and so was the lit room Metro demo. However the tomb raider one, and the Metro one from a while ago just looks meh. 


Main System: EVGA GTX 1080 SC, i7 8700, 16GB DDR4 Corsair LPX 3000mhz CL15, Asus Z370 Prime A, Noctua NH D15, EVGA GQ 650W, Fractal Design Define R5, 2TB Seagate Barracuda, 500gb Samsung 850 Evo
Secondary System: EVGA GTX 780ti SC, i5 3570k @ 4.5ghz, 16gb DDR3 1600mhz, MSI Z77 G43, Noctua NH D15, EVGA GQ 650W, Fractal Design Define R4, 3TB WD Caviar Blue, 250gb Samsung 850 Evo
 
Link to comment
Share on other sites

Link to post
Share on other sites

31 minutes ago, Zeitec said:

I mean it looks really good in some cases, but most of the time it just looks pretty good. Like that Battlefield 5 demo was pretty impressive, and so was the lit room Metro demo. However the tomb raider one, and the Metro one from a while ago just looks meh. 

Which kinda re-affirms to me that only a hand full of devs out the gate are going to be using ray tracing, and using it well.

It could be seen as an investment if new APIs in the future really mature with the tech.  But at the end of the day, investing is gambling, and people gambled on Vulkan/DX12.

 

(Keep in mind this is from a gaming perspective only)

Link to comment
Share on other sites

Link to post
Share on other sites

RTX is new tech that hasn't been fully tested yet. I think we have to wait about a generation or 2 until they perfected it. You don't want to be an earlier adopter of new technology that hasn't been proven yet.

CPU: I9-9900k CPU Cooler: Noctua NH-D15 Memory: 32GB Corsair Vengeance LPX 3200mhz Dual Channel Motherboard: Gigabyte Z390 Aorus Master Soundcard: Sound BlasterX AE-7 Capture Card: Elgato Game Capture HD60 Pro Graphics Card: Gigabyte RTX 3080 Ti Boot Drive: Samsung 980 Pro NVME 1TB SSD Storage Drives: WD BLACK SN750 NVME 1TB SSD WD Blue 1TB SSD, Samsung 850 Pro 512GB SSD, 2 WD Blue 500GB SSDs Blu-Ray Drive: Pioneer BDR-2207 Power Supply:  Seasonic PRIME 850 Platinum SSR-850PD 850W 80+ Platinum Case: Cooler Master HAF X OS: Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

Screenshot_20180822-014309_1.jpg.8a3907ada549931493943e393047748a.jpg

 

Wow £1340! That's 1700 dollars...insane.

CPU - Ryzen 7 5800x3D |  GPU - RTX 3080 TUF OC | Motherboard - ASUS TUF X570 | RAM - Patriot Viper Blackout 32GB 3200MHz | Case - InWin 805 | Boot Drive - Corsair MP600 PCIe 4.0  Storage - 2 x 1TB SSD's & 1 500GB SSD | PSU - Seasonic Focus Gold 1000w | Display - ASUS TUF VG27WQ Curved 1440P 165Hz | Cooling - ASUS TUF LC240 AIO + 5 aRGB Fans

 

Link to comment
Share on other sites

Link to post
Share on other sites

No it will be better, but not a huge change.

 

All the gains was in the GDDR6. When it came to clock speeds and cuda they didn't change much. Instead they wasted all that valuable space on RTX and Tensor cores. 

 

Or just release the damn card without the RTX and tensor cores and price things the same as the last generation but with GDDR6.....raytracing is a gimmick 

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Princess Cadence said:

Am I the only one who's not impressed at all with Ray Tracing 'visual enhancements"? Pretty much if you don't care to it and leave it off on these few upcoming titles that will support it then there will be no real difference between Pascal and Turing.

Not impressed at all, just...darker shadows, have to see it in person though.

 

"Nvidia ShadowWorks"

5950x 1.33v 5.05 4.5 88C 195w ll R20 12k ll drp4 ll x570 dark hero ll gskill 4x8gb 3666 14-14-14-32-320-24-2T (zen trfc)  1.45v 45C 1.15v soc ll 6950xt gaming x trio 325w 60C ll samsung 970 500gb nvme os ll sandisk 4tb ssd ll 6x nf12/14 ippc fans ll tt gt10 case ll evga g2 1300w ll w10 pro ll 34GN850B ll AW3423DW

 

9900k 1.36v 5.1avx 4.9ring 85C 195w (daily) 1.02v 4.3ghz 80w 50C R20 temps score=5500 ll D15 ll Z390 taichi ult 1.60 bios ll gskill 4x8gb 14-14-14-30-280-20 ddr3666bdie 1.45v 45C 1.22sa/1.18 io  ll EVGA 30 non90 tie ftw3 1920//10000 0.85v 300w 71C ll  6x nf14 ippc 2000rpm ll 500gb nvme 970 evo ll l sandisk 4tb sata ssd +4tb exssd backup ll 2x 500gb samsung 970 evo raid 0 llCorsair graphite 780T ll EVGA P2 1200w ll w10p ll NEC PA241w ll pa32ucg-k

 

prebuilt 5800 stock ll 2x8gb ddr4 cl17 3466 ll oem 3080 0.85v 1890//10000 290w 74C ll 27gl850b ll pa272w ll w11

 

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, PrinnyExplodes said:

Or just release the damn card without the RTX and tensor cores and price things the same as the last generation but with GDDR6.....raytracing is a gimmick 

It'd have been funny to see a GTX 2080 Ti along with no tensor cores, no RT cores, just a hell bunch, like 6000 or more CUDA Cores indeed... probably would outsell RTX though making it useless and nVidia really wants to milk some money back from all the pointless investments they made on Ray Tracing technology.

Personal Desktop":

CPU: Intel Core i7 10700K @5ghz |~| Cooling: bq! Dark Rock Pro 4 |~| MOBO: Gigabyte Z490UD ATX|~| RAM: 16gb DDR4 3333mhzCL16 G.Skill Trident Z |~| GPU: RX 6900XT Sapphire Nitro+ |~| PSU: Corsair TX650M 80Plus Gold |~| Boot:  SSD WD Green M.2 2280 240GB |~| Storage: 1x3TB HDD 7200rpm Seagate Barracuda + SanDisk Ultra 3D 1TB |~| Case: Fractal Design Meshify C Mini |~| Display: Toshiba UL7A 4K/60hz |~| OS: Windows 10 Pro.

Luna, the temporary Desktop:

CPU: AMD R9 7950XT  |~| Cooling: bq! Dark Rock 4 Pro |~| MOBO: Gigabyte Aorus Master |~| RAM: 32G Kingston HyperX |~| GPU: AMD Radeon RX 7900XTX (Reference) |~| PSU: Corsair HX1000 80+ Platinum |~| Windows Boot Drive: 2x 512GB (1TB total) Plextor SATA SSD (RAID0 volume) |~| Linux Boot Drive: 500GB Kingston A2000 |~| Storage: 4TB WD Black HDD |~| Case: Cooler Master Silencio S600 |~| Display 1 (leftmost): Eizo (unknown model) 1920x1080 IPS @ 60Hz|~| Display 2 (center): BenQ ZOWIE XL2540 1920x1080 TN @ 240Hz |~| Display 3 (rightmost): Wacom Cintiq Pro 24 3840x2160 IPS @ 60Hz 10-bit |~| OS: Windows 10 Pro (games / art) + Linux (distro: NixOS; programming and daily driver)
Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×