Jump to content

NVIDIA finally officially supports Adaptive Sync (with a small catch)

D13H4RD

Double Doh!  I just bought a 1070 (339CAD) and a S2716DG (449.99CAD).  Pulled the trigger a bit to quickly, I didn't realize the 2060 would be quite that competitive.  Good deals none the less, just a little bummed, I nearly returned the G-Sync monitor thinking... wtf, I turn off G-Sync and I can hardly tell the diff so why not get a 144hz Samsung 1440p 32"... same price.  I went back to check and the Sammy wasn't adaptive but only 60,120,144 selectable only.  I have until tomorrow to return this monitor on BB 14 day return policy.  Don't think I will though.  After all the boxing day deals are done there are very few similarly priced freesync monitors that come close to the specs of the dell.

 

Thinking out loud.  I'm pretty sure a couple of you did the same black friday and boxing day.

Link to comment
Share on other sites

Link to post
Share on other sites

53 minutes ago, empyr3al said:

Double Doh!  I just bought a 1070 (339CAD) and a S2716DG (449.99CAD).  Pulled the trigger a bit to quickly, I didn't realize the 2060 would be quite that competitive.  Good deals none the less, just a little bummed, I nearly returned the G-Sync monitor thinking... wtf, I turn off G-Sync and I can hardly tell the diff so why not get a 144hz Samsung 1440p 32"... same price.  I went back to check and the Sammy wasn't adaptive but only 60,120,144 selectable only.  I have until tomorrow to return this monitor on BB 14 day return policy.  Don't think I will though.  After all the boxing day deals are done there are very few similarly priced freesync monitors that come close to the specs of the dell.

 

Thinking out loud.  I'm pretty sure a couple of you did the same black friday and boxing day.

 

You bought a good monitor, that's all really.  The bits in bold are reflective of what I said earlier about Gsync being higher quality and not always necessary with today's high performance GPU's

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

18 minutes ago, mr moose said:

 

You bought a good monitor, that's all really.  The bits in bold are reflective of what I said earlier about Gsync being higher quality and not always necessary with today's high performance GPU's

Good yes, perfect it's not.  I'll be swapping it for another unit (same S2716DG G-Sync) as it has 1 dead pixel (constantly white) about 2x2" from the bottom left and 1 stuck pixel though extremely difficult to spot other than when using a Dead Pixel Spotting website.  1440p really means they don't stick out that badly, I just want something perfect. Dell's policy is that its covered, so I'll roll the dice and hope I don't get another panel with a dead pixel.

Link to comment
Share on other sites

Link to post
Share on other sites

8 hours ago, mr moose said:

Actually,  I really think the thing that saved AMD was getting Lisa Su in as the CEO in 2014. 

That alongside AMD getting some breaks and some good releases. 

 

Notably the Polaris RX line, them supplying the processing units for the 8th generation home consoles and especially the successful launch of the Zen micro-architecture. 

 

Wasn't too long ago that the general sentiment around AMD was very bleak. 

The Workhorse (AMD-powered custom desktop)

CPU: AMD Ryzen 7 3700X | GPU: MSI X Trio GeForce RTX 2070S | RAM: XPG Spectrix D60G 32GB DDR4-3200 | Storage: 512GB XPG SX8200P + 2TB 7200RPM Seagate Barracuda Compute | OS: Microsoft Windows 10 Pro

 

The Portable Workstation (Apple MacBook Pro 16" 2021)

SoC: Apple M1 Max (8+2 core CPU w/ 32-core GPU) | RAM: 32GB unified LPDDR5 | Storage: 1TB PCIe Gen4 SSD | OS: macOS Monterey

 

The Communicator (Apple iPhone 13 Pro)

SoC: Apple A15 Bionic | RAM: 6GB LPDDR4X | Storage: 128GB internal w/ NVMe controller | Display: 6.1" 2532x1170 "Super Retina XDR" OLED with VRR at up to 120Hz | OS: iOS 15.1

Link to comment
Share on other sites

Link to post
Share on other sites

So nvidia finally decided to let the cat out of the bag and admit to the lie that everyone already knew was bullshit.

 

G-sync has always been a propriety cash grab and completely unnecessary. At the end of the day customer friendly open standards always pull through and nvidia could no longer justify the over inflated price differential of gimmick based video cards paired with a g-sync monitor for a several hundred dollar premium.

 

Now we'll just have to give it another 5 years before they admit that the "g-sync certification" is also a scam.

What does windows 10 and ET have in common?

 

They are both constantly trying to phone home.

Link to comment
Share on other sites

Link to post
Share on other sites

25 minutes ago, Hellion said:

So nvidia finally decided to let the cat out of the bag and admit to the lie that everyone already knew was bullshit.

 

 

 

G-sync has always been a propriety cash grab and completely unnecessary. At the end of the day customer friendly open standards always pull through and nvidia could no longer justify the over inflated price differential of gimmick based video cards paired with a g-sync monitor for a several hundred dollar premium.

 

 

 

Now we'll just have to give it another 5 years before they admit that the "g-sync certification" is also a scam.

While Gsync is a premium over a freesync display, the feature requirements are higher such as the better variable refresh rate so you get a higher quality monitor. There are a lot of crappy Freesync monitors with the compatibility slapped on for marketing so an open standard isn't always better. But Nvidia opening up the support to some high end Freesync displays is awesome no matter the reason why, but of course some always have to make it seem negative because it's Nvidia.

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, Blademaster91 said:

While Gsync is a premium over a freesync display, the display quality is still higher, and the feature requirements are higher such has the better variable refresh rate so you get a higher quality monitor. There are tons of crappy Freesync monitors with the compatibility slapped on for marketing so an open standard isn't always better. But Nvidia opening up the support to some high end Freesync displays is awesome no matter the reason why, but of course a few people always have to make it seem negative because it's Nvidia.

The display quality is higher under what standards exactly? I'm willing to bet that during a pepsi challenge scenario you like nearly everyone else would not be able to tell the difference. Regardless, most of the freesync monitors worth buying after taking into consideration the type of panel, refresh rate, response time and color gamut coverage are essentially the same as the g-sync variant. Of course an entry level monitor vs a premium one with a completely different spec sheet is going to have an inferior user experience... Your argument has zero merit without context.

 

Regarding the available range, educate yourself about freesync 2 and outside of that understand that the coverage range sub 60 FPS is going to be a poor experience simply based on the amount of dropped frames. You may not see tearing but at that point it won't matter. It's still going to feel choppy, slow and restrained.

 

At the end of the day this has nothing to do with nvidia's name. That's a desperate reach. As I've already explained here before I make purchases on a personal level based exclusively on value which boils down to price to performance ratio. Nvidia has been a poor customer experience for so many reasons that I can’t even keep track any more and it’s gone on for years.

What does windows 10 and ET have in common?

 

They are both constantly trying to phone home.

Link to comment
Share on other sites

Link to post
Share on other sites

10 hours ago, Syntaxvgm said:

The problem is I guarantee it was an intentional limit that will be worked around with driver hacks like usual. It's not a hardware limitation. It doesn't matter if they are 5 years old, the fact is they still play games well and a lot of people still use them. I don't expect them to make it work if the cards were so old they no longer got driver updates, but they're not. I'm 90% sure it's an artificial limitation, and if they didn't have so much 10 series stock to get rid of I bet it would be 20 series only. 

And no, they are a little over 4 years old btw, not 5. 980ti is bout 3 and a half. 

image.png.329fc17a74d9953cf33854a6f0bb6d8e.png

I said they "will be" five years old this year, not they already are. They probably don't have a lot of 10 series cards in stock anymore. They're not excluding the 10-series because they simply can't. They're too new still and Nvidia is still actively optimizing for them. The 9-series is supported by current drivers, but its not like Nvidia is doing game specific optimizations for them and they haven't for a while now. They're still supported because they're not quite old enough to hit legacy status yet. Officially supporting old hardware, means spending resources testing on it to make sure everything works. That is extra time and money spent for what will probably be little to no real financial benefit and zero real negative results for not supporting them.

Link to comment
Share on other sites

Link to post
Share on other sites

21 hours ago, Cyberspirit said:

Oh, damn! What a great way to start 2019. I don't have a freesync panel either but, I'll be curious how the "Not worthy" monitors will play along with this.

It's actually second christmas for me on the 15th, I'm soooooooooooooo excited to try it with my asus mg248 and 1070Ti. Since my monitor "isn't worthy", it makes me more curious to see what'll happen. I don't feel like my monitor is one of the lower end ones, but I guess I'll see in a week.

8086k

aorus pro z390

noctua nh-d15s chromax w black cover

evga 3070 ultra

samsung 128gb, adata swordfish 1tb, wd blue 1tb

seasonic 620w dogballs psu

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

25 minutes ago, Hellion said:

The display quality is higher under what standards exactly? I'm willing to bet that during a pepsi challenge scenario you like nearly everyone else would not be able to tell the difference.

Mostly to due with binning. G-SYNC monitors tend to use highly binned panels while FreeSync panels can vary from highly binned to lower binned units. Some FreeSync models also use panels originally meant for G-SYNC models but did not meet standards.

The Workhorse (AMD-powered custom desktop)

CPU: AMD Ryzen 7 3700X | GPU: MSI X Trio GeForce RTX 2070S | RAM: XPG Spectrix D60G 32GB DDR4-3200 | Storage: 512GB XPG SX8200P + 2TB 7200RPM Seagate Barracuda Compute | OS: Microsoft Windows 10 Pro

 

The Portable Workstation (Apple MacBook Pro 16" 2021)

SoC: Apple M1 Max (8+2 core CPU w/ 32-core GPU) | RAM: 32GB unified LPDDR5 | Storage: 1TB PCIe Gen4 SSD | OS: macOS Monterey

 

The Communicator (Apple iPhone 13 Pro)

SoC: Apple A15 Bionic | RAM: 6GB LPDDR4X | Storage: 128GB internal w/ NVMe controller | Display: 6.1" 2532x1170 "Super Retina XDR" OLED with VRR at up to 120Hz | OS: iOS 15.1

Link to comment
Share on other sites

Link to post
Share on other sites

16 minutes ago, mxk. said:

It's actually second christmas for me on the 15th, I'm soooooooooooooo excited to try it with my asus mg248 and 1070Ti. Since my monitor "isn't worthy", it makes me more curious to see what'll happen. I don't feel like my monitor is one of the lower end ones, but I guess I'll see in a week.

Make sure to update us on that. I'm very curious.

Make sure to quote or tag people, so they get notified.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, D13H4RD said:

Mostly to due with binning. G-SYNC monitors tend to use highly binned panels while FreeSync panels can vary from highly binned to lower binned units. Some FreeSync models also use panels originally meant for G-SYNC models but did not meet standards.

The binning is done arbitrarily by a single company, nvidia. So again, in real world use cases you still wouldn't be able to tell the difference between the same panel. Both panels still meet the minimum spec.

 

That sure as hell isn't worth a $200+ premium especially to those that care about performance per dollar rather then chasing highly diminishing returns.

 

Even on the lower models monitor performance below 60 FPS is a crap shoot and having g-sync available sub 30 FPS is still a terrible experience.

What does windows 10 and ET have in common?

 

They are both constantly trying to phone home.

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, Hellion said:

That sure as hell isn't worth a $200+ premium especially to those that care about performance per dollar rather then chasing highly diminishing returns.

Well, no one (ones with a reasonably mindset anyway) ever said G-SYNC was worth $200+ over an equivalent FreeSync model.

 

Is G-SYNC objectively better? Yes. But is it $200+ better? I'd say no.

 

Regardless, it's still better than they (finally) realized that it was better to just let customers choose based on their demands and budget, rather than forcing something if someone wants variable refresh rate, rather than having options, especially when an open one exists that works well enough for many.

The Workhorse (AMD-powered custom desktop)

CPU: AMD Ryzen 7 3700X | GPU: MSI X Trio GeForce RTX 2070S | RAM: XPG Spectrix D60G 32GB DDR4-3200 | Storage: 512GB XPG SX8200P + 2TB 7200RPM Seagate Barracuda Compute | OS: Microsoft Windows 10 Pro

 

The Portable Workstation (Apple MacBook Pro 16" 2021)

SoC: Apple M1 Max (8+2 core CPU w/ 32-core GPU) | RAM: 32GB unified LPDDR5 | Storage: 1TB PCIe Gen4 SSD | OS: macOS Monterey

 

The Communicator (Apple iPhone 13 Pro)

SoC: Apple A15 Bionic | RAM: 6GB LPDDR4X | Storage: 128GB internal w/ NVMe controller | Display: 6.1" 2532x1170 "Super Retina XDR" OLED with VRR at up to 120Hz | OS: iOS 15.1

Link to comment
Share on other sites

Link to post
Share on other sites

On 1/8/2019 at 1:37 AM, D13H4RD said:

Well, no one ever said G-SYNC was worth $200+ over an equivalent FreeSync model.

 

Is G-SYNC objectively better? Yes. But is it $200+ better? I'd say no.

 

Regardless, it's still better than they (finally) realized that it was better to just let customers choose based on their demands and budget. 

Which is how it should have been since day one. Instead nvidia made a propriety over priced "alternative" and touted nonsense about how freesync isn't supported to fleece fan boys without the logical ability to rationalize. Then 5 years later after realizing that their bullshit strategy wasn't working to force g-sync sales over the open standard, came clean and enabled adaptive sync via nothing more then a driver update.

 

So even if we conclude that the same panel is slightly better sub 30 FPS where it doesn't actually matter, that doesn't change the fact that this is just another shitty business practice tacked onto a never ending list from nvidia.

 

I personally don't praise corporations for doing what was right after the fact. It's like praising a game developer for not including loot boxes. Games never had that shit 30 years ago. Why should they be praised for not having them in the modern age?

What does windows 10 and ET have in common?

 

They are both constantly trying to phone home.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, D13H4RD said:

Mostly to due with binning. G-SYNC monitors tend to use highly binned panels while FreeSync panels can vary from highly binned to lower binned units. Some FreeSync models also use panels originally meant for G-SYNC models but did not meet standards.

First time I am hearing that the G-Sync panels are better binned ones. Got a source on that?

I don't find it that hard to believe considering G-Sync monitors are usually more expensive, but I'd like some evidence before believing it.

 

 

16 minutes ago, Hellion said:

The binning is done arbitrarily by a single company, nvidia. So again, in real world use cases you still wouldn't be able to tell the difference between the same panel. Both panels still meet the minimum spec.

If the panels for FreeSync monitors are lower binned ones then it's not done by Nvidia. The binning happens at the panel manufacturer (like Samsung or Japan Display) and then the manufacturers of the displays (Asus, BenQ, etc) will decide which panel gets put into which monitor model. Nvidia can only say that "if you want to be called a G-Sync monitor and be compatible, you need to fulfill these specifications". Doing things like demanding that your competitor only get B panels is illegal.

 

 

19 minutes ago, Hellion said:

That sure as hell isn't worth a $200+ premium especially to those that care about performance per dollar rather then chasing highly diminishing returns.

Got any source that there is a $200+ price premium? I see this get thrown around all the time but never seen it actually backed up by posting comparable displays. And no, a FreeSync monitor without things like frame duplication or strobe backlight is not comparable.

 

20 minutes ago, Hellion said:

Even on the lower models monitor performance below 60 FPS is a crap shoot and having g-sync available sub 30 FPS is still a terrible experience.

What are you on about? It's at below 60 FPS you get the biggest benefit from adaptive sync.

One of the main reasons why sub-60 FPS feels bad is because you get very uneven frame timings. When some frames are displayed for 1 refresh cycle, and others for 3 or 4 refresh cycles, the animation looks and feels very choppy. Adaptive sync like FreeSync and G-Sync solves that.

Otherwise FreeSync and G-Sync would just be glorified solutions to v-sync, which isn't even a big problem at such high frame rates.

Link to comment
Share on other sites

Link to post
Share on other sites

10 minutes ago, Hellion said:

Instead nvidia made a propriety over priced "alternative" and touted nonsense about how freesync isn't supported to fleece fan boys without the logical ability to rationalize.

Weren't nvidia first out with variable refresh technology? FreeSync was a catchup move by AMD, and they only way they could get traction on it was to give it away by standardising it. It's the strategy AMD are forced into by their weak position, they're not doing this out of their generosity, it's all they can do business wise. We end up with some good FreeSync panels, but there are many not so good ones also.

 

7 minutes ago, LAwLz said:

It's at below 60 FPS you get the biggest benefit from adaptive sync.

I think I'd disagree there, at least based on my personal preferences. With a 144 Hz G-sync and 60 Hz FreeSync, I think the benefit is in some zone around 50-90fps. At very high frame rates, 100+, I'm not sure I feel it. Below 50fps, even if pacing is helped, it isn't quite smooth enough for me. Variable fps in the 60+ fps area does feel more responsive to me than say fixed 60fps V-sync. Maybe it is latency reduction more than the absolute frame rate. To save on heat and noise, I'm actually playing in game limited to 72fps.

Main system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, Corsair Vengeance Pro 3200 3x 16GB 2R, RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

21 minutes ago, LAwLz said:

First time I am hearing that the G-Sync panels are better binned ones. Got a source on that?

I don't find it that hard to believe considering G-Sync monitors are usually more expensive, but I'd like some evidence before believing it.

I couldn’t find the source that I had since it was a while from when I last heard of it, but the closest is this one

 

It’s Reddit, so hardly the most informative and reliable source, but it’s honestly hard to find a more recent source that backs up that theory.

 

Speaking of that, I have to issue an update and a potential correction. G-SYNC monitors tend to be upper-end gaming units that could have highly-binned panels but reports of defective units a la Amazon reviews may suggest that these are also subject to various levels of QC.

 

Another update is that FreeSync monitors can also be highly binned, although that depends on the type of monitor that features it 

The Workhorse (AMD-powered custom desktop)

CPU: AMD Ryzen 7 3700X | GPU: MSI X Trio GeForce RTX 2070S | RAM: XPG Spectrix D60G 32GB DDR4-3200 | Storage: 512GB XPG SX8200P + 2TB 7200RPM Seagate Barracuda Compute | OS: Microsoft Windows 10 Pro

 

The Portable Workstation (Apple MacBook Pro 16" 2021)

SoC: Apple M1 Max (8+2 core CPU w/ 32-core GPU) | RAM: 32GB unified LPDDR5 | Storage: 1TB PCIe Gen4 SSD | OS: macOS Monterey

 

The Communicator (Apple iPhone 13 Pro)

SoC: Apple A15 Bionic | RAM: 6GB LPDDR4X | Storage: 128GB internal w/ NVMe controller | Display: 6.1" 2532x1170 "Super Retina XDR" OLED with VRR at up to 120Hz | OS: iOS 15.1

Link to comment
Share on other sites

Link to post
Share on other sites

On 1/8/2019 at 4:27 AM, D13H4RD said:

* snip *

https://www.extremetech.com/gaming/283261-nvidia-admits-defeat-will-support-g-sync-on-freesync-displays

 

I found this article interesting.  However, it also doesn't mention anything about panel binning...

What does windows 10 and ET have in common?

 

They are both constantly trying to phone home.

Link to comment
Share on other sites

Link to post
Share on other sites

On 1/7/2019 at 8:42 AM, D13H4RD said:

One of the biggest bombshells in NVIDIA's CES 2019 press conference was the announcement that NVIDIA GPU users will no longer have to get a G-SYNC monitor to utilize adaptive sync, as the company now extends official support to a number of FreeSync-capable monitors, dubbed as "G-SYNC capable". 

There is a small catch however. Jensen Huang claims that after testing approximately 400 monitors on the market, only 12 are deemed to be worthy of bearing the "G-SYNC CAPABLE" moniker. These 12 are;

  Reveal hidden contents
  • Acer XFA240
  • Acer XZ321Q
  • Acer XG270HU
  • Acer XV273K
  • Agon AG241QG4
  • AOC G2590FX
  • Asus MG278Q
  • Asus XG248
  • Asus VG258Q
  • Asus XG258
  • Asus VG278Q
  • BenQ Xl2740

With all that said, if your monitor is on the "We're not wooooorthyyyyy" list, the setting can still be enabled manually from the GPU Control Panel as part of a driver update that will be released on the 15th of January, whereas supported monitors have them enabled automatically. 

 

D13H4RD's opinion 

  Reveal hidden contents

This is the best news out of the entire press conference by far. While G-SYNC is neat, these monitors can be significantly pricier compared to their FreeSync counterparts. NVIDIA's announcement basically means that while they can't guarantee the best performance, it should work. 

Source: VentureBeat

Too bad I spent $600 on a G-SYNC monitor two months ago... Well, at least I got it on sale so I didn't have to spend $780 on it :) 

Corsair iCUE 4000X RGB

ASUS ROG STRIX B550-E GAMING

Ryzen 5900X

Corsair Hydro H150i Pro 360mm AIO

Ballistix 32GB (4x8GB) 3600MHz CL16 RGB

Samsung 980 PRO 1TB

Samsung 970 EVO 1TB

Gigabyte RTX 3060 Ti GAMING OC

Corsair RM850X

Predator XB273UGS QHD IPS 165 Hz

 

iPhone 13 Pro 128GB Graphite

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, LAwLz said:

First time I am hearing that the G-Sync panels are better binned ones. Got a source on that?

I don't find it that hard to believe considering G-Sync monitors are usually more expensive, but I'd like some evidence before believing it.

Ever so slightly better binned, they have 165hz instead of the usual 144hz. So meh. 

Link to comment
Share on other sites

Link to post
Share on other sites

I'm looking forward to testing my AOC G2460PF

 

It was one of the better freesync displays when it was new, with a freesync range of 35-144 hz

 

It can also do it over HDMI up to 120hz

System specs:

4790k

GTX 1050

16GB DDR3

Samsung evo SSD

a few HDD's

Link to comment
Share on other sites

Link to post
Share on other sites

This happening was just a matter of time. There was no way NVIDIA could counter an open standard that's as good and cheaper than their G-Sync. Plus, not supporting it meant people were essentially stuck with AMD solutions if they wanted tear free gaming but didn't want to invest into god expensive G-Sync monitor (or change it if they had FreeSync capable monitor already). Now with NVIDIA supporting FreeSync too, means you have GeForce again as viable solution. Out of which NVIDIA profits the most because you can be their customer again.

Link to comment
Share on other sites

Link to post
Share on other sites

7 hours ago, Derangel said:

I said they "will be" five years old this year, not they already are. They probably don't have a lot of 10 series cards in stock anymore. They're not excluding the 10-series because they simply can't. They're too new still and Nvidia is still actively optimizing for them. The 9-series is supported by current drivers, but its not like Nvidia is doing game specific optimizations for them and they haven't for a while now. They're still supported because they're not quite old enough to hit legacy status yet. Officially supporting old hardware, means spending resources testing on it to make sure everything works. That is extra time and money spent for what will probably be little to no real financial benefit and zero real negative results for not supporting them.

the 5xx series isn't legacy yet and still gets new feature updates...

muh specs 

Gaming and HTPC (reparations)- ASUS 1080, MSI X99A SLI Plus, 5820k- 4.5GHz @ 1.25v, asetek based 360mm AIO, RM 1000x, 16GB memory, 750D with front USB 2.0 replaced with 3.0  ports, 2 250GB 850 EVOs in Raid 0 (why not, only has games on it), some hard drives

Screens- Acer preditor XB241H (1080p, 144Hz Gsync), LG 1080p ultrawide, (all mounted) directly wired to TV in other room

Stuff- k70 with reds, steel series rival, g13, full desk covering mouse mat

All parts black

Workstation(desk)- 3770k, 970 reference, 16GB of some crucial memory, a motherboard of some kind I don't remember, Micomsoft SC-512N1-L/DVI, CM Storm Trooper (It's got a handle, can you handle that?), 240mm Asetek based AIO, Crucial M550 256GB (upgrade soon), some hard drives, disc drives, and hot swap bays

Screens- 3  ASUS VN248H-P IPS 1080p screens mounted on a stand, some old tv on the wall above it. 

Stuff- Epicgear defiant (solderless swappable switches), g600, moutned mic and other stuff. 

Laptop docking area- 2 1440p korean monitors mounted, one AHVA matte, one samsung PLS gloss (very annoying, yes). Trashy Razer blackwidow chroma...I mean like the J key doesn't click anymore. I got a model M i use on it to, but its time for a new keyboard. Some edgy Utechsmart mouse similar to g600. Hooked to laptop dock for both of my dell precision laptops. (not only docking area)

Shelf- i7-2600 non-k (has vt-d), 380t, some ASUS sandy itx board, intel quad nic. Currently hosts shared files, setting up as pfsense box in VM. Also acts as spare gaming PC with a 580 or whatever someone brings. Hooked into laptop dock area via usb switch

Link to comment
Share on other sites

Link to post
Share on other sites

58 minutes ago, Shay12Mennink said:

Who can help me to choose a laptop for 3d graphics? Now I work with Macbook pro, but I need a more powerful laptop

Start another Thread if you haven’t and then you will get a lot of suggestions :)

Corsair iCUE 4000X RGB

ASUS ROG STRIX B550-E GAMING

Ryzen 5900X

Corsair Hydro H150i Pro 360mm AIO

Ballistix 32GB (4x8GB) 3600MHz CL16 RGB

Samsung 980 PRO 1TB

Samsung 970 EVO 1TB

Gigabyte RTX 3060 Ti GAMING OC

Corsair RM850X

Predator XB273UGS QHD IPS 165 Hz

 

iPhone 13 Pro 128GB Graphite

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×