Jump to content
Search In
  • More options...
Find results that contain...
Find results in...

Mamonos

Member
  • Content Count

    299
  • Joined

  • Last visited

Awards


This user doesn't have any awards

1 Follower

About Mamonos

  • Title
    Member

Profile Information

  • Gender
    Male

System

  • CPU
    Intel Core i7-8700K
  • Motherboard
    ASUS ROG Maximus X Hero
  • RAM
    4x8GB G.SKILL Trident Z RGB 3000MHz CL14
  • GPU
    nVidia RTX 2080 Ti
  • Case
    Fractal Design R6 TG
  • Storage
    Samsung 970 Evo Plus 1TB
  • PSU
    Corsair RM750i
  • Display(s)
    Acer Predator X27P
  • Cooling
    Custom Loop (EKWB)
  • Keyboard
    Logitech G810
  • Mouse
    Logitech G402
  • Sound
    Logitech G432
    Focal Spirit Professional
  • Operating System
    Windows 10

Recent Profile Visitors

The recent visitors block is disabled and is not being shown to other users.

  1. Developing and training such networks to make sure they are able to produce reliable result is complex and expensive. In addition to this, you need specific hardware (this is not necessarily expensive) on the end user side to achieve this kind of real time AI upscaling. This is why you don't get it advertised by (say) Netflix. Moreover, they rather invest their money in content creation than in upscaling technologies because for most users the (amount of) entertainment content matters more than the resolution - and anyway ISP in most countries already provide enough bandwidth to deliver high res content (given the compression they use). Finally, they could rely on hardware manufacturer to upscale the content for them. In contrary, hardware manufacturer are quite a lot into that. nVidia Shield is using AI upscaling to go on their latest device, they can upscale 720p/1080p to 4K at 30fps in real time. Samsung developed a comparable technology (MLSR) to upscale content for their new 8K TV lineup, LG and Sony are on board as well. And this is just a small percentage of the applications that are based on deep neural networks. Most of those company have large resources that can be browsed to understand their researches, if you are interested on it. A sample AI-based upscaling software (which therefore takes ages to render) that you can investigate and try is Gigapixel AI by Topaz. They upscaled old movies with that with good results.
  2. "Moving 3D graphics that are far more complex and even have 3rd axis" that you see while gaming are a series of rendered images (-> frames) With DLSS 2.0 the shaders in RTX cards are free to render those images at a (aliased) lower resolution (i.e. 720p to target 1080p, therefore more frames produced in the same amount of time). Meanwhile the card's tensor cores are then used to upscale the image using an AI convolutional encoder in real time. They don't make "details out of thin air", they have an exhaustively trained deep neural network that learns by comparing the results of it's processing to high quality 16K reference images. I suggest to not underestimate AI and address it like it is something that "it is impossible and will never exist" because a lot of companies (including nVidia) have already understood AI potential and are shifting their focus and resources towards that.
  3. The videos from Roman are always original and super interesting, he is one of my favourite content creators at the moment. That said I think in the end is actually a pretty sketchy solution, if the goal is to have a second system for video capturing and streaming then I think it would be better to get a two-system case. In this configuration the NUC does not have a PCIe slot for capture cards so you are forced to use either USB or Thunderbolt (unless using an M.2 adapter, not sure if it is possible tho), there are no chances to upgrade or swap parts since the CPU is soldered to the mainboard, and I am also pretty sure that you can build a (way?) better and cheaper ITX based system.
  4. I have the X27 (and tested the PG27UQ) and I find that the difference compared to HDR400 and even HDR600 monitors is huge. I bought this monitor because I play 4K HDR games at maxed out settings using 98Hz refresh rate with G-SYNC. In my opinion this is the use case that makes these monitor "worthy" (to be clear it is still a crazy, hard to justify purchase), because at the moment you can only accomplish this using one of these two monitors. That being said, there is not an incredible amount of (gaming) HDR content yet and if you are looking for 4K high refresh rate it's probably better to sacrifice HDR certifications and get something with a lower response time.
  5. Desk Right Left And yes the headphone cable is not long enough.
  6. You could use that but you need to cut away the unused pin for each header. There are Addressable RGB splitters with 3-pin header on sale, EKWB makes one but is for 6 devices which may result in a cable mess, this one for Cooler Master is for 3 devices https://www.amazon.co.uk/dp/B07HQBCX9L/ref=cm_sw_em_r_mt_dp_U_3J86Eb4Z2SZ8H EDIT: BTW I believe you should check if you can daisy-chain the ARGB header of the fans (I am not sure)
  7. Have a look at Rosewill, the RSV-R4000 or RSV-L4000.
  8. Yes, it it supposed to work that way.
  9. This board looks astonishing to be honest Gigabyte B550 Vision D
  10. To my knowledge -Your CPU supports only dual channel memory. -There are no Z370/Z390 mATX motherboards with Thunderbolt 3 included. Some boards in this form factor (mainly from Gigabyte) have Thunderbolt 3 support for a PCIe card but then lacks either SLI support or the required PCIe slots. -The Gigabyte Z390 Designare is the only board that comes with included Thunderbolt 3 and 2-way SLI support. It your case it would be super convenient because it also has WiFi and Bluetooth and also has a front panel USB 3.1 Gen 2 Type-C header in case your case is equipped with that. -Thunderbold 3 PCIe cards are compatible only with specific motherboards since the board need to have a TB header. -You should look into the different Thunderbold 3 PCIe card from motherboard manufacturer (ASUS ThunderboltEX 3, ASRock Thunderbolt 3 AIC, Gigabyte GC-Alpine Ridge or Titan Ridge) and compare the listed compatible motherboard models to see if you can find a board that you prefer
  11. (I have the same PSU). The continuous output rated temperature is 50C so there is basically nothing to worry about, and by looking at my measurement I found out that the default Zero-RPM Fan Mode (beware it's possible to change the fan configuration using Corsair Link) is well calibrated to keep the PSU around that value. According to the specification/manual the fan will start to spin only once the power output is greater than 300W. According to my measurements this is true only to some extent and is where you get confused because -I am not sure if the power threshold that triggers the fan activation is based on the +12W rail power draw alone or on the PSU sum draw. -The value seems to be a bit higher (more like towards 350-400W).
  12. The SSD will work on your motherboard. M2_1 is connected directly to the CPU while M2_2 is connected to the X570 chipset, but I never saw any performance difference between the two slots (there was significant degradation in X470 due to the fact that the chipset was distributing only PCIe 2.0 x4 lanes). PCIe is backward compatible - a PCIe 3.0 NVMe SSD will work in a PCIe 4.0 slot (the opposite is also true meaning you could use a PCIe 4.0 NVMe SSD in a PCIe 3.0 M2 slot - of course that would make no sense since you will be limiting its bandwidth). In my opinion it makes no sense to partition a solid state drive if not purely for organizational purposes - in which case partitioning will not hurt anything anyway, so do as you wish.
  13. Put the socket cover on and then throw it in the dishwasher (without adding soap/detergent)
  14. Yeah would be ideal but I am not sure about that, for the moment it looks like "X570 is one year old and B550 is the new thing". B550 will hopefully drop in prixes after some time.
×