Jump to content
Search In
  • More options...
Find results that contain...
Find results in...

KaitouX

Member
  • Content Count

    403
  • Joined

  • Last visited

Everything posted by KaitouX

  1. This is probably correct, it wouldn't make much sense for Micron to try to find GPUs with the exact memory configuration when the objective is to give a example on where that memory was/can be used, also because probably there isn't always a GPU that maxes out the possible memory configuration in every way.
  2. Because the 10900K is a lot more expensive. And if you need high core count, AMD is way better for the money.
  3. By the way the leaked PCB is supposedly from the Colorful Vulcan model which also had 3x 8-pin connectors in the 2080 Ti so it is not that weird for it to have those, and neither does that mean that other models will need it.
  4. 3900 seems way too low for a 3700X, it should be around 4800~4900 at stock, ~100 points higher with PBO and a bit higher with a high all core OC. It seems to be throttling or locked at a lower clock than normal for it to be scoring so low.
  5. Review of the Gigabyte 1650 Super Windforce OC on TPU: https://www.techpowerup.com/review/gigabyte-geforce-gtx-1650-super-windforce-oc/ Also the GPU itself is the same as any 1650 Super so the performance is also basically the same, the difference is mostly the noise/temperature. If you want to upgrade the Vega 56, you will need to look for a 5700XT or higher to have a performance increase that justifies the upgrade in my opinion, if you want something that is similar to the Vega then the 1660 Super would probably where you should look.
  6. Both AMD and Nvidia are probably already going to do it with RDNA2 and Ampere.
  7. That just mean that TSMC won't actually build more capacity for Intel, not that Intel can't pay for the capacity they already have. Also it's not like there aren't alternatives to using the latest TSMC process, particularly for lower end products.
  8. I think the plan is to use (probably) TSMC for the GPUs: So their 10nm and 7nm manufacturing issues aren't going to apply, even if other issues Intel might have internally could still apply.
  9. I hope this time around they don't do something like "Our new GPUs are 40% faster than last generation, but 70% more expensive", I really want a decent GPU for 4K for around 300€, so I hope that the 3060 or whatever it is called will be slightly cheaper than the 2060 at launch, while being at least 30% faster and hopefully not using more than 150W. That shouldn't be impossible as they've done it with the 900 series, but if they repeat the 900 series I hope that they won't repeat the 3,5GB memory in some similar way.
  10. Strix GPUs are usually overpriced and not worth it, make sure that there aren't better options before buying. Depends how much you care about noise, some high end coolers are a lot quieter than the cheaper ones, but I do agree on avoiding premium models for the most part, in general they are too expensive to be worth it.
  11. It took 1 or 2 weeks till they were available in the market. Some were available earlier, others took more time. Probably the exact dates varies depending on the region, but for the most part it took less than 2 weeks for the first few models.
  12. Until Turing happens, everyone that sold their 1080TIs for 500$ or less will probably never do that again. Those people literally had to pay the same price as the launch 1080Ti to get basically the same performance they had, so they would have to use the money they got from selling the 1080Ti and add another $200~300 to get the same performance unless they just bought a used 1080Ti which would likely cost more than what they got from selling it lol.
  13. Software - No, the exception would probably be in the case of RGB or something stupid that isn't worth using. Support - Yes, as the warranty and hardware support is honored by the company that made the specific model(EVGA, Zotac, Asus, PNY, etc) Performance - Maybe, depends on the cooling solution, but the difference is likely too small to make it noticeable. The biggest difference between models would be the noise, Zotac coolers are often really loud and not that cool when considering the noise with the exception of the more expensive coolers they have. I wasn't able to find much about the PNY cooler, but it's probably better than the Zotac one. It seems like that isn't that correct when it comes to the 1650 Super, from the latest TPU 1650 Super review: The best cooler at stock out of the "cheap" coolers reviewed($160~170 MSRP) is the Palit StormX which is a single-fan design, probably would have been better if it had fan stop, the worst being the Zotac by a mile, being the hottest by 8c and loudest by 3dB at the same time at load. If the PNY uses the same cooling system as the StormX, which it looks like it does from the photos, it might be a decent option, the safe choice based on PCPP US prices would be the Gigabyte Windforce OC for $170, as the good models are really overpriced currently with the second best option being the Strix for $20 more than the Gigabyte one.
  14. I would just use the G2 until the warranty ends. 750W is more than enough, the G2 is a good PSU and the PSUs you mentioned would be downgrades from it.
  15. Even without sharpening filters when playing at 4K, upscaling from 1800P and similar resolutions is a great way to get a performance boost without much image quality loss, the sharpening filters can help particularly when going from lower resolutions, but Nvidia does have a alternative which is in my opinion it is better than the Radeon Image Sharpening due to the finer control you have over it, and I'm not sure about it currently but the AMD one didn't work with some API when I checked it last time. Both AMD and Nvidia can use Reshade to get similar experience, with the disadvantage being the slightly performance loss compared to the others mentioned. While similar to what DLSS does, because in the end DLSS is upscaling the image, the idea of using DLSS 2.0 is to get both better image quality and performance, with the drawback(currently) being some rare issues that might appear due to the way DLSS does upscaling, like for example in Death Stranding where during one specific scene the DLSS caused dark trails behind some dark object that were flying to the sky which can be seen in the Digital Foundry analysis and the other drawback is the need of per-game implementation. So as a simple way to think about it: DLSS(2.0) = AI Upscaling to achieve both better performance and image quality, drawbacks currently are possible rare visual bugs and need of per-game implementation. Normal Upscale = Traditional Upscale, achieves better performance while sacrificing image quality, when going from ~1800p to 4K often the quality loss is really small. Often paired together with sharpening filters(RIS, Reshade, Freestyle) to achieve a sharper image(same can be done with DLSS if wished). Again the reason why DLSS was compared to traditional upscaling was because DLSS 1.0 was worse than a normal upscale, offering similar performance while looking worse. DLSS 2.0 actually looks as good or better than native, with the drawbacks mentioned before, those drawbacks also applied to the older DLSS.
  16. DLSS is the opposite of DSR&VSR, DLSS is upscaling to get better performance, DSR and VSR are downscaling to get better image quality.
  17. It isn't a DLSS equivalent, AMD have no equivalent currently.
  18. Personally even for that price I would just save $150 and get the 5700XT Pulse for $400 or some similar priced 2060S/2070, the 2070S is faster than the 5700XT and 2060S/2070 but it's nowhere near fast enough to warrant the $150 extra it costs currently(for a decent cooler), even the $100 extra it costs normally was already hard to justify but with the current prices it's even harder.
  19. Probably this is the cheapest 2070S with a decent cooler that is in stock: https://pcpartpicker.com/product/L3tKHx/gigabyte-geforce-rtx-2070-super-8-gb-gaming-oc-3x-video-card-gv-n207sgaming-oc-8gd Edit: This one is more expensive, but it should be similar to the Strix: https://pcpartpicker.com/product/rqsnTW/evga-geforce-rtx-2070-super-8-gb-ftw3-ultra-gaming-video-card-08g-p4-3277-kr
  20. Depends on the game, but your average game should be able to reach 60FPS as long you're okay with medium or high settings, maybe with the resolution set to something like 1800p or lower in heavier games. AMD have no DLSS equivalent, the initial DLSS was just worse than a simple upscale with AMD sharpening filter, all you need to do is to create a custom resolution at 1800p or whatever you want(1440P probably doesn't need it) and choose it in the game, the GPU/TV will do the upscale depending on your settings to fit in the screen, if you want to use the sharpening filter, just look for it in the driver settings. From TPU, games are probably in the highest settings(without Hairworks, RTX and similar):
  21. It's kinda weird that they would add this in the post considering it is unrelated to the topic and it's mostly things that have no proof of being true/important, it looks extremely unprofessional to have a rant in a notice about supply issues in my opinion. The first point while true at the high-end on Desktop, there seems to be no difference when it comes to notebooks, the lower performance of the GPUs compared to desktop and the lower performance of Intel CPUs seems to remove any visible difference in performance. The second point while it's a fact, there's no proof that it would have any significant impact in most cases, at least in desktops only some extreme cases can cause visible impact in performance. The third I'm not sure about it so no comments there. The things after that(left out of the quote) are even more unrelated, being about some issues in Death Stranding when using the Ryzen 3900X which in the linked video is said to be likely something related to the latency or a bug, which likely wouldn't affect Renoir or even some Desktop CPUs. Even if the point is about possible issues with Ryzen CPUs, there's no guarantee that Intel wouldn't have issues in some games because of some random reason and the point about "platform stability" seems to have zero reflection in reality, as there seems to be no issues that are particular to Renoir CPUs in notebooks. On-topic: I wouldn't be surprised if the ODM underestimated the demand which could have caused the issue to become bigger, but AMD having supply issues is something that will probably be a thing for quite some time.
  22. Check the resolution in the Windows settings, you probably got scaling on but the true resolution should be correct. About the site for example mine says "Your screen resolution: 1536x864 pixels" but I use a 4K TV with scaling on 250% so it isn't the true resolution, if you check the Resolution Inspector in that site "Note" section it probably will show the correct resolution which should match the resolution that can be seen in the Windows settings.
  23. I would avoid the 1660Ti, as the 1660Ti performance is too close to the 1660 Super for it to be worth in my opinion. I would either go for a decent cheap cooler on the 1660 Super to save some money compared to the Strix/Gaming X or get some basic 5600XT/2060. Also prices are dropping like crazy where I'm located, likely due to the imminent release of the new GPUs, both 1660S and 2060 dropped close to 15% in price in the last week. Might be worth checking for deals that might happen in the next few days/weeks.
  24. You can try using a pendrive to run Linux to take Windows related issues out from the possibilities, but first try looking for the diagnostic LEDs in your motherboard and see if the GPU one lights up when you have the GPU there while you turn the PC on(judging by what it was said it should), also you can try checking for physical damage in the GPU. You can also try using other PCIe slots maybe it will be easier to see the debug LED as it seems to be to the left side(side of the external ports) of the first PCIe slot.
  25. Depends on the price, but usually none of those are worth buying because they're too expensive, getting often way too close to the next class of GPUs. If they use the same coolers as the other 16XX GPUs, the MSI one is probably the quietest.
×