Jump to content
Search In
  • More options...
Find results that contain...
Find results in...

Briggsy

Member
  • Content Count

    4,758
  • Joined

  • Last visited

Awards

7 Followers

About Briggsy

  • Title
    I'm Just Here for the Coffee. Help is on the way.
  • Birthday January 1

Profile Information

  • Location
    Ontario, Canada
  • Gender
    Male

System

  • CPU
    2700x
  • Motherboard
    Asus Prime x370
  • RAM
    32GB DDR4 3.2 Jizzlehertz
  • GPU
    EVGA RTX 2080
  • Case
    CM Scout 2
  • Storage
    Several TB's of SSD space and 12TB HDD Raid
  • PSU
    1000i
  • Display(s)
    4K 50'' AZIO Gaming TV
  • Cooling
    a Dozen fans
  • Keyboard
    Logitech G something wireless
  • Mouse
    Logitech G604
  • Sound
    Wireless
  • Operating System
    Windows 10

Recent Profile Visitors

3,449 profile views
  1. NVidia are selling mining companies the GPU chips, and the mining companies have their own PCB design, components and drivers. Not only that, but Nvidia are selling them the chips that have too many defects to be used in a consumer product (but still have working SM's), thus giving NVidia the ability to make up for poor Samsung yields. To be honest, I've never seen a more misinformed and entitled forum thread in my life. Outrage culture needs some valium.
  2. You ran the OC Scanner in Afterburner? That's the first thing anyone with an Ampere card should do before thinking they got a dud.
  3. nvidia aren't selling cards to miners, only the wafers. Mining companies have their own drivers and pcb layout etc. To think AMD are not selling wafers to mining companies as well is ludicrous. Mining companies don't want the consumer cards, the reference pcb design is likely too inefficient for mining.
  4. From what I've read and heard nvidia aren't selling cards to miners, only the GPU chips. The mining companies have their own pcb design, components and in-house drivers. In theory samsung might be able to produce more wafers than AIB's need, so the excess being sold to miners doesn't impact how many cards get manufactured and sold to gamers. For all we know, the exact same thing was happening with Pascal and Turing, but the demand wasn't nearly as high from gamers as it is now, hence the outrage.
  5. Totally agreed. If there's any silver lining, the rumor does suggest that it's only the chips being sold to mining companies and not the assembled graphic cards, with the mining companies having their own pcb design, components and in-house drivers. It may be that Samsung and TSMC can produce the GPU wafers faster than AIB's can put cards together, allowing Nvidia and AMD to saturate the production pipeline better. But now I'm just speculating on a rumor, and the demand for graphic cards is so high right now that I can see this being AMD and Nvidia biting off more than they can ch
  6. Rumor is that both AMD and Nvidia are shipping large quantities of chips to crypto mining companies, so there's that.
  7. From a bird's eye view its very easy to see where the problems are in the supply chain, but there isn't a single entity that's in control of everything. From the manufacturers of components to the retailers and everywhere in-between, everyone involved is only responsible for a tiny slice of the whole pie, even Sony. I personally think retailers hold the lion's share of the blame, but it would be up to Sony to provide some kind of incentive to retailers to combat scalpers, because retailers are made up of individuals who are only interested in their own sales numbers, or how well they are movin
  8. Probably because AMD have had something for a couple years now that works just fine without any overhead, nor has the need for die-space hogging tensor cores and ai training. If my 2080 had anything like radeon image sharpening, I'd play everything in 1440p and upscale to 4K for the added performance. For the RX 6000 series AMD users will use radeon image sharpening to get more performance with raytracing, without the artifacting that exists with DLSS. My biggest gripe with tech forums is that a lot of comments contain misinformed opinions. Even techtubers lack the knowledge you'd
  9. For moderate upscaling its totally doable. see the video below starting at around 15:20 mark. This is a video from a couple years ago Wendel and crew did where they talk about Radeon Image Sharpening, which basically allows you to game at 1440p and upscale to 4K with almost no discernable difference. iirc It's the same tech the game consoles have been using for years. Right now Nvidia have nothing like this that works across all games, but I assume a more ubiquitous version of DLSS would do something similar to what AMD has. It's not going to do what the true Deep Learning DLSS can
  10. Unless someone hands them a big sack of cash to make it happen
  11. The problem I see with DLSS right now is that we don't know if DLSS is going to be a bait and switch or not. Most scaling algorithms used (i.e. bicubic, bilinear, etc) can be thought of as dumb algorithms (figuratively speaking). With DLSS in its current form, it has to be trained through machine learning using high resolution reference images supplied by the game developer to Nvidia. It's theoretically possible that at some point an AI can be trained with enough reference images to upscale all games without any training and provide better fidelity than dumb upscaling algorithms,
  12. not bad considering we're looking at the 6800, which is 60CU vs the 80 CU of the 6900xt and 72 CU of the 6800xt, where these cards have 1 Ray accelerator per CU.
  13. I've recently tested this myself as part of my decision on whether I would go Ampere or RDNA2 for my next daily driver for gaming, so I could care less what someone else thinks vs. my own first hand experience. First, AMD have Radeon Image Sharpening that makes what you see on screen look better, like having sweetFX or reshade with one click. I wouldn't confuse it with Nvidia's sharpening feature in the control panel. The other thing I noticed right away was the color quality. Nvidia colors seem to be washed out in comparison, but you wouldn't notice unless you did a sidebyside.
  14. what are you even talking about? Why respond if you're going to BS for no reason?
×