Jump to content

Briggsy

Member
  • Posts

    4,764
  • Joined

  • Last visited

Awards

7 Followers

About Briggsy

  • Birthday January 1

Profile Information

  • Gender
    Male
  • Location
    Ontario, Canada
  • Member title
    I'm Just Here for the Coffee. Help is on the way.

System

  • CPU
    2700x
  • Motherboard
    Asus Prime x370
  • RAM
    32GB DDR4 3.2 Jizzlehertz
  • GPU
    EVGA RTX 2080
  • Case
    CM Scout 2
  • Storage
    Several TB's of SSD space and 12TB HDD Raid
  • PSU
    1000i
  • Display(s)
    4K 50'' AZIO Gaming TV
  • Cooling
    a Dozen fans
  • Keyboard
    Logitech G something wireless
  • Mouse
    Logitech G604
  • Sound
    Wireless
  • Operating System
    Windows 10

Recent Profile Visitors

3,916 profile views
  1. 1nm + + + + + + + + + + + + + joking aside, they might just move the goal post to advertise transistor density in a different way. Transistors per mm2 seems like a good metric for Intel to market, as their 14nm density isn't far off from TSMC 7nm. At this point the whole nm designation is pure marketing bullsnot.
  2. Its a curious notion for sure. Back when AMD were overbuilding their hardware, it took months and sometimes years for the drivers to catch up, and so the fanboys fell back on the idea that AMD performance improved over time like fine wine, instead of acknowledging that AMD couldn't get their drivers optimized in a timely fashion for AAA games. The fine wine argument is a reaction to AMD having slow driver releases, because if AMD didn't have slow driver releases then there wouldn't be any "fine wine" improvements later. You can't have both, they're mutually exclusive. The only other aspect is the amount of VRAM that AMD uses compared to Nvidia. Go all the way back to GCN 1.1 with the R9 290 and there were 4GB and 8GB variants, while Nvidia was playing around with 3GB 780s and 6GB Titans. As far back as I can remember, AMD have always had more VRAM. I think the VRAM size might be the only meaningful way that AMD cards could age better, but at some point all the VRAM in the world isn't going to give you more performance, and in my own testing Nvidia manages VRAM usage better than AMD does, which means AMD having more VRAM might simply be to compensate for less aggressive memory management.
  3. While I like rooting for the underdog, AMD is hardly the underdog anymore. They own the console market and they've got their boot on Intel's throat. Big Navi is AMD's Zen moment for sure, but I owned a Zen 1 processor when they launched and it was hot garbage, so not really a good selling point for Team Red imo. Based on reviews and benchmarks, Ampere is the safer option to go this time. Plus, the 6900xt is going to be as rare as hen's teeth for a long time. If you're concerned about power draw, Ampere can undervolt extremely well.
  4. It wouldn't hurt to try setting your Ram to default anyway and see if that's the problem.
  5. Don't return the card yet before checking your Ram. I was having problems with my 3090 at first, same exact thing. I'd be fine gaming but as soon as I started watching a youtube video at the same time the game I'd be playing would crash, occasional bluescreens, etc. Turns out that my System Memory was getting overheated by the 3090 and becoming just unstable enough that youtube alone couldn't be run at the same time as a game. I have excellent airflow in my case, graphic card usually sits in the high 60's or low 70's under load and fans are completely silent. From what I can tell It's not the air temps in my case, but the heat being transferred through the motherboard itself across to the Ram. I've since adjusted my ram to tighter timings, lower clocks and lower voltage, and everything is 100% stable now. Set your memory in Bios to default/auto and see if your problem is still happening.
  6. NVidia are selling mining companies the GPU chips, and the mining companies have their own PCB design, components and drivers. Not only that, but Nvidia are selling them the chips that have too many defects to be used in a consumer product (but still have working SM's), thus giving NVidia the ability to make up for poor Samsung yields. To be honest, I've never seen a more misinformed and entitled forum thread in my life. Outrage culture needs some valium.
  7. You ran the OC Scanner in Afterburner? That's the first thing anyone with an Ampere card should do before thinking they got a dud.
  8. nvidia aren't selling cards to miners, only the wafers. Mining companies have their own drivers and pcb layout etc. To think AMD are not selling wafers to mining companies as well is ludicrous. Mining companies don't want the consumer cards, the reference pcb design is likely too inefficient for mining.
  9. From what I've read and heard nvidia aren't selling cards to miners, only the GPU chips. The mining companies have their own pcb design, components and in-house drivers. In theory samsung might be able to produce more wafers than AIB's need, so the excess being sold to miners doesn't impact how many cards get manufactured and sold to gamers. For all we know, the exact same thing was happening with Pascal and Turing, but the demand wasn't nearly as high from gamers as it is now, hence the outrage.
  10. Totally agreed. If there's any silver lining, the rumor does suggest that it's only the chips being sold to mining companies and not the assembled graphic cards, with the mining companies having their own pcb design, components and in-house drivers. It may be that Samsung and TSMC can produce the GPU wafers faster than AIB's can put cards together, allowing Nvidia and AMD to saturate the production pipeline better. But now I'm just speculating on a rumor, and the demand for graphic cards is so high right now that I can see this being AMD and Nvidia biting off more than they can chew. And then there's intel with their Xe graphic chips, I wonder where they sit with large mining farms.
  11. Rumor is that both AMD and Nvidia are shipping large quantities of chips to crypto mining companies, so there's that.
  12. From a bird's eye view its very easy to see where the problems are in the supply chain, but there isn't a single entity that's in control of everything. From the manufacturers of components to the retailers and everywhere in-between, everyone involved is only responsible for a tiny slice of the whole pie, even Sony. I personally think retailers hold the lion's share of the blame, but it would be up to Sony to provide some kind of incentive to retailers to combat scalpers, because retailers are made up of individuals who are only interested in their own sales numbers, or how well they are moving product. A product manager that tries to be conscientious about getting PS5's to proper customers will be risking their own job, so they won't do anything unless their boss tells them to, but their boss doesn't want to start discriminating who is and isn't a customer, that's bad for business. I think my point is that nobody involved in the supply chain has any incentive to change how they do things, and they even risk reprimand from their superiors if they try to fight scalpers. AMD, Nvidia, Intel, sony, Microsoft, Nintendo, etc will continue to give lip service to consumers and feign interest in fighting scalpers, but it won't change anything. As long as there's someone willing to pay scalper prices, there's going to be scalpers.
  13. Probably because AMD have had something for a couple years now that works just fine without any overhead, nor has the need for die-space hogging tensor cores and ai training. If my 2080 had anything like radeon image sharpening, I'd play everything in 1440p and upscale to 4K for the added performance. For the RX 6000 series AMD users will use radeon image sharpening to get more performance with raytracing, without the artifacting that exists with DLSS. My biggest gripe with tech forums is that a lot of comments contain misinformed opinions. Even techtubers lack the knowledge you'd think they would have working in this space, which only serves to create an echo chamber of misinformation. I know that my own knowledge base is limited, but the amount of times I've seen people asking "what is AMD's answer is to DLSS" makes me wonder where people get their information. How about Nvidia Reflex? AMD have had their own anti-lag for over a year now, and it works insanely well. DLSS? AMD already have great upscaling that works with every game and requires no training. Fast Sync? ditto.
  14. For moderate upscaling its totally doable. see the video below starting at around 15:20 mark. This is a video from a couple years ago Wendel and crew did where they talk about Radeon Image Sharpening, which basically allows you to game at 1440p and upscale to 4K with almost no discernable difference. iirc It's the same tech the game consoles have been using for years. Right now Nvidia have nothing like this that works across all games, but I assume a more ubiquitous version of DLSS would do something similar to what AMD has. It's not going to do what the true Deep Learning DLSS can do with 240p upscaled to 1080p kind of sorcery, but 1440p to 4K with minimal differences is pretty good considering its slightly more than a 2x upscale. If that's all DLSS ends up being for most games, I'd be fine with it.
  15. Unless someone hands them a big sack of cash to make it happen
×