Jump to content
Search In
  • More options...
Find results that contain...
Find results in...

Briggsy

Member
  • Content Count

    4,764
  • Joined

  • Last visited

Reputation Activity

  1. Like
    Briggsy got a reaction from soldier_ph in Tech that refuses to die   
    still have an iPod classic 80GB kicking around from 2007ish I believe. I'm genuinely surprised the hard drive inside still functions.
     

  2. Informative
    Briggsy got a reaction from Longshot in Radeon 6900 xt vs RTX 3080?   
    Its a curious notion for sure. 
     
    Back when AMD were overbuilding their hardware, it took months and sometimes years for the drivers to catch up, and so the fanboys fell back on the idea that AMD performance improved over time like fine wine, instead of acknowledging that AMD couldn't get their drivers optimized in a timely fashion for AAA games. 
     
    The fine wine argument is a reaction to AMD having slow driver releases, because if AMD didn't have slow driver releases then there wouldn't be any "fine wine" improvements later. You can't have both, they're mutually exclusive.
     
    The only other aspect is the amount of VRAM that AMD uses compared to Nvidia. Go all the way back to GCN 1.1 with the R9 290 and there were 4GB and 8GB variants, while Nvidia was playing around with 3GB 780s and 6GB Titans. As far back as I can remember, AMD have always had more VRAM. I think the VRAM size might be the only meaningful way that AMD cards could age better, but at some point all the VRAM in the world isn't going to give you more performance, and in my own testing Nvidia manages VRAM usage better than AMD does, which means AMD having more VRAM might simply be to compensate for less aggressive memory management. 
  3. Informative
    Briggsy got a reaction from dickjack in Radeon 6900 xt vs RTX 3080?   
    Its a curious notion for sure. 
     
    Back when AMD were overbuilding their hardware, it took months and sometimes years for the drivers to catch up, and so the fanboys fell back on the idea that AMD performance improved over time like fine wine, instead of acknowledging that AMD couldn't get their drivers optimized in a timely fashion for AAA games. 
     
    The fine wine argument is a reaction to AMD having slow driver releases, because if AMD didn't have slow driver releases then there wouldn't be any "fine wine" improvements later. You can't have both, they're mutually exclusive.
     
    The only other aspect is the amount of VRAM that AMD uses compared to Nvidia. Go all the way back to GCN 1.1 with the R9 290 and there were 4GB and 8GB variants, while Nvidia was playing around with 3GB 780s and 6GB Titans. As far back as I can remember, AMD have always had more VRAM. I think the VRAM size might be the only meaningful way that AMD cards could age better, but at some point all the VRAM in the world isn't going to give you more performance, and in my own testing Nvidia manages VRAM usage better than AMD does, which means AMD having more VRAM might simply be to compensate for less aggressive memory management. 
  4. Informative
    Briggsy got a reaction from Mark Kaine in Radeon 6900 xt vs RTX 3080?   
    Its a curious notion for sure. 
     
    Back when AMD were overbuilding their hardware, it took months and sometimes years for the drivers to catch up, and so the fanboys fell back on the idea that AMD performance improved over time like fine wine, instead of acknowledging that AMD couldn't get their drivers optimized in a timely fashion for AAA games. 
     
    The fine wine argument is a reaction to AMD having slow driver releases, because if AMD didn't have slow driver releases then there wouldn't be any "fine wine" improvements later. You can't have both, they're mutually exclusive.
     
    The only other aspect is the amount of VRAM that AMD uses compared to Nvidia. Go all the way back to GCN 1.1 with the R9 290 and there were 4GB and 8GB variants, while Nvidia was playing around with 3GB 780s and 6GB Titans. As far back as I can remember, AMD have always had more VRAM. I think the VRAM size might be the only meaningful way that AMD cards could age better, but at some point all the VRAM in the world isn't going to give you more performance, and in my own testing Nvidia manages VRAM usage better than AMD does, which means AMD having more VRAM might simply be to compensate for less aggressive memory management. 
  5. Agree
    Briggsy reacted to dickjack in Radeon 6900 xt vs RTX 3080?   
    Why do people say AMD cards ages better than Nvidia? I would assume the card that has better performance would age better (?) 
  6. Agree
    Briggsy reacted to MageTank in Nvidia Sold $175 Million Worth of GeForce RTX 30 GPUs To Crypto Miners   
    That is the beauty of the subjective nature of this issue. Neither side has to come to an agreement on anything as neither side is doing anything wrong. Miners are simply consumers that are using their product differently to how you believe it should be used. No different than the kids that go out and lower their Honda Civics, cut their exhausts by several inches then pretend they are characters from Fast & Furious. Honda didn't intend for them to make a poor attempt at turning their Civic's into a street racing phenomenon, but the will of the consumer was strong enough to endeavor to persevere, in spite of what most of us would consider to be a "stupid" and "pointless" endeavor.
     
    Now scalpers... that is an entirely different issue, however I am bound to make some enemies here with yet another unpopular opinion. What most of you call "scalping", I call capitalism. They are taking advantage of an opportunity. One can spend all day arguing the morality of it, but at the end of the day, scalpers would not exist if there were not a market for it. Otherwise, they would be sitting on hardware that nobody is buying. You can't blame the miners on this, because they care too much about their ROI that they refuse to overpay for GPU's upfront, so someone is clearly making the scalping worth it for the scalpers.
     
    With that said, I should probably add a disclaimer before people rush to assume that I myself am a scalper. I am not, still waiting to get my hands on a 3080 for personal use that doesn't belong to my testing facility. Much like the rest of you, I have to wait and refresh websites for availability and do the hokey pokey every time I go online. I've made several physical visits to my local Micro Center store in the hopes that they'd randomly have one in stock but it has yet to happen.
     
    My point is, I personally do not take what is going on personally, nor do I choose to cast blame at any party in this shortage. In an ideal world, the scalping issue would be addressed by both the retailers (in-store purchase limits, online order limits per household / billing information) and manufacturers (producing enough product to meet the demands in the first place). The mining "issue" as far as I am concerned, isn't one. They, as consumers, are no less entitled to the same hardware we are as gamers or overclocking enthusiasts. That said, this "perceived issue" can only ever be addressed by manufacturers meeting the supply needs of all parties involved or for that crypto bubble to burst. You know, that thing the "experts" have been claiming was going to happen for several years now?
     
    To those of you that will inevitably say "The more GPU's they make, the more the miners will buy", remember, there is only a finite amount of power and space available. They'll hit that wall eventually, or society will evolve to handle crypto mining in space, lol.
  7. Like
    Briggsy got a reaction from GreatnessRD in Nvidia Sold $175 Million Worth of GeForce RTX 30 GPUs To Crypto Miners   
    NVidia are selling mining companies the GPU chips, and the mining companies have their own PCB design, components and drivers. Not only that, but Nvidia are selling them the chips that have too many defects to be used in a consumer product (but still have working SM's), thus giving NVidia the ability to make up for poor Samsung yields. 
     
    To be honest, I've never seen a more misinformed and entitled forum thread in my life. Outrage culture needs some valium. 
  8. Like
    Briggsy got a reaction from Arika S in Nvidia Sold $175 Million Worth of GeForce RTX 30 GPUs To Crypto Miners   
    NVidia are selling mining companies the GPU chips, and the mining companies have their own PCB design, components and drivers. Not only that, but Nvidia are selling them the chips that have too many defects to be used in a consumer product (but still have working SM's), thus giving NVidia the ability to make up for poor Samsung yields. 
     
    To be honest, I've never seen a more misinformed and entitled forum thread in my life. Outrage culture needs some valium. 
  9. Agree
    Briggsy got a reaction from BiG StroOnZ in Nvidia Sold $175 Million Worth of GeForce RTX 30 GPUs To Crypto Miners   
    NVidia are selling mining companies the GPU chips, and the mining companies have their own PCB design, components and drivers. Not only that, but Nvidia are selling them the chips that have too many defects to be used in a consumer product (but still have working SM's), thus giving NVidia the ability to make up for poor Samsung yields. 
     
    To be honest, I've never seen a more misinformed and entitled forum thread in my life. Outrage culture needs some valium. 
  10. Agree
    Briggsy reacted to Imglidinhere in RX 6800 Ray Tracing Performance Leaked   
    Likely due to how irrelevant RT is in the grand scheme of things. Until it becomes a more prominent addition and less of a tech demo, I think it's completely fair to say that people shouldn't be buying these cards for that purpose. Though it's nice to see Nvidia no longer has zero competition in that field.
  11. Agree
    Briggsy reacted to Ash_Kechummm in RX 6800 Ray Tracing Performance Leaked   
    RIS is a very "dumb" algorithm (as in, it's just a sharpening filter instead of an AI trained on a supercomputer like Nvidia), but as you said
     
    If the end result looks the same to human eyes, the one with the least effort wins, whether dumb or not. The only problem is that since Nvidia's approach involves AI, a lot of people who aren't able to see a difference between relatively modest upscaling (like 1440p to 4k, not like 720p to 4k), DO see a difference due to the placebo effect ("the work done by an AI must be better, right?")

    All in all, I like both approaches, with Radeon's approach being miraculously simple compared to Nvidia's OP solution, both being able to achieve modest upscaling with minimal difference at best (in both image quality and performance). 

    EDIT: also is it just me or does Wendel look like Gabe Newell?
  12. Like
    Briggsy got a reaction from Haro in Post your Cinebench R20+15+R11.5+2003 Scores **Read The OP PLZ**   
    3900XT 1.325V in bios (LLC 3)
    Tomahawk B550
    Noctua NH-D15 w/ Fans @100%
    CCD1 @4.6Ghz
    CCD2 @ 4.475Ghz
     
    Overclock is stable through various stress testing and regular use (knock on wood).
     

     
     

     
     
  13. Agree
    Briggsy got a reaction from leadeater in RX 6800 Ray Tracing Performance Leaked   
    Unless someone hands them a big sack of cash to make it happen 💰
  14. Agree
    Briggsy got a reaction from Parideboy in RX 6800 Ray Tracing Performance Leaked   
    The problem I see with DLSS right now is that we don't know if DLSS is going to be a bait and switch or not. 
     
    Most scaling algorithms used (i.e. bicubic, bilinear, etc) can be thought of as dumb algorithms (figuratively speaking). With DLSS in its current form, it has to be trained through machine learning using high resolution reference images supplied by the game developer to Nvidia. It's theoretically possible that at some point an AI can be trained with enough reference images to upscale all games without any training and provide better fidelity than dumb upscaling algorithms, but I would argue that it will never be as effective as a smart upscaler that is trained for a specific game the way that Nvidia are doing it now, and there are not that many games with DLSS to begin with. I would also argue that the computation power required with a general purpose AI upscaler might be better served in removing the tensor cores and adding more Float/Integer processors instead.
     
    I'm not saying DLSS is snake oil, I'm merely saying that unless Nvidia train the upscaler for a specific game, its never going to be that much better than dumb upscaling, and even if it is the horsepower required and the extra diespace used is a wash. I hope to be proven wrong and Nvidia will eventually release a ubiquitous form of DLSS that is better than a dumb algorithm, without sacrificing die space on the GPU. 
     
    for now the extra $80 gets you 16GB of Vram vs 8GB, and better rasterization performance. Nvidia does manage Vram usage better than AMD in drivers, but that only gets you so far. 
  15. Agree
    Briggsy reacted to aliasdred in RX 6800 Ray Tracing Performance Leaked   
    But 6800 got more vRAM tho.... even if it just breaks even with performance.. I can justify the extra $80 on 8Gbs of VRAM
  16. Agree
    Briggsy got a reaction from wall03 in RTX 3080 Ti Teased   
    3080 is GA102-200
    3090 is GA102-300
     
    It wouldn't be hard to guess a theoretical in-between chip would be called GA102-250
     
    It could be an OEM SKU, or a mobile SKU for all we know at this point.
  17. Informative
    Briggsy got a reaction from BTGbullseye in AMD RX 6000 Ray tracing   
    In AMD's footnotes for the recent presentation they mention that with their hardware Ray Accelerators being used they got a 13.8x (1380%) FPS increase over just using software DXR. 
     
    I can't recall who but someone on Youtube extrapolated that it would put the 6800xt on par with the 3070 in Raytracing capability. If true that some AIB cards have boost clocks past 2.5Ghz for the 6800xt, then maybe a little faster in a best case scenario.
     
    The performance hit iirc (again I can't remember source) said it was similar to Turing's performance hit.
  18. Informative
    Briggsy got a reaction from Stahlmann in AMD RX 6000 Ray tracing   
    In AMD's footnotes for the recent presentation they mention that with their hardware Ray Accelerators being used they got a 13.8x (1380%) FPS increase over just using software DXR. 
     
    I can't recall who but someone on Youtube extrapolated that it would put the 6800xt on par with the 3070 in Raytracing capability. If true that some AIB cards have boost clocks past 2.5Ghz for the 6800xt, then maybe a little faster in a best case scenario.
     
    The performance hit iirc (again I can't remember source) said it was similar to Turing's performance hit.
  19. Funny
    Briggsy got a reaction from Mark Kaine in GPU consumes much more power on desktop after UV'ing   
    what are you even talking about? Why respond if you're going to BS for no reason?
  20. Agree
    Briggsy got a reaction from RTX 3069 in GPU consumes much more power on desktop after UV'ing   
    I'm curious about this as well, as in Afterburner even if you save the profile it won't load the profile properly later on. In my example I lock my 2080 to 1.000v and 1950Mhz using the curve editor. I save it to profile. When I'm done gaming I restore default settings. When I load up the profile again later the curve editor doesn't revert to what I had locked in, but loads whatever curve it feels like instead.
     
    I mean it only takes me 10 seconds to redo the voltage lock every time I'm gaming, but yeah it would be nice to have an easier implementation that doesn't ruin your custom curve that you saved. AMD have this built into their in-game software overlay to automatically load custom voltage curve on a per-game basis, it's litle things like this that I wish Nvidia would add to their suite.
  21. Agree
    Briggsy reacted to SpiderMan in Is this normal for 10900k?   
    Have you messed around with the fan curves of the CPU cooler and intake fans to increase RPM when increasing temperatures? For example, I have my Noctua NH-D15 as well as the case fans go to 100% at 70 Celsius and during Cinebench benchmarks, they reach about 75 C and nothing higher unless I use a crappier thermal paste.
  22. Like
    Briggsy got a reaction from Hymenopus_Coronatus in AMD RX 6000 Ray tracing   
    In AMD's footnotes for the recent presentation they mention that with their hardware Ray Accelerators being used they got a 13.8x (1380%) FPS increase over just using software DXR. 
     
    I can't recall who but someone on Youtube extrapolated that it would put the 6800xt on par with the 3070 in Raytracing capability. If true that some AIB cards have boost clocks past 2.5Ghz for the 6800xt, then maybe a little faster in a best case scenario.
     
    The performance hit iirc (again I can't remember source) said it was similar to Turing's performance hit.
  23. Agree
    Briggsy reacted to Stahlmann in AMD RX 6000 Ray tracing   
    The RX 6000 series also have dedicated raytracing accellerators. 1 for each CU to be exact.
  24. Agree
    Briggsy got a reaction from Tostr4 in Industrial espionage   
    As the old philosophical question goes: Why is a soap bubble round?
     
    With the exception of the past few years, AMD and Nvidia were always in lock-step with each other for performance. At the end of the day both companies are working towards the same goal, to move electrons from point A to point B, and unless one of them finds a way to circumvent classical and/or quantum physics, then they're both equally limited by the same factors. Most of the components being used on graphic cards are the same. The memory chips, the power delivery, the silicon, the fabrication processes etc are the same or similar. Both companies are limited by the same factors in hardware, so in theory they should always be progressing at the same pace every year or two. Their engineers might know each other and went to school together and learned the same methods and concepts. Over the last few years Radeon went on vacation while AMD focused resources on the CPU end, so people new to computer hardware might be surprised that AMD are in lock-step with Nvidia, but its not surprising.
     
    One big advantage Nvidia have over AMD is with intangible features like physX, NVENC, RTX voice, etc. The vast majority of graphic card buyers don't need or even know a lot of these value added features exist with AMD or Nvidia. Nvidia market their features better and push them out into the industry more aggressively, as well as being more proprietary in their implementation, and these intangible features help tick boxes for marketing when it comes time to differ themselves from the competition. If you can see past all the marketing BS, its just another android vs. apple, xbox vs playstation kind of debate where the software and branding sell the hardware.
  25. Funny
    Briggsy reacted to GoodEnough in You'd be a fool to buy an RTX 3070   
    you dont know the definition of running like a turd until you have tried running continuum rt on a rx 580 lmao
×