Jump to content

xFluing

Member
  • Content count

    234
  • Joined

  • Last visited

Awards


This user doesn't have any awards

About xFluing

  • Title
    Member
  1. "Multi-Core" graphics cards

    Yep, not at all how it works.
  2. "Multi-Core" graphics cards

    Cost is overall lower, otherwise Ryzen prices would have been through the roof.
  3. "Multi-Core" graphics cards

    Inflation has nothing to do with the costs I'm talking about. Wafers cost a lot so fitting as many dies onto it as possible greatly increases the yield and implicitly reduces the cost. The RTX 2080 price does not follow any inflation curve either, so still no inflation.
  4. "Multi-Core" graphics cards

    But they work through crossfire / sli which has its faults and needs to be supported by the game devs, what I mean is something to circumvent that.
  5. "Multi-Core" graphics cards

    Well Intel got the idea from AMD's CCXes but it's different enough, they're looking to make the cores on 7nm and the rest, less important stuff like memory controller on 14nm or 28nm which I have to say is pretty smart because those two processes are already mature enough for high yields and low prices, and because the cores will be made individually it'll allow for much smaller dies once again reducing the cost.
  6. "Multi-Core" graphics cards

    295x2 also had two GPUs but they were running in crossfire, I'm talking about something that would have say 4 GPU modules, and each of those would act as a CPU core or something and working together as one big GPU unit made of many "cores", removing the issues of crossfire / sli.
  7. "Multi-Core" graphics cards

    Nah, I'm talking about some kind of tech that's not shit, something that would make the GPU modules work as one singular GPU unit instead of taking turns rendering frames or each one rendering half a frame.
  8. "Multi-Core" graphics cards

    So, now that we are getting closer and closer to the end of the manufacturing process shrink, how will GPUs evolve from here on out? The only solution with current technology would be bigger dies, sure, but that will just drive up the costs because the bigger the dies, the less of them fit a wafer, and implicitly drive the costs higher and higher. Unless we somehow find a way for multiple GPUs to work together like Ryzen CCXes, I'd say the GPU speed will just hit a plateau and stay there until GPUs become powered by quantum processors or some shit.
  9. EQ & Recording

    No matter what I use to record video (OBS, shadowplay, relive) it always seem to be picking up the equalized audio as well, is there a way to capture the raw audio instead?
  10. I have this Athlon x4 860k system on an MSI A68HM-E33 (V1) motherboard. For the longest time, I've run this dual channel 4GB setup with some cheap 2GB modules (running at 1.5V, pretty standard), fast forward to a few months ago when a friend gave me two 4GB modules (total of 8GB), a Corsair XMS3 set (rated at 1.6V), however after putting it in I started having issues booting with it, even with XMP on (which made the RAM run at the rated 1.6V), I then realized I couldn't run dual-channel anymore, not even with the old 4GB kit. The corsair kit would just refuse to POST (and RARELY if it POSTS it only shows 4GB, but board explorer still sees that both slots are populated), while the old 4GB kit would result in the monitor showing artifacts that would go for about 3 rows then cut off in the middle of the next one, so now I'm stuck in a single channel 4GB stick. Is there a way i can fix this? Did I burn off my second DIMM with this ram?
  11. Show off your old and retro computer parts

    This old Quadro FX 3450 (equivalent to a 6800 gt, I think, it's the same form factor so I assume it is) I used to game on this until 2015, so yeah you can tell it was a nightmare with games like path of exile being one of my favourites back then, even Half Life 2 would have issues with it. I couldn't use it in a retro system even if I wanted to sadly, because it suddenly started artifacting by itself after 2 years in storage. Full sized pics in spoiler below.
  12. That's not how graphs work, Linus

    That's the exact problem though BECAUSE linus is more entertaining to watch, ofc he's gonna have a much bigger audience than GN and should ESPECIALLY pay attention to stuff like this to avoid confusion.
  13. That's not how graphs work, Linus

    Then why make the takeaway point seem like some rigged graph, again, either have them of variable length without rounded numbers, or without any number altogether, or have them at the same length if rounded, it's not about the point they're making, it's about the implications of such a small error, this is how misinformation gets spread, through small errors everyone copies.
  14. That's not how graphs work, Linus

    Thought it might be, but why not just give the non-rounded numbers, otherwise it can cause uproars, like this had already done. This still doesn't absolve them of this goof, either round the numbers and have both bars the same length, or DON'T round the numbers in order to keep bars at different lengths.
×