Jump to content

xFluing

Member
  • Posts

    357
  • Joined

  • Last visited

Everything posted by xFluing

  1. Inflation has nothing to do with the costs I'm talking about. Wafers cost a lot so fitting as many dies onto it as possible greatly increases the yield and implicitly reduces the cost. The RTX 2080 price does not follow any inflation curve either, so still no inflation.
  2. But they work through crossfire / sli which has its faults and needs to be supported by the game devs, what I mean is something to circumvent that.
  3. Well Intel got the idea from AMD's CCXes but it's different enough, they're looking to make the cores on 7nm and the rest, less important stuff like memory controller on 14nm or 28nm which I have to say is pretty smart because those two processes are already mature enough for high yields and low prices, and because the cores will be made individually it'll allow for much smaller dies once again reducing the cost.
  4. 295x2 also had two GPUs but they were running in crossfire, I'm talking about something that would have say 4 GPU modules, and each of those would act as a CPU core or something and working together as one big GPU unit made of many "cores", removing the issues of crossfire / sli.
  5. Nah, I'm talking about some kind of tech that's not shit, something that would make the GPU modules work as one singular GPU unit instead of taking turns rendering frames or each one rendering half a frame.
  6. So, now that we are getting closer and closer to the end of the manufacturing process shrink, how will GPUs evolve from here on out? The only solution with current technology would be bigger dies, sure, but that will just drive up the costs because the bigger the dies, the less of them fit a wafer, and implicitly drive the costs higher and higher. Unless we somehow find a way for multiple GPUs to work together like Ryzen CCXes, I'd say the GPU speed will just hit a plateau and stay there until GPUs become powered by quantum processors or some shit.
  7. xFluing

    EQ & Recording

    No matter what I use to record video (OBS, shadowplay, relive) it always seem to be picking up the equalized audio as well, is there a way to capture the raw audio instead?
  8. I have this Athlon x4 860k system on an MSI A68HM-E33 (V1) motherboard. For the longest time, I've run this dual channel 4GB setup with some cheap 2GB modules (running at 1.5V, pretty standard), fast forward to a few months ago when a friend gave me two 4GB modules (total of 8GB), a Corsair XMS3 set (rated at 1.6V), however after putting it in I started having issues booting with it, even with XMP on (which made the RAM run at the rated 1.6V), I then realized I couldn't run dual-channel anymore, not even with the old 4GB kit. The corsair kit would just refuse to POST (and RARELY if it POSTS it only shows 4GB, but board explorer still sees that both slots are populated), while the old 4GB kit would result in the monitor showing artifacts that would go for about 3 rows then cut off in the middle of the next one, so now I'm stuck in a single channel 4GB stick. Is there a way i can fix this? Did I burn off my second DIMM with this ram?
  9. This old Quadro FX 3450 (equivalent to a 6800 gt, I think, it's the same form factor so I assume it is) I used to game on this until 2015, so yeah you can tell it was a nightmare with games like path of exile being one of my favourites back then, even Half Life 2 would have issues with it. I couldn't use it in a retro system even if I wanted to sadly, because it suddenly started artifacting by itself after 2 years in storage. Full sized pics in spoiler below.
  10. That's the exact problem though BECAUSE linus is more entertaining to watch, ofc he's gonna have a much bigger audience than GN and should ESPECIALLY pay attention to stuff like this to avoid confusion.
  11. Then why make the takeaway point seem like some rigged graph, again, either have them of variable length without rounded numbers, or without any number altogether, or have them at the same length if rounded, it's not about the point they're making, it's about the implications of such a small error, this is how misinformation gets spread, through small errors everyone copies.
  12. Thought it might be, but why not just give the non-rounded numbers, otherwise it can cause uproars, like this had already done. This still doesn't absolve them of this goof, either round the numbers and have both bars the same length, or DON'T round the numbers in order to keep bars at different lengths.
  13. Whoever did this needs to be fired STAT, is his editor an Intel shill or something what happened here, why do the Ryzen 3 results have smaller bars than the Intel ones, even though the numbers are the exact same? This isn't just a fluke, it's in all of the graphs in the video, see for yourself:
  14. Everything is in order, you can't have a 1060 before a 1050. Edit: to clarify, @Hugs12343 GTX Doorstop is what they're calling it the "GTX 1050 Ti" which is trash.
  15. Recently I've come up with a system to tier Nvidia's cards to clear up the waters on what video card is what. This list calls the video cards by their real names, you can figure out by yourself which is which, it's pretty easy. GT 1020 GT 1030 GTX 1050 GTX 1055 GTX Doorstop GTX 1050 Ti GTX 1060 GTX 1070 GTX Doorstop 2 GTX we-don't-want-to-give-you-the-full-GP-100-chip-even-though-we-should-have-because-this-is-the-80-class-which-should-be-the-full-dye. GTX we-still-don't-want-to-give-you-the-full-GP-100-chip I hope this clears stuff up and makes it a bit easier to differentiate between each of them.
  16. Those CPUs you have in your sig should not bottleneck the game, unless there's something odd about that xeon
  17. Physx is disabled, so yes I have that off right now. Yeah I thought my CPU would bottleneck, also I play at 1440p since I saw no difference between it and 1080p, so I thought might as well run in 2K
  18. I don't know, it may very well be a possibility, maybe I can set rivatuner to tell me if it starts writing to the page file, 4GB of RAM is starting to be limiting, however I'll soon change the whole platform so buying ram right now would just be a waste.
  19. Specs are in sig Yes, yes "welcome to the BL2 runs poorly club" so here's the deal: It runs very well most of the time, I'm only having issues in large areas like Thousand Cuts and Tundra Express (and though large, The Highlands run smoothly) (and oddly enough in Sanctuary even though the city is small), so I have another theory: I'm having trouble when there's large areas with lots of stuff happening. Before, I'd have PhysX on, and in Thousand Cuts it would drop in the low 20s, I've since disabled it and performance has improved, it drops to 45 now but still not perfect. I've also disabled the cell shading which also boosted the performance by a lot, however I'm still getting these drops, which makes me suspect it's a CPU bottleneck. I tried fondling around in the config files tweaking every single setting I could think of, but still did not have much success. So I've come to the conclusion that this may very well be a CPU bottleneck at this point, even though my CPU is way beyond the recommended settings.
  20. That's wrong, the 1060 3gb is a cut down version of the 6gb, so 6gb version is 1060 and 3gb is 1055 TI or something (1055 as in something between 1050 ti and 1060 6GB)
  21. Exactly @Enderman these are the pads I'm talking about
  22. With the recent Intel trends, I would really like to see direct die cooling make a comeback, there's just something about seeing a bare CPU, granted you won't see it when the cooler is on anyway, it's just an aesthetic thing I like. Now, there's the risk of cracking the die with uneven pressure like GN pointed out, but I think that could be fixed with pads to take some of the pressure off the die, like you'd see in old Athlon CPUs
  23. Dude a card like that with games like that you can run them no problem in 1080p, even an RX550 is made for 1080p in those games (except maybe fallout)
  24. No offence, but I personally think naming your computer is kinda going overboard to the point of being dare I say "cringy", then again that's probably because most of the names I've seen were edgy 12 year old names like "PROJECT: X" and stuff like that, nothing wrong with just calling it "my PC" or "my rig".
×