Jump to content

PianoPlayer88Key

Member
  • Posts

    1,627
  • Joined

  • Last visited

Everything posted by PianoPlayer88Key

  1. My wish: GT 2010 @ $20, 2W, no cooler, PCIe x1, single slot, low profile >> 4x Titan Xp or 4x Quadro P6000 GTX 2080 @ $360, 70W, based on GV100 (not 102 or 104): 240fps minimums @ max settings @ 8K * 4 monitors, in AAA games released 4-5 years later (2 of these would run Ghost Recon Wildlands at max settings on Linus's 16K setup, even on a single-thread LGA775 CPU, at the same FPS as a 1080 Ti can do CS:GO at 640x480, lowest settings assuming no CPU bottleneck) My more reasonable(?) approximate expectation: GTX 2090 Ti @ $1600, 400W ~ 4x GTX 2080 Ti GTX 2090 @ $850, 275W ~ 2x GTX 2080 GTX 2080 Ti @ $650, 200W ~ 2x GTX 1080 Ti GTX 2080 @ $500, 160W ~ 2x GTX 1080 GTX 2070 Ti @ $400, 145W > GTX 2080 Ti GTX 2070 @ $320, 130W ~ GTX 1080 Ti GTX2060 Ti @ $250, 110W >~ GTX 1080 GTX2060 @ $200, 90W > GTX 1070 < GTX 1080 GTX 2050 Ti @ $150, 65W > GTX 1060 6GB < GTX 1070 GTX 2050 @ $100, 50W ~ GTX 1060 3GB (if it had 6GB VRAM) GT 1040 @ $75, 40W > GTX 1050 Ti < GTX 1060 3GB GT 1030 @ $55, 30W ~ GTX 1050 Ti GT 1020 @ $40, 22.5W ~ GTX 1050 GT 1010 @ $25, 15W ~ GTX 750 Ti (or > 5x then-current AMD APU or Intel Iris Pro "flagships") My planned upgrade, at least as of now, is when at least 1 of the following is true: My GTX 1060 3GB can't get 10-12 max fps @ 640x480, lowest settings, in then-10-year-old casual games I could get a 4K, 60Hz, 10-bit, 100% sRGB, IPS 32" monitor AND a GPU capable of 120fps 4K max settings in then-newest AAA games, for $750 total I could get 8K version of above, plus 8K equivalent of updated Panasonic FZ2000 or GH5 + lens or A7s II + lens, for $1500-2000 or so My 3GB 1060 dies My current GPU refuses to launch (at all) a future game I want to play
  2. What about a demo / trial version with part of it playable? Or was that only a thing in the 1990s or so, and now you gotta buy the game/program even to just try it out?
  3. I was reading this thread & saw quite a few posts I thought were interesting, and I wanted to comment on some points. As I was clicking to +quote posts. So, @Damascus, @Ryan_Vickers, @GamingDevilsCC, @H0R53, @M.Yurizaki, @Moress, @SlaughterSmurf, @JoostinOnline, @Fardin & @Glenwing I've grouped your quoted posts in a spoiler, at the bottom of my post. (Btw, are in-spoiler quotes alerted, or are things in spoilers exempt from notifications? I don't recall....) Where I live, electricity is not cheap, especially when I start using more than a certain amount in a month. Also, we don't have a HVAC system here at my parents' house, so temps can climb into the mid 80s °F (or 30°C) indoors in summer quite frequently. Approaching 90°F / 32°C does happen now and then as well. (Outdoor temperatures often reach upper 90s to mid 100s °F, or 36-42°C.) I remember one summer about 15-20 years ago, where it reached 117°F / 47°C outside! I don't exactly remember the indoor temperature, but considering how I remember it felt, it had to have been at least 100-104°F / 38-40°C - inside the house! So not only would a higher-TDP part cost more to run over time (something that a few in this thread apparently neglect),it would also contribute to making the room even hotter. I've been one that has a fairly long upgrade cycle - for example I'd be willing to turn down settings in casual games to 320x240, low, 6-8 fps before I decide it's time to upgrade to a new card. (Also I like huge jumps when I do upgrade.) I probably won't keep my 3GB GTX 1060 quite that long though. I might replace it either when it dies (like my brother recently did, going from a 780 to a 1080 Ti), or when my next criteria for upgrade is met (like getting 4K, max settings, 60+ fps (or 2x Hz, like 120) in AAA titles, AND a 27-32" 4K 60Hz 10-bit 100% sRGB IPS monitor for a combined total of $700-800). Another replacement reason I've also recently thought I should start putting more consideration into -- is newer parts being much more efficient than older parts, so I could have similar performance with much less power draw, or better performance AND less power draw. For example, if I had SLI GTX 285s, they might still perform well enough for me (I only have a 1080p 60Hz monitor (and could turn down to medium/high or 720p/900p & settle for 30fps due to card age), not the 4K I want someday), but if I switched them for a GTX 1050 Ti or 1060, I'd save a ton on power, AND get better performance (especially with the 1060). Considering my cost of electricity, I might also make the argument to go for a higher 80+ rating on the PSU, as opposed to a lower one. Now maybe going from 80+ Platinum to 80+ Titanium wouldn't quite be worth it, but I'd think 80+ Gold would definitely be better for me than 80+ Bronze. (As it happens, I have an 80+ Platinum Corsair AX760, and I'm finding I'm lacking SATA power plugs.) Also, on buying a top tier part up front then not caring about other parts ... I actually would still care. For example, a bit ago I was looking at building a NAS as a backup server. But, it turned out my budget was low enough so I could barely afford the hard drives needed to copy the data I wanted to back up. (If I'd bought more, I might have been in danger of overdrafting at the time.) I had been looking at having the platform (everything except the hard drives) cost ~$100-150 total, but I ended up with just three 8TB HGST NAS drives for now. (Yes I have a lot to back up, and no it isn't anything like what @Slick taught us how to hide. )
  4. On some of my older hard drives, I may still have things in download folders (or the equivalent for older OS's copied from even older media) that's anywhere from 10 to 25 years old, possibly older. Some may also be on 5¼" floppies, but I haven't yet come across a reasonably-priced (like 3.5" USB floppy drives) 5.25" drive that hooks up via USB or SATA, nor a 5.25" controller that slots in via PCI Express. Same goes for the recycle bin from older Windows installs. (I'd have to mount them in, say, Linux, to get stuff, otherwise Windows would either freak out with hardware it didn't know about, or empty the recycle bin.)
  5. The bolded part reminds me of another thing. When I do see IPC jumps, I want them to benefit already-existing software. Sometimes there are things that older software does well, or user interfaces I like, or other factors, and sometimes newer software is a bit of a step back for me. (I understand some businesses, etc, are even more hardcore with the "if it ain't broke, don't fix it" mentality - aren't there some places that were recently still using Windows 3.1?) For example, the same pre-existing build of Blender or Handbrake, when run on a newer CPU with 100% better IPC, should complete the same project (at the same settings) in half the time as it took on the older CPU running at the same clock speed and core/thread count. I didn't mean gains in clock speeds, I meant gains in performance at the SAME clock speed. I screenshotted parts of a few wikipedia articles, highlighted a few things, and sized the windows as necessary. Also included is an abridged table (which I didn't save) from the instructions per second page, timeline of instructions per second section. On the table, focus more on the "instructions per clock cycle per core" column instead of the one to its left, at least at the bottom of the chart where there's a few multi-core CPUs. I removed some CPUs with SMT/HT before I made the screenshot (as I think the chart wasn't taking that into account properly), as well as everything non-x86. Also, take a look at the 286 in the table on the right, and compare it to the Intel DX4. Then, look at the 2 photos below. There's a dramatic increase in price to performance over that time. (Unfortunately I don't have the price for the 286-10 CPU by itself - I'm guessing around $300-400(?), but I DO have the price for the 486 motherboard and RAM & CPU Cooler, and a 1.44MB floppy drive purchased a little earlier than the 486 parts.) In the time since Sandy Bridge launch, I have seen nothing remotely close to improvements like that. In modern terms, if the i7-7700K had a 100% IPC gain over the i7-6700K, it would do about 1,686 or so in Cinebench R15 multi-thread, and about 334 in Single Core mode, at 4 GHz. (Not at 8 GHz, if so, that would be no IPC improvement. Not that any 7700Ks hit 8 GHz even under LN2, anyway.)
  6. I too have had the same question in the bolded part of the quote, and am disappointed to see no answers in this already-15-hour-old thread. I'd like my next motherboard to last even longer. Also, I'd want large performance/$ jumps with CPU upgrades over the years on the same board. For example, when a high-end PSU (like a SeaSonic Prime with 12-year warranty) dies of old age after having been babied all its life, THEN it might be time to replace the motherboard. Or, when long-time connector standards are replaced - past examples being AT -> ATX 24-pin power, IDE -> SATA, PCI -> PCIe, COM/LPT -> USB. For now, I'm not planning to upgrade my desktop's LGA1150 i7-4790K platform until around 2020-2022, or when PCIe 5.0 & DDR5 are out. If I was getting one of the CPUs now that the OP mentioned, though, I'd likely go for the 1950X, budget permitting.
  7. What happened to the days of bigger IPC boosts? Sandy Bridge was a fairly good jump over Westmere, wasn't it? I think the 286 was 100% better (or more) than the 8086. Wikipedia also says the original Pentium was about 100% faster than the 486, which in turn had a similar gain over the 386. I understand IPC gain to basically mean improved performance at a given clock speed and thread count. For example, making up theoretical Cinebench R15 benchmarks: CPU A: 160 @ 4 GHz, 1 thread. CPU B: 320 @ 4 GHz, 1 thread. IPC boost = 100%. CPU C: 800 @ 3.6 GHz, 8 threads. CPU D: 1200 @ 3.6 GHz, 12 threads. IPC boost = 0%. And there's also the big improvements in price to performance (over a given time period) seen 20-30 years ago.
  8. RX Vega's gluttonous power appetite looks bad to me. :( Good thing I'm not buying one (keeping my 3GB 1060 & 6GB 970M instead), but if I was, I'd just go for a 1070 or 1080 instead.

    DHMCLvXUMAEND5L.thumb.jpg.4e006f9cd8dbcfd44eaee98bfb065a63.jpg

    Pascal would be MUCH cheaper than Vega in power consumption for me over the time frame I'd likely keep a card.  (Probably at least a few years or so.)  And I don't expect my electricity to get any cheaper, either.

     

    For now, it appears to me that Vega's performance per watt may be somewhere between Kepler and Maxwell, but nowhere near Pascal.

     

    Early on, I was hoping Vega would blow Pascal out of the galaxy in efficiency AND price to performance.

    I wanted Vega to cause Nvidia to lower prices on their next generation, like the Nvidia GTX 285 (launched $369) vs GTX 280 (launched $649).

     

    My wishes got tempered based on rumors, though, and it turns out my fears were right.  It looks like there's no incentive now for Nvidia to make that giant leap I was hoping for.

     

     

    Oh well ... At least Ryzen & Threadripper seem to be putting pressure on Intel.  Now if Team Blue would just slash prices across the stack the same percentage that the Q6600 was in its life, or better yet, the same percentage that LGA 771 & 1366 Xeons have fallen on eBay (compared to original launch MSRP)...

     

     

    P.S. Should this have been a forum topic, or was I correct to assume it's better as a status update?

  9. I'd be really happy if, in mining, Vega took a big step backwards in price/hashes, so that, using a gaming example, it'd be like getting 2-3 fps peak at 144p, lowest settings, in Hovertank 3D (from 1991), on a Tesla V100. I think miners should need to get flagship Teslas or Radeon Pro SSGs (or whatever is the AMD-equivalent of Tesla) to get RX 550 or GT 1030 level mining performance. (Maybe a Quadro V6000 could be in-between and do 0.1 MH/s.)
  10. One thing that has come to mind a few times over the past few months or so ... is the idea of re-purposing that build in the OP as a storage/backup/NAS, and using my laptop (i7-6700K, 40GB RAM, 256GB+1.05TB SSDs, GTX 970M) as my "daily driver". (Except, the GTX 1060 is considerably faster, although steam streaming / lag would more than negate that advantage...) Yes, but for more $/TB. For example, an 8TB HGST NAS is ~$259, or $32.38/TB. A 10TB Seagate Ironwolf is $339.99, or $34/TB. It does add up. Also for now I prefer HGST drives. I used to use WD, as evidenced in the OP photos. I'd consider switching brands as long as it's highly reliable. The 2TB Seagate 2.5" drive that was in my laptop has been accumulating bad sectors, so I pulled it out of service. The 1TB MX300 SSD has taken its place.
  11. I already got a few 8TB HGST NAS drives to back up stuff from the smaller ones. I'll probably need to get another one or two in the next few months. The smaller ones were going to be retired, but turns out I needed some of them for Windows backup clones. (I couldn't boot off a clone from an 8TB drive due to MBR vs GPT issues. There's more complications related to that, but I won't post about it here.) For condensing to fewer drives, I'm waiting for 12TB, 16TB, etc. to be available at a reasonable price like . Still too expensive per TB. No I don't have RAID (yet). Current raw storage capacity, not including my dad's 2 drives in a picture, taking out the dying 80GB IDE, but adding the M.2 SSD in my laptop, is 60.75 TB. (That doesn't include other storage media like SDXC cards, etc.) I had been thinking about it a while ago, but it would be for backup purposes - basically a duplicate of my main computer's storage, NOT for offloading things from my computer to be stored exclusively on the NAS. And for now, I really don't have the physical place to put a second large system, especially not a horizontal form factor like a rackmount NAS or server. (Also I wasn't wanting a server since I didn't think my use case warranted it. I'm not hosting a popular website, I'm not running a 7-figures-a-month-profit business out of my home, etc.) That would then mean I'd need THREE systems. One to be my main use system, one to be my storage system, and one to be my backup system. One thing I was thinking re: RAID: I'd be split between wanting ZFS so I could have deduplication, checksums, and the other stuff that FreeNAS implements for data integrity (but requires X GB/TB ECC RAM), or UnRAID, which I hear doesn't stripe data across disks so too many drive failures don't take out the entire array. (If UnRAID supported the ZFS features, AND could do it on like an LGA771 era Xeon with 1 or 2 GB RAM, I'd strongly consider that.) I like the Define R5, and the Arc Midi R2. (I almost wonder if it'd be possible to mount a 2nd stack of bays where the bottom fan mount normally goes... I also am aware of the Rosewill B2 Spirit, that case has 13x 3.5" bays. I was about to because of this happening, but one day when I plugged it in several months ago it spontaneously resurrected itself. I copied everything I could off, then a day or 2 later it started clicking again. But, if I could revive it on command, maybe it could be used for this? I wonder how much longer the 250GB IDE next to it (in the pic in the op) will last...
  12. But where am I going to put my eventual hundreds of TB of storage? (Or will $/GB come down quickly on SSDs by then, while still maintaining quality, performance & endurance of SLC and MLC SSDs?)
  13. So I've been keeping an eye on what's been going on with cases over the last several months or year or more, and as the title says, I'm kinda disappointed in the trend toward having fewer drive bays in modern cases. A few examples from well-known brands are pictured below. I'm fearful as to whether I'll be able to get a case in the next few years that'll house all my drives. (The two 2.5" HDDs on the upper right are my dad's, though.) I'm preliminarily considering upgrading (from my 4790K platform) when DDR5 and PCI-Express 5.0 are out in the early 2020s or so, depending on other factors as well. Even if I were to get a new case sooner (that would actually fit where I want to put it) and move my existing system to it, I would definitely NOT want a server / rackmount style case. Also I don't like frequently replacing motherboards - not so much due to purchase price, but because of the labor involved. Same with cases and PSUs. Of course I'd want the drives properly mounted in the case. They're not in the example below; I had long since run out of drive bays. I just now realized the SATA power cable is covering up the GTX 1060. Moreover, neither my PSU nor my motherboard have nearly enough SATA connectors. And it'd be nice to still have room for dual triple-fan GPUs, triple-fan thick push/pull rads, dual CPU sockets, etc, if at all possible. And my current case is WAY too loud. I can't record piano music and encode (2160p video) and stream live without fan noise blasting the microphone mounted under the piano keyboard (audio mike would be about where the USB mouse dongle is). If I was getting a case now (I'm not), I like the aesthetics of the Fractal Design Define R5 or Arc Midi R2. If it was possible to have 2 stacks of 3.5" bays in that case, it'd be nice. Also a huge plus, or almost a requirement, would be toolless HDD/SSD trays, and the ability to swap drives without having to remove the cable-management-side panel. Also, three of the 3.5" drives (HGST 8TB, specifically) don't have the proper screw holes to line up with the sleds in my current case. (I had a topic on that here.) I have to use 5.25" bay adapters to mount those in my current case. Cloud storage isn't an option because . But, if my internet was faster than WITHOUT any data caps/throttling for the same $75/month my parents are paying now… But then, there'd still be privacy issues. (No, I'm not referring to Luke's video posted to YouTube on Oct 11, 2015 ) Yes I know some of the drives are getting up there in age, but I'm still able to find a use for almost all of them. - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - TL;DR: Any chance in the next 3-5 years of standard ATX tower cases (without flashy designs, but more like Fractal Design style) being available to house lots of 3.5" drives, WITHOUT needing a server / rackmount case? Many modern cases are trending toward too few drive bays for my use case.
  14. Popped into this thread, noticed a few posters here have a "0 posts" count. Is that because posts in this thread (or in Off Topic - which?) don't count toward your post total? I can understand why that'd be the case, so people don't spam in here just to get their post count up. (I got banned from a forum once like 16 years ago for that, they required you to have 750 posts or something before you could have a custom avatar or title, IIRC.) /me wonders what would happen if Linus "ripped" his shirt with a delidded FX-9590 (with no cooler attached) problem would be putting the CPU under Prime95 load when it's not in a motherboard
  15. I'm not on the AM4 platform, but if I was, I'd want the same or better price-to-performance improvements in PCI-E video cards we've seen over the years since PCI-E first came out, WITHOUT having to swap out the motherboard. (For example, compare the GeForce PCX 5950 to the GTX 1080 11Gbps, both of which, I believe, launched at $499.) I'm currently on Z97, LGA1150, with an i7-4790K, but on my next motherboard swap, I definitely want the longevity in CPU sockets (and DIMM slots) that PCI-E, SATA, and USB connectors have had. If I could replace a died-of-old-age-when-paired-with-35W-CPU-+-iGPU-in-5°C-environment SeaSonic Prime Titanium 1200W PSU twice before I replace the motherboard, that'd be nice. (No I wouldn't pair the parts that way, but if the mobo could last and its parts be upgradeable that long...)
  16. That's good. I think it'd be nice if there was a way to also limit sales based on games, monitor setup, etc. For example, Someone playing CS:GO, LoL, etc. at 1080p 60Hz 1 monitor might only be allowed to buy a single RX 560 or GTX 1050 or lower. Someone playing ARK:Survival Evolved or Ghost Recon Wildlands at 2160p 120-144Hz or 4320p 60Hz with 16 or 24 monitors(*1) could be permitted to buy 4 RX Vega 64s or equivalent NVidia. (*1) There was a card - FirePro W9100 - that had 6 mini DP ports, times 4-way CF = 24 displays. Don't think it supported 8K 60Hz per port, but if a newer one used all miniDPs.....hey, I think you could fit like 7 or 8 per PCI slot, times the 11 slots in the Corsair 900D ... 88 Monitors, anyone? Ultrawide 122880x21600, or 76800x34560? Also, limit how often you're allowed to buy. You could only buy extras if the natural-length warranty expired, or you're RMA'ing a deceased(*2) card, for example. (*2) Mining would immediately void all warranty on a consumer card (would still have to wait the year or whatever to buy a new one, although new architecture release would be allowed to buy), and somehow make sure the miner's CC info (used to purchase the card) doesn't get purged. (Of course this would be disclosed up front even before adding the card to cart.) Then, when the miner sells the card used, the buyer, upon installing it or checking warranty status, would be notified, and get a voucher/coupon for a free upgrade to SLI Quadro V6000s or CrossFireX Radeon Pro SSG (or equivalent gaming cards), paid for by the miner. These limits wouldn't apply to cards like the Radeon Pro SSG or Tesla V100. This!! Nerf the mining performance in gaming cards. Price/performance for mining after the nerf should be the equivalent of getting 3-4 fps at 144p, lowest settings, in (1992) Wolfenstein 3D on a Quadro P6000. Require a Tesla or Radeon Pro SSG for full mining performance. A Quadro (or AMD equivalent) would be in-between - using the nerfed-gaming-on-pro-cards analogy: $/perf like 24-30fps at 720p medium in CS:GO or LoL on a Quadro P6000.
  17. I've noticed the same thing on my laptop's i7-6700K recently. Since the 1703 update, if not before, the CPU has been pinned at 4 GHz or 2.3 GHz, depending on whether I'm in performance or power saver mode. This happens even with the system at idle, or with <20% CPU usage. Sometimes it will drop to 3.5-3.6 GHz or so under load in performance mode, but then it's flirting with 100°C under those loads. (That's not running Prime95 either.) I'm thinking maybe I should reapply the NT-H1, as it's also idling around 65-70°C in power saver mode.
  18. Screw that, I want to compare the 286-10 in my dad's 1st PC to the CPU I'll be putting in my ~2021-2022 build (when PCI-E 5.0 & DDR5 are out.)
  19. @Evolution90 Speaking of 3D Mark benchmarks, you reminded me of something. How would you compare performance of cards that are several generations / years apart? For example, say I want to compare a GTX 1080 to a GeForce PCX 5950 (or FX 5950 Ultra). I think one of the older 3DMark benchmarks had the 2 cards compared, but then there could be the issue of the 1080 saturating the older benchmark. Is it maybe better to do a few steps, like FX 5950 Ultra vs GTX 280 in like 3DMark Vantage, then GTX 280 vs GTX 1080 in FireStrike? Or would a site like PassMark be better for something like that? Or how about if I wanted to go crazy and compare the first video card ever made (was it the ATI CGA Wonder in the mid 1980s which Passmark does NOT have listed? I'm sure there's something older though) vs like a Titan Xp, or a future high-end video card. I'm thinking of sometime making another topic or 2, one about comparing video cards (and CPUs, etc) released a long time apart, and another about upgrade cycle on the same platform. The 2nd one would reference something like going from PCX 5950 to 1080, and lamenting that I can't get the same price/performance upgrade on a CPU over time cause sockets change so often. (I don't like frequently replacing motherboards, not because of the purchase cost, but because of the labor involved in swapping them out. If only it was as easy as swapping headsets or similar ) Ahh, I see @SageOfSpice. I guess I had gotten the impression that the cards struggled to run it at all, based on "but can it run Crysis?" Maybe that's because I'm willing to put up with such low settings/performance (compared to everyone else's tolerance) before I decide a game can't run? (And maybe that stems for my days when I was playing newer games on then-several-year-old GPUs that were low end when they were new…) And by "prototype" I'm guessing you and @JuztBe mean like a pre-release beta of Crysis? Or was it a game called "Prototype"?
  20. @Shimejii Ahh, yeah, true. Looks like I forgot to add the requirement to the first part that the game be already well optimized. Oh well My bro plays ARK, and recently upgraded from a dead GTX 780 to a GTX 1080 Ti, using Intel HD 4600 briefly in the meantime. I think he went from ~45fps Medium 1080p to 25fps 800x600 low to ~50-60fps 4K high, IIRC. What I'm defining as "demanding" is low fps even at low resolution & settings on a high-end GPU, even with a well optimized game. (Maybe it's just one that has insanely advanced graphics for its time.) @SlaughterSmurf I could see the logic of your 1st part answer. For 2nd part, I don't even have a monitor that can run 1440p yet and until I got my laptop with its 970M, or the 3GB 1060 in my desktop, I was running the 4790K's iGPU in games. Interestingly, I was able to play the first few scenes of Witcher 3 at 1080p ultra on the iGPU. Sure it was only 3-5 fps, but the gameplay itself was slowed way down. (Riding a certain distance that normally took 30 seconds or so, was taking more like 4-5 minutes, and not because I was having trouble controlling the horse.) I hadn't thought of testing League. I have it but haven't played it. (I'm not really planning to test much more right now.) Yes, I realize 10 years is a long time. But sometimes I think I'd have upgrade cycles that are pretty long. (I'm not one to upgrade soon as something better comes out - for example I didn't go OMG the 7700K is out, gotta replace my 6700K in my laptop! Side note: I was hoping to put the 8700K in eventually, but based on recent rumors I likely won't be able to, so 6700K will probably be CPU my laptop eventually dies with, or has when it's replaced in the early/mid 2020s or so.)
  21. Hi guys Was just curious about a few things … What do you think was the most graphically demanding game ever released, relative to the GPU technology then available? I've heard people talk about the original Crysis being quite demanding for its time. However, as I read up more about it, it seems that high-end cards, like the GeForce 8800 GTX, seemed to be able to run it at 30fps, medium settings, 1600x1200 or 1920x1080 or something like that.But…What if you, like me, would be willing to turn settings way down to be able to play something? For example, see the below simulation of a brief CS:GO match vs bots on Dust, encoded at 240p, 6fps, H.265 q42. csgo 2017-08-06 18;24 - b - 240p, 6fps, h265, q42.mp4 (it's 744 kb & 47 seconds. Is there a way to make it a playable-in-browser video without putting it on youtube, or is that where I'd need to put it?) Has there ever been a game that struggled badly like that at very LOW settings, even on extremely high-end hardware like US$>2,000 worth of GPUs? If so, what was it, and what was it like on hardware of the day? Most demanding game relative to GPU hardware then available? Also, lowest settings+fps you've had to put up with? And, second part of the question... What's the lowest settings you've ever had to deal with in a game you were playing? For me, it would be somewhat similar to the example above, although the "quality" wasn't quite as low (mainly because there was no setting to go that low), and I think I could get up to 8 or 10 fps looking at a blank wall. The game was Team Fortress Classic (same engine as the original Half-Life) back in 1999 or 2000. The GPU at the time I believe was the original ATI All-In-Wonder (based on 3D Rage II, I think), and a Pentium 166 MMX. I don't remember what the RAM or HDD (no SSD obviously) was at the time, but it was probably on Windows 98. Third part. How long does it typically take before various price tiers of single GPUs (like $700, $500, $350, $200) get to where they can't peak at more than 12-15fps at lowest resolution and settings in a then-10-year-old esports / lighter-duty title? Or does deprecated / lack of driver / API support for games on older cards make them not run at all before the performance gets that low? For one example, I'm guessing a game like CS:GO would either struggle to get more than a few FPS at lowest settings on like an Nvidia Riva 128 or ATI Wonder/Mach/Rage, or, not run at all due to lack of support on those cards for certain features. Lastly, for those of you who like me don't stay on the cutting edge, but hold onto your cards for a while, how often do you upgrade, and what's your preferred criteria for how low the performance gets before you do so? For me, while I'd grudgingly accept the scenario in the preceeding paragraph if financially necessary, I'd probably prefer to upgrade when few-year-old games are almost always dipping below ~15-20fps at low settings and 480p or something like that.
  22. I thought you were supposed to clean and repaste every time you took the heatsink/waterblock off? That's why, when I applied mine (both with my laptop CPU + GPU and with my desktop i7-4790K CPU), once I put the heatsink down I immediately clamped/screwed it. Although I wanted to take it back off to check my application, I didn't want to have to clean that paste off and do yet another application.
  23. Interesting. So, then, if paste makes little difference in temps when applied properly when new ... I'd like to know how they hold up over time. Like, maybe you have a Pentium II and a Riva TNT running a personal web server / NAS, and are still on the original paste application. Or for an intermediate age, a system with an E8400 and 7800 GTX. How would we find out which paste lasts longest (assuming set-&-forget), or at least predict/project that? My minimum would be long enough so that if it was on my dad's Dell D830 (bought Aug 2008), it wouldn't have degraded more than a couple °C under Prime95 Small FFT or Aida64 FPU test. And, hopefully it'd still be good enough to keep a 4790K from thermal throttling with P95 28.10 SmallFFT under the stock heatsink in a 50°C ambient environment a few minutes before the CPU dies of old age (not "dies of heatstroke"). I think I saw somewhere in the article that Arctic MX-2 & MX-4 didn't hold up well over time. Makes me glad I didn't get it when I was upgrading my laptop CPU last December. Although, that 6700K is now idling at 75-80+°C with Chrome in the background, and hits 100°C & throttles at 35-50% (according to Aida64) when opening Chrome and restoring previous session, while it's loading everything. Shutting off Chrome only brings temps down to ~65°C. Maybe it's time to reapply the NT-H1...and maybe use a bit less next time? (Was trying to do the "X" method for max/even coverage, maybe I need a different application method as well.) The GPU does keep cooler than the CPU, though. That 970M only barely hit 77°C at the end of SuperPosition 8K Optimized. (Started at around 69-70°C at the beginning and crept up over the course of the benchmark.) What do you mean by that? I sometimes do.
  24. Coffee Lake not being compatible with Z170 does upset me at least a little bit. I was hoping to be able to upgrade the i7-6700K in my laptop (Clevo P750DM-G) to a 6-core in the future. I wouldn't have bought it at launch, though, but would have waited (pending an available bios update) for The next generation AFTER to be released, that wasn't compatible, & The 8700K (or whatever) to go on sale for ~$250 on Black Friday at Micro Center. I actually started with the i3-6100 in it, but upgraded to the 6700K last year (2016) around Black Friday when it went on sale ($260), and rumors were flying pretty high that the 7700K wasn't going to be all that big of an improvement. Also I was hitting the limit of 2 cores, 4 threads, prompting me to expedite the upgrade. I'm glad I did. (Also Kaby Lake would have needed a new bios update anyway, something I wasn't sure if I'd be able to get.) As for my desktop, I have an i7-4790K in there. I have no upgrade path to speak of - the i7-5775C only has better iGPU, but then I now have a (3GB) GTX 1060. I'm running at stock, though, since my max OC would be 4.7 GHz, or maybe 4.8 if I push it a bit, which is too small an OC to be worth it for me. (I'm considering trying undervolting instead to save on power consumption.) I can't get the same % OC the Celeron 300A could get, or some other older Intel CPUs that I've heard could OC to 200% over stock on air. My preferred next $300-400 upgrade would be able to live-encode-and-stream 4K 30fps H.265 lossless video while multitasking at minimum, and preferably keep up with 8K 120fps if decent ~$800-1200 cameras that do it are available (or whatever cameras can do a few years later.) For my next upgrade, though, I'd really like the motherboard to last a long time, and have a long upgrade path for the CPU. Barring financial hardship, I don't mind spending like $150-200 to replace a motherboard every 4-5 years or so. Although, if I was to drop like $400-500+ on one, I don't want to have to replace it until a SeaSonic Prime 1000W, EVGA T2 1600W, or Corsair AX1500i bought at the same time (hooked up to a different mobo, with a G3930T underclocked/volted to 0.8GHz 0.7V, and low-end SATA SSD, no GPU, used lightly by office standards) dies of old age. That's with my current budget. If i had a better job, I might be willing to spend ~$200-250 every 3 years, or $500 every 5. At any rate, I'm skipping Ryzen and Threadripper, and pretty much all Intel through Tiger Lake, and waiting for PCI Express 5 and DDR5. (Last I checked, rumors point to around 2020 or 2021 or so for general availability, iirc.) Actually, though, the labor involved with replacing a motherboard, and unplugging & replugging all the other components, is I think the main reason I don't like to upgrade very often. When major things change, like going from the old AT power connector to ATX, or when going from ISA to PCI, and then to PCI Express, sure, it'd be time to upgrade. Also of course when there's a gigantic performance improvement, or we've hit a wall that had been set a long way off early on. (For example, hitting the 7.02 zettabyte GPT disk limit, or the 16-exbibyte RAM limit for 64-bit CPUs.) Like someone said earlier in the thread, if I want to add a feature my motherboard didn't come with, isn't that why there are PCI Express slots? They're not just for video cards, you know. Back to that long upgrade path I mentioned earlier. I'll briefly touch on RAM. I'd prefer different generations of DDR to be compatible, like PCI Express is compatible. If I get a board with, say, 512 MB of SDR or DDR, I want to eventually be able to upgrade to like 4 TB of DDR5 Reg ECC or whatever, as the need arises. (Or maybe DDR7 or DDR8 by the time I need that much.) Look at PCI Express for an example of upgrade path. I looked up a few of the first PCI-E video cards, and compared them to the prices of current ones, and came up with: GeForce GTX 1080 (11Gbps) vs. GeForce PCX 5950 (both launched MSRP ~ $500) GeForce GTX 1070 vs. GeForce PCX 5900 (~ $400) GeForce GTX 1060 (3GB) vs. GeForce PCX 5750 (~ $200) GeForce GTX 1050 Ti vs. GeForce PCX 5300 (~ $140) Look up the benchmarks on 3DMark03/05/06, and PassMark (UserBenchmark doesn't have FX/PCX cards other than Quadros), to see how much of a difference there is between them. (You may need to substitute FX 5950 Ultra, FX 5900, FX 5700 Ultra, FX 5200/Ultra, or similar, for some of the PCX cards.) Also I went with Nvidia instead of AMD, since they were slower in the benchmarks I was finding, and I wanted to show a larger performance improvement. You could pull a PCX 5950 or PCX 5300 out of an old PCI-E 1.0 x16 slot, and jump straight to a GTX 1080 or GTX 1050 Ti respectively. Although, how much would that interface bottleneck the newer cards? Would the 1060 or 1070 be the best you could get on PCI-E 1.0 x16? I'd REALLY love to be able to upgrade CPUs and get as big a price/performance and performance/watt improvement as the above GPU examples, WITHOUT having to replace my motherboard. And of course, same idea for upgrading the other parts. Generally, the more parts I have to unplug or uninstall to replace one part, the longer I prefer to keep it. (Motherboard, PSU and Case stand out as examples to me.) Oh, and speaking of benchmarks ... anyone know of a better site that accurately lets me compare really old parts vs current? (For example an MFM hard drive vs NVMe, an 8086 CPU vs Threadripper, an ATI CGA Wonder GPU vs. Vega 64.) I'd prefer an all-uses-covered benchmark, not one specializing in just high refresh gaming, or content creation, or office work, or any one thing. Glancing through the thread, someone mentioned future-proofing, and people upgrading every few generations. I already touched on the upgrade labor aspect re: future proofing in my post. I would upgrade every few generations of CPU, provided that I'm getting a several times (not percent) performance per $ and performance per watt improvement. I would prefer to wait several UPGRADES before replacing my motherboard. That reminded me of another thing. I also hate segmenting different types of CPUs. I want to be able to get like a $30 hyper-threaded dual-core CPU to start with, then upgrade to a $200 or so 8-core/16-thread CPU, or whatever is the norm then, when I've saved up the $. Then, after I've bought the last $300-400 CPU to be compatible, a few years later I'd want to pop in a used $10K many-core server CPU bought off eBay or wherever for ~$200. Then when THAT gets too slow (like a Core 2 Duo or Pentium D or Athlon 64 X2 would be today), it's maybe time to replace the board.
×