Jump to content

bmichaels556

Member
  • Posts

    334
  • Joined

  • Last visited

Everything posted by bmichaels556

  1. Oh definitely, there are brands that I'm loyal to, but there are sure things they could do that would make me question that loyalty and ditch them outright. It seems like with Apple especially, it's become this weird cult. Like, there are a LOT of people who could be presented with a superior Windows-based product. Let's say, same price, better specs, battery life, display, the list could go on. There are many Apple cult members that would still go with the Mac, but for impractical reasons. "Well, I just think Mac OS is better." "Okay sure, fair enough, but what about the fact that Mac OS is much more restricted with available programs, and that you're way more locked down into Apple's control, not to mention their nonsense repair situations." "Well... Okay, but Macs are better for productivity! You know, things like Final Cut!" "Okay, sure, it's great software. How often do you edit video?" "Three times a year. Family vacation videos/photos and another nonsense thing once in a while." And then their heads kind of explode. I think it's totally fine to like a product for a "second type of cool", where it's based on style and just a preference you maybe can't even describe or quantify. I think it's another to lie to yourself and join a religion based on that lie. Wow, big rant, sorry for that lmao
  2. It's tough to explain, and I'm sure someone could do it better. But it's this sort of idea that one brand or line of products is "better" because of the name / reputation, rather than for practical reasons. For example, think about the RX 480 and GTX 1060. Both nice cards, and really neck and neck with each other in just about all situations, certainly in gaming. Yet, despite both products being extremely close, the 1060 vastly outsold the RX 480 because "it's Nvidia", rather than "It's the better product" if that makes sense. People perceive Apple as, let's say, being higher quality than other manufacturers, but this is debatable. But because "It's Apple!", 9 year old laptops with half the performance still go for many hundreds of dollars, when a newer production laptop would of course perform much better, have better battery life, and the list goes on. Point being, a product is perceived as "better", but not based on practicality, but maybe brand recognition and so on.
  3. So I was really lusting after the 1950x but of course didn't have the budget. But I got to really looking, and when you look at something like an E5-2650 V2, it looks to have very similar single-core performance to a first gen Threadripper, and go dual socket and you're basically just under a 1950x from what I can tell. And they're pretty darn cheap. And comparing to price of Threadripper 1950x and TR4 motherboards and so on, I feel like for my uses, the older Xeon option would be sufficient. I do game, and I'm using an RX 480 right now on a dual Xeon X5675 rig. The bottlenecking is minimal, other than on games that are heavy on single core, or of course on high framerates. But other than that, it's been great. I'm thinking of maybe upgrading to an RX 5700, maybe an XT. I feel like at the 1440p I'd be using it for, even what I have now would be sufficient. But then to add 8 more threads with two generations of IPC increase at similar clockspeeds... It seems like a no-brainer for what it would cost to get it all set up. What are your guys' thoughts on this? Thanks!
  4. I think it has a lot to do with "mindshare". Same situation with old macs. Excellent construction is nice, and they sure look great. But there's no reason a 2011 laptop should be going for $500+. And yet they are. It's insane.
  5. Hey gun gang, I've really been thinking on what more modern full-size handgun to get into. I've really been leaning towards an M&P 2.0 9mm. I've had the opportunity to shoot the gen 1 and it felt good. I've shot glocks and while I enjoyed them, I found them to be blocky and maybe not for me. For you guys who have maybe several of the major offerings (Glock, M&P, XD-series and so on), which have you ultimately gravitated towards most?
  6. So the thing is.. When you look at something like a 2080Ti for example, even an i9 9900K can't fully use it at 1080P with high framerates. So it's all relative and depends so much on how it is that you play games. But stepping down to an RX 5700 / XT with a 2700X, I think that's a good pairing and you'll be just fine. I expect SOME bottleneck if going for 1080p absolute max FPS. Stepping up to 1440p for example, there would be virtually no different. What really matters is if you're going for absolute maximum frames per second at lower resolution. Also, I agree with Stormseeker. This will be a REALLY nice setup for 1440p 120hz+ gaming.
  7. Really? I mean if that's the case, I could up my budget by like $150 for a refurbished Swift 3 w/ i5 8250u and MX150. I figure a GT 1030 should have no issue running something like GTA 5 at 1080p low settings 60fps, give or take? Can MX150 be turned off, or are you stuck using it all the time instead of the igpu?
  8. For a while now, I've been using an Asus E403S, and honestly, it's not that great. I got a good deal on it and it's been just about adequate for basic things I need to do, but bad single-core performance and only 4GB of RAM really make it chug. The screen, despite being 1080p, just isn't very nice to look at. Still, it's fairly light and is easy to carry around. I do also have a cheap Dell laptop I got, with an i3 4010u. MUCH snappier and better overall experience, but the thing is pretty bulky and heavy. I'm thinking I'll probably just sell them both off and finally look towards a more adequate laptop that really does what I need it to do without any major sacrifices. I'm totally fine with going used. No issues at all with that. For a long time, I've been eyeing an Acer Swift 3 w/ an 8th gen i5 (the 8250u I believe?) and MX150 for just a bit of basic gaming capability if/when I'm away from home. Still, that bumps up price significantly. I'm see refurbs w/ i5-8250u without the MX150 for $500 direct from Acer on Ebay. I mean that seems like a hell of a deal, but I still feel like I'll be sacrificing that basic gaming performance. Not that I expect to be gaming much anyway, but it's a nice bonus, you know? So what is it that I do? Well, most of the time, basic office-type tasks, I run a couple websites, nothing too heavy. But I do also make videos. Not terribly often, but enough that I want a half-decent editing experience when I need to do do it on the go. And I know the 8th gen i5 is going to be MUCH better overall. It's basic stuff, 1080p60, no color correction or anything like that, or even any editing that's too crazy extensive. Keep in mind, I edited 1080p video on an old laptop that you wouldn't believe.. It wasn't pretty, but I made it work back then. So I'm not picky. Just that it needs to be good enough. And I think even an i5 7th gen would be MORE than fine. I know, I probably want a bit too much for $500, even used. Maybe I could go for something older, maybe with a 940MX? I'm just not sure. Overall, I need: - Good battery life (8 hours typical?) - Adequate video editing capability for basic 1080p60 - Adequate GPU for... Maybe 1080p 60fps minimum settings? Looking to play older games, as well as Fallout 4, GTA 5 things like that.
  9. Hmm... You could probably be looking at something like a few Dell Optiplex compact models with i3's for example. If just a basic media PC for YouTube, watching some movies in 1080p (can't attest to 4K video performance), they'll be very adequate. I'm currently rocking one as a simple file server and it works just fine! Check out Optiplex 3020's on Ebay, especially the small form factor "SFF" ones. 4GB of RAM on a lot of those... Hm.. I mean I think if you're literally just looking to open a handful of tabs in chrome, do some basic office-type tasks, watch movies or whatever the case may be, it should be fairly adequate, even with 4GB of RAM and the integrated graphics. Would love to hear what others have to think. And yeah, these can be had for WELL under $100 each. In fact, you can often find them with hard drive, Windows already installed, the 4GB of RAM installed, and you're basically off to the races for $75, maybe even less. MAYBE upgrade to some cheap 128GB SSD's for added speed? Trust me, it'll help A TON if you're hitting that 4GB RAM limit and start going into the page file. On a HDD, that will slow your system to a CRAWL. On an SSD, it'll still be adequate generally.
  10. For quick backstory, I got my One S about a year and a half ago and generally really like it. I've also went with a PS4 slim during that time and enjoy it quite a bit as well. What I've noticed however, is that a used or refurbished One X can basically be had for pennies on the dollar compared to it's value in my opinion. In fact, if I sell off my Xbox One S with one of the controllers, it would basically just cost me $100 to upgrade to the X. Would look tons better on my 4K TV as well. I actually already have a "gaming" PC, and in fact one that is very comparable to a One X. A dual Xeon X5675 w/ an OC'd RX 480. Hell, for all intents and purposes, it practically is a One X. But I'm also someone who just can't shake console gaming and still enjoy it quite a bit. Considering what it would cost for the upgrade, about $100, do you see it as being worth it? To me, it seems like a no-brainer. But the next-gen consoles aren't THAT far away. Still, I have a feeling that they won't perform radically better in the real world than a One X. And if they did, I could always sell of the X and buy the next gen console, also cutting my costs... What are your guys' thoughts? Like I said, it seems like a no-brainer, but I'd love to hear your suggestions.
  11. So this is my 10,000 foot level understanding of what ray-tracing is, how it works, and how Nvidia cards are driving it now. One thing that seems to be happening is that RTX cores merely ACCELERATE ray-tracing, and aren't respomsible for a card being able to do it per se, correct? My evidence for this, is that even when ray-tracing, the cards still lose performance, which sounds like it's so demanding that it's cutting sharply into the rest of the card. Or something like that? Now, I could again be totally off, but aren't other Nvidia cards now able to ray-trace without RTX cores using some kind of generic method in DX12? So if that's the case, why can't AMD cards do the same thing right here, right now? Has it been specifically adapted to Nvidia's hardware and so isn't quite ready to be able to run on other cards? Regardless, it all just sounds like RTX was a brilliant scam that was all marketing and no substance. I guess I'm preaching to the choir on that one though. But it seems like the equivelent of selling 8K TV's when 4K content was hardly available a number of years ago. This latest and great thing that does absolutely nothing for the user. Not well, anyway.. Pretty much snake oil until it reached potential. This is sort of a rant, but I'm also looking for a little more info how other cards are able to ray trace without RTX, and also why AMD cards are not able to do it currently, or maybe ever on what's currently out there.
  12. Maybe it's just... Well, let's be real. The "average" consumer probably doesn't know their ass from their elbow about what they're shopping for when it comes to CPU's. They go into Walmart or Best Buy, see a "gaming" or "office" PC, and just get what sounds like it'll fit their needs. And so, while I guess it's technically true that this clocks up to 2.4 Ghz, it's sort of fucked up to even market it that way because it doesn't perform how one would expect, if that makes sense? Like... "Oh, this i3 1.6 Ghz. Wow, but that other one is half the price, has four cores, and goes up to 2.4 Ghz! It's GOTTA' be faster and better!" I dunno, I'm just being a pain in the ass I guess.
  13. "Real-world usage" as in I guess.. Regardless of what benchmarks and specs say in theory, how does it ACTUALLY perform when using it? And despite how great the N3700 SHOULD be for a low-cost low power consumption laptop in the equation, it still just feels sluggish. Like even opening a new tab and quickly going to Google feels slower on this thing than my old laptop from 2007. Granted, it's totally fine even on that old thing (so long as you don't cap out the RAM). And like I said, I figure at a point, your CPU isn't going to speed up Windows anymore and it's going to start to come down to storage speed, things like that. But it just happens to be noticeable with this laptop, which is why the whole thing perplexes me to be honest. I'll probably be upgrading to something better that doesn't sacrifice battery life sooner rather than later, though so I guess it's not the most relevant thing ever. But I was still curious about the whole thing.
  14. So... I have a few computers that I use. For when I'm just out and about and need some very basic computing power, I'm generally running an Asus E403S (Penium N3700 w/ 4GB RAM). I guess it's like, the ultra cut down version of a Zenbook, or whatever lol. For when I'm on vacation or whatever, I usually bring along my Dell laptop, which has an i3 4010U. It's not great by any means, but is much snappier and does what I need it to do overall. Some basic GIMP work, basic 1080p video editing. Ya' know, doing a bunch of cuts where I need them, things like that. Nothing too heavy, so it works just fine. Plus, I edited 1080p video for WAY too long on an old AMD Turion laptop from 2007, so I'm not too picky when it comes to doing these things on a laptop lmao. And finally, my main rig right now is a dual Xeon X5675 w/ 24GB of RAM that I got for way cheap and souped out myself. But I can't help but ask this: Why does it seem like clock speeds on low-powered mobile CPU's are a lie? For example... The i3-4010u is running at 1.6 Ghz as far as I remember. Everything is snappy enough, given the horsepower it has. On Cinebench, it scores like a 60 single-thread. Yet it still feels just fine w/ an SSD, so I'm SURE that a lot of perceived performance and snapiness doesn't come from the CPU I suppose. At least in my experience. But then look at this Asus laptop w/ the Pentium 3700. It scores ~40 on cinebench single core, but feels way more than 33% "slower". Not that it's an easy thing to quantify. But it's sort of a "I know it when I feel it" type of thing. But another weird thing I notice is this: The Pentium N3700 is at 1.6 Ghz and boosts to 2.4 Ghz. So why then, does it feel so damn slow in real-world usage? So yeah, I'm aware it's a low-power CPU. So if it's hitting a power limit, I get that's going to slow down. BUT, it's consistently hitting the advertised 2.4 Ghz on all cores. I'm SURE that while the RAM in a budget laptop like this isn't the best, that it's "good enough" to not be the cause of the whole system feeling like it's crawling. I also get that the storage on these speeds is often that eMMC stuff, which (I believe) is usually about as fast as a mechanical hard drive despite being solid-state. If I assume that it's true that this thing is hitting 2.4 Ghz, and that it performs 33% worse numbers-wise as a 1.6 Ghz i3 series CPU... That could ONLY mean that for whatever reason, the IPC on these particular lines of CPU's, is ATTROCIOUS, and only half as good as the standard "Core" series. My question is, WHY? Are these architectures (Airmont for the 3700 I believe) specifically designed to hit super low power usage, but for some reason have inherently terrible IPC? Because that seems to be the case here. But then why does it even exist in the first place? Is it really cheaper for intel to have a whole dedicated line for this junk, as opposed to the Core-M series? Because THAT is more indicative of what you'd expect out of these kinds of low-powered CPU's that are clocked really low, but still offer respectable performance for what they are. But that doesn't happen with these. Why the hell do they exist then, as opposed to just releasing Core M's to the OEM's that are clocked low? Even the lowest-performing Core M I can find (the 5Y10a) has similar passmark scores, but I GUARANTEE it absolutely wrecks a 3700 in real-world performance in all aspects.. I guess this is just a huge rant, but also me trying to wrap my head around all this..
  15. I've heard really good things about the Zenbook, but I'm currently running a cheapo version of Asus' lineup, a vivobook. I think you probably can't go wrong with either lines, as long as you're keeping in mind the specs you need. You should be getting a decent 1080p display and decent battery life out of any of them I reckon. You may consider looking at something like an Acer Swift 3 w/ i5 8250u (Intel REALLY ramped up their mobile CPU's 8th gen) and Nvidia MX150. I think that thing will give pretty much anyone all the horsepower they'd need from a laptop. Past that, you start looking into higher-power bulky laptops, or just looking at desktops at that point. Only thing is, I'm not sure if the Swift 3 has an option for 16GB RAM, and I believe they're soldered on, so you'd need to make sure you get what you need from the start. Hope that helped!
  16. So my current main rig is a dual X5675 workstation, paired with an RX 480 for some gaming and whatnot. It does an okay job in most games, but of course some are horrendous due to CPU bottlenecking. And while it's pretty damn good for the $250 I spent on the whole thing, it definitely has its limitations and quirks. My previous machine was an FX-8350 rig, same GPU. I've really been looking hard at Ryzen 3000, preferably with a 3700X. But it occured to me... Am I really losing a ton by going with the 1000-series? Namely a 1700 overclocked to 4Ghz, or as close I can get to it? It seems like the biggest benefit is really going to be the higher clock speed potential for Zen 2. BUT, since my uses are a combination of gaming (typically above 1080P) and productivity tasks. Some video editing, very basic photo editing, some handbrake sort of stuff. It seems like even with Zen (1), 1440p gaming for all intents and purposes, basically removed any major advantage Intel had. But it is 2 gens later, and I'm not so delusional as to think a Ryzen 7 1700 overclocked to 4Ghz, is going to do as well as an i9 9900K at 1440p. But how far is the difference? For my uses, would I be totally gimping an RX 5700/XT by going with one?
  17. It seemed like back in 2015, the Fury series was pretty badass! I think it was relatively competetive with the 980Ti, and the 390 competed with the 980. Which makes sense, because the RX 480 was neck and neck with the 390 and 980, so that seemed to carry over nicely. But then whenever someone did a recent test of the Fury, it seems to fall a good deal behind the 1070, which should be about neck and neck with the 980Ti. What exactly is it about the Fury that makes it hold up so poorly in more recent games? Power consumption and all that stuff aside, it feels like it should be pretty close on paper, yet seems to perform more in line with the 590, rather than making that next jump to 1070 level performance. What gives?
  18. That... Actually seems like a real powerhouse. Seriously... And it looks like the single-threaded performance is significantly better as well. I was looking at them prior, but saw a low base clock compared to the X5675 and figured the passmark scores were lying, not realizing they'd all boost to 3Ghz. I guess with the benefit of being a newer architecture..
  19. Definitely not... But since it does well in most games without a huge bottleneck on the GPU, I just kind of accept it. The other weak point is having to upgrade to SATA3 and USB 3 using PCIe cards. Not ideal ha. So yeah, definitely some downsides, but all things considered, I never felt it had any dealbreakers or major weak points. It all depends on what you're looking for though. A lot of people would be disgusted by lack of native SATA3 and USB 3, but overall, I was willing to accept somewhat gimped SSD performance and having to add it in. But I would totally understand the criticism by those who would want to avoid the hassle, even if doing the upgrades themselves was no big deal.
  20. Does seem pretty crazy, those Ryzen 1st gen prices. Maybe I dump this dual socket guy and almost pay for a whole system with a 1700 and 16GB RAM. Not a bad option, all things considered. Plus, it'll look a lot prettier.
  21. For a while, a ton of people were building Xeon X5600-series systems. I believe Tech YES City started the trend. Me? I ended up going with a Dell T7500 Workstation, and swapped in X5675's for the old E-series Xeons that came with it, upgraded to 24GB of RAM, and put in a USB 3.0 PCIe card for good measure. The result? A $250 gaming and productivity rig that matches or beats a Ryzen 1700 at stock speeds, and that doesn't bottleneck most games with an RX 480, with the exception of the ones that only use one or two threads. Other than that? ~90%+ GPU utilization in most games. Not too shabby! It uses some more power, and is limited in some other ways. But price to performance? It was IMPOSSIBLE to beat last year, especially if you were willing to forgo overclocking support in favor of a dual socket 24 thread monster workstation. But I have to wonder. While these things have come down in price, it seems like the Sandy Bridge stuff is starting be in. Is there a newer Sandy Bridge Xeon that seems to be the new "budget king"? I guess any of the lower core higher clocked CPU's would be fine for gaming, relatively speaking. But me? I'm really looking at the whole package. A high core count processor that performs well enough in games, but also presents a serious value proposition in multi-core performance compared to newer mainstream desktop processors. Anyone?
  22. I don't see how something like a used RX 480 8GB or something along those lines wouldn't serve you well in this role. If you go through a few Ebay listings, you can almost definitely grab one up for $110 tops. So you'll still have most of your money left from getting rid of the 1080Ti.
  23. Really impressed at the 2600K. Not too bad for such an "old" CPU.
×