Jump to content

vertigo220

Member
  • Posts

    516
  • Joined

  • Last visited

Awards

This user doesn't have any awards

Recent Profile Visitors

The recent visitors block is disabled and is not being shown to other users.

  1. I've tried with and without FreeSync enabled. I'll try your other suggestion.
  2. As mentioned in the OP, I not only used DDU but also a fresh install. I'm sorry, but there's just no way it's a CPU issue. First, as stated, it's 1440p, not 1080p. Second, that would just limit the average and max framerate, but I should not be getting stuttering, even with a slow CPU (which mine is not), especially on basic games and in cutscenes. Finally, yes, my CPU is older, but its performance is slightly above that of the recommended CPU for Fortnite, so it should be able to perform spectacularly, and certainly without constant stuttering. I'm hopeful someone can shed some light on something I've overlooked, but at this point it really looks like an AMD architecture and/or driver issue.
  3. Specs: Kingwin LZP-750W Platinum PSU Asrock X299 Taichi i7-7820x Patriot Viper Elite DDR4 16GBx4 Samsung 960 EVO 1TB (OS) EVGA GTX 770 4GB / Sapphire Pulse 6700XT 12GB / MSI Mech 6750XT 12GB 1440p monitor connected via DP I recently decided to finally upgrade my 770, despite the fact it was still doing what I needed for gaming, mainly so I could output 4K for a TV and also so I could switch from a 1080 monitor to a 1440 one. Unfortunately, the 6700XT I first tried had lots of issues, ranging from frequent frame drops/stuttering in games to videos often freezing (audio kept playing) when toggling between windowed and fullscreen mode in my video player to causing various other programs to become unresponsive. I mainly play Fortnite, and the framerate drops were worse on the new card, no matter what settings I used (DirectX 11 or 12, ultra high or low settings, fullscreen or windowed fullscreen, etc), than they were on the 10-year-old 770. I tried in other games as well, and it stuttered multiple times even on the opening cutscene of Metro Exodus. I did use DDU in safe mode when switching to the new card, but just to be sure, I did a fresh install on another drive and tested with Fortnite and it still stuttered. I replaced it with a 6750XT, figuring the card might be bad, and while the non-gaming issues mostly went away (it still occasionally takes a second for a video or game to respond after switching to it, something I don't recall experiencing with my 770 or even my laptop with a Ryzen 4650 with integrated graphics), I'm still getting stuttering just as bad on it as I did on the 6700XT. It was even doing it, regularly, in games that it should be able to run smoothly with practically no effort, like Action Henk and All-Star Fruit Racing. I don't know if AMD's drivers are just that bad, or if I've had two bad cards in a row, or what, but these games should run with zero issues, and I should be able to run Fortnite on medium, if not high (and certainly on low), settings without going below 100 fps, much less *regularly* hitting the teens and even single digits. There seems to be something very, very wrong, but I have no idea. It doesn't seem to be a CPU bottleneck, since my CPU is slightly better than the *recommended* one for Fortnite, and it's certainly more than enough for these other games. Also, the frame dips happen even when the CPU usage is relatively low, and as mentioned, I didn't have this issue with my old card. I've read AMD cards have issues with throttling down, and I've set the minimum speeds to just below the maximum as suggested, but that hasn't made any difference, though I've noticed that at least some of the times the framerate has dropped, if not many/most of the times, the GPU usage (using AMD's performance logging tool) has been low, which is very strange, as it suggests the stuttering may be happening because the cards aren't using their full potential. Then again, other times the usage is high, which again seems strange that it would be so high playing at low settings. It's even pretty high (>50%) just sitting on the desktop oftentimes. I've tried tweaking all sorts of settings, which shouldn't be necessary to run basic games on a card as supposedly powerful as the 6700XT and 6750XT, but to no avail. I've seen scattered comments here and there about people having similar issues, but nothing consistent, which seems strange considering I've experienced it with two different cards. I should also point out that the AMD Adrenalin software sucks. In order to see the performance overlay in-game, it has to be set to always on top, but then the main software is on top as well, so it has to be minimized to be able to see the game. And the Alt+r shortcut to open Adrenalin rarely works, at least while in-game. The layout is also not great, as it's confusing knowing where to go to do certain things. And it's a pretty stupid oversight that you can browse to select the folder for the performance logs but you can't actually open the folder to browse and view the files. There should be a button to open the folder. It would also be helpful to be able to set it to automatically start logging whenever a game is launched.
  4. Yeah, I realize they had to have been working on it for years already and couldn't just come up with it last minute, but what I was saying is it seems like they may have had it ready to go and just sat on it since they didn't have much competition in that regard. But I can also see that may not be the case due to Apple pushing them to improve it, and based on that it seems it was perhaps a situation of too little, too late, with them getting to that point just as Apple was breaking off the partnership.
  5. Makes sense, but I find the timing too coincidental. Intel released Xe shortly after AMD started pushing greatly improved iGPUs, which tells me they had it ready to go (possibly for the reasons you mentioned, i.e. development for the purpose of competing with Nvidia) and only actually started using it when they had to, as a reaction to AMD, as opposed to taking it upon themselves to improve it beforehand. I suppose it's possible they just weren't ready yet, or it was already in the pipeline, but I just find the timing a bit suspicious.
  6. Sorry, I took your statement as meaning they weren't really competitors, not that they were but that AMD just wasn't a "major" one. Makes more sense, but still doesn't seem to align with what porina said and what I recall. I don't even remember IBM being very big at that time, as it seems they had fallen out of the consumer space at least by then, and I don't recall SPARC at all. All I remember is Intel vs AMD. But that's not to say there wasn't more at the time that I wasn't aware of, though it definitely does seem Intel was competing with AMD a good bit. Ok, but I'm sure it's hard to know that. I imagine they weren't just blindly investing and they thought they would have need of it. Had the opposite happened and they didn't invest in it and needed it, as seems to be the case now, they could be equally criticized for that. Of course, it's also possible they just didn't do due diligence and it was a case of irresponsible management. Interesting info, and I have no doubt that all played a role. It may likely have even been the majority of it. All I'm saying is Intel using anti-competitive actions to gain/maintain laptop market share over AMD didn't help. It both cost AMD money in lost sales and it cost them name recognition, both of which resulted in continued costs down the road. It's absolutely possible that's a drop in the bucket compared to the other factors you mentioned and I'm overestimating its impact, but it doesn't change the fact that Intel engaging in those behaviors was harmful and egregious, which was my original/main point, and I still suspect the lost revenue due to it had to have had at least some impact on those other things. Now that I think about it, I seem to recall feeling back then that maybe AMD bit off more than they could chew. Instead of being able to put their resources behind ATI to make it better, it just spread them too thin, especially, as you mentioned, due to Intel releasing the Core product. So basically they made a couple bad bets one after the other. That certainly explains a lot. Hindsight is 20/20, so easy for him to say that now, but back then, I suppose it wasn't so obvious. What's their reasoning though for keeping a crappy iGPU for years and only improving it when AMD did?
  7. This is interesting as it's in direct contradiction with Wolram's claim that Intel didn't compete with AMD in the 90s, though your account is more inline with what I recall from the time. Though according to you (and I'm not calling it into question), AMD beat Intel on frequency, and then on performance as a result of that frequency, while I recall AMD being competitive with, and possibly better than, Intel at lower frequencies due to higher IPC in the mid-90s. I didn't realize Intel had HT that early on. My first build was with an Athlon64, due to being generally regarded as better than Intel's offerings at the time as well as cheaper, but I constantly struggled due to the 1C/1T design, and after a couple years or so replaced it with an X2 which was night-and-day. Hard to say now, but who knows, maybe an Intel with HT from the start would have been the better option, but I'm guessing it was prohibitively expensive, especially considering it was another several years or more before HT became standard on all their CPUs and not something you had to pay more for. As for the process issues, that's fair, though I wonder how hard they were really trying, or if they didn't push as hard and spend as much as they could have due to not having much competition. It just seems awfully coincidental that they suddenly started having massive improvements, especially finally replacing the extremely old iGPU with a halfway decent one, only after AMD started becoming a real threat again. Just like how they came out with the C2D/C2Q and then the i-series, all of which were substantial improvements over what they had been doing, when the Athlons started taking away a decent amount of market share, and even then it took them a couple years, indicating they weren't even ready. In any event, whether they were anti-competitive so they wouldn't have to try as hard or because they were struggling with manufacturing or any other number of reasons, their actions were both reprehensible and harmful. If it was due to manufacturing issues, and done to keep them from falling behind due to those issues vs simply to allow them to be complacent, that's just as bad if not worse. Though I don't know what the exact timeline was, and it seems their fab struggles happened toward the end of all that. Pentium D was bad. 386 to 486 to 586 were late-80s/early-90s. Maybe I'm wrong, but I just don't recall Pentium 2 and 3 being that significant, and Pentium 4 being a decent improvement but then it took another five years or so for the Pentium D, which again wasn't good, evidenced by the fact they released the C2D only a year later, which was a massive improvement and set them off on a several-year run of very good improvements until they started to stagnate again. Perhaps that more recent case was due to manufacturing issues, as mentioned by @porina, though as I said in my reply to them above, I seriously wonder if they tried as hard as they could/should have to resolve those issues vs figuring they didn't need to because they didn't have the competition to force them to do so.
  8. I don't disagree it could be better, as with most things. An I/A would certainly be nice, but again, if a consumer isn't going to spend an hour researching such basic things, they shouldn't be building their own computer. It's really not hard to know what CPUs go with what motherboards. In fact, I'd argue it's hard NOT to, and you'd almost have to try to mess it up. Companies already do a lot to make it clear, and they also need to do what they need to, within reason, to sell their products, and marketing them with similar numbers to the competition so consumers will look at them too, as opposed to having different numbers which may be perceived by the consumer as not as good and therefore be ignored, is part of that. Again, I find it interesting how people take issue with it in the CPU space but not the GPU space. It seems the main reason for that is potential compatibility issues, which don't really exist with GPUs, but again, all it takes is a tiny bit of research. I guess it depends on what you consider a "review" and where you're going for them. I look at various sites, both at reviews and general articles. Considering you're going places with titles like that, I can understand why you feel the way you do. I've seen plenty of good and bad about both, but again, the general trend is that AMD has been outperforming Intel overall lately. They have better multi-core, better efficiency, and better graphics, all for typically a lower price. It's okay to be a fan of Intel, but it's not okay to ignore objective facts and substitute them with personal opinons.lol. As for the rest, as I said, if you can't understand how loss of profits today affects a company's performance tomorrow, I can't help you.
  9. That was my point. Putting Ultra in the name may be a way to intentionally mislead customers into thinking something is much better than it is. Yes. They released Pentium in the early 90s, and then didn't do much for the next decade, essentially making incremental improvements and eventually releasing the Pentium D, which was pretty bad. Whether or not they were competing with AMD or not at the time is irrelevant, and I never said they were. They didn't have sufficient competition to push them along, so they were complacent. Only when AMD released the Athlon64 and then the Athlon 64 x2 (which is what pushed Intel to go multi-core as well) did Intel get their act together and release the Core 2 Duo, etc, followed by the i-series chips. Then early last decade, they stagnated again, with multiple generations being only single-digit improvements over the last, then once AMD came out with Ryzen, which was a substantial improvement and started offering real competition, Intel started making bigger improvements again with larger IPC increases, better iGPUs, etc. I don't deny, and never said, there aren't other reasons for AMD's struggles. Also, Intel didn't do what they did in 2009, they did it for years and agreed to stop in 2009. So while yes, AMD did make mistakes of their own, it absolutely hurt them by being hamstrung, again for years, causing a loss of profits AND market share, both of which have long-lasting detrimental effects. And I don't know exact timelines or their financials, so this is conjecture, but it certainly seems possible the loss of income from Intel's anti-competitive dominance in the laptop market contributed to their lackluster products you mentioned, as they had less money for R&D and to take the time necessary to produce good products. People don't seem to understand the domino effect revenue (or lack thereof) has for a company. As for video cards, I don't follow them as closely, but my impression was that ATi was behind Nvidia when they were acquired by AMD, so it's not that AMD mishandled things and caused Nvidia to pull ahead, but that they started already behind and just haven't managed to catch up (though they're getting close save for ray-tracing). I could be wrong on that, though. And I do prefer AMD over Intel, but I can still recognize facts and timelines and the effects of various things. In fact, the main reason I prefer AMD isn't due to performance, but because I dislike Intel specifically due to their actions and complacency. Even so, I used Intel on my last desktop build because it was better suited for what I was doing. I don't play favorites; I look at the facts and choose the best and most deserving, whether in computer parts, politics, or whatever. As for fabs, I'm curious why you'd think they shouldn't have invested in them and didn't/don't need them. One of AMD's biggest problems now is that they can't get their processors out in large enough numbers, which if they had their own fabs may not be (as much of) an issue. Then again, perhaps they would have had to spend enormous amounts of money keeping them updated, and I'd wonder if it makes sense for them to have their own, but Intel is working on building their own right now, so it seems to make just as much sense for AMD.
  10. That's a fair point, though as far as internally they likely have NDAs, and while they may have whistleblower protection, that's a fairly new thing and still isn't a guarantee, and just because they would be protected legally doesn't mean they wouldn't still potentially lose friends, receive threats, etc, if they reported something like this which cost the company a lot of money and jobs. There's also group mentality to consider, and if it's the norm and is generally accepted as a way to collect data to keep them employed and/or make more money, people will often do things that aren't right. And I say that from experience from seeing it happen all the time. Of course, the larger the organization, the less likely, so you're right, that does make it far less likely something like this would be going on with MS. But again, not impossible. I should also clarify my concern isn't that they're stealing biometrics, which I do find very unlikely. My concern is more that having such low-level access, i.e. a chip that's integrated with the CPU, gives them an unprecedented backdoor capability. And if that's something they're doing in coordination with the government, there very well could be no whistleblower protection.
  11. Yeah, I realize that, but Intel is and has always been a bigger company with far more money than AMD, so I find it unlikely they would settle, especially since that gives the impression of guilt. Do you really not realize how things in the past can affect things in the future? They spent years forcing large market share in the laptop space, which led to years of both consumers knowing the Intel name better and thinking Intel is automatically better and laptop OEMs continuing to use what they've always used, because it's what consumers know and "want" and it's what they've been working with and designing around for years. Not to mention that it cost AMD market share then, which cost them money then, which impaired their ability to do R&D and slowed their progress, resulting in a vicious cycle. If you can't understand that, I can't help. And by the way, I'm in no way saying others, including AMD, don't do anti-competitive stuff, too, but Intel is one of the bigger offenders and they basically handicapped AMD for a long time. Between that and them spending years, multiple times, dragging their feet and only making major leaps when AMD offered significant competition, we are years behind where we should be technologically. They stagnated CPU development in the 90s and early 00s and again early last decade, something they were able to do largely because of holding AMD down so they wouldn't have to compete against them.
  12. Except I see it as a help, not a burden, to the consumer, because they can look at it and say these two parts are directly competitive and roughly comparable (granted that's not always the case). You could also say it's a burden for them to have to try and compare two brands when they are significantly different in their naming. Nobody seems to take issue with i-3/5/7/9 and R-3/5/7/9 or RTX 4080 vs RX 7800 despite them being so similar, because it allows a more-or-less direct comparison. Most reviews I've read over the past few years have generally been in favor of AMD over Intel, including with otherwise similar models. AMD has almost as much single-core performance while having better multi-core performance, lower power usage, better efficiency (yes, those are two separate things), better iGPUs, and lower cost. There will of course always be exceptions, but in both professional reviews and in forums, I've seen a lot more praise of and recommendations for AMD than Intel in the past few years. And market share does not equal superiority. As I mentioned, AMD has suffered in laptop market share for years due to Intel's practices, and even despite that their share has increased dramatically over the past few years, because they are so good compared to Intel. As for benchmarking, I really hope you aren't referring to userbenchmark. Regarding the anticompetitive stuff, I don't need to provide anything, because, despite your claims that you couldn't find anything, you provided a link that says it all. Why would Intel have paid AMD $1.25B and agreed to abide by a set of ground rules and entered into a consent degree with the FTC if they weren't engaged in anti-competitive behavior? And that link also details said behavior, with the very first point being the main one I read about years ago and the primary reason Intel has the market share you believe is due to better performance. Exactly. There has to be a point at which people need to take responsibility for their own choices and actions. If somebody wants to spend hundreds or thousands of dollars on a computer without doing the most basic of research first, that's on them. People can say it's putting a burden on consumers, but companies aren't and shouldn't be responsible for holding consumers' hands, either. It's not about going overboard trying to make it idiot-proof, which nothing is; it's about people being lazy and not caring enough to bother spending an hour looking at a few noob-friendly articles on the topic, of which there are undoubtedly hundreds, if not thousands. There's a thing called the internet that exists now and has essentially the whole of human knowledge on it. If people can't be bothered to use such an immense resource to research before making a large purchase, that's on them.
  13. True, though again, MS is pushing for more user data in the cloud, including user accounts, so while not related to biometrics, it's still something to be wary about, and as such it's wise to be wary of anything they're doing. But I also never said they are doing it, just that, based on their and other companies' track records, I'm always suspicious and require proof they aren't before trusting them rather than proof they are before not trusting them. As for lawsuits, Google was found collecting location data even after turning it off, and they weren't fined billions. I don't recall, but I don't think they were fined at all, and if they were, it was a very small amount. So that argument simply has no traction as far as I'm concerned. And location data is arguably more useful and invasive than biometrics, at least for companies like Google, MS, etc. Supposedly, but if they were genuinely interested in using user data to determine what to work on, why are there issues that still exist after years of hundreds of, if not more, people complaining about them. I'm sure that's one use for it, but I suspect they both have other uses and that they don't really use it for that as much as they claim or people believe. Just the fact they keep trying to put advertising in their OS shows they are interested in user data, so it calls their motives into question. Heck, I contacted them a few years ago through their feedback hub, which I've become convinced is useless and is only there to make people feel better, telling them I'd found a way to consistently and easily crash every explorer (file manager) window, and I never heard back and I'm fairly certain the problem still exists. If they were genuinely concerned about what they should be working on, you'd think they would have followed up with me on that. "Transparency" from a historically non-transparent company that has a consistent belief that users shouldn't have a say in how things are done on their own computers and that MS should be able to do whatever they want on them means little to me. It's not unhinged or disingenuous to take a company's history, not to mention current, actions, behavior, and principles into account when determining how much to trust them and whether to question what they're doing. As I mentioned above, I never said they are doing these things, but that it's possible, and even likely, again based on their own actions. If they don't want people to make those assumptions about them, they shouldn't act in a way that all but begs for it. If it can be proven they aren't, then fine, but until then, I'm going to assume they are (note that's different from claiming they are which, again, I didn't do). @wanderingfool2 has added some insight into this, which is helpful as I don't know what is and isn't possible as far as determining what a closed-source OS is actually doing, though as I mentioned in my earlier reply to them it still seems there are ways, however unlikely, that MS could hide such activity even from researchers if so inclined. And since they would know that researchers could inspect the traffic despite it being encrypted using various methods like key injection, if they wanted to siphon data they would certainly take additional measures to hide that. Of course, ideally that would make it that much worse and get them into that much more trouble if caught, and ideally that risk would prevent them from doing it, but, as I mentioned, companies do illegal things all the time, and more often than not receive a miniscule fine as a "punishment."
  14. I guess I can see that, though anybody building their own computer should at least know enough to not mix AMD and Intel CPUs and motherboards, and if they don't, sorry, but that's on them, and they should have done just a tad more research. It's well-established and can likely be found easily with a bit of searching, so I don't want to get into it here. But there's a reason why laptops have historically been almost entirely Intel, and even now, with AMD being regarded by most as superior to Intel for a few years now, the majority of laptops continue to use Intel vs AMD.
  15. I'm not saying they're lifting prints off of doorknobs, and I agree that's highly unlikely (though not out of the realm of possibility); I'm just saying that, in general, it's wise to be suspicious/paranoid, because we have seen many, many cases of companies abusing our privacy and data. And history certainly matters, especially since MS isn't innocent in this, and as I said, companies do things they're not supposed to all the time, hoping to get away with it, often getting away with it, and when they get caught, it's just a cost of doing business. MS has proven itself to be untrustworthy and unethical (though obviously everyone has different thresholds for this, and this is my opinion), and so deserves even more scrutiny and less trust. I also realize open-source isn't a guarantee, but it does mean it can be checked, whereas closed-source is much more difficult, if not impossible. Of course they say so. But that proves nothing. Google and Facebook and other companies have also said things that proved to be lies. As I mentioned in my previous post and above, companies don't care about that; they just care about making money, and if they have to break the law and lie to do so, they often will. They cannot, and should not, be trusted. It's better to assume they're lying until proven otherwise than the opposite. And obviously they can investigate it, but as you mentioned, it will be encrypted, and for all we know it sends it in bits at a time so as to not be obvious, so it would be extremely difficult to prove, if not impossible. I'm not saying it is done, just that it could be, and it's safer to assume it is. I also don't know enough to know just how much researchers can determine, so your comment about injecting their own keys is interesting. If that means they can read the encrypted data and monitor it fully, then that's certainly better. But, and I realize this is a bit tin-foily and unlikely, but it's not impossible, MS could have Windows alter its behavior in that case and/or it could send the data in bits and pieces over several minutes or even hours so it blends in with the rest of the telemetry. I get that certain things seem overly paranoid, but given the history of companies' and governments' violations and general lack of respect for individuals' privacy, it frankly surprises me (though not really, sadly) how trusting most people are and how little they question things. If nothing else, MS could do a lot to improve people's trust in them by pulling back the reigns on the telemetry, the all but forcing people to create online accounts for Windows, and actually providing explanations of what all the various, mysterious updates are doing, not to mention allowing users to easily opt-out of individual updates. And of course many, many other things, the point of which is they don't exactly make it easy or provide reasons for people to trust them.
×