Jump to content
Search In
  • More options...
Find results that contain...
Find results in...

straight_stewie

Member
  • Content Count

    2,611
  • Joined

  • Last visited

Everything posted by straight_stewie

  1. You should probably know that I don't own a single Apple product, and likely never will... I'm just saying. It's not a new idea to use ARM on devices, even PC like devices: The Surface Pro X, Microsofts own flagship device, that runs Windows 10 no less, uses ARM. Your phone almost certainly uses an ARM product. Your modem and/or router probably runs on ARM. Cars 2010 and newer, run on ARM. Your TV or monitor runs on ARM. In fact, these days, pretty much everything that's not a PC runs on ARM. Wanna know an even bigger secret? Your x86/amd64 machine also runs on ARM. It's just hidden away. Modern x86/64 processors require alot of help to start up and stay running. Help that comes from littler processors that do things for them. Littler processors that, these days, come from ARM.
  2. If by "card" you mean "software compiled for that OS on that architecture", then yes. They'll have to distribute the SDK, otherwise they will start losing users when they realize that literally no software exists for their machine. Maybe. That depends on who's responsibility is what. If Apple is smart, they'll develop some PCI-e IP for their new processors, that way they can just distribute the privileged SDK to hardware manufacturers and they can start providing drivers that will allow their existing cards to work. If Apple is even smarter, they'll contract with one of the three major desktop GPU manufacturers (well, soon to be three anyway) to get them to produce some variants designed specifically to work with some proprietary solution because that sounds flipping GREAT for marketing, and has the highly profitable side effect of requiring your users to purchase all upgrade and repair parts from you, which fits very nicely with Apples current repair infrastructure. Now that I think about it, they'll end up partnering with Nvidia, to get those sweet, sweet discounts on ARM licensing fees. As an aside, it's not like ARM is some newfangled thing that no one knows how to use. Yet again Apple has managed to pull off being behind while making it seem like they are these huge innovators. But whatever. The vast majority of consumer compute devices (including home appliances) run an ARM processor of some type. Windows and Linux already have versions for ARM, and pretty much every cellphone (smart or not, and including iPhones) uses an ARM processor.
  3. In the lead how? AMD's current 10nm offerings still lose to Intels 14nm offerings in basically every real world performance benchmark. This has been the case for 2 generations now. I don't see any reason why the move to 5nm itself will change this. I'll be very blunt here: From my perspective, it's a very tough sell to make any other metric more important than raw performance in desktop processors. I couldn't possibly care any less about what node size a manufacturer is using. The sole care should be benchmark results, and retail cost if that's your jam. Also, welcome to the forum! Please remember to quote users when you reply to them so that they get a notification that you've replied.
  4. I can say with 100% certainty that this is false. Intel has already reached 10nm, just not on socketed processors. However, if you'd read the op, you'd know that they plan to release 10nm socketed processors in late 2021.
  5. Big-little can bring performance improvements. While it's true that it commonly means "purposeful performance hits", that's only because the only devices that use them are battery powered. There is a problem in processor design. When things get too far apart, you have to slow signalling down. That's because the signals can get "skewed", resulting in timing problems.This is actually a pretty large part of the reason why processors with more cores also generally have lower clocks. However, with a big-little architecture, you can design smaller "cores" focused on more specific tasks, but with their own clocks. So the bigger, fully-featured cores, run on slower clock speeds, while the smaller, more specialized cores, can run at higher clock speeds. This could actually bring noticeable performance gains: most Non-OS-call stuff is still just branching and integer math.
  6. There's a few different strategies. The first, most common, and easiest, is to just not use extensions. Another way is to write multiple functions, and control which ones get compiled with some flags. This way, you can compile the same application, but to take advantage of different extensions, or none if they are not available. A third way is to be dynamic about it. That is to write a program that checks at runtime what extensions are available, and calls the appropriate function(s). A different way is to use a JIT, like the CLR or the JVM. These can decide at runtime how to compile your intermediate code (CIL or Bytecode respectively), into machine instructions. @Sakuriru Not always: https://github.com/xoreaxeaxeax/sandsifter is an application that runs "random" opcodes and sees what happens. It has found many cases of undocumented instructions, some of which produce illegal behavior.
  7. It's not necessarily a bad thing. Some customers collect components for a long time before final assembly (how you build a top shelf machine with very low income). What this means is that, for example, one might have some DDR4 memory which they'd rather not just throw away, but could still benefit from PCIe 4. There is a market that will benefit from this kind of launch. As for the "just go for AMD" response I'm likely to get from that, good luck getting full performance out of 2 of these memory kits with an AMD processor.
  8. Caching is, complicated. To say the least. It's easy enough to understand the very basics of what a CPU cache is. Understanding how it really works is fairly hard, predicting whether or not it will actually improve performance for a given application is next to impossible. What I can tell you is that, for a well designed cache and cache policy, having a cache will improve performance over having no cache at all for most data bound tasks. Having a larger cache or a better cache policy can improve performance even further. However, having a cache is unlikely to improve performance much for most IO bound tasks. Which is where things get difficult for games specifically: Games are generally both data and IO bound. So predicting whether more cache equals more better is, well, next to impossible without some serious, in-depth, application specific analysis. My suggestion is: Don't worry about how much cache a processor has when you are shopping. Modern processors are pretty well built, and generally fit nicely into their performance categories.
  9. I meant to specifically google for problems with the specific updates that you had during the time it stopped working.
  10. Start googling, lol. Some update you got while you were away broke it. You gotta use Google to figure out which one.
  11. They might. The idea I'm working on is that if it was working fine before, and now it's suddenly not, then something absolutely had to change between then and now. Software updates are frequently a cause of this kind of issue.
  12. I think that both are scary. I think that alien contact (with the ability to travel to our planet) right now is extremely scary. I am personally scared of that idea. But more than that, I don't think that overall, we could treat them correctly enough to build a positive relationship. But, it's also scary to think that we might never be advanced enough to handle first contact.
  13. I'm so sick and tired of every single thing under the sun becoming a subscription service. PaaS (product as a service) is one of the worst things to befall the capitalistic world IMO.
  14. What date did you notice the stuttering on? The 20th or the 21st?
  15. Do you have backups? Can you unroll the update and see if it fixes the issue?
  16. Apple finally gets something that everyone else has had for 15 years. It's a ripoff anyway.
  17. So it was working just fine and then two days ago suddenly stopped working correctly? If it was working fine, and then suddenly stopped, it means that something changed. Our task now is to find out what exactly changed. Can you think of anything that might have changed? Did any software do an update in that timeframe? Did you change where you put your laptop while gaming? Did you change any Windows, Steam, or game settings? Did you install any new software at that time? Did anything at all change that you can think of?
  18. Does the game always stutter, or only after you play it for a while? How long between when you closed the game and when you started the benchmarks up? Have you run these benchmarks before this issue, and not experienced this low component usage?
  19. Without going too in-depth here, All of these companies really started developing their cloud infrastructure in response to a DoD program (remember recently when Microsoft kept trying to sue Amazon because Amazon won the contract?). The same year this program was publicly announced, the text of a bill called the "Cyber Security Information Sharing Act" was passed as part of the 2015 Omnibus Spending Bill. Among other things, this bill waives alot of a businesses data protection liabilities provided that it shares "cybersecurity threat information" with the office of the Director of National Intelligence. Basically, as long as a business shares it's "cybersecurity threat information" with the government, they cannot be held liable for data breaches. And just like that, everyone had this big cloud infrastructure, and this great way to never be held truly liable for faulty data protection policies. Suddenly, the way to make money was to be nice to people: Convince them that you are friendly and trustworthy, and that they can trust that you will protect all of their cloud backed data. It didn't hurt that having an open mind when it came to the definition of "privacy" bought you friends in high places. Of course, the key to all of this is to convince people that you are nice and friendly, and that you can be trusted. For Microsoft to do that requires them to get away from the common perception that they are big bad evil Micro$oft.
  20. The FAA just now finalized their list of requirements for return-to-service. Well, just now as in "last month". https://www.faa.gov/news/media/attachments/19_035n-R3-8-3-20.pdf Just to be clear, the government approved verbiage for "autopilot nose-dive into the ground" is:
  21. Just learn by doing projects. Google your questions. C# is a very popular language with lots of StackExchange activity and very good lessons, documentation, and code examples directly from Microsoft. A good starting point on C#/.NET is here: https://docs.microsoft.com/en-us/dotnet/csharp/ The Class Library API reference can be found here: https://docs.microsoft.com/en-us/dotnet/api/ The source code for .NET Core can be found here (useful once you get some experience under your belt): https://source.dot.net/ In all honesty, the best text based beginner C# online courses come free with your Microsoft account: https://docs.microsoft.com/en-us/users/dotnet/collections/yz26f8y64n7k07 The best video lessons used to be on MVA, but the videos are only left on channel 9. The C# Fundamentals for Absolute Beginners series by Bob Tabor has to be one of the best video series out there: https://channel9.msdn.com/Series/CSharp-Fundamentals-for-Absolute-Beginners Microsoft expends a metric buttload of time and money trying to push C#/.NET. They've raised the bar for documentation and onboarding. Start with their materials and move from there when you need to. That's not been my experience in the US, atleast not until later years in undergraduate studies. In elementary, Highschool, and early university you kind of just get what you get. I suspect that telling the teacher that you don't like their curriculum is a fantastic way to always get judged harshly. It would be more, diplomatic, to say that you are struggling to understand and are looking for extra help, at least that's been my experience. Just as an aside to the whole thing: I absolutely love C# personally. However, it might not actually be the best beginner language. It's got a lot of things going for it, the least of which is it's excellent and easily accessible documentation. But, it's also a complicated system. You could probably earn a masters degree studying the type system alone. It's nearly PhD level work understanding all the nuances of garbage collection (Especially now that they've introduced managed types for native memory (Memory and and Span))
  22. There's another technique called "virginning". Basically, your engineers doing the reversing have to have never even touched the product they are reversing. You feed them separated bits and pieces of it without telling them what it's from, and have them figure it out. You keep them in a separate entity from the rest of the enterprise, and then no one can be sued for copyright violation: The enterprise never violated copyright, and the implementers weren't a witting partner in a copyright violation. This is what Compaq did to IBM.
×