Jump to content
Search In
  • More options...
Find results that contain...
Find results in...

straight_stewie

Member
  • Content Count

    2,611
  • Joined

  • Last visited

Awards


8 Followers

About straight_stewie

  • Title
    Veteran
  • Birthday 1994-05-10

Profile Information

  • Gender
    Male
  • Location
    North Mississippi
  • Interests
    Audio, Programming, Engineering. Just a hobbyist now, unfortunately.

Recent Profile Visitors

5,651 profile views
  1. You should probably know that I don't own a single Apple product, and likely never will... I'm just saying. It's not a new idea to use ARM on devices, even PC like devices: The Surface Pro X, Microsofts own flagship device, that runs Windows 10 no less, uses ARM. Your phone almost certainly uses an ARM product. Your modem and/or router probably runs on ARM. Cars 2010 and newer, run on ARM. Your TV or monitor runs on ARM. In fact, these days, pretty much everything that's not a PC runs on ARM. Wanna know an even bigger secret? Your x86/amd64 machine also runs on ARM. It's just hidden away. Modern x86/64 processors require alot of help to start up and stay running. Help that comes from littler processors that do things for them. Littler processors that, these days, come from ARM.
  2. If by "card" you mean "software compiled for that OS on that architecture", then yes. They'll have to distribute the SDK, otherwise they will start losing users when they realize that literally no software exists for their machine. Maybe. That depends on who's responsibility is what. If Apple is smart, they'll develop some PCI-e IP for their new processors, that way they can just distribute the privileged SDK to hardware manufacturers and they can start providing drivers that will allow their existing cards to work. If Apple is even smarter, they'll contract with one of the three major desktop GPU manufacturers (well, soon to be three anyway) to get them to produce some variants designed specifically to work with some proprietary solution because that sounds flipping GREAT for marketing, and has the highly profitable side effect of requiring your users to purchase all upgrade and repair parts from you, which fits very nicely with Apples current repair infrastructure. Now that I think about it, they'll end up partnering with Nvidia, to get those sweet, sweet discounts on ARM licensing fees. As an aside, it's not like ARM is some newfangled thing that no one knows how to use. Yet again Apple has managed to pull off being behind while making it seem like they are these huge innovators. But whatever. The vast majority of consumer compute devices (including home appliances) run an ARM processor of some type. Windows and Linux already have versions for ARM, and pretty much every cellphone (smart or not, and including iPhones) uses an ARM processor.
  3. In the lead how? AMD's current 10nm offerings still lose to Intels 14nm offerings in basically every real world performance benchmark. This has been the case for 2 generations now. I don't see any reason why the move to 5nm itself will change this. I'll be very blunt here: From my perspective, it's a very tough sell to make any other metric more important than raw performance in desktop processors. I couldn't possibly care any less about what node size a manufacturer is using. The sole care should be benchmark results, and retail cost if that's your jam. Also, welcome to the forum! Please remember to quote users when you reply to them so that they get a notification that you've replied.
  4. I can say with 100% certainty that this is false. Intel has already reached 10nm, just not on socketed processors. However, if you'd read the op, you'd know that they plan to release 10nm socketed processors in late 2021.
  5. Big-little can bring performance improvements. While it's true that it commonly means "purposeful performance hits", that's only because the only devices that use them are battery powered. There is a problem in processor design. When things get too far apart, you have to slow signalling down. That's because the signals can get "skewed", resulting in timing problems.This is actually a pretty large part of the reason why processors with more cores also generally have lower clocks. However, with a big-little architecture, you can design smaller "cores" focused on more specific tasks, but with their own clocks. So the bigger, fully-featured cores, run on slower clock speeds, while the smaller, more specialized cores, can run at higher clock speeds. This could actually bring noticeable performance gains: most Non-OS-call stuff is still just branching and integer math.
  6. There's a few different strategies. The first, most common, and easiest, is to just not use extensions. Another way is to write multiple functions, and control which ones get compiled with some flags. This way, you can compile the same application, but to take advantage of different extensions, or none if they are not available. A third way is to be dynamic about it. That is to write a program that checks at runtime what extensions are available, and calls the appropriate function(s). A different way is to use a JIT, like the CLR or the JVM. These can decide at runtime how to compile your intermediate code (CIL or Bytecode respectively), into machine instructions. @Sakuriru Not always: https://github.com/xoreaxeaxeax/sandsifter is an application that runs "random" opcodes and sees what happens. It has found many cases of undocumented instructions, some of which produce illegal behavior.
  7. It's not necessarily a bad thing. Some customers collect components for a long time before final assembly (how you build a top shelf machine with very low income). What this means is that, for example, one might have some DDR4 memory which they'd rather not just throw away, but could still benefit from PCIe 4. There is a market that will benefit from this kind of launch. As for the "just go for AMD" response I'm likely to get from that, good luck getting full performance out of 2 of these memory kits with an AMD processor.
  8. Caching is, complicated. To say the least. It's easy enough to understand the very basics of what a CPU cache is. Understanding how it really works is fairly hard, predicting whether or not it will actually improve performance for a given application is next to impossible. What I can tell you is that, for a well designed cache and cache policy, having a cache will improve performance over having no cache at all for most data bound tasks. Having a larger cache or a better cache policy can improve performance even further. However, having a cache is unlikely to improve performance much for most IO bound tasks. Which is where things get difficult for games specifically: Games are generally both data and IO bound. So predicting whether more cache equals more better is, well, next to impossible without some serious, in-depth, application specific analysis. My suggestion is: Don't worry about how much cache a processor has when you are shopping. Modern processors are pretty well built, and generally fit nicely into their performance categories.
  9. I meant to specifically google for problems with the specific updates that you had during the time it stopped working.
  10. Start googling, lol. Some update you got while you were away broke it. You gotta use Google to figure out which one.
  11. They might. The idea I'm working on is that if it was working fine before, and now it's suddenly not, then something absolutely had to change between then and now. Software updates are frequently a cause of this kind of issue.
  12. I think that both are scary. I think that alien contact (with the ability to travel to our planet) right now is extremely scary. I am personally scared of that idea. But more than that, I don't think that overall, we could treat them correctly enough to build a positive relationship. But, it's also scary to think that we might never be advanced enough to handle first contact.
  13. I'm so sick and tired of every single thing under the sun becoming a subscription service. PaaS (product as a service) is one of the worst things to befall the capitalistic world IMO.
  14. What date did you notice the stuttering on? The 20th or the 21st?
×