Jump to content

Why does Apple, a hardware company, have a more well liked OS than microsoft (a software company?)

corrado33

If you consider Apple to be a hardware company then you don't really know that much about them. They make very little hardware.

Make sure to quote or tag me (@JoostinOnline) or I won't see your response!

PSU Tier List  |  The Real Reason Delidding Improves Temperatures"2K" does not mean 2560×1440 

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, corrado33 said:

So how on earth does apple, a company that makes money selling hardware, have a nicer (or at least equally nice) OS to microsoft, which makes money selling software? 

I'm not a very big fan of Apple, but Apple has never been "just a hardware" or "just a software" company. Apple has always been about engineering product experiences. That's what Steve Jobs was always going on about when he was saying that his stuff "just worked".

ENCRYPTION IS NOT A CRIME

Link to comment
Share on other sites

Link to post
Share on other sites

41 minutes ago, corrado33 said:

If the driver allows the hardware to communicate with windows, then why does windows need to be so much more complicated than mac?

Not really sure what you mean. They are both 'complicated' because they are different kernels. The difficulty you may be thinking of may come from the variety of hardware that Windows runs on whereas Mac only runs on a small amount of Apple controlled hardware.

 

 

 

Because they have different kernels, different drivers need to be used between the two.

 

Do you have an example in mind where it is 'more complicated'on Windows?

Rest In Peace my old signature...                  September 11th 2018 ~ December 26th 2018

Link to comment
Share on other sites

Link to post
Share on other sites

Because Apple has one of the biggest fanboys in the world and Microsoft people are critical consumers who say something when shit is bad. That is then reflected as public opinion where Apple stuff is seemingly absolutely perfect and Microsoft's is endlessly broken. Even if that isn't actually true for either case...

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, corrado33 said:

Exactly. This is the point I'm trying to get at. If the driver allows the hardware to communicate with windows, then why does windows need to be so much more complicated than mac? The drivers are provided by the hardware manufacturer. Even chipset drivers are provided by the motherboard manufacturer. 

 

I guess the absolute best way I can phrase the question is this:

 

If drivers modify the kernel to work with certain hardware, then why does the windows kernel need to be more complicated than mac when the drivers are made by the component manufacturer? Why is windows not just super stripped down to provide the absolute BASE system, waiting for you to install drivers for your hardware (like you usually have to do anyway.) 

 

I can understand how a windows INSTALLATION ISO would have to be bigger than a macOS iso, but once installed, I have no idea why windows and mac would be any different on the "supporting hardware" front when in reality it's the hardware manufacturer's drivers that allow the hardware to work. 

I would say you are coming at the concept of an operating system from the wrong angle.  It's not just intrinsic differences in the core of the Kernel that make them different, but the evolution of those operating systems over literally decades with all manor of third party hardware manufacturers and software developers.

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

IOS is a fairly closed enviroment, with a low variance of hardware to support.

 

Windows is a broad platform, that can do a bit of everything, and support all hardware types, has to run on slow machines, fast machines, support every type of configuration..

 

IOS is just focused on simpflication to the point, where you do lack customization options. 

 

it is 2 different products for 2 different customers. 

Link to comment
Share on other sites

Link to post
Share on other sites

9 hours ago, corrado33 said:

No, not at all. I think the question is relevant. By all means microsoft SHOULD have the superior OS, but I don't think many people would classify it as such. (If you exclude software compatibility/gaming of course.) And MacOS is known for it's usability. That's pretty much it's calling card. 

 

Not to mention microsoft has been in the news a lot lately for possibly shitty things they'll be doing to their OS soon...

"MacOS is superior if you don't count all the things where Windows is superior." Well, that doesn't make much sense in my opinion. With that standard you can make anything seem superior as long as it beats the thing you are comparing it to on at least one aspect. 

I would take Windows over MacOS any day and not just because of gaming.

Link to comment
Share on other sites

Link to post
Share on other sites

9 hours ago, corrado33 said:

If you exclude software compatibility/gaming of course.

why would you exclude that? it's a fairly important part of the os's usability. Your "question" is ultimately just you trying to express a "truth" of macos being inherently better by putting that truth in a related question so we're not supposed to argue against mac being better but instead try to determine why, solidifying your opinion and creating an echo chamber for you. Personally, I think you're flat out wrong. I've tried using macos many times, and I can't even begin to stand it. I find it to be terribly buggy, laid out horribly, and only designed with ecosystem restriction in mind. If you only intend to give them money, devices in the apple ecosystem work very well together, but if you have even 1 accessory not apple owned, problems tend to arise. I used to work at staples easy tech support and had people  come to us all the time with problems on a mac. We tried and tried and couldn't make simple things work properly. The settings were often a nightmare to find things in, and no amount of google resolved many of the problems where we just advised them to talk to apple, and they then confided that they had already done so and apple couldn't figure it out and told them they needed to buy a new computer.

Now you're gonna say and have already said above that you didn't intend for this to be an echo chamber and you weren't forcing a truth on us by sliding it in a question like that. But the truth is, if you didn't intend that, your actual post would have been:

    Given how long Microsoft has been in the business and how well they've done in the past with their software, why are they experiencing so many problems keeping updates stable on their current OS? 

To which the answer is: As we become more and more dependent on technology taking over parts of out lives, parts of our activities, as we expect computers to do more and more at an ever increasing speed and throughput, technology has undeniably become exponentially more complex over time. In earlier days of the internet, there weren't as many things to account for when thinking about software compatibility. There were simply less programs, less viruses, less use cases, less things that a computer could be asked to do. Now a computer is expected to be able to do ANYTHING. I can EXPECT a pentium to play minecraft on low settings. If I design an OS today, I need to account for spectre and meltdown, along with a slew other other vulnerabilities and security concerns. I have to plan for everything, and it's inevitable that bugs will arise. Unfortunately the latest bugs in this recent update have been more than annoying bugs, they've been detrimental to the function of systems.

Now to get back to the ecosystem you're fanboying over. When designing MacOSX, they still have to plan for a large range of things given todays technology dependence, but the truth is they don't have remotely as much to plan for. Due to the locked down nature of their ecosystem, they cut off a lot of things that they need to worry about.

 

Insanity is not the absence of sanity, but the willingness to ignore it for a purpose. Chaos is the result of this choice. I relish in both.

Link to comment
Share on other sites

Link to post
Share on other sites

11 hours ago, corrado33 said:

MacOS, in my opinion (and many others) is a really good OS. It's fast, secure, has good features, etc etc. Where as windows 10 is... Ok I guess. The whole "spying" thing wasn't cool, but I understand why they did it. And the possibility of ads in their stock programs in the future is terrifying. 

 

So how on earth does apple, a company that makes money selling hardware, have a nicer (or at least equally nice) OS to microsoft, which makes money selling software? 

Who said Apple is just a hardware company?  You defined Apple incorrectly to advance a nonsensical argument.

 

Kinda makes you look a little stupid.

"Do what makes the experience better" - in regards to PCs and Life itself.

 

Onyx AMD Ryzen 7 7800x3d / MSI 6900xt Gaming X Trio / Gigabyte B650 AORUS Pro AX / G. Skill Flare X5 6000CL36 32GB / Samsung 980 1TB x3 / Super Flower Leadex V Platinum Pro 850 / EK-AIO 360 Basic / Fractal Design North XL (black mesh) / AOC AGON 35" 3440x1440 100Hz / Mackie CR5BT / Corsair Virtuoso SE / Cherry MX Board 3.0 / Logitech G502

 

7800X3D - PBO -30 all cores, 4.90GHz all core, 5.05GHz single core, 18286 C23 multi, 1779 C23 single

 

Emma : i9 9900K @5.1Ghz - Gigabyte AORUS 1080Ti - Gigabyte AORUS Z370 Gaming 5 - G. Skill Ripjaws V 32GB 3200CL16 - 750 EVO 512GB + 2x 860 EVO 1TB (RAID0) - EVGA SuperNova 650 P2 - Thermaltake Water 3.0 Ultimate 360mm - Fractal Design Define R6 - TP-Link AC1900 PCIe Wifi

 

Raven: AMD Ryzen 5 5600x3d - ASRock B550M Pro4 - G. Skill Ripjaws V 16GB 3200Mhz - XFX Radeon RX6650XT - Samsung 980 1TB + Crucial MX500 1TB - TP-Link AC600 USB Wifi - Gigabyte GP-P450B PSU -  Cooler Master MasterBox Q300L -  Samsung 27" 1080p

 

Plex : AMD Ryzen 5 5600 - Gigabyte B550M AORUS Elite AX - G. Skill Ripjaws V 16GB 2400Mhz - MSI 1050Ti 4GB - Crucial P3 Plus 500GB + WD Red NAS 4TBx2 - TP-Link AC1200 PCIe Wifi - EVGA SuperNova 650 P2 - ASUS Prime AP201 - Spectre 24" 1080p

 

Steam Deck 512GB OLED

 

OnePlus: 

OnePlus 11 5G - 16GB RAM, 256GB NAND, Eternal Green

OnePlus Buds Pro 2 - Eternal Green

 

Other Tech:

- 2021 Volvo S60 Recharge T8 Polestar Engineered - 415hp/495tq 2.0L 4cyl. turbocharged, supercharged and electrified.

Lenovo 720S Touch 15.6" - i7 7700HQ, 16GB RAM 2400MHz, 512GB NVMe SSD, 1050Ti, 4K touchscreen

MSI GF62 15.6" - i7 7700HQ, 16GB RAM 2400 MHz, 256GB NVMe SSD + 1TB 7200rpm HDD, 1050Ti

- Ubiquiti Amplifi HD mesh wifi

 

Link to comment
Share on other sites

Link to post
Share on other sites

11 hours ago, corrado33 said:

So how on earth does apple, a company that makes money selling hardware, have a nicer (or at least equally nice) OS to microsoft, which makes money selling software? 

Who says Apple isn't a software company? Just because you can't directly buy macOS, that doesn't mean it's not part of what you're buying when you purchase a mac. In fact, I would argue it's the main selling point for a lot of people.

Don't ask to ask, just ask... please 🤨

sudo chmod -R 000 /*

Link to comment
Share on other sites

Link to post
Share on other sites

Security and company definitions aside, what make an OS superior?

IMO, depending on the use case, an OS may be superior to another. For my specific applications MS Windows is superior to MacOS.

There were other factors in play here also, MS Windows is more widely available than MacOS and I trust it to work out of the box on a wider variety of hardware - that on it's own doesn't make the OS superior, if anything we're talking about business models in this regard.

Link to comment
Share on other sites

Link to post
Share on other sites

I'm pitching my two cents on a few things here

 

10 hours ago, corrado33 said:

Exactly. This is the point I'm trying to get at. If the driver allows the hardware to communicate with windows, then why does windows need to be so much more complicated than mac? The drivers are provided by the hardware manufacturer. Even chipset drivers are provided by the motherboard manufacturer. 

Windows likely isn't more complicated. The thing is that Microsoft didn't seem to have any influence on the hardware architecture of the IBM PC standard. The hardware side was often dictated by Intel and system builders. Microsoft was just along for the ride.

 

The other problem is that if Microsoft wrote the drivers, they effectively dictate what hardware is compatible with the OS. This is against the original design philosophy of Windows NT which modern Windows is based off of.

 

10 hours ago, corrado33 said:

I guess the absolute best way I can phrase the question is this:

 

If drivers modify the kernel to work with certain hardware, then why does the windows kernel need to be more complicated than mac when the drivers are made by the component manufacturer? Why is windows not just super stripped down to provide the absolute BASE system, waiting for you to install drivers for your hardware (like you usually have to do anyway.) 

Define an "absolute base system", because Microsoft once showcased a build of Windows NT that was stripped down, running a web server. And all you had to interface to it was a command line. It was neat from a technical demo point of view, but it's also worthless for at least 99% of Windows users.

 

Also Apple doesn't write most of the drivers that are included in the OS. The component manufacturer still has to develop for macOS and supply Apple or the user with the drivers.

 

10 hours ago, corrado33 said:

I can understand how a windows INSTALLATION ISO would have to be bigger than a macOS iso, but once installed, I have no idea why windows and mac would be any different on the "supporting hardware" front when in reality it's the hardware manufacturer's drivers that allow the hardware to work. 

They're not. It depends if the hardware manufacturer wants to support the OS or not.

 

11 hours ago, corrado33 said:

I'd argue that the internal connections are also standardized. IDE, SATA, PCIe, PCI. I mean... even chipsets are standardized right? Every board with a certain chipset will be using the same commands right?

If you're within the same platform, it's likely. But an Intel chipset isn't likely going to talk the same language as an AMD one.

 

11 hours ago, corrado33 said:

If I'm developing a piece of hardware that's destined to run on windows, I'm going to make damn sure that it runs on windows well. I'm not just going to assume that microsoft will make it run well. 

And that's how system builders and hardware developers actually do things. Well, they're supposed to anyway.

 

11 hours ago, corrado33 said:

Ok then why are internal components NOT standardized when external components ARE standardized? Every single flash drive works with macos. External GPUs work on mac and PC. 

External GPUs don't actually "just work" like a flash drive. You still need drivers to use the video card to its fullest.

 

11 hours ago, corrado33 said:

Again, I'm confused. Why are all the drivers from different third parties not using the same set of protocols? I'm assuming apple ALSO has these protocols. 

The driver talking to the OS is standardized by the OS developer.

 

The driver talking to the hardware should not be standardized. This limits the implementation details of what the hardware developer can do. I'd argue unless this is extremely flexible (which standards shouldn't be, because they're standards), this will be a gross limiting factor in hardware development. Granted, perhaps some basic functionality to get started should be done (which for a lot of components, they are), but unleashing the full functionality of the hardware shouldn't be standardized.

 

11 hours ago, corrado33 said:

As for not making an ASUS laptop, that's not what I was saying. If Microsoft drops support for certain NICs, manufacturers would stop using them. They would HAVE to. Microsoft could FORCE standardization to make their life easier.

And make a lot of people mad in the process. Microsoft's bread and butter isn't just the consumer market. It's the professional market. Companies are glacially slow in updating their systems and sometimes they never update something because they have a system that just works, has always worked, and they see no reason to fix what isn't broken. The reason why Microsoft has a lot of backwards compatibility support is because a big enough corporate customer demanded that Microsoft bend over to make their old stuff work.

 

Also it should not be the job of the OS developer to determine who gets to be compatible with them. It's the OS developer's job to provide a platform with which anyone can build on. The only reason why Apple gets to determine what goes into macOS is because they're also the system builder.

Link to comment
Share on other sites

Link to post
Share on other sites

Apple has the advantage of knowing their hardware extremely well and can therefore provide a great experience to anyone using their products. Microsoft doesn't know / has to accommodate a wide range of hardware that will run windows and they therefore have a much harder time providing that same smooth usability experience.

Folding stats

Vigilo Confido

 

Link to comment
Share on other sites

Link to post
Share on other sites

12 hours ago, corrado33 said:

But, I mean... microsoft could easily make it easier on themselves. Tell vendors (Dell, HP, etc) to all install 1 type of NIC. Then they only have to write for that one NIC. All other NICS will require their own drivers written by the manufacturer. 

 

Right? Microsoft definitely has the clout to do something like this, why don't they? 

 

I mean hell, when I install windows 10 I have to download gigabytes of drivers just to get everything working on my computer, so what exactly did they write? 

It has already been done with IBM it was a big failure. The only ones that did not learn from it was apple.

Link to comment
Share on other sites

Link to post
Share on other sites

12 hours ago, corrado33 said:

It just seems so back-ass-wards to me.

 

If I'm developing a piece of hardware that's destined to run on windows, I'm going to make damn sure that it runs on windows well. I'm not just going to assume that microsoft will make it run well. 

Microsoft does not make the drivers. When you realize that you might understand this.

Link to comment
Share on other sites

Link to post
Share on other sites

I don't really know. But, imo, they make pretty good software as well that runs smoothly without too many bugs. And there are a few things that are either Apple exclusive, or that apple just does better, like having a good ecosystem and having software and security updates that arrive on time.

Link to comment
Share on other sites

Link to post
Share on other sites

Something more to add. I believe a lot of interfaces for hardware, both internal an external, have generic enough standards that a modern OS doesn't need a third-party driver. You only need a driver if you want to get the most out of the hardware. For example, Windows comes with a basic graphics driver. You could still use the OS without needing the manufacturer's driver, but until you install that, you can't make the most out of whatever GPU you have installed. And often times the native driver is good enough. Like you could install a driver of sorts for Samsung NVMe drives, but it doesn't really make the drive perform any better.

 

And this is also why Microsoft discourages using Windows 7. Windows 7 is too old to know what even USB 3.0 is because it was released before that was a thing.

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, M.Yurizaki said:

The driver talking to the OS is standardized by the OS developer.

 

The driver talking to the hardware should not be standardized. This limits the implementation details of what the hardware developer can do. I'd argue unless this is extremely flexible (which standards shouldn't be, because they're standards), this will be a gross limiting factor in hardware development. Granted, perhaps some basic functionality to get started should be done (which for a lot of components, they are), but unleashing the full functionality of the hardware shouldn't be standardized.

 

Whilst unlike the OP i have a better understanding of the why's of this i really only have one response to this:

 

Bull.

 

The implementations of many things a computer does on a hardware level are done according to various set in stone standards. Ever USB 3.0 controller does exactly the same thing. So why does, (or i should say did), every USB 3.0 controller require it's own driver with it's own custom communication protocol between it and the OS?

 

Don't get me wrong there are devices out there that need some custom protocols because they involve proprietary features. But for most of those devices only part of the functionality is proprietary. Yet they need to replace wholesale any default drivers that may exist for the standardised components.

 

To be fair the real answer is that it's only recently that we've started to get really solid industry wide standards applied to a lot of things in the hardware side. There's allways been some standardisation of course, but recently we've started to see things get nailed down a lot better overall sense. The still fairly broad range of graphics output standards are one of the few area's we haven't settled on a single hardware level standard for things yet.

 

You can't really have a consistent software standard without an equivalent level of hardware standardisation. And thats new enough that things are lagging. Add in simple inertia and it's going to eb a while before things get ironed out.

Link to comment
Share on other sites

Link to post
Share on other sites

25 minutes ago, CarlBar said:

-Snip-

When I say "implementation detail" or whatever, I mean how the hardware does it. USB describe what a device is supposed to do, but it does not describe how to do it. If there are certain things that a particular USB controller wants to do outside of the USB spec, then it needs to have device drivers tailored for that controller so the OS can expose those features to applications that can take advantage of it. Otherwise, the default drivers are only going to expose the bog standard USB spec. Like some motherboards require the use of either a utility or special drivers if you wanted to charge your iOS device faster, such as ASUS's Ai Charger.

 

On another semi-related area is DX12 and asynchronous compute. Asynchronous compute is actually not a feature required for DX12 hardware compliance. All DX12 does is allow applications to create multiple command queues. How the hardware handles this is an implementation detail. Asynchronous compute is an implementation detail on how to address multiple command queues.

 

Or in another way, the ARM architecture. Apple, NVIDIA, Samsung, and ARM implement it differently, yet they all process the same ISA. Similarly, even Intel's implementation of x86-64 is slightly different than AMD's implementation of it.

 

If you demand that hardware implements something a certain way, there must be a damn good reason for it to be done that way. Otherwise you're limiting how hardware developers can design their products which isn't really a good thing.

Link to comment
Share on other sites

Link to post
Share on other sites

18 hours ago, corrado33 said:

Why does microsoft spend so much time writing drivers for all different types of systems when you have to install a shit ton of drivers (from the vendor itself, not microsoft) after you install windows anyway? Surely microsoft could make it more simple and depend on vendor drivers more often? 

 

Windows is often useless before the motherboard drivers are installed, so what exactly is microsoft writing for? 

 

 

Just singling this piece out.

 

Take your graphics card for example.  If there were no drivers in Windows already to handle at least basic VGA compatibility.  How would you be able to see your desktop to be able to install drivers from a third party.

 

Not to mention drivers for the monitor itself, and USB functionality (that "plug-n-play" is all pre-install basic drivers from Windows that MS made).  Also, other PCI cards and expansion devices.  Imagine having to go out and find the driver for your USB keyboard before it would work on your computer.

Linux Daily Driver:

CPU: R5 2400G

Motherboard: MSI B350M Mortar

RAM: 32GB Corsair Vengeance LPX DDR4

HDD: 1TB POS HDD from an old Dell

SSD: 256GB WD Black NVMe M.2

Case: Phanteks Mini XL DS

PSU: 1200W Corsair HX1200

 

Gaming Rig:

CPU: i7 6700K @ 4.4GHz

Motherboard: Gigabyte Z270-N Wi-Fi ITX

RAM: 16GB Corsair Vengeance LPX DDR4

GPU: Asus Turbo GTX 1070 @ 2GHz

HDD: 3TB Toshiba something or other

SSD: 512GB WD Black NVMe M.2

Case: Shared with Daily - Phanteks Mini XL DS

PSU: Shared with Daily - 1200W Corsair HX1200

 

Server

CPU: Ryzen7 1700

Motherboard: MSI X370 SLI Plus

RAM: 8GB Corsair Vengeance LPX DDR4

GPU: Nvidia GT 710

HDD: 1X 10TB Seagate ironwolf NAS Drive.  4X 3TB WD Red NAS Drive.

SSD: Adata 128GB

Case: NZXT Source 210 (white)

PSU: EVGA 650 G2 80Plus Gold

Link to comment
Share on other sites

Link to post
Share on other sites

19 hours ago, corrado33 said:

 

I mean... is it? The processors are the same, it's only drivers that differ, and those are written by the vendors/companies themselves....Right? Hell, even the motherboards windows is usually installed only have so many different chipsets. 

Apple only has to deal with a small selection of processors. Only a small selection of EFI configs. Only a small selection of GPUs. Only a small selection of storage controllers. Only a small selection of RAM confirgurations. Etc. Etc.

Come Bloody Angel

Break off your chains

And look what I've found in the dirt.

 

Pale battered body

Seems she was struggling

Something is wrong with this world.

 

Fierce Bloody Angel

The blood is on your hands

Why did you come to this world?

 

Everybody turns to dust.

 

Everybody turns to dust.

 

The blood is on your hands.

 

The blood is on your hands!

 

Pyo.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, MedievalMatt said:

 

 

Just singling this piece out.

 

Take your graphics card for example.  If there were no drivers in Windows already to handle at least basic VGA compatibility.  How would you be able to see your desktop to be able to install drivers from a third party.

 

Not to mention drivers for the monitor itself, and USB functionality (that "plug-n-play" is all pre-install basic drivers from Windows that MS made).  Also, other PCI cards and expansion devices.  Imagine having to go out and find the driver for your USB keyboard before it would work on your computer.

EXACTLY. If they only have to write the basic VGA compatibility driver then why the hell is it such a big deal? Is that VGA driver different for every video card out there? Surely this should be something the GPU makers should do. A la "Every GPU must provide a basic VGA mode to interface with computers in which the correct drivers are not installed." 

 

Sorry, that came off as aggressive. I'm not arguing, I'm genuinely curious. 

 

I've had windows installs where none of the USB ports worked until I installed the chipset drivers... via freaking burned CD. 

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, corrado33 said:

EXACTLY. If they only have to write the basic VGA compatibility driver then why the hell is it such a big deal? Is that VGA driver different for every video card out there? Surely this should be something the GPU makers should do. A la "Every GPU must provide a basic VGA mode to interface with computers in which the correct drivers are not installed." 

 

I've had windows installs where none of the USB ports worked until I installed the chipset drivers... via freaking burned CD. 

The video cards likely implement something so it can talk over the basic VGA driver. But the VGA driver can't do anything else because it can't assume anything about how the hardware works. Keep in mind the VGA driver has to work with everything from the original IBM VGA card all the way up to a GTX 2080 RTX. If the VGA driver supported every GPU up until now to its fullest, it would be a massively huge driver full of 99% dead code (which is dangerous from a security point of view).

 

(Heck even a GTX 980 Ti has EGA compatibility and likely for even more fun compatibility reasons CGA/MCA)

Link to comment
Share on other sites

Link to post
Share on other sites

8 minutes ago, M.Yurizaki said:

The video cards likely implement something so it can talk over the basic VGA driver. But the VGA driver can't do anything else because it can't assume anything about how the hardware works. Keep in mind the VGA driver has to work with everything from the original IBM VGA card all the way up to a GTX 2080 RTX. If the VGA driver supported every GPU up until now to its fullest, it would be a massively huge driver full of 99% dead code (which is dangerous from a security point of view).

 

(Heck even a GTX 980 Ti has EGA compatibility and likely for even more fun compatibility reasons CGA/MCA)

I mean surely microsoft could say, for example with windows 10, "We've dropped support for video cards over 15 years old and that'd be a reasonable thing..."

 

But if I understand you correctly, are you telling me that every video card that sits in a PCIe slot doesn't communicate in the same way? There MUST be standardization of PCIe pins (a la pin 1 and 2 must be 12V and Ground (not really, just as an example.)) So again, has the "basic VGA mode" of video cards changed over time? If so... why? And how many changes have there been? A dozen? A few hundred? A thousand? 

 

I just don't understand. You say that video cards already have a "compatibility mode," but that windows still has to "work hard" and/or "have lots of drivers" to support all these cards. Doesn't that defeat the point of the video card's "compatibility mode?" 

 

EDIT: I would assume that a video card's "compatibility mode" would mean that it can function (in a basic sense) with a singular windows display driver. 

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, M.Yurizaki said:

When I say "implementation detail" or whatever, I mean how the hardware does it. USB describe what a device is supposed to do, but it does not describe how to do it. If there are certain things that a particular USB controller wants to do outside of the USB spec, then it needs to have device drivers tailored for that controller so the OS can expose those features to applications that can take advantage of it. Otherwise, the default drivers are only going to expose the bog standard USB spec. Like some motherboards require the use of either a utility or special drivers if you wanted to charge your iOS device faster, such as ASUS's Ai Charger.

 

On another semi-related area is DX12 and asynchronous compute. Asynchronous compute is actually not a feature required for DX12 hardware compliance. All DX12 does is allow applications to create multiple command queues. How the hardware handles this is an implementation detail. Asynchronous compute is an implementation detail on how to address multiple command queues.

 

Or in another way, the ARM architecture. Apple, NVIDIA, Samsung, and ARM implement it differently, yet they all process the same ISA. Similarly, even Intel's implementation of x86-64 is slightly different than AMD's implementation of it.

 

If you demand that hardware implements something a certain way, there must be a damn good reason for it to be done that way. Otherwise you're limiting how hardware developers can design their products which isn't really a good thing.

 

I think you missed somthing in there. 

 

My point wasn't, (to stick with the USB 3.0 example), to say that they shouldn't be able to implement an extra feature on their USB 3.0 hub. But rather any driver for that feature should add the extra information for that feature, but the USB 3.00 functionality should be going through the standard USB 3.0 that never changes.

 

12 minutes ago, corrado33 said:

I mean surely microsoft could say, for example with windows 10, "We've dropped support for video cards over 15 years old and that'd be a reasonable thing..."

 

But if I understand you correctly, are you telling me that every video card that sits in a PCIe slot doesn't communicate in the same way? There MUST be standardization of PCIe pins (a la pin 1 and 2 must be 12V and Ground (not really, just as an example.)) So again, has the "basic VGA mode" of video cards changed over time? If so... why? And how many changes have there been? A dozen? A few hundred? A thousand? 

 

I just don't understand. You say that video cards already have a "compatibility mode," but that windows still has to "work hard" and/or "have lots of drivers" to support all these cards. Doesn't that defeat the point of the video card's "compatibility mode?" 

 

EDIT: I would assume that a video card's "compatibility mode" would mean that it can function (in a basic sense) with a singular windows display driver. 

 

Drivers tell the OS how to talk to things. Thing of it like a language. All english speaking people from as far back english goes have spoken english. But that doesn't meaning all forms of english have used the same spelling, grammer, or in some cases exact word meanings. US english vs UK English vs Shakespearean English vs X type of English, (take your pick), is full of differences. As humans we can handle these contradictions and subtleties quite well. Computers can't, they need to be told what the differences are and what they mean.

 

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×