Jump to content

Malware for the Linux desktop. Yes, really.

Ashley MLP Fangirl
13 minutes ago, mr moose said:

I don't, I am a linux fan, I just won't censor myself to avoid offending those who aren't apart of said trend. 

 

So you are still using arguments of it being safer/better to try and convince me that my experiences are wrong.   I am talking about a behavior trait I have experienced, you are defending it by engaging in arguments about security that are not the issue.    You are starting to compound my views. 

 

 

 

 

 

 

What I meant to say was less discovered backdoors. And I know that is probably largely because linux is a smaller market but for me at this point it means samething.

I live in misery USA. my timezone is central daylight time which is either UTC -5 or -4 because the government hates everyone.

into trains? here's the model railroad thread!

Link to comment
Share on other sites

Link to post
Share on other sites

11 minutes ago, will4623 said:

What I meant to say was less discovered backdoors. And I know that is probably largely because linux is a smaller market but for me at this point it means samething.

 

As a measure of probabilities that is perfectly fine.  Security by obscurity (whiled frowned upon by some) still  has an effect rate. 

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

12 hours ago, leadeater said:

Because if you pay attention to the vulnerabilities that have or could apply to Linux then it gets really hard to argue it actually is more secure. 

This is a bit strange to say, what specific Linux release are you talking about?

Mainline kernel vulnerabilities? 

LTS?

Because obviously an open source project like the Linux Kernel has more "security vulnerabilities", every kernel release is exposed to everyone and it doesn't  mean every Linux build is affected 

Any kernel which isn't an LTS release shouldn't even be considered speaking about servers 

 

Quote

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

12 hours ago, leadeater said:

You can't just update a server, contrary to popular belief that Linux doesn't require reboots, which is pretty well correct, that doesn't mean it's safe or no impact to do so. Not only that there's processes like change control which often dictate when or if you can actually do it.

This really depends. On desktop it's totally harmless, issues can only happen when the specific service you provide has deprecated some config options, or it you are not separating config files you will be prompted to replace configure files (pretty common for ldap)

A good practice obviously is splitting most services you can using containers or virtual machines minimizing an eventual negative impact, which anyway a system administrator should know the effects before upgrading anything checking eventual bugs or procedures and the same can be said for Windows (which anyway in my company we have an insane amount of problems on clients rather than Windows servers)

Link to comment
Share on other sites

Link to post
Share on other sites

21 minutes ago, Chunchunmaru_ said:

This is a bit strange to say, what specific Linux release are you talking about?

Mainline kernel vulnerabilities? 

LTS?

Because obviously an open source project like the Linux Kernel has more "security vulnerabilities", every kernel release is exposed to everyone and it doesn't  mean every Linux build is affected 

Any kernel which isn't an LTS release shouldn't even be considered speaking about servers 

By have or could apply that means all, I'm also including popular packages in that which are typically installed like OpenSSH and OpenSSL. 'How many' really isn't that important either, which was part of the point.

Link to comment
Share on other sites

Link to post
Share on other sites

12 hours ago, leadeater said:

You can't just update a server, contrary to popular belief that Linux doesn't require reboots, which is pretty well correct, that doesn't mean it's safe or no impact to do so. Not only that there's processes like change control which often dictate when or if you can actually do it.

in recent versions of ubuntu to you can have live kernel security updates without having to restart

 

 

https://ubuntu.com/livepatch

                     ¸„»°'´¸„»°'´ Vorticalbox `'°«„¸`'°«„¸
`'°«„¸¸„»°'´¸„»°'´`'°«„¸Scientia Potentia est  ¸„»°'´`'°«„¸`'°«„¸¸„»°'´

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, vorticalbox said:

in recent versions of ubuntu to you can have live kernel security updates without having to restart

 

 

https://ubuntu.com/livepatch

He was not talking about kernel upgrades but system upgrades in a server which is totally unrelated to this

 

(also livepatch is not something that should be used on a server)

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, Chunchunmaru_ said:

This really depends. On desktop it's totally harmless, issues can only happen when the specific service you provide has deprecated some config options, or it you are not separating config files you will be prompted to replace configure files (pretty common for ldap)

A good practice obviously is splitting most services you can using containers or virtual machines minimizing an eventual negative impact, which anyway a system administrator should know the effects before upgrading anything checking eventual bugs or procedures and the same can be said for Windows (which anyway in my company we have an insane amount of problems on clients rather than Windows servers)

We are a 95%+ VM environment. You can't check everything and not every combination can be known so checking known bugs doesn't negate the potential risk, that's why we have dev/test/prod environments so you can know the impacts before rolling updates in to prod.

 

Change control is change control, it exists for a reason and it doesn't really matter how many technical arguments you can put forward to try and justify not having to do them the answer is always 'Every change must go through change control', I've tried and it doesn't work ?.

 

The reason it's done is to give proper notification of changes, justify why and potential risks and impacts. You can fill out all the paperwork, do all the dev env testing and things can still go wrong in prod, because somehow or someone managed to get a package a single sub dot version more up to date than in dev and then Moodle authentication is no longer working and 30k students can't login because LDAP is broken but only for Moodle and not for system authentication.

 

Basically shit happens, it's very hard to avoid it.

Link to comment
Share on other sites

Link to post
Share on other sites

12 hours ago, leadeater said:

And it was a relevant reply because his was to someone else talking about security, and how generally the Linux community proclaims that that OS is more secure and almost always based off opinion not a very exhaustive analysis. 

I made a post saying that I believe that GNU/Linux is more safe than Windows a few pages back and gave reasons for it, including citing statistics. Would you say my post is wrong?

 

Just because there are a lot of "Linux is more secure" posts which doesn't explain why doesn't mean they are wrong and the opposite is true. It does not invalidate the posts which do come to the same conclusion, but with exhaustive analysis behind them either.

 

 

12 hours ago, leadeater said:

 Is it more secure because they are more often a non interactive system or is the other less secure because it's more often an interactive system with a stupid user in the driving seat.

That's a false dichotomy. The reasons I gave for my higher "security ranking" of GNU/Linux did not include either of those parameters and I still came to a conclusion.

 

 

12 hours ago, leadeater said:

It's a much more complicated situation, much more than just going and looking at vulnerability lists or infection rate tracking lists and then concluding that one is more than the other based on those without accounting for other wider aspects of why or how.

What other aspects are you thinking about exactly?

 

 

12 hours ago, leadeater said:

Are cars less safe than motorbikes because there are more car crashes than motorbike crashes? Are trucks the most safe vehicles on the road because these have the least number of crashes? 

I think you're putting way too much emphasis on the (incorrect) assumption that Windows is only more vulnerable because it is the most widely used.

Not only is it incorrect that Windows is the most widely used OS (Linux is far more widely used) but you seem to brush all the other aspects like OS architecture and design under the rug.

 

 

12 hours ago, leadeater said:

Well then we should do the same for Windows, in fact the majority of compromised Windows systems come from users running, and ignoring UAC, malicious software. Seriously no one truly cares specifically how or why a system is compromised in these debates so I see no reason to do Linux a favor that is not given to Windows. If it got compromised it got compromised, ergo the Linux system was vulnerable. 

Yes, we should absolutely not include a user running a Trojan as a vulnerability, or if we do we should only count that as one vulnerability that exists on all OSes.

However, what the program does after that is fair to count as vulnerabilities. That's the industry standard for categorizing and ranking vulnerabilities work.

Running a program is not a vulnerability. A program being able to write data outsides its memory buffer's boundary is, even if that means a user has to run the program. Two different malware using the same unchecked buffer should only count as one vulnerability to the OS, but one malware who checks for two different attack surfaces such as two unprotected buffers should count as two vulnerabilities.

 

And yes, I do care specifically how or why system compromises happens when that's what we are talking about. If you want to debate which OS is the most secure then you should be prepared to actually do that and not brush very important details under the rug.

And no, I am not going to give one OS favors over another. Both are judged on the same criteria and in the same way.

 

 

12 hours ago, leadeater said:

Is the system secure or the user irresponsible. Where was the actual flaw, 'man or machine'. 

That depends on how and why the system was compromised.

As soon as the OS does not behave according to the design intent and that creates a vulnerability then it's the OS which is at fault. If it operates precisely as intended and the user does something which causes harm, then it's not a vulnerability.

 

A user running a script that deletes system files after giving that script admin privileges in the correct way (such as through UAC) is not a vulnerability.

A user running a script that deletes system files without having to give it elevated privileges even though it's suppose to require that, is a OS vulnerability.

Link to comment
Share on other sites

Link to post
Share on other sites

23 minutes ago, LAwLz said:

I made a post saying that I believe that GNU/Linux is more safe than Windows a few pages back and gave reasons for it, including citing statistics. Would you say my post is wrong?

Honestly I didn't and haven't read back that far. I'll go and try and find it after this post.

 

23 minutes ago, LAwLz said:

Just before there are a lot of "Linux is more secure" posts which doesn't explain why doesn't mean they are wrong and the opposite is true. It does not invalidate the posts which do come to the same conclusion, but with exhaustive analysis behind them either.

Correct, they don't mean anything at all. I take the debates about which is more secure as pointless because it's ultimately very complicated situation, I don't see much point trying to take a position on which is more secure than the other. I do however like to point out that it is in fact largely a waste to try, my counter points have nothing to do with the other being better, or even worse. There have been critical security vulnerabilities in all operating systems and widely used  software packages, there are also ways to properly secure operating systems and this isn't always done.

 

I also object to using terms like backdoors when they target vulnerabilities such as Bluekeep, which are not 'backdoors' but these vulnerabilities have been found in Windows are talked about as being backdoors with some tone of being designed to intentionally be there, which would make it a backdoor rather than a vulnerability. When does a vulnerability become a  'backdoor'? Seems to me in the case of Windows if it's being exploited, whereas with Linux the same situation is not referred to commonly as a backdoor.

 

23 minutes ago, LAwLz said:

I think you're putting way too much emphasis on the (incorrect) assumption that Windows is only more vulnerable because it is the most widely used.

Not only is it incorrect that Windows is the most widely used OS (Linux is far more widely used) but you seem to brush all the other aspects like OS architecture and design under the rug.

I actually never said it was more widely used, I said it's more commonly used as a user interactive system in comparison to Linux (not including mobile phones). Users are one of these risk factors that adds to the complexity of the situation. Users accessing a web server is not the same as a user logged in to the local system with administrative privileges i.e. their personal laptop.

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, leadeater said:

Honestly I didn't and haven't read back that far. I'll go and try and find it after this post. 

I wouldn't blame you for not reading them. They are quite long. Here are the two I think explains my position:

 

 

 

 

4 minutes ago, leadeater said:

I take the debates about which is more secure as pointless because it's ultimately very complicated situation, I don't see much point trying to take a position on which is more secure than the other.

And I don't think the debate is pointless just because it's complicated. I think both sides should be able to state arguments and counterarguments based on facts if they want, and people are free to give counterarguments.

Personally, I have taken a position.

 

 

5 minutes ago, leadeater said:

There have been critical security vulnerabilities in all operating systems and widely used  software packages, there are also ways to properly secure operating systems and this isn't always done. 

I really dislike this argument where someone says "all OSes have vulnerabilities" because it's a rather pointless observation to make.

Does all operating systems have vulnerabilities? Yes. But saying like an argument is like saying it's okay to drink bleach because "everything we drink can be harmful".

Security is a spectrum and while all operating systems are somewhere between "has no security whatsoever" and "perfectly secure in every way", different operating systems are closer or further to one end of the spectrum.

 

The debate is "which OS is the most secure", not "which OS is perfect in terms of security". Those two questions do not have the same answers.

 

 

10 minutes ago, leadeater said:

When does a vulnerability become a  'backdoor'?

Hard to say. I think the real definition is just a way to access the system while bypassing the security mechanisms in place. But I think in everyday speech it means a deliberate design flaw. The real definition does not include some clause about it having to be intentional.

 

14 minutes ago, leadeater said:

Seems to me in the case of Windows if it's being exploited, whereas with Linux the same situation is not referred to commonly as a backdoor. 

You sure about that? In general we hear far less about exploits for GNU/Linux so maybe statistically it's the same, but the true number of articles referring to backdoors on Windows is higher?

For example if 10% of articles refer to exploits as backdoors and 100 articles are written about Windows, you will see backdoors being mentioned 10 times.

If 10 articles about GNU/Linux are posted you'll only see backdoors mentioned once.

 

 

17 minutes ago, leadeater said:

I actually never said it was more widely used, I said it more commonly used as a user interactive system in comparison to Linux (not including mobile phones).

Sorry, that's the way I interpreted your post, especially when you started comparing cars vs trucks and such.

Link to comment
Share on other sites

Link to post
Share on other sites

31 minutes ago, LAwLz said:

Security is a spectrum and while all operating systems are somewhere between "has no security whatsoever" and "perfectly secure in every way", different operating systems are closer or further to one end of the spectrum.

Basically one of the reasons I point this out, in conjunction with operating systems can be properly secured, is because it starts becoming a problem with defining a baseline configuration to make the assessment. Are the statistical analysis of systems actually fair and equal? Should we evaluate purely on defaults only with no configuration and system hardening? That would have to fall under security research because that's not actually an in the wild actually used configuration.

 

It's such an open ended issue which is why I feel it's so pointless. You can make Linux and Windows very secure if you know how, then it comes down to if/when a vulnerability is found for either of them, is it published or not, time between publication and the vulnerability being found etc.

 

There is a great deal of bad system administrators out there and bad IT service companies, I'd probably even classify it as epidemic levels (I have seen it vastly improve in the last 5 years). Is an OS more secure than another because it's more so in default condition, even if the other can be hardened to the same level as the other (also hardened) or better. I would lean towards yes, the more secure as default is better. However I have seen many cases of admins making a Windows system less secure than default, same for Linux

 

Sample size should rule out much of the above problems but I just don't feel the represented Windows systems encompass enough actually secured systems, I do agree that is therein evidence to use as justification that it is less secure.

 

A lot of freedom and control is actually given to Windows users with administrative privileges which in combination of some rather fundamental different approaches to Linux can and does cause systems to be less secure. I have seen problems with trying to address that, when Microsoft start to try and develop things like package management the Windows community vehemently fights against it. I will say execution of those has been poor but I doubt there would be much difference either way. In saying that Chocolatey does exist and is pretty awesome.

Link to comment
Share on other sites

Link to post
Share on other sites

35 minutes ago, LAwLz said:

You sure about that? In general we hear far less about exploits for GNU/Linux so maybe statistically it's the same, but the true number of articles referring to backdoors on Windows is higher?

For example if 10% of articles refer to exploits as backdoors and 100 articles are written about Windows, you will see backdoors being mentioned 10 times.

If 10 articles about GNU/Linux are posted you'll only see backdoors mentioned once.

I was more referring to forum commentators. 

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, leadeater said:

Basically one of the reasons I point this out, in conjunction with operating systems can be properly secured, is because it starts becoming a problem with defining a baseline configuration to make the assessment. Are the statistical analysis of systems actually fair and equal? Should we evaluate purely on defaults only with no configuration and system hardening? That would have to fall under security research because that's not actually an in the wild actually used configuration.

 

It's such an open ended issue which is why I feel it's so pointless. You can make Linux and Windows very secure if you know how, then it comes down to if/when a vulnerability is found for either of them, is it published or not, time between publication and the vulnerability being found etc.

 

There is a great deal of bad system administrators out there and bad IT service companies, I'd probably even classify it as epidemic levels (I have seen it vastly improve in the last 5 years). Is an OS more secure than another because it's more so in default condition, even if the other can be hardened to the same level as the other (also hardened) or better. I would lean towards yes, the more secure as default is better. However I have seen many cases of admins making a Windows system less secure than default, same for Linux

 

Sample size should rule out much of the above problems but I just don't feel the represented Windows systems encompass enough actually secured systems, I do agree that is therein evidence to use as justification that it is less secure.

 

A lot of freedom and control is actually given to Windows users with administrative privileges which in combination of some rather fundamental different approaches to Linux can and does cause systems to be less secure. I have seen problems with trying to address that, when Microsoft start to try and develop things like package management the Windows community vehemently fights against it. I will say execution of those has been poor but I doubt there would be much difference either way. In saying that Chocolatey does exist and is pretty awesome.

I agree, these discussions are pointless. Our OS (BS2000/OSD) is arguably very secure as no sod knows anything about it. Question is, do they actually need to? I would say no as the weakest point in any security are the people. We make mistakes, we leave access where we shouldn't. We can harden security on the OS level as much as we wish, we can even control all applications a user can install. What we cannot do is count for the ignorance of individuals. 

Link to comment
Share on other sites

Link to post
Share on other sites

-

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, leadeater said:

Basically one of the reasons I point this out, in conjunction with operating systems can be properly secured, is because it starts becoming a problem with defining a baseline configuration to make the assessment. Are the statistical analysis of systems actually fair and equal? Should we evaluate purely on defaults only with no configuration and system hardening? That would have to fall under security research because that's not actually an in the wild actually used configuration. 

I'd say you can judge them three different ways.

1) The default configuration, which I think is the best because it should represent what the developers believe is the best configuration for the largest amount of users. And if the defaults aren't tuned for that then the defaults are incorrect and should be changed.

 

2) A hardened version of each OS. I think this is a poor reference point because I can make any OS, including DOS, completely unhackable. I just have to change things until it is completely broken and unusable. Remove all the ways someone could potentially interact with the computer and all of a sudden it's completely secure. It's also completely useless though.

 

3) Some type of "average configuration". This is really hard to do and introduces a lot of things which might be completely unrelated to the actual OS. For example flash player on Windows opens up a ton of security holes I do not believe should be blamed on Microsoft.

 

So if you ask me, only number 1 makes any sense, but I wouldn't be surprised if GNU/Linux "wins" at both number 1 and number 3. Number 2 is a draw for all OSes and therefore meaningless.

 

 

2 hours ago, leadeater said:

Is an OS more secure than another because it's more so in default condition, even if the other can be hardened to the same level as the other (also hardened) or better. I would lean towards yes, the more secure as default is better.

I agree. Measure the secuirty by an OS with defaults.

At least in cases like this where we're only talking about generalizations.

 

 

2 hours ago, leadeater said:

However I have seen many cases of admins making a Windows system less secure than default, same for Linux

I think those should be ignored just like it isn't a vulnerability to deliberately give a script full access to delete a bunch of important files.

 

 

2 hours ago, leadeater said:

Sample size should rule out much of the above problems but I just don't feel the represented Windows systems encompass enough actually secured systems, I do agree that is therein evidence to use as justification that it is less secure.

Well if the argument is that most Windows machines aren't secure because of poor configuration and decisions from Microsoft (such as bad defaults), would it not be fair to say that if we were to generalize the operating systems, Windows would be less secure?

I think that sounds completely fair.

 

Sure there might be some hypothetical scenario where Windows is as secure or more secure than GNU/Linux in some other hypothetical scenario, but when making generalizations you should make reasonable assumptions. The most reasonable being either a clean, default install, or some kind of "average" install (which I think would give GNU/Linux a massive advantage because of a higher average user knowledge level).

 

 

2 hours ago, leadeater said:

when Microsoft start to try and develop things like package management the Windows community vehemently fights against it. I will say execution of those has been poor but I doubt there would be much difference either way. In saying that Chocolatey does exist and is pretty awesome. 

That's because they have done such a terrible job with it. It has gotten a lot better, but if Microsoft wants to do changes they need to do them properly right from the start, and making the primary focus to improve things for the users rather than having making money as the primary focus.

 

If they had allowed win32 programs in the app store, without needing a Microsoft account, right from day 1, then I don't think there would have been as much backlash. I certainly wouldn't have shat on it as much as I did.

 

 

I think it's pretty obvious that the app store in Windows 10 was a poor attempt to save Windows Phone. That was their primary goal with it. They didn't push the store to desktop users because they felt like it would be good to have a package manager on Windows. They did it to artificially inflate the number of "UWP users" to make people interested in developing UWP apps. They felt like developers didn't support Windows Phone and tried to change that by pushing things onto desktop users (where they have a monopoly).

It's like when Google+ started to fail. Instead of fixing the issues they forced it into all Youtube users to artificially inflate the use of their failed platform. Have failing things leech of the success of their popular products. Microsoft did a ton of stuff like that with Windows 10 and Windows Phone.

I hate shit like that.

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, LAwLz said:

Instead of fixing the issues they forced it into all Youtube users to artificially inflate the use of their failed platform.

AFAIK they did it to every single service. If you created a gmail it automatically made a g+ too for example.

 

4 hours ago, leadeater said:

package management the Windows community vehemently fights against it

Because unlike on linux if they have their way they could lock out any app they please... Plus windows is still pretty monolithic and a huge mess. First they should get that spaghetti code fixed and broken up into modules that can be removed and added on demand. But they wont do it because they are still blinded by greed. (Or they are just incredibly stupid, take your pick.)

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, jagdtigger said:

Because unlike on linux if they have their way they could lock out any app they please... Plus windows is still pretty monolithic and a huge mess. First they should get that spaghetti code fixed and broken up into modules that can be removed and added on demand. But they wont do it because they are still blinded by greed. (Or they are just incredibly stupid, take your pick.)

That is much harder than you think and imply, much harder. They could but you try and justify that cost to finance over something which really don't bring that much benefit, being a better approach doesn't mean it's enough to justify the resource expense to do it. That's generally why most applications with 20+ years of legacy in them suck.

Link to comment
Share on other sites

Link to post
Share on other sites

6 hours ago, LAwLz said:

2) A hardened version of each OS. I think this is a poor reference point because I can make any OS, including DOS, completely unhackable. I just have to change things until it is completely broken and unusable. Remove all the ways someone could potentially interact with the computer and all of a sudden it's completely secure. It's also completely useless though.

The term hardened specifically means usable, it's an industry term for applying configuration to the system that makes it more secure. There isn't however a standardized way that is done or how to do it, it's more a fluffy term but it's pretty accurate to what is being done.

Link to comment
Share on other sites

Link to post
Share on other sites

7 hours ago, LAwLz said:

Sure there might be some hypothetical scenario where Windows is as secure or more secure than GNU/Linux in some other hypothetical scenario, but when making generalizations you should make reasonable assumptions.

Well the problem is generally every Windows system I configure is just as secure as many Linux systems, that's the problem with defaults versus what is used versus what should be used. It used to be, and still is extremely common for Windows admins to turn off the Windows firewall because they are incapable or too lazy to figure out what the applications on the server needs. Literally the first thing I changed here when I started working, this used to be the defacto norm until security gained more importance and businesses started to care requiring IT to care.

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, leadeater said:

That is much harder than you think and imply, much harder. They could but you try and justify that cost to finance over something which really don't bring that much benefit, being a better approach doesn't mean it's enough to justify the resource expense to do it. That's generally why most applications with 20+ years of legacy in them suck.

So then they just simply release a new inferior version of the same OS, then rip off their paing customer base by basically stealing their data behind their back? 9_9 (Remember, originally they wanted to rug it under the carpet, they only admitted it after it blow up several times into their faces.)

To simply put they aint going to do it simply because then they would loose most of their control over the platform they have right now.

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, leadeater said:

The term hardened specifically means usable

No it doesn't.

You usually don't harden a computer to the point where it's unusable because that's kind of pointless. But it doesn't go against the definition of the word.

Hardening just means reducing the attack surface of a system.

 

 

 

3 hours ago, leadeater said:

Well the problem is generally every Windows system I configure is just as secure as many Linux systems

You sure about that? In what way is it as secure? Number of exploits that can be taken advantage of? Number of unique malware which can run on the machine? The likelihood of a user getting their system compromised?

 

3 hours ago, leadeater said:

that's the problem with defaults versus what is used versus what should be used.

If it "should be used" then it should be the default. Otherwise the default is incorrect and Microsoft deserve to get shit for shipping incorrectly configured software.

 

3 hours ago, leadeater said:

It used to be, and still is extremely common for Windows admins to turn off the Windows firewall because they are incapable or too lazy to figure out what the applications on the server needs. Literally the first thing I changed here when I started working, this used to be the defacto norm until security gained more importance and businesses started to care requiring IT to care.

I feel bad for where you work. That has never happened, outside of brief testing and troubleshooting scenarios, where I work.

But we take security pretty seriously so maybe it varies from company to company.

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, LAwLz said:

Hardening just means reducing the attack surface of a system.

Yea which you said means it's unusable, or implied that hardening does that. I'm pretty sure you knew what I meant by hardening the OS so eh, pointless to discuss that further. Your 3rd option is the hardened option, this is one the that is done when you harden a system because the goal is to secure it while not rendering it unusable. If I wanted to do that I'd just pull the power out, if fact why not just put the entire thing through a wood chipper, there perfect security ?. Option 2 was a bit redundant towards the point.

 

4 hours ago, LAwLz said:

You sure about that? In what way is it as secure? Number of exploits that can be taken advantage of? Number of unique malware which can run on the machine? The likelihood of a user getting their system compromised?

You sure about that for Linux? Like I have said, you can secure a system and it is until a vulnerability is found or operator error makes the system insecure. I'm sure you have picked up on it but when operator error is a significant factor towards an OS being more vulnerable than another I personally lean more towards that not being an OS issue. It's a mix of both, like in woodworking there a many dangerous tools and if you get injured was it the user that caused it or the machine is just unsafe? It can actually be 3 different possibilities; user, machine, combination of both.

 

As to am I sure, yea I'm sure enough. You can only do so much to protect against what is unknown, you should have the known covered. We also get a security audit every year, before it's brought up as to how our RHEL servers got compromised this happened between the audit for a vulnerability not known about before/during the last one. Still the fault lies in those systems not being patched when they should have been or could have been.

 

And therein is the huge problem with these 'which OS is more secure' comparisons, the variables are just too large to make it a worth while exercise. An OS should not be made more secure because a competing one is, it should be made more secure because it's possible to do so.

 

Modern security concepts have started focusing less on IPS and more on IDS, because it's pretty well accepted now that perfection is impossible so you are in all likelihood going to be compromised in some way at some point. Detection is becoming more important than prevention, because if you have accepted the inevitable you should be putting maximum effort toward detecting it, without reducing efforts to prevent it. Cost of security is only going up.

 

4 hours ago, LAwLz said:

I feel bad for where you work. That has never happened, outside of brief testing and troubleshooting scenarios, where I work.

But we take security pretty seriously so maybe it varies from company to company.

As an IT contractor I've gone in to many clients, Windows firewall off was very common. It's not now but this was a very real thing for a long time and if nobody reviews that then it stays the way it is until either the system is decommissioned and/or the provisioning process is changed so the new template has it on not off. When you work for a contracting company you just have to deal with what you get presented with at the time, when you have thousands of clients that's just how it goes. That's also where I base my 'very common' off of. It's nice no longer doing that work but it was also nice not doing the same thing every day at the same place, everything has it's pros and cons.

 

4 hours ago, LAwLz said:

If it "should be used" then it should be the default. Otherwise the default is incorrect and Microsoft deserve to get shit for shipping incorrectly configured software.

Shipped defaults have to work with basically everything, running with defaults is on the user more than the OS in a lot of cases. I'm way more forgiving on that for client users but not system admins and engineers who should know better and are making configuration changes anyway.

 

The defaults are correct for their purpose, you can improperly use them by way of using the defaults. How many Linux systems and software packages run with defaults? Is it even recommended to do so, no it is not.

Link to comment
Share on other sites

Link to post
Share on other sites

8 hours ago, leadeater said:

Yea which you said means it's unusable, or implied that hardening does that. I'm pretty sure you knew what I meant by hardening the OS so eh, pointless to discuss that further.

I did not say "hardening = make it unusable".

What I said was that we can't really use a hardened OS as a reference point when making generalizations because you can just take hardening to the extreme and make any OS completely secure. As soon as someone could bring up any potential vulnerability against any system you can just go "well let's imagine the OS was hardened against that". It is possible to protect any OS vs any type of attack, but in doing so simultaneously also means that the computer can not be used for any real purposes either.

That's why I don't think a hardened version of an OS is a good reference point to compare against. Because who decides where on on the usability spectrum we should be?

 

 

8 hours ago, leadeater said:

Your 3rd option is the hardened option, this is one the that is done when you harden a system because the goal is to secure it while not rendering it unusable.

You sure about that? Does the average Joe harden Windows on their personal computer? If so, in what ways do they do it and why isn't isn't Microsoft shipping the OS like that by default?

 

 

8 hours ago, leadeater said:

You sure about that for Linux?

The burden of proof is on the one making a statement.

You can not make a claim and when I question it go "prove the opposite!".

 

 

8 hours ago, leadeater said:

Like I have said, you can secure a system and it is until a vulnerability is found or operator error makes the system insecure. I'm sure you have picked up on it but when operator error is a significant factor towards an OS being more vulnerable than another I personally lean more towards that not being an OS issue. It's a mix of both, like in woodworking there a many dangerous tools and if you get injured was it the user that caused it or the machine is just unsafe? It can actually be 3 different possibilities; user, machine, combination of both.

I have already addressed this before.

 

"The user doing something stupid" is one vulnerability, but that is the same for all OSes and therefore isn't really relevant to the debate of "which OS is the most secure". What is relevant is things like the number of zero day vulnerabilities, the average time to patch vulnerabilities, the default configurations etc.

You keep bringing up things which are the same between all OSes for some reason. Why? Why do you keep bringing up that the user can do something stupid when it's completely irrelevant to the conversation? The only thing that should be discussed here is the number of vulnerabilities that can be blamed on the OS. The number of instances where the OS is not acting the way it was intended, design wise, and that leads to malicious acts being possible on the OS. That is all I want to talk about. I do not give a damn if "a user can do something stupid on both OSes" because everything that is equal for both OSes is irrelevant when making a comparison.

You don't focus on the similarities when comparing two things, right? You focus on the things that are different. The user is the same for both in this case so they should be ignored and not brought up.

 

 

8 hours ago, leadeater said:

And therein is the huge problem with these 'which OS is more secure' comparisons, the variables are just too large to make it a worth while exercise. An OS should not be made more secure because a competing one is, it should be made more secure because it's possible to do so.

What are you on about?

If someone asks "which OS puts me at the least risk of being compromised straight out of the box, or with a typical setup" then of course there is a possibility to get a generalized answer, and the answer to that question will most likely be a GNU/Linux distro (or possibly a BSD one).

What's next, you're gonna talk about how we can't generalize and say an RX2080 Ti is faster than a Radeon 580 because "there are too many variables"?

 

This whole conversation started when moose said GNU/Linux users were ignorant and delusional because they thought they were safe, to which I replied that in general it's true to say that GNU/Linux has fewer malware and can be considered safer on a wide range of criteria, and moose replied to that by saying Linux is no more secure than any other OS and to say anything else is just naive.

 

 

Why shouldn't OS makers compete on who can make the most secure OS? Comparing how they perform from a security perspective can give very valuable insight in design decisions. If one way of doing something offers no drawbacks but better security, shouldn't that be taken into consideration for future releases? Just look at things like Retpoline. It started off as a GNU/Linux feature and now Microsoft has adpoted it on Windows because it offers the same protection, but with a much smaller performance penalty. Do you not think it was a good thing Microsoft looked at how GNU/Linux solved that and implemented it in Windows? It provided a ~50% performance increase in some scenarios.

 

 

9 hours ago, leadeater said:

As an IT contractor I've gone in to many clients, Windows firewall off was very common. It's not now but this was a very real thing for a long time and if nobody reviews that then it stays the way it is until either the system is decommissioned and/or the provisioning process is changed so the new template has it on not off. When you work for a contracting company you just have to deal with what you get presented with at the time, when you have thousands of clients that's just how it goes. That's also where I base my 'very common' off of. It's nice no longer doing that work but it was also nice not doing the same thing every day at the same place, everything has it's pros and cons. 

I also work for a consulting firm. I'm not sure how long ago you're talking about, but I have never been at a customer where they got a bunch of computers without a client firewall running.

It might also be a regional difference where it might have been common in one country, but not in another.

 

 

9 hours ago, leadeater said:

Shipped defaults have to work with basically everything, running with defaults is on the user more than the OS in a lot of cases. I'm way more forgiving on that for client users but not system admins and engineers who should know better and are making configuration changes anyway. 

I think it's very important to note that I have all throughout this thread talked mainly about home users. Enterprise is a completely different beast altogether.

I have very little interest in discussing GNU/Linux vs Windows from a security perspective, for use in enterprise.

I want to talk about, and have been talking about, GNU/Linux vs Windows security for use as a personal computer for the average user at home.

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, LAwLz said:

You sure about that? Does the average Joe harden Windows on their personal computer? If so, in what ways do they do it and why isn't isn't Microsoft shipping the OS like that by default?

So when I said Windows servers and was talking about servers personal computers comes in where? True I have mentioned them as part of the discussion but the that point was and has been about Windows server. I don't expect any home user to do any security hardening at all, I expect them to do the opposite in fact. Which that last bit may I add is one of my points as to why this is useless exercise.

 

2 hours ago, LAwLz said:

The burden of proof is on the one making a statement.

You can not make a claim and when I question it go "prove the opposite!".

I have via the point about security audits, I'm not just saying it the external contracted company hasn't raised any major security concerns in that area. I'm not the one proclaiming that Linux is more secure than Windows, I'm saying there is no reason to bother and as far as I'm concerned our Windows servers are no less secure than our Linux ones.

 

Defaults suck on both sides, do I care which defaults are worse than the other? Two people are standing in an effluent pond arguing who's is deeper, they're both standing in effluent. Probably better to get out than argue who's in the worse position.

 

2 hours ago, LAwLz said:

Why shouldn't OS makers compete on who can make the most secure OS?

Compete all you like, an OS shouldn't be made more secure because another is, it should be made more secure because it can. Not sure what was difficult to understand about that. Should one stop because it's the most secure? Yes? They are the market leader so they can stop right? I think you are trying to over analyse what is being said.

 

2 hours ago, LAwLz said:

What's next, you're gonna talk about how we can't generalize and say an RX2080 Ti is faster than a Radeon 580 because "there are too many variables"?

No because there aren't too many variables, that has even been proven. Don't think this was a good example.

 

3 hours ago, LAwLz said:

I think it's very important to note that I have all throughout this thread talked mainly about home users. Enterprise is a completely different beast altogether.

I have very little interest in discussing GNU/Linux vs Windows from a security perspective, for use in enterprise.

I want to talk about, and have been talking about, GNU/Linux vs Windows security for use as a personal computer for the average user at home.

That's fine, you can have the discussion you want but you addressed mine which was about enterprise and was pretty evident that was what I was talking about.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×