Jump to content

"iOS Secuirty is fucked" -Zerodium Stops Accepting iOS Exploits Because of too Many Submissions

LAwLz
11 minutes ago, StDragon said:

Nope. I'm calling BS on this. As someone whom works in the MSP side of things, most abandoned sites that stay up are for several reasons; of which none relates to altruism.

 

-Rotating IT staff in which the previous admin was fired or left - no passing of institutional knowledge.

-Management keeps paying the invoice thinking it's something "important" without understanding it's just deadwood.

-The hosting provider also suffers from poor IT systems administration to where it's kept online long after the customer stops paying the bill.

 

It really is comical. IT seems to be the only industry where you can get away with not paying for something, or you are and the content is maintained improperly. It's quasi-dysfunctional given how rapidly the industry changes. Trying getting away with that at physical storage facility. If the bill isn't paid, they will forklift your junk to the curb. But in IT, storage is so cheap that if it's a crappy little website, it's a statistical rounding error in storage consumed on the SAN.

 

As always, "Never attribute to malice that which can be adequately explained by stupidity"

There are literately 100's of sites that fall under my administrative access that are "left up" because they are literately, memorial sites. Once in a while the owner of the site comes back and either wants it deleted, or wants to update it.

 

VPS and Bulkhosting is largely in this realm. If it costs nothing to host, then it will remain up perpetually because it costs money (as in paying someone to do all the proper legal and technical responsibilities) to shutdown. When something receives a DMCA or C&D about a site, it costs less money to shut the site down than to investigate, and if the owner isn't coming back for a few years, it's likely not even a loss to them.

Link to comment
Share on other sites

Link to post
Share on other sites

9 minutes ago, Kisai said:

There are literately 100's of sites that fall under my administrative access that are "left up" because they are literately, memorial sites. Once in a while the owner of the site comes back and either wants it deleted, or wants to update it.

Sure, assuming it's all part of some contractual obligation between you and the client.

 

Quote

VPS and Bulkhosting is largely in this realm. If it costs nothing to host, then it will remain up perpetually because it costs money (as in paying someone to do all the proper legal and technical responsibilities) to shutdown. When something receives a DMCA or C&D about a site, it costs less money to shut the site down than to investigate, and if the owner isn't coming back for a few years, it's likely not even a loss to them.

 

Correct.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Kisai said:

Part of Android's failure can also be laid at the feet of Linux developers (no not the kernel) insisting on changing things for fundamentally "not invented here" reasons. There is no reason to keep changing the ABI (yes ABI) with every major and minor OS version of all the libraries on the OS. If you need to make a breaking change, keep the old and new library around until nothing on the device uses the old library. 

I'd like to address this block, since it is of interest to me as well. Specifically, the part about ABI. This is what you're referring to, right?


Is this in any way related?

 

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, Kisai said:

 

Manage a website will you. When users start complaining in both directions that 1) the ads are interfering with their website viewing and 2) blocking the ads makes the site unnavigatable since all the ad spaces are now collapsed and the content no longer where it's supposed to be.

 

Thats why I said it's PEBKAC. These are issues caused by the user trying to assert control over a website they have no business tampering with. DO REALIZE you are running software on your computer that can also be compromised. No honor among thieves.

Ah.  Different viewpoint.  I’m viewing it as a  user.  Embedded in a website and pop up are different.  I don’t dislike the part-of-page ones nearly so much as the high jack your screen and force interaction to break your concentration ones.  I’ve sort of fantasized about folks who built those being forced to wear some kind of shock collar and every time a thousand people were sufficiently annoyed by an ad to click a special button they would receive a mild shock.  It would probably kill them with the constant electrical shock.

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

23 hours ago, TopHatProductions115 said:

More or less, 

 

Basically when a device is "ABI stable", that means that shared libraries on the OS only need to be deployed to the device once, so what might be a 300MB monolithic binary can be carved up into a 70MB binary, and the C/C++/Swift/C#/OBJC/Rust runtime and library bindings would be provided by the OS. However this doesn't address the problem of migrating between devices. So in order for one application to be able to run on all devices with that OS, you need both 32-bit and 64-bit ABI's for each CPU type (Intel, Arm, any others like RISC) and multiply that by the OS versions. A version of an application designed for say iOS 5 will work on iOS 13 as long as the 32-bit ABI is still present on the device. However, if you were actually developing iOS software to begin with, you would already have both 32-bit and 64-bit versions of ARM and Intel versions entirely because you need the Intel versions to run on the simulator on the mac, and the ARM versions to run on the device itself.

 

Android's problem is worse than iOS's. Because Linux OS's do this a lot, where they will change or remove libraries on the device for no apparent reason other than to refactor something or "improve security", yet doing so changes the actual C++ name mangling (there is no standard way to do this) which is why pretty much everything is required to use C bindings if it's expected to survive a version change. Yet, simply adding a parameter to a function destroys it's compatibility, since all parameters are required in C. A consequnce of this on both Windows and Linux is you end up with dozens or even hundreds of versions of the same library, thus removing any space advantage of having them in the first place. You may as well just statically compile the entire thing, as that's the only way it's going to survive an OS version update.

 

Unity is a good example of doing things wrong. Under ideal situations, you should be able to just install a generic "Unity Engine" runtime on the OS, and then every Unity program should just use that runtime (just like Flash), however in practice, every Unity game actually ships with it's own version of Unity runtime, which can not be updated. Even though it's still technically a shared library within it's own program directory, all the C# assemblies will not work if you change it. In theory you could pull an Android game from the App store, copy it to your PC,  and then run it with the Unity engine for windows/mac, if you can find an exact version used.

 

Which comes to the other half of this, and a previous poster mentioned this in passing, on iOS the webview MUST be used for app applications that access the web, this is to ensure that all applications have secure web access, and third party programs don't ship something like a CEF (Chrome Embedded Framework) that doesn't play by the rules and never gets updated.

 

Many Mac/PC games that use something like RPG Maker MV, or "export to html5" from some other engine, use a CEF (eg nw.js) or Electron (also embedded Chrome) are relying on a stable HTML5+Javascript features, which Chrome doesn't really do. So when they are exported to iOS they use the native webview, but when exported to Android they often use the native Chrome engine. So this is another layer on top where things can break, and most HTML5 "apps" have much poorer performance on Android because of broken GPU drivers. Though let's be honest, there's broken GPU drivers on all platforms, otherwise there wouldn't be a mile long list of work-arounds in Chrome.

 

Ideally, something that replaces Android would have a stable ABI, where anything released on Version 1.0 of the OS also works on Version 999. No more deliberate breaking of backwards compatibility just because some gee-whiz feature another OS has is popular. An OS should provide the bare minimum required to run software on top. No installing "user libraries" that stomp on each other. Just build a system compiler into the device that works against the intermediate code from the development environment, so that any application installed is compiled for the hardware the device has. Don't bother with shared libraries that don't interface with hardware. If this means every program is going to be 256MB instead of 64MB, maybe reconsider using all these obnoxiously large monolithic game engines to do small things we could do 15 years ago.

Link to comment
Share on other sites

Link to post
Share on other sites

Sounds like the EOL Windows 7 is more secure than iOS :D

A PC Enthusiast since 2011
AMD Ryzen 7 5700X@4.65GHz | GIGABYTE GTX 1660 GAMING OC @ Core 2085MHz Memory 5000MHz
Cinebench R23: 15669cb | Unigine Superposition 1080p Extreme: 3566
Link to comment
Share on other sites

Link to post
Share on other sites

On 5/18/2020 at 12:38 AM, Kisai said:

Because the vendors using it would rather use their own OS, particularly Samsung who has 40% of the Android market, have phones that run Tizen, and push Tizen on all their other products. All of them. If they own the app store, they get the money from it. Not that Android even has a healthy or profitable app store to begin with.

Again, doesn't mean the OS is dead.

Of course vendors would rather use their own OS if they could. Don't you think Dell, HP, Lenovo, etc would also like to ditch Windows if they could and would rather have their own OSes with their own stores if possible? It's a no-brainer.

 

On 5/18/2020 at 12:38 AM, Kisai said:

Not that Android even has a healthy or profitable app store to begin with.

Google had a gross revenue of 24.8 BILLION dollars from Google Play Store alone in 2018. I am pretty sure they make healthy profits from it. You are once again talking about stuff you don't have the first clue about. Something you seem to do quite a lot. Or are you going to try and say that 24.8 billion dollars isn't "profitable" or "healthy"?

 

 

On 5/18/2020 at 12:38 AM, Kisai said:

Both Google Play and Chrome extensions full of bugs, and rubbish are examples of Google just not giving even a little thought about it. Android is dead and it's better off something better came along that didn't make any of Android's mistakes. Google v Oracle is STILL going on, and it's still possible for Oracle to win it and utterly wreck the Android OS due to the reliance on Java API's even when nobody likes the dalvik Java junk to begin with.

Again, Android is not dead. It's about as far from dead as you can get. You don't know what "dead" means if that's what you think.

By the way, it's interesting that you bring up Dalvik because that's actually dead, so not sure why you bring that up. And when I say dead I mean it. Dalvik has not been used in Android since version 5.0. And I don't mean "it is sometimes used but not always". Dalvik has been completely removed from Android 5.0 and forward. Android phones haven't used Dalvik for 6 years.

 

 

On 5/18/2020 at 12:38 AM, Kisai said:

It does not prevent malware, it does not prevent viruses, it does not eliminate annoyances from websites.

Yes it does all three of those.

I get that you run websites and want to stop people from using adblockers, but stop lying. There are countless of examples where adblockers have prevented malware, even on large and "trusted" sites. Like @Bombastinator said it's edge cases, but it does happen.

It most certainly blocks annoyances from websites. I mean did you even stop and think what you were writing here? If it didn't do the things you claim it does (such as remove annoyances), people would not use them. People use adblockers because they do remove annoyances (ads for example). Claiming otherwise is illogical.

 

 

On 5/18/2020 at 12:38 AM, Kisai said:

Most of the blockers are ginormous resource hogs because they have regex filters for every website that runs ads in them. That's ass-backwards software design. The only effective way to prevent "ads" is at the DNS level, by which the web browser is smart enough to cache things once looked up. Yet many of the worst ad vendors actually create new domains daily just to get around that as well.

And despite them being such resources hogs, they still end up increasing performance. Yes, you get higher performance in web browsing for your typical site if you turn adblocking on, than running with it off.

Blocking ads at DNS level is the inefficient way because DNS can only block entire domains. The reason why you still see ads with DNS adblocking isn't because "ad vendors create new domains daily", it's because the ads are hosted on domains you can't block at the DNS level without also blocking non-ad content.

 

 

On 5/18/2020 at 12:38 AM, Kisai said:

And ad blockers would never have had to be a thing if Google didn't screw it up for everyone. Google keeps all the good paying ads for itself, and passes along the worthless ads to everyone else, makes egregious demands on both websites and advertisers that chase them off their platform, and right into the arms of the shitty ad services that make no attempt to check for quality.

I disagree. It's not Google's fault sites started doing things like auto playing video ads, or ads that pop-in above the content so content moves around while the website is loading, or all the fake download buttons on sites, and so on. It's on the website owners.

 

 

On 5/18/2020 at 12:38 AM, Kisai said:

The best thing that Google could do to come back from the hell they created for themselves is to stop making nitpicky bullshit demands on both who they advertise on and who is allowed to buy advertising and instead silently blackhole ads that contain javascript and WASM by default. Thereby only permitting ads that are png or jpeg that computer vision algorithms can check, and humans sign off on. Only permit ads to link to product websites that have existed for 3 months and have a SSL certificate from someone other than CentOS/Let's encrypt/Cloudflare. Don't allow ads that link to sites behind Cloudflare or other "privacy hiding" nonsense. If a site is unwilling to pay for it's own SSL certificate to sell a product, you don't want to go there anyway.

That would break the ad model. What you are describing sounds good, just static images and links in ads. But that's not what advertisers want. Advertisers want to track users. They want JS so that they can know if you hovered over the ad, or how long you spend on the site and things like that. It's not Google's fault.

You complained about ads paying little and Google hogging all the "good ads" for themselves? The ads you describe would be beyond "shitty" in terms of revenue. Companies don't want to spend the same amount of money or more and getting less back (and yes, less tracking and less insight into how their ads perform would be less to them).

Link to comment
Share on other sites

Link to post
Share on other sites

58 minutes ago, LAwLz said:

Google had a gross revenue of 24.8 BILLION dollars from Google Play Store alone in 2018. I am pretty sure they make healthy profits from it. You are once again talking about stuff you don't have the first clue about. Something you seem to do quite a lot. Or are you going to try and say that 24.8 billion dollars isn't "profitable" or "healthy"?

Gross revenue is not profit.  Other than that I largely agree with what you had to say.

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, Koeshi said:

Gross revenue is not profit.  Other than that I largely agree with what you had to say.

It is cash flow though which seems to be enough for many investors lately in tech.  Google is famous for leading with gross cash flow over actual profit margin, being unable to convert that gross cash flow into margin, and eventually killing a system.  I don’t think google could kill the play store even if they can’t convert though.  They’re still battling Apple.

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

On 5/18/2020 at 2:33 AM, SpaceGhostC2C said:

Thieves? LOL, are you in charge of some terrible website not getting the income it mistakenly expected to get? :D 

From the sounds of it, yes.

That's why he is trying to convince people that adblocking is a bad idea.

 

 

From what I have gathered, Kisai runs some websites *** and he is mad at Google for not providing him with ads for the site.

In some other thread he was mad at Cloudflare for not shutting down sites that rehosted content he had made.

In this thread he is mad at people blocking ads.

 

Basically, from what I can gather most of his posts in these types of threads are "baww companies should make it so that I make more money baww!".

People pirate his content? All companies should stand in line to help him fight the pirates, even if they have no legal obligation to do so. Entire sites should also be shut down because someone posted Kisai's content on it!

Google don't want to run their ads on porn sites? Google is the devil because they do not allow Kisai to use them for advertising! Down with Google!

People use adblock? Those people are thieves!

Edited by LogicalDrm
*** = content removed by staff
Link to comment
Share on other sites

Link to post
Share on other sites

@Kisai Instead of replacing Android, why not attempt to make it ABI-stable? Would it be less work to do so, or to make a completely new mobile OS? Do realise, I sarcastically mentioned creating a new Mobile OS competitor for a reason. For every successful OS/kernel that you've seen or used, there are many more that never see the light of day. They die, fast. Especially for the amount of effort that it takes to make them. The truly dead OS examples that I listed previously were simply the ones that I remembered off the top of my head. The real list is way longer...

Link to comment
Share on other sites

Link to post
Share on other sites

7 minutes ago, Koeshi said:

Gross revenue is not profit.  Other than that I largely agree with what you had to say.

Yeah but I mean...

If you're going to call an OS "dead" maybe don't pick the one that has tens of billions of dollars in revenue and over 80% market share.

 

Pretty sure Google makes profits on those 28+ billion dollars as well. I just couldn't find numbers for it so I went with the numbers I did find.

Link to comment
Share on other sites

Link to post
Share on other sites

55 minutes ago, TopHatProductions115 said:

@Kisai Instead of replacing Android, why not attempt to make it ABI-stable? Would it be less work to do so, or to make a completely new mobile OS? Do realise, I sarcastically mentioned creating a new Mobile OS competitor for a reason. For every successful OS/kernel that you've seen or used, there are many more that never see the light of day. They die, fast. Especially for the amount of effort that it takes to make them. The truly dead OS examples that I listed previously were simply the ones that I remembered off the top of my head. The real list is way longer...

1) Don't trust what Kisai writes. He is a good example of "I understand a tiny bit about this, so I think I know everything!" and make up a lot of details. A lot of what he says might be right, but it might also be wrong.

 

2) Android has stable ABIs. For example project treble bought a stable ABI between the user-space and kernel/device (HALs). You can read about it here under the "HAL interface definition language". That ensures a stable ABI for some parts of Android.

 

3) It's important to remember that ABI is not just one thing. When you use for example Windows, you will run into several ABIs in the background. Some of them are stable and some aren't. For example the ABI for DirectX is not stable. The ABI for Swift (one of the primary language to write iOS apps in) is not stable. "Stable" is also a relative term. How long before major changes that break compatibility counts as "stable"? Is it 1 year? 3 years? 5 years? That's why drivers for one Windows version sometimes work on another Windows version. Because they haven't made changes to the ABI between those versions (less changes = more "stable").

 

 

Disclaimer: I know very little about ABIs and programming in general. Hence why I am light on details.

Link to comment
Share on other sites

Link to post
Share on other sites

10 minutes ago, LAwLz said:

1) Don't trust what Kisai writes. He is a good example of "I understand a tiny bit about this, so I think I know everything!" and make up a lot of details. A lot of what he says might be right, but it might also be wrong.

 

2) Android has stable ABIs. For example project treble bought a stable ABI for drivers and the HAL. You can read about it here under the "HAL interface definition language". That ensures a stable ABI for some parts of Android.

 

3) It's important to remember that ABI is not just one thing. When you use for example Windows, you will run into several ABIs in the background. Some of them are stable and some aren't. For example the ABI for DirectX is not stable. The ABI for Swift (one of the primary language to write iOS apps in) is not stable. "Stable" is also a relative term. How long before major changes that break compatibility counts as "stable"? Is it 1 year? 3 years? 5 years? That's why drivers for one Windows version sometimes work on another Windows version. Because they haven't made changes to the ABI between those versions (less changes = more "stable").

 

 

Disclaimer: I know very little about ABIs and programming in general. Hence why I am light on details.

Don’t trust what anyone writes.  
 

We’re all just humans.
 

 I find @Kisai’s perspective to be interesting, just as I find @LAwLz’s perspective interesting.  It’s not binary.  Very little in this world is truly binary.  A lot less than gets treated as such.

 

the impression I get is that @Kisai is some sort of sysadmin for a large industrial system and works in a large non tech company.  This provides a very useful perspective.  No one sees everything though.  My perspective is generally much less useful.  I was a sysadmin assistant for an art school in the 80s and early 90’s that used macs, I later was a photoshop jockey, then bagged arts and then went PC as a casual gamer.  Mostly I merely had a decent education and I’m old.  Much less useful generally.  I don’t know what @LAwLz perspective is.  It is probably also more useful than mine in many ways.

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

On 5/19/2020 at 5:44 PM, Bombastinator said:

Don’t trust what anyone writes.  We’re all just humans.  I find @Kisai’s perspective to be interesting, just as I find @LAwLz’s perspective interesting.  It’s not binary.  Very little in this world is truly binary.  A lot less than gets treated as such.

 

the impression I get is that @Kisai is some sort of sysadmin for a large industrial system and works in a large non tech company.  This provides a very useful perspective.  No one sees everything though.  My perspective is generally much less useful.  I was a sysadmin assistant for an art school in the 80s and early 90’s that used macs, I later was a photoshop jockey, then bagged arts and then went PC as a casual gamer.  Mostly I merely had a decent education and I’m old.  Much less useful generally.  I don’t know what @LAwLz perspective is.  It is probably also more useful than mine in many ways.

I don't think Kisai is a sysadmin. More likely a student ***. That's the impression I get at least. But then again, a lot of things he has said is very outdated so maybe he is too old to be a student.

 

If you're wondering what I do, I am a networking consultant at an MSP. I know very little programming and much less Android development. My job doesn't really give any additional insight in this discussion though, so my perspective is "someone who is interested in technology". Not much more than that.

 

It's true that a lot of things aren't black and white. But it's pretty easy to stop someone who is just flat out wrong. For example ranting about how someone hates Android and listing Dalvik as an example, when Dalvik has not been used in Android for 6 years is quite telling. Or saying that Android doesn't have a stable ABI and therefore libraries break between versions (which they don't, like I said earlier Android has stable APIs for userspace, and stable ABI for user-land to HAL). Those things are binary (black and white).

Edited by LogicalDrm
*** = content removed by staff
Link to comment
Share on other sites

Link to post
Share on other sites

On 5/19/2020 at 5:57 PM, LAwLz said:

 

It’s possible I’m confusing @Kisai and @LAwLz. Battle of the pigtailed schoolgirl avatars.  One of the disadvantages of being old.  

Edited by LogicalDrm

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

9 minutes ago, foldingNoob said:

I want to know how exploits are found. It's not something they teach at university.

My info on this is very very old and dealt with much simpler systems. As such it may be even worse than what you have now.
 

 Back in the day it was done by hand and involved developing a very deep low level understanding of the code.  A lot of reverse engineering was done to develop readable versions of the code.  Binary code is very very hard to read but it can be done.  There are programs that will take binary code and attempt to derive source from it as well.  They used to generally get it very very wrong, but this itself was occasionally useful.  once some version of source is available it is much easier to look for holes.  It required extremely detailed knowledge of systems.  So deep and hard to obtain that criminals rarely acquired it.  
Exploits came not from criminals but from feuding governments or corporations that could afford to hire the expertise.  Generally someone capable of doing these things could make a much better living doing non criminal things.

 

My impression is that AI and machine learning could have or even has had a drastic sea change effect on this.

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

34 minutes ago, foldingNoob said:

I want to know how exploits are found. It's not something they teach at university.

It kind of is though.

For example I studied networking at college. Part of the curriculum was securing networks from common attacks. When learning about that, you obviously also learn how those attacks are made, and therefore can do them yourself.

 

A lot of knowledge about security comes from knowing the fundamentals of something and trying to find ways around it as well. A course at a university will never be able to teach you "here is how to hack 1) do this 2) do that 3) then do this 4) done!". But it will teach you "this is how memory is read by a program" and from that you can think of ways yourself on how to exploit it.

Link to comment
Share on other sites

Link to post
Share on other sites

On 5/19/2020 at 5:57 PM, LAwLz said:

 

 

 

I work for myself, have several clients of which one is an office of a fortune 500 company worth about 20 billion dollars, and the other two are internet-only clients that I had previous to them, which their clients are primarily artists who make very little money because they are popular enough to have their content stolen by petty criminals on the internet.

 

I've done sysadmin, webdesign/development, C/C++/PHP/Perl/Java programming,  PC hardware replacement/builds. I've done a lot, and when I see babies on the internet jumping on bandwagons that were crushed years or even decades ago, I point out the folly of beating the dead horse again.

 

Android -IS- a dead OS. It is harder to develop for than iOS, it suffers from problems that stem from Google's techbro culture and Linux's "not-invented-here" culture, it basically aimed to steal RIM's clients, but Apple beat them to the punch so they changed directions and aimed to steal Apple's much better user experience, while not changing the underlying garbage-and-gum that holds Android together. Someone else needs to step up and replace Android, and you can see Google doing that to itself with Fushia, Samsung with Tizen, LG with WebOS, Huawei with Harmony OS, and so forth. Nobody likes Android, and we're going to see it thrown under the bus sometime soon, because that garbage-and-gum holding it together doesn't scale across processing cores, and is responsible for Android hardware being so fickle and difficult to develop for. Google was was still promoting Java as the language to develop Android software in as late as Android 4.4, when it really should have been encouraging development in C/C++. Yet look at Fuschia, they are making the same mistake again with trying to get developers to use Dart.

 

Remember the old "Linux is better than Windows" debates? Well those debates were put to bed over a decade ago, because every year it was "will (this year) be the year of the linux desktop" and every year the answer is no. Linux had it's chance with Android, and Google buried it with ChromeOS, neither marketed as Linux, nor gaining any significant market share where it mattered. They are money-pits, and it would not surprise me in the least if Google discontinued development on Android once they get Fuschia to run Android apps and ChromeOS. Literately, here's a company where OS development is not their core competency and they are developing THREE at once. Two of these need to die, soon.

 

Edited by LogicalDrm
Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, foldingNoob said:

I want to know how exploits are found. It's not something they teach at university.

Exploits are typically taking advantage of one or more things:

1) Software bugs

2) Hardware bugs

3) Lax Security.

 

You know what was the big "exploit" back when I was a teen? The "back orifice" kit. Script kiddiez loved to ship that around to their school pc's. The Win9x lab at the college was just loaded with viruses because the IT department there never updated AV products even once. Meanwhile the NT4 lab just worked, no AV product necessary, it was secured. More than one tool like BO had second-level backdoors in them, of which BO had the password in plain text inside it's binary, so if you wanted to hack any of the script kiddies running it, that was also easy.

 

Win9x was not securable in the slightest, you could boot into safe mode, without any login. Even if it was on a domain. NT4 and later you can break into with "locksmith" if you have access to the physical machine.

 

XP was where things went sideways, because people were still connecting their PC's directly to cable modems, and they were typically infected with worms/viruses withing minutes because infected machines on the same ISP would be constantly doing sweeps for them. If you wanted to secure an XP machine, you had to do it by downloading all the updates on a second machine, burn them to a disc, and then apply them manually before you even put it on the internet. Once cable modems started coming with primitive firewall/NAT,  this stopped being that big of an issue, but it's still something you will run into and why you should not plug any PC or connect any smartphone to the public internet, at any time, without the OS being the most recent patched version. That's a core reason why Android is bad because one infected, old device connecting to a public AP can infect everything else connected to the same WiFi AP.

 

Hardware bugs like that in the Intel CPU's is one of those cases where you typically need physical access to the machine, or at least access to to a privileged process on it. So you can't hack a server by uploading a client-run javascript, but you can potentially hack that server by uploading an image to a server running a vulnerable image processing software such as GD or ImageMagick. That's why many software programs that handle images "scrub" the meta data before processing it. All it takes is something in a meta tag, like javascript for the processing system to spit out the meta tag contents unsanitized.

 

Exploits in software are usually just found by people using debuggers on the software, or looking at the source for mistakes. Hardware exploits are harder to pull off but you can also "look at the source" on hardware as well, though it likely won't get you very far unless you are a chip engineer. Most exploits are simply taking advantage of user stupidity.

 

Link to comment
Share on other sites

Link to post
Share on other sites

Android hard to develop for? I can take the code I use in Windows, jam it in a Xamarin project and send to my phone via WiFi. Unlike iOS, I don't have to pay the mighty overlords for permission to put MY code on MY device that I paid nearly £1000 for. In addition, it doesn't stop working after a week, or a reboot of the device requiring recompiling and transferring over again.

 

But yeh, really hard to develop for Android and pointless too as it's about to die due to a lack of market share and use.

Link to comment
Share on other sites

Link to post
Share on other sites

10 minutes ago, valdyrgramr said:

But, it isn't dead.  What you're talking about is developmental frustrations meeting, what you believe it is obsolete.  Also, not everyone who jumped from RIM went to Apple.  That's just your own theory.  I and several others that were on said platform prefer Android over ios from the user to development side, and Android is still well and alive.  You're pretty much using the Linux take over logic that you said didn't work.   Also, a lot of your argument is about personal experience and preferences, not factual.   Apple is not factually better when it comes to a user's experience.  That's a personal pref thing.   

That's not the argument I'm making here. What I'm saying is that "Android" itself is rubbish, and the market has been moving against it because of things like GoogleVsOracle and because all those Java bits are still in Android and kneecap it compared to software developed for iOS. This is why middleware like Unity or Apache Cordova ends up being used, because Android is just the worst thing to develop software for. Not even iOS has that much nonsense just to run software.

 

Side-loading and rooting a device aside, there has never been anything but downsides to running Android, and the fact that all Android devices don't even last two years is just another reason why not to invest in the Google ecosystem.

Link to comment
Share on other sites

Link to post
Share on other sites

10 minutes ago, Kisai said:

Side-loading and rooting a device aside, there has never been anything but downsides to running Android, and the fact that all Android devices don't even last two years is just another reason why not to invest in the Google ecosystem.

Galaxy S2, still supported by many applications. iPhone 5, nahh, Apple says that's too old so you can't run your apps on it any more. Android devices don't last at all.

Link to comment
Share on other sites

Link to post
Share on other sites

20 minutes ago, Curious Pineapple said:

Galaxy S2, still supported by many applications. iPhone 5, nahh, Apple says that's too old so you can't run your apps on it any more. Android devices don't last at all.

it's missing features. 

 

what applications are we talking about here. 

 

Android devices last less long if you want an up to date OS with security patches like most people. 

She/Her

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×