Jump to content

Reason why Android needs more RAM compared to iOS

RedRound2

Someone doesn't want to believe that he was wrong. ¯\_(ツ)_/¯

Implementing something in hardware for the sake of doing it does not count in the grand scheme of byte code vs machine code. Furthermore, Java byte code is not the only byte code. It's one of 13 that I'm aware of, including LLVM, JAVA, C#, LISP, and some which become increasingly obscure. 

 

Byte code is not machine code just because some because some people made a hardware simulator for shits and giggles. Sorry, but I'm not wrong. Byte code is machine code when and only when the language exists as a virtual construct and is implemented in hardware for direct acceleration purposes. This is a rare animal and is in no way a method by which to say machine code and byte code are equivalent.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

Implementing something in hardware for the sake of doing it does not count in the grand scheme of byte code vs machine code. Furthermore, Java byte code is not the only byte code. It's one of 13 that I'm aware of, including LLVM, JAVA, C#, LISP, and some which become increasingly obscure.

Neither LLVM, JAVA (lol), Java, C# nor LISP is bytecode.

They didn't do it "for the sake of doing it". They did it because it actually makes it faster. You can buy ARM chips which support the Java "bytecode".

 

Byte code is not machine code just because some because some people made a hardware simulator for shits and giggles. Sorry, but I'm not wrong. Byte code is machine code when and only when the language exists as a virtual construct and is implemented in hardware for direct acceleration purposes. This is a rare animal and is in no way a method by which to say machine code and byte code are equivalent.

It's not a hardware simulator, it is hardware which supports the IS of the bytecode which makes the bytecode also machinecode.

 

You always claim to know so much, so you also must know that you can always build machine which is equivalent to a program. That also means that you can build a machine which supports any bytecode because every bytecode can be converted to the IS of a target CPU (which already physically exists) by a program (this is a subset of the sentence before).

 

Now that I've shown that you can build a machine for a corresponding "bytecode", every "bytecode" is therefor also machinecode.

Link to comment
Share on other sites

Link to post
Share on other sites

Neither LLVM, JAVA (lol), Java, C# nor LISP is bytecode.

They didn't do it "for the sake of doing it". They did it because it actually makes it faster. You can buy ARM chips which support the Java "bytecode".

 

It's not a hardware simulator, it is hardware which supports the IS of the bytecode which makes the bytecode also machinecode.

 

You always claim to know so much, so you also must know that you can always build machine which is equivalent to a program. That also means that you can build a machine which supports any bytecode because every bytecode can be converted to the IS of a target CPU (which already physically exists) by a program (this is a subset of the sentence before).

 

Now that I've shown that you can build a machine for a corresponding "bytecode", every "bytecode" is therefor also machinecode.

LLVM runs byte code in a language the developers have not released. C# runs on a microsoft virtual machine after being compiled to byte code. Java is compiled to byte code to run on a Java virtual machine. So does LISP. Are you just wildly ignorant of how these languages work?

 

It was done for fun, and the only people who do it now, do so to accelerate web-based server applications.

 

And no, virtual machines do not simulate hardware in the general case. They're code-code translators, not code-process translators which CPUs/GPUs are. Please get your terminology straight.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

LLVM runs byte code in a language the developers have not released. C# runs on a microsoft virtual machine after being compiled to byte code. Java is compiled to byte code to run on a Java virtual machine. So does LISP. Are you just wildly ignorant of how these languages work?

Like you just noticed yourself refer all of the terms to languages and not bytecode.

It was done for fun, and the only people who do it now, do so to accelerate web-based server applications.

Doesn't matter, it gets used.

And no, virtual machines do not simulate hardware in the general case. They're code-code translators, not code-process translators which CPUs/GPUs are. Please get your terminology straight.

All of this doesn't even matter. You apprently didn't understand what I said and accuse me for it. Read it again.
Link to comment
Share on other sites

Link to post
Share on other sites

Is adding an extra GB of Ram on phones really that expensive? if not I dont see why they don't just all have 4GB of ram.

Only expensive if Ram is from Apple...

Anyone who has a sister hates the fact that his sister isn't Kasugano Sora.
Anyone who does not have a sister hates the fact that Kasugano Sora isn't his sister.
I'm not insulting anyone; I'm just being condescending. There is a difference, you see...

Link to comment
Share on other sites

Link to post
Share on other sites

So we need to ditch Java? Also I thought that it was already known that Java was the big memory hog on Android? Either way I hope that they manage to improve this crap

that would severely stop android advancement, and stagnate a bit for developers to catch on as all apps must be completely rewritten and fix the inevitable bugs that arise. From a personal and developer standpoint, there isn't a strong reason to ditch Java especially considering the price of flash chips, it really isn't to hard to give Android phones 3-4GB of RAM. 

 

Sure it wouldn't be necessary if you could just switch the language; however, the benefits do not outweigh the implications.

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Like you just noticed yourself refer all of the terms to languages and not bytecode.

Doesn't matter, it gets used.

All of this doesn't even matter. You apprently didn't understand what I said and accuse me for it. Read it again.

Okay, perhaps I should rephrase what I said earlier. There are 13 languages with associated byte codes to which they compile (or, 13 byte codes), including those aforementioned. It's not as though you couldn't infer that anyway. 

 

It barely gets used and is by no means an industry staple. It's practically pointless.

 

It doesn't matter because it just isn't done in any large scale outside of academia and bleeding-edge research within a source company that only seeks to make a program itself better. Byte code and machine code are not equivalent except in the tortured cases where someone forced them to be. In terms of usage, they're never the same. In terms of implementation, every once in a while. 

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

that would severely stop android advancement, and stagnate a bit for developers to catch on as all apps must be completely rewritten and fix the inevitable bugs that arise. From a personal and developer standpoint, there isn't a strong reason to ditch Java especially considering the price of flash chips, it really isn't to hard to give Android phones 3-4GB of RAM. 

 

Sure it wouldn't be necessary if you could just switch the language; however, the benefits do not outweigh the implications.

The benefits do outweigh. This is why android battery life sucks for the supplied milliamp hours. It's completely worth it. We get stronger programmers down the line too.

 

RAM is electrically very expensive. You think refreshing billions of capacitors billions of times per second is cheap?

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

I hate it when people say iPhones are overpriced....

 

(Sarcasm)

iPhones are overpriced.

|EVGA 850 P2| |1440p PG279Q| |X570 Aorus Extreme| |Ryzen 9 3950x WC| |FE 2080Ti WC|TridentZ Neo 64GB| |Samsung 970 EVO M.2 1TB x3

 |Logitech G900|K70 Cherry MX Speed|  |Logitech Z906 |  |HD650|  |CaseLabs SMA8 (one of the last ones made)

 

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×