Jump to content

will 128 bit be a thing?

Snickerzz

There's a really simple answer to this... Yes, but not any time soon, and most likely not in the consumer market for even longer.

"Any sufficiently advanced technology is indistinguishable from magic" - Arthur C. Clarke
Just because it may seem like magic, I'm not a wizard, just a nerd. I am fallible. 


Use the quote button or @<username> to reply to people | Mark solved troubleshooting topics as such, selecting the correct answer, and follow them to get replies!

Community Standards | Guides & Tutorials Troubleshooting Section

Link to comment
Share on other sites

Link to post
Share on other sites

I wasn't trying to say 64bit is useless, I was trying to say that Apple used it as a marketing gimmick more than anything else. On an iPad this would be useful, but right now I don't see it making a huge difference on the iPhone considering there are not a whole lot of professional apps on the iPhone, again a device targeted to 13 year old kids with "swag". Keep in mind, the average consumer that buys Apples stuff doesn't care for performance, it's only the power users.

But again, it's free performance, battery life and future proofing. Those things are hardly marketing gimmicks.

Even if you don't desperately need higher performance it's still good to have. You can never have too much performance so why complain about getting more?

Link to comment
Share on other sites

Link to post
Share on other sites

But again, it's free performance, battery life and future proofing. Those things are hardly marketing gimmicks.

Even if you don't desperately need higher performance it's still good to have. You can never have too much performance so why complain about getting more?

 

I understand that, but when the iPhone 5s was announced I constantly heard crap like "omg the iphone 5s is twice as powerful as other phones because it has twice the bits". So again, while it very well may be useful it just felt like a marketing gimik to me, because thats pretty much what it is, just like 8 core phone cpu's, heart rate monitors and all that other crap that Samsung puts on their phones. For companies like Apple and Samsung it isn't about actual performance nearly as much as that kind of marketing stuff they can throw in their commercials. Regardless, I am excited for 64bit phones, especially that stuff we are seeing with Andriod L but I just think it will be another few years until its not "just a gimmick" like the bit wars on consoles.

Link to comment
Share on other sites

Link to post
Share on other sites

Reddit user @SystemVirus has a good post:

 

 

There'd be little to no benefit in jumping to 128-bit instructions. For one, you'd have to introduce a whole new set of instructions (x86), registers, etc to be able to differentiate the new instructions from the old. Secondly, only a very small set of applications would be able to take advantage of the increased instruction sizes.

Also, you should disassociate the bit size of the bus from the speed of the system, the two are not directly related. Apple's new A7 processor is a great example, it's not faster because they went 64-bit. It's faster because they added more general-purpose registers -- which means you can do more stuff in-cpu without having to get data from RAM, which slows down stuff by orders of magnitude.

The main reason x86 processors jumped from 32-bit to 64-bit was due to ballooning memory sizes. 32-bit processors can only, directly, reference 4GB worth of memory. The performance hit of addressing more than 4GB (less on Windows) was about one or two instructions/calls per memory lookup due to the way in which memory greater than 4GB was addressed on 32-bit systems. By moving to 64-bit instructions, cpus were able to directly reference all memory without having fancy lookup tables.

It's also pretty rare to need to do mathematical computations on values that will not fit into the space of 64-bits. For the few situations where it is necessary, the performance hit is, usually, relatively negligible.

The majority of wide-bus usage is in graphics cards, since a lot of the math done there is complex vector math. The latest intel cpu's have pretty good on-board gfx capabilities and you'll often see intel calling them 128-bit cpus since they have some instructions that do vector math. They are also adding special wide instructions for specific purposes (e.g., Intel's AES on-cpu instructions). The truth of the matter is that eventually all computing devices will have gpus that are capable of being used as general purpose processors (via OpenCL or CUDA), which can fulfill the need to do complex math operations directly.

Edit: Forgot to actually talk about bus width. PCI-Express, I believe, is still 32-bit, but they create multiple lanes to transfer data. PCI-Express v3.0, as an example, can transfer data at ~1000MB/s per lane. So if you have a x16 card you have 16 lanes, so you'll get ~16000MB/s to transfer.

Edit2: To further clarify, PCI-Express can be up to 32-bits. Each lane transfers 1 bit per cycle. So an x16 slot can transfer 16 bits per cycle. I've never seen a x32 slot or device, but the spec should support it.

The stone cannot know why the chisel cleaves it; the iron cannot know why the fire scorches it. When thy life is cleft and scorched, when death and despair leap at thee, beat not thy breast and curse thy evil fate, but thank the Builder for the trials that shape thee.
Link to comment
Share on other sites

Link to post
Share on other sites

I wasn't trying to say 64bit is useless, I was trying to say that Apple used it as a marketing gimmick more than anything else. On an iPad this would be useful, but right now I don't see it making a huge difference on the iPhone considering there are not a whole lot of professional apps on the iPhone, again a device targeted to 13 year old kids with "swag". Keep in mind, the average consumer that buys Apples stuff doesn't care for performance, it's only the power users.

cubercaleb you stole the words from my mouth. As I said, it's not for now. And the point I was making is the marketing surrounding it, NOT if its useful or not. I am a big supporter of 64-bit processors. I paid 750$ my 64-bit CPU, I was also a supporter for multi-core CPUs, where at the time even that, I was fighting with people on saying how important and beneficial it is today (well, back in 2005). I am surprised that smartphones CPUs aren't already 64-bit. But, currently, smartphone software even the great majority of games aren't really pushing high-end phones processors (unless it's a really old one, and not fairly current), and the marketing on the phone processor is being pushed like 32-bit is like walking with 1 feet, and 64-bit, you are driving a racing car with no speed limit, on a perfect road, and act how SUPER important it is NOW, when it's not at all the case on both points.

 

I was simply pointing that the marketing was similar to the days of game console. 8-bit, 16-bit, 32-bit! 64-bit! 128-bit!!!!! I am sorry, but our 32-bit CPUs in our computers did a better job then these 128-bit CPUs. A better example, the GameCube, falsely claimed 128-bit console (PowerPC processor used could only support 32-bit instructions), was more powerful than the PS2. The games were better, because the CPU was faster, because the GPU was better and faster... not because of the bit (well ok for 8-bit, 16-bit, yes.. but after 32-bit.. no), which most people to this day, have 0 idea what it is, or what it means. But bigger the number, the better, right?

Link to comment
Share on other sites

Link to post
Share on other sites

I understand that, but when the iPhone 5s was announced I constantly heard crap like "omg the iphone 5s is twice as powerful as other phones because it has twice the bits". So again, while it very well may be useful it just felt like a marketing gimik to me, because thats pretty much what it is, just like 8 core phone cpu's, heart rate monitors and all that other crap that Samsung puts on their phones. For companies like Apple and Samsung it isn't about actual performance nearly as much as that kind of marketing stuff they can throw in their commercials. Regardless, I am excited for 64bit phones, especially that stuff we are seeing with Andriod L but I just think it will be another few years until its not "just a gimmick" like the bit wars on consoles.

Well that's a reason to dislike misinformed consumers, not a reason to dislike 64bit phones. I think it is very unfair to say that it's a gimmick though. Releasing it now makes a ton of sense in terms because by the time iOS devices desperately needs it (for memory purposes) it will already be a mature platform with plenty of apps for it. In the meantime we will get the benefits of higher performance.

Apple didn't even do *that* much marketing about 64bit. A lot of the CPU talk was about the CPU itself (which is a beast and honestly, calling it desktop class isn't that far fetched) and not about it being 64bit.

 

8 cores is not "crap" or a marketing gimmick. big.LITTLE is a great idea. You put a low power/performance cluster and a high performance cluster next to each other and then scheduled tasks for the appropriate cores. What you end up with is a far wider dynamic range. Low power consumption for simple tasks and high performance for demanding tasks. Samsung's first implementation was crap (broken coherency bus caused the cache to be flushed when changing cluster which is very wasteful, and it couldn't mix cores between the clusters, it was either low or high performance cores active) but it's a step in the right direction. Snapdragon 810 looks like a fantastic SoC and it uses four Cortex A53 cores for the LITTLE part, and four Cortex A57 cores for the big part.

Like with 64bit, a lot of people don't fully understand it and will therefore say it's awesome for the wrong reasons, or say it sucks for the wrong reasons. That's why you should always do your own research and form your own opinion about things. There are only benefits for us consumers to have big.LITTLE SoCs like the Exynos Octa, and the benefits are quite big (longer battery life and higher performance). The only drawback is that is costs more for the manufacturers (requires more silicon and also more licenses) but when you're talking about a ~25 dollar chip inside a 700 dollar device then a few more dollars won't matter that much.

 

 

cubercaleb you stole the words from my mouth. As I said, it's not for now. And the point I was making is the marketing surrounding it, NOT if its useful or not. I am a big supporter of 64-bit processors. I paid 750$ my 64-bit CPU, I was also a supporter for multi-core CPUs, where at the time even that, I was fighting with people on saying how important and beneficial it is today (well, back in 2005). I am surprised that smartphones CPUs aren't already 64-bit. But, currently, smartphone software even the great majority of games aren't really pushing high-end phones processors (unless it's a really old one, and not fairly current), and the marketing on the phone processor is being pushed like 32-bit is like walking with 1 feet, and 64-bit, you are driving a racing car with no speed limit, on a perfect road, and act how SUPER important it is NOW, when it's not at all the case on both points.

 

I was simply pointing that the marketing was similar to the days of game console. 8-bit, 16-bit, 32-bit! 64-bit! 128-bit!!!!! I am sorry, but our 32-bit CPUs in our computers did a better job then these 128-bit CPUs. A better example, the GameCube, falsely claimed 128-bit console (PowerPC processor used could only support 32-bit instructions), was more powerful than the PS2. The games were better, because the CPU was faster, because the GPU was better and faster... not because of the bit (well ok for 8-bit, 16-bit, yes.. but after 32-bit.. no), which most people to this day, have 0 idea what it is, or what it means. But bigger the number, the better, right?

Backpedaling again? First you said 64bit was useless and didn't matter, then you said you didn't need the higher performance and now you're saying you only dislike the marketing surrounding it. make your mind up please.

Marketing is usually crap so don't form your opinion about something based on it. Who cared how many bits Atari claimed the Jaguar had (a mix of 32 and 64)? What mattered was how it performed (poorly) and how the programs used it (horrendously badly). Using Atari's logic we are at 512bit computers these days but again, it doesn't matter.

Link to comment
Share on other sites

Link to post
Share on other sites

Im not sure if it's even possible. How I had 64 bit explained to me is that it has to do witht he frequency wave of how information is sent. 32 bit is only sending information on the crest, and 64 bit is sending information on  crest and trough. So unless we are able to find a way to send information on the crest, trough and consistently in between, no.

 

Im not sure. I dont really trust that the guy who taught me this was right or wrong.

I think they may have been explaining how ddr works. Someone correct me if I'm wrong but the 32 bit vs 64 bit is just how long each address can be. DDR stands for double data rate which means that the memory can transfer information on both the peak and low points of each clock cycle. So a ddr3 1600 dimm is actually running at 800mhz but since it is capable of transferring information on both extremes of each clock cycle it has an effective clock rate of 1600mhz

Link to comment
Share on other sites

Link to post
Share on other sites

Well, I HOPE SO THAT WILL BE GREAT!

 

"Of course, its trash," Luke, 2014

 

Link to comment
Share on other sites

Link to post
Share on other sites

I think they may have been explaining how ddr works. Someone correct me if I'm wrong but the 32 bit vs 64 bit is just how long each address can be. DDR stands for double data rate which means that the memory can transfer information on both the peak and low points of each clock cycle. So a ddr3 1600 dimm is actually running at 800mhz but since it is capable of transferring information on both extremes of each clock cycle it has an effective clock rate of 1600mhz

That is correct. That's why programs like for example Speccy will only show your 1600MHz RAM as 800MHz. Because it is reporting the true frequency instead of effective frequency.

Link to comment
Share on other sites

Link to post
Share on other sites

On Windows 8 Professional, a 64-bit operating system can support up to 512 GB of RAM (http://msdn.microsoft.com/en-us/library/windows/desktop/aa366778%28v=vs.85%29.aspx#physical_memory_limits_windows_8). I do not foresee computers utilizing even close to that limit within 5-10 years.  It will eventually be a thing, but probably within a long period of time.

My PC specifications are in my profile.

Link to comment
Share on other sites

Link to post
Share on other sites

There have been 128-bit CPU's built and working but not for the mainstream

 

There will be no 128-bit mainstream CPU's for a very long time.

DESKTOP - Motherboard - Gigabyte GA-Z77X-D3H Processor - Intel Core i5-2500K @ Stock 1.135v Cooling - Cooler Master Hyper TX3 RAM - Kingston Hyper-X Fury White 4x4GB DDR3-1866 Graphics Card - MSI GeForce GTX 780 Lightning PSU - Seasonic M12II EVO Edition 850w  HDD -  WD Caviar  Blue 500GB (Boot Drive)  /  WD Scorpio Black 750GB (Games Storage) / WD Green 2TB (Main Storage) Case - Cooler Master 335U Elite OS - Microsoft Windows 7 Ultimate

Link to comment
Share on other sites

Link to post
Share on other sites

On Windows 8 Professional, a 64-bit operating system can support up to 512 GB of RAM (http://msdn.microsoft.com/en-us/library/windows/desktop/aa366778%28v=vs.85%29.aspx#physical_memory_limits_windows_8). I do not foresee computers utilizing even close to that limit within 5-10 years.  It will eventually be a thing, but probably within a long period of time.

There are 128GB sticks of DDR4 in development and I think 16GB will soon become low/mid range.

My 4GB kit of DDR3 was about 1000 SEK 4 years ago. Today I can get 16GB for about the same price. That's a doubling each second year. So if we are following the same trend (which has been fairly slow compared to how it used to be) and assume that 32GB is high end these days, then the high end will hit the 512GB limit in 8 years. 512GB will be what 64GB (not too uncommon on very high end consumer desktops) is today in a mere 6 years. It's reasonable to assume that we will see 1TB RAM in ~10 years.

Link to comment
Share on other sites

Link to post
Share on other sites

Maybe 64bit and 32 bit would make a good As Fast As Possible.

 

 

This user, mrbit10, does a pretty awesome job of explaining the difference.  On his channel, he has a playlist that addresses this further.

My PC specifications are in my profile.

Link to comment
Share on other sites

Link to post
Share on other sites

There are 128GB sticks of DDR4 in development and I think 16GB will soon become low/mid range.

My 4GB kit of DDR3 was about 1000 SEK 4 years ago. Today I can get 16GB for about the same price. That's a doubling each second year. So if we are following the same trend (which has been fairly slow compared to how it used to be) and assume that 32GB is high end these days, then the high end will hit the 512GB limit in 8 years. 512GB will be what 64GB (not too uncommon on very high end consumer desktops) is today in a mere 6 years. It's reasonable to assume that we will see 1TB RAM in ~10 years.

 

Let me rephrase.  It's possible that we will have a ton of RAM available in the next few years.  Whether or not we will actually utilize that amount of RAM remains to be seen.

My PC specifications are in my profile.

Link to comment
Share on other sites

Link to post
Share on other sites

Let me rephrase.  It's possible that we will have a ton of RAM available in the next few years.  Whether or not we will actually utilize that amount of RAM remains to be seen.

Using more RAM is very easy. You just have to flush the cache less frequently. Just a small change in a timer and all of a sudden you will start using more RAM and get a more responsive system.

Link to comment
Share on other sites

Link to post
Share on other sites

 

Using more RAM is very easy. You just have to flush the cache less frequently. Just a small change in a timer and all of a sudden you will start using more RAM and get a more responsive system.

You'll never exceed using even 128 GB of ram (let alone 512 GB) with one piece of consumer software. Everyone in this thread will more than likely be dead before that happens.

Link to comment
Share on other sites

Link to post
Share on other sites

You'll never exceed using even 128 GB of ram (let alone 512 GB) with one piece of consumer software. Everyone in this thread will more than likely be dead before that happens.

"640K ought to be enough for anybody"

Link to comment
Share on other sites

Link to post
Share on other sites

This is a good way to sum up the way that technology is advancing.

Link to comment
Share on other sites

Link to post
Share on other sites

not anytime soon i think. But in the future as Technology evolves and 64-bit is not sufficient, then we'll be moving to 128 bit or something else.

 

But for now i'm pretty sure 64-bit is not even fully utilized. A lot of programs are still 32-bit.

 

And as for ram.. well I don't think we will be breaking the 16 exbibytes ram theoritical space anytime soon.

source : http://en.wikipedia.org/wiki/64-bit

MB :MSI Z77a G45 | Proc: I5 3570K (Stock) | HSF : CM 212X turbo | RAM : Corsair Vengeance 8GB (2X4GB) | VGA : MSI GTX 660 Twin Frozr | PSU : Corsair GS600 | Case : CM Storm Enforcer | Storage :  OCZ Vector 128GB, WD Blue 500GB , Samsung 840 Evo 120GB, WD Blue 1TB

Link to comment
Share on other sites

Link to post
Share on other sites

Nope.avi

 

Until we cap out 255TB of ram (48 bit mem-address plus 16 bit virtual) on a regular basis, there would be no real reason to make the jump to 128 bit.

Link to comment
Share on other sites

Link to post
Share on other sites

Nope.avi

 

Until we cap out 255TB of ram (48 bit mem-address plus 16 bit virtual) on a regular basis, there would be no real reason to make the jump to 128 bit.

Idk where you are getting 255TB from, but your math is off. Win 8 Enterprise 64 bit, has support for 512GB of RAM, while server 2012 Datacenter is up to 4TB, again on x64. Theoretically, the max limit would be closer to 2.3 EB (exabytes).

"Any sufficiently advanced technology is indistinguishable from magic" - Arthur C. Clarke
Just because it may seem like magic, I'm not a wizard, just a nerd. I am fallible. 


Use the quote button or @<username> to reply to people | Mark solved troubleshooting topics as such, selecting the correct answer, and follow them to get replies!

Community Standards | Guides & Tutorials Troubleshooting Section

Link to comment
Share on other sites

Link to post
Share on other sites

128 bit CPUs? That depends. Technically speaking AVX enabled processes are 128 bit already.

▶ Learn from yesterday, live for today, hope for tomorrow. The important thing is not to stop questioning. - Einstein◀

Please remember to mark a thread as solved if your issue has been fixed, it helps other who may stumble across the thread at a later point in time.

Link to comment
Share on other sites

Link to post
Share on other sites

Idk where you are getting 255TB from, but your math is off. Win 8 Enterprise 64 bit, has support for 512GB of RAM, while server 2012 Datacenter is up to 4TB, again on x64. Theoretically, the max limit would be closer to 2.3 EB (exabytes).

 

ripping numbers from my Investigating PAE thread, 2 Exabytes is only after you implement PAE on x64, before then, it's 255TB

Link to comment
Share on other sites

Link to post
Share on other sites

Eventually, sure, but at the moment there is no need whatsoever to make it happen. The limits of ram with 64bit are so high we'll take years if not decades to saturate it, and more years to actually need more. Also, a lot of programs are still 32bit, so even if 128bit became a thing tomorrow, the programs that would actually use it would be so few that the actual performance gain would be unnoticeable in most scenarios (look at the iphone5s, how many 64bit apps for ios are actually on the market? Wouldn't it have had a much more significant performance gain from going quad core?).

Don't ask to ask, just ask... please 🤨

sudo chmod -R 000 /*

Link to comment
Share on other sites

Link to post
Share on other sites

There are 128GB sticks of DDR4 in development and I think 16GB will soon become low/mid range.

My 4GB kit of DDR3 was about 1000 SEK 4 years ago. Today I can get 16GB for about the same price. That's a doubling each second year. So if we are following the same trend (which has been fairly slow compared to how it used to be) and assume that 32GB is high end these days, then the high end will hit the 512GB limit in 8 years. 512GB will be what 64GB (not too uncommon on very high end consumer desktops) is today in a mere 6 years. It's reasonable to assume that we will see 1TB RAM in ~10 years.

 

I can't see it happening, why on earth would we require any more than 16GB (unless you edit videos and such), 512GB in a mainstream computer will not be here for at least 20 years. If 512GB did actually come out in 8 years how would we even begin to make use of it.

DESKTOP - Motherboard - Gigabyte GA-Z77X-D3H Processor - Intel Core i5-2500K @ Stock 1.135v Cooling - Cooler Master Hyper TX3 RAM - Kingston Hyper-X Fury White 4x4GB DDR3-1866 Graphics Card - MSI GeForce GTX 780 Lightning PSU - Seasonic M12II EVO Edition 850w  HDD -  WD Caviar  Blue 500GB (Boot Drive)  /  WD Scorpio Black 750GB (Games Storage) / WD Green 2TB (Main Storage) Case - Cooler Master 335U Elite OS - Microsoft Windows 7 Ultimate

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×