Jump to content

Your GPU Now Run Quantum Simulations 50x Faster Using Algorithms Inspired by Quantum computing.

My_Computer_Is_Trash
4 minutes ago, 05032-Mendicant-Bias said:

There is no bridge. Either there are quantum bits, or it's classical.

 

Fun Fact:

Vitalik, the creator of Ethereum, tried to convince VCs to give him money to simulate a quantum computer on a classical computer, and be faster than a classical computer. It didn't work. Vitalik did manage to convince VCs to give him money to build Ethereum.

Bridge the gap in capabilities. Through software. Not hardware. For example: Quantum computers are good at one thing: numbers. They can't do anything we use a computer for on a daily basis. Bridging the gap means making classical computers better at what quantum computers do.

Omg, it's a signature!

 

Link to comment
Share on other sites

Link to post
Share on other sites

6 minutes ago, My_Computer_Is_Trash said:

Bridge the gap in capabilities. Through software. Not hardware. For example: Quantum computers are good at one thing: numbers. They can't do anything we use a computer for on a daily basis. Bridging the gap means making classical computers better at what quantum computers do.

This is how we started with modern computers. In the beginning they could only do math, now we have smartphones. I assume a similar progression will occur in the quantum era, except I foresee progress being more explosive after an initial breakthrough.

CPU: Ryzen 5950X Ram: Corsair Vengeance 32GB DDR4 3600 CL14 | Graphics: GIGABYTE GAMING OC RTX 3090 |  Mobo: GIGABYTE B550 AORUS MASTER | Storage: SEAGATE FIRECUDA 520 2TB PSU: Be Quiet! Dark Power Pro 12 - 1500W | Monitor: Acer Predator XB271HU & LG C1

 

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, Remixt said:

This is how we started with modern computers. In the beginning they could only do math, now we have smartphones. I assume a similar progression will occur in the quantum era, except I foresee progress being more explosive after an initial breakthrough.

It makes sense now that I think about it. The only reason quantum computers can't run an OS is because there is not OS with a gui that can use superpositions.

Omg, it's a signature!

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Remixt said:

This is how we started with modern computers. In the beginning they could only do math, now we have smartphones. I assume a similar progression will occur in the quantum era, except I foresee progress being more explosive after an initial breakthrough.

I don't think that will be the case, for early computers there was still a well defined concept of logic gates and being able to utilize them to perform mathematical things (but also the ability to know what command one was on and progress through it like jmp statements and such).

 

In the quantum computing world, at least from my understanding we are just at the level of "make this problem into something that can be solved with a Fourier Transform and let the qubits figure it out and read out the buckets" [over simplification].  There isn't anything that really has been shown that indicates that it could even run an OS in the classical sense (and even if it did, you would need breakthroughs in material science to get qubits that aren't chilled with helium and giant chambers that don't have any EM interference.

 

1 hour ago, My_Computer_Is_Trash said:

Bridge the gap in capabilities. Through software. Not hardware. For example: Quantum computers are good at one thing: numbers. They can't do anything we use a computer for on a daily basis. Bridging the gap means making classical computers better at what quantum computers do.

I'm still not sure you understand what is being said in the article.

 

Literally the term bridge in this case means simulating a quantum computer because there isn't one that exists yet...so they are banking on one existing.  They are not bridging the gap between classical computers and quantum computers in terms of capabilities.  They are instead making it so the developers can start learning how  to "program" one, even though one doesn't exist yet.  So again, they are not increasing the performance of classical computing (even in this case they are just essentially using hardware acceleration)

 

It's akin to when new video formats release, they release software encoders/decoders first and wait for the hardware to do it.  It doesn't mean they are extending/making better the capabilities of the current hardware, instead they are just releasing a tool.  [And then later they might find they can use GPU acceleration to do the encode/decode faster].

 

The way I look at it, they are just trying to play with the wording to put a positive spin on their failure.  They failed at creating software for quantum computers, because the ones they needed didn't exist, so instead they wrote a simulation that would run their code; instead of bridging the gap it seems to me more along the lines of a stop-gap solution.

 

 

Should be pointed out, quantum computers aren't good at "numbers".  They can effectively solve a specific subset of problems, which can be coerced to reveal the solution.  If you were to ask it what 2+2 is it wouldn't be great at tasks like that.

3735928559 - Beware of the dead beef

Link to comment
Share on other sites

Link to post
Share on other sites

6 minutes ago, wanderingfool2 said:

I don't think that will be the case, for early computers there was still a well defined concept of logic gates and being able to utilize them to perform mathematical things (but also the ability to know what command one was on and progress through it like jmp statements and such).

I think (predict anyway) quantum computers will be accessed via the cloud and probably process specific things that traditional computers suck at in the long term, so it will be more of a combination of modern tech and quantum tech working in tandum that springs the information age into a "quantum age" so to speak.

CPU: Ryzen 5950X Ram: Corsair Vengeance 32GB DDR4 3600 CL14 | Graphics: GIGABYTE GAMING OC RTX 3090 |  Mobo: GIGABYTE B550 AORUS MASTER | Storage: SEAGATE FIRECUDA 520 2TB PSU: Be Quiet! Dark Power Pro 12 - 1500W | Monitor: Acer Predator XB271HU & LG C1

 

Link to comment
Share on other sites

Link to post
Share on other sites

21 hours ago, My_Computer_Is_Trash said:

My next class. And giving Nvidia *graphics cards* the ability to run quantum simulations isn't nothing.

There is a draft section at the bottom of the forum. You can store your posts there and compose them better next time.

Link to comment
Share on other sites

Link to post
Share on other sites

9 minutes ago, wanderingfool2 said:

Literally the term bridge in this case means simulating a quantum computer because there isn't one that exists yet...so they are banking on one existing.  They are not bridging the gap between classical computers and quantum computers in terms of capabilities.  They are instead making it so the developers can start learning how  to "program" one, even though one doesn't exist yet.  So again, they are not increasing the performance of classical computing (even in this case they are just essentially using hardware acceleration)

"So QC Ware CEO Matt Johnson said it turned to Nvidia Corp's (NVDA.O) graphic processing units (GPU) to "figure out how can we get them something that is a big step change in performance ... and build a bridge to quantum processing in the future."" (Jane Lee)

 

"This week, QC Ware is unveiling a quantum-inspired software platform called Promethium that will simulate chemical molecules - to see how they interact with things like protein - on a traditional computer using GPUs" (Jane Lee).

 

"The software can cut simulation time from hours to minutes for molecules of 100 atoms, and months to hours for molecules of up to 2000 atoms, compared with existing software solutions, said QC Ware's head of quantum chemistry Robert Parrish" (Jane Lee).

Omg, it's a signature!

 

Link to comment
Share on other sites

Link to post
Share on other sites

15 minutes ago, wanderingfool2 said:

Should be pointed out, quantum computers aren't good at "numbers".  They can effectively solve a specific subset of problems, which can be coerced to reveal the solution.  If you were to ask it what 2+2 is it wouldn't be great at tasks like that.

Huh. Interesting, I guess I was wrong. Would saying that quantum computers are good at running physics simulations be good?

Omg, it's a signature!

 

Link to comment
Share on other sites

Link to post
Share on other sites

21 minutes ago, Remixt said:

I think (predict anyway) quantum computers will be accessed via the cloud and probably process specific things that traditional computers suck at in the long term, so it will be more of a combination of modern tech and quantum tech working in tandum that springs the information age into a "quantum age" so to speak.

I think the issue is that the majority of people won't really have too many issues that can be solved specifically with quantum computing.  Or at least I can't think of any right now.  Data analysis stuff maybe, but for the majority of tasks I don't think there are too many consumer facing use-cases (probably more driven towards business use-cases).

 

17 minutes ago, My_Computer_Is_Trash said:

"So QC Ware CEO Matt Johnson said it turned to Nvidia Corp's (NVDA.O) graphic processing units (GPU) to "figure out how can we get them something that is a big step change in performance ... and build a bridge to quantum processing in the future."" (Jane Lee)

 

"This week, QC Ware is unveiling a quantum-inspired software platform called Promethium that will simulate chemical molecules - to see how they interact with things like protein - on a traditional computer using GPUs" (Jane Lee).

 

"The software can cut simulation time from hours to minutes for molecules of 100 atoms, and months to hours for molecules of up to 2000 atoms, compared with existing software solutions, said QC Ware's head of quantum chemistry Robert Parrish" (Jane Lee).

That's still not increasing the performance of classical computing.  They increased the performance in terms of simulating it...but that's also like me saying using a videocard to increase the performance in video encoding is increasing the performance of a system.  It's increasing the performance in a very very specific use-case, and in this case as well it seems like they must be taking shortcuts as well given they say code would still need to be adapted on the final quantum product.

 

Also the other bit regarding the bridge, I stand by what I said.  They effectively made software that simulates the quantum computer, and the bridge they are talking about the software.  Not bridging the gap between the actual computer and quantum computer.

3735928559 - Beware of the dead beef

Link to comment
Share on other sites

Link to post
Share on other sites

41 minutes ago, wanderingfool2 said:

Also the other bit regarding the bridge, I stand by what I said.  They effectively made software that simulates the quantum computer, and the bridge they are talking about the software.  Not bridging the gap between the actual computer and quantum computer.

Are they not using software to take classical computation simulation performance closer to the effectiveness of which Quantum computers simulate quantum level simulations? Therefor lessening the difference in simulation performance between the two, bridging the gap?

Omg, it's a signature!

 

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×