Jump to content

New Quantum CPU In The Works By D-Wave

ionbasa

640px-DWave_128chip.jpg

Earlier this month, D-Wave’s Chief Executive Vern Brownell and DFJ’s Steve Jurvetson, one of the company’s earliest and biggest investors, sat down for an hour-long interview with Re/code. The discussion spanned the critiques of the company, the science of quantum computing and the next steps for D-Wave.
 
Notably, the latter includes the forthcoming D-Wave processor, which Brownell says will end all doubt that they've leaped ahead of classical systems — and will forever leave them behind.

http://recode.net/2014/09/25/d-wave-ceo-our-next-quantum-processor-will-make-computer-science-history-video/

 

Essentially, there were some doubts about whether their first production ready CPU did actually use principle of superposition. For those of you who don't know, Quantum computers are not strictly binary, but rather use qubits, where a bit can be both a 1 and a 0 simultaneously. These quantum CPUs can also leverage the phenomenon of quantum entanglement. In the end, this means that a cluster of quantum computers can outlevarge traditional supercomputers in computational power. 

 

For those of you who want to know the 'specs':

Our next-generation processor will be 1,000 qubits, actually more precisely 1,152, and that’s going to be released early next year. We already have several customers waiting for that processor and we have about four of those systems in our laboratory today undergoing development and tests.

 

It not only increases the number of qubits, it also has significant improvements in other important dimensions of performance. So certainly this next processor is going to be very exciting.

 

Ohh, look a video: 

 

Personally, this is pretty exciting. At least for now it will only be available to big players, like NASA, who plan on building a quantum computer cluster using D-Wave's technology. Eventually, this may become mainstream for the entire market. I suspect within the next 50 years, if all the kinks are ironed out. Also, AMD and Intel needs to watch their back, there is a new player in town, and both Intel's and AMD's traditional transistor based CPUs are antiquated. Granted, there is still a large time gap before quantum computing becomes available to everyone. For now, at least it can power some of the worlds fastest supercomputers. 

▶ Learn from yesterday, live for today, hope for tomorrow. The important thing is not to stop questioning. - Einstein◀

Please remember to mark a thread as solved if your issue has been fixed, it helps other who may stumble across the thread at a later point in time.

Link to comment
Share on other sites

Link to post
Share on other sites

Yeahhhhh very interesting, but first I'd like to see if it actually works.... They haven't provided any actual proof of their quantum computing yet....

"Great minds discuss ideas; average minds discuss events; small minds discuss people."

Main rig:

i7-4790 - 24GB RAM - GTX 970 - Samsung 840 240GB Evo - 2x 2TB Seagate. - 4 monitors - G710+ - G600 - Zalman Z9U3

Other devices

Oneplus One 64GB Sandstone

Surface Pro 3 - i7 - 256Gb

Surface RT

Server:

SuperMicro something - Xeon e3 1220 V2 - 12GB RAM - 16TB of Seagates 

Link to comment
Share on other sites

Link to post
Share on other sites

I wouldn't say that something that isn't the absolute absolute leading edge of research and future technology is "antiquated". Intel and AMD are very much present technology.

Link to comment
Share on other sites

Link to post
Share on other sites

Yeahhhhh very interesting, but first I'd like to see if it actually works.... They haven't provided any actual proof of their quantum computing yet....

If NASA, Google, and Lockheed Martin are willing to invest billions of dollars, I'm sure there is a reason for it. The article references over 30 studies/academia work which has been peer reviewed. 

 

I would imagine at this early in the game they don't want to reveal too much information, which would make sense, they could be on the verge of being the first company to provide quantum computing. 

As a company you don't just let everyone know your trade secrets.

▶ Learn from yesterday, live for today, hope for tomorrow. The important thing is not to stop questioning. - Einstein◀

Please remember to mark a thread as solved if your issue has been fixed, it helps other who may stumble across the thread at a later point in time.

Link to comment
Share on other sites

Link to post
Share on other sites

If this works it is going to be a....large step forward in technology.

All this just makes me want to try and learn more about quantum physics, and I already know nothing. 

 

Spoiler

Case Bitfenix Ghost, Mobo Asus Maximus VIII Ranger, CPU i7 6700K @4.2 Ghz cooled by Arctic cooling Freezer i30, (barely). GPU Nvidia GTX 970 Gigabyte G1 @1519Mhz core, RAM 16Gb Crucial Ballistix CL16 @2400Mhz. SSD 128GB Sandisk Ultra Plus as my OS drive. HDD's  1TB  Seagate ST31000524AS its OEM, 3TB Seagate Barracuda, 2x 500GB WDC Blue (RAID 0)

If it isn't working absolutely perfectly, according to all your assumptions, it is broken.

Link to comment
Share on other sites

Link to post
Share on other sites

"will end all doubt that they've leaped ahead of classical systems — and will forever leave them behind."

 

 

i hope thats true

1st gen had a lot of criticism 

If your grave doesn't say "rest in peace" on it You are automatically drafted into the skeleton war.

Link to comment
Share on other sites

Link to post
Share on other sites

Isn't D-Wave that company that Google bought like 6 months ago? Or am I thinking of something else?

Link to comment
Share on other sites

Link to post
Share on other sites

Isn't D-Wave that company that Google bought like 6 months ago? Or am I thinking of something else?

No, I don't remember what company it was exactly either. Maybe someone can chime in?

But Google is working on their own quantum CPUs along with the help of UC Santa Barbara: http://techcrunch.com/2014/09/02/google-partners-with-ucsb-to-build-quantum-processors-for-artificial-intelligence/

 

"will end all doubt that they've leaped ahead of classical systems — and will forever leave them behind."

 

 

i hope thats true

1st gen had a lot of criticism 

I know, which is why this is exciting. I remember one of the problems with the 1st generation was that the actual traces in the CPU were not random enough. If D-Wave delivers on what they are claiming with the 2nd generation of their chips, I will be impressed beyond belief.

▶ Learn from yesterday, live for today, hope for tomorrow. The important thing is not to stop questioning. - Einstein◀

Please remember to mark a thread as solved if your issue has been fixed, it helps other who may stumble across the thread at a later point in time.

Link to comment
Share on other sites

Link to post
Share on other sites

If NASA, Google, and Lockheed Martin are willing to invest billions of dollars, I'm sure there is a reason for it. The article references over 30 studies/academia work which has been peer reviewed. 

 

I would imagine at this early in the game they don't want to reveal too much information, which would make sense, they could be on the verge of being the first company to provide quantum computing. 

As a company you don't just let everyone know your trade secrets.

 

Yeah and much of that peer review said the whole thing was complete and utter BS. See criticism heading. The only public benchmarks we have of this "quantum computer", show it getting absolutely slaughtered thousands of times over, by a single core of an old Sandy Bridge hexa at stock speeds.

 

http://en.wikipedia.org/wiki/D-Wave_Systems

 

In January 2014 researchers at UC Berkeley and IBM published a classical model reproducing the D-Wave machine's observed behavior, suggesting that it may not be a quantum computer.

 

Other independent researchers found that different software packages running on a single core of a desktop computer can solve those same problems as fast or faster than D-Wave's computers (at least 12,000 times faster for quadratic assignment problems, and between 1 and 50 times faster for quadratic unconstrained binary optimization problems).

CPU:24/7-4770k @ 4.5ghz/4.0 cache @ 1.22V override, 1.776 VCCIN. MB: Z87-G41 PC Mate. Cooling: Hyper 212 evo push/pull. Ram: Gskill Ares 1600 CL9 @ 2133 1.56v 10-12-10-31-T1 150 TRFC. Case: HAF 912 stock fans (no LED crap). HD: Seagate Barracuda 1 TB. Display: Dell S2340M IPS. GPU: Sapphire Tri-x R9 290. PSU:CX600M OS: Win 7 64 bit/Mac OS X Mavericks, dual boot Hackintosh.

Link to comment
Share on other sites

Link to post
Share on other sites

Yeah and much of that peer review said the whole thing was complete and utter BS. See criticism heading. The only public benchmarks we have of this "quantum computer", show it getting absolutely slaughtered thousands of times over, by a single core of an old Sandy Bridge hexa at stock speeds.

 

http://en.wikipedia.org/wiki/D-Wave_Systems

 

In January 2014 researchers at UC Berkeley and IBM published a classical model reproducing the D-Wave machine's observed behavior, suggesting that it may not be a quantum computer.

 

Other independent researchers found that different software packages running on a single core of a desktop computer can solve those same problems as fast or faster than D-Wave's computers (at least 12,000 times faster for quadratic assignment problems, and between 1 and 50 times faster for quadratic unconstrained binary optimization problems).

Well, to be fair you can't compare a supposed quantum CPU to basic binary CPUs. I understand the criticism, but again, why cherry pick. From the same Wikipedia article you posted it clearly says:

Prior to announcing this partnership, NASA, Google, and Universities Space Research Association put a D-Wave computer through a series of benchmark and acceptance tests, which it passed. Independent researchers found that D-Wave's computers could solve some problems as much as 3,600 times faster than particular software packages running on conventional digital computers

and:

In March 2014, researchers at University College London and the University of Southern California (USC) published a paper comparing data obtained from a D-Wave Two computer with three possible explanations from classical physics and one quantum model. They found that their quantum model was a better fit to the experimental data than the Shin-Smith-Smolin-Vazirani classical model, and a much better fit than any of the other classical models. The authors conclude that "This suggests that an open system quantum dynamical description of the D-Wave device is well-justified even in the presence of relevant thermal excitations and fast single-qubit decoherence." 

 

In May 2014, researchers at D-Wave, Google, USC, Simon Fraser University, and National Research Tomsk Polytechnic University published a paper containing experimental results that demonstrated the presence of entanglement among D-Wave qubits. Qubit tunneling spectroscopy was used to measure the energy eigenspectrum of two and eight-qubit systems, demonstrating their coherence during a critical portion of the quantum annealing procedure.

 

This is like saying AMD's CPUs are fake because they suck in single threaded applications, or because their IPC is too slow. Maybe the Quantum CPU wasn't the right tool for the job. You wouldn't hammer a screw in drywall would you?

 

Don't cherry pick, will you?

▶ Learn from yesterday, live for today, hope for tomorrow. The important thing is not to stop questioning. - Einstein◀

Please remember to mark a thread as solved if your issue has been fixed, it helps other who may stumble across the thread at a later point in time.

Link to comment
Share on other sites

Link to post
Share on other sites

Isn't D-Wave that company that Google bought like 6 months ago? Or am I thinking of something else?

Google bought a computer from them.

“The mind of the bigot is like the pupil of the eye; the more light you pour upon it the more it will contract” -Oliver Wendell Holmes “If it can be destroyed by the truth, it deserves to be destroyed by the truth.” -Carl Sagan

Link to comment
Share on other sites

Link to post
Share on other sites

Yeahhhhh very interesting, but first I'd like to see if it actually works.... They haven't provided any actual proof of their quantum computing yet....

Yes they have. I've gotten to work with the D-Wave 1 myself due to Miami's relationship with Lockheed Martin. They certainly work using true quantum properties.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

Well, to be fair you can't compare a supposed quantum CPU to basic binary CPUs. I understand the criticism, but again, why cherry pick. From the same Wikipedia article you posted it clearly says:

and:

 

This is like saying AMD's CPUs are fake because they suck in single threaded applications, or because their IPC is too slow. Maybe the Quantum CPU wasn't the right tool for the job. You wouldn't hammer a screw in drywall would you?

 

Don't cherry pick, will you?

 

They replicated the "behavior" that claims it is a "quantum computer" on a classical model at UC Berkeley. That means that the claims are very suspicious, especially when not many people are given access. As far as the benchmarks, the thing is very slow. Also throwing out the names Google and NASA doesn't mean much to me when they release no public benchmarks or proof other than statements. 

 

All that has been proven so far is that it can be done on a classical computer model, and that existing OLDER technology by Intel crushes the thing and that the CPU in my own PC which is middle tier for a personal computer destroys it many times over. Add to that my old e8400 would be faster on one core from 2008. What exactly am I supposed to believe in here and be impressed by? Benchmarks? Nope. Possibly fraudulent claims? Nope. Oh I am supposed to be impressed by the names NASA and Google. No thanks.

CPU:24/7-4770k @ 4.5ghz/4.0 cache @ 1.22V override, 1.776 VCCIN. MB: Z87-G41 PC Mate. Cooling: Hyper 212 evo push/pull. Ram: Gskill Ares 1600 CL9 @ 2133 1.56v 10-12-10-31-T1 150 TRFC. Case: HAF 912 stock fans (no LED crap). HD: Seagate Barracuda 1 TB. Display: Dell S2340M IPS. GPU: Sapphire Tri-x R9 290. PSU:CX600M OS: Win 7 64 bit/Mac OS X Mavericks, dual boot Hackintosh.

Link to comment
Share on other sites

Link to post
Share on other sites

Yes they have. I've gotten to work with the D-Wave 1 myself due to Miami's relationship with Lockheed Martin. They certainly work using true quantum properties.

As much as that is cool, how do you know it works with true quantum properties? Surely you don't write in assembly for it but use some kind of high level language, how does the quantumness come into play?

"Great minds discuss ideas; average minds discuss events; small minds discuss people."

Main rig:

i7-4790 - 24GB RAM - GTX 970 - Samsung 840 240GB Evo - 2x 2TB Seagate. - 4 monitors - G710+ - G600 - Zalman Z9U3

Other devices

Oneplus One 64GB Sandstone

Surface Pro 3 - i7 - 256Gb

Surface RT

Server:

SuperMicro something - Xeon e3 1220 V2 - 12GB RAM - 16TB of Seagates 

Link to comment
Share on other sites

Link to post
Share on other sites

wow. that's high-level stuff! :blink:

Recovering Apple addict

 

ASUS Zephyrus G14 2022

Spoiler

CPU: AMD Ryzen 9 6900HS GPU: AMD r680M / RX 6700S RAM: 16GB DDR5 

 

Link to comment
Share on other sites

Link to post
Share on other sites

If only you didn't have to cool it down to those exotic temps.

Sound: Custom one pros, Audioengine A5+ with S8 sub.

K70 RGB

Link to comment
Share on other sites

Link to post
Share on other sites

They replicated the "behavior" that claims it is a "quantum computer" on a classical model at UC Berkeley. That means that the claims are very suspicious, especially when not many people are given access. As far as the benchmarks, the thing is very slow. Also throwing out the names Google and NASA doesn't mean much to me when they release no public benchmarks or proof other than statements

 

All that has been proven so far is that it can be done on a classical computer model, and that existing OLDER technology my Intel crushes the thing.

Ok, throw out what UC Berkeley said because you find it suspicious. Throw out the fact that it passed both NASA and Google's tests because it is suspicious. Throw out what Fraser University and National Research Tomsk Polytechnic University published because you are butt hurt that they won't publish exactly how the CPU works, and that is suspicious. As if you could understand how it even works. 95% of people, including me, aren't quantum physicists. All I know about quantum mechanics is what I learned in a 1 quarter physics class needed to work towards my mechanical engineering degree.

 

As for why only certain people are given access to the machine, do you really think any average Joe is qualified to use the computer? Let's not forget that this isn't mass produced on an assembly line, so there is obviously a limited supply to work with due to the complexity in building the apparatus needed to house the chip.

 

Also, again, they won't release information on how the chips work because it is a trade secret. You don't see Intel, Nvidia, AMD letting technical information out in the public. Does anyone other than Nvidia really know how an SMX unit works? Does anyone other than Intel know exactly how the designed their AVX1/2 capabilities to be processed on their chips? The answer is no, because there is money at the stake.

 

Also, you talk about how older CPUs can beat the quantum computer in solving quadratic models. Well guess what? Maybe the quantum CPU is running at a really low IPC (or similar). Maybe the quantum CPU isn't optimized completely for that task yet. Or maybe software used wasn't fully optimized. On the last notion of software optimization, take a look at how long AMD cpus couldn't run AVX 1 properly when compared to Intel. Why?

▶ Learn from yesterday, live for today, hope for tomorrow. The important thing is not to stop questioning. - Einstein◀

Please remember to mark a thread as solved if your issue has been fixed, it helps other who may stumble across the thread at a later point in time.

Link to comment
Share on other sites

Link to post
Share on other sites

As much as that is cool, how do you know it works with true quantum properties? Surely you don't write in assembly for it but use some kind of high level language, how does the quantumness come into play?

Because if you compare the emulator D-Wave provides for classical computers against the real deal, you can see the difference between pseudorandom and truly chaotic results. There's also the fact using the quantum route-tracing algorithm (solving Traveling Salesman) you get a truly correct answer in linear time instead of an approximation as happens on the emulators.

 

You can in fact program in q-assembly, though I prefer higher-level C libraries for it.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

Because if you compare the emulator D-Wave provides for classical computers against the real deal, you can see the difference between pseudorandom and truly chaotic results. There's also the fact using the quantum route-tracing algorithm (solving Traveling Salesman) you get a truly correct answer in linear time instead of an approximation as happens on the emulators.

 

You can in fact program in q-assembly, though I prefer higher-level C libraries for it.

This is pretty cool actually. Nice to see that someone has some experience using it.

▶ Learn from yesterday, live for today, hope for tomorrow. The important thing is not to stop questioning. - Einstein◀

Please remember to mark a thread as solved if your issue has been fixed, it helps other who may stumble across the thread at a later point in time.

Link to comment
Share on other sites

Link to post
Share on other sites

They replicated the "behavior" that claims it is a "quantum computer" on a classical model at UC Berkeley. That means that the claims are very suspicious, especially when not many people are given access. As far as the benchmarks, the thing is very slow. Also throwing out the names Google and NASA doesn't mean much to me when they release no public benchmarks or proof other than statements. 

 

All that has been proven so far is that it can be done on a classical computer model, and that existing OLDER technology by Intel crushes the thing and that the CPU in my own PC which is middle tier for a personal computer destroys it many times over. Add to that my old e8400 would be faster on one core from 2008. What exactly am I supposed to believe in here and be impressed by? Benchmarks? Nope. Possibly fraudulent claims? Nope. Oh I am supposed to be impressed by the names NASA and Google. No thanks.

Um, it cannot be done on a classical model. It can be approximated/emulated (with a high degree of error) on a classical model. Please do not go spreading misinformation.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

Ok, throw out what UC Berkeley said because you find it suspicious. Throw out the fact that it passed both NASA and Google's tests because it is suspicious. Throw out what Fraser University and National Research Tomsk Polytechnic University published because you are butt hurt that they won't publish exactly how the CPU works, and that is suspicious. As if you could understand how it even works. 95% of people, including me, aren't quantum physicists. All I know about quantum mechanics is what I learned in a 1 quarter physics class needed to work towards my mechanical engineering degree.

 

As for why only certain people are given access to the machine, do you really think any average Joe is qualified to use the computer? Let's not forget that this isn't mass produced on an assembly line, so there is obviously a limited supply to work with due to the complexity in building the apparatus needed to house the chip.

 

Also, again, they won't release information on how the chips work because it is a trade secret. You don't see Intel, Nvidia, AMD letting technical information out in the public. Does anyone other than Nvidia really know how an SMX unit works? Does anyone other than Intel know exactly how the designed their AVX1/2 capabilities to be processed on their chips? The answer is no, because there is money at the stake.

 

Also, you talk about how older CPUs can beat the quantum computer in solving quadratic models. Well guess what? Maybe the quantum CPU is running at a really low IPC (or similar). Maybe the quantum CPU isn't optimized completely for that task yet. Or maybe software used wasn't fully optimized. On the last notion of software optimization, take a look at how long AMD cpus couldn't run AVX 1 properly when compared to Intel. Why?

 

Just because government agencies and large corporations say something doesn't mean I have to take it at face value Where are the benchmarks? Is it only 7000 times slower than one core a Sandy Bridge E now? WOO HOO.

 

"Maybe the quantum CPU is running at a really low IPC (or similar). Maybe the quantum CPU isn't optimized completely for that task yet. Or maybe software used wasn't fully optimized."

 

Or maybe you should be skeptical of something whose only "proof" as a Quantum computer, has already been replicated on classical computing by Berkeley. Maybe you should be capable of critical thought and be skeptical of what government agencies and large corporations say. The burden of proof is on them and nothing has been proven yet. 

CPU:24/7-4770k @ 4.5ghz/4.0 cache @ 1.22V override, 1.776 VCCIN. MB: Z87-G41 PC Mate. Cooling: Hyper 212 evo push/pull. Ram: Gskill Ares 1600 CL9 @ 2133 1.56v 10-12-10-31-T1 150 TRFC. Case: HAF 912 stock fans (no LED crap). HD: Seagate Barracuda 1 TB. Display: Dell S2340M IPS. GPU: Sapphire Tri-x R9 290. PSU:CX600M OS: Win 7 64 bit/Mac OS X Mavericks, dual boot Hackintosh.

Link to comment
Share on other sites

Link to post
Share on other sites

Just because government agencies and large corporations say something doesn't mean I have to take it at face value Where are the benchmarks? Is it only 7000 times slower than one core a Sandy Bridge E now? WOO HOO.

 

"Maybe the quantum CPU is running at a really low IPC (or similar). Maybe the quantum CPU isn't optimized completely for that task yet. Or maybe software used wasn't fully optimized."

 

Or maybe you should be skeptical of something whose only "proof" as a Quantum computer, has already been replicated on classical computing by Berkeley. Maybe you should be capable of critical thought and be skeptical of what government agencies and large corporations say. The burden of proof is on them and nothing has been proven yet. 

It's slower for serial-style computing. Getting traveling salesman in linear time, even if you work 7n times slower, is still faster than 2^n which is the fastest a classical computer can actually correctly solve the problem. Benchmarks aren't everything and there is no equating a quantum computer and a classical. Please actually do some research.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

Google bought a computer from them.

 

Oh, I see. That's where I was confused then. :P

Link to comment
Share on other sites

Link to post
Share on other sites

Um, it cannot be done on a classical model. It can be approximated/emulated (with a high degree of error) on a classical model. Please do not go spreading misinformation.

 

Cornell University. Download the PDF

 

http://arxiv.org/abs/1401.7087

 

http://www.scottaaronson.com/blog/?p=1400

 

^ People at MIT are calling this BS.

CPU:24/7-4770k @ 4.5ghz/4.0 cache @ 1.22V override, 1.776 VCCIN. MB: Z87-G41 PC Mate. Cooling: Hyper 212 evo push/pull. Ram: Gskill Ares 1600 CL9 @ 2133 1.56v 10-12-10-31-T1 150 TRFC. Case: HAF 912 stock fans (no LED crap). HD: Seagate Barracuda 1 TB. Display: Dell S2340M IPS. GPU: Sapphire Tri-x R9 290. PSU:CX600M OS: Win 7 64 bit/Mac OS X Mavericks, dual boot Hackintosh.

Link to comment
Share on other sites

Link to post
Share on other sites

Just because government agencies and large corporations say something doesn't mean I have to take it at face value Where are the benchmarks? Is it only 7000 times slower than one core a Sandy Bridge E now? WOO HOO.

 

"Maybe the quantum CPU is running at a really low IPC (or similar). Maybe the quantum CPU isn't optimized completely for that task yet. Or maybe software used wasn't fully optimized."

 

Or maybe you should be skeptical of something whose only "proof" as a Quantum computer, has already been replicated on classical computing by Berkeley. Maybe you should be capable of critical thought and be skeptical of what government agencies and large corporations say. The burden of proof is on them and nothing has been proven yet. 

Universities are government agencies? Ok, today I learned that the US government has infiltrated Universities across the globe in trying to manipulate published information. Yes they are public institutions, but you really are taking it too far.

 

Again, your only argument is that you don't understand how it works, therefore, it clearly is the government and corporations lying to you. 

 

I hope this thread isn't going to turn into this thread which you trashed last week: http://linustechtips.com/main/topic/217960-microsoft-reveals-details-on-directx-113-yes-and-12-new-rendering/

 

This is about cool new technology, so you can be skeptical all you want, but stop posting half-truths which have been manipulated by you to try and confuse/misinform people. Please go do some research on the topic first and look through the academia papers published on the D-Wave. And using Wikipedia as a source is a joke. 

▶ Learn from yesterday, live for today, hope for tomorrow. The important thing is not to stop questioning. - Einstein◀

Please remember to mark a thread as solved if your issue has been fixed, it helps other who may stumble across the thread at a later point in time.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×