- Jul 29, 2001
- 27,709
- 11
- 81
D-Wave homepage
They're doing a demonstration Feb 13 in Mountainview CA, and another one Feb 15 in Vancouver. I'm going to attend the Vancouver demo. There's not much more information than this right now, but I got wind of it at work and thought I'd pass it along.
UPDATE:
So I went to the demo day today. It started a bit later than expected due to the amount of people that showed up. Once it got going though, it was okay.
First, the presentation was very business oriented and not very technical. There were a lot of analogies used and most of the applications were spoken about in terms of dollars, markets, and business opportunity. The first half of the presentation was very dry and was spent mostly by the business end guys talking about the people behind the project and how great everyone was. It seemed as though these two businessey guys didn't actually know much about the computer itself, and their descriptions of how it worked were pretty lacking. I think they were engineers by training who were good with business and had financial contacts in the business community. There wasn't much substance to anything they spoke about, which was quite unfortunate.
The technical director finished off the presentation by talking a bit about the problems the QC could solve. There was a little bit of technical info which was nice, but his audience clearly didn't come from a physics background, so he shyed away from it for their sake. The demos were fairly impressive though, particularly the molecule search application. I didn't get to ask my question, and most of the questions dealt with possible applications, so I didn't get much out of that. Free food and beer afterwards was nice though.
The only really technical thing I can speak about is something I had to get for myself from the information presented as it wasn't spoken about directly. It seems as though their coherence time is still fairly low. The reason I say this is because the QC doesn't always come up with the same solution to the same problem every time. The speaker blamed it on noise, which I guess is true, but that noise serves to decohere the system. The result is they run the same algorithm several times and then take the answer which comes up the majority of the time to be the correct one. They have the system reliable enough that the answer they get the majority of the time is in fact the right one, and they have never gotten an incorrect solution more times than a correct one, but as of now there is currently no solution checking, nor any error checking. In fact, they don't plan on doing real-time error checking at all for the next several years, and their next steps involve doing solution checking at the end of the run of the algorithm.
They also acknowledged that nearest neighbour interactions isn't good enough for a larger system, and claim to have come up with a solution they are trying to implement for newer versions of the machine.
They're doing a demonstration Feb 13 in Mountainview CA, and another one Feb 15 in Vancouver. I'm going to attend the Vancouver demo. There's not much more information than this right now, but I got wind of it at work and thought I'd pass it along.
UPDATE:
So I went to the demo day today. It started a bit later than expected due to the amount of people that showed up. Once it got going though, it was okay.
First, the presentation was very business oriented and not very technical. There were a lot of analogies used and most of the applications were spoken about in terms of dollars, markets, and business opportunity. The first half of the presentation was very dry and was spent mostly by the business end guys talking about the people behind the project and how great everyone was. It seemed as though these two businessey guys didn't actually know much about the computer itself, and their descriptions of how it worked were pretty lacking. I think they were engineers by training who were good with business and had financial contacts in the business community. There wasn't much substance to anything they spoke about, which was quite unfortunate.
The technical director finished off the presentation by talking a bit about the problems the QC could solve. There was a little bit of technical info which was nice, but his audience clearly didn't come from a physics background, so he shyed away from it for their sake. The demos were fairly impressive though, particularly the molecule search application. I didn't get to ask my question, and most of the questions dealt with possible applications, so I didn't get much out of that. Free food and beer afterwards was nice though.
The only really technical thing I can speak about is something I had to get for myself from the information presented as it wasn't spoken about directly. It seems as though their coherence time is still fairly low. The reason I say this is because the QC doesn't always come up with the same solution to the same problem every time. The speaker blamed it on noise, which I guess is true, but that noise serves to decohere the system. The result is they run the same algorithm several times and then take the answer which comes up the majority of the time to be the correct one. They have the system reliable enough that the answer they get the majority of the time is in fact the right one, and they have never gotten an incorrect solution more times than a correct one, but as of now there is currently no solution checking, nor any error checking. In fact, they don't plan on doing real-time error checking at all for the next several years, and their next steps involve doing solution checking at the end of the run of the algorithm.
They also acknowledged that nearest neighbour interactions isn't good enough for a larger system, and claim to have come up with a solution they are trying to implement for newer versions of the machine.