zlacker

[parent] [thread] 9 comments
1. elgfar+(OP)[view] [source] 2020-04-26 23:10:50
I recommend Computerphile's videos https://www.youtube.com/playlist?list=PLzg3FkRs7fcRJLgCmpy3o....

I had the same "problem" as you. What finally made me feel I sort of cracked it was those videos. The way I think of it now is: They let you do matrix multiplication. The internal state of the computer is the matrix, and the input is a vector, where each element is represented by a qubit. The elements can have any value 0 to 1, but in the output vector of the multiplication, they are collapsed into 0 or 1. You then run it many times to get statistical data on the output to be able to pinpoint the output values more closely.

replies(1): >>ad404b+D3
2. ad404b+D3[view] [source] 2020-04-26 23:45:34
>>elgfar+(OP)
I don't know if it's accurate (because I never understand anything I read about it) but this is the most concise and clear explanation I've read on this subject to date. Thank you!
replies(1): >>dreamc+cp
◧◩
3. dreamc+cp[view] [source] [discussion] 2020-04-27 03:48:46
>>ad404b+D3
It's accurate. QC is just linear algebra with complex numbers, plus probability extended to the complex domain. Why that's useful is something I'm still struggling with as well.
replies(1): >>sodafo+nt
◧◩◪
4. sodafo+nt[view] [source] [discussion] 2020-04-27 04:50:51
>>dreamc+cp
I'd assume it's the speed advantage, but the only problem I can think of that would require that type of exponential speed is cracking hashing algorithms which just seems destructive and counterproductive, like building a digital nuclear bomb - and from my very limited understanding that's a long ways off from being achieved.

I assume there's probably many more complex computational problems outside of my domain that QC can help with. Does anybody know of any?

replies(3): >>skosch+ov >>brg+Bv >>Smaug1+Sy
◧◩◪◨
5. skosch+ov[view] [source] [discussion] 2020-04-27 05:22:14
>>sodafo+nt
Many problems require lots of huge matrix multiplications—think simulations of physical systems, i.e. weather systems or molecular interactions, or numerical optimization.
replies(1): >>TJSome+EF
◧◩◪◨
6. brg+Bv[view] [source] [discussion] 2020-04-27 05:24:43
>>sodafo+nt
Aside from Shor's, the other is Grover's algorithm which deals with search in an unstructured database. There are more and more superpolynomial speedups which have been discovered in application of QC. A good enumeration of these is the quantum algorithm zoo.

https://quantumalgorithmzoo.org/

◧◩◪◨
7. Smaug1+Sy[view] [source] [discussion] 2020-04-27 06:13:01
>>sodafo+nt
Grover's algorithm lets you search an unsorted list of four elements with just one "is this thing in the list?" query. Classically, of course, it requires four queries.

More precisely, given f: 2^n -> {0,1} which is guaranteed to hit 1 exactly once, Grover finds the one input which hits 1, and it does so using about 2^{n/2} queries of f; but the constants happen to line up so that when n=2, exactly one query is required.

◧◩◪◨⬒
8. TJSome+EF[view] [source] [discussion] 2020-04-27 07:35:38
>>skosch+ov
Additionally, many problems are can be converted to matrix operations, like graph algorithms.

Note that matrix multiplication takes O(n^2) time with a quantum computer, but O(n^2.807) time on a classical computer.

replies(1): >>gnu-no+PJ
◧◩◪◨⬒⬓
9. gnu-no+PJ[view] [source] [discussion] 2020-04-27 08:28:42
>>TJSome+EF
> but O(n^2.807) time on a classical computer

Optimizing matrix multiplication for classical computers is an open research problem, and according to wikipedia there are algorithms with O(n^2.37) running time. Also according to wikipedia, it is not proven that matrix multiplication can't be done in O(n^2).

replies(1): >>infogu+fN
◧◩◪◨⬒⬓⬔
10. infogu+fN[view] [source] [discussion] 2020-04-27 09:09:59
>>gnu-no+PJ
Ah yes, we may as well add NP-completeness to this thread.
[go to top]