Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Qudits: The Real Future of Quantum Computing? (ieee.org)
63 points by jonbaer on June 30, 2017 | hide | past | favorite | 34 comments


You don't actually want qubits, you want an analog computer with differentiable signals. Most likely photonic. Qubits are a dead evolution branch.

I've been recently exploring computational metamaterials for photonic computation.

http://users.ece.utexas.edu/~aalu/research%20-%20page%203.ht... (there's quite a few papers on this but unfortunately they are all paywalled. Spoiler alert, they seem to be based entirely on Fourier transform).

These computational metamaterials don't need electricity to be powered (you need something that will shoot the photons on them and read back the values off tho).

Machine learning would be much, much faster on these as you have O(1) differential calculus.

They don't heat up. You can possibly build a house sized CPU out of these. I can see it, a city block sized CPU and a nuclear reactor next to it.

Did you know that on an analog machine, you can do sort in O(n)?

https://en.wikipedia.org/wiki/Spaghetti_sort

Hit me up if you wanna chat about this. I've seen the "light" (xdddd) now and can't go back to stupid bits.

I'm not like super married to the metamaterials but analog photonic trumps quantum for just about every task I can think of.


> You don't actually want qubits, you want an analog computer with differentiable signals. Most likely photonic. Qubits are a dead evolution branch.

that's quite a bold claim. im pretty intrigued, but it seems like there isn't a lot of easily accessible information on this topic. the wikipedia article on analog computers mostly focuses on mechanical ones from the past. do you have any recommended starter readings for analog computation? is it a model that has the computational time complexity of a non-deterministic turing machine? thanks.


There really isn't. Some of the old stuff still applies. But my approach to understanding this is very all over the place and I wouldn't recommend it for anyone besides myself. But I've found trying to understand electromagnetism and basic quantum theory (photon-matter interactions) to be quite helpful.

As for classification, there's some work on this e.g. this http://www.cs.princeton.edu/~ken/MCS86.pdf

I believe it's not non-deterministic, as in some sense, if you can pose the problem in a differentiable way, it's better than non-determinism.

It's like being in a maze. An analog computer like robot can perceive the slight tilt of the floor that goes from the entrance to the exit and just follows that tilt. It takes the robot an optimal number of steps too.

A non-deterministic Turing would kind of split itself into N different robots and start exploring all the turns. It might find the answer eventually but e.g. it will consume that much more energy.

This analogy might be silly but I hope it conveys how very different analog machines are.


thanks for the info. for the maze example, a ND robot would be able to find the exit in the optimal number of steps since one of the decision trees will in fact be the optimal path and will terminate first. so i dont see how analog computing can be more efficient.

unless you are saying that in an analog case we can ignore the rules of the maze and can go through walls. but then are we still solving the same problem? seems like the time complexity comparison no longer makes sense if you redefine the constraints of the problem.


> unless you are saying that in an analog case we can ignore the rules of the maze and can go through walls.

Not ignore, but you have a lot more information about the problem and you have a general direction of the solution.

ND turing can split and like explore multiple corridors at once. But in the end all those robots that don't end up finding exit just wasted energy. I guess you don't care about this in complexity theory but this is a huge gain in practice.


The description makes me think of a loud sound echoing through the maze, each branch further divides the volume of the sound, but a loud enough sound will "exit" in the optimal "O(1)" time because the sound wave "explores" every branch of the maze in "parallel".

Is this a reasonable mental model of what your describing?


In your description you are still expending extra energy to explore the branches. Also it's not O(1), it's just that every step you take is bound to be optimal. So in O(1), you are literally one step closer to the solution.


Did someone call my name?


Please make your criticism more concrete. Is what I'm proposing a stretch given the current circumstances? Definitely. However I can provide you with research on just about any point that I'm making. You do sometimes have to read between the lines but please point out one thing that you need clarified and I'll provide some resources.


I don't think you understand why quantum computers are powerful.


I do actually, but analog computers are potentially even more powerful. Quantum computers have bits and you don't want those.

Also when I say quantum I really mean digital quantum computer, I still have to decide what to think of quantum analog computers.


An ideal 1000 qubit system can explore 10^300 solutions simultaneously. A classical analog computer achieving a similar speedup would be larger than the visible universe (10^30 m) with precision below the planck length (10^-35 m).

Unlikely.


They they are orthogonal. Digital/Analog vs Non-quantum/Quantum. You can have quantum analog. Note that analog is a better match since quanta are fundamentally analog and you loose all this info when going to bits.

That's the thing, it can solve a lot of computations at the same time however an analog computer can recognize the direction of the solution therefore the number of simultaneous solutions doesn't really matter.


I agree they are orthogonal, but that does nothing to support the point that analog computers could be even more powerful than quantum computers.

You haven't really explained your second point, but assuming it's true, you'll only be able to "recognize the direction" within your margin of error. Your noise floor will be way above the required precision to explore a high-dimensional space.

Analog computers never achieved common use because of noise and lack of practical error correction. Even the human brain evolved to be mostly digital and we almost never focus on being exact.


Can you like google around please? What I'm saying isn't as unsupported as you seem to believe.

You are talking about electric analog, I'm talking about photonic analog.

How is the brain mostly digital?


This may be a rather naive question, as I've never seriously considered a modern take on analog computers with differentiable signals, but intuitively it's easy to see qualitative advantages in algorithms based on differential calculus, Fourier analysis, etc. What's less obvious to me is how such machines would do at fundamentally discrete tasks.

For a trivially simple concrete example, consider the task of computing the results of the SHA-256 hashing algorithm, which is primarily bottlenecked by bit shifts and rotates on integers. How would your analog machine compare to conventional digital computers? Is there any literature on this?


> What's less obvious to me is how such machines would do at fundamentally discrete tasks.

I might be talking out of my butt but fundamentally I think that eventually we'll realize that a lot of the currently supposedly solved problems might need to be rethought. I'm inclined to believe that hashing falls under these but don't quote me on that.

As for literature, I can't recommend anything in particular. But there's something on amazon, arxiv etc. I'm not sure what part of this are you most interested in, I guess what you are talking about could be complexity?


I'm extremely skeptical of this. Analog on such small scales is overwhelmed by noise and very low precision. Analog computations aren't magic either, they still take energy and produce waste heat. You are fighting the laws of thermodynamics. I certainly don't see how it beats quantum computers at the things they are good at over classical computers.


> I'm extremely skeptical of this.

It's ok, it took me some time to realize that the bit-copper orthodoxy need to be toppled as well.

> Analog on such small scales is overwhelmed by noise and very low precision.

But here we are making more and more photonic and optic things. Fiber optic is taking over. There quite a bit of work in silicon-photonic design (you can check this book https://smile.amazon.com/Silicon-Photonics-Design-Devices-Sy...

I mean there's noise yes, but it's actually somewhat manageable in a photonic setup.

> You are fighting the laws of thermodynamics.

Nowhere near as much as with copper. A fiber optic cable doesn't heat up.

> I certainly don't see how it beats quantum computers at the things they are good at over classical computers.

O(1) differential calculus is magic for a lot of purposes.


> Did you know that on an analog machine, you can do sort in O(n)?

Can it sort names or addresses?

Otherwise for integers you don't have buy spaghetti and just use a computer to do a radix sort which is O(nw).

Another thing with analog computers is to think how they scale. I can sort 1M items on my laptop. What is the practical application of buying 1M strands of spaghetti.


> Can it sort names or addresses?

Why couldn't it? Ideally they can represent real numbers with infinite precision which are identical to strings.

https://en.wikipedia.org/wiki/Real_computation

But even in non-real computation this shouldn't be a problem, the same way you can sort strings even though you have only 32/64 bits of memory at a time if that's what you are hinting at.

> Another thing with analog computers is to think how they scale.

I don't think that scale has ever been a problem for analog computers.


> I don't think that scale has ever been a problem for analog computers.

I'd think an array of 1B spaghetti strands might pose a few difficulties above what sorting those numbers on computers would.


I can't tell if you are joking or not. You don't actually think there's 1B spaghetti right?


Not actual pasta, no but there would have to be an analog equivalent. How do you see a practical implementation of this working that still preserves the O(n) property


There's several way of achieving computation in this scheme. I'm not quite settled on which of these is the best. But why do you think that this property would be translated when going from electric to photonic?


Is it yet possible to build the basic logic gates, using light only?


You don't want logic gates.


What do you want than? Is there some lecture on this that would explain the architecture of such a computer?


I want continuous functions and signals. Not aware of lectures but search for analog computation on Amazon or google.

There's this class https://www.utdallas.edu/atec/docs/virtual-analog-computing.... but the resources aren't online.


Can please try to explain the basics?

How will such a machine to add two numbers look like?


I know these guys understand quantum anything more than I ever will, but is having a larger radix specifically a practical way for increasing performance?

If a binary device has a hardware fault, bits that are in error still have a 50% chance of being correct. It seems like a single component failing here would make this worse (five times worse if it wasn't quantum).

I don't know whether or not the pros outweigh the cons here, or at what radix they do (there's probably some statistical analysis that can answer that), but ten just _seems_ like too much.


I am not an expert.

The issue that quantum computers have is not so much that individual qubits get errors, but rather that quibits become entangled with the rest of the universe, erasing our ability to take advantage of any quantom effect. To prevent this, you need to very carefully contain the quibits to prevent them from interacting with anything apart from highly controlled manipulations. The more quibits you have, the harder this becomes. If you can replace this with quidits that have more states, you can do the same computation, while needing fewer particles, making containment easier.


A qudit expands the state space, sure. But now you need universal control over this system so I'm not sure you've bought anything. Also, a system will decay from energy level n to n-1 faster when n is larger... so like eh.


quantum algorithms change complexity class. The pro is simply too big if it can work in reality.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: