Quantum Computing

434 readers
1 users here now

Discussions about quantum computers

founded 3 years ago
MODERATORS
1
 
 

Quantum mechanics appears to be a local hidden variable theory, at least when analyzing quantum circuits. This is not to say that quantum mechanics can be made into a local hidden variable theory if we add some additional ingredients to it, but it appears to already be one without modifying it in any way.

The paper points out that the computed weak values for a quantum circuit seem to evolve deterministically exactly like we would expect. Each qubit can be assigned a weak value and this weak value isn't arbitrary but is absolutely determined by the constraints of the theory and it's the only value you can assign to it using their equation for weak values.

If you compute the weak values for all the observables throughout the system, an interesting pattern emerges. The values look like what we would expect the states of the system to be if quantum mechanics was a local hidden variable theory. For example, when we measure an entangled system, if you compute the weak value throughout the evolution of the system prior to your measurement, you find that the observables correlate themselves with a specific value when they locally interact and not later when you measure it, and the weak values tell you exactly which pre-chosen value they settled upon.

Sometimes, the values of the observables you can get for computing the weak values can be strange and not even whole numbers. But whenever they interact in a way that depends upon the numbers, the interaction causes the weak values to "snap" to a specific observable. For example, I was playing around with the equation and managed to get a strange weak value of 2.414 for the Z observable, but when the CNOT operator was applied with it as the control which requires it to settle on a specific value, it "snapped" to +1 (which is |0>) in response.

The weak values evolve locally in the sense that they do not change unless an operator is specifically applied to that qubit. Even if it is entangled, the weak values will only change to a value correlated with the other entangled qubit at the moment of the interaction, when the CNOT operator is applied, and then won't change if the qubit it is entangled with is affected.

The weak values cannot be used to violate the uncertainty principle because the equation for weak values requires you to time-evolve the system in both directions, meaning you have to condition it on both the initial state and a specific final state.

Due to the uncertainty principle, a forwards time evolution of a quantum circuit yields ambiguous results, hence requiring you to describe it in terms of phase space, capturing all possible configurations. A backwards time evolution would do the same thing because unitary is time-reversible.

However, the weak value equation does both simulateously. If you want to know the weak value of a qubit at t=0.5 in an experiment that runs from t=0 to t=1, you compute the backwards time of evolution of the final state from t=1 to t=0.5, and the forwards time evolution of the initial state from t=0 to t=0.5, and these together give you sufficient constraints that out pops a deterministic value without any ambiguity, and you can do this for all the observables of the system.

The equation is really simple and doesn't add anything to the theory, it is just a simple fraction that is basically the same as the already agreed upon weak value equation but just a clarification on how you should construct the final and initial state vectors.

I have been playing around with it in Octave and it's blown me away.

If these weak values are indeed the physical state of the system at its intermediate steps, why doesn't that violate Bell's theorem? It's because Bell's theorem assumes a hidden variable theory would depend solely on the initial state of the system evolved forwards in time. But the weak value equation relies on both the forward and the backwards evolution in time simulateously, and if you have just one or the other then it isn't a sufficient constraint to spit out a deterministic answer.

You shouldn't interpret that as if the system is actually evolving both directions in time simulateously. The paper suggests thinking of it more in the terms of a kind of "all-at-once" dynamics, whereby the universe has certain constraints to the evolution of a system when the whole past and future are considered "all-at-once," those being the constraints given by quantum mechanics, and so the system will always locally and deterministically evolve according to those constraints, but the nature of this kind of evolution makes it impossible to predict ahead of time just with the initial state.

2
3
4
5
6
7
8
9
10
11
12
13
14
12
submitted 5 months ago* (last edited 5 months ago) by Sal@mander.xyz to c/qubits@mander.xyz
 
 

Abstract

Since the early 2000s there has existed the meme that “DOOM can run on anything”. Whether it be an ATM or a calculator, someone at some point has recompiled DOOM to run on it. Now the quantum computer finally joins the list. More specifically, this project represents the first level of DOOM loosely rewritten using Hadamards and Toffolis which, despite being a universal gate set, has been designed in such a way that it’s classically simulable, able to reach 10-20 frames per second on a laptop. The circuit uses 72,376 total qubits and at least 80 million gates, thus it may have use as a benchmark for quantum simulation software.

15
16
17
 
 

This is an ebook about quantum computing made for a general audience. The ebook was made freely available via arXiv by the author Chris Ferrie, who is known for writing science books for children such as the popular Quantum Physics for Babies.

18
19
20
21
22
23
24
25
view more: next ›