What are the differences between quantum computing and ternary computing?
From what I understand...
A standard sequence of n
bits has 2^n configurations, and can be in only one of those configurations at a time.
A standard sequence of n
trits has 3^n configurations, and can be in only one of those configurations at a time.
A standard sequence of n
qubits can be in any subset of its 2^n configurations simultaneously.
A standard sequence of n
qutrits can be in any subset of its 3^n configurations simultaneously.