... ... ...

QUANTUM COMPUTING Quantum computing studies theoretical computation systems (quantum computers) that make direct use of quantum-mechanical phenomena, such as superposition and entanglement, to perform operations on data. Quantum computers are different from binary digital electronic computers based on transistors.

Fine Line Labs is currently putting together a kit that will allow anyone to build a qubit, the basic component of a quantum computer. With many qubits, standardized quantum computing will be possible. Though still in the research and development phases, we have been making progress in bringing the cost of a qubit kit down to a reasonable figure. Our idea is that, with a DIY kit that costs a fraction of what Universities are currently paying for the development of a single qubit, coupled with a cloud network that will connect the qubits, a citizen-built quantum computer will be possible.

WHY IT'S IMPORTANT Encryption:

Quantum computing will solve problems exponentially faster than classical computing. Where it takes a classical computer the sum of all possible combinations to crack a password, for example, it will take a quantum computer one try. This is one of the basic natures of quantum entanglement: the operation performed by the qubit exists as 0, 1, plus every combination in between (as a superposition). A classical computer bit exists as only 0 or 1.

Commercial Interests:

Commercial interests as a sole driving force behind quantum computing will have consequences that are detrimental to humanity. The scope of this problem is bigger than anything faced before. As has been seen with government and pharmaceuticals, any time there is a medical advancement, profit is the first indicator looked at. If it can't be patented, the breakthrough is often shelved indefinitely. There have been exceptions, such as with the polio and smallpox vaccines, but in most cases, such as that faced by Crispr, discoveries are battled over and often go nowhere if profits aren't forseen by the companies involved in the research. These problems will face quantum computing just the same, and because it is feasible that quantum computers will have the ability to solve ALL computational problems, and quickly, humanity has alot to lose if quantum computing rests in the hands of governments and corporations alone. One might argue that discoveries will lie in the educational domain because experiments are often conducted in Universities, but said research is usually funded by corparations or government, and will not be accessible to the masses, as is the case with pharmaceuticals. This is bad, and cannot happen.

Certain aspects of quantum computing are currently being opened up to the public, like the interface to IBM's 5-qubit system in the cloud, but the components and software are still proprietory and innaccessible.

Unlike big Pharmacy, quantum computing isn't illegal for individuals to explore, so we must pursue it and stay in the know. We must allow citizens and the masses access to quantum computing, a technology that will potentially solve all of humanity's computational problems, including those posed by disease, climate change, and countless others.

Stay tuned for further developments!

MORSE CODE AS AN ANALOGY TO CLASSICAL COMPUTERS

A quantum computer makes use of the quantum states of subatomic particles to store and process information. A quantum computer is different from binary digital electronic computers based on transistors. Whereas common digital computing requires that the data be encoded into binary digits (bits), each of which is always in one of two definite states (0 or 1), quantum computation uses quantum bits, which can be in a superposition of states (0, 1, or somewhere in between). A useful way to think of quantum superposition, in terms of quantum computation, is as follows: As an analogy to digital, 0's and 1's, morse code uses dots and lines to encode letters and numbers, whereas dot dot dot, line line line, dot dot dot is equivalent to "SOS", as an S is encoded as dot dot dot, and an O is as line line line. This is exactly how a digital computer works, using combinations of 0s and 1s to encode all information. So again using morse, but with 0 and 1 instead of dot and line, imagine you are standing at one end of a hallway, and your partner is standing at the other end of the hallway holding up flashcards with either 0, or 1. So like the morse example, to spell out SOS, your partner would have to hold up the following flashcards, one at a time: 0 0 0 1 1 1 0 0 0 (same as dot dot dot, line line line, dot dot dot). In essense, you have to look at nine flashcards to get the SOS message. Now, as the message gets longer, you will have to look at many more cards. Let's take the entire alphabet: to relay to you the entire alphabet, your flashcard partner will have to hold up the following cards: Letter/Morse/Binary:

A .- 0 1

B -... 1 0 0 0

C -.-. 1 0 1 0

D -.. 1 0 0

E . 0

F ..-. 0 0 1 0

G --. 1 1 0

H .... 0 0 0 0

I .. 0 0

J .--- 0 1 1 1

K -.- 1 0 1

L .-.. 0 1 0 0

M -- 1 1

N -. 1 0

O --- 1 1 1

P .--. 0 1 1 0

Q --.- 1 1 0 1

R .-. 0 1 0

S ... 0 0 0

T - 1

U ..- 0 0 1

V ...- 0 0 0 1

W .-- 0 1 1

X -..- 1 0 0 1

Y -.-- 1 0 1 1

Z --.. 1 1 0 0

There are 92 binary digits here (0's and 1's) to comprise the alphabet. That doesn't include another combination of 0s/1s to indicate a space, and it doesn't include numbers or special characters. So, using the letters of the alphabet alone, it already takes 92 checks down the hall to your partners flashcards to decode and transfer the alphabet. Now, lets take a look at superposition and quantum computing: Imagine a quadrant (quarter of a circle), and so you have an X axis, and a Y axis, then the curve joining those axis, like if you cut a quarter out of a pie. Take this quarter pie-chart, and the top position (y axis) is UP, whereas the far right position (x axis) is DOWN. You could also call UP 1 and down 0, the name is redundant. Now trace along the curve of the quarter pie, the quarter of a radius of a circle. This curve lies between UP and DOWN (or between 1 and 0), and as we measure the curve of the quarter circle in degrees, we can say we are 1 degree along the curve, or 2 degrees, 3, 4, ..., 90 degrees, as the quarter of a circle has a 90 degree angle. We can also say 0.1 degrees, 0.2 degrees, etc. Or 0.000001, 0.000002, 0.000003, ... to the N. Now imagime we encode the alphabet as the first 0.000001, ... , 0.000026 degrees, but then we also encode entire words such as 0.000027 = me, 0.000028 = you, ..., until we have encoded every known word. Then we encode sentences, and eventually paragraphs, etc. We can encode every page of wikepidia as a degree along the curve. Now, say you want your friend with the flashcard to convey Einstine's theory of Relativity to you, and Einstine's theory of Relativity is encoded as 0.0004187 degrees along the curve. Well, just by drawing a line from the x/y axis center, at a 0.0004187 degree angle, your partner only has to show you one card. So to use binary (like morse code) to convey Einstine's theory of Relativity, it would take an average of 4 flashcards per letter, adding up to thousands upon thousands of flashcards, thousands of "checks" down the hallway to your partner. Using the Superposition as a degree between the states of 0 and 1, you only need to check once. This is the power of quantum computing. Of course, measuring the position between 0 and 1 (the superposition) gives only a probability of that position (this is the nature of quantum measurements), so the single flashcard "check" needs to be runs a few times, say ten times, to assure you get that same probability each time, but we are still thousands of times more efficient than digital computing.