CERN (Conseil Européen pour la Recherche Nucléaire), otherwise known as the European Organization for Nuclear Research, has launched a free online course that attempts to explain the basics of quantum computing. The first lesson aired on Friday at 10:30 a.m. CET, and future ones will happen at the same time every Friday through December 18.

Quantum computing is a nascent field of computing that is supposed to offer unparalleled computing speeds, and someday could perform calculations that existing computers are unable to solve in a reasonable amount of time.

__A monumental step —__ Nobody has quite figured out exactly what applications quantum computing will be used for, but it will likely be useful in areas like optimization problem solving that requires taking in lots of variables and finding the best course of action. In self-driving, for instance, where simulated driving is used to supplement real-world driving, companies could run many more simulated scenarios for machine learning in a short time frame.

Weather models are also expected to become far more accurate when quantum computing is affordable and accessible, but experts say nearly every industry could benefit from productivity gains derived from the technology.

Google recently said its own quantum chip can measure the output of a random number generator one million times in three minutes, compared to the 10,000 years it would take today's computers to do the same thing.

__Early days —__ Quantum bits promise much faster computing speeds because each bit can be in a state of one or zero at the same time, whereas regular computers accept input in the form of bits that are fixed in one position. Basically, if you think of cracking a password, a quantum computer could try many more inputs simultaneously. But the main hurdle to begin using quantum computing for serious applications is that the quantum bits themselves are very volatile and must be kept at sub-zero temperatures, which has made the computers expensive and bulky. IBM's best quantum computer only has 100 quantum bits inside, not enough to do serious processing just yet. It hopes to have a 1,000-bit quantum computer by 2023.

CERN's class begins by walking viewers through the basic concepts of quantum bits, and continues with the application of quantum computing in fields of optimization and simulation. The class will then offer an introduction to the IBM Quantum Experience, IBM's interface where researchers can gain access to its actual quantum computers and begin experimenting.

You don't need to have any quantum computing knowledge to take the course, but basic linear algebra and a familiarity with the Python programming language is beneficial. Recordings will be posted in the event you can't tune in live.