Quantum computing is an idea whose time has come. It would be radically new and fundamentally different from the classical computers we're used to. Quantum computers use microscopic objects or other extraordinarily tiny entities including light to process information. These tiny things don't follow the classical laws that govern the rest of the macroscopic universe. Instead, they follow the laws of quantum physics. Quantum computers harness the unique behaviour of quantum physics such as superposition, entanglement, and quantum interference and apply it to computing. This introduces new concepts to traditional programming methods. Quantum computers use tiny circuits to perform calculations, as do traditional computers. But they make these calculations in parallel, rather than sequentially, which is what makes them so fast. Regular computers process information in units called bits, which can represent one of two possible states, 0 or 1 that correspond to whether a portion of the computer chip called a logic gate is open or closed. Before a traditional computer moves on to process the next piece of information, it must have assigned the previous piece a value. By contrast, thanks to the “probabilistic” nature of quantum mechanics, the qubits in quantum computers don’t have to be assigned a value until the computer has finished the whole calculation. A quantum computer with 4 qubits can in theory handle 16 times as much information as an equally-sized conventional computer and will keep doubling in power with every qubit that’s added. That’s why a quantum computer can process exponentially more information than a classic computer. Quantum computers work on Qubits. The current record for qubits connected in commercially available quantum computers is 1,180, achieved by California startup Atom Computing in October 2023 more than double the previous record of 433, set by IBM in November 2022. The race for quantum advantage and fault-tolerant quantum computing is accelerating. In the next 12-24 months we expect to see demonstration of quantum superiority over classical systems on useful problems. By 2029/2030, the industry's ability to progress on chip scalability, algorithm development and quantum-classical integration will dictate commerciality timelines of useful systems. While there are many applications in material sciences, weather forecasting, drug discovery, optimisation solutions, the two most critical areas of impact are in cybersecurity and AI training. If by 2030 a 100k qubit stable solution is achieved, current crypto currencies will loose all their value as quantum computing will be able to compromise their keys. Not only cryptos but also most of the current industry security networks. Quantum computing will also make redundant the need for large data centers and reduce energy needs for AI learning. For e.g. it is generally more efficient to use ~100 qubits than it is to use ~10^18 classical bits. Another potential opportunity is the ability of QPUs to train models with much smaller data sets, particularly in cases where data scarcity prevents AI models from presenting good results. Summary: Niels Bohr once said “Those who are not shocked when they first come across quantum theory cannot possibly have understood it.” We believe quantum computing might change the world faster than the combined impact of telecom revolution in 2000s and the AI burst currently. The development curve in Quantum Computing will be far sharper than what we are seeing in AI and the impact far more wide spread. The only technical challenge remaining in this journey is how to scale the number of stable & error free physical qubits (which sit on the quantum chip) to be able to produce error-corrected logical qubits. Once the industry is able to solve this problem statement, we might be looking at answers to many unresolved questions till date in human history. Simultaneously it will pose many questions to current asset valuations such as crypto and AI led names.