Artificial Intelligence (AI) is one of the most popular areas of computer science and engineering. AI tries to reproduce what the human brain does. This means analysing, understanding, responding, learning and finding solutions. Since it is the computer that performs this reproduction, it is software that provides this intelligence. This intelligent software has the ability to find standalone solutions through big data analysis, to provide speech recognition, translate languages, manage mail, improve computer games, or analyse the behaviour of online browsers for customer-specific marketing. Banks use AI to predict changes in exchange rates or stock market movements, the automotive industry is working to use AI in autonomous cars that will drive without drivers, smartphones use AI to recognise faces, voices, etc. Superintelligence options are discussed, as well as AI risks. Big players like Google, Amazon, Baidu, Microsoft are investing billions in AI and their labour market is growing at high speed [1].
Health is not outside this “game” and AI is expected to have a massive impact on the currently established methods and therapeutic practices. This applies to diagnostics, medical therapy, new drug development, personalised treatment and overall health and prevention management, including gene editing [2].
How can computers learn?
Machine learning – algorithms can learn to find connections and manifestations related to diseases similarly to the way a doctor sees them. It is crucial for a computer to have as many specific examples as possible – several hundreds of thousands to millions of pieces of data to learn from. And, of course, this information must be digitised in order for the computer software to process it. So machine learning is particularly helpful in areas where diagnostic information is already digitised [3].
However, big data brings with it a new problem, and that is the speed of processing due to the large volume. We are talking about quantum computers that are so intelligent that they can find the information they want immediately at the moment of measurement. Quantum computers are changing the paradigm of contemporary computer science and are giving us a new level of understanding of reality.
Today’s computers use bits—a stream of electrical or optical pulses representing 1s or 0s. Everything from your tweets and emails to your iTunes songs and YouTube videos are essentially long strings of these binary digits.
Quantum computers, on the other hand, use qubits, which are typically subatomic particles such as electrons or photons. Generating and managing qubits is a scientific and engineering challenge. Qubits have some quirky quantum properties that mean a connected group of them can provide much more processing power than the same number of binary bits. One of those properties is known as superposition and another is called entanglement.