Introduction to Quantum Computing
"Big O Notation is used to classify algorithms according to how their run time or space requirements grow as the input size grows."
Big O Notation permits comparison with how the resources/runtime required by an algorithm scale with input size.
To understand this, we can use complexity theory. Complexity theory is the study of the computational effort required to run an algorithm.
The image below provides examples of common scaling factors of runtime N as a function of input size n.
For example, the computational effort for doing addition increases linearly with the number of digits for the number and the computational effort for multiplication increases by the square of the number size. These algorithms can be solved in polynomial time.
For example, the computational effort to find the prime factors of a number increases exponentially with the number of digits.
References:
- Big O Notation:
https://en.wikipedia.org/wiki/Big_O_notation - Hands-on Quantum Machine Learning with Python:
https://www.pyqml.com/ - Basic Qiskit Syntax:
https://qiskit.org/textbook/ch-appendix/qiskit.html - Learn in Qiskit:
https://qiskit.org/learn/ - Qiskit:
https://qiskit.org - Qiskit Documentation:
https://qiskit.org/documentation/index.html