Big O Notation is a tool used to describe the time complexity of algorithms.
It calculates the time taken to run an algorithm as the input grows.
In other words, it calculates the worst-case time complexity of an algorithm.
Big O Notation in Data Structure describes the upper bound of an algorithm's runtime.
Performance analysis of an algorithm depends upon two factors i.e. amount of memory used and amount of compute time consumed on any CPU.
Formally they are notified as complexities in terms of: Space Complexity.
Problem complexity (lower bounds)
Thus the complexity of a problem is not greater than the complexity of any algorithm that solves the problems.
It follows that every complexity of an algorithm, that is expressed with big O notation, is also an upper bound on the complexity of the corresponding problem.