to determine the costs of operations I perform code-wise. Is Learning the Asymptotic Notation enough?
Asymptotic notation usually deals with number of operations, not the time each operation takes.
...so what do you recommend?
What do you want exactly? To know how much operation like "+" or "*" take on your CPU?
To be able to mathematically analyze designs I come up with, which will enable to make comparisons before implementation
...so I can be sure I meet requirements
For general analysis O notation may be fine. But the whole point of the notation is to ignore time for each operation and the exact number of operations in algorithms. It's more high level than this.
I think I get your point. Ultimately, the time is machine and/or OS dependent. I think big-O and some research will do the job then
The only real way to know is to benchmark on realistic inputs. Other factors affect overall time too like memory access. Big O notation doesn't take it into consideration. And it ignores constant factors because it assumes that N approaches infinity. In reality N and 2N make huge difference while both are O(N)
I'll come back to this when I can relate
Обсуждают сегодня