Похожие чаты

I want to be able to use a mathematical approach

to determine the costs of operations I perform code-wise. Is Learning the Asymptotic Notation enough?

9 ответов

16 просмотров

Asymptotic notation usually deals with number of operations, not the time each operation takes.

🖖- Автор вопроса
🖖
...so what do you recommend?

What do you want exactly? To know how much operation like "+" or "*" take on your CPU?

🖖- Автор вопроса
Ihor
What do you want exactly? To know how much operati...

To be able to mathematically analyze designs I come up with, which will enable to make comparisons before implementation

🖖- Автор вопроса
🖖
To be able to mathematically analyze designs I com...

...so I can be sure I meet requirements

🖖
To be able to mathematically analyze designs I com...

For general analysis O notation may be fine. But the whole point of the notation is to ignore time for each operation and the exact number of operations in algorithms. It's more high level than this.

🖖- Автор вопроса
Ihor
For general analysis O notation may be fine. But t...

I think I get your point. Ultimately, the time is machine and/or OS dependent. I think big-O and some research will do the job then

🖖
I think I get your point. Ultimately, the time is ...

The only real way to know is to benchmark on realistic inputs. Other factors affect overall time too like memory access. Big O notation doesn't take it into consideration. And it ignores constant factors because it assumes that N approaches infinity. In reality N and 2N make huge difference while both are O(N)

🖖- Автор вопроса
Ihor
The only real way to know is to benchmark on reali...

I'll come back to this when I can relate

Похожие вопросы

Карта сайта