is not much memory intensive. (correct me!)
Because it reads the file in backward direction from the EOF until it sees $someBigNumber "\n" characters. Then it prints the lines in forward direction, by reading them directly from the file line by line, without loading them into the memory.
But I think, when I run ./getSomeBigStream | tail -n $someBigNumber, it needs a large amount of memory size, depending to $someBigNumber, because the tail command doesn't know when it will reach EOF, so it should always hold $someBigNumber of lines in the memory.
Am I right?
You can check tail's source to see how it works
Обсуждают сегодня