I have been studying Big O notation and watch videos and read book about the big O notation. I understand the purpose of it but I need to make sure that I got the whole concept of the Big O because I might have gotten the wrong idea. And I am hoping someone will be able to clarify that with me.

Let's Start, I know big O notation has different complexity times:

First, we have the constant time and that is O(1) and that when it does not include for-loops and print out instantly, correct ?

Second, we have O(N) and that includes one for loop in the code and corresponds to the input size, correct ?

Third, we have O(N^2) the deeper we go with the for loops it increases.

Finally, We have logarithmic time that is O (log N) and that is used for binary tree or binary search tree, correct ? <-- example would be awesome if possible

Thanks in advance, I just want to have the concept solidified