O(·) is not a function, so how can a function be equal to it?
up vote
10
down vote
favorite
I totally understand what big $O$ notation means. My issue is when we say $T(n)=O(f(n))$ , where $T(n)$ is running time of an algorithm on input of size $n$.
I understand semantics of it. But $T(n)$ and $O(f(n))$ are two different things.
$T(n)$ is an exact number, But $O(f(n))$ is not a function that spits out a number, so technically we can't say $T(n)$ equals $O(f(n))$, if one asks you what's the value of $O(f(n))$, what would be your answer? There is no answer.
complexity-theory asymptotics notation
New contributor
add a comment |
up vote
10
down vote
favorite
I totally understand what big $O$ notation means. My issue is when we say $T(n)=O(f(n))$ , where $T(n)$ is running time of an algorithm on input of size $n$.
I understand semantics of it. But $T(n)$ and $O(f(n))$ are two different things.
$T(n)$ is an exact number, But $O(f(n))$ is not a function that spits out a number, so technically we can't say $T(n)$ equals $O(f(n))$, if one asks you what's the value of $O(f(n))$, what would be your answer? There is no answer.
complexity-theory asymptotics notation
New contributor
add a comment |
up vote
10
down vote
favorite
up vote
10
down vote
favorite
I totally understand what big $O$ notation means. My issue is when we say $T(n)=O(f(n))$ , where $T(n)$ is running time of an algorithm on input of size $n$.
I understand semantics of it. But $T(n)$ and $O(f(n))$ are two different things.
$T(n)$ is an exact number, But $O(f(n))$ is not a function that spits out a number, so technically we can't say $T(n)$ equals $O(f(n))$, if one asks you what's the value of $O(f(n))$, what would be your answer? There is no answer.
complexity-theory asymptotics notation
New contributor
I totally understand what big $O$ notation means. My issue is when we say $T(n)=O(f(n))$ , where $T(n)$ is running time of an algorithm on input of size $n$.
I understand semantics of it. But $T(n)$ and $O(f(n))$ are two different things.
$T(n)$ is an exact number, But $O(f(n))$ is not a function that spits out a number, so technically we can't say $T(n)$ equals $O(f(n))$, if one asks you what's the value of $O(f(n))$, what would be your answer? There is no answer.
complexity-theory asymptotics notation
complexity-theory asymptotics notation
New contributor
New contributor
edited 10 hours ago
Gilles♦
32.4k790161
32.4k790161
New contributor
asked 18 hours ago
Mediocre
15416
15416
New contributor
New contributor
add a comment |
add a comment |
6 Answers
6
active
oldest
votes
up vote
27
down vote
Strictly speaking, $O(f(n))$ is a set of functions. So the value of $O(f(n))$ is simply the set of all functions that grow asymptotically not faster than $f(n)$. The notation $T(n) = O(f(n))$ is just a simplified way to write that $T(n) in O(f(n))$.
Note that this also clarifies some issues of the $O$ notation as it is normally used.
For example, $n^2 in O(n^3)$ but $n^3 notin O(n^2)$. So you can write $O(n^2) = O(n^3)$, but you cannot write $O(n^3) = O(n^2)$, even though one normally expects $=$ to represent an equivalence relation (in particular, symmetric).
20
Using $=O(f(n))$ is not simplified notation, it's an abuse of notation. And while it might be convenient for some uses, it's still confusing when you think about it.
– einpoklum
13 hours ago
1
Thanks for clarifying the last paragraph. Having said that, I don't think anyone would ever write "$O(n^2)=O(n^3)$", so criticizing "$=O(...)$" on those grounds is something of a straw man.
– David Richerby
13 hours ago
1
That example may be too simple to appear in practice, however in CS literature you will find very easily calculations such as e.g. $O(n log n + n^2) = O(n^2)$, which are applying the same principle to conclude that a particular phase of an algorithm, taking time e.g. $O(n log n)$, is dominated by another phase of cost $O(n^2)$.
– Vincenzo
11 hours ago
3
Your $T(n)in O(f(n))$ could also be considered (mild) abuse of notation, since the function is $T$ or $nmapsto T(n)$, so it should be more correct to write $Tin O(f(n))$ or $(nmapsto T(n))in O(f(n))$. But in many branches of mathematics $f(x)$ is used both for the function $f$ in general, and for the value of $f$ at the particular point $x$, so that should lead to little risk of confusion (unless you are a compiler or something).
– Jeppe Stig Nielsen
11 hours ago
2
@Vincenzo in that example both sets of functions are actually equal.
– Paŭlo Ebermann
5 hours ago
|
show 3 more comments
up vote
14
down vote
$O$ is a function
$$begin{align}
O : (mathbb{N}to mathbb{R}) &to mathbf{P}(mathbb{N}to mathbb{R})
\ f &mapsto O(f)
end{align}$$
i.e. it accepts a function $f$ and yields a set of functions that share the asymptotic bound of (at most) $f$. And strictly speaking the correct notation is thus
$$
(n mapsto T(n)) in O(nmapsto f(n))
$$
or short
$$
T in O(f)
$$
but it's customary in maths, science and CS to just use a variable somewhere in the expression to denote that you're considering functions of the argument $n$ on both sides. So $T(n) in O(f(n))$ is quite fine as well. $T(n) = O(f(n))$ is pretty much wrong though, as you suspected. It is very commonly used though, so definitely keep in mind what people mean when they write this.
I would advise against ever writing $T(n) = O(f(n))$, but opinions differ.
1
$T(n)=O(f(n)$ is a completely standard use of notation so claiming that it's wrong is unhelpful. (As, IMO, is claiming that $O$ is a function; that's technically true, but it's not really a helpful way to think about it.)
– David Richerby
15 hours ago
7
@DavidRicherby some things are completely standard but shouldn't be. $T(n) = O(f(n))$ is one example. Sure it's still good to know what people mean by this (as the OP does already), but how is it not helpful to confirm that this notation is technically speaking bogus? Why would you use it? Even if the $=$ version isn't ambiguous, neither is the $in$ one, and the more people switch to that notation the better. It's always better to stick to what actually makes sense mathematically, unless it's much more awkward to write. $in$ is perfectly readable and easy to write.
– leftaroundabout
15 hours ago
I like your answer., +1. I'd like to suggest though that you clarify in the last sentence that although it's wrong everyone uses it to mean "in" instead of "equal to", unfortunately.
– Pedro A
14 hours ago
Comments are not for extended discussion; this conversation has been moved to chat.
– D.W.♦
8 mins ago
add a comment |
up vote
3
down vote
Formally speaking, $O(f(n))$ is a the set of functions $g$ such that $g(n)leq k,f(n)$ for some constant $k$ and all large enough $n$. Thus, the most pedantically accurate way of writing it would be $T(n)in O(f(n))$. However, using $=$ instead of $in$ is completely standard, and $T(n)=O(f(n))$ just means $T(n)in O(f(n))$. This is essentially never ambiguous because we almost never manipulate the set $O(f(n))$.
In a sense, using equality makes $O(f(n))$ mean "some function $g$ such that $g(n)leq f,g(n)$ for all large enough $n$", and this means that you can write things like $f(n) = 3n + O(log n)$. Note that this is much more precise than, e.g., $f(n)=Theta(n)$ or $f(n)=O(n+log n)$.
You could also write $f(n) - 3n in O(log n)$. Though I admit that it can be handy to conclude a multiple-step computation with $f(n) = ldots = 3n + O(log n)$.
– leftaroundabout
14 hours ago
The rearrangement only works in standalone statements. It's much more common in the middle of calculations, where that kind of thing doesn't work, and where multiple functions get absorbed together into the Landau notation. (Stuff like $f(x) = e^{-x}(e^{2x}+O(x)) = e^x+o(1)$).
– David Richerby
14 hours ago
I find computations like that jarring; those equals signs aren't bidirectional anymore. I'm not sure there's more of a problem with writing $f(x) in e^x(e^{2x}+O(x)) subset e^x + o(1)$. I suppose that's also abuse of notation; basically you're overloading the $=$ operator whereas I prefer to lift $+$ and $cdot$ to also operate on sets.
– leftaroundabout
14 hours ago
add a comment |
up vote
3
down vote
In The Algorithm Design Manual [1], you can find a paragraph about this issue:
The Big Oh notation [including $O$, $Omega$ and $Theta$] provides for a rough notion of equality when comparing
functions. It is somewhat jarring to see an expression like $n^2 = O(n^3)$, but its
meaning can always be resolved by going back to the definitions in terms of upper
and lower bounds. It is perhaps most instructive to read the " = " here as meaning
"one of the functions that are". Clearly, $n^2$ is one of functions that are $O(n^3)$.
Strictly speaking (as noted by David Richerby's comment), $Theta$ gives you a rough notion of equality, $O$ a rough notion of less-than-or-equal-to, and $Omega$ and rough notion of greater-than-or-equal-to.
Nonetheless, I agree with Vincenzo's answer: you can simply interpret $O(f(n))$ as a set of functions and the = symbol as a set membership symbol $in$.
[1] Skiena, S. S. The Algorithm Design Manual (Second Edition). Springer (2008)
add a comment |
up vote
2
down vote
Usually, statements like
$$f = O(g)$$
can be interpreted as
$$ text{there exists } h in O(g) text{ such that }f = h,. $$
This becomes more useful in contexts like David Richerby mentions, where we write $f(n) = n^3 + O(n^2)$ to mean "there exists $g(n) in O(n^2)$ such that $f(n) = n^2 + g(n)$."
I find this existential quantifier interpretation so useful that I am tempted to write things like
$$ f(n) leq O(n^3) $$
which some will find an even more egregious style violation, but it is just a space-saving way of writing "there exists $C$ such that $f(n) leq C n^3$."
add a comment |
up vote
1
down vote
Prologue: The big $O$ notation is a classic example of the power and ambiguity of some notations as part of language loved by human mind. No matter how much confusion it have caused, it remains the choice of notation to convey the ideas that we can easily identify and agree to efficiently.
I totally understand what big $O$ notation means. My issue is when we say $T(n)=O(f(n))$ , where $T(n)$ is running time of an algorithm on input of size $n$.
Sorry, but you do not have an issue if you understand the meaning of big $O$ notation.
I understand semantics of it. But $T(n)$ and $O(f(n))$ are two different things. $T(n)$ is an exact number, But $O(f(n))$ is not a function that spits out a number, so technically we can't say $T(n)$ equals $O(f(n))$, if one asks you what's the value of $O(f(n))$, what would be your answer? There is no answer.
What is important is the semantics. What is important is (how) people can agree easily on (one of) its precise interpretations that will describe asymptotic behavior or time or space complexity we are interested in. The default precise interpretation/definition of $T(n)=O(f(n))$ is, as translated from Wikipedia,
$T$ is a real or complex valued function and $f$ is a real valued function, both defined on some unbounded subset of the real positive numbers, such that $f(n)$ is strictly positive for all large enough values of $n$. For for all sufficiently large values of $n$, the absolute value of $T(n)$ is at most a positive constant multiple of $f(n)$. That is, there exists a positive real number $M$ and a real number $n_0$ such that
${text{ for all }ngeq n_{0}, |T(n)|leq ;Mf(n){text{ for all }}ngeq n_{0}.}$
Please note this interpretation is considered the definition. All other interpretations and understandings, which may help you greatly in various ways, are secondary and corollary. Everyone (well, at least every answerer here) agrees to this interpretation/definition/semantics. As long as you can apply this interpretation, you are probably good most of time. Relax and be comfortable. You do not want to think too much, just as you do not think too much about some of the irregularity of English or French or most of natural languages. Just use the notation by that definition.
$T(n)$ is an exact number, But $O(f(n))$ is not a function that spits out a number, so technically we can't say $T(n)$ equals $O(f(n))$, if one asks you what's the value of $O(f(n))$, what would be your answer? There is no answer.
Indeed, there could be no answer, since the question is ill-posed. $T(n)$ does not mean an exact number. It is meant to stand for a function whose name is $T$ and whose formal parameter is $n$ (which is sort of bounded to the $n$ in $f(n)$). It is just as correct and even more so if we write $T=O(f)$. If $T$ is the function that maps $n$ to $n^2$ and $f$ is the function that maps $n$ to $n^3$, it is also conventional to write $f(n)=O(n^3)$ or $n^2=O(n^3)$. You are right that the equal sign does not mean equality in its ordinary sense. (Another example of abuse of the equality sign is the usage of equal sign to mean assignment in most programming languages, instead of more cumbersome :=
as in some languages.)
If we are only concerned about that one equality (I am starting to abuse language as well. It is not an equality; however, it is an equality since there is an equality sign in the notation or it could be construed as some kind of equality), $T(n)=O(f(n))$, this answer is done.
However, the question actually goes on. What does it mean by, for example, $f(n)=3n+O(log n)$? This equality is not covered by the definition above. We would like to introduce another convention, the placeholder convention. Here is the full statement of placeholder convention as stated in Wikipedia.
In more complicated usage, $O(cdots)$ can appear in different places in an equation, even several times on each side. For example, the following are true for $nto infty$.
$(n+1)^{2}=n^{2}+O(n)$
$(n+O(n^{1/2}))(n+O(log n))^{2}=n^{3}+O(n^{5/2})$
$n^{O(1)}=O(e^{n})$
The meaning of such statements is as follows: for any functions which satisfy each $O(cdots)$ on the left side, there are some functions satisfying each $O(cdots)$ on the right side, such that substituting all these functions into the equation makes the two sides equal. For example, the third equation above means: "For any function $f(n) = O(1)$, there is some function $g(n) = O(e^n)$ such that $n^{f(n)} = g(n)$."
You may want to check here for another example of placeholder convention in action.
You might have noticed by now that I have not used the set-theoretic explanation of the big $O$-notation. All I have done is just to show even without that set-theoretic explanation such as "$O(f(n))$ is a set of functions", we can still understand big $O$-notation fully and perfectly. If you find that set-theoretic explanation useful, please go ahead anyway.
You can check the section in "asymptotic notation" of CLRS for a more detailed analysis and usage pattern for the family of notations for asymptotic behavior, such as big $Theta$, $Omega$, small $o$, small $omega$, multivariable usage and more. The Wikipedia entry is also a pretty good reference.
Lastly, there is some inherent ambiguity/controversy with big $O$ notation with multiple variables,1 and 2. You might want to think twice when you are using those.
add a comment |
6 Answers
6
active
oldest
votes
6 Answers
6
active
oldest
votes
active
oldest
votes
active
oldest
votes
up vote
27
down vote
Strictly speaking, $O(f(n))$ is a set of functions. So the value of $O(f(n))$ is simply the set of all functions that grow asymptotically not faster than $f(n)$. The notation $T(n) = O(f(n))$ is just a simplified way to write that $T(n) in O(f(n))$.
Note that this also clarifies some issues of the $O$ notation as it is normally used.
For example, $n^2 in O(n^3)$ but $n^3 notin O(n^2)$. So you can write $O(n^2) = O(n^3)$, but you cannot write $O(n^3) = O(n^2)$, even though one normally expects $=$ to represent an equivalence relation (in particular, symmetric).
20
Using $=O(f(n))$ is not simplified notation, it's an abuse of notation. And while it might be convenient for some uses, it's still confusing when you think about it.
– einpoklum
13 hours ago
1
Thanks for clarifying the last paragraph. Having said that, I don't think anyone would ever write "$O(n^2)=O(n^3)$", so criticizing "$=O(...)$" on those grounds is something of a straw man.
– David Richerby
13 hours ago
1
That example may be too simple to appear in practice, however in CS literature you will find very easily calculations such as e.g. $O(n log n + n^2) = O(n^2)$, which are applying the same principle to conclude that a particular phase of an algorithm, taking time e.g. $O(n log n)$, is dominated by another phase of cost $O(n^2)$.
– Vincenzo
11 hours ago
3
Your $T(n)in O(f(n))$ could also be considered (mild) abuse of notation, since the function is $T$ or $nmapsto T(n)$, so it should be more correct to write $Tin O(f(n))$ or $(nmapsto T(n))in O(f(n))$. But in many branches of mathematics $f(x)$ is used both for the function $f$ in general, and for the value of $f$ at the particular point $x$, so that should lead to little risk of confusion (unless you are a compiler or something).
– Jeppe Stig Nielsen
11 hours ago
2
@Vincenzo in that example both sets of functions are actually equal.
– Paŭlo Ebermann
5 hours ago
|
show 3 more comments
up vote
27
down vote
Strictly speaking, $O(f(n))$ is a set of functions. So the value of $O(f(n))$ is simply the set of all functions that grow asymptotically not faster than $f(n)$. The notation $T(n) = O(f(n))$ is just a simplified way to write that $T(n) in O(f(n))$.
Note that this also clarifies some issues of the $O$ notation as it is normally used.
For example, $n^2 in O(n^3)$ but $n^3 notin O(n^2)$. So you can write $O(n^2) = O(n^3)$, but you cannot write $O(n^3) = O(n^2)$, even though one normally expects $=$ to represent an equivalence relation (in particular, symmetric).
20
Using $=O(f(n))$ is not simplified notation, it's an abuse of notation. And while it might be convenient for some uses, it's still confusing when you think about it.
– einpoklum
13 hours ago
1
Thanks for clarifying the last paragraph. Having said that, I don't think anyone would ever write "$O(n^2)=O(n^3)$", so criticizing "$=O(...)$" on those grounds is something of a straw man.
– David Richerby
13 hours ago
1
That example may be too simple to appear in practice, however in CS literature you will find very easily calculations such as e.g. $O(n log n + n^2) = O(n^2)$, which are applying the same principle to conclude that a particular phase of an algorithm, taking time e.g. $O(n log n)$, is dominated by another phase of cost $O(n^2)$.
– Vincenzo
11 hours ago
3
Your $T(n)in O(f(n))$ could also be considered (mild) abuse of notation, since the function is $T$ or $nmapsto T(n)$, so it should be more correct to write $Tin O(f(n))$ or $(nmapsto T(n))in O(f(n))$. But in many branches of mathematics $f(x)$ is used both for the function $f$ in general, and for the value of $f$ at the particular point $x$, so that should lead to little risk of confusion (unless you are a compiler or something).
– Jeppe Stig Nielsen
11 hours ago
2
@Vincenzo in that example both sets of functions are actually equal.
– Paŭlo Ebermann
5 hours ago
|
show 3 more comments
up vote
27
down vote
up vote
27
down vote
Strictly speaking, $O(f(n))$ is a set of functions. So the value of $O(f(n))$ is simply the set of all functions that grow asymptotically not faster than $f(n)$. The notation $T(n) = O(f(n))$ is just a simplified way to write that $T(n) in O(f(n))$.
Note that this also clarifies some issues of the $O$ notation as it is normally used.
For example, $n^2 in O(n^3)$ but $n^3 notin O(n^2)$. So you can write $O(n^2) = O(n^3)$, but you cannot write $O(n^3) = O(n^2)$, even though one normally expects $=$ to represent an equivalence relation (in particular, symmetric).
Strictly speaking, $O(f(n))$ is a set of functions. So the value of $O(f(n))$ is simply the set of all functions that grow asymptotically not faster than $f(n)$. The notation $T(n) = O(f(n))$ is just a simplified way to write that $T(n) in O(f(n))$.
Note that this also clarifies some issues of the $O$ notation as it is normally used.
For example, $n^2 in O(n^3)$ but $n^3 notin O(n^2)$. So you can write $O(n^2) = O(n^3)$, but you cannot write $O(n^3) = O(n^2)$, even though one normally expects $=$ to represent an equivalence relation (in particular, symmetric).
edited 14 hours ago
answered 18 hours ago
Vincenzo
54418
54418
20
Using $=O(f(n))$ is not simplified notation, it's an abuse of notation. And while it might be convenient for some uses, it's still confusing when you think about it.
– einpoklum
13 hours ago
1
Thanks for clarifying the last paragraph. Having said that, I don't think anyone would ever write "$O(n^2)=O(n^3)$", so criticizing "$=O(...)$" on those grounds is something of a straw man.
– David Richerby
13 hours ago
1
That example may be too simple to appear in practice, however in CS literature you will find very easily calculations such as e.g. $O(n log n + n^2) = O(n^2)$, which are applying the same principle to conclude that a particular phase of an algorithm, taking time e.g. $O(n log n)$, is dominated by another phase of cost $O(n^2)$.
– Vincenzo
11 hours ago
3
Your $T(n)in O(f(n))$ could also be considered (mild) abuse of notation, since the function is $T$ or $nmapsto T(n)$, so it should be more correct to write $Tin O(f(n))$ or $(nmapsto T(n))in O(f(n))$. But in many branches of mathematics $f(x)$ is used both for the function $f$ in general, and for the value of $f$ at the particular point $x$, so that should lead to little risk of confusion (unless you are a compiler or something).
– Jeppe Stig Nielsen
11 hours ago
2
@Vincenzo in that example both sets of functions are actually equal.
– Paŭlo Ebermann
5 hours ago
|
show 3 more comments
20
Using $=O(f(n))$ is not simplified notation, it's an abuse of notation. And while it might be convenient for some uses, it's still confusing when you think about it.
– einpoklum
13 hours ago
1
Thanks for clarifying the last paragraph. Having said that, I don't think anyone would ever write "$O(n^2)=O(n^3)$", so criticizing "$=O(...)$" on those grounds is something of a straw man.
– David Richerby
13 hours ago
1
That example may be too simple to appear in practice, however in CS literature you will find very easily calculations such as e.g. $O(n log n + n^2) = O(n^2)$, which are applying the same principle to conclude that a particular phase of an algorithm, taking time e.g. $O(n log n)$, is dominated by another phase of cost $O(n^2)$.
– Vincenzo
11 hours ago
3
Your $T(n)in O(f(n))$ could also be considered (mild) abuse of notation, since the function is $T$ or $nmapsto T(n)$, so it should be more correct to write $Tin O(f(n))$ or $(nmapsto T(n))in O(f(n))$. But in many branches of mathematics $f(x)$ is used both for the function $f$ in general, and for the value of $f$ at the particular point $x$, so that should lead to little risk of confusion (unless you are a compiler or something).
– Jeppe Stig Nielsen
11 hours ago
2
@Vincenzo in that example both sets of functions are actually equal.
– Paŭlo Ebermann
5 hours ago
20
20
Using $=O(f(n))$ is not simplified notation, it's an abuse of notation. And while it might be convenient for some uses, it's still confusing when you think about it.
– einpoklum
13 hours ago
Using $=O(f(n))$ is not simplified notation, it's an abuse of notation. And while it might be convenient for some uses, it's still confusing when you think about it.
– einpoklum
13 hours ago
1
1
Thanks for clarifying the last paragraph. Having said that, I don't think anyone would ever write "$O(n^2)=O(n^3)$", so criticizing "$=O(...)$" on those grounds is something of a straw man.
– David Richerby
13 hours ago
Thanks for clarifying the last paragraph. Having said that, I don't think anyone would ever write "$O(n^2)=O(n^3)$", so criticizing "$=O(...)$" on those grounds is something of a straw man.
– David Richerby
13 hours ago
1
1
That example may be too simple to appear in practice, however in CS literature you will find very easily calculations such as e.g. $O(n log n + n^2) = O(n^2)$, which are applying the same principle to conclude that a particular phase of an algorithm, taking time e.g. $O(n log n)$, is dominated by another phase of cost $O(n^2)$.
– Vincenzo
11 hours ago
That example may be too simple to appear in practice, however in CS literature you will find very easily calculations such as e.g. $O(n log n + n^2) = O(n^2)$, which are applying the same principle to conclude that a particular phase of an algorithm, taking time e.g. $O(n log n)$, is dominated by another phase of cost $O(n^2)$.
– Vincenzo
11 hours ago
3
3
Your $T(n)in O(f(n))$ could also be considered (mild) abuse of notation, since the function is $T$ or $nmapsto T(n)$, so it should be more correct to write $Tin O(f(n))$ or $(nmapsto T(n))in O(f(n))$. But in many branches of mathematics $f(x)$ is used both for the function $f$ in general, and for the value of $f$ at the particular point $x$, so that should lead to little risk of confusion (unless you are a compiler or something).
– Jeppe Stig Nielsen
11 hours ago
Your $T(n)in O(f(n))$ could also be considered (mild) abuse of notation, since the function is $T$ or $nmapsto T(n)$, so it should be more correct to write $Tin O(f(n))$ or $(nmapsto T(n))in O(f(n))$. But in many branches of mathematics $f(x)$ is used both for the function $f$ in general, and for the value of $f$ at the particular point $x$, so that should lead to little risk of confusion (unless you are a compiler or something).
– Jeppe Stig Nielsen
11 hours ago
2
2
@Vincenzo in that example both sets of functions are actually equal.
– Paŭlo Ebermann
5 hours ago
@Vincenzo in that example both sets of functions are actually equal.
– Paŭlo Ebermann
5 hours ago
|
show 3 more comments
up vote
14
down vote
$O$ is a function
$$begin{align}
O : (mathbb{N}to mathbb{R}) &to mathbf{P}(mathbb{N}to mathbb{R})
\ f &mapsto O(f)
end{align}$$
i.e. it accepts a function $f$ and yields a set of functions that share the asymptotic bound of (at most) $f$. And strictly speaking the correct notation is thus
$$
(n mapsto T(n)) in O(nmapsto f(n))
$$
or short
$$
T in O(f)
$$
but it's customary in maths, science and CS to just use a variable somewhere in the expression to denote that you're considering functions of the argument $n$ on both sides. So $T(n) in O(f(n))$ is quite fine as well. $T(n) = O(f(n))$ is pretty much wrong though, as you suspected. It is very commonly used though, so definitely keep in mind what people mean when they write this.
I would advise against ever writing $T(n) = O(f(n))$, but opinions differ.
1
$T(n)=O(f(n)$ is a completely standard use of notation so claiming that it's wrong is unhelpful. (As, IMO, is claiming that $O$ is a function; that's technically true, but it's not really a helpful way to think about it.)
– David Richerby
15 hours ago
7
@DavidRicherby some things are completely standard but shouldn't be. $T(n) = O(f(n))$ is one example. Sure it's still good to know what people mean by this (as the OP does already), but how is it not helpful to confirm that this notation is technically speaking bogus? Why would you use it? Even if the $=$ version isn't ambiguous, neither is the $in$ one, and the more people switch to that notation the better. It's always better to stick to what actually makes sense mathematically, unless it's much more awkward to write. $in$ is perfectly readable and easy to write.
– leftaroundabout
15 hours ago
I like your answer., +1. I'd like to suggest though that you clarify in the last sentence that although it's wrong everyone uses it to mean "in" instead of "equal to", unfortunately.
– Pedro A
14 hours ago
Comments are not for extended discussion; this conversation has been moved to chat.
– D.W.♦
8 mins ago
add a comment |
up vote
14
down vote
$O$ is a function
$$begin{align}
O : (mathbb{N}to mathbb{R}) &to mathbf{P}(mathbb{N}to mathbb{R})
\ f &mapsto O(f)
end{align}$$
i.e. it accepts a function $f$ and yields a set of functions that share the asymptotic bound of (at most) $f$. And strictly speaking the correct notation is thus
$$
(n mapsto T(n)) in O(nmapsto f(n))
$$
or short
$$
T in O(f)
$$
but it's customary in maths, science and CS to just use a variable somewhere in the expression to denote that you're considering functions of the argument $n$ on both sides. So $T(n) in O(f(n))$ is quite fine as well. $T(n) = O(f(n))$ is pretty much wrong though, as you suspected. It is very commonly used though, so definitely keep in mind what people mean when they write this.
I would advise against ever writing $T(n) = O(f(n))$, but opinions differ.
1
$T(n)=O(f(n)$ is a completely standard use of notation so claiming that it's wrong is unhelpful. (As, IMO, is claiming that $O$ is a function; that's technically true, but it's not really a helpful way to think about it.)
– David Richerby
15 hours ago
7
@DavidRicherby some things are completely standard but shouldn't be. $T(n) = O(f(n))$ is one example. Sure it's still good to know what people mean by this (as the OP does already), but how is it not helpful to confirm that this notation is technically speaking bogus? Why would you use it? Even if the $=$ version isn't ambiguous, neither is the $in$ one, and the more people switch to that notation the better. It's always better to stick to what actually makes sense mathematically, unless it's much more awkward to write. $in$ is perfectly readable and easy to write.
– leftaroundabout
15 hours ago
I like your answer., +1. I'd like to suggest though that you clarify in the last sentence that although it's wrong everyone uses it to mean "in" instead of "equal to", unfortunately.
– Pedro A
14 hours ago
Comments are not for extended discussion; this conversation has been moved to chat.
– D.W.♦
8 mins ago
add a comment |
up vote
14
down vote
up vote
14
down vote
$O$ is a function
$$begin{align}
O : (mathbb{N}to mathbb{R}) &to mathbf{P}(mathbb{N}to mathbb{R})
\ f &mapsto O(f)
end{align}$$
i.e. it accepts a function $f$ and yields a set of functions that share the asymptotic bound of (at most) $f$. And strictly speaking the correct notation is thus
$$
(n mapsto T(n)) in O(nmapsto f(n))
$$
or short
$$
T in O(f)
$$
but it's customary in maths, science and CS to just use a variable somewhere in the expression to denote that you're considering functions of the argument $n$ on both sides. So $T(n) in O(f(n))$ is quite fine as well. $T(n) = O(f(n))$ is pretty much wrong though, as you suspected. It is very commonly used though, so definitely keep in mind what people mean when they write this.
I would advise against ever writing $T(n) = O(f(n))$, but opinions differ.
$O$ is a function
$$begin{align}
O : (mathbb{N}to mathbb{R}) &to mathbf{P}(mathbb{N}to mathbb{R})
\ f &mapsto O(f)
end{align}$$
i.e. it accepts a function $f$ and yields a set of functions that share the asymptotic bound of (at most) $f$. And strictly speaking the correct notation is thus
$$
(n mapsto T(n)) in O(nmapsto f(n))
$$
or short
$$
T in O(f)
$$
but it's customary in maths, science and CS to just use a variable somewhere in the expression to denote that you're considering functions of the argument $n$ on both sides. So $T(n) in O(f(n))$ is quite fine as well. $T(n) = O(f(n))$ is pretty much wrong though, as you suspected. It is very commonly used though, so definitely keep in mind what people mean when they write this.
I would advise against ever writing $T(n) = O(f(n))$, but opinions differ.
edited 13 hours ago
answered 16 hours ago
leftaroundabout
1,01559
1,01559
1
$T(n)=O(f(n)$ is a completely standard use of notation so claiming that it's wrong is unhelpful. (As, IMO, is claiming that $O$ is a function; that's technically true, but it's not really a helpful way to think about it.)
– David Richerby
15 hours ago
7
@DavidRicherby some things are completely standard but shouldn't be. $T(n) = O(f(n))$ is one example. Sure it's still good to know what people mean by this (as the OP does already), but how is it not helpful to confirm that this notation is technically speaking bogus? Why would you use it? Even if the $=$ version isn't ambiguous, neither is the $in$ one, and the more people switch to that notation the better. It's always better to stick to what actually makes sense mathematically, unless it's much more awkward to write. $in$ is perfectly readable and easy to write.
– leftaroundabout
15 hours ago
I like your answer., +1. I'd like to suggest though that you clarify in the last sentence that although it's wrong everyone uses it to mean "in" instead of "equal to", unfortunately.
– Pedro A
14 hours ago
Comments are not for extended discussion; this conversation has been moved to chat.
– D.W.♦
8 mins ago
add a comment |
1
$T(n)=O(f(n)$ is a completely standard use of notation so claiming that it's wrong is unhelpful. (As, IMO, is claiming that $O$ is a function; that's technically true, but it's not really a helpful way to think about it.)
– David Richerby
15 hours ago
7
@DavidRicherby some things are completely standard but shouldn't be. $T(n) = O(f(n))$ is one example. Sure it's still good to know what people mean by this (as the OP does already), but how is it not helpful to confirm that this notation is technically speaking bogus? Why would you use it? Even if the $=$ version isn't ambiguous, neither is the $in$ one, and the more people switch to that notation the better. It's always better to stick to what actually makes sense mathematically, unless it's much more awkward to write. $in$ is perfectly readable and easy to write.
– leftaroundabout
15 hours ago
I like your answer., +1. I'd like to suggest though that you clarify in the last sentence that although it's wrong everyone uses it to mean "in" instead of "equal to", unfortunately.
– Pedro A
14 hours ago
Comments are not for extended discussion; this conversation has been moved to chat.
– D.W.♦
8 mins ago
1
1
$T(n)=O(f(n)$ is a completely standard use of notation so claiming that it's wrong is unhelpful. (As, IMO, is claiming that $O$ is a function; that's technically true, but it's not really a helpful way to think about it.)
– David Richerby
15 hours ago
$T(n)=O(f(n)$ is a completely standard use of notation so claiming that it's wrong is unhelpful. (As, IMO, is claiming that $O$ is a function; that's technically true, but it's not really a helpful way to think about it.)
– David Richerby
15 hours ago
7
7
@DavidRicherby some things are completely standard but shouldn't be. $T(n) = O(f(n))$ is one example. Sure it's still good to know what people mean by this (as the OP does already), but how is it not helpful to confirm that this notation is technically speaking bogus? Why would you use it? Even if the $=$ version isn't ambiguous, neither is the $in$ one, and the more people switch to that notation the better. It's always better to stick to what actually makes sense mathematically, unless it's much more awkward to write. $in$ is perfectly readable and easy to write.
– leftaroundabout
15 hours ago
@DavidRicherby some things are completely standard but shouldn't be. $T(n) = O(f(n))$ is one example. Sure it's still good to know what people mean by this (as the OP does already), but how is it not helpful to confirm that this notation is technically speaking bogus? Why would you use it? Even if the $=$ version isn't ambiguous, neither is the $in$ one, and the more people switch to that notation the better. It's always better to stick to what actually makes sense mathematically, unless it's much more awkward to write. $in$ is perfectly readable and easy to write.
– leftaroundabout
15 hours ago
I like your answer., +1. I'd like to suggest though that you clarify in the last sentence that although it's wrong everyone uses it to mean "in" instead of "equal to", unfortunately.
– Pedro A
14 hours ago
I like your answer., +1. I'd like to suggest though that you clarify in the last sentence that although it's wrong everyone uses it to mean "in" instead of "equal to", unfortunately.
– Pedro A
14 hours ago
Comments are not for extended discussion; this conversation has been moved to chat.
– D.W.♦
8 mins ago
Comments are not for extended discussion; this conversation has been moved to chat.
– D.W.♦
8 mins ago
add a comment |
up vote
3
down vote
Formally speaking, $O(f(n))$ is a the set of functions $g$ such that $g(n)leq k,f(n)$ for some constant $k$ and all large enough $n$. Thus, the most pedantically accurate way of writing it would be $T(n)in O(f(n))$. However, using $=$ instead of $in$ is completely standard, and $T(n)=O(f(n))$ just means $T(n)in O(f(n))$. This is essentially never ambiguous because we almost never manipulate the set $O(f(n))$.
In a sense, using equality makes $O(f(n))$ mean "some function $g$ such that $g(n)leq f,g(n)$ for all large enough $n$", and this means that you can write things like $f(n) = 3n + O(log n)$. Note that this is much more precise than, e.g., $f(n)=Theta(n)$ or $f(n)=O(n+log n)$.
You could also write $f(n) - 3n in O(log n)$. Though I admit that it can be handy to conclude a multiple-step computation with $f(n) = ldots = 3n + O(log n)$.
– leftaroundabout
14 hours ago
The rearrangement only works in standalone statements. It's much more common in the middle of calculations, where that kind of thing doesn't work, and where multiple functions get absorbed together into the Landau notation. (Stuff like $f(x) = e^{-x}(e^{2x}+O(x)) = e^x+o(1)$).
– David Richerby
14 hours ago
I find computations like that jarring; those equals signs aren't bidirectional anymore. I'm not sure there's more of a problem with writing $f(x) in e^x(e^{2x}+O(x)) subset e^x + o(1)$. I suppose that's also abuse of notation; basically you're overloading the $=$ operator whereas I prefer to lift $+$ and $cdot$ to also operate on sets.
– leftaroundabout
14 hours ago
add a comment |
up vote
3
down vote
Formally speaking, $O(f(n))$ is a the set of functions $g$ such that $g(n)leq k,f(n)$ for some constant $k$ and all large enough $n$. Thus, the most pedantically accurate way of writing it would be $T(n)in O(f(n))$. However, using $=$ instead of $in$ is completely standard, and $T(n)=O(f(n))$ just means $T(n)in O(f(n))$. This is essentially never ambiguous because we almost never manipulate the set $O(f(n))$.
In a sense, using equality makes $O(f(n))$ mean "some function $g$ such that $g(n)leq f,g(n)$ for all large enough $n$", and this means that you can write things like $f(n) = 3n + O(log n)$. Note that this is much more precise than, e.g., $f(n)=Theta(n)$ or $f(n)=O(n+log n)$.
You could also write $f(n) - 3n in O(log n)$. Though I admit that it can be handy to conclude a multiple-step computation with $f(n) = ldots = 3n + O(log n)$.
– leftaroundabout
14 hours ago
The rearrangement only works in standalone statements. It's much more common in the middle of calculations, where that kind of thing doesn't work, and where multiple functions get absorbed together into the Landau notation. (Stuff like $f(x) = e^{-x}(e^{2x}+O(x)) = e^x+o(1)$).
– David Richerby
14 hours ago
I find computations like that jarring; those equals signs aren't bidirectional anymore. I'm not sure there's more of a problem with writing $f(x) in e^x(e^{2x}+O(x)) subset e^x + o(1)$. I suppose that's also abuse of notation; basically you're overloading the $=$ operator whereas I prefer to lift $+$ and $cdot$ to also operate on sets.
– leftaroundabout
14 hours ago
add a comment |
up vote
3
down vote
up vote
3
down vote
Formally speaking, $O(f(n))$ is a the set of functions $g$ such that $g(n)leq k,f(n)$ for some constant $k$ and all large enough $n$. Thus, the most pedantically accurate way of writing it would be $T(n)in O(f(n))$. However, using $=$ instead of $in$ is completely standard, and $T(n)=O(f(n))$ just means $T(n)in O(f(n))$. This is essentially never ambiguous because we almost never manipulate the set $O(f(n))$.
In a sense, using equality makes $O(f(n))$ mean "some function $g$ such that $g(n)leq f,g(n)$ for all large enough $n$", and this means that you can write things like $f(n) = 3n + O(log n)$. Note that this is much more precise than, e.g., $f(n)=Theta(n)$ or $f(n)=O(n+log n)$.
Formally speaking, $O(f(n))$ is a the set of functions $g$ such that $g(n)leq k,f(n)$ for some constant $k$ and all large enough $n$. Thus, the most pedantically accurate way of writing it would be $T(n)in O(f(n))$. However, using $=$ instead of $in$ is completely standard, and $T(n)=O(f(n))$ just means $T(n)in O(f(n))$. This is essentially never ambiguous because we almost never manipulate the set $O(f(n))$.
In a sense, using equality makes $O(f(n))$ mean "some function $g$ such that $g(n)leq f,g(n)$ for all large enough $n$", and this means that you can write things like $f(n) = 3n + O(log n)$. Note that this is much more precise than, e.g., $f(n)=Theta(n)$ or $f(n)=O(n+log n)$.
answered 15 hours ago
David Richerby
65.3k1597186
65.3k1597186
You could also write $f(n) - 3n in O(log n)$. Though I admit that it can be handy to conclude a multiple-step computation with $f(n) = ldots = 3n + O(log n)$.
– leftaroundabout
14 hours ago
The rearrangement only works in standalone statements. It's much more common in the middle of calculations, where that kind of thing doesn't work, and where multiple functions get absorbed together into the Landau notation. (Stuff like $f(x) = e^{-x}(e^{2x}+O(x)) = e^x+o(1)$).
– David Richerby
14 hours ago
I find computations like that jarring; those equals signs aren't bidirectional anymore. I'm not sure there's more of a problem with writing $f(x) in e^x(e^{2x}+O(x)) subset e^x + o(1)$. I suppose that's also abuse of notation; basically you're overloading the $=$ operator whereas I prefer to lift $+$ and $cdot$ to also operate on sets.
– leftaroundabout
14 hours ago
add a comment |
You could also write $f(n) - 3n in O(log n)$. Though I admit that it can be handy to conclude a multiple-step computation with $f(n) = ldots = 3n + O(log n)$.
– leftaroundabout
14 hours ago
The rearrangement only works in standalone statements. It's much more common in the middle of calculations, where that kind of thing doesn't work, and where multiple functions get absorbed together into the Landau notation. (Stuff like $f(x) = e^{-x}(e^{2x}+O(x)) = e^x+o(1)$).
– David Richerby
14 hours ago
I find computations like that jarring; those equals signs aren't bidirectional anymore. I'm not sure there's more of a problem with writing $f(x) in e^x(e^{2x}+O(x)) subset e^x + o(1)$. I suppose that's also abuse of notation; basically you're overloading the $=$ operator whereas I prefer to lift $+$ and $cdot$ to also operate on sets.
– leftaroundabout
14 hours ago
You could also write $f(n) - 3n in O(log n)$. Though I admit that it can be handy to conclude a multiple-step computation with $f(n) = ldots = 3n + O(log n)$.
– leftaroundabout
14 hours ago
You could also write $f(n) - 3n in O(log n)$. Though I admit that it can be handy to conclude a multiple-step computation with $f(n) = ldots = 3n + O(log n)$.
– leftaroundabout
14 hours ago
The rearrangement only works in standalone statements. It's much more common in the middle of calculations, where that kind of thing doesn't work, and where multiple functions get absorbed together into the Landau notation. (Stuff like $f(x) = e^{-x}(e^{2x}+O(x)) = e^x+o(1)$).
– David Richerby
14 hours ago
The rearrangement only works in standalone statements. It's much more common in the middle of calculations, where that kind of thing doesn't work, and where multiple functions get absorbed together into the Landau notation. (Stuff like $f(x) = e^{-x}(e^{2x}+O(x)) = e^x+o(1)$).
– David Richerby
14 hours ago
I find computations like that jarring; those equals signs aren't bidirectional anymore. I'm not sure there's more of a problem with writing $f(x) in e^x(e^{2x}+O(x)) subset e^x + o(1)$. I suppose that's also abuse of notation; basically you're overloading the $=$ operator whereas I prefer to lift $+$ and $cdot$ to also operate on sets.
– leftaroundabout
14 hours ago
I find computations like that jarring; those equals signs aren't bidirectional anymore. I'm not sure there's more of a problem with writing $f(x) in e^x(e^{2x}+O(x)) subset e^x + o(1)$. I suppose that's also abuse of notation; basically you're overloading the $=$ operator whereas I prefer to lift $+$ and $cdot$ to also operate on sets.
– leftaroundabout
14 hours ago
add a comment |
up vote
3
down vote
In The Algorithm Design Manual [1], you can find a paragraph about this issue:
The Big Oh notation [including $O$, $Omega$ and $Theta$] provides for a rough notion of equality when comparing
functions. It is somewhat jarring to see an expression like $n^2 = O(n^3)$, but its
meaning can always be resolved by going back to the definitions in terms of upper
and lower bounds. It is perhaps most instructive to read the " = " here as meaning
"one of the functions that are". Clearly, $n^2$ is one of functions that are $O(n^3)$.
Strictly speaking (as noted by David Richerby's comment), $Theta$ gives you a rough notion of equality, $O$ a rough notion of less-than-or-equal-to, and $Omega$ and rough notion of greater-than-or-equal-to.
Nonetheless, I agree with Vincenzo's answer: you can simply interpret $O(f(n))$ as a set of functions and the = symbol as a set membership symbol $in$.
[1] Skiena, S. S. The Algorithm Design Manual (Second Edition). Springer (2008)
add a comment |
up vote
3
down vote
In The Algorithm Design Manual [1], you can find a paragraph about this issue:
The Big Oh notation [including $O$, $Omega$ and $Theta$] provides for a rough notion of equality when comparing
functions. It is somewhat jarring to see an expression like $n^2 = O(n^3)$, but its
meaning can always be resolved by going back to the definitions in terms of upper
and lower bounds. It is perhaps most instructive to read the " = " here as meaning
"one of the functions that are". Clearly, $n^2$ is one of functions that are $O(n^3)$.
Strictly speaking (as noted by David Richerby's comment), $Theta$ gives you a rough notion of equality, $O$ a rough notion of less-than-or-equal-to, and $Omega$ and rough notion of greater-than-or-equal-to.
Nonetheless, I agree with Vincenzo's answer: you can simply interpret $O(f(n))$ as a set of functions and the = symbol as a set membership symbol $in$.
[1] Skiena, S. S. The Algorithm Design Manual (Second Edition). Springer (2008)
add a comment |
up vote
3
down vote
up vote
3
down vote
In The Algorithm Design Manual [1], you can find a paragraph about this issue:
The Big Oh notation [including $O$, $Omega$ and $Theta$] provides for a rough notion of equality when comparing
functions. It is somewhat jarring to see an expression like $n^2 = O(n^3)$, but its
meaning can always be resolved by going back to the definitions in terms of upper
and lower bounds. It is perhaps most instructive to read the " = " here as meaning
"one of the functions that are". Clearly, $n^2$ is one of functions that are $O(n^3)$.
Strictly speaking (as noted by David Richerby's comment), $Theta$ gives you a rough notion of equality, $O$ a rough notion of less-than-or-equal-to, and $Omega$ and rough notion of greater-than-or-equal-to.
Nonetheless, I agree with Vincenzo's answer: you can simply interpret $O(f(n))$ as a set of functions and the = symbol as a set membership symbol $in$.
[1] Skiena, S. S. The Algorithm Design Manual (Second Edition). Springer (2008)
In The Algorithm Design Manual [1], you can find a paragraph about this issue:
The Big Oh notation [including $O$, $Omega$ and $Theta$] provides for a rough notion of equality when comparing
functions. It is somewhat jarring to see an expression like $n^2 = O(n^3)$, but its
meaning can always be resolved by going back to the definitions in terms of upper
and lower bounds. It is perhaps most instructive to read the " = " here as meaning
"one of the functions that are". Clearly, $n^2$ is one of functions that are $O(n^3)$.
Strictly speaking (as noted by David Richerby's comment), $Theta$ gives you a rough notion of equality, $O$ a rough notion of less-than-or-equal-to, and $Omega$ and rough notion of greater-than-or-equal-to.
Nonetheless, I agree with Vincenzo's answer: you can simply interpret $O(f(n))$ as a set of functions and the = symbol as a set membership symbol $in$.
[1] Skiena, S. S. The Algorithm Design Manual (Second Edition). Springer (2008)
edited 9 hours ago
answered 17 hours ago
Mario Cervera
2,36411120
2,36411120
add a comment |
add a comment |
up vote
2
down vote
Usually, statements like
$$f = O(g)$$
can be interpreted as
$$ text{there exists } h in O(g) text{ such that }f = h,. $$
This becomes more useful in contexts like David Richerby mentions, where we write $f(n) = n^3 + O(n^2)$ to mean "there exists $g(n) in O(n^2)$ such that $f(n) = n^2 + g(n)$."
I find this existential quantifier interpretation so useful that I am tempted to write things like
$$ f(n) leq O(n^3) $$
which some will find an even more egregious style violation, but it is just a space-saving way of writing "there exists $C$ such that $f(n) leq C n^3$."
add a comment |
up vote
2
down vote
Usually, statements like
$$f = O(g)$$
can be interpreted as
$$ text{there exists } h in O(g) text{ such that }f = h,. $$
This becomes more useful in contexts like David Richerby mentions, where we write $f(n) = n^3 + O(n^2)$ to mean "there exists $g(n) in O(n^2)$ such that $f(n) = n^2 + g(n)$."
I find this existential quantifier interpretation so useful that I am tempted to write things like
$$ f(n) leq O(n^3) $$
which some will find an even more egregious style violation, but it is just a space-saving way of writing "there exists $C$ such that $f(n) leq C n^3$."
add a comment |
up vote
2
down vote
up vote
2
down vote
Usually, statements like
$$f = O(g)$$
can be interpreted as
$$ text{there exists } h in O(g) text{ such that }f = h,. $$
This becomes more useful in contexts like David Richerby mentions, where we write $f(n) = n^3 + O(n^2)$ to mean "there exists $g(n) in O(n^2)$ such that $f(n) = n^2 + g(n)$."
I find this existential quantifier interpretation so useful that I am tempted to write things like
$$ f(n) leq O(n^3) $$
which some will find an even more egregious style violation, but it is just a space-saving way of writing "there exists $C$ such that $f(n) leq C n^3$."
Usually, statements like
$$f = O(g)$$
can be interpreted as
$$ text{there exists } h in O(g) text{ such that }f = h,. $$
This becomes more useful in contexts like David Richerby mentions, where we write $f(n) = n^3 + O(n^2)$ to mean "there exists $g(n) in O(n^2)$ such that $f(n) = n^2 + g(n)$."
I find this existential quantifier interpretation so useful that I am tempted to write things like
$$ f(n) leq O(n^3) $$
which some will find an even more egregious style violation, but it is just a space-saving way of writing "there exists $C$ such that $f(n) leq C n^3$."
edited 11 hours ago
David Richerby
65.3k1597186
65.3k1597186
answered 11 hours ago
usul
3,3681421
3,3681421
add a comment |
add a comment |
up vote
1
down vote
Prologue: The big $O$ notation is a classic example of the power and ambiguity of some notations as part of language loved by human mind. No matter how much confusion it have caused, it remains the choice of notation to convey the ideas that we can easily identify and agree to efficiently.
I totally understand what big $O$ notation means. My issue is when we say $T(n)=O(f(n))$ , where $T(n)$ is running time of an algorithm on input of size $n$.
Sorry, but you do not have an issue if you understand the meaning of big $O$ notation.
I understand semantics of it. But $T(n)$ and $O(f(n))$ are two different things. $T(n)$ is an exact number, But $O(f(n))$ is not a function that spits out a number, so technically we can't say $T(n)$ equals $O(f(n))$, if one asks you what's the value of $O(f(n))$, what would be your answer? There is no answer.
What is important is the semantics. What is important is (how) people can agree easily on (one of) its precise interpretations that will describe asymptotic behavior or time or space complexity we are interested in. The default precise interpretation/definition of $T(n)=O(f(n))$ is, as translated from Wikipedia,
$T$ is a real or complex valued function and $f$ is a real valued function, both defined on some unbounded subset of the real positive numbers, such that $f(n)$ is strictly positive for all large enough values of $n$. For for all sufficiently large values of $n$, the absolute value of $T(n)$ is at most a positive constant multiple of $f(n)$. That is, there exists a positive real number $M$ and a real number $n_0$ such that
${text{ for all }ngeq n_{0}, |T(n)|leq ;Mf(n){text{ for all }}ngeq n_{0}.}$
Please note this interpretation is considered the definition. All other interpretations and understandings, which may help you greatly in various ways, are secondary and corollary. Everyone (well, at least every answerer here) agrees to this interpretation/definition/semantics. As long as you can apply this interpretation, you are probably good most of time. Relax and be comfortable. You do not want to think too much, just as you do not think too much about some of the irregularity of English or French or most of natural languages. Just use the notation by that definition.
$T(n)$ is an exact number, But $O(f(n))$ is not a function that spits out a number, so technically we can't say $T(n)$ equals $O(f(n))$, if one asks you what's the value of $O(f(n))$, what would be your answer? There is no answer.
Indeed, there could be no answer, since the question is ill-posed. $T(n)$ does not mean an exact number. It is meant to stand for a function whose name is $T$ and whose formal parameter is $n$ (which is sort of bounded to the $n$ in $f(n)$). It is just as correct and even more so if we write $T=O(f)$. If $T$ is the function that maps $n$ to $n^2$ and $f$ is the function that maps $n$ to $n^3$, it is also conventional to write $f(n)=O(n^3)$ or $n^2=O(n^3)$. You are right that the equal sign does not mean equality in its ordinary sense. (Another example of abuse of the equality sign is the usage of equal sign to mean assignment in most programming languages, instead of more cumbersome :=
as in some languages.)
If we are only concerned about that one equality (I am starting to abuse language as well. It is not an equality; however, it is an equality since there is an equality sign in the notation or it could be construed as some kind of equality), $T(n)=O(f(n))$, this answer is done.
However, the question actually goes on. What does it mean by, for example, $f(n)=3n+O(log n)$? This equality is not covered by the definition above. We would like to introduce another convention, the placeholder convention. Here is the full statement of placeholder convention as stated in Wikipedia.
In more complicated usage, $O(cdots)$ can appear in different places in an equation, even several times on each side. For example, the following are true for $nto infty$.
$(n+1)^{2}=n^{2}+O(n)$
$(n+O(n^{1/2}))(n+O(log n))^{2}=n^{3}+O(n^{5/2})$
$n^{O(1)}=O(e^{n})$
The meaning of such statements is as follows: for any functions which satisfy each $O(cdots)$ on the left side, there are some functions satisfying each $O(cdots)$ on the right side, such that substituting all these functions into the equation makes the two sides equal. For example, the third equation above means: "For any function $f(n) = O(1)$, there is some function $g(n) = O(e^n)$ such that $n^{f(n)} = g(n)$."
You may want to check here for another example of placeholder convention in action.
You might have noticed by now that I have not used the set-theoretic explanation of the big $O$-notation. All I have done is just to show even without that set-theoretic explanation such as "$O(f(n))$ is a set of functions", we can still understand big $O$-notation fully and perfectly. If you find that set-theoretic explanation useful, please go ahead anyway.
You can check the section in "asymptotic notation" of CLRS for a more detailed analysis and usage pattern for the family of notations for asymptotic behavior, such as big $Theta$, $Omega$, small $o$, small $omega$, multivariable usage and more. The Wikipedia entry is also a pretty good reference.
Lastly, there is some inherent ambiguity/controversy with big $O$ notation with multiple variables,1 and 2. You might want to think twice when you are using those.
add a comment |
up vote
1
down vote
Prologue: The big $O$ notation is a classic example of the power and ambiguity of some notations as part of language loved by human mind. No matter how much confusion it have caused, it remains the choice of notation to convey the ideas that we can easily identify and agree to efficiently.
I totally understand what big $O$ notation means. My issue is when we say $T(n)=O(f(n))$ , where $T(n)$ is running time of an algorithm on input of size $n$.
Sorry, but you do not have an issue if you understand the meaning of big $O$ notation.
I understand semantics of it. But $T(n)$ and $O(f(n))$ are two different things. $T(n)$ is an exact number, But $O(f(n))$ is not a function that spits out a number, so technically we can't say $T(n)$ equals $O(f(n))$, if one asks you what's the value of $O(f(n))$, what would be your answer? There is no answer.
What is important is the semantics. What is important is (how) people can agree easily on (one of) its precise interpretations that will describe asymptotic behavior or time or space complexity we are interested in. The default precise interpretation/definition of $T(n)=O(f(n))$ is, as translated from Wikipedia,
$T$ is a real or complex valued function and $f$ is a real valued function, both defined on some unbounded subset of the real positive numbers, such that $f(n)$ is strictly positive for all large enough values of $n$. For for all sufficiently large values of $n$, the absolute value of $T(n)$ is at most a positive constant multiple of $f(n)$. That is, there exists a positive real number $M$ and a real number $n_0$ such that
${text{ for all }ngeq n_{0}, |T(n)|leq ;Mf(n){text{ for all }}ngeq n_{0}.}$
Please note this interpretation is considered the definition. All other interpretations and understandings, which may help you greatly in various ways, are secondary and corollary. Everyone (well, at least every answerer here) agrees to this interpretation/definition/semantics. As long as you can apply this interpretation, you are probably good most of time. Relax and be comfortable. You do not want to think too much, just as you do not think too much about some of the irregularity of English or French or most of natural languages. Just use the notation by that definition.
$T(n)$ is an exact number, But $O(f(n))$ is not a function that spits out a number, so technically we can't say $T(n)$ equals $O(f(n))$, if one asks you what's the value of $O(f(n))$, what would be your answer? There is no answer.
Indeed, there could be no answer, since the question is ill-posed. $T(n)$ does not mean an exact number. It is meant to stand for a function whose name is $T$ and whose formal parameter is $n$ (which is sort of bounded to the $n$ in $f(n)$). It is just as correct and even more so if we write $T=O(f)$. If $T$ is the function that maps $n$ to $n^2$ and $f$ is the function that maps $n$ to $n^3$, it is also conventional to write $f(n)=O(n^3)$ or $n^2=O(n^3)$. You are right that the equal sign does not mean equality in its ordinary sense. (Another example of abuse of the equality sign is the usage of equal sign to mean assignment in most programming languages, instead of more cumbersome :=
as in some languages.)
If we are only concerned about that one equality (I am starting to abuse language as well. It is not an equality; however, it is an equality since there is an equality sign in the notation or it could be construed as some kind of equality), $T(n)=O(f(n))$, this answer is done.
However, the question actually goes on. What does it mean by, for example, $f(n)=3n+O(log n)$? This equality is not covered by the definition above. We would like to introduce another convention, the placeholder convention. Here is the full statement of placeholder convention as stated in Wikipedia.
In more complicated usage, $O(cdots)$ can appear in different places in an equation, even several times on each side. For example, the following are true for $nto infty$.
$(n+1)^{2}=n^{2}+O(n)$
$(n+O(n^{1/2}))(n+O(log n))^{2}=n^{3}+O(n^{5/2})$
$n^{O(1)}=O(e^{n})$
The meaning of such statements is as follows: for any functions which satisfy each $O(cdots)$ on the left side, there are some functions satisfying each $O(cdots)$ on the right side, such that substituting all these functions into the equation makes the two sides equal. For example, the third equation above means: "For any function $f(n) = O(1)$, there is some function $g(n) = O(e^n)$ such that $n^{f(n)} = g(n)$."
You may want to check here for another example of placeholder convention in action.
You might have noticed by now that I have not used the set-theoretic explanation of the big $O$-notation. All I have done is just to show even without that set-theoretic explanation such as "$O(f(n))$ is a set of functions", we can still understand big $O$-notation fully and perfectly. If you find that set-theoretic explanation useful, please go ahead anyway.
You can check the section in "asymptotic notation" of CLRS for a more detailed analysis and usage pattern for the family of notations for asymptotic behavior, such as big $Theta$, $Omega$, small $o$, small $omega$, multivariable usage and more. The Wikipedia entry is also a pretty good reference.
Lastly, there is some inherent ambiguity/controversy with big $O$ notation with multiple variables,1 and 2. You might want to think twice when you are using those.
add a comment |
up vote
1
down vote
up vote
1
down vote
Prologue: The big $O$ notation is a classic example of the power and ambiguity of some notations as part of language loved by human mind. No matter how much confusion it have caused, it remains the choice of notation to convey the ideas that we can easily identify and agree to efficiently.
I totally understand what big $O$ notation means. My issue is when we say $T(n)=O(f(n))$ , where $T(n)$ is running time of an algorithm on input of size $n$.
Sorry, but you do not have an issue if you understand the meaning of big $O$ notation.
I understand semantics of it. But $T(n)$ and $O(f(n))$ are two different things. $T(n)$ is an exact number, But $O(f(n))$ is not a function that spits out a number, so technically we can't say $T(n)$ equals $O(f(n))$, if one asks you what's the value of $O(f(n))$, what would be your answer? There is no answer.
What is important is the semantics. What is important is (how) people can agree easily on (one of) its precise interpretations that will describe asymptotic behavior or time or space complexity we are interested in. The default precise interpretation/definition of $T(n)=O(f(n))$ is, as translated from Wikipedia,
$T$ is a real or complex valued function and $f$ is a real valued function, both defined on some unbounded subset of the real positive numbers, such that $f(n)$ is strictly positive for all large enough values of $n$. For for all sufficiently large values of $n$, the absolute value of $T(n)$ is at most a positive constant multiple of $f(n)$. That is, there exists a positive real number $M$ and a real number $n_0$ such that
${text{ for all }ngeq n_{0}, |T(n)|leq ;Mf(n){text{ for all }}ngeq n_{0}.}$
Please note this interpretation is considered the definition. All other interpretations and understandings, which may help you greatly in various ways, are secondary and corollary. Everyone (well, at least every answerer here) agrees to this interpretation/definition/semantics. As long as you can apply this interpretation, you are probably good most of time. Relax and be comfortable. You do not want to think too much, just as you do not think too much about some of the irregularity of English or French or most of natural languages. Just use the notation by that definition.
$T(n)$ is an exact number, But $O(f(n))$ is not a function that spits out a number, so technically we can't say $T(n)$ equals $O(f(n))$, if one asks you what's the value of $O(f(n))$, what would be your answer? There is no answer.
Indeed, there could be no answer, since the question is ill-posed. $T(n)$ does not mean an exact number. It is meant to stand for a function whose name is $T$ and whose formal parameter is $n$ (which is sort of bounded to the $n$ in $f(n)$). It is just as correct and even more so if we write $T=O(f)$. If $T$ is the function that maps $n$ to $n^2$ and $f$ is the function that maps $n$ to $n^3$, it is also conventional to write $f(n)=O(n^3)$ or $n^2=O(n^3)$. You are right that the equal sign does not mean equality in its ordinary sense. (Another example of abuse of the equality sign is the usage of equal sign to mean assignment in most programming languages, instead of more cumbersome :=
as in some languages.)
If we are only concerned about that one equality (I am starting to abuse language as well. It is not an equality; however, it is an equality since there is an equality sign in the notation or it could be construed as some kind of equality), $T(n)=O(f(n))$, this answer is done.
However, the question actually goes on. What does it mean by, for example, $f(n)=3n+O(log n)$? This equality is not covered by the definition above. We would like to introduce another convention, the placeholder convention. Here is the full statement of placeholder convention as stated in Wikipedia.
In more complicated usage, $O(cdots)$ can appear in different places in an equation, even several times on each side. For example, the following are true for $nto infty$.
$(n+1)^{2}=n^{2}+O(n)$
$(n+O(n^{1/2}))(n+O(log n))^{2}=n^{3}+O(n^{5/2})$
$n^{O(1)}=O(e^{n})$
The meaning of such statements is as follows: for any functions which satisfy each $O(cdots)$ on the left side, there are some functions satisfying each $O(cdots)$ on the right side, such that substituting all these functions into the equation makes the two sides equal. For example, the third equation above means: "For any function $f(n) = O(1)$, there is some function $g(n) = O(e^n)$ such that $n^{f(n)} = g(n)$."
You may want to check here for another example of placeholder convention in action.
You might have noticed by now that I have not used the set-theoretic explanation of the big $O$-notation. All I have done is just to show even without that set-theoretic explanation such as "$O(f(n))$ is a set of functions", we can still understand big $O$-notation fully and perfectly. If you find that set-theoretic explanation useful, please go ahead anyway.
You can check the section in "asymptotic notation" of CLRS for a more detailed analysis and usage pattern for the family of notations for asymptotic behavior, such as big $Theta$, $Omega$, small $o$, small $omega$, multivariable usage and more. The Wikipedia entry is also a pretty good reference.
Lastly, there is some inherent ambiguity/controversy with big $O$ notation with multiple variables,1 and 2. You might want to think twice when you are using those.
Prologue: The big $O$ notation is a classic example of the power and ambiguity of some notations as part of language loved by human mind. No matter how much confusion it have caused, it remains the choice of notation to convey the ideas that we can easily identify and agree to efficiently.
I totally understand what big $O$ notation means. My issue is when we say $T(n)=O(f(n))$ , where $T(n)$ is running time of an algorithm on input of size $n$.
Sorry, but you do not have an issue if you understand the meaning of big $O$ notation.
I understand semantics of it. But $T(n)$ and $O(f(n))$ are two different things. $T(n)$ is an exact number, But $O(f(n))$ is not a function that spits out a number, so technically we can't say $T(n)$ equals $O(f(n))$, if one asks you what's the value of $O(f(n))$, what would be your answer? There is no answer.
What is important is the semantics. What is important is (how) people can agree easily on (one of) its precise interpretations that will describe asymptotic behavior or time or space complexity we are interested in. The default precise interpretation/definition of $T(n)=O(f(n))$ is, as translated from Wikipedia,
$T$ is a real or complex valued function and $f$ is a real valued function, both defined on some unbounded subset of the real positive numbers, such that $f(n)$ is strictly positive for all large enough values of $n$. For for all sufficiently large values of $n$, the absolute value of $T(n)$ is at most a positive constant multiple of $f(n)$. That is, there exists a positive real number $M$ and a real number $n_0$ such that
${text{ for all }ngeq n_{0}, |T(n)|leq ;Mf(n){text{ for all }}ngeq n_{0}.}$
Please note this interpretation is considered the definition. All other interpretations and understandings, which may help you greatly in various ways, are secondary and corollary. Everyone (well, at least every answerer here) agrees to this interpretation/definition/semantics. As long as you can apply this interpretation, you are probably good most of time. Relax and be comfortable. You do not want to think too much, just as you do not think too much about some of the irregularity of English or French or most of natural languages. Just use the notation by that definition.
$T(n)$ is an exact number, But $O(f(n))$ is not a function that spits out a number, so technically we can't say $T(n)$ equals $O(f(n))$, if one asks you what's the value of $O(f(n))$, what would be your answer? There is no answer.
Indeed, there could be no answer, since the question is ill-posed. $T(n)$ does not mean an exact number. It is meant to stand for a function whose name is $T$ and whose formal parameter is $n$ (which is sort of bounded to the $n$ in $f(n)$). It is just as correct and even more so if we write $T=O(f)$. If $T$ is the function that maps $n$ to $n^2$ and $f$ is the function that maps $n$ to $n^3$, it is also conventional to write $f(n)=O(n^3)$ or $n^2=O(n^3)$. You are right that the equal sign does not mean equality in its ordinary sense. (Another example of abuse of the equality sign is the usage of equal sign to mean assignment in most programming languages, instead of more cumbersome :=
as in some languages.)
If we are only concerned about that one equality (I am starting to abuse language as well. It is not an equality; however, it is an equality since there is an equality sign in the notation or it could be construed as some kind of equality), $T(n)=O(f(n))$, this answer is done.
However, the question actually goes on. What does it mean by, for example, $f(n)=3n+O(log n)$? This equality is not covered by the definition above. We would like to introduce another convention, the placeholder convention. Here is the full statement of placeholder convention as stated in Wikipedia.
In more complicated usage, $O(cdots)$ can appear in different places in an equation, even several times on each side. For example, the following are true for $nto infty$.
$(n+1)^{2}=n^{2}+O(n)$
$(n+O(n^{1/2}))(n+O(log n))^{2}=n^{3}+O(n^{5/2})$
$n^{O(1)}=O(e^{n})$
The meaning of such statements is as follows: for any functions which satisfy each $O(cdots)$ on the left side, there are some functions satisfying each $O(cdots)$ on the right side, such that substituting all these functions into the equation makes the two sides equal. For example, the third equation above means: "For any function $f(n) = O(1)$, there is some function $g(n) = O(e^n)$ such that $n^{f(n)} = g(n)$."
You may want to check here for another example of placeholder convention in action.
You might have noticed by now that I have not used the set-theoretic explanation of the big $O$-notation. All I have done is just to show even without that set-theoretic explanation such as "$O(f(n))$ is a set of functions", we can still understand big $O$-notation fully and perfectly. If you find that set-theoretic explanation useful, please go ahead anyway.
You can check the section in "asymptotic notation" of CLRS for a more detailed analysis and usage pattern for the family of notations for asymptotic behavior, such as big $Theta$, $Omega$, small $o$, small $omega$, multivariable usage and more. The Wikipedia entry is also a pretty good reference.
Lastly, there is some inherent ambiguity/controversy with big $O$ notation with multiple variables,1 and 2. You might want to think twice when you are using those.
answered 3 hours ago
Apass.Jack
5,5681531
5,5681531
add a comment |
add a comment |
Mediocre is a new contributor. Be nice, and check out our Code of Conduct.
Mediocre is a new contributor. Be nice, and check out our Code of Conduct.
Mediocre is a new contributor. Be nice, and check out our Code of Conduct.
Mediocre is a new contributor. Be nice, and check out our Code of Conduct.
Thanks for contributing an answer to Computer Science Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
Use MathJax to format equations. MathJax reference.
To learn more, see our tips on writing great answers.
Some of your past answers have not been well-received, and you're in danger of being blocked from answering.
Please pay close attention to the following guidance:
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fcs.stackexchange.com%2fquestions%2f101324%2fo-is-not-a-function-so-how-can-a-function-be-equal-to-it%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown