# What is Big O

Big O is a way for us to talk generally about how long an algorithm takes to run, or in other words its ‘runtime’. Specifically, we are looking at how this runtime changes as the algorithm is given more and more data. We don’t talk about the runtime in terms of minutes or seconds, but rather we measure it relative to the size of the input, which we call *n*. Remember, and we’ll repeat this a lot in this tutorial, Big O isn’t about the exact time an algorithm will take to run, so we can’t say it’ll take 5 minutes. Instead, we can say the algorithm runs in *n *time (or some people say it has a time complexity of *n*). We write this as O(n) (read as ‘big O of n’). This means that if *n* is 5 (our input size) and this takes 10 minutes to run, then an input of 10 would take 20 minutes to run (roughly). What we are saying when we say it runs in *n *time is that the runtime will grow at the same rate *n* grows. If you double the input, the runtime will double. If you times it by 10, the runtime will increase by a factor of 10. There are many different ways the runtime can change as the input grows, not just *n*.