I was on phone trying to explain to my 5 year-old Niece Skylar what I’m learning
This is the first and most basic intro in to the Big O, learning about performance. So things like AI systems that follows the user around using graph algorithms, or recommendation using k-nearest neighbors.
I was going to use the phonebook or guess the number game, but soon realize she probably don’t even know what a phone book is. So a variation of guess the number is using a Mario picture to explain.
The simple(or stupid?) search would be letting Mario hitting every single block and stop until he finds the mushroom.
However, the problem is, this might be quick if the mushroom was at the first block, or it will also be very slow if the block was at the very last.
I told her that there is one mushroom inside all of those blocks. Every time Mario jumps into one of them, I will tell you if it’s closer to the left or right.
With that search we have reduced the amount of “guesses” we need to find the mushroom.
Suppose you’re looking for a word in the dictionary( well, she doesn’t have one since she just Google it, but let’s just pretend she has one at home)
The dictionary might have 100,000 words in it, in the worst case simple search could take 100,000 steps! With each step of binary search you cut the number of words in half until you’re left with only one word. So Binary search will take 17 steps — a big difference!
How much time do you save by using it?
if its Mario simple search, if there are 100 blocks, then it would take 100 guesses, if its 1 million blocks, it takes up to 1 million guesses. So the maximum guesses is the same as the size of the list. This is called linear time.
Binary search is different, if the list is 100 items you will only need a maximum of 7 guesses. If the list is 4 billion items, it takes at most 32 guesses.
The coding part, since this is too technical for Skylar here is an article that is already explaining it enough.