No, I'm saying it reduces the load of fetching them..you know unless you forgot how SQL works ..you can provide a limit. It will stop fetching after it reaches that limit..the speed is impressive but nothing no one else can't beat withought trying some.
It still has to sort it all, based on your search criteria, before it can limit them. Sorting is generally pretty fast, with a decent amount of hardware memory. Sorting a database that doesn't fit into main memory takes something like:
2(p) * (1 + log<b>(p/b))
i/o operations. Where b is the number of pages that can fit into active memory and p is the number of pages that are being sorted. The result is the number of disk i/o's required, which is the bottleneck operation.
Since Google doesn't know what you're going to type before you type it, that means that this has to be done every time somebody types something. Unless they've come up with a different way of indexing, but that's the way it's done in the popular relational databases (Oracle, MSSql, and MySQL for sure).
For more information on how data is sorted in a database, Google (ha) for "external mergesort".