I’m not sure why a lot of people are jumping on the bandwagon about the word “scalable”. I read another article the other day that said it simply means “it can grow”. This is far from the truth.
Scalable has a well understood meaning. Something that is scalable means that its architecture allows it to grow to any given size without inordinate amounts of upgrades. If you had to triple your web servers in order to handle 1000 more hits a day, that’s not scalable.
Computer scientists have always had a way to describe the performance of an algorithm, which it essentially its scalability. It’s called “Big Oh notation”.
Something that is O(1) keeps the same complexity no matter how big the problem gets. O(N) grows linearly with the problem. Double the number of hits, you double the web server.
Once you get beyond O(N), it’s possible that the problem can grow faster than you can scale your resources. For example, doubling the parameters on an O(N^2) (that’s N-squared) problem requires quadruple the resources.
The reason that encryption works is because decryption without the key is not a scalable problem. In a symmetric key system such as DES or AES, adding a single bit to the key doubles the complexity of the problem. By the time you get to 128 bits of keyspace, checking every key would take more energy and time than exist in the universe.
Of course, a lot of people don’t like to talk about scalability is because it’s easier to slap together a system, collect the bonus, and leave the maintenance to someone else. A good scalable architecture that solves an IT problem is vital.