Up to 16 Exabytes of RAM can be supported by a 64-bit system. Machines with 128GB RAM or more are becoming common with this era of Cloud Computing and Big Data. The data sets for Big Data are getting too large for even heavily loaded machines with memory despite the best efforts, don't fit into the RAM even after clustering in some cases. Researchers at MIT created a cluster called BlueDBM using Solid-State Drives (SSDs) to get rid of the memory problem. They also moved some of the computational power off the servers and onto chips. By pre-processing known parts of the data onto the flash drives prior to passing it back to the servers, the chips made distributed computation much more efficient than before. They thus got rid of the overhead of running an operating system. Read more at: http://www.itworld.com/article/2947839/big-data/mit-comes-up-with-a-no-memory-solution-for-big-data.html