Responsible for the largest and most complicated science experiment, the LHC, CERN has tried to uncover the secrets of the universe. Its major experiments are all in the domain of Big Data. How? Well the LHC single handed generates 30 Petabytes yearly. They use effective data warehousing and Big Data Analytics tools to make sense of such large amounts of raw data. Crunching all the data collected from the 600 million particle collisions per second requires enormous processing power. Distributed computing gives them access to enough processing power and storage capacity otherwise far too costly to build into one data center. The data can be accessed at greater speed by researchers wherever they are in the world now. Read more at:https://www.linkedin.com/pulse/big-data-uncovering-secrets-our-universe-cern-bernard-marr