Proficiency in Big Data provides a competitive advantage to banks. Banks too often depended on traditional technologies such as aggregation and normalization of data which resulted in several weaknesses like lack of flexibility in responding to upstream and downstream data changes. Data lineage may be lost after aggregation and summarization and data governance is likely weakened when several constituents retain responsibility for an extended, multi-stage data flow. These weaknesses are detrimental to the success of big data initiatives. So a new approach is required.  Big data represents a new way that banks can interact with and leverage their data. As a result, banks need to shift the paradigm for designing, developing, deploying, and maintaining big data solutions with new approaches to data storage (e.g., NoSQL databases)  and maturity of distributed-computation software frameworks (e.g., Hadoop). The approach to Big Data implementation also needs to change through rapid, iterative, and incremental deployment of solutions in a way that aligns well to the speed at which the underlying data are measured, understood, and parsed. This will take banks to an acceptable level of competency and capability. Read more at:

http://www.informationweek.in/informationweek/news-analysis/297426/mean-banks