Menu
Banking Exchange Magazine Logo
Menu

Thinking big, really big: "Big data" a force banks must deal with

 
 
 
A new term is cropping up more and more often in technology circles: “Big data.”

It refers to the massive amount of digital data that’s continually created, used, modified, and stored in databases around the world. It’s data that’s entered, one way or another, through computers of all sorts, telephones, mobile devices, and other means, including the coding that attaches to the raw data.

When people say “big data,” they mean really, really big. A study by IDC estimates that 1.8 zettabytes will be created and replicated in 2011. In terms of sheer volume, 1.8 zettabytes of data is equivalent to:

•    Every person in the United States tweeting 3 tweets per minute for 26,976 years nonstop.

•    Every person in the world having over 215 million high-resolution MRI scans per day.

•    Over 200 billion 2-hour HD movies—which would take one person 47 million years to watch every movie if they watched 24/7.

As IBM said in releasing a new product designed to analyze big data, banks, insurance companies, healthcare organizations, and communications services providers are required by industry regulators to retain massive amounts of data—in some cases up to a decade. As data retention laws continue to evolve, organizations are faced with a unique challenge to store and analyze ever-expanding “big data” sets that may not be directly related to daily operations, yet still hold potential business value.

At the same time, merely handling the volume of data is not enough. A study by Gartner concludes the real issue is making sense of big data and finding patterns in it that help organizations make better business decisions.

Granted, thinking about big data is very big-picture stuff and by its nature applies far and wide. Still, the following three articles ought to give bank CIOs a better understanding of opportunities and challenges the rapidly expanding digital generation will bring in the not-so-distant future.

World’s data more than doubling every two years—driving “big data” opportunity, new IT roles  
 
 
John Ginovsky

John Ginovsky is a contributing editor of Banking Exchange and editor of the publication’s Tech Exchange e-newsletter. For more than two decades he’s written about the commercial banking industry, specializing in its technological side and how it relates to the actual business of banking. In addition to his weekly blogs—"Making Sense of It All"—he contributes fresh, original stories to each Tech Exchange issue based on personal interviews or exclusive contributed pieces. He previously was senior editor for Community Banker magazine (which merged into ABA Banking Journal) and for ABA Banking Journal and was managing editor and staff reporter for ABA’s Bankers News. Email him at [email protected].

back to top

Sections

About Us

Connect With Us

Resources