Big Data Demands Keep Growing

Posted by on in Big Data

 

Are you the type of person that looks forward to working with numbers? Are you easily excited by large amounts of data and seek the challenge in identify micro trends within that data - a data super sleuth if you will? Then perhaps a career as a Big Data analyst might be the opportunity you are looking for.

Companies such as IBM are beefing up employee expertise in the fast growing trend of data consultants. These are data specialists with insights on how to help businesses make sense of the explosion of various data sources. These include data from Web traffic or social network comments, as well as software and sensors that monitor shipments, suppliers and customers. Companies are fast learning that they can mine the data for insights that help guide them to important decisions, trim costs and lift sales. 

To exploit the data flood companies are going to need many more data analysts in the coming years. A report last year by the McKinsey Global Institute, the research arm of the consulting firm, projected that the United States needs 140,000 to 190,000 more workers with “deep analytical” expertise and 1.5 million more data-literate managers, whether retrained or hired.

According to a recent NYT article, “It’s a revolution,” says Gary King, director of Harvard’s Institute for Quantitative Social Science. “We’re really just getting under way. But the march of quantification, made possible by enormous new sources of data, will sweep through academia, business and government. There is no area that is going to be untouched.”

What is Big Data? The collection of both data and toolsets for advancing trends in technology that open the door to a new approach to understanding the world and making decisions. There is a lot more data, all the time, growing at 50 percent a year, or more than doubling every two years, estimates IDC, the technology research firm. It’s not just more streams of data, but entirely new ones. For example, there are now countless digital sensors worldwide in industrial equipment, automobiles, electrical meters and shipping crates. They can measure and communicate location, movement, vibration, temperature, humidity, even chemical changes in the air.

We recently discussed the possibility that Big Data might be more appropriately named Fast Data. How does Big Data Compare with Open Data? Which holds the promise of delivering more value to CIOs and their organizations?

Please see Open Data Vs. Big Data

Big Data describes the process of the collecting, storing, and analyzing of fragments of information that can be rapidly assembled to identify subtle macro trends or create actionable profiles that precisely target unique individuals.

Open data on the other hand describes a corporate policy which permits certain data to be freely available to everyone to use and republish as they wish, without restrictions from copyright, patents or other forms of control.

Each day, organizations and individuals create nearly 2.5 quintillion bytes of data. This data generation rate produces so much data that nearly 90% of the data in the world today has been created in the last two years alone. This data for example comes from devices such as sensors used to gather climate information, posts to social media sites, digital pictures and videos, purchase-transaction records, and cell phone GPS signals.

In the latest micro trend to hit the information revolution we are faced with the Age of Big Data. Web companies such as Google, Facebook, and Twitter have organically grown into experts at harnessing the data of the Web. This data is produced from online searches, blog posts, social commentary, and of course brief messages or tweets.

The goal is  to match the proper ( ie most relevant) advertisement with the intent of the searcher or web consumer to drive with Internet advertising sales. 

At the World Economic Forum last month in Davos, Switzerland, Big Data was a leading topic. A report by the forum, “Big Data, Big Impact,” declared data a new class of economic asset, like currency or gold.

Data is not only growing more rapidly than ever before but it is also more processable by computers. Most of the Big Data that is produced today is known as unstructured data. Examples include the words, images and video on the Web and those streams of sensor data. Many of these data streams are known as BLOB or ( Binary Large Objects) and do not easily fit in traditional database management systems for processing.They require a special set of tools for filtering, processing, and loading. The typical data ETL process ( Extraction, Transformation, Load ) is now supplemented with super Big Data tools. Many of these tools are now available in the Open Source world. The problem is many organizations do not know how to leverage them.

The computer tools for gleaning knowledge and insights from the mountains of unstructured data are fast gaining ground. There are now rapidly advancing techniques such as artificial intelligence like natural-language processing, pattern recognition and machine learning, and contextual language processing.

The wealth of new data, in turn, accelerates advances in computing — a circle of Big Data. Machine-learning algorithms, for example, learn on data, and the more data, the more the machines learn.

 

Seeing The Big Picture with Big Data

The capacity of the enterprise to mesh with its physical and social environment, will become increasingly vital for its survival in the future. Without such a grounded relationship, poor decision making based on an inwardly focused mindset will continue to drive many large enterprises to bankruptcy.

It will also be insufficient to plan just one, two or five years ahead. Although near-term sales and cost forecasts are important, understanding the bigger shifts likely to impact all businesses in a future dominated by climate change, geopolitics and globalization, will be more essential to survival- allowing a better balance of creative planning, adaptive resilience and risk avoidance.

In fact enterprises- particularly the biggest, have a poor history in seeing the big picture.The larger the enterprise the more likely it is to believe in its own invincibility in the marketplace.

More recently, both Ford and GM virtually went bankrupt and had to be bailed out by the public purse because they would not or could not see the obvious shift in consumer sentiment to smaller cars with lower fuel usage. And then there was AIG, Lehman Brothers and Citibank and Fanny Mae which also thought they were too big to fail. And now the giants Kodak and Sony and many others are struggling.

 

In all the above cases, enterprise management ignored the signals coming loud and clear from their environments via consumers and customers, through a combination of ignorance and arrogance. In the meantime other more agile companies such as Microsoft and Toyota picked up the vibes and exploited the opportunities. But then Microsoft almost lost the plot to Google by not seeing the emerging power of the Internet as the dominant driver of information in today's society.

Companies are really starting to realize that when Big Data speaks, they should listen. As an information systems analyst you can help them speak the correct language.

 

 

Enjoyed the article?

Sign-up for our free newsletter to kick off your day with the latest technology insights, or share the article with your friends and contacts on Facebook, Twitter or Google+ using the icons below.


E-mail address

Bill has been a member of the technology and publishing industries for more than 25 years and brings extensive expertise to the roles of CEO, CIO, and Executive Editor. Most recently, Bill was COO and Co-Founder of CIOZone.com and the parent company PSN Inc. Previously, Bill held the position of CTO of both Wiseads New Media and About.com.

Comments