“Big Data is heaps of data that exceeds the processing ability of conventional database systems. Big data is too big, moves rapidly, and doesn't accommodate in the structures of database architectures. To gain value from this data, we must choose an alternative way to process it. Big Data compliments velocity, volume, variety and veracity.”
The meaning of this statement varies for different organizations as each interprets Big Data in an entirely different way due to varied technologies and market pressures. However the populace is that an organization that uses Big Data is able to benefit from extracting value from data that would have previously been considered 'dead' to a certain extent.
P&G turned to Big Data to Predict Demands
Latest products, fresh markets and increased economic uncertainty have created pervasive volatility, and today’s supply chain requires technology to access and analyze as much information as possible to successfully forecast consumer demand. P&G used big data in the Wal-Mart’s to find out how many consumers visited Wal-Mart in summers to purchase Sunscreen creams.
Obama’s campaign Used Big Data to rally individual voters
Team Obama was able to predict who were going to vote online. They could identify people who were going to vote via mail. They could select volunteers. In the end, modeling became something way bigger for them in 2012 than in 2008.
Delta Airlines Identified Customer Pain Points and solve them
All airlines know that a primary concern for passengers is lost baggage, especially when they are on a flight that’s delayed and missed connections involved. Delta Employees looked further into their data and created a solution that would remove the uncertainty of where a passenger’s bag might be.
To tackle this problem, they developed a solution wherein customers can snap a photo of their baggage tag using the “Track My Bag” feature on the Delta app and then keep tabs on their luggage as it makes its way to the final destination. Even if a bag doesn’t make it on the bounded flight, passengers could save time by tracking via application. Finding a new way to put big data to use for the benefit of their passengers put Delta out front in a competitive market.
Big Data is niche technology worldwide; however many MNC’s in India have already adopted it to improve efficiency, decision-making, and sales. The biggest challenge companies are facing is to integrate the data technology into existing systems/business models, recognizing what data to accumulate, and ability to nab the data from all areas of the business.
In future, Big Data will present a vast opportunity for businesses to unfold important insights by applying principles of statistics, artificial intelligence and mathematics. Big Data analytics can assist in discovering hidden patterns, develop innovative products, enhance customer intimacy, and improve supply chain effectiveness.
Some of the interesting technologies that are applied to the handling of big data are:
Apache Hadoop – Apache Hadoop is the key technology to handle big data, stream computing and its analytics. The major advantage of Hadoop technology is non-reliance on high-end hardware and the ability to detect and handle failures at the application layer.
Massively Parallel Processing (MPP) – Also known as “loosely coupled” or “shared nothing” system. MPP processes the program by multiple processors coordinating with each other. The processor makes use of its own operating system and memory and works on different parts of the program.
Distributed File System – Also known as “Network File System”. The client nodes are allowed to access files through a computer network, as a result number of users working on multiple machines will be able to share files and storage resources. However, the client node access to block storage is prohibited but the client nodes can interact through a network protocol.
Data Intensive Computing – Data Intensive Computing is a class of parallel computing application that uses a parallel approach to process big data. This works on the basis of the Principle of collocation of programs, data or algorithms used to perform computation.