How Shell uses Big Data in practice
With rising costs and dwindling natural resources, it’s crucial that drilling takes place in the locations that will provide the biggest rewards. With this in mind, Shell collects data that allows them to predict the likely size of oil and gas resources by monitoring low-frequency seismic waves below the earth’s surface. These waves register differently on sensors as they travel through the earth’s crust, depending on whether they are passing through solid rock, liquids or gaseous material –indicating the probable location of hydrocarbon deposits.
Until recently, this method often proved unreliable; time-consuming, costly exploratory drills would usually be needed to confirm findings. Now, with an increased ability to capture, monitor, store and interpret large volumes of data, Shell’s capability to identify reserves has improved. While a previous survey might have involved taking a few thousand readings, today it will typically involve more than a million readings. This data is then uploaded to analytics systems and compared with data from other sites around the world. The more closely the data matches the profiles of other drilling sites with abundant reserves, the higher the probability that a full-scale drilling operation will pay off.
The technical details
When surveying potential new sites, Shell uses fibre-optic cables and sensor technology from Hewlett-Packard. All this data is stored and analysed using a Hadoop infrastructure that runs on Amazon Web Service servers. While the company doesn’t publicise data volumes, it is known that the first test of the system collected around one petabyte of information, and it is estimated that Shell’s data-driven oilfield approach has so far generated around 46 petabytes. Their analytics team is thought to consist of around 70 people.
Ideas and insights you can steal
As one of the largest companies in the world, Shell is undoubtedly able to invest heavily in new technology. But their reason for moving to a data-driven approach will ring true among companies of any size: a need to achieve better, more reliable results while reducing costs and risk. With the vast volumes of data available these days, this is a goal that almost any company can aim for and achieve. And this doesn’t have to mean investing in expensive proprietary technology – third party data, whether it’s freely available public data or part of a low-cost paid-for data solution, can provide a wealth of business-critical insights.
You can read more about how Shell is using Big Data to drive success in Big Data in Practice: How 45 Successful Companies Used Big Data Analytics to Deliver Extraordinary Results.
Bernard Marr is a bestselling author, keynote speaker, and advisor to companies and governments. He has worked with and advised many of the world's best-known organisations. LinkedIn has recently ranked Bernard as one of the top 10 Business Influencers in the world (in fact, No 5 - just behind Bill Gates and Richard Branson). He writes on the topics of intelligent business performance for various publications including Forbes, HuffPost, and LinkedIn Pulse. His blogs and SlideShare presentation have millions of readers.