When it comes to making the big business decisions required in the oil and gas industry, major companies like Shell, BP, Chevron, BHP Billiton, Hess, Total and ExxonMobil, rely heavily on Big Data to improve efficiency and reduce operating costs. After all, you don’t break ground on a $100 million well without some numbers to back up the decision.
To drive decision making, Oil and gas giants employ massive amounts of geologic, operational and performance data to improve and optimize drilling parameters, monitor pipelines and equipment and foster predictable and precise approaches to equipment maintenance.
A recent article in Forbes described oil and gas giant Royal Dutch Shell’s process of gathering, storing and transferring the data they use to make billion dollar digging decisions. Shell places sensors deep inside wells and drilling equipment which relay constant streams of information via fibre optic cable. The data is then moved to cloud-based storage servers maintained by FileCatalyst partner, Amazon Web Services, where it can be analyzed and dissected by industry professionals.
Big Data in Oil & Gas
Oil and gas companies regularly make use of files containing geophysical and seismic data – the smallest of which can range from two to five GB in size for a single file. Multiply that by even say, 20 “small scale” files and you’re looking at over 40GB of data!
Moving massive media is challenging enough in urban landscapes with existing infrastructure. Oil and gas companies however set up shop in remote, off-grid locations in order to strike it big. These locations are not equipped with the networks required to send move large files fast.
“If oil and gas companies want to be kept informed of the status of their wells and pipelines in real-time, they can’t be sitting around waiting for files to transfer,” said Chris Bailey, CEO of FileCatalyst, an accelerated file transfer solution and pioneer in managed file transfer technology. “The faster data can get into the hands of data scientists, the quicker important decisions can be made. In the oil and gas industry, million dollar mistakes can happen in the time it takes for a file to send.”
FileCatalyst has primarily offered their accelerated file transfer services to the broadcast and entertainment industry. Their most recent achievement involved sending 240TB of data for NBC Olympics during their coverage of the Olympic Games in Rio, allowing for remote editing in near real-time. However, they have seen an influx of customers in the oil and gas industry looking to harness the power of accelerated file transfer.
“After integrating an accelerated file transfer solution, FileCatalyst customers can effectively transfer large amounts of data across large distances at full line speed, regardless of bandwidth,” Bailey said. “File transfers will no longer take hours or days, instead files can be transferred at an accelerated rate of up to 10Gbps, without slowing other processes.”
NICE Software: Moving Big Bata in the Oil & Gas Industry
In a recent case study with NICE Software, who delivers comprehensive grid and cloud solutions for companies and institutions in the oil and gas industry, FileCatalyst was employed to assist their client in sending big data files from Perth, Australia to Houston, Texas.
“Our client’s engineers had to transfer the files to Houston using Windows Explorer, submit to the cluster and then copy back the results,” said Antonio Arena, Solutions Architect at NICE Software. “The data sets can vary in size from 2GB to 5GB. Furthermore, network conditions often add to file transfer issues, causing many failed and unreliable transfers of crucial data.”
In addition to NICE Software, FileCatalyst has been employed by ADGas, HESS and Tullow Oil, and other natural resource companies, to increase efficiency, security and reliability of big data transfers.
“No industry is immune to increasing file sizes and the increased reliance on big data,” Bailey said. “We don’t mind though, in fact, the bigger the files, the better.”