Today’s oil and gas industry has changed drastically as a result of the 2015+ oil slump. The price of oil has declined more than 50% which has had a global impact on operators, oilfield services companies, and other contractors and vendors. Operators have adopted a leaner approach, investing only in operating enhancements that are the most critical. A key area of focus is on driving new growth by tapping critical legacy data from older assets.
Why is unlocking the value of legacy data important?
Legacy data is mostly ‘out of sight – out of mind’. We all know we have it, but we don’t know the exact volume, or the exact value of the data. For some companies, legacy data may store seismic surveys that reveal areas of rich oil and gas reserves. For other companies the value may be in the role legacy data can play in informing decisions about plant-related enhancements or maintenance.
For too long we have squirreled data into unstructured folders or exported it to external storage sites. Locating it can be challenging. To further complicate the situation, the storage of legacy data –offsite - in hard or soft copies or retained on an unstructured drive - can have significant cost implications and schedule impacts. It may take employees days to find a piece of information that is critical to a key decision. Because of this, project dates and deadlines may be missed, causing problematic delays. Companies are realizing there may be significant value hidden in their archives, but there is no value in retaining legacy data that is inaccessible. When a company digitizes and organizes data that was previously paper-based or stored in an unstructured format, they can enable rapid access to new decision support.
What is an asset-rich, data-heavy company to do?
Businesses are finding that with an organized set of techniques and artificial intelligence, the journey to emptying those warehouses and old drives and harnessing data previously inaccessible is not as troublesome as it may seem.
Parallel processing is applied to the digitization of large volumes of documents, along with techniques like image correction, digital and perceptual hashing, key calculations and optical character recognition. Advanced imaging routines are used to ensure images are captured effectively. Sophisticated image and text deduplication efforts are added to standard deduplication methods. Once files are digitized they must be accessible, so the complexities of metadata extraction, cleansing and mapping are required to release the true value of previously inaccessible data. Methods include tailored extractions, fuzzy logic and specialized matching. The key is to ensure that an approach adapts to a company’s business complexities, learns from nuances in their business rules and recognizes patterns..
In some cases, this can be performed at a net-zero cost by eliminating paper storage costs. But companies are finding that significant value emerges when decision support that was once locked away is now quickly available and close at hand.