Halliburton forms strategic agreement with Microsoft and Accenture to advance digital capabilities

HOUSTON – July 17, 2020  Halliburton (NYSE: HAL), Microsoft Corp. (Nasdaq: MSFT) and Accenture (NYSE: ACN) today announced they have entered into a five-year strategic agreement to advance Halliburton’s digital capabilities in Microsoft Azure.

Under the agreement, Halliburton will complete its move to cloud-based digital platforms and strengthen its customer offerings by:

  • Enhancing real-time platforms for expanded remote operations,

  • Improving analytics capability with the Halliburton Data Lake utilizing machine learning and artificial intelligence, and

  • Accelerating the deployment of new technology and applications, including SOC2 compliance for Halliburton’s overall system reliability and security.

Halliburton logo“The strategic agreement with Microsoft and Accenture is an important step in our adoption of new technology and applications to enhance our digital capabilities, drive additional business agility and reduce capital expenditures,” said Jeff Miller, Halliburton chairman, president & CEO. “We are excited about the benefits our customers and employees will realize through this agreement, and the opportunity to further leverage our open architecture approach to software delivery.”

“Moving to the cloud allows companies to create market-shaping customer offerings and drive tangible business outcomes,” said Judson Althoff, executive vice president, Microsoft’s Worldwide Commercial Business. “Through this alliance with Halliburton and Accenture, we will apply the power of the cloud to unlock digital capabilities that deliver benefits for Halliburton and its customers.”

Accenture logoThe agreement also enables the migration of all Halliburton physical data centers to Azure, which delivers enterprise-grade cloud services at global scale and offers sustainability benefits. Accenture will work closely with Microsoft, in conjunction with their Avanade joint venture, to help transition Halliburton’s digital capabilities and business-critical applications to Azure. Accenture will leverage its comprehensive cloud migration framework, which brings industrialized capabilities together with exclusive tools, methods, and automation to accelerate Halliburton’s data center migration and provide for additional transformation opportunities.

“Building a digital core and scaling it quickly across a business is only possible with a strong foundation in the cloud,” said Julie Sweet, chief executive officer, Accenture. “Halliburton recognizes that this essential foundation will provide the innovation, efficiency and talent advantages to do things differently and fast. We are proud to be part of driving this transformational change, which builds on our long history of working with Halliburton and Microsoft.”

The companies expect to complete the staged migration by 2022.

About Microsoft

Microsoft (Nasdaq “MSFT” @microsoft) enables digital transformation for the era of an intelligent cloud and an intelligent edge. Its mission is to empower every person and every organization on the planet to achieve more.

About Halliburton

Founded in 1919, Halliburton is one of the world’s largest providers of products and services to the energy industry. With approximately 50,000 employees, representing 140 nationalities in more than 80 countries, the company helps its customers maximize value throughout the lifecycle of the reservoir – from locating hydrocarbons and managing geological data, to drilling and formation evaluation, well construction and completion, and optimizing production throughout the life of the asset. Visit the company’s website at www.halliburton.com. Connect with Halliburton on FacebookTwitterLinkedInInstagram and YouTube.

About Accenture

Accenture is a leading global professional services company, providing a broad range of services in strategy and consulting, interactive, technology and operations, with digital capabilities across all of these services. We combine unmatched experience and specialized capabilities across more than 40 industries — powered by the world’s largest network of Advanced Technology and Intelligent Operations centers. With 513,000 people serving clients in more than 120 countries, Accenture brings continuous innovation to help clients improve their performance and create lasting value across their enterprises. Visit us at www.accenture.com.

For Microsoft

Microsoft Media Relations
WE Communications for Microsoft
(425) 638-7777
rrt@we-worldwide.com

For Halliburton

Investors:
Abu Zeya
Halliburton, Investor Relations
Investors@Halliburton.com
281-871-2633

Media:
Emily Mir
Halliburton, Public Relations
PR@Halliburton.com
281-871-2601

For Accenture
 Christian Harper
Accenture Media Relations
Christian.harper@acccenture.com
516-434-8615

Subsurface Data in the Oil and Gas Industry

Probing beneath the Earth’s surface for exploration and hazard mitigation

Drilling for oil and gas is expensive. A single well generally costs $5-8 million onshore and $100-200 million or more in deep water.1 To maximize the chances of drilling a productive well, oil and gas companies collect and study large amounts of information about the Earth’s subsurface both before and during drilling. Data are collected at a variety of scales, from regional (tens to hundreds of miles) to microscopic (such as tiny grains and cracks in the rocks being drilled). This information, much of which will have been acquired in earlier exploration efforts and preserved in public or private repositories, helps companies to find and produce more oil and gas and avoid drilling unproductive wells, but can also help to identify potential hazards such as earthquake-prone zones or areas of potential land subsidence and sinkhole formation.

Mapping the Subsurface 1: Regional Data from Geophysics

In the 21st century, much is already known about the distribution of rocks on Earth. When looking for new resources, oil and gas producers will use existing maps and subsurface data to identify an area for more detailed exploration. A number of geophysical techniques are then used to obtain more information about what lies beneath the surface. These methods include measurements of variations in the Earth’s gravity and magnetic field, but the most common technique is seismic imaging.

Seismic images are like an ultrasound for the Earth and provide detailed regional information about the structure of the subsurface, including buried faults, folds, salt domes, and the size, shape, and orientation of rock layers. They are collected by using truck-mounted vibrators or dynamite (onshore), or air guns towed by ships (offshore), to generate sound waves; these waves travel into the Earth and are reflected by underground rock layers; instruments at the surface record these reflected waves; and the recorded waves are mathematically processed to produce 2-D or 3-D images of subsurface features. These images, which cover many square miles and have a resolution of tens to hundreds of feet, help to pinpoint the areas most likely to contain oil and/or gas.

A typical setup for offshore seismic imaging. Image Credit: U.S. Bureau of Ocean Energy Management.2

Mapping the Subsurface 2: Local Data from Well Logs, Samples, and Cores

Drilling a small number of exploratory holes or using data from previously drilled wells (common in areas of existing oil and gas production) allows geologists to develop a much more complete map of the subsurface using well logs and cores:

  • well log is produced by lowering geophysical devices into a wellbore, before (and sometimes after) the steel well casing is inserted, to record the rock’s response to electrical currents and sound waves and measure the radioactive and electromagnetic properties of the rocks and their contained fluids.3 Well logs have been used for almost 100 years4 and are recorded in essentially all modern wells.

  • core is a cylindrical column of rock, commonly 3-4 inches in diameter, that is cut and extracted as a well is drilled. A core provides a small cross-section of the sequence of rocks being drilled through, providing more comprehensive information than the measurements made by tools inside the wellbore.5 Core analysis gives the most detailed information about the rock layers, faults and fractures, rock and fluid compositions, and how easily fluids (especially oil and gas) can flow through the rock and thus into the well.

By comparing the depth, thickness, and composition of subsurface rock formations in nearby wells, geoscientists can predict the location and productive potential of oil and gas deposits before drilling a new well. As a new well is being drilled, well logs and cores also help geoscientists and petroleum engineers to predict whether the rocks can produce enough oil or natural gas to justify the cost of preparing the well for production.7

A box containing 9 feet of 4-inch diameter core from the National Petroleum Reserve, Alaska, showing the fine-scale structure and composition of the rock layers being drilled. Image Source: U.S. Geological Survey.6

Data Preservation

Preservation of subsurface data is an ongoing challenge, both because there is so much of it and because a lot of older data predate computer storage. A modern seismic survey produces a few to thousands of terabytes of data;8 state and federal repositories collectively hold hundreds of miles of core;9 and millions of digital and paper records are housed at state geological surveys. For example, the Kansas Geological Society library maintains over 2.5 million digitized well logs and associated records for the state.10 Oil companies also retain huge stores of their own data. Preserving these data, which cost many millions of dollars to collect, allows them to be used in the future for a variety of purposes, some of which may not have been anticipated when the data were originally collected. For example, the shale formations that are now yielding large volumes of oil and natural gas in the United States were known but not considered for development for decades while conventional oil and gas resources were being extracted in many of the same areas. Archived well logs from these areas have helped many oil and gas producers to focus in on these shale resources now that the combination of hydraulic fracturing and horizontal drilling allow for their development.

Data for Hazard Mitigation

Oil and gas exploration is a major source of information about the subsurface that can be used to help identify geologic hazards:

  • Since 2013, the oil and gas industry has provided more than 2,500 square miles of seismic data to Louisiana universities to assist with research into the causes and effects of subsidence in coastal wetlands. For example, seismic and well data have been used to link faults to historic subsidence and wetland loss near Lake Boudreaux.11

  • To improve earthquake risk assessment and mitigation in metropolitan Los Angeles, scientists have used seismic and well data from the oil and gas industry to map out previously unidentified faults. This work was motivated by the 1994 Northridge earthquake, which occurred on an unknown fault that was not visible at the Earth’s surface.12

More Resources

U.S. Geological Survey – National Geological and Geophysical Data Preservation Program.

References

1 U.S. Energy Information Administration (2016). Trends in U.S. Oil and Natural Gas Upstream Costs.
2 Bureau of Ocean Energy Management – Record of Decision, Atlantic OCS Region Geological and Geophysical Activities.
3 Varhaug, M. (2016). Basic Well Log Interpretation. The Defining Series, Oilfield Review.
4 Schlumberger – 1920s: The First Well Log.
5 AAPGWiki – Overview of Routine Core Analysis.
6 Zihlman, F.N. et al. (2000). Selected Data from Fourteen Wildcat Wells in the National Petroleum Reserve in Alaska. USGS Open-File Report 00-200. Core from the well “East Simpson 2”, Image no. 0462077.
7 Society of Petroleum Engineers PetroWiki – Petrophysics.
8 “Big Data Growth Continues in Seismic Surveys.” K. Boman, Rigzone, September 2, 2015.
9 U.S. Geological Survey Core Research Center – Frequently Asked Questions.
10 Kansas Geological Society & Library – Oil and Gas Well Data.
11 Akintomide, A.O. and Dawers, N.H. (2016). Structure of the Northern Margin of the Terrebonne Trough, Southeastern Louisiana: Implications for Salt Withdrawal and Miocene to Holocene Fault Activity. Geological Society of America Abstracts with Programs, 48(7), Paper No. 244-2.
12 Shaw, J. and Shearer, P. (1999). An Elusive Blind-Thrust Fault Beneath Metropolitan Los Angeles. Science, 283, 1516-1518.

Date updated: 2018-06-01
Petroleum and the Environment, Part 23/24
Written by E. Allison and B. Mandler for AGI, 2018

Enterprise AI with the CIO and CMO: Better together benefits

Here’s a look at how AI is transforming entire enterprises, particularly through the lens of marketing and IT, and why the two teams must work together.

The massive impact AI has already had in marketing, and what we expect to see of it in the near future, is a hot topic here at MarTech Today. In my previous columns, we’ve explored how AI will be woven into marketing organizations, where it belongs in your marketing stack, and where CMOs should focus today to get the best results from their investments in AI.

There’s no doubt it’s become widespread; in fact, global spend on artificial intelligence is expected to grow from an estimated $2 billion this year to $7.3 billion per year by 2022, according to a study from Juniper Research. Yet, as abundant as it is, artificial intelligence is still a mystery to many.

Case in point: Only 33 percent of consumers think they use AI-enabled technology, yet new research shows that 77 percent actually use an AI-powered service or device.

Marketers are perhaps savvier to the opportunities than most, so it was no surprise that when my company, BrightEdge, recently asked over 500 marketers to identify the next “big trend in marketing,” 75 percent pointed to some type of AI application.

CMOs are challenged now to not only identify the right AI applications to solve specific problems but to then sell those to the CEO, other company leaders and the teams that will use the technology. Today, we’re going to broaden the scope and take a look at just a few of the ways AI is transforming entire enterprises, particularly through the lens of marketing and IT integration.

The CIO, CMO and AI

We learned in recent Adobe research that 47 percent of digitally mature organizations, or those that have advanced digital practices, said they have a defined AI strategy.

We all know that Google has one. The search giant dropped a whopping $3.2 billion acquiring Nest Labs, the largest of its $3.9 billion in disclosed AI acquisitions since 2006. All told, Google has invested $3.9 billion in AI deals, more than any other company.

Microsoft, Apple, Intel and SalesForce behind Google round out the top five companies making acquisitions of AI. (Intel takes the crown for the highest number of unique investments in AI companies, at 81.)

Sixty-one percent of over 1,600 marketing professionals from companies of all sizes pointed to machine learning and AI as their company’s most significant data initiative for next year, a MemSQL survey found.

But where is all of this interest and investment headed?

Take a look at Amazon for a sneak preview. The e-commerce giant completely rebuilt itself around AI, with spectacular results, according to a feature published in Wired. In 2014, according to the article, Srikanth Thirumalai, computer scientist and head of Amazon’s recommendations team, brought CEO Jeff Bezos the idea that Amazon could use deep learning to revamp the way recommendations work.

Thirumalai was only one department leader who included AI in his visionary proposal to Bezos. The revolution came, he told Wired, when leaders in isolated pockets of AI came together to discuss the possibilities and ultimately begin collaborating across projects. As Thirumalai told Wired:

We would talk, we would have conversations, but we wouldn’t share a lot of artifacts with each other because the lessons were not easily or directly transferable.

What followed was a revolutionary AI-centric management strategy that has baked artificial intelligence into Alexa, Amazon Web Services and almost every other facet of the $1 trillion company. Amazon takes a “flywheel” approach to AI.

Modeled after the simple tool that stores rotational energy, Amazon’s AI flywheel enables teams to build off of AI applications developed elsewhere in the organization. It’s an entirely collaborative approach that has proven a revenue generator, as well, by offering select tools to third-party companies.

That collaboration — the shift from competing for the budget for AI to working across departments — has paid huge dividends for Amazon. What could it do for your brand?

Solving persistent challenges

In 2018, CMOs have had access to more third-party AI-powered tool options than they can shake a stick at. Our firm found in recent research that more than 50 percent of marketers simply expect marketing technology providers to have native AI capabilities and consider it important or a must-have.

CIOs have been slower on the draw. Gartner’s 2018 CIO Agenda Survey found that just 4 percent of CIOs have already implemented AI in the corporate realm. However, 46 percent plan to do so in the near future. This doesn’t mean IT is being left behind. After all, the best use of AI isn’t about providing tools; it’s the catalyst in massive organizational change and even creating a new type of organization.

In the Texas A&M University System, for example, Cyber Security Intelligence reports that AI has been put to work in IT enhancing cybersecurity via Artemis, an intelligent assistant from Endgame.

“We monitor the networks for 11 universities and 7 state agencies,” said Barbara Gallaway, a security analyst at Texas A&M University System, told the publication.

Using an AI application that enables her staff to ask simple questions has helped train them in their jobs as a side benefit, she reportedly said. Her team now includes eight part-time student workers who don’t need extensive experience in dealing with security incidents in addition to nine full-time IT staff.

AI-powered products and services are helping IT teams improve productivity and effectiveness through logs analysis, employee support, enhanced cybersecurity, deep learning, natural-language processing and more. CIOs have the opportunity to transform IT from cost center to organizational trailblazer with AI.

However, as we’ve seen with Amazon, the real magic happens when CMOs, CIOs and other company leaders work together to facilitate collaborative workflows and enhanced customer experiences through AI.

Analysis of data is already a key AI focus for businesses, with on-site personalization the second most commonly cited use case for AI. Working across departments and projects, teams are discovering new and unexpected use cases for AI in their organizations.

For example, Mike Orr, IT director of digital transformation at Murphy Oil, shared the following story with CIO.com. Murphy Oil turned to an AI-powered system from Turbonomic to make recommendations about how to optimize their infrastructure while moving it from traditional on-premises and colocation to cloud and SaaS models. Once the company grew comfortable with the system, they began to trust it to perform placement and sizing automatically. Prior to the move, Orr had 4 1/2 full-time equivalents working on nothing but tickets. “Now it’s one-tenth of an FTE [full-time employee],” he says.

This is something we’re going to see more and more; in fact, Gartner predicts that while 1.8 million jobs will be eliminated due to AI by 2020, 2.3 million more jobs will be created in their place. Rather than the robots “stealing our jobs,” the impact of AI technologies on business is projected to increase labor productivity by up to 40 percent and enable people to make more efficient use of their time.

So, how can CIOs and CMOs work together?

  • As the worldwide volume of data continues to grow at some 40 percent per year, the CIO and CMO need to work closely and collaborate early on new initiatives. IT is a critical strategic partner for marketing and should be involved and consulted from conception and through all stages of planning.

  • Constantly connected consumers are generating a wealth of data for marketing — so much that most teams struggle to uncover the actionable insights that drive smarter, more informed campaigns. Who better than IT to assist? In addition to their information architecture and analysis prowess, IT is also in a position to share relevant insights with other departments as well. CMOs and CIOs must each take steps to come closer together. For CMOs, this means mastering not only the art of creativity and strategy but also the science of analytics. CIOs need to shift from a mindset of control and prevention to that of a facilitator and enabler.

  • The CIO is in a position to execute massive organizational change, while the CMO can be critical in selling it internally, to the rest of the C-suite and right on down to individual team members.

  • The CMO must be able to articulate and clearly define business goals for the CIO to evaluate and cost out. This is a give-and-take relationship that may require some negotiation but is sure to result in more purposeful tracking, measurement, and analysis.

  • Each must demonstrate a willingness to communicate on the level; to adopt a common vernacular and clear set of expectations of one another.

  • Both the CIO and CMO must enable and support integrated teams. This means not only giving employees the time and space to work together, but also giving recognition and sharing results out to the company when these partnerships result in innovative, successful uses of AI within the organization.

It all sounds great in theory, doesn’t it? In reality, changing up the complexities of traditional organizational hierarchy and deep-seeded business practice has proven incredibly challenging. Industry recommendations suggest CIOs and CMOs ensure they have these five prerequisites in place (with the CEO’s explicit support) as the foundation on which to build this relationship:

  1. Be clear on decision governance.

  2. Build the right teams.

  3. Provide transparency.

  4. Hire IT and marketing translators.

  5. Learn to drive before you fly.

In the age where the growth of big data brings complexity, with a universe of AI-powered possibility spread out before us, marketing and IT simply do better together.

ABOUT THE AUTHOR

Jim Yu is the founder and CEO of BrightEdge, the leading enterprise content performance, and SEO Platform. He combines in-depth expertise in developing and marketing large on-demand software platforms with hands-on experience in advanced digital, content and SEO practices

Credit Source: MarTech Today  

Published on September 19, 2018, at 2:42 pm

BP deploys Plant Operations Advisor on Gulf of Mexico platforms

Advanced analytics solution, developed with BHGE, will be installed on BP’s upstream assets around the world

HOUSTON – BP announced today that it has successfully deployed Plant Operations Advisor (POA), a cloud-based advanced analytics solution developed with Baker Hughes, a GE company, across all four of its operated production platforms in the deepwater Gulf of Mexico.

The announcement comes after an initial deployment of POA proved the technology could help prevent unplanned downtime at BP’s Atlantis platform in the Gulf.

The technology has now been successfully installed and tested at BP’s Thunder Horse, Na Kika, and Mad Dog platforms – and it will continue to be deployed to more than 30 of BP’s upstream assets across the globe.

Diana and Barth keep a close eye on the plant

“BP has been one of the pioneers in digital technology in our industry, and co-development of Plant Operations Advisor with BHGE is a key plank of modernizing and transforming our upstream operations,” said Ahmed Hashmi, BP’s global head of upstream technology. “We expect the deployment of this technology not only to deliver improvements in safety, reliability, and performance of our assets but also to help raise the bar for the entire oil and gas industry.”

Built on GE’s Predix platform, POA applies analytics to real-time data from the production system and provides system-level insights to engineers so operational issues on processes and equipment can be addressed before they become significant. POA helps engineers manage the performance of BP’s offshore assets by further ensuring that assets operate within safe operating limits to reduce unplanned downtime.

hr.jpg

“BP has been one of the pioneers in digital technology in our industry, and co-development of Plant Operations Advisor with BHGE is a key plank of modernizing and transforming our upstream operations.”

Ahmed Hashmi, BP’s global head of upstream technology

hr.jpg

Now live across the Gulf of Mexico, POA works across more than 1,200 mission-critical pieces of equipment, analyzing more than 155 million data points per day and delivering insights on performance and maintenance. There are plans to continue augmenting the analytical capabilities in the system as POA is expanded to BP’s upstream assets around the globe.

BP and BHGE announced a partnership in 2016 to develop POA, an industry-wide solution for improved plant reliability. The teams have built a suite of cloud-based Industrial ‘internet of things’ (IoT) solutions that have been tailor-fit for BP’s oil and gas operations.

“The partnership between BP and BHGE has resulted in a unique set of capabilities that quickly find valuable insights in streams of operational data,” said Matthias Heilmann, president, and CEO of Digital Solutions and chief digital officer for Baker Hughes, a GE company. “Together, we are creating leading-edge technologies to automate processes and increase the safety and reliability of BP’s upstream assets. As we extend the solution globally, this will become the largest upstream Industrial IoT deployment in the world when complete.”

BP is currently in the process of deploying POA to its operations in Angola with additional deployments in Oman and the North Sea scheduled for 2019.

hr.jpg

Downloads

hr.jpg

About BP

BP is a global producer of oil and gas with operations in over 70 countries. BP has a larger economic footprint in the U.S. than in any other nation, and it has invested more than $100 billion here since 2005. BP employs about 14,000 people across the U.S. and supports more than 106,000 additional jobs through all its business activities. For more information on BP in America, visit www.bp.com/us.

About Baker Hughes, a GE company

Baker Hughes, a GE company (NYSE: BHGE) is the world’s first and only full stream provider of integrated oilfield products, services, and digital solutions. We deploy minds and machines to enhance customer productivity, safety, and environmental stewardship while minimizing costs and risks at every step of the energy value chain. With operations in over 120 countries, we infuse over a century of experience with the spirit of a startup – inventing smarter ways to bring energy to the world.

Further Information

Name: BP U.S. Media Affairs
Email: uspress@bp.com

Name: Ashley Nelson
Phone: +1 925 316-9197
Email: ashley.nelson1@ge.com

Name: Gavin Roberts
Phone: +44 7775547365
Email: gavin.roberts@bhge.com

While politicians court Google and Uber, fracking industry offers a different sort of high-tech job

CANNONSBURG, Pa. — If there is mud on the floor, they say in the shale industry, that means cash is coming in the door. That is, when workers are out in the field and the boots are getting dirty, money is being made.

Thanks to an infusion of high technology driving the natural gas industry, it’s not just about dirty boots anymore – and it’s a good story. It’s a marriage of advanced technologies and dirt-under-your-nails hard work rarely told, because extracting shale is not a popular business politically.

Fracking, it turns out, is the one high-tech industry not embraced by politicians in Pittsburgh who are rushing to embrace the likes of Uber and Google. Why? Because local progressive Democrats, very vocal climate activists, and the burgeoning Democratic Socialists of America party demand a wholesale repudiation of the natural gas industry. Local Democratic officials thus have to oppose fracking or risk losing in a Democratic primary.

Vice President of Engineering and Development of CNX Resources Corporation Andrea Passman stands in a control room that is used for predicting drilling locations at CNX's headquarters on July 30 in Cannonsburg, Pa.

Vice President of Engineering and Development of CNX Resources Corporation Andrea Passman stands in a control room that is used for predicting drilling locations at CNX’s headquarters on July 30 in Cannonsburg, Pa.

(Justin Merriman for the Washington Examiner)

Today’s natural gas industry isn’t the same petroleum job your grandfather or your father would have applied for. It not only attracts computer scientists, software engineers, mathematicians, and geologists to relocate to Western Pennsylvania from around the country, but it also provides careers for locals who thought those good jobs left for good when the coal mines and steel mills closed a generation ago.

Plenty of locals, who perhaps were not cut out for college, just wanted an opportunity to work hard in an industry with a future. All the better if that industry utilized the resources of the land while conserving it — nobody wants to spoil the places for hunting, fishing, climbing, hiking, and camping. Even better, a local job would allow them to live near family.

Mike May is one such guy.

The 33-year-old grew up in Imperial, Pa., along the Lincoln Highway. After graduating from West Allegheny High School, May joined the Marines. When he left the service, he wanted to come back home to Western Pennsylvania and work his way up in the world, but he just didn’t know if he had the career skills.

Mike May, 33, of Oakdale, Pa., works in the control room of CNX Resources Corporation on July 30 at their headquarters in Cannonsburg, Pa. The control room is able to monitor and adjust well sites throughout several states.

Mike May, 33, of Oakdale, Pa., works in the control room of CNX Resources Corporation on July 30 at their headquarters in Cannonsburg, Pa. The control room is able to monitor and adjust well sites throughout several states.

(Justin Merriman for the Washington Examiner)

“So, I started in the gas and oil fields literally working with my hands; I have worked in the industry from the bottom up,” he says as he stands in front of three monitors doing the same thing he did in the field.

No dirt under the nails. No weather dictating field conditions. No mud on the boots. Just precision automation that does the job a team of workers used to do in the field. Now, May does it inside the offices of CNX, a fracking company that broke off of energy giant CONSOL.

“Basically, I was a production operator,” explains May, “I ran all the physical operations, manual chokes, fixing anything that would break or go down; adjusting water dumps to increase the efficiency of the separators, water, and tank levels out there,” he says of the drilling sites.

Now, he does almost all of that remotely.

“See, this is the digital twin of the well site,” he says, pointing to one of several screens he is monitoring in a highly secure floor of the complex. “So, over here, we have all of our physical assets. This is the data surveillance side of the house. We’re also able to control and push parameters out to the field level. So, things I used have to do at the site and make physical changes I can do using technology,” he says.

Twenty miles north of this office, in Pittsburgh, several dozen young climate activists — about May’s age — protested last week in front of the mayor’s office. They pressed Democratic city and county leaders to stop the expansion of fracking in the county and to speak out against the Shell cracker plant under construction in the region.

Twenty miles in the opposite direction, public high schools are offering vocational training for their students that prepare them to walk off the high school football field on graduation day with their diplomas and into jobs that start at $129,000 a year.

Compared to the kids closer to Pittsburgh, these kids from rural high schools won’t have an inside track for jobs at the likes of Google, Uber, and others whom the Democratic mayor celebrates as part of the “new Pittsburgh.”

And the Shell cracker plant the climate activists were protesting? It doesn’t make really make crackers — cracking is the process that converts natural gas products into ethylene and then into plastics. The $6 billion dollar plant began construction last year, with construction employment expected to exceed 6,000 workers over the next ten years and provide 600 permanent positions once the plant is complete.

Since the 1920s, technology and automation have been disrupting the manufacturing world — eliminating jobs and growth opportunities throughout the different regions in the country. Here, technology is creating jobs. For May, automation and high technology didn’t take his job; it enriched it.

“Correct. I kinda evolved with the times. I am truly living the American Dream.”

AkerBP, Cognite and Solution Seeker have entered into a partnership agreement to leverage artificial intelligence (AI) for real-time, data driven production optimization.

First out is the Alvheim field, where Solution Seeker´s ProductionCompass AI solution will utilize all available and relevant data to perform real-time production data analytics and production optimization, including management of the challenging slugging problem at the field through advanced slug data analytics.

“With Alvheim, we embark on a very exciting journey with AkerBP and Cognite to deliver artificial intelligence to maximize oil and gas production based on pure data-driven models. We are honored and proud to be chosen as a strategic partner to AkerBP and Cognite, as AkerBP is clearly one of the most ambitious oil companies driving the digital oilfield agenda.” says Vidar Gunnerud, founder, and CEO of Solution Seeker.

The production data is streamed live from Cognite´s Data Platform, developed in close collaboration with AkerBP to make all data and models readily accessible for all users and systems. The platform facilitates an open ecosystem for advanced applications such as Solution Seeker´s AI.

“We believe Solution Seeker´s AI will enable us to fully leverage and make sense of all our production data, build robust, fast and precise prediction models, and maximize our production in real-time. Their solution plugs directly onto the Cognite Data Platform, accessing all relevant production data, and writing all relevant results from their artificial intelligence back to the platform so other systems and users, in turn, can utilize these new data. In addition to the value this project creates from production optimization, this is a real demonstration of how we want to work with partners through the Cognite platform. This is data liberalization in practice – creating tangible results at every step,” says Signy Vefring, Manager Digitalization Program Office at AkerBP.

Solution Seeker is developing the first artificial intelligence for oil and gas production optimization, leveraging big data and machine learning techniques to solve the continuous optimization problem. The AI is capable of analyzing thousands of historical and live production data streams, identifying field behavior and relations, and automatically and continuously providing the most up to date prediction model to make the optimal choice of production settings.

The AI is currently being developed and deployed in collaboration with ConocoPhillips, Neptune Energy, Wintershall, Lundin, and AkerBP, and will be launched and made commercially available to all operators in 2018. This will disrupt the way operators can maximize production and improve their operations.

Solution Seeker is a technology spin-off from the ICT research group at NTNU Engineering Cybernetics and NTNU’s Centre for Integrated Operations.

Read more about this on Sysla (in Norwegian).

Repsol and Google Cloud to optimize refinery management using big data and artificial intelligence

  • Repsol’s goal is to maximize the performance and efficiency of a refinery, which is among the largest and most complex industrial facilities.

  • Google Cloud will provide its computing power, experience with big data and machine learning expertise.

  • The initiative is part of Repsol’s commitment to digitalization, innovation and technology across all of its business areas.

Repsol has today announced that it is working with Google Cloud to launch a project that will use big data and artificial intelligence to optimize management of the Tarragona refinery. Refineries are among the largest and most complex industrial facilities.

Repsol’s Executive Managing Director of Downstream, María Victoria Zingoni, and Google’s Country Manager for Spain and Portugal, Fuencisla Clemares, participated in the launch of the project, which will be carried out in the Tarragona Industrial Complex and marks a pioneering challenge in the global refining industry.

This initiative puts the latest cloud technology from Google at the service of the refinery’s operators. Repsol’s objectives are to maximize efficiency, both in energy consumption as well as consumption of other resources, and to improve the performance of the refinery’s overall operations.

To achieve this, Google will make available to Repsol its data and analytics products, the experience of its professional services consultants and its machine learning managed service, Google Cloud ML, which will help Repsol’s developers to build and bring machine learning models to production in their refinery environment.

The management of a refinery involves around 400 variables, which demands a high level of computational capacity and a vast amount of data control. This is an unprecedented challenge in the refining world.

Until now, the highest number of functions integrated digitally in an industrial plant is around 30 variables, demonstrating the vast challenge this project presents. It aims to increase the number of variables being managed by more than 10 times. Repsol chose the Tarragona refinery to develop this initiative because the online configuration of its production schematics facilitates testing and implementation.

This project, as well as the collaboration with Google Cloud, is part of Repsol’s ongoing digitalization, innovation and technology projects development in all of its business units to improve its competitiveness and efficiency.

The project has the potential to add 30 cents on the dollar to Repsol’s refined barrel margin, which could translate to 20 million dollars annually for the Tarragona refinery, with significant upward growth if all optimization objectives are achieved.

Improvement of industrial processes

For Maria Victoria Zingoni “this is an efficiency project in all senses: it seeks to consume fewer resources; reduce energy consumption, which is the highest cost of a refinery; increase the unit reliability and, by extension, improve economic performance.”

“This initiative belongs to a more comprehensive plan to take advantage of the possibilities afforded us by the latest in technology, and improve industrial processes. We are not afraid of aiming for the stars, even if some projects will fail. This is about learning as fast as possible and that machines help people in their work,” said Repsol’s Executive Managing Director of Downstream.

Google’s Country Manager for Spain and Portugal, said that “This project demonstrates the commitment from Spanish companies to digital transformation and the application of machine learning in industrial processes, of which Repsol is a pioneer.

“At Google, we are deeply committed to sustainability and ensuring that we have a positive impact on the environment – and we see technology such as machine learning and data analytics play an important role in helping our customers maximize their own efficiency. We are proud to collaborate with a company such as Repsol, which has been a leader for many years in leveraging technological innovation to reduce its environmental impact,” said Fuencisla Clemares, Country Manager Google España y Portugal.

This project, as well as the collaboration with a partner like Google, is part of Repsol’s ongoing digitalization, innovation and technology projects development in all of its business units to improve its competitiveness and efficiency.

This project is compatible with other digital initiatives that are already in use at Repsol’s industrial facilities, such as Siclos, with which Repsol’s refinery control panel operators learn, in real time, the economic implications of operating decisions; or Nepxus, which increases planning, analysis and agility in decision-making in the control rooms of these industrial installations.

Tarragona is one of the six refineries that Repsol operates in Spain and Peru. This plant has the capacity to distill 186,000 barrels of oil a day and is Repsol’s third-largest unit.

The facility occupies over 500 hectares and is as large as the Tarragona’s city center. The refining unit processes 9.5 million tons of raw material a year and the storage tanks can hold a million cubic meters.

REPSOL Press Release 

Most Popular Topics

Editor Picks