Monday, February 18, 2008

Innovation and Technology

Deepwater Activity in the US Gulf of Mexico Continues to Drive Innovation and Technology
The deepwater Gulf of Mexico (GOM) is an integral part of US energy supply and one of the world’s most important oil and gas provinces. As a result of its proximity to key markets and a long history of exploration and development, the region is now seeing a transition towards deeper and more challenging exploration and development, including both deeper water and deeper wells. These trends are the motivation behind a renewed drive towards advancing the state of key technologies. Growth of US Deepwater:The distinction between shallow and deepwater can range from 656ft (200m) to 1,500ft (457m) water depth. Here, Minerals Management Service (MMS) information is used, which defines deepwater as water depths greater than or equal to 1,000ft (305m) and ultra-deepwater as water depths greater than or equal to 5,000ft (1,524m).1 As of early 2006, there were 118 deepwater projects on production. Production from deepwater by the end of 2004 was approximately 950,000 barrels of oil and 3.8 billion cubic feet of natural gas per day. More than 980 deepwater exploration wells have been drilled since 1995 and at least 126 deepwater discoveries have been announced from that effort. In the last six years, there have been 22 discoveries in water depths greater than 7,000ft (2,134m), with 11 of those discoveries in the last two years. Approximately one-third of the world’s deepwater rig fleet is committed to GOM service. The average size of a deepwater GOM field discovery is several times larger than the average shallow-water discovery, and deepwater fields are some of the most prolific producers in the GOM. Announced volumes for these deepwater discoveries are more than 1.8 billion barrels of oil equivalent (BOE). The growth of activity in the deepwater GOM has accelerated over the last few years, although it has been developing for over two decades. Deepwater production began in 1979 with Shell’s Cognac field, and it took five more years before the next deepwater field (ExxonMobil’s Lena field) came online. Both developments relied on extending platform technology to greater water depths. Over the last 14 years, all phases of deepwater activity have expanded. There are over 8,200 active GOM leases with 54% of those in deepwater. Contrast this with approximately 5,600 active GOM leases in 1992, with only 27% in deepwater. On average, there were 30 rigs drilling in deepwater in 2005 compared with only three rigs in 1992. In the period 1992–2002, deepwater oil production rose by over 840% and deepwater gas production increased by about 1,600%. The Deep Water Royalty Relief Act (DWRRA), which provides economic incentives to develop leases in deepwater, has clearly had a significant impact on deepwater GOM activities. Deepwater exploration and production growth have been enabled by remarkable technology advances over time. These advances continue today, with many new technologies currently in the research phase for future deployment. Discoveries Recent discoveries continue to expand the exploration potential of the deepwater GOM. Of total GOM proved reserves, 99% are in Neogene-age and younger reservoirs (Pleistocene, Pliocene and Miocene). However, several recent deepwater discoveries encountered large potential reservoirs in sands of Paleogene age (Oligocene, Eocene and Paleocene). The discovery of these Paleogene-age reservoirs has opened wide areas of the deepwater GOM to further drilling, focused on two frontier plays: the Mississippi Fan Foldbelt and the Perdido Foldbelt. With the drilling of the Trident and Cascade discoveries (AC 903 and WR 206) in 2001 and 2002, the potential for an extensive Lower Wilcox sand extending from Alaminos Canyon to Walker Ridge was established. Deposition of the Lower Wilcox appears to have been largely unaffected by salt tectonics, resulting in a thick sand across a broad geographic area. The Cascade discovery established turbidite sands more than 350 miles downdip of their source deltas in south Texas. Two further subsalt discoveries have been made in the Lower Wilcox: St Malo (WR 678) and Jack (WR 759). To date, there have been five Lower Wilcox and/or Paleogene discoveries in Alaminos Canyon and four Lower Wilcox discoveries in Walker Ridge. The Paleogene-age reservoirs provide a promising exploration trend. However, there are a number of challenges that must be addressed before production can begin. Appraisal wells must be drilled to test reservoir quality and producibility. Other challenges include the completion and production of deep reservoirs in the ultra-deepwater GOM, for which infrastructure must be developed. Successful exploration has also occurred in the eastern GOM with announced discoveries in DeSoto Canyon (Spiderman/Amazon and San Jacinto) and Lloyd Ridge (Atlas, Atlas NW, Cheyenne and Mondo Northwest). At least six of these discoveries encountered Miocene-age reservoirs, and all ten are in water depths greater than 7,800ft (2,378m). The Mississippi Fan foldbelt trend saw three Lower Miocene oil discoveries in 2005: Knotty Head (GC 512), Genghis Khan (GC 652) and Big Foot (WR 29). Chevron’s successful production test at their Tahiti discovery well (GC 640) in 2004 undoubtedly spurred further exploration of the trend. Tahiti tested a structural trap beneath an 11,000ft (3,354m) thick salt canopy. The discovery well produced at a restricted rate of 15 million barrels of oil per day (MBOPD). Rate and pressure analyses indicate that the well may be capable of a sustained flow of as much as 30 MBOPD. Until recently, there had been a gradual increase of drilling depth. However, since 1996 the maximum drilling depth has increased rapidly, reaching true vertical depths (TVDs) just below 30,000ft (9,144m) in 2002. The Transocean Discoverer Spirit drilled the deepest well in the GOM to date: Chevron/Unocal’s Knotty Head discovery in Green Canyon Block 512 at a TVD of 34,157ft (10,411m) in December 2005. This recent dramatic increase in TVD may be attributed to several factors, including enhanced rig capabilities, deeper exploration targets and the general trend towards greater water depths. In the last five years, 12 wells have been drilled in water depths exceeding 9,000ft (2,744m) and, in December 2003, the first well in water depths over 10,000ft (3,050m) was drilled. The water depth drilling record of 10,011ft (3,051m) was set by Chevron in Alaminos Canyon Block 951 in late 2003.Productivity High production rates have been a driving force behind the success of deepwater operations. For example, a Shell Bullwinkle well produced approximately 5,000 barrels of oil per day (BOPD) in 1992. In 1994, a Shell Auger well set a new record, producing about 10,000 BOPD. From 1994 to mid 1999, maximum deepwater oil production rates continued to climb. BP’s Horn Mountain project came online in early 2002 in a water depth of 5,400ft (1,646m), with a single well maximum rate of more than 30,000 BOPD. Since mid 2002, oil production rates have declined in the 1,500–4,999ft (457–1,524m) water-depth interval. However, production rates have increased steeply in the greater than and equal to 5,000ft (1,524m) water-depth interval. The record daily oil production rate (for a single well) is 41,532 BOPD (Troika). In terms of gas production, maximum well rates were around 25 million cubic feet per day (MMCFPD) until a well in Shell’s Popeye field raised the deepwater production record to over 100 MMCFPD in 1996. Since then, the deepwater has yielded even higher maximum production rates. In 1997, Shell’s Mensa field showed the potential for deepwater production rates beyond the 5,000ft (1,524m) waterdepth interval. The record GOM daily gas production rate is 158 MMCFPD (Mensa). The average GOM deepwater oil well currently produces at about 25 times the rate of the average shallow-water oil well, while the average GOM deepwater gas well currently produces at about eight times the rate of the average shallow-water gas well. Subsea Toolkit There were fewer than ten subsea completions per year until 1993, but this number increased dramatically throughout the 1990s. Shallowwater subsea wells began to make up a significant proportion of the total number of GOM subsea wells, accounting for 151 of the 348 subsea wells by year-end 2005. Operators have found subsea tiebacks to be valuable for marginal shallow-water fields because of the extensive infrastructure of available platforms and pipelines. As a result of these factors, there has been an increasing reliance on subsea technology to develop shallow-water and deepwater fields. The technology required to implement subsea production systems in deepwater has evolved significantly in the last 17 years. A water depth of 350ft (107m) was the deepest subsea completion until 1988, when the water depth record for the GOM jumped to 2,243ft or 684m (Green Canyon 31 project). In 1996, another record was reached with a subsea completion in 2,956ft (901m) of water (Mars project), followed by a 1997 subsea completion in 5,295ft (1,614m) of water (Mensa project). Currently, Coulomb has the deepest subsea production in the GOM, in a water depth of 7,591ft (2,313m). Nearly 70% of subsea completions are in water depths of less than 2,500ft (762m). In order for subsea wells to continue to advance to greater water depths and harsher environments, technological improvements are needed. Currently, the industry is working to ensure that new advancements are developed in a safe and environmentally conscientious manner. Technologies currently under evaluation include high-integrity pressure protection systems (HIPPS), high-pressure, hightemperature (HPHT) materials and subsea processing. High-pressure, High-temperature Future As deepwater wells are drilled to greater depths, they begin to encounter the same HPHT conditions that shallow-water wells see at shallower depths. HPHT development is therefore one of the greatest technical challenges facing the oil and gas industry today. Materials that have been used for many years now face unique and critical environmental conditions. The industry is working on a number of collaborative fronts to evaluate these issues and to develop appropriate technologies to mitigate potential hazards. Such efforts include joint research, knowledge sharing via industry conferences and focused standards development via technical committees of the American Petroleum Institute (API), National Association of Corrosion Engineers (NACE) and other groups.Summary Significant challenges exist for deepwater exploration and development. Deepwater operations are expensive and require significant amounts of time between initial discovery and first production. Despite these challenges, deepwater fields have demonstrated prolific performance with successful developments providing great rewards. In order for growth and deepening trends to continue, technology will be a required point of leverage. It will be required to resolve current gap challenges, such as those posed by HPHT prospects in deepwater, and to bring further cost efficiencies into the exploration, development and production phases.
High Integrity Protection Systems (HIPS) – Making SIL Calculations Effective
the oil industry, traditional protection systems as defined in American Petroleum Institute (API) 14C are more and more often replaced by high integrity protection systems (HIPS). In particular, this encompasses the well-known high integrity pressure protection systems (HIPPS) used to protect specifically against overpressure. As safety instrumented systems (SIS) they have to be analysed through the formal processes described in the International Electrotechnical Commission (IEC) 61508 and IEC 61511 Standards in order to assess which Safety Integrity Levels (SIL) they are able to claim. What is really important when dealing with safety systems is that the probability of accident is sufficiently low to be acceptable according to the magnitude of the consequences. This can be done in a lot of different ways: applying rules, know-how or standards that may be deterministic, probabilistic, qualitative or quantitative, using reliability analysis and reliability methods and tools, collecting statistics, etc. Among them we find SIL calculations as per IEC 61508 and IEC 61511. Then we have to keep in mind that calculating a SIL is not an end in itself. It is only a tool among many others to help engineers to master safety through the whole life cycle of the safety systems. This proves to be very efficient from organisational point of view but, unfortunately, some problems arise when probabilistic calculations are performed by analysts thinking that it is a very easy job only consisting to apply some magical formulae (found in IEC 61508-Part 6) or to build a kind of ‘Lego’ from certified SILed elements bought from the shelf. Beyond the fact that sound mathematical theorems (Bellman or Gödel) demonstrate that doing it that way gives no guarantee of good results, this is the complete negation of the spirit developed in the reliability field over the last 50 years that is based on a sound knowledge of the probabilistic concepts and in-depth analysis of systems under study. Therefore, a skilled reliability analyst who aims to use the above standards in a clever and compatible way with the traditional analysis has to solve several difficulties: this is simple for the relationship between IEC standards probability concepts and those recognised in the reliability field or for the failure taxonomy and definitions which may need improvements; it is more difficult for handling complex tests and maintenance procedures encountered in oil industry; it is almost impossible for some concepts like the ‘Safe Failure Fraction’ (SFF), which is not really relevant in our field where spurious failures have to be thoroughly considered and avoided. SIL versus Traditional Concepts The size of this article being limited, we will only give some indications about our way to manage SIL calculations in an efficient way for oil production installations. Figure 1 shows the links with the traditional concepts. The first protection layer works in continuous mode and the standards impose to calculate its Probability of Failure per Hour (PFH). This is actually an average frequency of failure. When the number of failures over [0, T] is small compared with 1, PFH may be assimilated to F1(T)/T. When this is not the case, T/MTTF shall be used instead. In these formulae F1(T) is the unreliability of this layer over [0,T] and MTTF its classical Mean Time To Fail. Then, in the general cases, PFH cannot be assimilated to a failure rate. Anyway this gives the demand frequency on the second layer, which runs in low demand mode (if the first layer is efficient). Its Probability of Failure on Demand (PFD) as per the standards is in fact its the average unavailability P2. Then F1(T).P2 is the probability that both protection layers fail during a given period T. If there is no more protection layer this is the probability of accident. If a third protection layer is installed this will be is the demand frequency on this layer. Note that the Risk Reduction Factor (RRF) is infinite when working in continuous mode. The standard split, the demand mode between low and high according to the demand frequency (lower or greater than 1/year). From probabilistic calculation point of view we prefer to consider the relationship between test and demand frequencies to do that: when the test frequency is big compared with the demand frequency, PFD may be used, on the contrary it is better to use the unreliability, which provides a conservative estimation. From a failure mode point of view the main problem encountered is that the genuine on demand failures are forgotten by the standards. They are likely to occur when the system experiences sudden changes of states. Therefore, they shall be taken under consideration when calculating the PFD, which comprises both hidden failure (occurring within test intervals) and genuine on-demand failures (due to tests or demands themselves). Another commonly encountered problem is that a superficial reading of the standard leads one to think that every revealed failure becomes automatically safe. This, of course, is not true. It remains unsafe as long as something is done to make it safe. This also has to be considered in the calculations.

Health and Safety issues in the offshore industry – a precautionary tale
There have been significant improvements in health and safety in the offshore industry since the catastrophic Piper Alpha disaster in 1988, when an explosion and the resulting fire cost the lives of 167 workers. Despite these improvements, risks for the 20000 workforce in the offshore industry are still ever present - fire, explosion and infrastructure failure all have the potential to cause major loss of life.News on Tuesday evening broke that offshore workers, based 120 miles offshore, had been helicoptered from a semi-submersible drilling rig following an engine fire.In fact a total of 32 of the 87 personnel were taken off the Ocean Guardian, owned by Diamond Offshore, before the fire was brought under control. Fortunately no-one was injured."We have had a fire in the engine room," said a spokesman for Diamond. "As a precaution we began down-manning non-essential personnel."A fire suppression system was activated when the fire broke out. Innovative water mist systems are in high regard in the industry and are becoming the sought after solution to fire on offshore installations.With the geographically isolated workforce, as well as the inherent dangers in working offshore, the industry needs the best health and safety management. The quality of management that Diamond Offshore showed through their precautionary evacuation.