The company is planning to pressure test the A annulus. In 2017, Multi-barrier imaging (Empulse) was deployed to detect metal loss due to corrosion and metal wear. The maximum metal loss in specific intervals ranged between 13% and 18%. One of the wells was found to have intensive and significant corrosions in 7” Production Tubing of 34% below casing tie back shoe. The API SPEC 5CT (Specification for Casing and Tubing) specifies that the minimal permissible pipe wall thickness is 87.5 % of the nominal wall thickness, which in turn has a tolerance of-12.5 %. The metal loss higher than 12.5% requires that the casing is downgraded or corroded pipe replaced. How is the remaining strength of corroded pipe determined? I.e. to what proportion does the loss of metal reduce the strength of the casing when the metal loss is above the specified 12.5%?
Supplementing to David and Doug's responses, additional points might help:
1. Was the pipe ordered with API tolerance? Meaning is the Kwall 12.5% or less?
2. Wall loss is at pipe body only or connections also? Connection wear and integrity
assessment is another complex subject
3. Potential collapse risk of tubing w/ 34% wall
loss. For collapse strength a linear deration can be considered.
4. If you are not able to figure out how logged WT (wall thickness) is/was computed, assume wall loss at minimum wall. Meaning add Kwall to reported wear (this is worst case) to check if the recalc. burst strength suffices.
Hope this helps!