Shale gouge ratio


Shale Gouge Ratio is a mathematical algorithm for predicting the fault rock types for simple fault zones developed in sedimentary sequences dominated by sandstone and shale.
The parameter is widely used in the oil and gas exploration and production industries to enable quantitative predictions regarding faults' hydrodynamic behavior.

Definition

At any point on a fault surface, the shale gouge ratio equals the net shale/clay content of the rocks that have slipped past that point.
The SGR algorithm assumes complete mixing of the wall-rock components in any particular 'throw interval'. The parameter measures the fault zone's 'upscaled' composition.

Application to hydrocarbon exploration

Hydrocarbon exploration involves identifying and defining accumulations of hydrocarbons trapped in subsurface structures. Faults often segment these structures. For a thorough trap evaluation, it is necessary to predict whether the fault is sealing or leaking to hydrocarbons and to estimate how 'strong' the fault seal might be. The 'strength' of a fault seal can be quantified in terms of subsurface pressure, arising from the buoyancy forces within the hydrocarbon column, that the fault can support before it leaks. When acting on a fault zone, this subsurface pressure is called the capillary threshold pressure.
For faults developed in sandstone and shale sequences, the first-order control on capillary threshold pressure is likely to be the composition of the fault-zone material, particularly the shale or clay content. SGR is used to estimate the shale content of the fault zone.
Generally, fault zones with higher clay content, equivalent to higher SGR values, can support higher capillary threshold pressures. On a broader scale, other factors also exert control on the threshold pressure, such as the depth of the rock sequence at the time of faulting and the maximum burial depth. As maximum burial depth exceeds 3 km, the effective strength of the fault seal will increase for all fault zone compositions.