DISCUSSION / DISCUSSION
Reply to the discussion by J.M. Duncan, M. Navin,
and T.F. Wolff on “Probabilistic slope stability
analysis for practice”
1
H. El-Ramly, N.R. Morgenstern, and D.M. Cruden
El-Ramly et al.
The authors wish to thank the discussers for their interest
in the paper and their valuable comments. Our response
takes up the important issues raised and provides more de-
tails of our methodology. The response considers the four is-
sues raised in their original sequence.
Variance reduction due to spatial averaging
Since the performance of earth slopes is controlled, in
most cases, by the average material properties and pore pres-
sures over the area of the slip surface, it is logical to con-
sider the variability of the average values (rather than the
variability between discrete points) in any probabilistic slope
assessment. As demonstrated in the paper (El-Ramly et al.
2002), the variance of the spatial average of a property over
an area or volume can be significantly less than that of the
measured data. While this concept has been recognized for
some time (Anderson et al. 1984; Li and Lumb 1987;
Baecher 1987), for simplicity, several studies have used the
variance of measured data without any reduction. The under-
lying assumption is that the operational shear strength, for
example, controlling the stability of the slope is constant at
all locations within the domain of the problem and can as-
sume any value within the measured minimum–maximum
range. There are two main consequences of this assumption.
First, the probability of unsatisfactory performance can be
grossly overestimated and second, the probability of unsatis-
factory performance becomes detached from the scale at
which failure takes place.
To illustrate the above points, El-Ramly (2001) analysed a
hypothetical slope configuration, Fig. D1, using soil data
from an actual failure of the cut slope behind housing block
No. 36 of the Shek Kip Mei Estate in Hong Kong (FMSW
2000). Three slip surfaces, corresponding to failure of the
entire slope (slip surface 1), failure of the upper 2/3 of the
slope (slip surface 2), and failure of the top 1/4 of slope
height (slip surface 3), were analysed probabilistically. The
analyses were performed using our probabilistic slope analy-
sis methodology, as well as the simplified probabilistic ap-
proach with no variance reduction due to spatial averaging.
The Spencer method of slices (Spencer 1967) was used in
the stability calculations. The results of the analyses are
summarized Table D1.
As in the James Bay dyke study, the simplified analysis
overestimates the probability of unsatisfactory performance.
The difference in the estimated probabilities arises from ig-
noring the reduction in the variances of the input variables
due to spatial averaging in the simplified analysis. For slip
surface 3, the spatial averaging domain (the slip surface) is
comparable in size to the assumed scale of fluctuation (
δ ≈
2r
o
= 10 m), and no variance reduction was introduced.
Hence, the estimated probabilities of unsatisfactory perfor-
mance were identical. For slip surface 1, the averaging do-
main is significantly larger than the scale of fluctuation, and
the reduction in the variances of the input variables is sub-
stantial. The probability of unsatisfactory performance from
the simplified analysis is 38 times that of the analysis ac-
counting for spatial averaging. Such a large difference
should have a significant impact on any decision making.
For the analyses based on our methodology, the probabili-
ties of unsatisfactory performance of larger slip surfaces are
lower than those of the smaller surfaces, even though the lat-
ter have high factors of safety. This might seem unusual
from a conventional point of view, but not from a probabilis-
tic perspective. Because of the spatial variability of soil
properties, encountering a sufficiently low strength to induce
failure in localized areas of the slope is more likely than
such an encounter over the entire slope height. Similarly, the
probability of occurrence of small local zones of high pore
pressure is much higher than that of large zones. In other
words, the uncertainty in the average properties and pore
pressure along the slip surface, and consequently the proba-
bility of unsatisfactory performance, are much higher for a
small failure than for a large failure. This is shown clearly in
the database of landslide incidents in Hong Kong, Table D2.
The frequency of occurrence of minor failures is almost
Can. Geotech. J. 40: 851–855 (2003)
doi: 10.1139/T03-031
© 2003 NRC Canada
851
Received 29 November 2002. Accepted 13 March 2003.
Published on the NRC Research Press Web site at
http://cgj.nrc.ca on 11 August 2003.
H. El-Ramly.
2
AMEC Earth and Environmental,
4810–93 Street, Edmonton, AB T6G 5M4, Canada.
N.R. Morgenstern and D.M. Cruden. Geotechnical and
Geoenvironmental Group, Department of Civil and
Environmental Engineering, University of Alberta, Edmonton,
AB T6G 2G7, Canada.
1
Appears in Canadian Geotechnical Journal, 40: 848–850.
2
Corresponding author (e-mail: hassan.el-ramly@ualberta.net).
I:\cgj\CGJ40-04\T03-031.vp
August 7, 2003 11:21:24 AM
Color profile: Generic CMYK printer profile
Composite Default screen
30 times that of massive failures. Both the conventional
slope analysis based on the factor of safety and the simpli-
fied probabilistic analysis fail to address this issue of scale
of failure.
The concepts of spatial variability of soil data and vari-
ance reduction due to spatial averaging are independent of
the methods and techniques used to perform a probabilistic
analysis. Monte Carlo simulation, using Microsoft
®
Excel
and @Risk software, is only a tool to conduct probabilistic
analyses, similar to the first-order second-moment (FOSM)
and the point estimate methods. As the discussers indicated,
the reduction in variance due to spatial averaging can be im-
plemented in conjunction with any of these techniques.
The authors agree that simple methods to estimate the
autocorrelation distance are needed to facilitate the imple-
mentation of these concepts in practice. The analytical tech-
niques used to estimate the autocorrelation distance require
not only substantial amounts of data, but also data at very
© 2003 NRC Canada
852
Can. Geotech. J. Vol. 40, 2003
El-Ramly et al. (2002)
probabilistic methodology
Simplified probabilistic analysis
a
Slip surface
No. (Fig. D1)
E[FS]
σ
[FS]
Skewness
P
u
σ
[FS]
Skewness
P
u
1
1.45
0.174
0.24
2.1×10
–3
0.347
0.47
8.0×10
–2
2
1.56
0.275
0.43
8.4×10
–3
0.466
0.75
8.6×10
–2
3
1.83
0.679
1.00
6.1×10
–2
0.679
1.00
6.1×10
–2
Note: E[FS], mean factor of safety;
σ
[FS], standard deviation of factor of safety; skewness, coefficient of skewness
of probability density function of factor of safety; P
u
, probability of unsatisfactory performance.
a
Variance reduction due to spatial averaging is ignored.
Table D1. Summary of results of probabilistic analyses for different slip surfaces.
Fig. D1. Cross-section of the slope showing stratigraphy and slip surfaces corresponding to different failure scales.
Number of landslides
Minor failure
(<50 m
3
)
Major failure
(50–500 m
3
)
Massive failure
(>500 m
3
)
Landslide incidents in 1997 and 1998 (GEO 1999)
703
58
26
Landslide incidents along BRIL roads for the period
1982–1996 (ERM-Hong Kong Limited 1997)
286
27
9
Note: Classification of landslides into minor, major, or massive based on the volume of the slide mass is proposed by the
Geotechnical Engineering Office of Hong Kong (GEO 1999)
Table D2. Number of landslide incidents in Hong Kong.
I:\cgj\CGJ40-04\T03-031.vp
August 7, 2003 11:21:25 AM
Color profile: Generic CMYK printer profile
Composite Default screen
close spacings that would be deemed redundant in conven-
tional practice. At present, the information gathered in a typ-
ical site investigation program would likely be inadequate
for an analytical assessment of the autocorrelation distance.
The authors (El-Ramly et al. 2003) conducted a literature
review of typical autocorrelation distances for different soil
types and properties. The data showed that the variation in
the autocorrelation distance is not large; in horizontal direc-
tions the distance varies from 10 to 40 m, while in the verti-
cal direction it ranges from 1 to 3 m. In the absence of
adequate data, they suggested that estimates of autocor-
relation distances could be inferred from within these
ranges. They pointed out, however, that the selection of an
autocorrelation distance should be based on an understand-
ing of the geological processes that formed any layering and
the direction, be it vertical or horizontal, in which the
variabilities of properties would have more profound im-
pacts on the slope performance. A parametric study using
different autocorrelation distances gives further insights.
Relationship between reliability index and
probability of unsatisfactory performance
The marginal differences in the estimated probabilities of
unsatisfactory performance noted by the discussers in the re-
sults of the FOSM analysis are attributed to rounding the
numbers to two decimal places. The estimated probabilities
are more consistent with the reliability index,
β
, and the
mean and standard deviation of the factor of safety when the
results are rounded to three decimal places as shown in Ta-
ble D3. As for the results of the analyses based on Monte
Carlo simulation, using either the simplified analysis or our
spreadsheet-based methodology, the probability density
functions of the factor of safety are not normally distributed,
as implied by the nonzero coefficients of skewness in Ta-
ble D3. The discussers estimate of P
u
= 3.29 × 10
–2
is based
on an assumed normal distribution and, hence, cannot be
compared with the Monte Carlo simulation results.
Importance of identifying the critical failure
mechanism
The authors emphasized in the paper the importance of
identifying all potentially critical slip surfaces and agree
with the discussers that no analysis can overcome missing a
credible failure mechanism. This statement is not limited to
probabilistic analyses, but it applies to any slope analysis in-
cluding conventional deterministic analyses. The authors’ in-
tent in analyzing the James Bay dyke was to calibrate the
proposed probabilistic methodology against a well estab-
lished case study. No effort was put into investigating a fail-
ure mechanism different from that reported by Christian et
al. (1994).
However, we now report deterministic and probabilistic
analyses of the wedge failure mechanism identified by the
discussers. The probabilistic analyses were conducted using
our methodology, the FOSM method, and the simplified ap-
proach. The Spencer method of slices (Spencer 1967), used
in the stability calculations, gave a minimum factor of safety
of 1.22. The details of the spreadsheet model used in the
probabilistic analyses are not presented here, as a description
of a spreadsheet for the Spencer’s method is presented in
El-Ramly (2001). Table D3 summarizes the results of all
analyses. Using our methodology, the probability of unsatis-
factory performance is estimated to be 1.2 × 10
–1
; 25 times
the probability of unsatisfactory performance of a circular
slip surface. The curved, noncircular slip surface reported by
the discussers can also be analysed using our methodology if
its geometry could be defined by analytical geometry.
The writers compliment the discussers on having isolated
a potential failure mechanism for this well documented case
that has a significantly lower factor of safety than the origi-
nal circular mechanism. It is of interest to note that, notwith-
standing the distinguished engineers who participated in the
original James Bay study, it remained for the discussers to
make this important finding. This highlights other aspects of
uncertainty in geotechnical engineering.
Advantages and disadvantages of
Microsoft® Excel and @Risk for evaluating
reliability of slopes
We have the following comments to make concerning the
advantages and disadvantages of Monte Carlo simulation us-
ing Microsoft® Excel and @Risk software.
(1) Our probabilistic methodology is applicable to any
geotechnical model that can be represented in a spread-
sheet, including slope analyses for circular and non-
circular slip surfaces. Advances in software engineering
will further facilitate spreadsheet modelling of more
complex geotechnical analyses.
(2) The initial effort required to develop a spreadsheet for a
slope analysis is more than using the FOSM method
combined with a commercial slope analysis software.
However, once developed, the spreadsheet can be rap-
idly updated as required. Such changes are often needed
in a probabilistic slope analysis as several failure mech-
anisms and sources of uncertainty are investigated.
Also, the spreadsheet can be modified with little effort
for a different slope problem.
(3) Most recent engineering graduates are familiar with
spreadsheet software, and modelling a slope analysis in
a spreadsheet is no longer a daunting task.
(4) The use of Monte Carlo simulation and our methodol-
ogy overcomes the limitations of the FOSM method,
which we outline below.
Limitations of the FOSM method
While the FOSM method is a useful tool for probabilistic
analyses, the simplifying assumptions made in formulating
this technique are seldom mentioned in the literature. To
avoid the misuse of the FOSM method, engineers conduct-
ing probabilistic analyses might be reminded of its limita-
tions.
The FOSM method is based on a Taylor’s series expan-
sion of the performance function, factor of safety equation,
for example, retaining only the first order terms. The dis-
carded terms are functions of the second and higher order
derivatives of the performance function, the variances and
shapes of the probability density functions of the input vari-
ables, and the correlations among input variables. For a lin-
ear performance function, the second order derivatives are
© 2003 NRC Canada
El-Ramly et al.
853
I:\cgj\CGJ40-04\T03-031.vp
August 7, 2003 11:21:25 AM
Color profile: Generic CMYK printer profile
Composite Default screen
equal to zero and the FOSM method is exact. For nonlinear
functions, the accuracy of the FOSM method diminishes as
the nonlinearity of the performance function increases. As
an illustration, the second order binomial y(x) = x
2
– x + 1
was analysed probabilistically using the FOSM method,
Monte Carlo simulation using Microsoft® Excel and @Risk
software, and Taylor’s series expansion retaining the first
and second order terms. The input variable, x, was assumed
to be normally distributed with a mean of 2.0 and a standard
deviation of 0.70. Table D4 summarizes the results of the
analyses. The mean and standard deviation from Monte
Carlo simulation and Taylor’s series expansion with the sec-
ond order terms are identical, whereas the FOSM method
shows errors of 14 and 5%, respectively. Unfortunately, re-
taining the second and higher order terms of the Taylor’s se-
ries expansion of a complex performance function with
more than one input variable is mathematically formidable.
Engineers should, thus, consider the nonlinearity of the
geotechnical model being analysed and the impact it might
have on the predicted mean and variance. The nonlinearity
of the performance function is not an issue in Monte Carlo
simulation.
Slope analysis methods were noted to be “reasonably lin-
ear” (Mostyn and Li 1993). For such functions, the accuracy
of the FOSM method depends on the variances of the input
variables and the shapes of their probability density func-
tions. As the level of uncertainty in the input variables in-
creases and their probability density functions becomes
more skewed, the accuracy of the FOSM method decreases.
The probabilistic analyses of the above binomial function
are repeated assuming the input variable, x, is a positively
skewed triangular distribution with a minimum of 1.01, a
most likely value of 1.01, and a maximum of 3.98. The
mean and standard deviation of this triangular distribution
are the same as those of the normal distribution assumed
above; 2.0 and 0.7, respectively. The results of the analyses
are summarized in Table D4. In the FOSM analysis, the
shapes of the probability distribution functions of the input
variables have no impact on the analysis and, hence, the
mean and standard deviation of y(x) remain unchanged. The
results of Monte Carlo simulation and Taylor’s series expan-
sion with the second order terms are identical and show an
increase in the standard deviation, or uncertainty, of y(x).
Hence, the error in the standard deviation
σ
[y] incurred by
the FOSM method increased from 5 to 14% because of the
skewness of the probability distribution of x.
The FOSM method allows the estimation of the mean and
variance of the performance function, but provides no infor-
mation about the shape of the probability density function.
To estimate any probability, the shape of the probability dis-
tribution of the output has to be assumed. This assumption
of, typically, a normal or a lognormal distribution introduces
© 2003 NRC Canada
854
Can. Geotech. J. Vol. 40, 2003
Method of analysis
Probability
distribution of x
E[y]
σ
[y]
P (y
≤
0.8)
FOSM
Not applicable
3.00
2.100
14.74×10
–2 a
; 3.78×10
–2 b
Taylor’s series retaining first-
and second-order terms
Normal
3.49
2.211
11.19×10
–2 a
; 1.24×10
–2 b
Triangular
3.49
2.431
13.43×10
–2 a
; 2.13×10
–2 b
Monte Carlo simulation
c
Normal
3.49
2.211
2.72×10
–2
Triangular
3.49
2.431
0
Note: E[y], mean of y;
σ
[y], standard deviation of y; P ( y
≤
0.8), probability of y equal to or less than 0.8.
a
Assuming the probability density function of y(x) is normal.
b
Assuming the probability density function of y(x) is lognormal.
c
Using Microsoft
®
Excel and @Risk software.
Table D4. Comparison of results of a probabilistic analysis of a nonlinear function.
Method of analysis
E[FS]
σ
[FS]
Skewness
P
u
β
Circular slip surface using Bishop’s method
Spreadsheet-based probabilistic
slope analysis
1.464
0.200
0.304
4.67×10
–3
2.320
FOSM
1.457
0.191
Not
available
8.36×10
–3 a
2.46×10
–3 b
2.393
Simplified analysis
c
1.463
0.252
0.315
2.37×10
–2
1.837
Noncircular wedge slip surface using Spencer’s method
Spreadsheet-based probabilistic
slope analysis
1.217
0.191
0.388
1.20×10
–1
1.136
FOSM
1.215
0.183
Not
available
1.20×10
–1 a
1.11×10
–1 b
1.175
Simplified analysis
c
1.213
0.253
0.309
2.01×10
–1
0.842
a
Assuming the probability density function of the factor of safety is normal.
b
Assuming the probability density function of the factor of safety is lognormal.
c
Using Microsoft
®
Excel and @Risk software, and ignoring variance reduction due to spatial averaging.
Table D3. Comparing the outputs of different analysis approaches.
I:\cgj\CGJ40-04\T03-031.vp
August 7, 2003 11:21:25 AM
Color profile: Generic CMYK printer profile
Composite Default screen
© 2003 NRC Canada
El-Ramly et al.
855
another source of inaccuracy. As an illustration, the proba-
bility of y(x) being less than 0.8 is estimated in Table D4 for
both the normal and triangular distributions of x. Even for
the accurate analysis accounting for the second order terms
of Taylor’s series, there is almost an order of magnitude dif-
ference between the estimated probabilities based on normal
and lognormal distributions. Similarly, our FOSM analysis
and the discussers’ analysis of the James Bay dyke assuming
a circular slip surface shows a ratio of four between the
probabilities of unsatisfactory performance based on normal
and lognormal distributions (Table D3 (this paper) and Ta-
ble D2 (discusser’s note)). Such a wide range could create
difficulties for decision-making based on a probabilistic cri-
terion. In Monte Carlo simulation, the shape of the probabil-
ity density function of the factor of safety is generated in the
simulation process, and there is no need to make assump-
tions.
Since the assumed triangular distribution of x has a finite
minimum of 1.01, the function y(x) has a minimum of 1.01,
and hence the probability P[y(x) < 0.8] is equal to zero as
predicted by Monte Carlo simulation (Table D4). Probability
estimates in both the FOSM and Taylor’s series with the sec-
ond order terms are based on assumed distributions with no
finite minimums, and hence neither of the two approaches
would have yielded a correct probability estimate. In prac-
tice, the designer may elect to set a practical minimum or
maximum for one or more of the input variables, such as a
minimum friction angle or a maximum pore pressure ratio.
This can only be done in Monte Carlo simulation by truncat-
ing the probability distributions of the input variable(s) at
the desired limits.
Finally, the FOSM method is applied primarily to prob-
lems where there are no correlations among input variables.
With extra effort, the method can also be applied where
there is a correlation between two variables, but for more
than two correlated variables, the FOSM method is very
cumbersome. Acknowledging correlations among input vari-
ables can be crucial to the analysis. Analysing the stability
of a tailings dyke, El-Ramly et al. (2003) noted a linear
trend in the recorded pore pressure ratio in a clay-shale be-
neath the dyke, the scatter of the measurements around the
mean trend was significant. To account for the statistical un-
certainty in the linear trend, the intercept and slope of the
trend line were modelled as negatively correlated random
variables. Recently, El-Ramly conducted a probabilistic as-
sessment of the stability of natural slopes in the Lethbridge
area in southern Alberta, Canada. The stability of the slopes
in the study area was largely governed by pore-water pres-
sures and residual strengths of very thin clay layers. Among
the input variables to the analysis were the slope angle,
θ
,
and the residual friction angle of the clay,
φ
′
r
. Accounting
for a positive correlation between
θ
and
φ
′
r
, that steeper
slope angles are associated with high friction angles, was
fundamental to the analysis. In Monte Carlo simulation us-
ing Microsoft
®
Excel and @Risk software, correlations be-
tween several pairs of input variables can be handled easily.
References
Anderson, L.R., Sharp, K.D., Bowles, D.S., and Canfield, R.V.
1984. Application of methods of probabilistic characterization
of soil properties. In Proceedings of ASCE Symposium on
Probabilistic Characterization of Soil Properties — Bridge Be-
tween Theory and Practice. ASCE National Convention, At-
lanta, Ga., 17 May, American Society of Civil Engineers, N.Y.
pp. 90–105.
Baecher, G.B. 1987. Statistical analysis of geotechnical data. Final
Report No. GL–87–1, USACE Waterways Experiment Station,
Vicksburg, Miss.
Christian, J.T., Ladd, C.C., and Baecher, G.B. 1994. Reliability and
probability in stability analysis. Journal of Geotechnical Engi-
neering Division, ASCE, 120: 1071–1111.
El-Ramly, H. 2001. Probabilistic analyses of landslide hazards and
risks: Bridging theory and practice. Ph.D. thesis, University of
Alberta, Edmonton, Alta.
El-Ramly, H., Morgenstern, N.R., and Cruden, D. 2002. Probabilis-
tic slope stability analysis for practice. Canadian Geotechnical
Journal, 39: 665–683.
El-Ramly, H., Morgenstern, N.R., and Cruden, D. 2003. Probabilis-
tic stability analysis of a tailings dyke on presheared clay–shale.
Canadian Geotechnical Journal, 40: 192–208.
ERM-Hong Kong Limited. 1997. Slope failures along BRIL roads:
quantitative risk assessment & ranking. Report No. C1644. Pre-
pared for the Geotechnical Engineering Office, Civil Engi-
neering Department of the Government of Hong Kong.
FMSW. 2000. Report on the Shek Kip Mei landslide of 25 August
1999. Fugro Maunsell Scott Wilson Joint Venture for the
Geotechnical Engineering Office, Government of Hong Kong.
GEO. 1999. Review of 1997 and 1998 landslides. Landslide Study
Report No. LSR 15/99. Geotechnical Engineering Office, Civil
Engineering Department, The Government of Hong Kong.
Li, K.S., and Lumb, P. 1987. Probabilistic design of slopes. Cana-
dian Geotechnical Journal, 24: 520–535.
Mostyn, G.R., and Li, K.S. 1993. Probabilistic slope stability
analysis — State-of-play. In Probabilistic methods in geotech-
nical engineering. Edited by K.S. Li and S.-C.R. Lo. A.A.
Balkema, Rotterdam. pp. 89–109.
Spencer, E. 1967. A method of analysis of the stability of embank-
ments assuming inter-slice forces. Géotechnique, 17: 11–26.
I:\cgj\CGJ40-04\T03-031.vp
August 7, 2003 11:21:26 AM
Color profile: Generic CMYK printer profile
Composite Default screen