The stableness analysis of Earth incline is a geotechnical job characterized by assorted beginnings of uncertainness. Traditionally, deterministic methods are used for stableness appraisal of inclines and factor of safety considered as an index of stableness. However, these methods are unable to manage the uncertainnesss in the stableness analysis. Therefore, the probabilistic attack is used for analysing the safety of inclines which is capable to take into history the uncertainnesss. In this survey, an effort has been made to show an effectual method for a probabilistic incline stableness analysis. For incline safety appraisal, the dependability index defined by Hasofer and Lind is employed for gauging the dependability index or chance of failure. The public presentation map formulated by a new attack of Morgenstern-Price method for general form of faux pas surface. To cipher the minimal dependability index and matching critical probabilistic faux pas surface a modified atom drove optimisation is introduced. By this method, dependability of the nonlinear and complex public presentation map can be evaluated without derivation. The pertinence and efficiency of the proposed algorithm are examined by sing a figure of published instances and the consequences indicate the successful working of the methodological analysis.
1. Introduction
Slope stableness is one of the of import issues in the geotechnical technology and has been studied extensively for a long clip. The stableness of natural and semisynthetic inclines has traditionally been analyzed utilizing deterministic method. In a deterministic process, variables are represented by individual values. The important variables involved in the incline stableness analysis include the dirt strength, dirt denseness, and pore H2O force per unit area. Representing these variables by individual values implies that the values are predicted with certainty, which the instance is rarely. Slope stableness jobs are characterized by many uncertainnesss and deterministic methods are unable to account for these uncertainnesss. The incline may neglect even though the factor of safety calculated from deterministic theoretical account is greater than integrity. This indicates a demand for more objectively structured and quantitative attack toward managing uncertainnesss involved in the jobs. The probabilistic attack is a natural pick for this type of analysis, because it allows for the direct incorporation of uncertainnesss into the analytical theoretical account. In recent old ages, several efforts have been done to develop a probabilistic incline stableness analysis [ 1-5 ] .
The consequences of probabilistic analysis may be expressed as a chance of failure or dependability index. The most normally used method to a probabilistic analysis of incline is based on calculating the dependability index associated with the critical deterministic faux pas surface [ 1, 6-7 ] . However, the critical deterministic surface with the minimal factor of safety may non be same as the surface with lowest dependability index or the highest chance of failure [ 8-9 ]
The common attack to gauge the dependability index of Earth incline is the Mean-Value First-Order Second-Moment ( MFOSM ) method [ 10 ] . In MFOSM, the public presentation map is expanded about the average values of the parametric quantities and merely the first order footings are kept. Furthermore, to cipher the dependability index the partial derived function of public presentation map is needed. Because the public presentation map in incline stableness analysis is normally inexplicit, the partial derived functions of public presentation map are often approximated numerically [ 1 ] . To get the better of the job of dependance of dependability index on public presentation map, Hasofer and Lind [ 11 ] proposed an invariant definition of the dependability index. They defined the dependability index ? as the minimal distance from the beginning in the criterion normal infinite to the bound province surface. To use the probabilistic analysis utilizing Hasofer-Lind dependability index ( ?HL ) it is necessary to work out a restraint optimisation job to happen the minimal dependability index or maximal chance of failure utilize the appropriate optimisation technique.
So far, the most normally used optimisation technique is called gradient algorithm which is based on gradient information. However, the acquisition of gradient information can be dearly-won or even wholly impossible to obtain. But another sort of optimisation techniques, known as evolutionary algorithm ( EA ) , is non restricted in the aforesaid mode. As a freshly developed subset of evolutionary algorithm, the atom drove optimisation has demonstrated its many advantages and robust nature in recent decennaries. It is derived from societal psychological science and the simulation of the societal behaviour of bird flocks in peculiar. Inspired by the drove intelligence theory, Kennedy created a theoretical account which Eberhart so extended to explicate the practical optimisation method known as atom drove optimisation ( PSO ) [ 12 ] . The PSO algorithm has some advantages compared with other optimisation algorithms. It is a simple algorithm with merely a few parametric quantities to be adjusted during the optimisation procedure, rendering it compatible with any modern computing machine linguistic communication. It is besides a really powerful algorithm because its application is virtually limitless. Recently, assorted research workers have analyzed PSO and experimented with it and many fluctuations were created to further better the public presentation of PSO.
In this paper, we propose a modified atom drove optimisation ( MPSO ) for minimising the Hasofer-Lind dependability index ( ?HL ) and find the critical probabilistic faux pas surface of earth incline. The proposed algorithm utilized a new attack of Morgenstern-Price ‘s method introduced by Zhu et al [ 13 ] for the preparation of the public presentation map coupled with the dependability index defined by Hasofer and Lind. The method presented herein is simple but effectual to seek for the critical probabilistic faux pas surface in incline stableness analysis. Furthermore, it can supply a solution to happen the critical deterministic faux pas surface and minimal factor of safety.
2. Deterministic incline stableness analysis
The by and large adopted attack of deterministic analysis of inclines is the bound equilibrium method of pieces. assorted methods of pieces have been proposed over the old ages such as Bishop [ 14 ] , Morgenstern-Price [ 15 ] , Spencer [ 16 ] and Janbu [ 17 ] and etc. The kernel of these methods is to split the skiding mass into a finite figure of perpendicular pieces to cipher the factor of safety. A figure of these methods are applicable to a round faux pas surface and satisfy merely overall minute equilibrium, such as the ordinary and simplified Bishop method, while others are applicable to any form of faux pas surface and fulfill both minute and force equilibrium, such as Morgenstern-Price ( M-P ) , and Spencer ‘s method.
In this survey, a concise algorithm of M-P method suggested by Zhu et al [ 13 ] is used for computation of the factor of safety. The original preparation introduced by Morgenstern and Price is really complicated and hard to utilize specially in the context of probabilistic analysis. In the solution developed by Zhu et al the two equilibrium equations used in the M-P method are re-derived to obtain two explicitly looks for the factor of safety ( FS ) and the grading factor ( ? ) . The process of this method is presented bellow and it is differs in item from the original M-P method. Fig.1 presents the inside informations of inter-slice forces for a typical perpendicular piece of a natural incline with general shaped faux pas surface.
In this method, like the original M-P method, the disposition of the attendant inter-slice force varies symmetrically across the slide mass. Therefore, the relationship between the normal and shear inter-slice force may be expressed as:
( 1 )
where, ? is a scaling factor to be evaluated in work outing for the safety factor and degree Fahrenheit ( ten ) is the false inter-slice force map with regard to x. several maps may be used as degree Fahrenheit ( ten ) such as changeless map, trapezoidal map, sine map, and half-sine map [ 18 ] .
Fig. 1 Forces moving on a typical piece
The undermentioned equations can be derived from the force equilibrium of the ith piece in the normal and digressive way to the faux pas surface and the Mohr-Coulomb failure standard [ 13 ] :
( 2 )
in which:
( 3 )
( 4 )
( 5 )
( 6 )
In the above equations, I•? and c? denote the effectual clash angle and the effectual coherence along the base, severally. In add-on, u represents the mean H2O force per unit area.
With the status E0 = 0 and En = 0 ( where E0 and En are the inter-slice forces at the boundaries ) , from Eq. 2 the force equilibrium equation is derived in the signifier of an look for the factor of safety in the signifier of holla:
( 7 )
See the summing up of minutes about the centre point of the base of the ith piece. After simplifying, the minute equilibrium equation is derived in the signifier of an expressed look for the grading factor as follows:
( 8 )
To work out for the factor of safety, foremost stipulate the signifier of the inter-slice map degree Fahrenheit ( x ) and presume the initial value for FS and ? . As suggested by Zhu et al [ 13 ] the appropriate pick for initial values of FS and ? are 1 and 0, severally. Then, FS obtained by an iterative process. After that, the values of Ei and ? are calculated based on the Eqs. 2 and 8. Finally, the factor of safety is recalculated with these computed values of scaling factor. The iterative process is completed when the difference between computed values of FS and ? is within an acceptable tolerance.
3. Probabilistic incline stableness analysis
In general, the factor of safety is non a consistent step of hazard. Slopes with the same value of FS may be at different hazard degrees depending on the fluctuation of dirt belongingss. It is uncomplete to quantify how much safer a incline becomes as the factor of safety is increased [ 1, 8 ] because assorted uncertainnesss are non considered. As a consequence, there has been an effort in recent old ages to utilize probabilistic techniques for analysing the safety of dirt inclines when natural variableness and uncertainness inherent in dirt parametric quantities are considered. One advantage of working within a probabilistic model is that assorted dirt parametric quantity uncertainnesss can be considered rationally.
Two attacks can be used for probabilistic analysis of Earth incline. The first attack is based on computation of the chance of failure matching to the critical deterministic surface with minimal factor of safety. However, the surface of the minimal factor of safety may non be the surface of the maximal chance of failure [ 9 ] . The 2nd attack is based on finding of the critical probabilistic surface that is associated with the highest chance of failure or the lowest dependability [ 3, 8, 19 ] .
The job of the probabilistic analysis is formulated by a vector, X= [ X1, X2, X3, … , Xn ] , stand foring a set of random variables. From the unsure variables, a public presentation map g ( X ) is formulated to depict the bound province in the infinite of X. The public presentation map divides the vector infinite X in to two distinguishable parts. The safety part for g ( X ) & A ; gt ; 0 and the failure part g ( X ) & A ; lt ; 0, while the bound province surface is g ( X ) =0. The public presentation map for the incline stableness is a map of the factor of safety ( FS ) normally defined as:
g ( X ) = FS- 1 ( 9 )
The chance of failure of the incline can be expressed in footings of the public presentation map by the undermentioned integral:
( 10 )
where fX ( X ) represents the joint chance denseness map of the vector of random variables and the built-in is carried out over the failure sphere. Consequently, the chance of safe public presentation, or the dependability of the incline is given by:
Reliability = P [ g ( X ) & A ; gt ; 0 ] = 1 – Pf ( 11 )
The public presentation map, as defined by Eq.9, is a map of several random variables. To find the dependability ( or chance of failure ) the chance denseness map of the public presentation map must be evaluated. This requires multiple integrating of the joint chance denseness map of the random variables over the full safe ( or failure ) sphere. The joint chance denseness map of the random variables is by and large non good defined and the public presentation map is really frequently inexplicit. Hence, measuring the chance denseness map of the public presentation map is frequently non possible. In add-on, even if the joint chance denseness map of the random variables can be specified, the multi-dimensional built-in in Eq.10 can non normally be solved analytically and numerical attacks are frequently required to happen the solution. Therefore, the most effectual applications of chance theory to the analysis of incline stableness have stated the uncertainnesss in the signifier of a dependability index ( ? ) . The dependability index provides more information and is a better indicant of the stableness of a incline than the factor of safety entirely because it incorporates information of the uncertainness in the values of the public presentation map. It besides provides a good comparative step of safety ; inclines with higher ? are considered safer than inclines with lower ? .
Depend on the signifier of the public presentation map several definitions of the dependability index exist. Hasofer and Lind [ 11 ] proposed an invariant definition of the dependability index as the minimal distance from the beginning in the criterion normal infinite to the bound province surface. This distance is defined as ?HL and the attack is referred to as the first-order dependability method ( FORM ) . The closest point on the failure surface is said to be the design point or the most likely point ( MPP ) of failure. To find the H-L dependability index ( ?HL ) , all the random variables X should be transformed into a standard normal infinite U, by an extraneous transmutation such that:
( 12 )
where ?i and ?i represent the mean and the standard divergence of xi, severally. The mean and standard divergence of the criterion usually distributed variable, ui, are zero and integrity, severally. Based on the transmutation of Eq.12, the average value point in the original infinite ( X-space ) is mapped into the beginning of the normal infinite ( U-space ) . The failure surface g ( X ) =0 in X-space is mapped into the corresponding failure surface g ( U ) =0 in U-space, as shown in Fig. 2. The dependability index is the shortest distance from the beginning to the failure surface.
Fig. 2 The geometrical representation of the definition of the dependability index
The matrix preparation of the Hasofer-Lind dependability index ( ?HL ) is defined in the undermentioned signifier [ 20 ] , [ 21 ] :
( 13 )
or, equivalently:
( 14 )
where Ten is a vector stand foring the set of random variables xi, F is the failure sphere, ? is the vector of average values ?i, ?i is the standard divergence, C is the covariance matrix and R is the correlativity matrix. Mathematically, R= [ ?ij ] ( I, j=1,2, … , N ) is a square matrix that contains the correlativities among a set of n random variables. As mentioned by Low and Tang [ 21 ] utilizing Eq.14 is preferred to Eq.13 because the correlativity matrix R is easier to put up, and conveys the correlativity construction more explicitly than the covariance matrix C. Although the correlativity coefficient among two random variables has a scope -1 & A ; lt ; ?ij & A ; lt ; 1, one is non wholly free in delegating any values within this scope for the correlativity matrix. It must be emphasized that the correlativity matrix has to be positive definite [ 22 ] .
As mentioned before, the H-L dependability index ( ?HL ) is defined as the minimal distance from the beginning of the axis in the criterion normal infinite to the bound province surface. To measure ?HL the following forced optimisation job should be solved:
( 15 )
The solution of the above optimisation job is the design point or MPP in the standardised normal variables infinite. Then, the chance of failure ( Pf ) can be estimated from the dependability index utilizing the established equation as:
( 16 )
where ? is the standard normal cumulative distribution map. Several algorithms have been recommended for the solution of optimisation job in Eq. 15 [ 23 ] . In the current survey a new attack of atom drove optimisation is proposed for the solution.
4. Modified atom drove optimisation
Particle drove optimisation is a population based stochastic optimisation method. It explores for the optimum solution from a population of traveling atoms, based on a fittingness map. Each atom represents a possible reply and has a place ( Xik ) and a speed ( Vik ) in the job infinite. Each atom keeps a record of its single best place ( Pik ) and planetary best place ( Pgk ) . The new place and speed of atom will be updated harmonizing to the undermentioned equations [ 24 ] :
( 17 )
( 18 )
where tungsten is an inertia weight that controls a atom ‘s geographic expedition during a hunt, c1 and c2 are positive Numberss exemplifying the weight of the acceleration footings that guide each atom toward the single best and the drove best places severally, r1 and r2 are uniformly distributed random Numberss in ( 0, 1 ) , and N is the figure of atoms in the drove. The inactiveness burdening map in Eq. 17 is normally calculated utilizing following equation:
( 19 )
where wmax and wmin are maximal and minimal value of tungsten, G is the maximal figure of loop and K is the current loop figure. As the PSO ‘s equations reveal, unlike the traditional theoretical account based optimisation algorithms like Newton ‘s method, the PSO algorithm does non necessitate a mathematical theoretical account of the job. The lone information required by the PSO to seek for the optimal solution is the rating of fittingness map.
This survey proposed a modified atom drove optimisation ( MPSO ) based on atom drove optimisation with inactive fold ( PSOPC ) to seek for the minimal dependability index and probabilistic incline stableness analysis. The theory of PSOPC creates an extra portion at the terminal of the speed update expression ( Eq. 17 ) and known as inactive fold portion. To better the public presentation of PSO, this paper proposed a new speed update equation. The basic thought is that persons need to supervise both their environment and their milieus. Therefore, each group member receives a battalion of information from other members, which may diminish the possibility of a failed effort at sensing or a nonmeaningful hunt. Therefore, the update speed equation in MPSO is defined as follow:
( 20 )
where Rik is a atom selected indiscriminately from the drove, c3 is the inactive fold coefficient, r3 is a unvarying random sequence in the scope of ( 0-1 ) , and ? is limitation factor which is used to command and compress speeds and it is defined as follows:
( 21 )
In Eq. 21, ?max and ?min are maximal and minimal value of ? , G and k were described in Eq. 19. The limitation coefficient is bigger during the early loop, and it is diminishing over the loops. Restriction factor reduces the amplitude of a atom ‘s oscillations as it concentrates on the local and vicinity before best place. The atom will hover around the leaden mean of planetary and local best. On the one manus, if the old best place and the vicinity best place are near each other the atom will accomplish a local geographic expedition. On the other manus, if the old best place and the vicinity best place are far apart from each other, the atom will accomplish a more fact-finding explore. During the geographic expedition, the vicinity and old best place will alter and the atom will travel from local hunt back to planetary hunt. The limitation factor method, balances the demand for local and planetary hunt depending on what societal conditions are in topographic point.
5. Trial jobs and consequences
This subdivision investigates the cogency and effectivity of the proposed algorithm to probabilistic incline stableness analysis. The execution process of the proposed MPSO for the dependability analysis of the Earth incline is shown as a flow chart in Fig. 4. The flow chart is self-explanatory, as each individual portion of the algorithm has been already discussed in old subdivisions. To verify and measure the pertinence of the proposed algorithm to seek for the minimal dependability index and associated probabilistic faux pas surface the undermentioned benchmark jobs were selected from the literature. The process has been carried out utilizing a computing machine plan was developed by MATLAB. The plan hunts for the most critical deterministic and probabilistic faux pas surface. The factor of safety has been defined based on the new attack of Morgenstern-Price method for general form of faux pas surface. To cipher the dependability or chance of failure, the Hasofer-Lind dependability index ( ?HL ) is evaluated utilizing modified atom drove optimisation.
Fig. 3 Reliability analysis of Earth incline utilizing MPSO algorithm
In current survey, the random variables are supposed to be described statistically by a lognormal distribution defined by a average ?X and a standard divergence ?X. The lognormal distribution ranges between nothing and eternity, skewed to the low scope, and is hence peculiarly suited for parametric quantities that can non take on negative values. The mean and standard divergence of the underlying normal distribution of lnX are so given by the undermentioned equations [ 25 ] :
( 22 )
( 23 )
Furthermore, probabilistic analysis requires gauging or presuming the correlativity coefficients between random variables. The parametric quantities that might be correlated are friction angle, unit weight and coherence. In the undermentioned instances the coherence and clash angle are assumed to be negatively correlated with each other ( ?c?=-0.3 ) and positively correlated with unit weight ( ?c?= ???=0.5 ) in conformity with the reported values in literature [ 21 ] , [ 26 ] .
To cipher the minimal value of ?HL sing proposed MPSO the parametric quantities of the algorithm should be adopted accurately. The parametric quantities that may impact the public presentation of the algorithm include acceleration invariables ( c1 and c2 ) , inactive fold coefficient ( c3 ) , maximal and minimal values of inactiveness weight ( wmax and wmin ) , maximal and minimal values of limitation factors ( ?max and ?min ) , and drove size ( N ) . In our survey, proper all right tuning of these parametric quantities was obtained utilizing several experimental surveies analyzing the consequence of each parametric quantity on the concluding solution and convergence of the algorithm. As a consequence, for all algorithms, a population of 40 persons was used ; wmax and wmin were chosen as 0.95 and 0.45 severally ; and the values of the acceleration invariables ( c1 and c2 ) were selected equal to 2. The inactive fold coefficient ( c3 ) was set to 0.4. The maximal and minimal values of the limitation factors ( ?max and ?min ) were selected as 0.9 and 0.7 severally. Finally, a fixed figure of maximal loop ( G ) of 3000 was applied. The optimisation process was terminated when one of the following fillet standards was met: ( I ) the maximal figure of coevalss is reached ; ( two ) after a given figure of loops, there is no important betterment of the solution.
5.1 Test job 1: application to a homogenous incline
Figure 4shows the geometry of a incline in homogenous dirt. The parametric quantities considered as random variables in the probabilistic analysis are: the effectual clash angle, effectual coherence, unit weight and pore H2O force per unit area ratio. Table 1 presents the average values and standard divergence associated with each random variable.
Fig. 4 Cross subdivision of homogenous slope-test job 1
Table 1 statistical belongingss of dirt parameters-test job 1
Random variable
Mean
Standard divergence
Distribution
degree Celsiuss ‘
18.0kN/m2
3.6kN/m2
Log-normal
tan ? ‘
sunburn 30
0.0577
Log-normal
?
18.0kN/m3
0.9kN/m3
Log-normal
Ru
0.2
0.02
Log-normal
The job was antecedently solved by Li and Lumb [ 8 ] , Hassan and Wolff [ 9 ] and Bhattacharya et Al. [ 3 ] . Li and Lumb [ 8 ] determined the dependability index utilizing Hasofer and Lind. They used Morgenstern-Price method for the preparation of safety factor and public presentation map. Hassan and Wolff [ 9 ] calculated the dependability index matching to the searched critical probabilistic surface utilizing the MFOSM method presuming a lognormal distribution for the factor of safety. The method was used in their survey to measure the safety factor was Spencer ‘s method [ 16 ] for round and non round faux pas surface. Finally, Bhattacharya et al [ 3 ] solved the job with the same methodological analysis as Hassan and Wolff [ 9 ] . They used Spencer ‘s method in concurrence with the direct hunt method for finding of minimal dependability index. Note that different bound equilibrium methods will give different truth and will normally give different factors of safety even for the same incline. Therefore, in this survey for the interest of right comparing, the map degree Fahrenheit ( ten ) in M-P method is taken as one ( equivalent to the Spencer ‘s method ) to cipher the factor of safety and public presentation map. The consequences of the proposed method and old surveies are summarized in Table 2. Furthermore, to verify the truth and efficiency of the proposed MPSO, the consequence of standard PSO is besides presented.
In Table 2, FSmin and ?FS are the minimal factor of safety and the dependability index associated with the critical deterministic faux pas surface, ?min and FS? are the minimal dependability index and the factor of safety matching to the critical probabilistic faux pas surface.
Harmonizing to analysing the consequences of this tabular array, it can be observed that, the minimal dependability index calculated utilizing presented method is 2.203, which is lower than the values reported by Li and Lumb ( 2.5 ) Hassan and Wolff ( 2.293 ) , Bhattacharya et Al. ( 2.239 ) , and besides standard PSO ( 2.212 ) . Further, the minimal factor of safety calculated from a deterministic analysis based on the average values of the dirt belongingss obtained by MPSO is 1.302, which is lower than 1.326 reported by Bhattacharya et Al. and somewhat lower than 1.309 achieved by PSO. Further, the values of ?FS and FS? are comparable with the reported values.
The corresponding critical deterministic and the critical probabilistic faux pas surfaces are besides presented in Fig. 4. As it can be seen, two surfaces are located moderately near to each other as expected in a homogenous incline. It ‘s because of the propinquity of the values of ?FS and ?min presented in Table 2. The failure surfaces reported by old research workers are besides likewise located.
Table 2 consequences comparison-test job 1
Method
?FS
?min
FSmin
FS?
Li and Lumb [ 8 ]
–
2.5
–
–
Hassan and Wolff [ 9 ]
2.336
2.293
–
–
Bhattacharya et al [ 3 ]
2.306
2.239
1.326
1.337
Present survey ( PSO )
2.295
2.212
1.309
1.319
Present survey ( MPSO )
2.288
2.203
1.302
1.312
The dependability ( or chance of failure ) of inclines is sensitive to the fluctuation of dirt belongingss and pore force per unit area ratio. To look into the consequence of these parametric quantities on the dependability of Earth incline in this job, each variable allowed changing independently about its several average value while the other belongingss are kept fixed at their mean values. Further, the coefficient of fluctuation ( COV ) values of mentioned parametric quantities allowed to alter. Fig. 5 show the relationships between the dependability index and the effectual clash angle, effectual coherence, unit weight, and pore force per unit area ratio severally utilizing the presented probabilistic analysis process. Figures 5 ( a ) and ( B ) show that how the dependability index may be affected with the uncertainness in dirt strength. These figures illustrate that an addition in mean of cI„ and ?I„ leads to the addition of dependability index ( or lessening of Pf ) . Besides it can be seen that, dependability index could be decrease by orders of magnitude as COV of cI„ and ?I„ additions. Thus dependability is significantly sensitive to the uncertainness in dirt strength. Similarly, Figure 5 ( degree Celsius ) demonstrates that as mean of unit weight ( ? ) increases dependability index decreases ( or Pf additions ) which is expected. Besides, this figure suggests the impact of COV of ? on dependability index is non really important and dependability is about independent of the considered COV of ? . Finally, the consequence of uncertainness in pore force per unit area ratio to the dependability index is presented in Fig 5 ( vitamin D ) . The figure implies the opposite relationship between the pore force per unit area ratio and dependability index.
Fig. 5 ( a ) consequence of changing statistics of degree Celsius on ? Fig. 5 ( B ) consequence of changing statistics of ? on ?
Fig. 5 ( degree Celsius ) consequence of changing statistics of ? on ? Fig. 5 ( vitamin D ) consequence of changing statistics of Ru on ?
Fig. 5 fluctuation of dependability index with random variables
5.2 Test job 2: application to a not homogenous incline
Figure 6 shows the cross subdivision and geometry of a two superimposed incline in clay bounded by a difficult bed below and parallel to the land surface. The dirt strength parametric quantities that are related to the stableness of incline, including clash angle ? , and coherence degree Celsius, are considered as random variables. The statistical minutes ( average value and standard divergence ) of the parametric quantities are summarized in Table3.
Fig. 6 Cross subdivision of non homogenous slope-test job 2
Table 3 statistical belongingss of dirt parameters-test job 2
Material
Parameter
Mean
Standard divergence
Distribution
Dirt 1
c1
38.31kN/m2
7.662kN/m2
Log-normal
?1
0
–
Log-normal
Dirt 2
c2
23.94kN/m2
4.788kN/m2
Log-normal
?2
12
1.2
Log-normal
This illustration was besides solved antecedently by Hassan and Wolff [ 9 ] and Bhattacharya et al [ 3 ] in footings of FSmin, ?FS, ?min and FS? . the methodological analysis was used in their research illustrated in trial job 1. The consequences obtained from current survey together with a comparing of those reported by old research workers are summarized in Table 4. For the consequences shown in this tabular array, it can be considered that the minimal dependability index evaluated utilizing MPSO is 2.768, which is about lower than those reported by Hassan and Wolff [ 9 ] , Bhattacharya et al [ 3 ] and PSO. Besides, the minimal factor of safety obtained by MPSO is found to be smaller than the others. The corresponding critical deterministic and the critical probabilistic faux pas surfaces are presented in Fig. 6. In conformity with the difference in the values of ?FS and ?min presented in Table 4, the two surfaces are located significantly separate.
Table 4 consequences comparison-test job 2
Method
?FS
?min
FSmin
FS?
Hassan and Wolff [ 9 ]
4.442
2.869
1.663
–
Bhattacharya et al [ 3 ]
5.064
2.861
1.665
1.797
Present survey ( PSO )
4.545
2.771
1.655
1.784
Present survey ( MPSO )
4538
2.768
1.651
1.782
5.3 Test job 3: application to a instance survey of the Cannon Dam
The probabilistic analysis for the end-of-construction phase of the Cannon Dam reported in Hassan and Wolff [ 9 ] will be investigated in this subdivision. A typical cross subdivision of the dike demoing the dirt profile is presented in Fig. 7. The construction consists of two zones of compacted clay: Phase I fill and Phase II fill over beds of sand and limestone. Strength parametric quantities of the two clay beds were considered as random variables ( c1, ?1, c2, ?2 ) . Table 5 shows the average value and standard divergence for these parametric quantities based on UU trials of recorded samples from the embankment. Hassan and Wolff [ 9 ] made no decreases in the discrepancy for spacial correlativity, in their survey it was cautiously assumed that discrepancy over a failure zone was potentially every bit big as the point discrepancy.
Table 5 statistical belongingss of dirt parameters-test job 3
Material
Parameter
Mean
Standard divergence
Correlation coefficient
Phase I Fill
c1
117.79kN/m2
58.89kN/m2
+0.10
?1
8.5
8.5
Phase II Fill
c2
143.64kN/m2
79kN/m2
-0.55
?2
15
9
Fig. 7 Cross subdivision of Cannon Dam [ 9 ] -test job 3
The minimal dependability index and factor of safety matching to critical probabilistic and deterministic faux pas surface obtained by different methods are shown in Table 6. As derived from Table 6, the factor of safety evaluated by the presented analysis method is somewhat lower compared to both the value of 2.647 achieved by Hassan and Wolff [ 9 ] and that of 2.612 by Bhattacharya et Al. [ 3 ] and is significantly lower than that reported by Hassan and Wolff [ 9 ] for round faux pas surface. Further, dependability index evaluated herein is lower than those reported in the old survey. The critical probabilistic surface determined by MPSO base on ballss through the Phase I fill, as shown in Fig. 8.
Table 6 consequences comparison-test job 3
Method
?FS
?min
FSmin
FS?
Hassan and Wolff [ 9 ]
7.028
2.664
2.647
–
Bhattacharya et al [ 3 ]
3.695
2.674
2.612
2.98
Present survey ( PSO )
3.223
2.657
2.595
1.782
Present survey ( MPSO )
3.218
2.655
2.595
1.782
Fig. 8 critical failure surface of Cannon Dam-test job 3
6. Decisions
This paper outlines a process of probabilistic analysis of Earth incline. The elaborate development of the hunt process for turn uping the critical probabilistic failure surfaces, presented herein, is based on the FORM method as the dependability theoretical account and the Morgenstern-Price method of pieces as the incline stableness theoretical account. The process can be applied to the analysis of the stableness of a general faux pas surface. The Hasofer-Lind dependability index ( ?HL ) is used alternatively of the conventional dependability index ? . The job of seeking the critical probabilistic surface with the minimal dependability index, ?min, can be formulated as an optimisation job and a modified atom drove optimisation is proposed for the solution. Despite the alteration presented in the current survey to the original PSO is non major, the advantages of the proposed alteration are evidently demonstrated through some numerical jobs. The described model has been coded in MATLAB and used to transport out parametric surveies for the numerical jobs. The method does non do any premises associating to the geometry of the failure surface and can be applied to any complex incline geometry, layering and pore force per unit area conditions. The pertinence of the proposed methodological analysis developed herein, has been examined over a assortment of incline stableness jobs from the literature. A comparing of consequences show that in all instances the consequences obtained in the present survey utilizing MPSO has evaluated values of minimal dependability index that are moderately lower than those reported in the literature and is capable to place the failure sequence.
From the probabilistic analysis point of position the consequences show that, the critical failure surface which has minimal factor of safety value is non a surface at which the chance of failure is ever the upper limit. Therefore, the deterministic critical surface is non ever an existent failure surface and factor of safety does non and can non scale safety. Further as illustrated trough the trial jobs ; the critical probabilistic and deterministic faux pas surface is about close for incline in a homogeneous dirt whereas these surfaces are located rather separate for not homogeneous inclines. Furthermore, from the sensitiveness analysis preformed herein, coherence and the angle of clash were found to be the most important input parametric quantities for the probabilistic analysis of inclines and the unit weight of the dirt was found to be the least important input parametric quantity. The chance of failure does non alter a batch with alterations in the COV of ? , nevertheless, it does significantly alter with alterations in the COV of degree Celsius and ? . Therefore, finding the exact unit weight distribution of dirt for a probabilistic analysis is less critical than coherence and the angle of clash.