Está en la página 1de 6

Proceedings of 2008 IEEE International Conference on Mechatronics and Automation

TC1-6

Identification of Ship Steering Dynamics Based on ACA-SVR


Liu Sheng
Department of Automation University of Harbin Engineering Harbin, Heilongjiang Province, China Liu.sch@163.com
Abstract - According to the high-order nonlinearity and parameter uncertainty of the ship steering dynamics, it is difficult to establish the accurate mathematical model by using normal identification methods. To solve this problem, a new kind of Support Vector Regression based on the Ant Colony Algorithm (ACA-SVR) is proposed. This method can select the parameters of SVR automatically without trial and error, thus ensure the accuracy of parameters optimization. Applying this method in the model identification of the ship steering dynamics, and comparing the identification effect with the experimental reference data. The SVR obtained by this method is able to establish the system model effectively, the structure is simple and generalization ability is well. Index Terms - Ship Maneuvering, Support Vector Regression, Ant Colony Algorithm, Nonlinear System Identification.

Song Jia, Li Bing and Li Gao-yun


Department of Automation University of Harbin Engineering Harbin, Heilongjiang Province, China Jia.sch@163.com algorithm, the ACA does not strongly depend on its initial solution. Comparing with the Genetic Algorithm(GA), communication and transmission between the individuals of ACA is unceasingly carried on, the positive feedback mechanism is more advantageous to the discovery of better solutions [8][9]. In this article proposed a new kind of Support Vector Regression based on the Ant Colony Algorithm (ACA-SVR), which automatically selects SVR parameters. We can get the globally optimal solution with the search ability of ACA to solve SVR parameters selection problem. According to the uncertainty, nonlinearity and high-order complexity of ship steering dynamics, it is difficult to establish the precise mathematical model by using conventional identification methods. Apply the ACA-SVR algorithm in the ship steering dynamics identification, the simulation shows that we can obtain the precise mathematical model. II. INFLUENCE OF SVR PARAMETERS Suppose that the input sample is x , x R n , l pairs of samples and responding output are expressed as: ( x1 , y1 ), ( x2 , y2 ),", ( xl , yl ) R n l For regressive problem, the basic idea is mapping the data to its corresponding high dimensional feature space F , and carrying on linear through a nonlinear mapping regression in the space, as in (1): f ( x ) wT ( x ) b, ( : R n F , w F ) (1)

I. INTRODUCTION Support vector machines (SVM), which proposed by Vapnik et al. in the 1990s, is a new machine learning method that has the prospects for development [1]. In recent years, as a result of its superior learning capability, SVM has received widespread recognition form the international academia. The SVM is formed on the foundation of statistics theory, which tries hard to realize the structural risk minimization and improves the learning machines generalization ability. Comparing with the Artificial Neural Network (ANN), the SVM has overcome the inherent defects of ANN such as partial minimum point, over fitting, etc. The regression estimation method based on the Support Vector Regression can approach arbitrary nonlinear function with adjustable precision. This method has the advantages of global optimization, strong generalization ability etc. The Support Vector Regression (SVR) is widely used [2][3][4]. But the performance of SVR mostly relies on the learning parameters selection, and the empirical risk is high. So far, there is no general method for the SVR parameters selection [5]. With the development of SVM, Italian scholar Dorigo Macro et al. proposed Ant Colony Algorithm (ACA) by simulating the ants collectively seek for the way in nature [6]. ACA is a heuristic biological evolution algorithm based on population. This algorithm was first successfully applied in solving the famous Traveling Salesman Problem (TSP) [7]. It uses the mechanism of distributed parallel computing, has strong robustness and integrating with other methods easily. In recent several years, it has addressed great attention. Comparing with tabu search

Here, b is the threshold. Therefore, the linear regression in high dimensional feature space F corresponding to the nonlinear regression in low dimensional input space. ( )

refers to a nonlinear mapping from the input space R n to the feature space F , function f ( ) expresses a linear function in the feature space F . The aim of SVR is to fit the sample data by using f ( ) , guarantee a good generalization simultaneously. In order to strengthen robustness of regression, Vapnik proposed the optimization criterion penalizes data . It neglects the fitting error which is smaller than . The optimization criterion penalizes data decides the number of support vectors [1]. The bigger the is, the less the support vectors to be, then the estimate precision of function is low, the smaller the is, the more the support vectors to be, then the estimate precision of function is high. But is not the smaller the better, although the precision enhanced, computing

978-1-4244-2632-4/08/$25.00 2008 IEEE

514

time lengthened. Therefore, we must choose a suitable to guarantee the computing speed and precision of SVR. Simultaneously, the optimization criterion penalizes data also reflects the data noise level of SVR, the bigger noise then choose the greater , the smaller noise then choose the smaller one. Comprehensively considering the fitting error and the function complexity, SVR can be expressed as solving an optimization problem: l 1 (2) min w T w + C ( i + i* ) 2 i =1
yi w T ( xi ) b + i s.t. w T ( xi ) + b yi + i* i , i* 0 Here, C > 0 is the weight of function complexity and loss error. The penalty parameter C adjusts the study machine fiducially range and the experience risk proportion in the definite data subspace. The purpose is to reach the best generalization ability. The optimal value of C is different in each data subspace. In certain data subspace, smaller value of C expresses smaller experiences error penalty, smaller learning machine complexity and bigger experience risk, vice versa. The former condition is owe fitting, the latter one is over fitting. There is at least one appropriate C in each data subspace to ensure SVR has the best promoting ability. We can calculate (2) through its dual Lagrange multinomial: 1 l min ( i i* )( j * )k ( xi , x j ) j 2 i , j =1 (3)

kernel function. Vapnik has discovered that the selection of different kernel functions has little effect on the performance of SVM, but the choice of kernel function parameters is the key factor of SVR performance. Speaking of RBF, the main parameter is the width 2 of the kernel, which mainly affects the complexity of sample data distributing in high dimensional feature space. When the RBF kernel width is big, the performance is similar to polynomial kernel function, but if the width is small, the performance is similar to linear kernel function [10]. Based on RBF kernel function, we can get the main parameters of SVR: kernel width 2 , penalty parameter C and optimization criterion penalizes data . III. PARAMETER OPTIMIZATION BASED ON ACA A. Problem Description The final path gotten through ant colony algorithm for SVR is different from in TSP, which represents the extreme value of the object function. The node values in ant colony algorithm represent 2 , C and . The hormone material is left in each node when the ants pass through (equal to the cities in the TSP). This ant colony algorithm used in SVR parameters optimization renew the substance concentration according to the object function value not to the length of path, the objective function contains all the information of the nodes that ants pass through and current performance index information. The three parameters of SVR are considered as the optimal variables, and expressed by 4 decimal valid numbers. According to the value of these parameters, making 2 and have one place before the decimal point and three places after the decimal point, C has two places before the decimal point and two places after the decimal point. In order to use ACA conveniently, we can make 10 lines 12 rows planar structure as the ant performance diagram for SVM parameter optimization. In this planar, 10 lines represent 10 values from 0 to 9, 12 rows represent 4 integer parameter positions of 2 C and . The nodes on row 1 to row 4 represents the 1~4th integer position of 2 , nodes on row 5 to row 8 represents the 1~4th integer position of C , nodes on row 9 to row 12 represents the 1~4th integer position of . There are 10 12 nodes in the ACA-SVR, Knot (hi , f ij ) express the node, hi express the ith row, f ij express the value on the ith row is j . In order to indicate conveniently, each row is called a set, each set has 10 elements, their value respectively is from 0 to 9. Therefore, the ACA which is used in SVR parameters optimization can be described as m ants seek for food from their nest. Every ant starts from the 1st set, choosing an element stochastically from each set according to each elements pheromone condition, and making the corresponding adjustment to the choice. When the ants complete the choice of all sets, they reach the food source. After that, the ants return to their nest according to the original way, at the same time, adjust the elements pheromone

+ ( i i* ) yi ( i + i* )
i =1 i =1

l * ( i i ) = 0 s.t. i =1 0 , * C , i = 1," , l i i

Here, k ( xi , x j ) = ( xi )T ( xi ) is called kernel function, which satisfies the Mercer condition. In this paper we use the Radial Basis Function (RBF) kernel function: x x 2 i j k ( xi , x j ) = exp 2 2 The output of SVR: 1 l f ( x) = ( i i* )k ( xi , x j ) + b (4) 2 i =1 And: g 1 b = min( yi ( i i* )k ( x, xi )) 2 i =1 (5) g * + max( yi ( i i )k ( x, xi )) i =1 We use SVR to extending linear regressive method to deal with nonlinear regression problem through introducing the

515

condition in the set. This process carries on repeatedly, when all ants restrain to the identical way, it means that they have found the SVR parameters optimal solution. B. Establishment of Objective Function The goal of SVR is to approach the nonlinear model of system, the objective function can be written as: min J = y f ( x) (6) Where the actual output of system is y , the output of SVR is f ( x) , as in (4).

this epicycle circulates(which correspond to the most superior performance index of this epicycle circulate), and store the correspondence SVR parameters in 2* , C * and * . Step eight: Make t t + 12 , N N + 1 , renew global information according to: ij (t ) (1 2 ) ij (t ) + 2 ij (t )

2 is the volatility parameter of global information, (0< 2 <1). 2 is a constant. L j = LK is sum of the output of j
k =1 m

C. Parameters Optimization The process of ACA-SVR parameters optimization is: Step one: Set m ants, define each ant K ( K = 1 ~ m) have one-dimension array PathK (12 elements), deposit the value that the Kth ant takes in 12 sets in PathK , use the value to express the crawling way of the Kth ant. Step two: Set time counter t = 0 , cycle-index N = 0 , and hypothesis the biggest cycle-index N max as well as the concentration 0 (hi , f ij ) of pheromone C (i = 1 ~ 12, j = 0 ~ 9) on the initial time node, make (hi , fij ) = 0 , put all the ants in the ant nest (i.e. initial station 0). Step three: Set variable i = 1 . Step four: Regarding each ant, from current node i = 1 choosing next node Knot (h2 , f 2 j ) . Join Knot (h2 , f 2 j ) to urgent table tabuK . The strategy of choosing the next node is: If q < q0 , choose the node correspond to the max j tabuK
{ ij (t )ij } , otherwise, according to (7)[11]:
ij (t )ij j allowed k (7) PijK = rallowed ir (t )ir k 0 otherwise Choosing node according to roulette selection q0 is established bigger than 0 and smaller than 1 in advance, q is a

SVR, that according to all the nodes the ants passed by in this circulation and the biggest error of the expects value. Step nine: Eliminate all the elements to zero in PathK ( K = 1 ~ m) , if N < N max , and the entire ant restrain to the identical strip way, then end the loop. Output optimal choice and the most superior SVR parameters 2* , C * and * corresponding to , otherwise put all the ants to the initial station and change to the step three. IV. REGRESSION OF SHIPS MANEUVERABLE NONLINEAR
MODEL

random number ( 0 < q < 1) . Step five: After each ant walking past a node, renew the local information according to: K ij (t ) (1 1 ) ijK (t ) + 1 ijK (t )
K Where, ij (t ) = Q1 LK . 1 is the volatility parameter of j

local information, (0< 1 <1). 1 is a constant and is used to adjust the adjustment speed of the information element. LK = max y f ( x) , expresses the biggest error between the j output of SVR correspond to the Kth ant and the actual expectation output. Step six: Set i = i + 1 , if i 12 change to the step four. Otherwise, change to the step seven. Step seven: According to the way which is passed by the Kth( K = 1 ~ m) ant, namely array PathK ,calculate 2K ,
C K and K of SVR, calculate objective function f correspond-

Speaking of the ships three degrees motion, the swaying, the rolling and the yawing, the external force (torque) of ship can be divided into: main power (torque) (controlling force (torque)), inertial force (torque), damping force (torque), and restoring force (torque), environment perturbed force (torque) and so on. According to the above, establish ships swaying, rolling, yawing, three degrees nonlinear model. Where, YR , K R and N R represent swaying force, rolling torque and yawing torque. YD , K D and N D represent the corresponding disturb- ance: m(v + ur ) = m y v mx ur + Yv v + Yr r + Yv|v| v | v | + Yr |r | r | r | +Yvvr v 2 r + Yvrr vr 2 + YR + YD J p = J N ( ) Wh( ) z (Y v + Y r + xx H v r xG 2 2 Yv|v| v | v | +Yr |r | r | r | +Yvvr v r + Yvrr vr ) + K R + K D (8) J zG r = J zz r + N v v + N r r + N + N v|v|v | v | + N r |r | r | r | + N v| | v | | + N r | | r | | + N vvr v 2 r + N vrr vr 2 + N R + N D Using SVR to identify the model of ship maneuvering take the heading angle (k ) and the rudder angle (k ) as the input of SVR, simultaneously, consider the influence of ship rolling angle (k ) , the input value vector is:

X = [ (k ), (k 1)," , ( k n + 1), (k ), (k 1), " , ( k m + 1), (k ), ( k 1)," , ( k p + 1)]


Where n, m and p represent the order of heading angle, rudder angle and rolling angle. The output of SVR is heading angle k = ( k + 1) . The given training sample: D = {( X1 , 1 ) ( X 2 , 2 ) " ( Xl , l )} R N R

ing to the Kth ant by using (8), record the optimal choice of

516

Where, X 1 , " , X l are the input vectors from time k to time k + 1 l :


X1 = [ ( k ), (k 1)," ( k n + 1), ( k ), ( k 1) " , ( k m + 1), (k ), (k 1)," , (k p + 1)] X 2 = [ ( k + 1), ( k )," ( k n + 2), ( k + 1), ( k ), " , (k m + 2), ( k + 1), ( k )," , (k p + 2)]

X l = [ (k + l 1), (k + l 2)," (k n + l ), ( k + l 1), ( k + l 2)," , ( k m + l ), ( k + l 1), ( k + l 2)," , (k p + l )]

is the actual bow angle output of the system, curve 2 is the identification output of ACA-SVR, and curve 1 is the identification output of SVR. As shown in Fig.3 is the error curve of system identification. Seen from Fig.2 and Fig.3, the ACA-SVR, this article proposed, can well approach ships manoeuvrable nonlinear model, which is a typical nonlinear model and compare to the SVR that use experience estimate method, the ACA-SVR has better nonlinear system approach ability.
30

" , (k m + 1), (k ), (k 1)," , ( k p + 1)] Take the datas obtained above as training sample of ACASVR, use the ACA algorithm optimization determine the width of kernel function 2 , the penalty parameter C and the optimization criterion penalizes data , substitute in SVR,
solve the unknown parameters i

rudder angle(deg)

In the identification of ship steering dynamics through ACA-SVR, first collect sample data of the heading angle, the rudder angle and the rolling angle, let: X = [ (k ), (k 1)," , (k n + 1), ( k ), (k 1),

20

10

-10

i* and the threshold value

-20

b , then obtain the ship manoeuvring model from the rudder angle to the heading angle. Regarding the new input data X (k ) , the output (k + 1) is: (k + 1) = f ( X (k )) = ( X (k )) + b N (9) = ( i* i ) K ( X (i ), X (k )) + b
i =1

-30

100

200

300 time(s)

400

500

600

Fig.1 Training rudder angle

20 15 10 heading angel(deg) 5 0 -5 -10 -15

V. EXPERIMENT AND CONCLUSION


In order to confirm the validity and the generality of the proposed the algorithm, use ACA-SVR and SVR to regressing ships maneuverable nonlinear model separately. The ships main parameters in this article, as in TABLE I:
TABLE I PARAMETERS OF SHIP Significance Units length m width m square coefficient tonnage m3 Sea gauge m rudder area m2 aspect ratio sailing speed Kn

Symbol L B Cb

d AR v

Value 100 15.2 0.66 4200 4.2 7.5 1.2 18

100

200

300 time(s)

400

500

600

Fig.2 Comparison of regression output and reference model output

In order to determine the system model, carry on the experiment to the object model to collect the sample data to train the network. Input pseudo random rudder angel signal, whose amplitude is 35, carry on 1200 second simulation, the sampling time is 1 second, obtain a group of 1200 couples input-output training data, take the first 600 couples as training sample, the latter 600 couples as test sample. The curve of input rudder-angle training data is shown in Fig.1. The identification result output and reference model output correlation curve are shown in Fig.2. In Fig.2, curve 3

517

2.5
ACA-SVR

25
SVR

2 1.5 1

20 15 heading angel(deg)
1 3

error(deg)

0.5 0 -0.5 -1 -1.5 -2 -2.5 0 100 200 300 time(s) 400 500 600

10
2

5 0 -5

-10

100

200

300 time(s)

400

500

600

Fig.3 Error curve of training data

Fig.5 Comparison of testing output and reference model output

In order to validate confirm generalization ability of the regressive method, use the latter 600 second sampled data to carry on the test simulation. As in Fig.4 is the curve of the test rudder angle signal. The test data output and reference model output correlation curve are shown in Fig.5. In Fig.5, curve 3 is for the actual bow angle output of the system, curve 2 is the identification output of ACA-SVR, curve 1 is the identification output of SVR. Ssimulation result shows that the ACA-SVR has good generalization ability, achieves the project request standard, and surpasses the method that using the experience to estimate the SVR parameters.
40 30 20 rudder angel(deg) 10 0 -10 -20 -30

In order to prove the superiority of using the ACA-SVR optimization algorithm, carry on the comparison of using the experience estimated SVR and the ACA-SVR.As in
2 2 1 n TABLEII. MSE is defined as ( yi f ( xi ) ) .Seen n i =1 from TABLE II, the error of using the ACA-SVR is smaller than the error of experience estimated SVR. 1

Arithmetic SVR ACA-SVR

TABLE II PARAMETERS OF SVR 2 C 0.105 22.47 0.452 0.032 21.76 0.184

MSE 0.732 0.354

A new kind of SVR (ACA-SVR) is proposed in this article, which select SVR parameters by using ACA. ACASVR can automatically choose the appropriate parameters according to the input-output data couples, it has avoided changing the SVR parameters depending artificially to experience, and it has provided a solution to the selection of SVR parameters. The ACA has merits of convergence quickly, few times of iterative, effectively raise the algorithm speed and so on. Regressing ships maneuverable nonlinear model with ACA-SVR can get accurate system model, compare with uses the experience to estimate that the parameter of SVR, ACA-SVR has the strong generalization ability.
REFERENCES
0 100 200 300 time(s) 400 500 600

Fig.4 Test rudder angle

[1] Vladimir N.Vapnik, Zhang Xuegong. Essence of Statistics theory [M]. Beijing, Tsinghua University publishing house.2000 [2] Sheng Liu, Yanyan Li. Application of Compound Controller Based on Fuzzy Control and Support Vector Machine on Ship Boiler-Turbine Coordinated Control System [A]. 2007 IEEE International Conference Mechatronics and Automatic [C].Harbin, 2006. [3] SONG Fu-Hua, LI Ping. Nonlinear Internal Model Control Based on Support Vector Machine th-order Inverse System Method [J]. Acta Automatica Sinica,2007,33(7) ,778-781. [4] ZHANG Hao ran,HAN Zheng zhi,LI Chang gang. Support Vector Machine Based Unknown Nonlinear Systems Identification and Control [J].Journal of Shanghai Jiaotong University.2003.37(6):927-930.

518

[5] LIU Sheng, LI Yan-yan. Parameter selection algorithm for support vector machines based on adaptive genetic algorithm [J]. Journal of Harbin Engineering University.2007.4:398-402 [6] Dorigo M. Maniezzo V. Ant colony system: Optimization by a colony of cooperation agents [J]. IEEE Trans on SMC,part B,1996,26(1):29-41. [7] Dorigo M.Gambardella LM. Ant colony system:A cooperative learing approach to the traveling salesman problem [J].IEEE Trans on Evolution Computation,1997, 1(1): 53-66. [8] XIONG Wei-Qing WEI Ping. Binary Ant Colony Evolutionary Algorithm [J]. Acta Automatica Sinica.2007.33(3):259-264. [9] Hao Jin,Shi Libao,Zhou Jiaqi .An Ant Colony Algorithm with Stochastic Disturbance Features [J].Chinese Journal of Scientific Instrument. 2001.8:350-352. [10]Zhu Yongsheng Zhang Youyun.The Study on Some Problems of Support Vector Classifier [J].Computer Engineering and Applications 2003, 39(13):36-38. [11]Sun Xueqin, Liu Li, Fu Ping, Wang Xuehou.Ant Colony Algorithm in Continuous Space [J] .Computer Engineering and Applications. 2005, 41(34):216-220. [12]Peng Peifu,Lin Yaping. Optimized PID Parameter Self-adapted Ant Colony Algorithm with Hybridizing and Aberrance Gene[J].Computer Engineering and Applications.2006.42(6): 88-91.

519

También podría gustarte