双立柱堆垛机--说明书 联系客服

发布时间 : 星期四 文章双立柱堆垛机--说明书更新完毕开始阅读

the maximum deflection is

Where Ft1-the tangential force of the worm(N),

F r1-the radial force of the worm(N),

E-the modulus of elasticity(Mpa),

I-the inertia moment of the dangerous cross-section of worm(mm4)

L-the distance of the worm bearings (mm), L=0 .9muz1.

IV.FUZZY OPTIMIZATION MATHEMATICAL MODEL OF TRACTION MECHANISM

The key of this method is how to decide the optimal level value. Several factors, such as factor class, factor fuzziness and the different influence of the factors on the different optimal level values, were considered and the method of second-class comprehensive evaluation was used based on the optimal level cut set, thus the optimal level value

λ*of every fuzzy constraint can be attained, that is λ*=0.71.Therefore the fuzzy optimization problem is converted into the usual optimization problem.

V.TRAINING RELATION COEFFICIENT BY NEURAL NETWORKS

Neural networks are composed of simple element operating in parallel. These elements are inspired by biological nervous systems. As in nature, the network function is determined largely by the connections between elements. We can train a neural network to perform a particular function by adjusting the values of the connections

(weights)between elements. Commonly neural networks are adjusted, or trained, so that a particular input leads to a specific target output based on a comparison of the output and the target, until the network output matches the target. Some points on relation curve between teeth number Z2 and the profile factor Yf of worm gear are selected as training sample data, the Fast Back Propagation are adopted to train feed-forward networks, the weights and biases of the network are updated. Then neural networks is simulated by the function of Neural Networks Toolbox in MATLAB. Program as follows:

Z2=0:10:90;YF=[2.58,2.5176,2.4566,2.3972,2.3392,2.2825,2.2273,2.1734,2.1208,2.0695] n1=5;[W1,b1,W2,b2]= initff(Z2,n1,’tansig’,YF,’purelin’); fpd=100;mne=20000;sse=0.001;lr=0.01;tp=[fpd, mne, sse, lr];

[W1,b1,W2,b2,te,tr]=trainbpx(W1,b1,’tansig’,W2,b2,’purelin’,Z2,YF,tp) y=simuff(Z2,W1,b1,’tansig’,W2,b2,’purelin’)

VI..SOLVING USUAL OPTIMIZATION MATHEMATICAL MODEL BY GENETIC ALGORITHM TOOLBOX

One key to successfully solving many types of optimization problems is choosing the

method that best suits the problem. The Genetic Algorithm and Direct Search Toolbox is a collection of functions that extend the capabilities of the Optimization Toolbox and the MATLAB? numeric computing environment. The Genetic Algorithm Toolbox includes routines for solving optimization problems using Genetic algorithm Direct search. These algorithms enable you to solve a variety of optimization problems that lie outside the scope of the standard Optimization Toolbox. Firstly the fitness function with penalty terms is built by penalty strategy with addition type, and the fitness function is programmed in MATLAB language, and above neural networks program fitting the profile factor of worm gear teeth is recalled, then the nonlinear constraints function are

programmed and the solver functions of Genetic Algorithm Toolbox are adopted. Program as follows:

options= gaoptim set (‘PopulationSize’,20); options=gaoptimset(‘Generations',100);

options=gaoptimset(‘CrossoverFraction’0.95, ’MigrationFraction’0.01); options=gaoptimset('SelectionFcn', selection-tournament, ’CrossoverFcn’, cross over scattered,’ Mutation Fcn’, Mutation gaussian); nvars=3;lb=[1;2;10];ub=[2;8;150]; [x, Fval, exit Flag, Output]=ga(@fitnessfun, nvars, [],[],[],[],lb, ub, @yueshufun, options)

After function counting 108 times and iterating 326

times, the final running output of above programming is: x1=1.0102,x2=4.8889,x3=78.2222,f(X)=1090628. VII..CONCLUSION

This paper explored the methods available in the Genetic Algorithm and Neural Networks Toolbox. Compared with standard optimization algorithms(f(X)=1269257.5),

the objective function optimum in the genetic algorithm is about16 .37%less than the former. Therefore we saw that the genetic algorithm is an effective solver for non smooth problems. Additionally, we found that the genetic algorithm can be combined with other solvers, such as fuzzy logic and neural networks, to efficiently find a more accurate solution.

TABLE I

OUTPUT OF STANDARD OPTIMIZATION AND GENETIC ALGORITHM

附 录 B

在神经网络中起重机传输遗传算法最佳化

摘要:那失真的适宜数学模型在设计起重机传输建立。那方法的二等的综合评价被那最佳的把割集弄平整使用经由,那方法的二等的综合评价是使用经由那最佳的把割集弄平整,因此每个模糊约束那最佳的价值可以是获得弄平整,并且那模糊的最佳化是被变成那通常的最佳化。神经网络算法那背面加固增长的将采用到连续性前馈网络如此适合相关系数。然后那用罚款期限是构成由罚款策略装配功能、神经网络计划是召回、解算机功能的遗传算法工具箱的matlab软件是采用到解决那最佳化数学模型。 索引词:起重机机构;遗传算法最佳化;神经网络。 模糊的最佳化数学模型的牵引机构

设计渐开线螺旋状的蜗轮传动装置是采民用在起重机传输,哪个有主参数如下:额定功率Pe=1.5kw、输出速度28.4r/min、输出转矩T2=2 295.87n.m、齿轮比U=49.3、工作负荷因素k=1.05, 那螺旋是机器和经加热处理材45钢和那由ZQA19-4构成的齿轮的齿轮冠. A指定目标函数

为了节省有色金属的齿轮冠的螺旋齿轮,那目标函数将应指定那那大量的齿轮冠的螺旋齿轮在牵引机构向最小的按照图1倾斜,d0、 di2,b分别是外径、内径和螺旋齿轮的齿面宽冠,因此那是大量的牙齿冠;

所以那目标函数是

m齿轮模数;d1齿轮分度圆直径;z1螺旋开始的齿数。

图1.涡轮传动装置图

B反面选择设计参数按照等式的那目标函数、,m、d1将应虽然设计参数选择,但是简而言之:

C建立模糊约束

认为Φ值的随机特性设计参数和一些因素谁的价值很不定的比如负荷性质和材料品质、那模糊约束是建立、包括那性质和边界约束在内。 1)极限的开始的螺旋的齿数:z1=1~2;; 2)极限的齿轮的模数:2≤m≤8;

3)极限的那导程角螺旋的因为保证蜗轮传动装置的效率:3≤γ≤8,tan γ=mz1/d1; 4)约束的接触强度的螺旋齿轮:

那材料弹性因素、

σh那接触应力的螺旋齿轮;

[σh]-那模糊的Φ值那容许接触应力的螺旋齿轮。 5)约束的牙齿梁强度的螺旋齿轮:

那横梁强调的轮齿;

[σf]-模糊的Φ值那容许弯曲应力的螺旋齿轮牙齿; Yf-那轮廓因素因为螺旋齿轮牙齿。

6)约束稠的的的螺旋:那螺旋信息系统支持

在...之间二轴承、如果那蜗杆轴弯曲多,那就是说,那牙齿不会适当地网孔,那么,