当前位置:
首页
资源下载

搜索资源 - activation function
搜索资源列表
-
0下载:
the activation function
-
-
1下载:
VB版神经网络——BP神经网络实验平台:输出幅值可调ATN激励函数,一个很绝的算法,以前在论坛上讨论比较多的一个帖子内容就是神经网络算法,因此来说,这个算法很具参考价值,不管是新手还是高手,都有一看。-VB version of neural network- BP neural network experiment: ATN activation function with adjustable output amplitude, an algorithm must, before the m
-
-
0下载:
* Lightweight backpropagation neural network.
* This a lightweight library implementating a neural network for use
* in C and C++ programs. It is intended for use in applications that
* just happen to need a simply neural network and do not
-
-
0下载:
ADIAL Basis Function (RBF) networks were introduced
into the neural network literature by Broomhead and
Lowe [1], which are motivated by observation on the local
response in biologic neurons. Due to their better
approximation capabilities, si
-
-
1下载:
wavelet network (wavelet function as activation function of the nodes of the hiden layer of the neural network , MLP) can be used for classification
-
-
0下载:
Sigmoid activation function for neural network
-
-
0下载:
-----引入动量的算法
建立一个3层(含输入层)的BP神经网络,并对其进行训练
输入层不进行数据处理,隐含层激活函数为sigmod函数,输出层为线性函数
输入输出数据归一化到[-1,1],数据在矩阵中按行向量表示
即x=[x11,x12 x21,x22 ... xp1,xp2] y=[y1 y2 ... yp] p为样本数 -The introduction of the algorithm ----- momentum to build a three-layer (
-
-
0下载:
讨论了bp网络的缺点, 深人研究网络中的多种激活函数, 并对它们进行综合分析与比较。-Discussed the shortcomings of the network bp, in-depth research networks in a variety of activation function, and a comprehensive analysis and comparison of them.
-
-
1下载:
采用L-M优化算法用以训练BP神经网络,使其能够拟合某一附加有白噪声的正弦样本数据。设计一个三层BP神经网络,其网络的隐含层神经元的激励函数为双曲正切型,输出层各神经元的激励函数为线性函数,隐含层有6个神经元-LM optimization algorithm used to train BP neural network to enable it to fit a white noise, sine additional sample data. Design a three BP neura
-
-
0下载:
为了提高语音分离算法的收敛速度以及分离性能,提出把拉普拉斯正态混合分布概率密度函数作为语音信号概率密度函数的估计,得到一个更加适合语音信号分离的激活函数,基于此函数提出一种快速语音分离算法.-In order to improve speech separation algorithm convergence speed and separation performance, raise the normal mixture distribution Laplace probability de
-
-
0下载:
函数逼近的MATLAB程序,本程序设计一个两层的bp网络用于函数逼近,隐层的激活函数为 tansig,输出层激活函数为purelin线性函数
-Function approximation of the MATLAB program, the program design of a two-tier network for function approximation bp, hidden layer activation function is tansig, the output la
-
-
0下载:
三层BP神经网络对药品的销售进行预测。:输入层有四个结点,隐含层结点数为5,隐含层的激活函数为tansig;输出层结点数为1个,输出层的激活函数为logsig,并利用此网络对药品的销售量进行预测,预测方法采用滚动预测方式,即用前四个月的销售量来预测第四个月的销售量。-Three-layer BP neural network to forecast the sales of drugs. : Input layer has four nodes, 5 hidden layer nodes, hi
-
-
0下载:
提出了一种基于自适应 Chebyshev 多项式神经网络(ACNN)的 Logistic 混沌系统控制算法。该算法采用 Chebyshev
正交多项式作为神经网络的激励函数, 构建 Logistic 混沌系统的预测与控制模型。为了保证算法的稳定性, 提出和证明了收敛定
理, 并利用自适应学习率算法提高神经网络的学习效率和收敛速度。通过采用自适应 Chebyshev 神经网络直接学习 Logistic 混
沌系统的动态特性, 并对系统实施目标函数控制。实验仿真结果表明, 该算法在 L
-
-
0下载:
基于BP神经网络的数字图像识别
采用引进动量项和变步长法的改进型BP网络
train.m 训练
shibie.m 识别
actfun.m 激活函数
image.rar 图像库
-BP neural network-based digital image recognition and momentum term with the introduction of variable step method of improved BP network train.m trai
-
-
0下载:
小波神经网络是一种将小波分析与神经网络结合的方法,将小波分析函数替代神经网络的激励函数。-Wavelet neural network is a wavelet analysis and neural network method, the wavelet analysis function instead of the neural network activation function.
-
-
0下载:
Activation function design
-
-
0下载:
creation and training of single nejron with logsig activation function
-
-
0下载:
实现BP神经网络训练的小程序。采用单输入单输出的3层BP网络,神经网络隐层采用标准的sigmoidal激活函数,输出层采用线性激活函数。-BP neural network training procedures. Single-input single-output three-layer BP network, the neural network hidden layer using a standard sigmoidal activation function, the output
-
-
0下载:
bp的matlab源代码 多层前馈网络用于图像压缩的网络模型、原理、算法及关键技术,并通过仿真实验说明了在BP神经网络图像压缩中,算法、激活函数和压缩率等参数的选择是至关重要的,它们与收敛时间以及重建图像的压缩性能息息相关。-the multilayer feedforward network for image compression network model, theory, algorithm and key technologies, by simulation illustrates
-
-
0下载:
利用两层BP神经网络训练加权系数。隐含层的激活函数取S型传输函数,输出层的激活函数取线性传输函数-Weighting factor of two layers of BP neural network training. The hidden layer activation function to take the S-shaped transfer function, the output layer activation function to take a linear transfer
-