Feed Forward Neural Networks (FFNN) are one of the most used models of machine learning in literature. In this model, one of important parts is the activation function in each layer. It is recommended to use an activation function that minimizes the error and maximizes generalization performance applying different activation functions. In this paper, comformable definition is considered as fractional derivative in neural networks. Conformable derivative has some beneficial properties comparing to other fractional derivative definitions. Sigmoid activation which is the most widely used function in neural networks was performed with the conformable derivative method and a different solution with high generalization capacity was proposed for back-propagation.
Benzer Makaleler | Yazar | # |
---|
Makale | Yazar | # |
---|