Parabola-Based Artificial Neural Network Activation Functions

The paper deals with the issues of constructing logical functions using artificial neurons with various activation functions. In particular, we propose a method for constructing parabola-based nonlinearity that allows expanding the logical capabilities of neurons and neural networks based on them. Alternative nonlinearity, like the sigmoid, is suitable for configuring a neural network by error backpropagation. We give an algorithm for such an XOR function adjustment using two neurons. The paper also shows the possibility of the XOR function building using one neuron by applying a rotated parabola (solving the XOR problem). We conduct experimental research on neural network adjustment based on standard sigmoid, s-parabola, and rotated parabola activation functions, which showed prospects of the proposed approach. The new nonlinearity makes it possible to reduce the overall computational complexity of setting up a neural network and speed up calculations in tasks that require the use of a large number of neurons. © 2023 IEEE.

Авторы
Khachumov M. , Emelyanova Y. , Khachumov V.
Сборник материалов конференции
Издательство
Institute of Electrical and Electronics Engineers Inc.
Язык
Английский
Страницы
249-254
Статус
Опубликовано
Год
2023
Организации
  • 1 Program Systems Institute of Ras, Research Center for Multiprocessor Systems, Pereslavl-Zalessky, Russian Federation
  • 2 Federeal Research Center 'Computer Science and Control' of Ras, Moscow, Russian Federation
  • 3 Rudn University, Moscow, Russian Federation
Ключевые слова
activation function; artificial neural network; neuron; nonlinearity; parabola; sigmoid; XOR problem
Цитировать
Поделиться

Другие записи