The paper deals with the issues of constructing logical functions using artificial neurons with various activation functions. In particular, we propose a method for constructing parabola-based nonlinearity that allows expanding the logical capabilities of neurons and neural networks based on them. Alternative nonlinearity, like the sigmoid, is suitable for configuring a neural network by error backpropagation. We give an algorithm for such an XOR function adjustment using two neurons. The paper also shows the possibility of the XOR function building using one neuron by applying a rotated parabola (solving the XOR problem). We conduct experimental research on neural network adjustment based on standard sigmoid, s-parabola, and rotated parabola activation functions, which showed prospects of the proposed approach. The new nonlinearity makes it possible to reduce the overall computational complexity of setting up a neural network and speed up calculations in tasks that require the use of a large number of neurons. © 2023 IEEE.