Fifth-generation (5G) networks are expected to revolutionize wireless communication by enabling faster and more reliable data transfer rates. However, the use of millimeter-wave frequencies in 5G networks introduces unique challenges in data transmission due to phenomena such as dynamic signal blockage, which can result in sudden and intense resource demands for sessions. This paper presents a mathematical model of a 5G base station using a resource allocation system with waiting and non-homogeneous resource requirements for sessions. The study aims to determine the most efficient way to select sessions from the waiting buffer to increase system performance and minimize the probability of blocking data sessions and reduce waiting time in the waiting buffer. As a result of the research, it was found, that the method that prioritized sessions with the highest resource requirements and took them from the waiting buffer until the system reached its limit, demonstrates the best system performance under certain parameters.