posted on 2024-07-11, 20:24authored byVyacheslav M. Abramov
The goal of the paper is to study asymptotic behavior of the number of lost messages. Long messages are assumed to be divided into a random number of packets which are transmitted independently of one another. An error in transmission of a packet results in the loss of the entire message. Messages arrive to the M/GI/1 finite buffer model and can be lost in two cases as either at least one of its packets is corrupted or the buffer is overflowed. With the parameters of the system typical for models of information transmission in real networks, we obtain theorems on asymptotic behavior of the number of lost messages. We also study how the loss probability changes if redundant packets are added. Our asymptotic analysis approach is based on Tauberian theorems with remainder.