Methodology guidelines are given to enable consistent
performance evaluations. The guidelines may serve as a
framework with aligned assumptions, consistent choice of models
and simulation reference metrics to ensure that the results can be
compared. The results on different levels are not meant to be
compared but to be used as possible input, e.g. link-level
simulations can be used as input to system-level simulations but
should not be compared to them. Below, the main performance
indicators, as well as suitable channel and propagation models,
are explained and defined.
3. Performance indicators
The main performance indicators to be used in the evaluation of the 5G system are
defined and explained hereafter:
The user throughput is defined as the total amount of received information bits at the receiver
divided by the total active session time at the data link layer. Active session time does not include
the waiting time at the application layer, e.g. reading time for web-browsing, or back-off time
introduced by TCP/IP’s traffic control, and therefore it is, in general, different from the session
A second definition of the user throughput accounts for the whole session time, instead of only the
active session time. Both definitions are equivalent for full buffer traffic model, which does not
have neither reading nor back-off times.
A third definition considers that the user throughput is the average of the throughput experienced
by all the packets received by the user.
4. Application data rate:
The application data rate is defined as the data bit rate from the application layer of the user, i.e. data bits related to
Transmission Control Protocol (TCP) and protocol overhead are excluded. This definition facilitates the
comparison of technology components that can implement changes at any layer in the protocol stack.
The cell throughput is defined as the total amount of received information bits in the cell under a pre-specified time
interval. The cell is defined as a single point of data aggregation for which the cell throughput is measured, e.g. a
traditional Third Generation Partnership Project (3GPP) cell or a Wi-Fi access point.
The spectral efficiency is defined as the aggregated user throughput divided by the aggregated spectrum used per
measurement unit in the data link layer. Note that the aggregated spectrum includes the spectrum used for e.g.
control and broadcast signaling.
The measurement unit is a cell or an area unit, e.g. square kilometers. The cell spectral efficiency is defined as the
spectral efficiency where the aggregation is taking place per cell. The normalized user throughput was defined as
the user throughput divided by the channel bandwidth of the user’s serving cell. This indicator is equivalent to a
user spectral efficiency. The cell edge user spectral efficiency is defined as the 5% point of the Cumulative
Distribution Function (CDF) of the normalized user throughput.
Traffic volume :
The traffic volume is defined at the application layer as the aggregated served traffic to all users, either in total for
the setting or per area unit.
5. Error rate:
The bit error rate is defined as the error rate of transmitted bits on the raw demodulation of
the investigated technology.
The frame error rate is defined as the error rate of transmitted information blocks. For
example, the information block can be a link-level codeword or a system-level transport
block at the data link layer.
The application end-to-end delay is defined as the time elapsed from the application layer at
the source to the application layer at the destination. The Medium Access Control (MAC)
layer delay is defined as the time elapsed from the MAC layer at the source to the MAC
layer at the destination.
Network energy performance :
The network energy performance is defined as the energy consumed to the number of served
bits at the data link layer.
Cost is the amount of capital consumed to reach a certain solution. To enable an easy
comparison, cost can be normalized by the system data rate, thus resulting in the metric of
cost per served bit.
6. Channel simplifications
The choice of channel and propagation models for simulation evaluations should be
made according to the required level of accuracy, but should take into account the
computational complexity of the model. In fact, the channel propagation modeling
heavily impacts the total computational burden of a simulator.
Stochastic and geometric models, as compared with ray-tracing option, are simpler to
implement, but usually lack the required level of realism that 5G assessment requires.
They use two different sets of channel parameters. The first one concerns small-scale
parameters, including Angle-of-Arrival (AoA) and Angle-of-Departure (AoD) or delay
of the rays. The second one is related to the large-scale parameters, such as shadow
fading and path loss.
A reasonable alternative consists of a simplified ray-based approach for the
characterization of the large-scale effects, followed by the use of a pure stochastic and
geometric approach for the characterization of small-scale effects.
This alternative, being much simpler than ray-tracing, still allows for a proper
characterization of real environments
7. Small-scale modeling
Concerning small-scale parameters characterization, International
Telecommunications Union – Radiocommunication Sector (ITU-R) M.2135
models are the ones most widely accepted by the research community.
Although some propagation scenarios commonly considered in 5G studies,
such as Device-to-Device (D2D) and Vehicle-to-Everything (V2X), are not
covered by M.2135, a mapping could be defined between those propagation
scenarios and the M.2135 channel models.