Sei sulla pagina 1di 3

LTE RF conditions classification

It is common sense that the performance of any wireless system has a direct relationship with the
RF conditions at the time. To aid with performance analysis then, we typically define some
ranges of RF measurements that correspond to some typical RF conditions one might find
themselves in.
When it comes to LTE, I came across the above table that presents a good classification. The
source of this table is a EUTRAN vendor and has been complied during the RF tuning process
for a major US operator. Of course there are no rules as to how various RF conditions are
classified, so different tables will exist but to a great extent you can expect them to align.
In this particular example, three measurement quantities are used. RSRP (Reference Signal
Received Power), RSRQ (Reference Signal Received Quality) and SINR (Signal to Interference
& Noise Ratio).
RSRP is a measure of signal strength. It is of most importance as it used by the UE for the cell
selection and reselection process and is reported to the network to aid in the handover procedure.
For those used to working in UMTS WCDMA it is equivalent to CPICH RSCP.
The 3GPP spec description is "The RSRP (Reference Signal Received Power) is determined for a
considered cell as the linear average over the power contributions (Watts) of the resource
elements that carry cell specific Reference Signals within the considered measurement frequency
bandwidth."
In simple terms the Reference Signal (RS) is mapped to Resource Elements (RE). This mapping
follows a specific pattern (see below). So at any point in time the UE will measure all the REs
that carry the RS and average the measurements to obtain an RSRP reading.

RSRQ is a measure of signal quality. It is measured by the UE and reported back to the network
to aid in the handover procedure. For those used to working in UMTS WCDMA is it equivalent
to CPICH Ec/N0. Unlike UTMS WCDMA though it is not used for the process of cell selection
and reselection (at least in the Rel08 version of the specs).
The 3GPP spec description is "RSRQ (Reference Signal Received Quality) is defined as the
ratio: NRSRP/(E -UTRA carrier RSSI) where N is the number of Resource Blocks of the EUTRA carrier RSSI measurement bandwidth."
The new term that appears here is RSSI (Received Signal Strength Indicator). RSSI is effectively
a measurement of all of the power contained in the applicable spectrum (1.4, 3, 5, 10, 15 or
20MHz). This could be signals, control channels, data channels, adjacent cell power, background
noise, everything. As RSSI applies to the whole spectrum we need to multiple the RSRP
measurement by N (the number of resource blocks) which effectively applies the RSRP
measurement across the whole spectrum and allows us to compare the two.
Finally SINR is a measure of signal quality as well. Unlike RSRQ, it is not defined in the 3GPP
specs but defined by the UE vendor. It is not reported to the network. SINR is used a lot by
operators, and the LTE industry in general, as it better quantifies the relationship between RF
conditions and throughput. UEs typically use SINR to calculate the CQI (Channel Quality
Indicator) they report to the network.
The components of the SINR calculation can be defined as:
S: indicates the power of measured usable signals. Reference signals (RS) and physical downlink
shared channels (PDSCHs) are mainly involved
I: indicates the power of measured signals or channel interference signals from other cells in the
current system

N: indicates background noise, which is related to measurement bandwidths and receiver noise
coefficients
So that is it! I have also included a real life measurement from a Sierra Wireless card that
includes the above mentioned metrics so you can see what is the typical output from a UE. Using
that and the table above you should be able to deduce the RF condition category it is in at the
time of measurement.

Potrebbero piacerti anche