Documenti di Didattica
Documenti di Professioni
Documenti di Cultura
3GPP Long Term Evolution (LTE), is the latest standard in the mobile network technology tree that produced the GSM/EDGE and UMTS/HSxPA network technologies. It is a project of the 3rd Generation Partnership Project (3GPP), operating under a name trademarked by one of the associations within the partnership , the European Telecommunications Standards Institute.
Long Term Evolution (LTE) is the latest step in moving forward from the cellular 3G services ( e.g. GSM to UMTS to HSPA to LTE or CDMA to LTE). LTE is based on standards developed by the 3
rd
Generation
Partnership Project (3GPP). LTE may also be referred more formally as Evolved UMTS Terrestrial Radio Access (E-UTRA) and Evolved UMTS Terrestrial Radio Access Network (EUTRAN). Even though 3GPP created standards for GSM/UMTS family, the LTE
standards are completely new, with exceptions where it made sense. The following are the main objectives for LTE. Increased downlink and uplink peak data rates. Scalable bandwidth Improved spectral efficiency All IP network A standards based interface that can support a multitude of user types. LTE networks are intended to bridge the functional data exchange gap between very high data rate fixed wireless Local Area Networks (LAN) and very high mobility cellular networks.
Efficient spectrum utilization Flexible spectrum allocation Reduced cost for the operator Improved system capacity and coverage Higher data rate with reduced latency
Some specific targets set for LTE are listed below [3GPP TR 25.913]
Increased peak data rate:100Mbps for DL with 20MHz (2 Rx Antenna at UE), 50Mbps for UL with 20MHz Improved spectral efficiency: 5bps/Hz for DL and 2.5bps/Hz for UL Improved cell edge performance (in terms of bit rate) Reduced latency:
The E-UTRAN uses a simplified single node architecture consisting of the eNBs (E-UTRAN Node B). The eNB communicates with the Evolved Packet Core (EPC) using the S1 interface; specifically with the MME (Mobility Management Entity) and the UPE (User Plane Entity) identified as S-GW (Serving Gateway) using S1-C and S1-U for control plane and user plane respectively. The MME and the UPE are preferably implemented as separate network nodes so as to facilitate independent scaling of the control and user plane. Also the eNB communicates with other eNB using the X2 interface (X2-C and X2-U for control and user plane respectively). Please Refer to LTE Network Infrastructure and Elements for a detailed overview of individual network elements.
LTE supports an option of Multicast/Broadcast over a Single Frequency Network (MBSFN), where a common signal is transmitted from multiple cells with appropriate time synchronization. The eNB being the only entity of the E-UTRAN supports all the functions in a typical radio network such as Radio Bearer
control, Mobility management, Admission control and scheduling. The Access Stratum resides completely at the eNB.
The LTE physical layer is based on Orthogonal Frequency Division Multiplexing scheme OFDM to meet the targets of high data rate and improved spectral efficiency. The spectral resources are allocated/used as a combination of both time (aka slot) and frequency units (aka subcarrier). MIMO options with 2 or 4 Antennas is supported. Multi-user MIMO is supported in both UL and DL.The modulation schemes supported in the downlink and uplink are QPSK, 16QAM and 64QAM.
The downlink transmission uses the OFDM with cyclic prefix . Some of the reasons for using OFDM are given below: Multiple carrier modulation (MCM) helps in countering the frequency selective fading as the channel appears to have nearly flat frequency response for the narrow band subcarrier. The frequency range of the resource block and the number of resource blocks can be changed (or adapted to the channel condition) allowing flexible spectrum allocation. Higher peak data rates can be achieved by using multiple resource blocks and not by reducing the symbol duration or using still higher order modulation thereby reducing the receiver complexity. The multiple orthogonal subcarriers inherently provides higher spectral efficiency. The cyclic prefix (CP) is the partial repetition of the bit/symbol sequence from the end to the beginning. This makes the time domain input sequence to appear periodic over a duration so that the DFT representation is possible for any frequency domain processing. Also the duration if chosen larger than the channel delay spread, will help in reducing the inter-symbol interference. The following pilot signals are defined for the downlink physical layer: Reference signal: The reference signal consists of known symbols transmitted at a well defined OFDM symbol position in the slot. This assists the receiver at the user terminal in estimating the channel impulse response so that channel distortion in the received signal can be compensated for. There is one reference signal transmitted per downlink antenna port and an exclusive symbol position is assigned for an antenna port (when one antenna port transmits a reference signal other ports are silent). Synchronization signal: Primary and secondary synchronization signals are transmitted at a fixed subframes (first and sixth) position in a frame and assists in the cell search and synchronization process at the user terminal. Each cell is assigned unique Primary sync signal.
The transformation is equivalent to shift in the center frequency of the single carrier input signal. The subcarriers do not combine in random phases to cause large variation in the instantaneous power of the modulated signal. This means lower PAPR (Peak to Average Power Ratio).
The PAPR (Peak to Average Power Ratio) of SC-FDMA is lesser than that of the conventional OFDMA, so the RF power amplifier (PA) can be operated at a point nearer to recommended operating point. This increases the efficiency of a PA thereby reducing the power consumption at the user terminal.
The following pilot signals are defined for the uplink physical layer: Demodulation Reference signal: This signal send by the user terminal along with the uplink transmission, assists the network in estimating the channel impulse response for the uplink bursts so as to effectively demodulate the uplink channel. Sounding Reference Signal: This signal is sent by the user terminal assists the network in estimating the overall channel conditions and to allocate appropriate frequency resources for uplink transmission.
Up to date versions of the RLC [3GPP TS 36.322] and MAC [3GPP TS 36.321] specifications are available and the majority of procedures are specified. The Hybrid-ARQ is suggested at the MAC layer in addition to the ARQ at the RLC layer.
Radio bearer control Radio admission control Connection mobility management Dynamic resource allocation Inter cell interference coordination Load balancing Inter RAT RRM functions
Contents
1. 2. 3. 1. 2. 4.
1 Introduction 2 LTE Initial Access 3 Initial synchronization 3.1 Primary Synchronization Signal (PSS) 3.2 Secondary Synchronization Signal (SSS) 4 LTE Cell selection and reselection criteria
Introduction
Cell reselection is a complex process in LTE. The following extract from [1] provides a very good understanding of the overall procedure.
Initial synchronization
Successful execution of the cell search and selection procedure as well as acquiring initial system information is essential for the UE before taking further steps to communicate with the network. For this reason, it is important to take a closer look at
this fundamental physical layer procedure. This section focuses on the cell-search scheme defined for LTE and the next chapter describes reception of the essential system information. As in 3G (WCDMA), LTE uses a hierarchical cell-search procedure in which an LTE radio cell is identified by a cell identity, which is comparable to the scrambling code that is used to separate base stations and cells in WCDMA. To avoid the need for expensive and complicated network and cell planning, 504 physical layer cell identities of is sufficiently large. With a hierarchical cell search scheme, these identities are divided into 168 unique cell layer identity groups in the physical layer, in which each group consists of three physical layer identities. To remember this hierarchical principle, consider the example of first names and surnames. According to statistics, the most common English surname is Smith, which corresponds to physical layer cell identity group 0. The second most common surname is Johnson, which represents the physical layer cell identity group 1. This example can be extended to the last group, which would be Rose. The most common male first names are James, John, or Robert and female names are Mary, Patricia, and Linda. Each first name represents one of the three physical layer identities. This information is now transmitted using two different signals, generated by Layer 1. The two signals, carrying the physical layer identity and the physical layer cell identity group, are the primary and the secondary synchronization signals respectively. This means that the complete cell search procedure consists of two steps to identify the cells identity as shown Graphically in the Figure below:
twice per radio frame, so it is repeated in subframe 5 (in time slot 11). This enables the UE to get time synchronized on a 5 ms basis, which was selected to simplify the required inter-frequency and inter-RAT measurements. LTE must accommodate handover to and from other radio access technologies, such as GSM/GPRS/EDGE, WCDMA/HSPA or CDMA2000 1xRTT/1xEV-DO.
The previous section described how initial cell selection will work and the difference between LTE FDD and TD-LTE. However, only when specific criteria are fulfilled is the UE allowed to camp on that cell. These criteria for cell selection as well as cell reselection for LTE are specified in [3]. It is further illustrated by a description of the two procedures: In the initial cell selection procedure, as described in the previous sections, no knowledge about RF channels carrying an E-UTRA signal is available at the UE. In that case the UE scans the supported E-UTRA frequency bands to find a suitable cell. Only the cell with the strongest signal per carrier will be selected by the UE. The second procedure relies on information about carrier frequencies and optionally cell parameters received and stored from previously-detected cells. If no suitable cell is found using the stored information the UE starts with the initial cell selection procedure. S is the criterion defined to decide if the cell is still suitable . This
criterion is fulfilled when the cell selection receive level is Equation below:
is the measured receive level value for this cell, i.e. the Reference Signal Received Power (RSRP). This measured value is the linear average over the power of the resource elements that carry the cell specific reference signals over the considered measurement bandwidth. Consequently, it depends on the configured signal bandwidth. In the case of receiver diversity configured for the UE, the reported value will be equivalent to the linear average of the power values of all diversity branches. is the minimum required receive level in this cell, given in dBm. This value is signaled as QrxLevmin by higher layers as part of the System Information Block Type 1 (SIB Type 1). QrxLevmin is calculated based on the value provided within the information element (-70 and -22) multiplied with factor 2 in dBm. is an offset to Qrxlevmin that is only taken into account as a result of a periodic search for a higher priority PLMN while camped normally in a Visitor PLMN (VPLMN). This offset is based on the information element provided within the SIB Type 1, taking integer values between (18) also multiplied by a factor of 2 in dB. This gives a wider range by keeping the number of bit transmitting this information. The offset is defined to avoid ping-pong between different PLMNs. If it is not available then Qrxlevminoffset is assumed to be 0 dB.
is a maximum function as shown in Equation 5. Whatever parameter is higher, PEMAX- PUMAX or 0, is the value used for PCompensation. PEMAX [dBm] is the maximum power a UE is allowed to use in this cell, whereas PUMAX [dBm] is the maximum transmit power of an UE according to the power class the UE belongs too. At the moment only one power class is defined for LTE, which corresponds to Power Class 3 in WCDMA that specifies +23 dBm. PEMAX is defined by higher layers and corresponds to the parameter P-MAX defined in [2]. Based on this relationship, PEMAX can take values between -30 to +33
dBm. Only when PEMAX > +23 dBm PCompensation is it considered when calculating Srxlev. The P-MAX information element (IE) is part of SIB Type 1 as well as in the "RadioResourceConfigCommon" IE, which is part of the SIB Type 2. As explained above, all parameters except for Qrxlevmeas are provided via system information. In a real network a UE will receive several cells perhaps from different network operators. The UE only knows after reading the SIB Type 1 if this cell belongs to its operators network (PLMN Identity). First the UE will look for the strongest cell per carrier, then for the PLMN identity by decoding the SIB Type 1 to decide if this PLMN is a suitable identity. Afterwards it will compute the S criterion and decide for a suitable cell or not.
The Figure above shows one possible scenario in a real network. Assume that the UE belongs to network operator 1. There are two other carriers also operating an LTE network but of course at different frequencies. The terminal receives all base stations but at different power levels. Based on the above definition the UE will select the strong cell for each carrier . Using this the UE will start with network operator 3 and figure out after decoding the SIB Type 1 that the PLMN saved on the USIM does not match to the transmitted one. From this information it will stop with its attempt and proceed to the next strongest signal, which is operator 2 . Now the PLMN does not correspond so the UE will continue with signal 3 (green) and the PLMN will match. The UE continues to use the information in SIB Type 1 and Type 2 to compute the cell selection criteria. In this example, the parameters transferred and belonging to eNB1 do not fulfill S > 0 where the UE will move along with demodulating and decoding the information provided by eNB2. S > 0 is fulfilled and the UE starts camping on this cell.
2. 3. 4. 3. 1. 2. 3. 4. 5. 4. 5.
2.2 2.2 eNode Bs: The Single E-UTRAN Node 2.3 2.3 The X2 Interface 2.4 2.4 eNode B Functionalities 3 3. Evolved Packet Core (EPC) and its Components 3.1 3.1 MME (Mobility Management Entity) 3.2 3.2 HSS (Home Subscriber Server) 3.3 3.3 The Serving GW (Serving Gateway) 3.4 3.4 The PDN GW (Packet Data Network Gateway) 3.5 3.5 The PCRF (Policy and Charging Rules Function) Server 4 References 5 External Links
1. Introduction
The following extract from [1] provides a very good understanding of the overall LTE Network Infrastructure and elements. The Figure below describes the LTE & UMTS overall network architecture, not only
including the Evolved Packet Core (EPC) and Evolved UMTS Terrestrial Access Network (EUTRAN), but also other components, in order to show the relationship between them. For simplification, the picture only shows the signalling interfaces. In some cases, both user data and signalling are supported by the interface (like the S1, S2 or 3G PS Gi interfaces) but, in some other cases, the interfaces are dedicated to the Control plane, and only support signalling (like the S6 and S7 interfaces).
The new blocks specific to Evolved UMTS evolution, also known as the Evolved Packet System (EPS), are the Evolved Packet Core (or EPC) and the Evolved UTRAN (or E-UTRAN). Other blocks from the classical UMTS architecture are also displayed, such as the UTRAN (the UMTS Access Network), the PS and the CS Core Networks, respectively, connected to the public (or any private) IP and Telephone Networks. The IMS (IP Multimedia Subsystem) is located on top of the Packet Core blocks and provide access to both public or private IP networks, and the public telephone network via Media Gateway network entities. The HSS, managing user subscription information is shown as a central node, providing services to all Core Network blocks of 3G and evolved 3G architecture. Note: The picture does not represent the nodes involved in the support of charging function. Discussed below are the individual sub-components:
Network concepts. The general architecture follows the good old 2G/GSM star model, meaning that a single controller (the RNC) may possibly control a large number the typical number in commercial networks is about several hundreds of radio Base Stations (the Node B) over the Iub interface. In addition, an inter-RNC Iur interface was defined to allow UTRAN call anchoring at the RNC level and macro-diversity between different Node B controlled by different RNCs. Macro-diversity was a consequence of CDMA-based UTRAN physical layers, as a means to reduce radio interference and preserve network capacity. The initial UTRAN architecture resulted in a simplified Node B implementation, and a relatively complex, sensitive, high capacity and feature-rich RNC design. In this model, the RNC had to support resource and traffic management features as well as a significant part of the radio protocols.
Radio Resource Control: this relates to the allocation, modification and release of resources for the transmission
over the radio interface between the user terminal and the eNodeB.
Radio Mobility management: this refers to a measurement processing and handover decision.
Radio interface full Layer 2 protocol: in the OSI Data Link way, the layer 2 purpose is to ensure transfer of data between network entities. This implies detection and possibly correction of errors that may occur in the physical layer.
The MME (Mobility Management Entity) The HSS (Home Subscriber Server) The Serving Gateway. The PDN Gateway (Packet Data Network). The PCRF (Policy and Charging Rules Function) Server.
The following sub-sections discuss each of these in detail:
Terminal-to-network session handling this relates to all the signalling procedures used to set up Packet Data
context and negotiate associated parameters like the Quality of Service.
Idle terminal location management this relates to the tracking area update process used in order for the network to
be able to join terminals in case of incoming sessions. The MME is linked through the S6 interface to the HSS which supports the database containing all the user subscription information.
User profile information this includes service subscription states and user-subscribed Quality of Service
information (such as maximum allowed bit rate or allowed traffic class). The AuC part of the HSS is in charge of generating security information from user identity keys. This security information is provided to the HLR and further communicated to other entities in the network. Security information is mainly used for:
Mutual network-terminal authentication. Radio path ciphering and integrity protection, to ensure data and signalling transmitted between the network and the
terminal is neither eavesdropped nor altered.
The Policy Decision Function (PDF) The Charging Rules Function (CRF)
The PDF is the network entity where the policy decisions are made. As the IMS session is being set up, SIP signalling containing media requirements are exchanged between the terminal and the P-CSCF. At some time in the session establishment process, the PDF receives those requirements from the P-CSCF and makes decisions based on network operator rules, such as:
Allowing or rejecting the media request. Using new or existing PDP context for an incoming media request. Checking the allocation of new resources against the maximum authorized
The CRFs role is to provide operator-defined charging rules applicable to each service data flow. The CRF selects the relevant charging rules based on information provided by the P-CSCF, such as Application Identifier, Type of Stream (audio, video, etc.), Application Data Rate, etc.
Contents
1. 2. 1. 2. 3. 4. 5. 3.
1 1. Introduction 2 2. LTE Radio Link Budgeting 2.1 2.2. Uplink Budget 2.2 2.3. Downlink Budget 2.3 2.4. Propagation (Path Loss) Models 2.4 2.5. Mapping of Path Losses to Cell Sizes 2.5 2.6. Comparison to Other Radio Access Technologies 3 3. References
1. Introduction
The initial planning of any Radio Access Network begins with a Radio Link Budget. As the name suggests, a link budget is simply the accounting of all of the gains and losses from the transmitter, through the medium (free space, cable, waveguide, fiber, etc.) to the receiver in a telecommunication system. In this page, we will briefly discuss link budget calculations for LTE.
attenuation g between the mobile and the base station antenna. The maximum path loss allows the maximum cell range to be estimated with a suitable propagation model. The cell range gives the number of base station sites required to cover the target geographical area.The following table shows typical (practical) parameter values used for doing an LTE Radio Link Budget.
a b c
Parameter Base Station maximum transmission power. A typical value for macro cell base station is 20-69 W at the antenna connector. Base Station Antenna Gain Cable loss between the base station antenna connector and the antenna. The cable loss value depends on the cable length, cable thickness and
d e f
frequency band. Many installations today use RF heads where the power amplifiers are close to the antenna making the cable loss very small. Base Station EIRP, Calculated as A + B - C UE RF noise figure. Depends on the frequency band. Duplex separation and on the allocated bandwidth. Terminal noise can be calculated as: K (Boltzmann constant) x T (290K) x bandwidth. The bandwidth depends on bit rate, which defines the number of resource blocks. We assume 50 resource blocks, equal 9 MHz, transmission for 1 Mbps downlink. Calculated as E + F Signal-to-noise ratio from link simulations or measurements. The value depends on the modulation and coding schemes, which again depend on the data rate and the number of resource blocks allocated. Calculated as G + H Interference margin accounts for the increase in the terminal noise level caused by the other cell. If we assume a minimum G-factor of -4 dB, that corresponds to 10*Log10(1+10^(4/10)) = 5.5 dB interference margin. Control channel overhead includes the overhead from reference signals, PBCH, PDCCH and PHICH. UE antenna gain. Body loss -9 to -7 dB 6 11 dB -104.5 dBm for 50 resource blocks (9 MHz)
g h i j k L M
64
24.0 0.0 0.0 24.0 = a + b + c
Receiver eNode B
e f g h Node B noise figure (dB) Thermal noise (dBm) Receiver noise floor (dBm) SINR (dB) 2.0 -118.4 = k(Boltzmann) * T(290K)* B(360kHz) -116.4 = e + f -7.0 From Simulations performed in [1]
i j k l m
Receiver sensitivity (dBm) Interference Margin (dB) Cable Loss (dB) RX antenna gain (dBi) MHA gain (dB)
1
46.0 18.0 2.0 62.0 = a + b + c
Receiver UE
e f g h i j k l m UE noise figure (dB) Thermal noise (dBm) Receiver noise floor (dBm) SINR (dB) Receiver sensitivity (dBm) Interference Margin (dB) Control Channel Overhead (dB) RX antenna gain (dBi) Body Loss (dB) 7.0 -104.5 = k(Boltzmann) * T(290K)* B(360kHz) -97.5 = e + f -10.0 From Simulations performed in [1] -107.5 = g + h 3.0 1.0 0.0 0.0 165.5 = d i j k + l - m
Common examples include Free space, WalfishIkegami, OkumuraHata, LongleyRice, Lee and Young's models. The most commonly used model in urban environments is the Okumura-Hata model as described below: For Urban Areas:
where:
OkumuraHata parameter
Base station antenna height (m) Mobile antenna height (m) Mobile antenna gain (dBi) 0 Slow fading standard deviation (dB) Location probability (%) Correction factor (dB) Indoor loss (dB) Slow fading margin (dB)
Cell Size in Km
contrast to HSPA link budgets, the LTE Link budgets show up to roughly 2 dB higher values, which is mainly a result of low interference margins that can be achieved with orthogonal modulation. For a detailed comparison please refer to LTE Link Budget Comparison.
1. Introduction
The tables below show a link budget comparison between LTE, GSM and UMTS HSPA.
The following table based on [1],[2] compares the uplink budget for LTE, HSPA and GSM
GSM 12.2
33 0 3 30
HSPA 64
23 0 0 23
LTE 64
23 0 0 23
i j k l m n
Receiver sensitivity (dBm) Interference Margin (dB) Cable Loss (dB) RX antenna gain (dBi) Fast fade margin (dB) Soft handover gain (dB)
-114 0 0 18 0 0 162
-123.4 1 0 18 0 0 163.4
The uplink link budget has some differences in comparison to HSPA: specifically the smaller interference margin, no macro diversity gain (Soft handover gain) and no fast fading margin. As can be seen from the table above the link budget was calculated for 64 kbps uplink, which is cannot be classified as a high enough data rate for true broadband service. To guarantee higher data rates for LTE, a low frequency deployment may be required in addition to additional sites, active antenna solutions or local area solutions.
The following table based on [1],[2] compares the downlink budget for LTE, HSPA and GSM
GSM 12.2 B,
44.5 18 2 60.5
HSPA 1024
LTE 1024
46 18 2 62
46 18 2 62
Receiver UE
e f g h i j k l m UE noise figure (dB) Thermal noise (dBm) Receiver noise floor (dBm) SINR (dB) Receiver sensitivity (dBm) Interference Margin (dB) Control channel overhead (%) RX antenna gain (dBi) Body loss (dB) -119.7 -104 0 0 0 3 161.5 7 -108.2 -101.2 -5.2 -106.4 4 20 0 0 163.4 7 -104.5 -97.5 -9 -106.4 4 20 0 0 163.5
The LTE link budget in downlink has several similarities with HSPA and the maximum path loss is similar. The link budgets show that LTE can be deployed using existing GSM and HSPA sites assuming that the same frequency is used for LTE as for GSM and HSPA. LTE itself does not provide any major boost in the coverage. That is because the transmission power levels and
the RF noise figures are also similar in GSM and HSPA technologies, and the link performance at low data rates is not much different in LTE than in HSPA.