Small Cells
Dec 21st, 2013 by Dan Lampie

Small cells are currently the big buzz word in the wireless industry.  Many of the major wireless operators in the US including AT&T, Verizon Wireless, and Sprint have committed to use small cells throughout their networks.  In fact AT&T has committed to deploy 40,000 small cells by the end of 2015, which reveals that wireless operators are serious about this technology.  This begs the question what are small cells and why are they needed?

Macro vs Small Cell in New York City


Small cells as its name suggests are smaller cellular base stations.  By smaller this includes physical size, RF coverage area, and cost.  Another term which is used in the same context as small cells is distrusted antenna systems or DAS.  A DAS is made up of a number of small antennas nodes which then connect back via fiber to a cellular base station.  With a small cell all the intelligence is housed within the device, while a DAS node is just a dumb transmitter and receiver and the intelligence is housed at remote location.  DAS technology has been in the marketplace for a couple of years and has been successfully deployed in both outdoor and indoor environments.  Small cells are brand new to the market, and they are gaining popularly as they should be cheaper and simpler to deploy than a DAS.

DAS Node


The need for small cells is being driven by the surge in mobile data consumption. The popularity of smartphones and tablets means that people are consuming large amounts of data on their mobile devices.  People are not just using their mobiles devices to surf the web, but they are streaming videos and uploading pictures using applications such as Netflix, YouTube, and Instagram.  While LTE was designed to support these mobile applications, the usage is growing quicker than improvements in wireless efficiency which is making networks congested.



To better understand the situation it is important to look at the capacity of an LTE base station which is called an eNB.  LTE is similar in technology to the 802.11N Wi-Fi standard.  Both use similar modulation schemes and data transmissions technologies.  Many LTE networks in the US use 10MHz LTE carriers using a technology called frequency division duplex, or FDD.  This means that 10MHz of spectrum is used in separate downlink and uplink channels.  This allows for full duplex communication and means the total amount of spectrum that is used is 20MHz.  Wi-Fi along with some forms of LTE use a technology called time division duplex or TDD.  With this technology the downlink and uplink data is interleaved in the time domain using the same channel.



The standard Wi-Fi channel is 20MHz wide which uses the same amount of spectrum as a 10MHz FDD LTE carrier.  While a normal Wi-Fi access point might only serve a few people, an LTE base station has to support hundreds of users in the same bandwidth.  An LTE base station is really a high tech Wi-Fi router with advanced resource and user scheduling technology.  If a hundred people tried streaming videos from the same Wi-Fi router the performance would be mediocre, and the same holds true with LTE.  To improve performance a simple solution is to decrease the number of people using the connection.  While this might seem obvious this is one way cellular operations ensure that their networks do not become overloaded and congested.

For the last twenty years the number of cell sites has been growing while the coverage area of each cell has been shrinking.  The concept is relatively simple and is known as cell splitting.  Instead of having one large cell site which serves an area, if two smaller cell sites are used which serve the same geographical area there will be close to double the capacity.  This concept has been successfully used for a long time, but today there is so much usage in major cities that a cell site is need on every block.  It is impractical due to cost and space requirements to put conventional cellular base stations on every block.  This is why small cells are being utilized.  They allow for a denser deployment as they can be mounted on light poles and sides of buildings instead on towers or rooftops.  Instead of having one conventional cell site every four blocks, now it is possible to have a small cell on every block greatly increasing capacity.



Give that small cells are a new technology there are still many questions that still need to be answered.  Will small cells be economic viable?  Will small cells be reliable?  The big question remains whether small cells are the solution to the explosive mobile data growth that is occurring.  Regardless of the success of small cells, the increase in mobile data consumption will force wireless operators to come up with innovative ways to meet mobile data demands.

Making Sense of Sprint’s Network Vision
Apr 7th, 2012 by Dan Lampie

A little over six months ago, Sprint-Nextel laid out its strategy for revamping its wireless network and called the plan “Network Vision.”  If you have read any of my previous articles about Sprint, you would know that Sprint has not had any real network strategy since purchasing Nextel back in 2005.  Today Sprint still has numerous sites where they have yet to combine their iDEN, CDMA/EVDO, and Clearwire’s WiMax network which has resulted in poor coverage and high maintenance and real estate costs.  Well this is all about to change with Network Vision.  After seven years without any true network plan, Sprint-Nextel has something that actually makes sense.

Sprint Network Vision Tower (Alcatel-Lucent Equipment)


Here is a brief overview of what “Network Vision” entails.  The website,, has some excellent detailed information on what “Network Vision” really means from a technical perspective.

- Consolidate its cell sites, by removing sites that are not needed.  Sprint currently has 68,000 sites and will reduce this by 44% to eventually remain with 38,000 sites.

* AT&T claims they have 55,000 cell sites so once Network Vision is completed its nationwide coverage will still lag behind that of AT&T.

- Shutting down iDEN and reusing the spectrum to support at least one 800MHz CDMA 1X Advanced carrier.

* Deploying a 1X carrier in the 800MHz spectrum will greatly improve the voice performance along with coverage for Sprint, especially inside buildings.

Frequency plan for new 1X advanced carriers. Source:


- Deploying a LTE carrier in a 5x5MHz channel configuration in their 1900MHz (PCS) spectrum.

* LTE is the future and this will give Sprint the opportunity to have a nationwide LTE network.
* 5MHz channels will offer only half the data speeds of the 10MHz channel that Verizon Wireless uses.  Still it will be vastly faster than EVDO with 50ms latency.
* Deploying on the 1900MHz spectrum will mean that Sprint will not have as good coverage or indoor penetration as either Verizon Wireless or AT&T which are using 700MHz.

PCS Band Plan. Source:

- Using Remote Radio Heads (RRH) for their existing CDMA/EVDO network and upcoming LTE network
* RRH moves the base station amplifier from the bottom of the tower to the top of the tower.  This eliminates the attenuation of long runs of coax cable up the tower.  According to this spec sheet from Andrews, 100FT of 1 ¼ coax has a loss of 1.6dB at 1900MHz, or a power loss of 31% at the top of the tower.  Thus going with the RRH solution should yield 30%+ more power output along with a 30% increase in receive power over today’s coax solution.
* This should improve coverage and performance of Sprint’s existing CDMA/EVDO network.  The combination of the 800MHz spectrum and RRH should really help Sprint’s voice coverage.

Sprint is using three RRH per face (1 for 800MHz CDMA, 1 for PCS EVDO, 1 for PCS LTE)


At the end of the day Sprint will have a CDMA/EVDO/LTE network, just like Verizon Wireless.  By consolidating its cell sites and turning off iDEN, Sprint will save a ton of money on operating expenses.  It is interesting that Sprint is investing a lot of time and money upgrading CDMA/EVDO instead of just focusing on deploying LTE.  Additionally, with MetroPCS, AT&T, and Verizon Wireless all committing to VOLTE it is interesting that Sprint is planning on deploying CDMA 1X Advanced for voice calls.  Sprint must have believed that its CDMA/EVDO networks could be greatly improved with Network Vision and that both these technologies will be around for some time.  Sprint has been successful at finding ways to monetize its old networks, such as offering Boost Mobile prepaid service over its iDEN network.  As postpaid customers move to LTE, Sprint could offer competitively priced but slower data services overs its CDMA/EVDO network to maximize its investment.

The one element that was left out of Network Vision is Clearwire which Sprint owns 54% of the company.  If Clearwire partnered with Sprint, like Lightsquared attempted before all their GPS interference issues, Clearwire’s network consolidation could save a great deal of money for the small carrier.  Clearwire will be upgrading its network to LTE, but it will be based on TD-LTE technology instead of FDD-LTE that all the other US carriers are using.  Clearwire’s 2.5GHz spectrum limits its usefulness to urban areas and the high cell density needed for good coverage makes network expansion expensive.  Clearwire is hoping to sell extra LTE capacity to the major wireless carriers, but using a different LTE technology and a separate frequency band than everyone else will make this difficult. While Sprint’s issues with Clearwire remain, Network Vision is a huge step in the right direction for Sprint.  One complete it will offer much greater voice coverage, improved EVDO performance, and most importantly bring Sprint into the LTE game.

A single dual band antenna supports all three technologies

mDDoS: A New Threat to Cellular Networks
Feb 12th, 2012 by Dan Lampie

Mobile security has become a growing concern with the rise of smartphones and tablets.  Vulnerabilities in mobile operating systems and malicious applications are an ever increasing threat to personal information and security.  Almost all of the focus for mobile security has been centered around protecting personal information stored on mobile devices.   Very little research has been done in protecting cellular networks from distributed denial of service attacks, or DDoS attacks.  DDoS attacks have been long used on the internet to disrupt web sites and web services.  A DDoS attack is simple in theory, thousands or even millions of computers simultaneously request information from a website which overloads the servers causing the website to stop functioning.

There is a new type of DDoS attack that is possible with mobile devices called a mobile distributed attack, or mDDoS.  An mDDoS attack is different from a traditional DDoS attack in that an mDDoS attack targets a cellular network, with the purpose of disrupting or even brining it down, instead of a website or web application.  mDDoS attacks are possible as cellular networks can only support a very limited number of active users at a time.  The number of active users at a given time varies depending on the network technology such as UMTS, EVDO, or LTE, but eventually it all comes down to the number of orthogonal codes or OFDM symbols that the different technologies utilize.  Usually the small number of active users isn’t an issue as the probability that everyone on a particular cell site is updating Facebook, surfing the net, or calling Grandma at a given instant is unlikely.

When a mobile device sends a request through the airwaves to the cell site it expects to receive a response.  A cell site at capacity either rejects or ignores the service request.  This causes the mobile devices to retry the service request, but this time the request is sent at a higher power.  When numerous mobile devices increase their transmission power it eventually creates enough interference that the base station’s capacity to serve users is severely reduced.  When this happens mobile devices have a difficult time accessing the network and their data connections start to fail.  Usually this is not an issue as cellular networks are designed to handle a normal traffic distribution through a process known as traffic engineering.

An mDDoS attack uses the technique of overloading the RF air interface to effectively bring down a cellular network.  To execute an mDDoS attack, multiple mobile devices need to start sending data or initiate a phone call at the same instant in time.  To accomplish this, a malicious app can be used to coordinate and execute the attack.  The idea about how to install and spread a malicious app across millions of mobile devices is an entirely different topic, but if this can be achieved the results could be devastating.  Unlike a DDoS attack where the attacking traffic can be routed by ISP’s and network operators to minimize the impact, an mDDoS attack is much harder to stop.  An mDDoS attack focuses on layer 1, the RF link signaling link.  Since mobile devices must follow the protocols laid out in the 3GPP and 3GPP2 standards it is difficult to engineer countermeasures.

The good news is that it is challenging to launch a successfully mDDoS attack.  Most cellular networks are designed with extra capacity in mind to prevent unexpected events from causing network outages.  This means that if a small number of mobiles connected to a cell site are vulnerable and an mDDoS attack is launched, it is unlikely this attack will have an impact.  While mDDoS attacks are improbable, it reveals how vulnerable cellular networks are to organized attacks.  To conclude, the adoption of highly advanced mobile devices such as smartphones not only increases the security threat to consumers, but also cellular network operators.

4G LTE Battle: AT&T vs Verizon Wireless
Sep 18th, 2011 by Dan Lampie

Today AT&T launched its 4G LTE network in five cities.  While this is a big step for AT&T, its main competitor, Verizon Wireless, already launched its 4G LTE network almost ten months ago.  Verizon’s 4G LTE network now has service in 143 markets and covers over half the US population.  Wireless carriers that utilize EVDO technology for 3G services such as Sprint-Nextel and Verizon Wireless have been the first to move to 4G technologies as EVDO offers slower theoretical speeds than the competing HSDPA+ technology.  A single EVDO rev. A channel’s theoretical speed is 3.1Mbps in the downlink and 1.8Mbps in the uplink while a single HSDPA+ (not utilizing channel bonding or MIMO) channel can offer 21Mbps in the downlink and 5.8Mbps in the uplink.  In real word applications the speed advantage for HSDPA+ is much less because HSDPA+ carries voice calls over the same channel which reduces the data speeds.   Verizon Wireless and AT&T have always used different network technologies with Verizon choosing the CDMA (3GPP2) path while AT&T electing the GSM (3GPP) route.  While both technologies offer their share of advantages and disadvantages, going forward both companies will use the same network technology – Long Term Evolution or LTE.

LTE is not a one size fit all technology, but instead a technology that allows for a variety of different configurations which greatly impact how it is deployed and its performance.  Both AT&T and Verizon Wireless utilize frequency division duplex (FDD) mode which means that the upload and download channels are on two separate frequencies.  LTE also offers the capability to use time division duplex (TDD) which allows for both the download and upload channels to use one frequency with the download and upload being allocated different time slots.



Another similarity between AT&T and Verizon Wireless is that they are both utilizing 2×2 MIMO antenna technology.  While LTE supports MIMO is an extremely complicated topic, but basically it allows double the amount of data to be transferred in a single channel by utilizing two transmit and receive antennas instead of one.


LTE release 8 supports the options for one, two, or four antenna configurations where the highest performance is achieved utilizing a 4×4 MIMO solution.  Almost all wireless carriers are choosing the 2×2 MIMO route as it offers the best performance/price ratio.  To go with a 4×4 MIMO solution over that of 2×2 MIMO means that double the number of antennas and amplifiers are needed along with more powerful processors in mobile handsets and base stations to decode the additional data streams.  Additionally according research by Ericsson Communication, MIMO only provides performance improvements when a receiver has a signal to noise ratio (SNR) of approximately 10dB or better.  This means that MIMO is most beneficial when a user is close to the cell site, which for most cell sites is only a small percent of the users.

Source: Ericsson


The main difference between AT&T’s and Verizon Wireless’ 4G LTE network is the bandwidth that each channel uses, and this is based on the spectrum allocation that both companies own.  AT&T is using a mixture of 10MHz and 5MHz channels while Verizon Wireless is solely using 10MHz channels.  In areas where AT&T uses 5MHz channels, Verizon Wireless’ network will theoretically offer double the performance of that of AT&T’s. Theoretically a 10MHz channel utilizing 2×2 MIMO supports peak downlink data rates of 73Mbps while a 5MHz channel will only support 37Mbps.   As with any wireless technology reaching anywhere near these theoretically numbers is extremely unlikely.  AT&T knows that its 5MHz channels will put it at a large capacity and speed disadvantage compared to Verizon Wireless, so in markets where it has both 700MHz and AWS spectrum it will try to utilize two 5MHz channels instead of just one.  The two channels become beneficial when a large number of users are on the network and the load is distributed across two 5MHz channels instead of crowding everyone into one 5MHz channel.  Currently, these two channels can’t be bonded for higher throughput, but this technology will come available in LTE Advanced and is known as carrier aggregation.



The final difference between the two 4G LTE networks is the base station radios used.  A growing trend in the wireless industry is to mount the base station’s radio and amplifier at the top of the tower.  This is known as remote radio heads (RRH) and this technology minimizes the cable attenuation experienced by an antenna system.  In traditional base station deployments, the radios and amplifiers are mounted on the ground, where they can be easily upgraded and repaired, and thick coax runs up the tower to the antennas.  The issue with this is that long runs of coax cable experience attenuation.  According to this spec sheet, 100FT of 1 ¼ coax cable has a loss of 1.6dB or roughly 31% less power at the top of the tower compared to when the signal left the amplifier.  Clearly reducing the antenna’s output power by 31% not only reduces coverage and degrades downlink throughput, but it also affects the uplink.  The antenna at the top of the power sees 31% more power from the mobile handset that what actually makes it to the base station.  This result in decreases coverage, reduced uplink throughput, and diminished battery life for handsets.  By mounting the radio and amplifier at the top of the top the 1.6dB cable loss is practically eliminated, greatly improving performance over traditional base station deployments.  Instead of running thick coax up the tower, a much thinner combined fiber optic cable and power cable are run to each remote radio head.


Clearwire’s 4G WiMAX network was the first wireless operator to solely use RRH and they can be easily spotted by the large boxes connecting to the antenna.  AT&T is following in Clearwire’s footsteps by primarily using RRH for its LTE deployment.  Currently RRH can only support one technology and frequency band, so with AT&T dual frequency (700MHz and AWS) LTE deployment this means two RRH are needed for each face of the tower (most towers have three faces).  Most of the  Verizon Wireless tower I have observed do not have any RRH mounted, so it is probable they are going with the conventional base station deployment model with the radios and amplifiers mounted at the bottom of the tower.  The benefit of this solution is that equipment can be quickly and easily repaired and upgraded while staying protected from the outdoor elements.  When a RRH goes bad or needs to be upgraded it requires someone to climb the tower which is time consuming and can become very costly.  Given that RRH technology is still very new, it will take time to see whether the performance gains of RRH make up for the limitations in repair and upgradability.

Overall, the technology is similar for both Verizon Wireless’ and AT&T’s LTE networks.  While the technology might be similar, Verizon Wireless is the clear leader already having half the US population covered compared to just five cities for ATt&T.  In the end regardless of whether one chooses Verizon Wireless or AT&T, US consumers are the true winners having access to multiple advanced 4G LTE networks.

»  Substance:WordPress   »  Style:Ahren Ahimsa