• Archives

  • 3G antenna Apple AT&T BBC broadband cellular cognitive environment Femtocell Femto Foru Femto Forum HSDPA in-building Ionica iPhone ipod LTE mobile net neutrality Ofcom smart grid smart meter smart metering spectrum TV UK Broadband VoIP white space Wi-Fi WiMAX Wireless

Antennagate: the real engineering behind the iPhone 4 antenna issue

One of our newest associates, Robert Thorpe, is a real mobile antenna expert. He took some time to explain the reality of the much-reported iPhone 4 antenna saga:

Antenna engineering is rarely big news, but the problems with the Apple iPhone 4 brought it into the spotlight.  It’s highlighted issues that many smaller businesses designing wireless products struggle with.

Back in June Apple announced the iPhone 4, in his keynote speech Steve Jobs promoted the new antenna design.  I was asked to comment on the antenna for an article on the Wall Street Journal website.  The journalist sent me a long list of questions and I replied with a long list of comments.  There was only one bit that was quoted directly: “This is a very difficult thing to do”.

New ideas in antenna technology are difficult to get right the first time.  It’s normal for a new type of antenna to be tried out on a low-volume product first before it is used on a high volume product.  Apple decided to skip that step and implement a new type of design on a high volume product, that was a very brave decision.
The conventional type of internal antenna for cellphones is the Planar Inverted-F Antenna – the PIFA.  The PIFA is normally placed at the top or the bottom of the handset.  The PIFA is a flat piece of conductor cut into a shape, it is placed above the handset’s PCB which provides a groundplane.  It has a feed pin which connects the RF feed to the PIFA and a ground pin that connects the PIFA to ground.
For a PIFA the distance between the groundplane and the antenna strongly affects the impedance bandwidth.  The greater the distance the greater the bandwidth that can be achieved.  In a handset that distance is directly related to the thickness of the phone.  That is the major problem with the PIFA, if more bands are needed then a thicker PIFA is needed, which in turn requires a thicker handset.  Modern cellphones must cover five bands to work internationally.  The effect this has on the profile of the handset can be clearly seen in the HTC Legend and the Samsung Galaxy SGS, both have a “power bulge” to accomodate the antenna.
For Apple though aesthetics and good product design are very important.  I think it was this that led to them to adopt radically different antenna technology.  They wanted to make a thin five band handset and they got a group of antenna engineers to research the problem until they found an answer.  A search through Apple’s patents shows that they’ve been working on this research program for several years.
Apple haven’t revealed exactly what sort of antenna they’ve used in the iPhone 4.  They have said that the outer frame around the edge of the handset forms the antenna, but they haven’t said exactly how this was done.  However, their patent applications are revealing.  In this post, I explain how two of Apple’s patent applications suggest that the antenna is a form of slot antenna.
As soon as the iPhone 4 was released rumours on the internet began about poor performance.  Only a couple of days after it was released there were dozens of videos on YouTube showing the problem.  Most of the videos showed something similar, when a hand is put around the phone the signal strength bars fall dramatically.
This ignited a lot of debate about the relevance of signal strength bars.  Some suggested that the problem was caused by the algorithm displaying the bars and didn’t represent a real issue.  (Simon and I were asked to comment in two further articles about that on the WSJ website).  While some were blaming software people who had iPhone 4s were doing experiments, videos were recorded showing calls dropping.  Then the tech website Anandtech hacked into the software to enable a handset to report the signal strength in dBs.  This showed that the fall in signal strength was large: ~20dBm.
There are several possible reasons for this, even now it’s hard to be sure.
Many people have identified that touching the slot in the bottom left side of the handset causes the degradation.  That could have two underlying causes.  The antenna could be detuned by the dielectric material in the hand causing electric fields to change.  Or it could be that the RF current is being conducted from the antenna into the hand where it is dissipated by the hand’s resistance.  It could be a combination of both of these effects too.  Without careful measurement it’s difficult to tell this precisely.
A problem of this sort can only be completely solved by redesigning the antenna.  An antenna redesign can’t be done quickly or easily.  Apple decided to give iPhone buyers a free “bumper” case.  That case provides a spacing between the user’s hand and the sensitive parts of the antenna, it prevents conduction and reduces.  Some users have said that the problem still occurs with the bumper on, though it’s lessened.
This is a high profile example of a common sort of problem in handset design.  Though few handsets use radical new antenna designs many suffer from performance problems in particular usage scenarios.  For example, a sliding handset must work well with the sliding portion extended and retracted.  In both of those states it must work well in empty space, close to the head, and in the hand.  I’ve worked on many of these handset antenna problems before.  If they’re only found late in the project then they can be very difficult to solve.  Some projects over-run because of these problems, some switch the antenna type or antenna provider, and some have to be cancelled.
Hopefully the iPhone 4 antenna saga will encourage people to take antenna issues more seriously.  Perhaps in the future users will be slower to blame poor performance on the network operator and consider the possibility that their handset could be the problem.

More haste, less speed needed for mobile data networks? Making the best of net neutrality.

Broadband connections, whether on fixed or mobile networks, tend to get sold to customers on the basis of their headline (or ‘up to’) data speed or on the size of the monthly data allowance. Aside from the obvious additional issue of the price, that’s often about all that seems to differentiate broadband offers.  Regarding  the headline speed, service providers are quite reasonably coming under increasing scrutiny because those speeds are rarely available to customers. On average, Ofcom estimates that UK download speeds are only 45% of the advertised ‘up to’ speed.

For mobile networks it’s even harder to make specific claims about differentiation between offerings from different networks. As well as experiencing the same contention issues of fixed networks which make  it hard to guarantee service when there are other heavy users in the same area, mobile connections are also subject to interference from surrounding cells, signal levels which can vary by factors of thousands over distances of a few centimeters, and even users daring to hold their phones.

Meanwhile, the “net neutrality” debate is raging around the world about the extent to which it is legitimate to manage traffic in order to avoid  a small number of heavy users massively degrading the service for everyone else. Verizon and Google in the US  have come up with a joint policy proposal, which argues that mobile networks do need such management.  However, the FCC appears underwhelmed by the proposal. The European Commission is also consulting on the issue, while Ofcom in the UK is  joining the debate.

One argument (and broadly the situation which applies in Europe today) holds that no regulation is necessary, and traffic management should be permissible, because consumers will vote with their feet against excessive restrictions. This relies on the broadband market being sufficiently competitive that consumers can easily switch providers and have other providers with different policies to switch to. The counterargument is that this is all too hard for consumers and the internet should be free and open without discrimination to any service.

An important approach to resolving this is for regulation to focus on transparency, so that consumers are told in  clear terms when traffic management is applied. This may also require some definition of the types of traffic management which are permissible.

The quantity of mobile spectrum, the viable size of network and the spectrum efficiency are all finite  and are bound to lead to  capacity limitations from  place-to-place and from time-to-time. It then seems to me that traffic management is essential to protect consumers, ensuring they get a service which is useful despite the excesses of others in the network. It’s a bit like a trip to the pub: one loud and obnoxious drinker can spoil the enjoyment for all, so I have no objection to the publican having powers to eject those who breach accepted norms.  I would however object if only one type of drink was available.

Further,  the application of appropriate traffic management process may have another positive side effect for consumers. Systems like LTE are capable of assigning very different quality-of-service levels to different users, even though they share the same spectrum. So one user could get a best-efforts web browsing service , contending with other users on the same service for the resources allocated for this, while others can gain access to a guaranteed bit-rate service which ensures (within reason) that they’ll always get a solid service with a given level of quality.

This feature of LTE is much less talked about than LTE’s speed, but is potentially far more significant. I hope it leads to a wider range of choices for consumers and ultimately a better service. Here’s an example from the fixed world, where the service is sold as much on its latency as its speed.

So less focus on speed and more  on other service quality dimensions could be good for mobile operators and consumers alike, while supporting the need for transparency under any future net neutrality regulations.

Mobile factoids from Ofcom Communications Market Report



Ofcom has just published its seventh Communications Market Report. As usual this contains a wealth of information on the UK communications scene. I’ve picked out a few factoids which provide an insight into the mobile market. The figure numbers after each are those in the original report to help those who wish to delve more deeply

  • 37% say mobile broadband coverage is poor in the areas they want to use it (fig 5.18)
  • It looks like mobile voice minutes will surpass fixed voice minutes this year (fig 5.1)
  • 13.5m people are now using the intranet on mobile phones (fig 5.18)
  • Smartphone penetration is now 26.5% / 12.8m (fig 5.19) (that’s low compared with some places: Italy’s at least twice that)
  • iPhone share of traffic volume is declining steeply – tilting towards Android. (fig 5.21)
  • Voice is still growing: 6.7% growth in 2009, for a 13% CAGR over 5 years (fig 5.32, 5.43)
  • Contract customers are growing (10.3% 5-year CAGR) far more steeply than pre-pay (3.5% CAGR) (fig 5.47)
  • 32% of mobile subscriptions are now 3G (fig 5.55)
  • Household expenditure on mobile services is continuing to decline: from £33.23/month in 2004 to £30.66/month in 2009 (at 2009 prices) (fig 5.64)
  • 45.6% of mobile internet access is at home, and 17.8% at work (fig 5.88)

(CAGR = compound annual growth rate)

Also, there has been much talk about the future disparity between the fast growth in mobile data volumes  and the more moderate growth in the associated revenues (“exponential traffic, linear revenue”). However, most of this has been future looking predictions, and it’s interesting to see some real historical data supporting the contention that mobile operators need to find ways to optimise delivery costs in order to profitably sustain this growth:

(Figure 5.6)

Ofcom appoints Real Wireless to study “4G Capacity Enhancements”



Real Wireless has been appointed by Ofcom to conduct a study into “4G Capacity Enhancements”.

The study is to investigate the potential for IMT-Advanced technologies, such as LTE-Advanced and WiMAX 2, to provide gains in mobile network capacity to accommodate long-term future growth in mobile broadband traffic demands.

The emphasis is on the real-world performance of the relevant technologies via testbed and trial results rather than on purely theoretical analysis.

Smart thinking needed for smart metering communications network

Smart metering is part of broader package of measures to get consumers to be more aware of their energy usage and in the process hopefully help them to reduce it. It’s a simple concept – give people historical and real time data on their usage, what it is costing them and they are going to try and reduce their consumption. Plus it means the end of the meter reading person and estimated bills along with the introduction of real time tariffs. Include electricity, gas and water and it’s an unstoppable idea and common sense really.
It’s going to cover nearly 50m homes and business locations in the UK. Think natural gas conversion, add transition to digital TV and include Electronic Patient Records and you’ve got some idea of the scale and complexity of this project.
So let’s just focus on one small but crucial part of this – how to provide two-way communication to electricity smart meters located in every household and business premises across the UK. Why two-way? Well, aside from any error correction requirements, power companies want to be able to utilise variable tariffs dependant on demand – so the price per unit at 07:30 might be higher than at say 11:00 that same day. A sort of real time version of ‘off peak’. This requires updating the meters perhaps several times a day depending on the way the smart meters work.
There are perhaps three basic paths to achieving this connectivity on such a scale:

  • Wireless
  • Broadband via the telephone line or cable service
  • Power line communications.

Using domestic or business broadband can be problematical because it isn’t a controlled connection – for example if you use one of those smart plug adaptors then your router might be turned off when you turn off the PC, in which case the smart meter would disappear from the network! Power line communications looks more promising but something on this scale hasn’t really been done before and there seem to be many issues.
So perhaps wireless is the most suitable solution – but what sort of wireless? Use a current network – such as cellular, satellite, terrestrial broadcast infrastructure or build a new network dedicated to smart metering. This is the big question but not unexpectedly it gets hard to separate fact from sales pitch. And nobody has designed, built and operated such a network on this scale before – so we’re heading into unchartered territory.
Currently the user requirements for such a network have not been published as the formal procurement process has yet to start, but we can guess the following based on the fact this is basically part of a billing system for what is considered a part of our critical national infrastructure.
1. High availability with a reliable connection to all premises right at the existing meter location
2. Secure – to protect from fraud and cyber attack
3. Simple to install – essentially the meter will be installed by an electrician and the comms will be just expected to work
4. Reliable, self powered for the lifetime of the network – 20-25 years
5. Simple to network manage and fault find with easy first line maintenance
The smart meter will probably incorporate the wireless unit including antenna and this unit will be installed by an electrician in the same location as the current meter. In older houses and buildings these are typically in cellars, under the stairs and so are probably in the last location you would choose to ensure good wireless connectivity!
Of the existing networks cellular is probably the most suitable – after all this is just M2M on a grand scale, an area that the network operators are keen to grow. Satellite probably fails the common sense test due to the location of the meters – installing a satellite dish on properties is not part of the plan! The utilities already run SCADA networks to control their switches and valves and to monitor flows – but these are slow speed systems design for low volumes of terminals so these networks wouldn’t scale up.
The key questions for cellular are capacity, coverage and resilience. Like most M2M services the data volume per meter per day is probably quite low – especially compared to smart phone usage which is currently the holy grail of service providers. So even in dense urban environments capacity shouldn’t be a problem. Coverage however is likely to be challenging – indoor coverage even for phone use is often a problem hence the reason why femtocells were developed. Coverage under the stairs and in a cellar just makes the problem worse. So the coverage question is somewhat unquantified at the moment but our day to day experience suggests there are issues that need to be addressed here. Resilience of the service no doubt set against tough service level obligations under various failure scenarios will also present significant challenges to a public cellular network operator. Whilst generally cellular infrastructure is very reliable, smart metering presents a different environment and unique challenges. Much work and risk analysis will need to be undertaken against the specific service level requirements of the smart metering network in order for a cellular operator to determine whether they would be prepared to contract to such obligations and stand behind the financial consequences of failing to meet them. Then just to add to the interest there’s the question of transition to 4G and the possible ceasing of 2G, 2.5G & 3G services within the lifetime of the smart meter network. Technology refresh at the premise level would be something to try and avoid.
So a lot of questions to work through and no doubt some unique solutions will be required to enable a cellular network operator to provide a nationwide service suitable for smart metering.
Looking at a new network build then whilst you are starting from a green field and so will end up with the optimal network the capital costs and operational costs start to be a major concern as does the availability and cost of suitable spectrum. Plus there is the general hassle and risks associated with building a new national network. Given the number of shared antenna sites available it is unlikely that many new sites will need to be constructed but antennas, base stations, IP networking equipment and backhaul will need to be installed at several thousand sites. Ideal spectrum is probably around the UHF range – balancing range against in building coverage. Digital dividend spectrum would therefore suffice. Again trials will need to be done and a much larger series of measurements undertaken right next to the existing meters to assess just how many base stations will be required to achieve the necessary blanket coverage. Then there is the question of which wireless technology should be used – a proprietary technology, WiMAX, Wi-Fi, Mesh or more traditional UHF mobile radio – or perhaps a combination. In many ways this decision is tied to the spectrum to be used and the architecture and signalling protocols used by the smart meter system itself. Currently most smart meter vendors appear to use proprietary signalling schemes bringing the potential for customer ‘lock in’. For a project of this size multiple suppliers in open competition for the smart meters themselves will deliver cost benefits as well as minimising the risks of delivering the sheer quantities needed. So perhaps some open standards are called for to allow the smart meter market to develop rapidly without concerns about the technology choice. What is clear is that the choice of spectrum, the type of wireless technology, the network architecture and the over the air signalling scheme need to be closely tied together, modelled and field tested to ensure that the system is scalable and robust in the real world.
Whether this new dedicated network build approach turns out to be a lower cost solution – in terms of total cost of ownership over the lifetime of the network – compared to using a public cellular network is a tough call and both options need to be looked at in depth using real costs and well founded engineering assumptions.
Either way it’s a great opportunity for many businesses and consortia have already been formed and are being formed to bid as you read this. The Government is pushing to accelerate the rollout of smart metering in order to deliver the benefits earlier – indeed since writing this blog over the last few days fifteen new documents – amounting to several hundred pages – providing further detail on the project have been released by the Department for Energy & Climate Change and are available here . The procurement process is planned to formally start in the autumn.
The smart metering project is hugely complex and the communications network is just one small but equally challenging piece with many questions currently unanswered. We all know that large projects such as this typically face huge cost increases and delays downstream as forecasts, optimism and expectations give way to Murphy’s Law, changes in the requirements and the gritty reality of the implementation team. If only 1% of the installed meters fail to appear on the network or have unreliable connectivity that could be almost 0.5 million locations that have to be visited and fixed. That’s going to be expensive!
Oh and we haven’t even talked about how the electricity meter is going to communicate with the gas meter and water meter.
– John Okas