Each week David Souter comments on an important issue for APC members and others concerned about the Information Society. This week’s blog post is about the latest edition of the ITU’s ICT Development Index.
Each year, the International Telecommunication Union (ITU) publishes a report which it calls Measuring the Information Society. This updates its ICT Development Index (IDI), which tries to give a comprehensive picture of progress being made towards an Information Society.
This year’s report came out last week, with latest findings on the IDI and new research on ICT prices, mobile uptake and Internet user trends. This week, I’ll comment on the IDI and the long-term trends it shows since 2010 (a period covered in last year’s edition). Full disclosure: I worked with the ITU as a consultant on these two reports.
Why do we need indices?
We need to measure what is happening in the Information Society if we are to make the right policy choices and ensure inclusion.
Indices bring together different indicators to provide a more comprehensive picture of what’s happening than one indicator could do on its own. A good index will even out anomalies between these indicators. It will help us to compare countries with one another, and to understand trends over time. That’s important if policy and investment decisions are going to be based on evidence.
What’s in this Index?
The IDI’s been running now since 2008. It includes eleven indicators – five on access (fixed and mobile phone subscriptions, households with a computer and with Internet, and international bandwidth per Internet user), three on use (fixed and mobile broadband subscriptions, and the proportion of individuals using the Internet) and three on skills (enrolment in primary and secondary education and average years in school).
This year’s Index includes 175 countries and territories out of the 200 or so that might be included if data were available for all.
A lot of attention’s paid to those countries that come highest in the Index (this year, that’s the Republic of Korea (South Korea), Iceland and Denmark), and those that have made the most dynamic progress (this year, St Kitts and Nevis, Algeria, Bhutan and Myanmar). But the IDI’s most useful in comparing groups of countries and identifying long-term trends. That’s what I’ll summarise below.
How good is it?
No index is perfect. Building an index for ICTs is especially difficult, for two reasons.
First, it’s harder to measure use than access. Access can be measured from the supply side – with data from businesses and regulators. That’s much less true of usage, where household surveys are also needed – and they’re much less common in developing countries than developed ones. In many cases, estimates have to be made.
Second, changes in ICTs happen very fast, so benchmarks rapidly fall out of date. Broadband today, for example, means something different from what it meant five years ago. ICT indices need updating regularly if they’re to stay relevant.
The skills part of the Index is also problematic. Most countries don’t yet have good data on ICT skill levels, so the IDI has to rely on ‘proxy indicators’ – education indicators that suggest people’s ability to acquire ICT skills rather than whether they actually have them.
But every index has its challenges. The IDI’s as good as anything available at present if we want a comprehensive picture of progress towards the WSIS goal of ‘a people-centred, inclusive and development-oriented Information Society.’
What does it tell us about digital divides?
It tell us digital divides are substantial and resilient.
First, the good news. Every country improved its index rating between 2010 and 2015, and almost every country did so this last year. That’s true for all but one of the access and usage indicators, too (see below). Where it didn’t happen, special factors were usually in play – for example tax or registration changes which led to reductions in the number of recorded mobile phone subscriptions.
But few countries have moved up or down the rankings by large margins in the last six years. The range of countries at the top and bottom of the rankings is similar today to what is was in 2010. Where there have been bigger changes for individual countries, special factors are again often at play. Myanmar, for example, one of the biggest movers in the last year, has seen an upsurge in mobile adoption following recent market liberalisation, political and economic change.
The gaps between high and low performing countries are truly substantial. The average IDI rating for developed countries in 2016 is 7.40 (out of a maximum of 10) compared with 4.07 for developing countries and 2.07 for least developed countries. (The UN category of developing countries, it should be noted, includes some OECD member-countries such as South Korea and Singapore: take these out and the gap between developed and developing would be greater.)
And gaps are large between and within regions. This map shows where countries fall if you divide them into four quartiles according to their Index rankings.
All but one of Europe’s countries falls within the top half of the rankings, while all of those in sub-Saharan Africa are in the lower half. Regions such as the Americas and the Arab States show large differences between their richer countries (OECD members, oil producers) and those with lower levels of development.
The underlying reason for this is clear enough. There’s a strong relationship, explored in the report, between levels of economic development and ICT development. And there’s a strong relationship between least developed countries and the ‘least connected countries’ at the bottom of the Index. Thirty-six of the 44 lowest ranking countries are also LDCs. While middle-income developing countries are keeping pace with those at the top of the Index, these LDCs seem to have fallen further back.
What does it tell us about long-term trends?
Some worldwide patterns of ICT access and usage are shown below.
The fastest growth between 2010 and 2015 was in mobile phone subscriptions. In the last year, for most countries, it’s been in mobile broadband. Now that mobile phones are so widespread, there’s little room for them to grow in many countries, so their influence on the Index has declined. In only one region last year – sub-Saharan Africa – was the growth of mobile phones more influential on IDI levels than the growth of mobile broadband.
At the same time, there’s been continuing decline in fixed phone subscriptions, in all regions, with very low fixed connectivity in sub-Saharan Africa. That translates onward to levels of fixed broadband. It may have an impact on the quality and reliability of future connectivity: time will tell.
More rapid improvements are taking place in the usage part of the Index than in that concerned with access. Partly that’s because of where broadband’s located in the Index. But it also reflects the growing sense in international policy towards improving the quality of connectivity, the affordability of access, and the capability of end-users to take advantage of the Internet.
What does this all tell us?
Indices like the IDI are difficult to compile, because data are hard to get and sometimes unreliable, but they’re important because they show up long-term trends as well as differences between countries. This should suggest policy priorities.
It’s worth looking at policy approaches in those countries that are outperforming others with similar economic characteristics. Is there something in their policy choices that enables them to do so? (The ITU thinks there is, including competition and enabling environments for investment and innovation.)
It’s worth looking at the balance between different indicators in individual countries. Those countries that do better on usage than on access need to focus policy initiatives on infrastructure; those that do better on access than on usage need to pay more attention to the demand side (including affordability and skills).
Developing the Index
More reliable data are needed in many countries, but that depends on improvements in national statistical systems, which are under pressure from the large number of indicators required for measuring the Sustainable Development Goals (SDGs).
The IDI also needs regular revision to ensure that it stays relevant. Questions can be asked, for example, about whether fixed telephony should be retained when there is so much fixed-mobile substitution; about the need to move from indicators for mobile phone subscriptions to mobile phone ownership or use, which more accurately reflect what people do (see Chapter 5 in this year’s report); and about the best definition for broadband. The ITU’s considering all of these.
It would be better to have skills indicators that are concerned specifically with ICTs rather than proxy indicators that measure education. They’ll be hard to get for some years yet.
But the IDI has real value in focusing attention on long-term differences between countries and regions, and in identifying long-term trends. Worth a download and some time reflecting on where your country stands.
Next week’s post will look ahead to this year’s Internet Governance Forum, on the theme ‘enabling inclusive and sustainable growth’.
David Souter is a longstanding associate of APC, and has worked for more than twenty years on the relationship between ICTs and public policy, particularly development, environment, governance (including Internet governance) and rights. David writes a weekly blog for APC, looking at different aspects of the Information Society, development and rights. David’s blog takes a fresh look at many of the issues that concern APC and its members, with the aim of provoking discussion and debate. It comments on current topics and international meetings, draws attention to new reports and publications, critiques assumptions and suggests alternative perspectives. The views are his own, not APC’s. We hope that they will stimulate discussion, and that others will contribute their ideas in complementary blogs in future. More about David Souter. Follow him on Twitter.