Redmill Marketing Associates logo

Service Innovation – The B2B Perspective

B2B markets, by which we mean both classical B2B services and emerging M2M applications, remain the area in which operators are most likely to display innovation. While there are clear opportunities in traditional B2B customer segments, operator advantages may be more rapidly eroded. However, M2M has much more interesting potential for operator-led innovation, even while traditional stakeholder relationships are turned on their head.

In our ongoing discussions on the subject of innovation, we recently turned our attention to services. We considered and proposed the Digital Selfridges model for consumer service innovation, based on feedback from operators we have met this year, in which we highlighted the role of an operator in curating quality content and partnerships, rather than in its own direct, in-house service innovation per se (which, in any event, we considered highly unlikely). Now it’s time to turn our attention to the B2B perspective.

This has two distinct but related elements. First, classical B2B services, such as managed connectivity and hosted PBX, and their evolution towards a broader-based cloud ICT portfolio. Second, M2M and IoT applications. Customers for these may be the same, may overlap, or may indeed be very different. However, it is clear from all of the work in which we have been engaged and the research we have reviewed that it is in these two domains that operators have the best prospects for service revenue growth.

So far, so obvious, but it is striking how long it has taken for this to become more or less accepted. For years, there have been efforts to stimulate service innovation in the operator domain. Yet, while many of these have been laudable, they have largely produced unsatisfactory results and we hold out no particular hopes that there will be any meaningful consumer focused innovation in telco land, with or without relevant partnerships.

Hence, our thoughts turn to B2B services. Here, the expectations for better service, higher quality solutions and so on, combined with a greater willingness to pay, means that operators have several assets that they can deploy for business customers. However, our concern is that, while the opportunities are clear – indeed, there are many growth opportunities among enterprise customers – most operators fail to adequately target them, used, as they are, to long-term relationships with high profile multinationals.

Moreover, the situation is made more complex by the fact that new entrants have also recognised the opportunities in B2B markets and are rapidly introducing attractive new services. There is also a wealth of new entrants offering highly focused products and services. However, given the overall size of the business market, it’s astonishing how few operators take this seriously. But, our primary concern here is that too many operators take larger enterprise customers for granted while ignoring SMEs. Increased competition for smaller companies combined with attractive new cloud-based offers for larger organisations could easily erode any advantages they have.

What they should do is think more seriously about segmentation within the market and consider what business customers actually need. Clue: it’s not just connectivity and QoS capabilities. They want better service, better attention, better voice products, more conferencing and collaboration capabilities, security, protection, backup and storage and more – from the fewest number of suppliers possible. In short, it’s a combination of things that operators can easily provide (UC, guaranteed SLAs) with useful tools such as are delivered today by Dropbox et al.

There are a host of ways in which operators can innovate for B2B customers, but they should start by more careful consideration of their needs – which means paying attention to all segments in the market. Otherwise, the services that B2B users consume will shift towards alternative providers, leaving the operators with nothing other than connectivity to offer. In summary, there is a huge opportunity for operators but there is a limited time (three to five years, we estimate) in which to capitalise on this before their value starts to be eroded.

M2M is rather different. There has been a plethora of estimates regarding the sheer volume of connected devices that analysts expect there to be by 2020, 2022 or at some other date in the future. However, the only thing that matters in relation to these estimates is that there will be a lot. Quite how much doesn’t really matter at this point. It’s sufficient that it’s a large and exciting number. What really matters is the role that operators have to play in this landscape.

And this is where things get interesting. First, in M2M and IoT applications, there is a significantly expanded range of stakeholders – ranging from cities, to healthcare providers, tourist agencies, motor manufacturers, and so on. In this context, operators are not necessarily the primary stakeholder; they are part of a diverse and expanded ecosystem. For operators used to bilateral relationships (network provider > customer), this requires a significant shift in thinking. The traditional balance of power has shifted.

Second, while many have pointed to the low demands and value of M2M applications, the situation is somewhat more complicated than this simplistic view. Of course, many applications will indeed have minimal data transfer requirements. However, many applications will have highly variable requirements and will be volatile, in so far as they may shift from one state to another, each of which requires a different level of service performance. Worse (or better, depending on perspective), we are only just beginning to explore the boundaries of M2M applications: we simply don’t know what use cases and demands will emerge through time.

All of which means that, while operators have an uncertain position in the ecosystem, they will no longer be supporting traditional services. They are likely to have to support an increasingly diverse range of digital capabilities. Network operators will either be the only players that can deliver such variable SLA capabilities or the only ones that can enable others to do so.

Thus, M2M represents both uncertainty and a huge opportunity for operators to assert a unique role, even while the driving force behind many applications may reside with other stakeholders. Of course, the situation is still emerging but too few operators have really grasped this opportunity – the key is that it’s not simply about commoditised services, it’s about a hugely variable range of services, many of which will require highly specialised capabilities.

In terms of service innovation, the only domains in which operators still have opportunities to make a difference are in these areas. However, B2B services probably have a shorter – though still good – time period available in which to act: advantages of mobility and network coverage will start to diminish soon, but there is both a significant opportunity and overlooked markets to target. On the other hand, M2M offers unprecedented potential for the future, but operators must become accustomed both to different roles and to a wider range of capabilities that they are well placed to deliver.

Early success does not necessarily translate to long-term sustained success

Some consumers repeatedly buy products that fail – and seem to have a propensity to become early adopters of new products and solutions. This early success can be seriously misleading, suggesting that a product will be a hit whereas it will, in fact, fail dismally. But if big data can help identify such consumers, operators can learn to evolve offers and shift them away to more mainstream customers, increasing chances of securing lasting success.

Operators that launch new services may observe early enthusiasm for their offer, but this may not be an indication of future success. In fact, it may signal the contrary – a dismal failure. This is the intriguing suggestion of new research from the renowned Wharton School at the University of Pennsylvania. In a paper with the appealingly gloomy title “Harbingers of Failure”, Anderson et al (2015)[1] suggest that some early adopters of new services may “systematically purchase new products that flop”. This finding, based on a lengthy and detailed study of retail purchases, has significant implications.

Operators need to launch new products and offers to remain relevant. Yet, their track record in doing so, particularly when compared to Internet providers, is patchy at best. In part, the success of a few runaway Internet stars can mask the volume of failures, but this is also instructive, since the Internet world is characterised by a relatively low cost of failure. Many companies fail – some even fail repeatedly – but from the failures, world-conquering success can be found.

In this context, the ability to spot a successful product is clearly important – but if a service does achieve early success, the research points out that this does not necessarily mean it will “cross the chasm” and go on to attain star status. What does this mean for operators? Well, many operators are beginning to invest heavily in big data analysis, with the aim to discover more about their customers, preferences and tastes. They hope to be able to use this data to spot emerging trends and launch more effective, successful offers.

However, it seems that there are some consumers who not only choose products that others do not but do so repeatedly. They are, as Anderson et al put it, harbingers of failure. Such consumers offer valuable lessons. They may, time after time, choose niche or unusual products that are doomed to failure. Worse, it appears that such consumers are characterised by their ability to discover such products before others – in the aggregate, this enthusiasm provides misleading results which are suggestive of future success but in fact turn out to be signs of failure.

Operators that are experimenting with big data analysis need to consider the implications of this research. What if such users could be identified? If that were the case, then providers could sieve data to see if the known harbingers are avoiding a particular product. If they are, then it might be indicative of a slow start but more likely future success. On the other hand, if the harbingers enthusiastically embrace the new product, then it could be a clear sign that something is wrong with it.

Careful use of big data gives operators the potential to do achieve this. If they can succeed in identifying the harbingers, this information can prove crucial – and help operators design more effective trial and release programmes, learning to make changes and adjustments before it’s too late, reducing the cost of failure. Interestingly, harbingers of failure could well be invaluable for increasing the chances of success.

[1] An article by the Undercover Economist in the FT first drew our attention to this interesting work.

Protecting your website from the dreaded scrapers…

What is ‘scraping’ and why should I care?

The lawless hinterlands of the internet are full of unscrupulous webmasters whose business model consists solely of stealing content from other sites and re-purposing it (sometimes in hilariously inept ways) for their own ends. These are the notorious ‘scrapers’ – websites that auto-generate content from other sources without permission.

While it might be flattering that someone thinks that your content is good enough to steal, it’s not necessarily beneficial for your site’s search engine rankings. Google typically frowns on duplicated content but it’s still surprisingly common to see a ‘scraper’ site outrank the original source domain for a specific content related search.

Search engines are very discerning about duplicate content and rather than showing multiple results for the same content, they’ll instead try to ascertain which version is likely to be the original. The challenge here is to defeat the scrapers and ensure that it’s your (original) page that shows rather than the bogus alternative.

What can I do?

Fortunately there are several things you can do to ensure that your content is prioritised as the original source:

  • Set up a google alert – While this won’t protect you against scraping, it will at least alert you if any of your content is duplicated. Simply set up a google alert using a sentence from your newly published page. Scapers are, by definition, lazy so there’s a good chance that your content will be republished in its entirety. When this happens, you’ll be alerted.
  • Ping the major blogging services – The simplest way to let Google et al. know that you are the source of the content is to alert them to the content as soon as it’s posted. You can find instructions on how to set up these ping alerts at Google Blog Search, Technorati, Yahoo etc. It’s a simple process and one that should only take a few moments.
  • Use a pinging service – alternatively (and even easier) services such as pingomatic take care of the process for you. Simply go along to pingomatic every time your content is updated, enter a few details and the whole process is automated.
  • Finally – If scraping is becoming a real challenge and your content is regularly duplicated, there are a number of tools and services available that will actively block certain ‘bots’ from accessing your site. They’ll blacklist known scrapers and provide obstacles such as captchas for the bots to navigate. If all else fails, services from companies such as Shield Square or distilnetworks will help you to identify and block rogue site visitors.

Scraper networks and domains are still a real issue for the legitimate webmaster and can have a serious impact on your site’s performance, undoing all the good work from your content marketing. Fortunately, with a little vigilance and some timely ‘pinging’ you can protect your site’s content from falling into the hands of the scrapers.