Skip to content

Growing market confidence in IoT through benchmarking

Posted 24 Sep 2020

The internet of things (IoT), in its early days, saw a lot of hype with obvious benefits for broad markets and applications. With more recently emerging technologies such as 5G and artificial intelligence in the spotlight, IoT appears to have lost both media and industry attention. The underlying benefits and impact potential that IoT has to offer however have remained unchanged. In fact, the IoT has silently “crossed the chasm” and is now on a journey coming into mainstream adoption in the immediate future.

Many organisations, both large and small, have already been engaged in some sort of IoT pilot activity in recent years, while others are even further ahead in the process of scaling solutions across the organisation. There is a newly found confidence for IoT in mainstream markets as the technology becomes part of the everyday IT and OT infrastructure agendas of today’s businesses, as well as further understanding the opportunities for business growth with a desire from the top to scale IoT solutions.

With more than just early adopters ready to embrace IoT technology, will the IoT adoption journey become an easy one for all others to follow?

Digital Catapult’s own experience shows that finding the right IoT technology solution for business challenges still remains a major issue for many organisations. Although some early adopters are becoming more educated on technologies and their potential use cases, more needs to be done to help them navigate the IoT solutions and supplier ecosystem.

Several factors contribute to this situation.

The IoT hype during the past decade has attracted a large number of different businesses bringing competing solutions to the market. While competition is great for the buyer when a market is mature, it poses significant difficulties to early adopters due to the diversity of proprietary and standards-based solutions and very different levels of quality in the market.

In the absence of extensive market experience, any buying decisions are based on promises made by suppliers rather than on substance of technology assessments and market evaluations. The wide choice and supplier noise can confuse adopters and a wrong choice can quickly put them off from exploring IoT technologies for a period of time. This in turn slows down the adoption of IoT technologies, even if suitable ones are available.

Why benchmarking is important

Making the right technology choice is particularly challenging for emerging technologies where the market is still immature, where there is little usage evidence with no track record or previous learnings from which to make informed procurement decisions. A wrong choice can quickly lead to an unsatisfactory pilot experience which can cause an expected business case to fall apart, jeopardising the investment made. For example, the labour costs for the maintenance of an IoT device such as manual intervention for battery replacement or device reset can exceed the costs of the IoT device or solution. If the initial business case was made on the assumption that the lifetime of the device is three years without maintenance interactions, a shorter than expected battery life or buggy device firmware can quickly nullify any potential business savings that are expected to be made.

In fact, Digital Catapult’s own tests and experiences have shown that data sheets from different IoT product vendors do in many cases not resemble reality and parameters regarding performance and battery life are often overestimated, giving rise to a potential false sense of confidence in the ability of the product. For example we examined 10 different LoRaWAN asset trackers and were highly surprised by the findings, which we will share in one of our upcoming blogs.

In best cases, expectation mismatch might be discovered after ordering a few samples and performing rigorous tests in envisioned pilot environments, before major investments are made. However many organisations lack these capabilities and skills inhouse or overlook the need for such testing.

Technology benchmarking is an activity that allows industry players to better understand the state of play of rapidly emerging solutions in the market. By establishing a comparison between “like for like” products and solutions and how they perform under specific use case scenarios, it provides valuable insights for both demand and supply side stakeholders.

For potential adopters, technology benchmarks provide useful guidance for making the right technology selection for business needs. Benchmarks allow a quick understanding of the performance and features that each technology product or solution offers and the trade-offs against technology requirements and business cases. This in turn allows the company to develop a more informed business case for a specific IoT solution and maximise the success of an IoT pilot deployment or larger scale rollout.

For product and solution providers, benchmarking provides an in-depth understanding of how their products and solutions compare to others in the market. This provides valuable feedback on potential performance issues and opportunities for further improvements and for businesses to adjust their pricing to remain competitive or maximise profits. The awareness brought about by benchmarking thus provides the means to raise the bar in terms of quality in an emerging technology market and to accelerate the maturity of corresponding products and solutions.

Benchmarking in the ICT industry

Benchmarking is fairly well established in the ICT industry. For example, performance benchmarking is a common practice for many mainstream microprocessors. So called CPU benchmarks allow easy comparison by judging performance based on a standardised series of tests. These tests are usually based on different sets of workloads that execute on the CPU under tests.

Work loads can be synthetic or based on real world benchmarks. Synthetic benchmarks are programs that simulate many different tasks: 3D rendering, file compression, web browsing, floating-point calculations, and so on. While they do not provide an exact predictor of performance in standard workloads, they are useful to compare the relative performance of CPUs.

In contrast, real-world benchmarks use representative applications that are actually utilised by end users such as programmes for file compression, 3D rendering, video encoding or games. By giving this real program a heavy workload and measuring the time to complete or throughput, the benchmark provides a preview of system performance in a future similar setting.

Benchmarks such as the aforementioned example of CPU focus on performance at component level. Other examples of component level benchmarks are memory benchmarks. Benchmark tests could however also examine the performance of an entire system. The latter is more complex to handle as comparisons become more difficult with an increasing degree of freedom for the testing scenarios and the interdependence of different system components. Network performance tests are an example of system level benchmark.

Benchmarking in the IoT industry

Within the IoT industry some component level benchmarking activities have started to emerge. An example is the benchmarking of embedded microprocessors by the EEMBC community which look at both component and system level performance.

Other emerging benchmark initiatives such as MLPerf look at how machine learning workloads can run on smaller embedded microcontrollers. A recent working group called TinyMLPerf has recently been set up with the goal to allow end users to understand how well different MCUs would perform for different edge processing scenarios.

An understanding of component performance is of relevancy for the IoT product designer but there is little to guarantee that the final product is actually performant, even if suitable component choices have been made.

On IoT product or device level, the only assurance mechanisms are certification tests that are performed such as compliance checks to European regulations, for example, CE marking or the adherence to specific radio standard specifications. While they ensure that an IoT device will safely operate and conform to specific standards, they say little about the actual performance of an IoT device or system and the associated end user experience.

In the consumer IoT space, organisations such as Which? have started to feature product reviews for connected devices such as for use in smart homes. However the majority of IoT products and services will be part of enterprise B2B solutions. There are no organisations who provide impartial guidance to businesses to navigate the emerging IoT solution space.

Why is Digital Catapult engaged in IoT benchmarking

Digital Catapult’s role is to accelerate the adoption of advanced digital technologies in the UK and the Internet of Things represents one of these. We do this by supporting startups and scaleups to leverage these technologies more effectively to build more competitive products that the market needs. We also help larger organisations and early adopters to navigate the complex and rapidly changing landscape of these technologies and emerging products by de-risking their technology experimentation and finding the right solutions that work for their business.

We believe that benchmarking of IoT technology will play a crucial role in accelerating market adoption. By creating a more educated community of potential buyers, benchmarking will enable adopters to make the right technology choice and minimise the number of failed IoT pilots and wrong technology investments. At the same time benchmarking will provide valuable feedback to supply side stakeholders, allowing improvements to existing IoT products, both in terms of quality and performance, to be more rapidly made.

Carrying out IoT benchmarking activities requires significant technology expertise, trusted and transparent processes and the right facilities. It also requires market independence and vendor neutrality to gain the right trust from the market.

As a not-for-profit organisation established and supported by the UK Government with deep technology expertise, Digital Catapult is in a great position to take a leading role in providing IoT benchmarking services to the market. Our Future Networks Lab provides suitable infrastructure and facilities to carry out technology assessments that underpin IoT benchmarking activities. Based on industry leading methodologies and tools, our recognised experts are able to support a variety of IoT benchmarking campaigns. These can range from benchmarking IoT devices for specific applications over to end to end IoT solutions.

Businesses interested to learn more about our IoT benchmarking services or how, please get in touch with Ramona Marfievici: [email protected]k.

Please watch this space as we will be sharing more insightful IoT benchmarking outcomes that we have established during the recent period.