As M2M systems become more pervasive, will vendors need to agree upon standards for device-to-device communication? What kind of standards, if any, are needed?
Steve Jennis, SVP Corporate Development, PrismTech
Let's first address the definition of M2M systems. A traditional M2M system involves point-to-point (north-south) connections over the Internet between a device and a Cloud service. This is adequate for limited applications like device management and basic data collection, and these systems predate the advent of the Internet of Things (IoT). Unfortunately, these systems create data silos that do not permit peer-to-peer (east-west) data connectivity between devices, or multi-tier system topologies (e.g. distributed computing concepts with edge, gateway, fog, or Cloud nodes) and thus do not allow the full potential of the IoT to be exploited (i.e. offer sub-optimal returns-on-investment).
Typically, these first-generation systems only address specific tactical applications and have their own embedded ways of enabling communications--standards-based or not. As such, they have only generated a relatively small market (compared to enterprise computing markets and the potential of the IoT) and are thus predominantly serviced by relatively small vendors.
Now let's look at the approach required to fully exploit the potential of the IoT. The infrastructure required is quite different from point-to-point first-generation M2M systems. It is being defined by industry giants such as Intel, Cisco, IBM, and Microsoft in collaboration forums such as the Industrial Internet Consortium (IIC) and the OpenFog Consortium.
This infrastructure is required to enable a digital enterprise through supporting ubiquitous, yet secure, data accessibility, system-wide interoperability, and composability. That is, both north-south and east-west data connectivity to support business value-add through applications at the edge (in devices), in gateways, in fog nodes (in appliances), or in the Cloud (as remote services). Many analysts (e.g. I DC) comment that soon 40% of IoT data will be "stored, processed, analyzed at or near the edge" and that 50% of IoT systems will be "network constrained."
Therefore, first-generation M2M systems (with their dependence on Cloud services and always-on Internet connectivity) will not be acceptable for many reasons (latency, bandwidth, security, reliability, robustness, recovery, etc.). This has resulted in industry efforts to support concepts like edge intelligence, fog computing, and distributed analytics, in addition to, and complementing. Cloud services. If first-generation M2M systems pioneered the use of the Internet and Cloud services to add value to device data, then these second-generation systems are those that will enable the Industrial IoT.
Obviously there are many standards that already exist and are highly relevant to these second-generation systems. The IIC has published a Reference Architecture that identifies relevant standards, and the OpenFog Consortium similarly has working groups on open architectures (using open standards). Both of these organizations were founded and are led by some of the biggest names in IT, those with the influence to define the next-generation industrial digital infrastructure.
1. Look beyond traditional (first-gen) M2M systems. Integrating multiple north-south data silos later will be a nightmare.
2. Look to industry consortia such as the IIC and the OpenFog Consortium to help you understand concepts such as edge intelligence, distributed analytics, and fog computing.
3. New standards are rarely required. Applying existing proven and recommended standards (such as DDS and MQTT) within an open reference architecture will give you a timely solution with minimal risk.
4. Consider the reference architectures being collaboratively developed by the biggest names in IT, and beware of M2M vendors' data silos.
Dr. Edward Griffor, Associate Director, Smart Grid & Cyber Physical Systems Program Office, NIST
Yes and no. Yes, because the vision for M2M, or more generally Iot, anticipates a high degree of interoperability, and standards are a traditional way of achieving that goal. Standardized interfaces would also, as in other domains, create opportunities for commercial products based on a broad potential usage for solutions.
These standards would enable uniformization at different levels: from a common data model to common communications protocols and network management algorithms, by domain, all the way to common 'parts' (i.e. common communications hardware and software).
No. because there remain in the commercial space multiple perspectives, participants, and goals supporting divergence in device-to-device communications. Vendors producing customized variants have independent revenue streams associated with each variant.
This is the case even if much of the abstract data being communicated is largely common. Their customers still regard this communication layer as proprietary, as it reflects and supports closely 'their design'. Standards may cut into some of those revenue streams. Agreeing on standards would potentially render many of the vendors' products commodities.
Olivier Pauzet, Vice President, Marketing & Market Strategy, Sierra Wireless
Standardization, and the strong ecosystem support that results from these efforts, is needed for device-to-device communication to become pervasive.
Low Power Wide Area (LPWA) wireless networking technologies have taken center stage in Internet of Things (loT) connectivity because of their promise to deliver low power, low speed, and low cost with very high network coverage. The technologies that are best positioned for LPWA longevity are the ones that will be standardized and will therefore be able to offer the greatest ecosystem support and interoperability: From this perspective, the 3GPP LTE-Machine Type Communications (LTE-M) LPWA technology stands out as a leader.
LTE-M is slated for commercial availability in the second half of 2017, with multi-national network deployments in hundreds of countries, and it is already proving itself in Iot trials. At Mobile World Congress in March 2016, Sierra Wireless demonstrated two devices communicating with each other using an LTE-M module, where we implemented low power features for both transmission and reception using Ericsson base station infrastructure. It was a world first to have both features in a cellular module.
Market timing and solution readiness are, of course, only two of the many forces at play. Cost, power, and coverage are all key factors when it comes to predicting which standards and technologies will have the greatest impact.
Because the complexities of all LPWA solutions are similar, costs are much more likely to be influenced by economies of scale. Standardized solutions with worldwide adoption that support a massive M2M ecosystem could make the technology much less expensive than other niche proprietary solutions.
Standardized solutions like LTE-M stand to fare much better over the long haul than proprietary ones, and they will best serve the industry and the Iot as a whole.
Stephanie Montgomery, Vice President of Technology & Standards, TIA
In any communication device design, it is ideal to have interoperability at the interface, connectivity, and protocol level between manufacturers. In addition, using a common framework for programming and defining capabilities allows for greater innovation as creative new uses are not hampered by custom engineering hardware and platform software for each solution.
Manufacturers, vendors, and users tend to come together in each industry vertical to have harmonized platforms for easy transitions by the end user from one feature set to another. Of course, there are proprietary solutions which get broad deployment and in effect become 'de-facto' standards. However, as the markets mature and there is more overlapping competition, vendors of M2M systems may find that there are distinct advantages to create industry standards from which they design their products to comply, ensure compatibility, and encourage innovation.
Standards drive interoperability and avoid re-inventing the wheel for each product. The time-proven model for developing standards encourages open participation and engagement from experts in the industry. There are already some M2M standards in the market which define the core platform and the base architecture from the TIA, ETSI, and oneM2M standards programs, and we assume there are others out there. As the community moves deeper into the explosion of M2M, use cases are being developed with a look toward defining the recommended interfaces.