Changing the paradigm
According to forecasts, there will be close to 21 billion edge devices by 2020ii. As this projec-tion becomes reality, it is likely that the underlying networks transporting the data will run into bandwidth congestion. In such an environment, dependency on cloud to take action could result in lags and failures, thereby seriously jeopardizing operations of critical systems such as in utilities, power plants, railway systems, mining operations and surveillance systems, which are more and more reliant on IoT and cloud technologies.
The choice of the level of intelligence at the edge is only restricted by our insecurity to rely on a system to take actions. A well-built system will take any action it is designed to take. Caution is imperative, however, it should not hold us back from building increasingly autonomous systems.
The solution is two fold:
- Create storage and data filters at the edge. This will reduce the volume of (real-time) data to be published at the backend, thereby reducing network congestion
- Provide the edge with a degree of decision-making ability. The extent of intelligence at the edge can be increased and matured gradually. This way, solutions can be made self-reliant and decisions can be taken without depending on backend or cloud systems
Why this has not happened yet is not too difficult to predict. Cloud has become a cost-effective and natural storage medium over time. Moreover, it is in the interest of cloud providers to encourage publishing data at the backend. This ensures their systems are used to complete analytics and decision-making. But as edge devices grow, the structure will be crippled by its own weight.
In case of the first solution, creating data filters can improve the performance of edge devices significantly. Assume a sensor monitoring ambient temperature and publishing data every 30 seconds on to the backend for analysis and action (for instance, if temperature is > 300C, switch on the fan). When the temperature increases, the backend provides appropriate action. However, in most situations, the temperature may not change for hours, even days. There is no real reason for the edge device to publish data every 30 seconds on to the backend. In such cases, the edge device can store the data and batch-publish when convenient. The edge device must also be armed with a degree of intelligence. This means that if the temperature varies, it can do one of two things: either send the data immediately to a remote backend for analysis or to take decision locally to produce instantaneous action.
The second solution is especially efficient and useful in situations where a lag in decision-making can lead to disaster – such as during a gas leak in a mining operation or a potentially dangerous traffic situation for an autonomous car. Further, edge devices must be smart enough to separate valuable data from noise. An incorrect value propagating through the system may create false alerts and cause havoc. Multiple gates are required to be built in the system to avoid such occurrences. Techniques such as FFT, ANR, LQE, Bandpass Filters and Nonlinear filters can be used in addition to physical devices like attenuators to that effect.
Collaborative clusters – higher level of intelligence
As a first step, embedding intelligence in edge devices need not mean sophisticated systems. Intelligence, to begin with, can be made up of simple low compute rules and algorithms. They come with an added advantage of low storage and low energy usage.