By the end of the decade, says a recent study by ABI research called “Edge Analytics in IoT,” M2M and other IoT-enabled devices will likely have collected more than 1.6 zettabytes (1.6 trillion GB) of data. As a point of caparison, the entire internet was has been estimated to be at somewhere between 4 and 7 trillion GB of data right now.
“The data originating from connected products and processes follows a certain journey of magnitudes. The yearly volumes that are generated within endpoints are counted in yottabytes, but only a tiny fraction of this vast data mass is actually being captured for storage or further analysis,” said Aapo Markkanen, principal analyst at ABI. “And of the captured volume, on average over 90 percent is stored or processed locally without a cloud element, even though this ratio can vary greatly by application segment. So far, the locally dealt data has typically been largely inaccessible for analytics, but that is now starting to change.”
Part of the reason for this explosion will likely come as a result of the beginning migration of data from traditional cloud computing toward edge computing. This shift is happening in an effort to make analysis easier and more meaningful thanks to distributed analytic workloads. The data is itself becoming richer and more contextually actionable, according to the report.
The value of this prediction is in getting prepared, by continuing to build distributed fog networks to leverage for parsing data into meaningful intelligence to inform the larger IoT.
Edited by Maurice Nagle