Have an account?

  •   Personalized content
  •   Your products and support

Need an account?

Create an account

Data analytics and the benefits of 5G technology

The benefits of 5G technology include better speed and performance for data transfer—a boon for data analytics processes. But enterprises need to consider some unintended risks with 5G as well.

With data increasingly important to most business operations today, next-generation connectivity is critical. Business data relies on speed and performance. But until recently, connectivity couldn’t keep up. Over the past couple of years, though, 5G technology has come into focus. As the latest generation of cellular technology, 5G is engineered to greatly increase the speed and responsiveness of wireless networks.

It’s important to note that 5G technology won’t work alone in improving wireless experiences. It will be aided by other wireless technology advancements, including Wi-Fi 6, the next generation wireless Wi-Fi (based on 802.11ax technology). W-Fi 6 will provide support for wireless services in congested areas (think airports, universities) and that have high-volume data density needs, such as video. These enhancements to speed and performance work together to make data-intensive experiences work more seamlessly and rapidly.

With their ample data transmission potential, low latency and network management capabilities (through slicing, a form of virtualization that allows multiple logical networks to run on top of a shared physical network infrastructure), 5G and Wi-Fi 6 will usher in a new era of data collection and analytics. Businesses will have access to data in volumes that were simply too large to collect prior to the advent of 5G. But how should companies evaluate the benefits of 5G—and risks—as they collect and analyze new data sets?

In developing data management strategies, enterprises need to consider the obvious costs of transmission and storage, but also the inherent risks surrounding security and privacy. Businesses will also need to rethink and streamline data management practices and develop innovative ways of extracting value from an ever-expanding sea of data. Without transforming raw data into actionable information, the data is of little use. Enterprises need to combine data analysis with a combination of innovative tools and good judgment. The following are some critical issues to consider.

Evaluating the marginal value of additional data

A key consideration for enterprises is whether the collection of additional customer data accelerates and improves decision making. For example, imagine collecting location data from mobile devices. Will additional, up-to-the-minute geo-spatial data improve decision making about presenting just-in-time offers to individual consumers? On a more macro level, will additional data offer insight into new trends in customer behavior? Could these trends be detected with existing data sources? We must evaluate the value of potential data within the context of data already in our possession.

The bottom line: Determining the marginal value of additional data will be a key task of businesses as 5G technology begins deployment.

The impact of additional volume on current data analysis processes

If the benefits of 5G technology include business agility, speed and improved decision making, there may be a downside to the enhanced performance enabled by wireless improvements. The growing volume enabled by 5G technology will inevitably include additional noise in the data stream.

An archetypal example of this occurred in in 2017, when scientists at CERN announced the detection of a signal from the Large Hadron Collider that could indicate an undiscovered particle. Excited researchers from around the world authored hundreds of papers explaining the unanticipated results. After eight months of review, however, CERN researchers determined that the unanticipated data was simply noise. Decision makers need to ask how they will protect their business from making incorrect inferences due to noise in new data sources.

The bottom line: Businesses need to develop and deploy mechanisms to separate noise from substance in their data so as not to cloud their results. This will happen at a time when the demand for data scientists will be growing.

Maintaining and ensuring quality data

After winnowing potentially useful data, businesses need to assess and measure the quality of this new data. It’s inevitable that data transmission will experience network interruptions and other disruptions that cause data loss. Businesses that are not already working with time-series data will need to develop strategies to handle missing and corrupt data There are a variety of ways to interpret missing data, such as using a moving average of recent data points. All options have pros and cons. The best choice will be specific to the application.

The bottom line: Companies need to develop awareness about the quality of data they are using. Missing or corrupt data introduces uncertainty and reduces confidence in analysis results.

Storage and data analysis costs

As data generation moves to fill 5G capacity, the cost of storage and data analysis will exceed transmission costs. These economic forces will drive moves to extract the most information from data. A critical factor will be determining which data is interesting from a decision making perspective.

If a measured variable has the same value for 100 consecutive readings, for example, then the 101st reading of the same value is hardly informative; but if a recent series of data values are 10 times greater than any data value received in the previous two hours, the data is notable. Research and application development work in anomaly detection is advancing the field. These techniques analyze signals and data sets to find unexpected or anomalous information. One important area of research is in understanding anomalies in a broader context than a lone signal. Data generated in complex systems cannot be monitored in isolation; it must be analyzed in the context of other aspects of the system. This will drive the adoption of increasingly sophisticated anomaly-detection techniques.

The bottom line: Businesses need to develop ways to identify the most useful information in data sets, in real time. This will drive innovation to process more data at the edge.

The privacy risks of collecting and storing data

In addition to the obvious costs associated with acquisition and storage, enterprises need to consider the associated risks of storing customers’ personal data. Does the risk of a data breach outweigh the benefits? The Privacy Rights Clearinghouse reports almost 1.4 billion known records breached in the U.S. alone in 2018. Measuring the security risks of increased data collection must include the potential risk of loss of data, which can have regulatory and market impacts—a lesson that companies have learned after incurring major fines.

Finally, enterprises must consider the dynamic regulatory environment surrounding digital privacy. Privacy regulations such as the Global Data Protection Regulation (GDPR) are only beginning to be interpreted in courts. Further, privacy advocates have proposed new legal frameworks to regulate not only the capture of personal data but also how that data can be used to make inferences about an individual’s attributes, including race or health status. Businesses must ask whether it’s worth collecting and storing massive amounts of customer data in this evolving legal landscape.

The bottom line: Not all data is created equal. Companies need to assess the costs of storing and processing data, including the security and regulatory risks they pose.

The future of data analysis and 5G

As 5G gathers steam, it will provide more opportunity to collect data from a growing array of mobile devices and connected sensors, but enterprises need to assess the value of this additional data. They need to consider how this data can improve decision making, in terms of timeliness and effectiveness. More data means the potential for more noise. Without rigorous methods for separating noise from signal, we risk undermining the value of information we derive from these new data sources. Increasingly, companies need to enlist anomaly detection and related techniques to help in this area. Finally, we should not underestimate the potential impact of increasing privacy protections that can alter the cost-benefit evaluation of collecting potentially personal data.

Dan Sullivan

Dan Sullivan is a software architect specializing in streaming analytics, machine learning and cloud computing. Sullivan is the author of NoSQL for Mere Mortals and several LinkedIn Learning courses on databases, data science and machine learning.