Gartner with some specific privacy trends they are expecting through to 2024. Specifically, five. And why 2024? By the end of 2024, Gartner predicts that 75% of the world’s population will have personal data covered under modern privacy regulations.
So what are the five? Let’s quote ChannelLife.
In a borderless digital society, seeking to control the country where data resides seems counterintuitive. However, this control is either a direct requirement or a byproduct of many emerging privacy laws.
The risks to a multicountry business strategy drive a new approach to the design and acquisition of cloud across all service models, as security & risk management leaders face an uneven regulatory landscape with different regions requiring different localization strategies. As a result, data localization planning will shift to a top priority in the design and acquisition of cloud services.
Privacy-Enhancing Computation Techniques
Data processing in untrusted environments – such as public cloud – and multiparty data sharing and analytics have become foundational to an organization’s success. Rather than taking a bolt-on approach, the increasing complexity of analytics engines and architectures mandates that vendors incorporate a by-design privacy capability. The pervasiveness of AI models and the necessity to train them is only the latest addition to privacy concerns.
Unlike common data-at-rest security controls, privacy-enhancing computation (PEC) protects data in use. As a result, organizations can implement data processing and analytics that were previously impossible because of privacy or security concerns. Gartner predicts that by 2025, 60% of large organizations will use at least one PEC technique in analytics, business intelligence, and cloud computing.
A Gartner survey found that 40% of organizations had an AI privacy breach and that, of those breaches, only one in four was malicious. Whether organizations process personal data through an AI-based module integrated into a vendor offering, or a discrete platform managed by an in-house data science team, the risks to privacy and potential misuse of personal data are clear.
“Much of the AI running across organizations today is built into larger solutions, with little oversight available to assess the impact to privacy,” says Henein.
“These embedded AI capabilities are used to track employee behavior, assess consumer sentiment and build “smart” products that learn on the go.
“Furthermore, the data being fed into these learning models today will have an influence on decisions being made years down the line,” Henein says.
“Once AI regulation becomes more established, it will be nearly impossible to untangle toxic data ingested in the absence of an AI governance program. IT leaders will be left having to rip out systems wholesale, at great expense to their organizations and to their standing.”
Centralized Privacy UX
Increased consumer demand for subject rights and raised expectations about transparency will drive the need for a centralized privacy user experience (UX). Forward-thinking organizations understand the advantage of bringing together all aspects of the privacy UX — notices, cookies, consent management, and subject rights requests (SRR) handling — into one self-service portal. This approach yields convenience for key constituents, customers, and employees and generates significant time and cost savings. By 2023, Gartner predicts that 30% of consumer-facing organizations will offer a self-service transparency portal to provide for preference and consent management.
Remote Becomes “Hybrid Everything”
With engagement models in work and life settling into a hybrid, both the opportunity and desire for increased tracking, monitoring, and other personal data processing activities rise, and privacy risk becomes paramount.
With the privacy implications of an all-hybrid set of interactions, productivity and work-life balance satisfaction have also increased across various industries and disciplines. Organizations should take a human-centric approach to privacy, and monitoring data should be used minimally and with clear purposes, such as improving employee experience by removing unnecessary friction or mitigating burnout risk by flagging well-being risks.
Why do we care?
I’m convinced there is space to make money and deliver real value around data privacy. It’s why I spend time on it – mining this for value is clearly an opportunity. Most customers will not be able to do this on their own… so step in.