May 31, 2019
The generation and use of big data – and its role in the economy and across society – is growing exponentially. Many are already declaring it the most important resource of the 21st century.
The large-scale data collection that is producing statistics like this offers novel and significant opportunities, especially in our increasing ability to use this data to generate valuable insights. Simultaneously, however, citizens and policymakers across the world are grappling with the equally novel and significant concerns that this use generates.
High-level concepts such as ‘privacy by design’ and ‘security by design’ are useful principles, but what they mean for business practices needs to be articulated with more specificity.
In particular, the emergence of “surveillance capitalism,” a digital business model focused on the collection and exploitation of users’ data, has raised numerous concerns. Security breaches like those at Equifax and Yahoo, and scandals like those arising from Facebook’s links to Cambridge Analytica, are just a few examples.
Events like these demonstrate the need for relevant governance instruments to promote informed consent, data security, privacy, accountability and ethical use of data. Indeed, in November 2018, Canada’s Privacy Commissioner wrote to the Minister of Innovation stating that:
“Recent events have shed light on how personal information can be manipulated and used in unintended, even nefarious, ways. I am growing increasingly troubled that longstanding privacy rights and values in Canada are not being given equal importance within a new digital ecosystem eagerly focused on embracing and leveraging data for various purposes. Individual privacy is not a right we simply trade away for innovation, efficiency or commercial gain.”
Mowat recently explored whether there is an opportunity for standards-based solutions to play a larger role in addressing the challenges arising from the emergence of the digital economy. The report focused on three key areas – data governance, artificial intelligence (AI) and algorithms, and digital platforms. This TLDR provides insights generated by this research pertaining to data governance.
Big data promises to provide important and novel insights and to improve our decision-making systems. However, it also poses serious challenges in safeguarding individual privacy and security, and preventing unauthorized data collection and usage by powerful interests at the expense of the larger society. Some areas of concern include:
- Unauthorized personal data use
In recent years, concerns related to online data collection – in contexts as diverse as social media, voice-operated virtual assistants and smart cities – have become widespread. Canada’s existing privacy regime requires that collection of personal data be limited to the purposes of a service. But technology companies routinely use lengthy, broadly-worded and opaque Terms of Service (ToS) agreements that enable them to stockpile highly detailed user profiles. This comprehensive data is often used – and even sold – for secondary purposes such as targeted advertisements. Users are often unaware of what data is being collected, how their digital activities are being tracked or the purposes for which their data is being used. Currently, companies are generally not required to obtain meaningfully informed consent, and governance measures are severely lacking.
- Lack of industrial data standards
While there is likely too much collection of personal data, there is probably far too little collection and sharing of industrial data in Canada. Currently, organizations are often unwilling to exchange data because they are unsure if their data is of sufficient quality, and fear liability for any negative repercussions that might occur. But access to data, particularly non-personal industrial and commercial data, will be essential to future economic success, both for individual firms and nationally. This is because the ability to collect this data – in the right format, and at a high level of quality – and draw insights from it is critical to the optimization of industrial and commercial processes, the design of new products and services, and the creation of valuable new tools like AI. But without robust standards for data quality and interoperability, such exchanges, and the innovations they can unleash, will simply not occur.
- Limited liability
Currently, data security across the economy tends to be quite poor. In part, this is because the value chains for many of the electronic devices that collect data have become very long and complicated, with many different firms involved in various stages of production. The result is that very few, if any, people have the end-to-end view of the process needed to ensure the device’s security. When compounded by the fact that software firms are generally not held liable for flaws in their software, even if these flaws are responsible for data breaches, it means that ensuring security is difficult and the incentives to do so are weak. This situation is only made worse by the failure of the market to reward organizations that do prioritize privacy or data security features – a failure that is largely the result of uncompetitive markets and a lack of easy-to-use standards.
Current governance landscape
In Canada, data collection and usage is governed by a small number of federal and provincial laws.
Federally, the most important of these are the Privacy Act, which governs the personal information-handling practices of the federal government, and the Personal Information Protection and Electronic Documents Act (PIPEDA), which regulates how private firms and not-for-profit organizations must handle personal information.
Provincially, different governments have passed a series of similar laws to govern areas of provincial responsibility. For instance, the Province of Ontario has passed the Freedom of Information and Protection of Privacy Act (FIPPA), the Municipal Freedom of Information and Protection of Privacy Act (MFIPPA) and Personal Health Information Protection Act (PHIPA).
However, these laws are proving insufficient for addressing the increasingly complex contemporary challenges.
Perhaps the most robust attempt at crafting a data governance regime so far has come from the EU. Its General Data Protection Regulation (GDPR) includes elements such as strengthened consent requirements, the right to be forgotten, and integration of ‘privacy by design’ among others. The size and importance of the EU market has led to GDPR’s adoption by many firms on a global basis. It is important to note, however, that since the GDPR is a new law, many of the details around its interpretation and enforcement remain to be determined.
There are also some private initiatives underway. One example is the Solid project being led by Sir Tim Berners-Lee, inventor of the World Wide Web. The aim of this project is to enable users to maintain their own data stores and control the extent to which websites, apps and other users are able to access them.
For example, instead of sharing photos with friends by uploading these photos to a social media app – which then has the licence to use the photos as they wish – users would upload the photos to their own Personal Online Data Stores (or PODS) and share access with their friends or firms through “dApps” or decentralized apps. Unlike existing apps, these dApps would not store user data. Rather, they would simply provide a user interface and facilitate the linking of individuals’ data which they have stored themselves.
What role can standards play?
Several voluntary standards on data protection, security and privacy, created by the International Organization for Standardization (ISO), already exist, and more are currently in development. Thus, the most obvious opportunity for standards to play a larger role in data governance involves building on those existing standards which have proven valuable. High-level concepts such as ‘privacy by design’ and ‘security by design’ are useful principles, but what they mean for business practices needs to be articulated with more specificity.
In particular, standards developing organizations (SDOs) can play a key role in helping to flesh out what the requirements of new laws like the GDPR and California’s Consumer Privacy Act mean in practical terms. This can be achieved by developing standards that can serve as legal best practices or “safe harbours” for organizations seeking to comply with them. SDOs are uniquely well-placed to do this sort of work because of their ability to convene relevant experts and stakeholders to help define how to translate these principles into successful implementable standards.
Finally, there are also opportunities to create consumer-facing standards that could enable organizations to indicate their adherence to instruments like the GDPR and allow consumers to confidently select services on the basis of such compliance. Certification can also serve as a useful tool for customers in deciding whether or not to trust a product or service – an area where demand is likely to increase, and SDOs will be well-placed to respond. More boldly, the creation of these standards could serve as the first step towards the creation of “machine-readable” standards that could support significant automation – through tools like web browser plug-ins or digital agents – of the due diligence required for consumers to protect their own data.
This TLDR is the first post in a three-part series examining the potential for standards-based solutions to play a positive role in governing the digital economy. Other posts examine opportunities for standards in the contexts of artificial intelligence (AI) and algorithms, and digital platforms. All three posts draw on research conducted for a project commissioned by the CSA Group which produced a report entitled: The Digital Age: Exploring the Role of Standards for Data Governance, Artificial Intelligence and Emerging Platforms.