Will you Trust? An idea of Trust for the Digital Future

Under the contemporary surveillance arms race, fostering digital trust has been increasingly more challenging. Therefore, more than ever, trust in data-driven environments should be clearly defined, while the use of trust-enabling mechanisms and individuals’ capacity-building promoted.

Ever since Snowden’s revelations[1] about government surveillance and Cambridge Analytica’s known efforts to influence elections[2], that our perception of digital technologies and data collection has forever changed. We are currently witnessing a growing lack of trust in institutions of all kinds and a dangerous growth of the digital trust deficit[3]. However, trust is a core principle of the digital economy, consumer and data protection, which underpins all digital-based relations. Understandably, the proliferation of data-intensive technologies has intensified the risks of harm for data protection and privacy, while the persistent data leaks and unlawful data sharing seemingly confirm the current technological distrust environment.

Additionally, the Smart Cities paradigm adds an extra element of complexity, since public services will operate directly with citizens, and impacting the public at large. This may ultimately mean that if citizens do not trust providers, they will be impeded from accessing basic services. Furthermore, the pervasive surveillance, rising automation and multiple algorithmic-based practices increase the opaqueness in decision-making processes. Resulting in biased solutions that detrimentally affect individual agency and sense of trust in technologies. For these reasons, it is of utmost importance to understand how to improve digital trust and promote public engagement with technology. On the contrary, without trust, the modern networked life based on continuous data sharing with service providers of different sectors, the technological development and societal adoption will likely be compromised.

1. What is Digital Trust?

Digital trust is the individual belief in digital-based processes for enabling technical security against potential data breaches and privacy violations. Working as a measure of individual and general confidence in technology-based processes as well as a positive expectation about its outcome. In this sense, when users “trust” a given social network, they expect that their personal data is kept private.

Notably, fostering digital trust goes beyond single individualized actions, but comprises the reliability on other parties without directly controlling the outcome of their decisions. That being said, recent data-scandals as well as the growing societal automation, strongly threaten the value of individual autonomy in the data-driven society. Particularly,  due to the power imbalances between individuals and corporations, leaving consumers in a more vulnerable position, and for that reason, also less willing to trust.

A trustworthy digital society entails establishing an adequate system of accountability and oversight to minimize the impact of potential trust breaches. Hence, digital trust much relies on privacy and security, along with a number of broader ethical questions which arise when organizations blindly deploy automation tools. Given this, consumers, as users of digital services, must be able to make use of technology without hampering their individual autonomy, while preserving the value of choice and consent.

Digital trust is thus the result of multiple factors, namely resilient technical security and reliability, in combination with adequate and enforceable regulations. Even so, security must always be subject to a proportionality assessment. As a secured cyberspace may increment online transactions, e-banking and improve online interactions between individuals and different entities. However, this may also impose excessive burdens upon those concerned.

In sum, trustworthiness in digital environments implies that continuous actions are enacted to promote transparency in data-driven processes and implementing adequate accountability mechanisms.

2. Why Digital Trust Matters

Understandably, many concerns arise with the growing capabilities behind corporate giants, whose activities include massive data collection and the use of technological resources to assure an almost uninterrupted surveillance to guarantee maximum data monetization. In a time when controlling user’s data has become an expression of corporate power over what users see, do or will predictably do online.

As outlined, there are inherent risks to the use of data-dependent technologies, whereas collecting and managing information increases the risks of privacy overreach. Notwithstanding, there are also many beneficial outcomes such as the growing convenience and efficiency in todays’ services. Trust in technologies enables long -term planning and building, as well as higher risk tolerance.

In the context of smart environments, it is essential to develop trust-based solutions and digital empowering mechanisms to promote individual trust and minimize concerns in this regard. Moreover, privacy in this context is about the capacity to minimize individual exposure; however, most privacy notices we accept go by unread which aggravate the difficulties in applying privacy regulations within ever more data-dependent environments.

Policy frameworks in this regard, must also converge towards the idea of trust-verifiable institutions and individuals through mechanisms such as standards or certifications, which would transparently identify the safeguards and data management instruments addopted to protect personal data.  Therefore, users would more easily trust them with their data.

Although consumers benefit from the convenience of digital technologies, undoubtably that digital providers must be held accountable. As outlined, information is about corporate power over individuals. Hence, in this context, there is a growing awareness of the data monetization practices and simultaneously an emerging pressure for reinforcing transparency mechanisms and empowering consumers by incrementing their control over how their data is managed.

Building digital trust is thus necessary, but also, a highly complex task requiring companies and policymakers to rethink how to enable the potential of technology.

3. Final Remarks

Consequently, technologies must evolve towards trust-building solutions by gradually empowering individuals to make informed choices and avoid trust deceptions, while promoting trustworthy relations between organizations and data subjects. Additionally, considering the recent events, cybersecurity and cyber-resilience are means for avoiding the damages of unlawful accesses to personal information. Since trustworthiness in this context much relies on security, confidentiality, and de facto accountability.

In this sense, it is imperative to implement further regulatory and technical guidance with the aim of increasing transparency in data-driven and artificial intelligence-based processes since trust and transparency are also inherently associated.

To this instance, the solution partially lays in setting out technical standards, and improving individual digital capacities throughout digital decision-making processes. Provided the safeguards to foster digital trust, digital relations may become steadier and individual choices about privacy less onerous. Indeed, data management may contribute for regaining the sense of control over their digital selves and individual choices in digital environments:  this must also be carefully considered not as a solution by itself, but as a part of the strategy for implementing trust-reliant mechanisms to ensure our digital future. 

As such, individual capacity building can be a vital tool for safeguarding individuals against distrust, cyber risks, and ultimately against digital exclusion. However, this will require that public policies are set in place to foster digital literacy and encourage user-friendly tools while improving individual online skills. And ultimatly, to enhance digital trust and minimize digital divides.

As a final note, digital trust is transversal to various the debates arising in the context of progressive digitalization of life. Although there is no “one size-fits all” solution, trust is mostly about ensuring individual privacy and the confidence that consumers will not be exploited or harmed by consenting to certain data collection operations. Therefore, digital trust is not only essential to foster emerging technologies, but the key to preserve our societal values in the digital reality of todays’ life.


[1] Glenn Greenwald, ‘Edward Snowden: the whistleblower behind the NSA surveillance revelations’, The Guardian, 9 June 2013. https://www.theguardian.com/world/2013/jun/09/edward-snowden-nsa-whistleblower-surveillance

[2]Robinson Mayer, ‘The Cambridge Analytica Scandal in Three Paragraphs’, The Atlantic, 20 March 2018. https://www.theatlantic.com/technology/archive/2018/03/the-cambridge-analytica-scandal-in-three-paragraphs/556046/

[3] For more on this definition, ‘Digital Trust’, World Economic Forum.  https://www.weforum.org/projects/digital-trust

The Insights published herein reproduce the work carried out for this purpose by the author and therefore maintain the original language in which they were written. The opinions expressed within the article are solely the author’s and do not reflect in any way the opinions and beliefs of WhatNext.Law or of its affiliates. See our Terms of Use for more information.

1 comment

Igor Inacio 29/06/2022 - 16:59

Nice work, Beatriz!

Reply

Leave a Comment

We'd love to hear from you

We’re open to new ideas and suggestions. If you have an idea that you’d like to share with us, use the button bellow.