No summary can capture the rich discussions that were held but, nevertheless, the following is an overview of some of the thoughts and notes that came from the workshop.
A recurring topic was that, unsurprisingly, we need to be cognisant when considering realities that groups of people face. A number of attendees raised various contextual challenges such as
- illiteracy while more and more requiring the use of secure technologies
- communities in developing countries where, for example, access to performing technology is challenging
- communities for which basic instruments such as an official identity or a bank account do not exist
- or simple cases of digital exclusion
This is a circular problem as digital exclusion gives less voice in an increasingly digital world.
This was at times referred to as the “last mile” problem, an interesting concept, and technology can definitely play a role here in creating a fairer and more inclusive society by bringing Identity, Trust, Security and Privacy to the fore. These technologies already exist but need to be proven at scale.
It was further recognised as an element of exclusion that one expects from others Internet connectivity at all times. From a wider angle, this is not always the case. In fact, there are interesting applications for cases where connectivity is intermittent and digital services (such as proving identity) are not immediately available. Furthermore, sharing of devices may be a necessity which raises problems in Privacy and Security.
It was also clear how COVID-19, a global phenomenon affecting virtually everyone, widened the digital divide. A first reason ties to the dependence on technology to work, live and communicate that, more often than not, is complex and difficult to understand and use. A second reason is that physical mobility is restricted and physical access to computers is much more difficult — such as a library. Being restricted in our homes also means that face-to-face support in using those technologies was more difficult to obtain.
COVID-19 also made our lives more dependent on Social Media, for better and worse. One consequence is that we are now more vulnerable and targeted to, for example, disinformation and cyber security attacks.
An interesting thought concerned Privacy. Whereas in the West the debate is around commercialisation of personal data and state surveillance, it was recognised that it was also important in certain contexts in the Developing world. An interesting case was given about the use of technology in countries where Democracy has not fully developed. In particular, technology was met with reservations because a simple USB stick or phone would hold too much information in one single place, and it needed special protection against an adversary such as the Army siding with an unfriendly government. In other words, having a less documented existence (or simply using paper) was a much more secure way of living.
Online harms were equally unavoidable as a topic. Technology is a wonderful thing but brings with it threats. Children, for example, should not have the same access to Internet content as adults. How to regulate, enforce and design technologies is a difficult challenge and needs to meet delicate equilibria.
Another interesting idea was Identity of communities. One tends to associate Identity (in the sense of a name or a national number) with single entities (individuals, devices, organisations, etc). The concept of Identity for a community, however, is far more unfamiliar. Several participants raised scenarios where a group of people may need to be authenticated and authorised on a group membership level, in contrast to the far more familiar case of a single entity. For example, someone may not necessarily have an individual digital Identity but yet, have digital membership to an asset that is owned by a group.
A final note to address the very difficult problem of Trust in services. There was consensus that organisations, up to States, need better accountability. On one hand, the normal root to establish Trust is to create transparency and auditing mechanisms but that seems to stumble on the problem of who we can ultimately trust. A single element in a chain of Trust is able to cast a shadow of doubt over the whole chain. This will, probably, never have a solution as we are only humans and a trade-off between practicality and complexity will likely be impossible to overcome. On the other hand, there is a problem related to alignment of incentives. Market forces seem to work well to a certain (early?) point but past it, and quickly, it shows deficiencies — a key one being at the heart of our group which is Power & Control. How can we design a system, with and for Humans, that is universally accepted as transparent?