Pansy Tlakula: ‘Why AI is giving me sleepless nights’

The author, Information Regulator chair Pansy Tlakula

It is almost inconceivable to think that it is 75 years since the General Assembly of the United Nations adopted the Universal Declaration of Human Rights (UDHR). The UDHR turned 75 on 10 December and continues to be the foundation for all international human rights law.

The significance of the UDHR’s 75th anniversary is that, seven decades after the adoption of the declaration proclaiming that all human beings are born free and equal in dignity and rights, we now have the opportunity to reflect on whether this ideal has been realised, taking into consideration the number of conflicts and wars globally.

For the Information Regulator, it’s a moment to consider whether, after these seven decades, the fundamental right to privacy as it relates to the protection of personal information has been fully recognised and appreciated.

Reflecting on this, one thing is for certain: people need to be able to assert their data privacy rights. However, rapid technological advancements make it difficult to guard this right jealously because of a willingness to trade parts of this right to digital platforms in order to participate in the information society. This is not an argument for neo-Luddism, but a call for recognition of the fact that rapid technological changes have proven to be a thorny issue regarding the right to privacy.

The United Nations declared in Article 12 of the UDHR: “No one shall be subjected to arbitrary interference with his privacy, family, home or correspondence, nor to attacks upon his honour and reputation. Everyone has the right to the protection of the law against such interference or attacks.” However, the problem lies in the fact that technology has galloped ahead of everyone else, from data privacy authorities and regulators to governments, political parties, legislators and even businesses. Everyone is having to constantly play catch-up.

Breakneck

The breakneck pace at which technological innovation is moving is not consistent with the pace at which the data subjects can inform themselves thoroughly about the privacy trade-offs entailed in their adoption of new technologies. Take, for example, the recent developments at the company that used to be called Twitter, now X, which has launched an artificial intelligence platform called Grok. Grok’s privacy policy states that data subjects have certain rights in terms of usage of their personal information, “depending on where you are based and subject to applicable legal exceptions”.

This means that data subjects in some regions will have rights, and others will not. Where applicable laws do not exist, as is the case in 20 African countries, these data subjects will be at the mercy of Grok in terms of the handling of their personal information. It is safe to assume that most people who adopt Grok will use the platform without having first informed themselves of the privacy rights’ implications of using the platform.

TCS | The Information Regulator bares its teeth

Even more worrying is that the pace of technological change is faster than the ability of data protection authorities and legislators to innovate or equip themselves to match the speed of change in the areas on which they must regulate or legislate. We have seen the tragi-comic scenes at the US Capitol when digital platform companies were brought to congress to account for certain aspects of their technological platforms only for the deep knowledge gap between the legislators and Big Tech representatives to be demonstrated.

For the Information Regulator, the impact on the protection of personal information is proving to be an even bigger issue, one that often gives me sleepless nights. AI relies on big data, which can include personal and sensitive information. Different types of AI, particularly large language models, are trained on many types of data and some of this training data can, and does, contain personal information.

The collection, processing and storing of this information poses a huge risk of security compromises (data breaches) should some of these training databases be leaked or illegally accessed. Security compromises in general are a major concern for the regulator. Since 2021, 1 262 data breaches have been reported to the Information Regulator – that is the scary reality we are grappling with. It is up to everyone who processes personal information to come to the party. They must ensure that when they process personal information, they are alive to the fact that this information is part of the human dignity of a data subject (any person to whom personal information relates is an identifiable natural person) – and more importantly, they cannot do as they please with it.

Although the private sector is ahead of other sectors, there is still room for improvement. I do not think that the rest of the sectors grasp the importance of data protection in the digital age we are living in, and the biggest threat to privacy is unregulated technological innovation. It may also be that the right to privacy is generally not understood; it is still largely regarded as a technology issue and the regulator has a huge responsibility to change this narrative.

Seventy-five years after the adoption of the UDHR, we still have a long way to go in the realisation of this vital right. The reason for this is that the right to privacy as it relates to the protection of personal information was not a consideration in 1948. We have a long way to go, but with collaboration, adequate resources, education and compliance by all, we can get closer to ensuring that the right to privacy is fulfilled. As a human rights advocate, I always say the right to privacy is one of the most essential human rights we can fight for.

  • The author, Adv Pansy Tlakula, is chair of the Information Regulator of South Africa.

Get breaking news alerts from TechCentral on WhatsApp

Source: techcentral.co.za