Human rights under threat from new technologies

Mensenrechten

Virtually all human rights treaties are much older than the modern technologies that determine the rhythm of our day-to-day lives. But these rights are under threat, especially from large tech companies. Professors Ruben Verborgh and Eva Lievens advocate an entirely new mindset: “The first step must always be to think about what kind of society we want. Only then can we build the technology necessary to achieve it, not the other way around.”

“Human rights were once codified in treaties to protect people from institutions such as the government. They intended to, for example, prevent censorship, allowing for the guarantee of free speech. Right now, we see that private actors, such as large tech companies, have more and more of an impact on those rights. We must guard against this.”

Ruben Verborgh is a professor of Decentralised Web Technology at IDLab at Ghent University and imec. He has been a long-time public voice against the influence of big tech companies on our human rights and fundamental freedoms.

Protecting the children

He is not alone in this fight, Professor Eva Lievens makes plain. She leads the research group Law & Technology at the Faculty of Law and Criminology at Ghent University. “Take the United Nations Convention on the Rights of the Child, for example, which bundles children’s rights. That treaty was ratified in 1989, the same year as the invention of the World Wide Web, but it doesn’t say a word about technology,” she says.

Now, more than thirty years later, the situation has changed. “You can’t ignore the fact that technology plays a prominent role in the lives of our children. It offers many opportunities for enrichment, but, at the same time, it poses risks that we must protect them from,” she continues. “This is why the Children’s Rights Committee had to re-evaluate children’s rights in the context of new technological applications. Their conclusions? Every specific children’s right can be linked to technology. It doesn’t matter whether you’re talking about apps, connected toys, learning platforms or smart devices: we must protect children’s rights.”

Running roughshod over human rights

The same holds true for adults’ rights. Every human right is influenced by technology in one way or another, both negatively and positively. “It doesn’t matter whether you’re talking about privacy, free speech or even the right to a fair trial or the right to free and fair elections. The problem with new technology is that the advantages often make us blind to ethical questions,” Ruben finds.

“Take Google Streetview, for example. All of a sudden, Google had cars driving around every street in our country to take pictures. Now, you can search out any house you want online. Only afterwards do we pose the question: do we really want this? Isn’t our right to privacy being infringed upon?”

Artificial intelligence

While we reflect on one kind of technology, another one is already being developed, the consequences of which will be anyone’s guess. “We don’t, for example, know what the long-term ramifications of a great many artificial intelligence applications will be,” Ruben says.

“There are some applications you just don’t want as a society,” Eva jumps in. “Think of AI-based ‘social credit systems’ like the one that the Chinese government is experimenting with. A social credit system assigns scores to individuals based on their behaviour. Do you have a bad score? Then, as a citizen, you end up on a blacklist and lose certain rights.

Legislation lags behind

The problem is that legislation often lags behind. “In many cases, legislation that theoretically applies to technological phenomena does exist, but in practice it’s difficult to apply it,” Eva says. “For example, who exactly is responsible in traffic accidents involving a self-driving car?”

“Furthermore, companies that deal in technology often oppose legislation,” Eva continues. “Too much regulation supposedly hinders innovation, they claim. I, on the other hand, believe a good regulatory framework would be good for developers as well.” Ruben goes one step further: “The creativity necessary to develop things can only exist within constraints. The first step must always be to think about what kind of society we want. Only then can we build the technology necessary to achieve it. Right now, we build technology without considering the consequences.”

Societal influence of private companies

A lack of clear legislation also generates problems on social media. “Fake news and disinformation pop up everywhere, which means that we can’t take any image we see on face value,” Ruben explains. “Social media’s algorithms are able to influence public perception. And these platforms are usually in the hands of American tech giants. Nobody wants these private companies to have the influence they have, but that’s exactly the kind of danger they pose today,” Ruben warns.

Case in point: Cambridge Analytica, the big data company owned by an American billionaire who influenced both the previous US elections and the Brexit Referendum to his advantage. “More than thirty years after its conception, the web has become incredibly centralized. The big internet players are gaining more and more power, but don’t do anything to try to solve society’s problems, simply because it doesn’t earn them any money.”

GDPR as an example of good practice

“We have to ensure that data and technology once again work for people, instead of companies,” Ruben thinks. Eva adds: “You have to start by developing a strong regulatory framework for tech giants. It’s not perfect yet, but the European Union has made big strides in this fight. If we can take the lead in Europe and give a regulatory framework to big tech platforms, chances are they’ll apply that same framework to the rest of the world. That would be a significant evolution.”

There are examples of good practices. “In the case of GDPR, the European legislation which standardises the regulations governing the processing of personal data by private companies and governmental entities in the entire EU, legislators were one step ahead of the technology,” says Ruben. “Even companies that aren’t located in the EU, but do offer services in Europe, have to observe these laws. That’s a step in the right direction. But it’s important to remain vigilant,” Eva adds.

Data locked away tight

Another possible solution to dealing with the threats to our privacy, would be to re-examine the way we handle our data. Solid, a technology developed by Sir Tim Berners-Lee, the inventor of the World Wide Web, could be of great help.

Ruben, who works on the project together with Eva, explains: “Using Solid, we want to give back control of data to people and allow them to manage it themselves. We are developing a type of personal data vault in which you can keep your data. You, yourself determine which data you give to which company, instead of the other way around. In this way, you are less dependent on platforms.” Eva adds: “Technology offers the potential to easily exercise and realise our fundamental freedoms and human rights. As a society, we have to invest much more in these possibilities in the future.”

Whichever way you look at it, there’s a lot to be done to bring technology in alignment with human rights. “They’re so intimately connected that we have no other choice but to balance them with each other. Unfortunately, it’s still very much a work in progress.”

Read also

Eva Lievens

Eva Lievens is a professor of Law & Technology at the Faculty of Law and Criminology. Her favourite place at Ghent University is the marvellous courtyard garden at her faculty.

33-67
Ruben Verborght

Ruben Verborgh is a professor of Decentralised Web Technology at the IDLab at Ghent University and imec. He’ll never forget the wise words the rector once told him: “Every researcher has the right to try to catch lightning in a bottle.”

33-67

Read also

Ghent University to train the language technologists of the future

At the Faculty of Arts and Philosophy, starting next academic year, you can opt for an educational track to become a language technologist. A sought-after profile on the job market, so it seems.

Taaltechnologie
view

Accepting cookies? Less innocent than you think

We all do it, accepting cookies without thinking when we visit a website. It seems quite harmless, but it is not. In fact, it is downright dangerous, according to human rights expert Professor Joe Cannataci.

Cookies
view

Coach CoDi: the motivation-boosting tool that helps children become independent coders in Scratch

Elk kind aan het coderen krijgen, dat is de gedeelde missie van enkele UGent-onderzoekers en het leerplatform FTRPRF. Samen ontwikkelden ze met de steun van VLAIO een digitale co-teacher voor de populaire programmeertaal Scratch.

Groepsfoto van de 4 ontwikkelaars van Coach Codi
view

Elisabeth and Rune want to bring people together on an interactive playground

Inventing technological solutions to make local communities stronger, that is the challenge of Samsung Electronics' Solve for Tomorrow ideas competition. Elisabeth and Rune won the Belgian qualifying round at the beginning of May and will soon go to the European finals with their GameYard.

GameYard
view