This is how I would redesign data governance. How would you?

I think of my own elderly mother, someone without a sophisticated smartphone. In the early days of the COVID-19 pandemic, she had difficulties accessing MySejahtera, the contact tracing app that was mandated to Malaysians. In an instance of digital gating in the first few months of the pandemic, she was not allowed to enter several premises without first showing the app on her phone. While she was resourceful and managed to get help from neighbours with her grocery runs, I think of other groups of people also disenfranchised by the mandated use of a contact tracing app – those with no money to get their own digital devices, people living in remote areas, and many others.

My mother’s is a small case of digital exclusion amidst the pandemic. These cases have spanned across the Global South, such as the Thai citizens who were threatened with arrest if they refused to use Mor Chana, the country’s (now discontinued) contact tracing app. These are instances where lived experiences are blatantly ignored and excluded in the process of data governance and policymaking. 

A few weeks ago I had the chance to digitally attend the Asia Pacific Regional Internet Governance Forum (APrIGF), held from 12 to 14 September 2022 in Singapore in a hybrid format. In a session called “Advancing data justice in (post)pandemic data governance: Perspectives from Southeast Asia”, a participant posed this question to the speakers:

“If you could redesign data governance, how would you do it?”

The session discussion included how governments in the region, particularly Indonesia, the Philippines, Singapore, Thailand, Malaysia and Cambodia, approached data governance during the pandemic, along with broader implications on outcomes, safety, security and privacy. It was surreal to hear how many of the cases around data injustices in the region were shared across these countries, including digital exclusion, data breach and violations, data protection safeguards, lack of transparency in data ownership, digital gating (discrimination of those not vaccinated or without digital devices from accessing public places), increased surveillance and policing.

It was apparent from the discussion at the session that not only did several governments in the region fail to use technology to mitigate the pandemic in a people-centric way aligned with human rights, but instead they used technologies as a means to exert control on their citizens during the pandemic. 

So how would I redesign data governance?

I am particularly fond of questions that ask you to imagine new ways of doing things. They are an invitation to re-examine existing practices, take stock of multiple truths and knowledges, and, above all, they are an exercise of hope – that a world accommodating these multiple truths is possible if we work for it. 

Data is never neutral. The decisions involved around datafication – from creation, collection, filtering, cleaning, interpretation, results, all the way to governance – are almost always dictated by a select few groups. These groups are often disproportionately male and with privileges like financial, ethnicity, class, geography and others. Their decisions do not reflect the lived experiences of the communities they most impact. These exclusionary, sometimes harmful, decisions can limit communities’ access to resources and services, as well as undermine their safety and privacy. This is also an example of power reproducing already existing inequalities. It is important to think how these power differentials can contribute to the imbalance surrounding data governance. 

Bearing this in mind, I would pose the question to myself:

“If I could redesign data governance, how would I do it, bearing in mind the unbalanced power dynamics and its exclusionary nature?”

Here are some of my suggestions:

Address exclusion at all levels of data governance. All levels of data governance should be inclusive and developed alongside diverse stakeholders in the community. Processes involving communities, especially marginalised ones, should make use of co-creation methods to identify needs and ensure everyone has the opportunity to participate. 

Constantly re-evaluate and assess. Just like any technical product that goes through quality control (QC) processes and testing before deployment, data governance should be subjected to evaluation processes, risk assessment and mitigation as well as continuous monitoring. These processes should include avenues to address grievances and offer feedback with effective follow-ups. At the same time, there should also be transparent steps to mitigate harm. 

Consider techno-political impacts. Developers and engineers often build technological products to serve the business needs of a few (read above: male, with power, money, etc.) without exploring the social and political repercussions. This often leads to instances of data being misused or misinterpreted based on the contexts of those in power. We must protect individuals and communities based on the principles of human rights and “do no harm”. Data collected must also be representative of the community, and disaggregated data must include all genders beyond the male/female binary lens. 

Adopt transparent practices. The data governance I envision needs to be transparent to give users complete agency over their own data. This means everyone should be informed, in all projects, what data will be collected, how will it be used and stored, who will have access, when it will be deleted, etc. Data ownership and limitation must also be transparent, especially when data is handed over to anyone outside of the organisation originally commissioning the exercise. People should be free to request data deletion within reason, and documents must be in easily accessible language and outputs.

Consider the possibility of less or no technology as better. Fighting for an ideal people-centric data governance means not only advocating for the above, I believe it also means people should have the freedom to opt out of all data collection, within reason. We must consider alternatives that prioritise safety and privacy like a census survey for needs like representative data to improve services and indicators.

Why don’t you try?

I realise that it is impossible to consider the needs of all stakeholders, given the limitations of our knowledge, backgrounds, our lack of consultation with other parties, as well as lack of insight into the future of this fast-moving field of data governance and data justice. However, as digital rights advocates, we would be remiss if we do not consider a world where almost all walks of life are included and whose needs are taken into consideration. This is where a multistakeholder forum such as APrIGF is so useful. More than just convening multiple groups to discuss, network and learn, it also sparked conversations such as this – conversations of hope and possibility.

And so, I invite you to re-imagine with me:  

“If you could redesign the data governance, how would you do it bearing in mind the unbalanced power dynamics and its exclusionary nature?”

 

This blog post was produced as part of APC's Digital Rights Collaborative - Southeast Asia, an initiative that aims to strengthen the digital rights movement by gathering a cohort of 15 organisations and individuals from the region who identify as new entrants to the space to advocate for and uphold digital rights. 
 

Image: Data Governance and SMART cities, @NTusikov lays down the concerns #viznotes by Giulia Forsythe via Flickr (CC BY 2.0) 
 

Zana Fauzi is a writer, rapporteur, and researcher, mainly working in the area of digital justice. She holds a Ph.D. where her research looked into the relations between movement leadership and social media. She is part of APC's Southeast Asia Digital Rights Collaborative.

 

« Retourner