Profit motives and other incentives could be harnessed to encourage private sector actors to redesign their technologies to counter and mitigate gendered cyber harms.
Cyber insecurity can be profitable. If companies design technologies without gender-sensitive (cyber)security measures, third parties (including law enforcement, private actors and others) can exploit design choices to serve their own incentives and motivations. But profit motives and other incentives can also be harnessed to encourage private actors to redesign their technologies to counter and mitigate harms. While some technologies may not be gender-sensitive in their design, they can be used to enhance gender-transformative cybersecurity (for instance, by being implemented to directly prevent, monitor and respond to gendered harms). This phenomenon is particularly significant when technologies are readapted and deployed by state and police actors, as explored in Chapter 5.
Each case study provides a commentary on individual experiences of cybersecurity and the gendered cyber harms that individuals may be confronted with. Chapter 3 documents cyber harms faced by queer social media users in Nigeria and South Africa, while the case studies in Chapter 4 exposes how these harms can be generated through the commercialization and ‘weaponization’ of sensitive data. Chapter 5 shows how technologies can be used to bolster cybersecurity and combat gender-based violence.
Gendered cyber harms and cyber insecurity, while experienced individually, pose a global security challenge. In cyberspace, insecurity at an individual or community level can bear global, exponential repercussions. Technologies developed and deployed by private actors have a global impact, and technology companies are increasingly involved in developing cybersecurity solutions to advance a stable, peaceful cyberspace in which all people can participate safely and securely.
To this end, the following general recommendations for countering and mitigating harms, and incentivizing gender-transformative cybersecurity, are aimed at private sector stakeholders developing or deploying technologies (including technology companies, but also data brokers). Other stakeholders can and should seek to assist in their implementation.
Recommendations
Critically evaluate data sharing and cooperation with state entities, adopting a human rights and gender perspective to map potential harms.
In exploring data sharing agreements with state entities (such as law enforcement and national security agencies), private actors need to consider the following questions:
- Is cooperation necessary according to local laws?
- What gendered risks and harms might such cooperation lead to?
- How can these risks be managed or limited?
- What compromises are being made between market incentives (e.g. expanding the reach of a product or identifying new customers) and data privacy, especially for data relating to gender and sexual orientation?
These considerations should be rooted in international standards and principles, such as the UN Guiding Principles on Business and Human Rights. For example, data brokers could pledge to not have a commercial relationship with organizations or industries that hold discriminatory, exclusionary and harmful stances on reproductive healthcare.
Assess the efficacy of user privacy and data sharing settings, and the accessibility and ease of changing those options, adopting a gender perspective to map potential harms.
Private actors should seek to identify where data collection and sharing are necessary for the functioning of a product, and assess the actual added value of those functions (e.g. for advertising) alongside potential risks – particularly in instances where additional data collection brings extra revenues.
When collecting or handling sensitive data, private actors should implement best practices in data minimization (such as storage time and location, anonymization, etc.); allow pseudonymous or anonymous access to services; stop behavioural tracking; install end-to-end encryption by default; and refrain from collecting any location-based information. For example, healthcare app providers could refrain from collecting geolocation data tracking visits to abortion clinics and other similarly stigmatized healthcare facilities. Compliance with regulatory standards on data privacy is an important lever in this case. For private companies, compliance with privacy regulation is necessary and can mitigate potential risks. Regulatory standards on data privacy should also be applied when private companies are considering data sharing agreements with state entities such as law enforcement and national security agencies.
Map technological relationships with commercial partners and potential risks.
Nearly all actors in the private sector depend on, and operate in, a complex and often poorly understood web of technology and data. Such actors should dedicate resources to mapping and evaluating their technological relationships, including direct and indirect data flows, from a gender-sensitive perspective, with the aim of identifying and mitigating gendered risks. For example, mapping data dependencies (both direct and indirect) between social media platforms is essential for mitigating potential harms faced by certain social media platform users, who may have concerns about their content and personal information moving between platforms without their knowledge or consent.
Implement additional technical features and mitigations that enable users to reduce risks.
Technology designers should proactively identify high-risk situations and contexts faced by their users, and design and implement features that enable users to reduce risks. Such features may include allowing the user to disable location services. Additional features and mitigations should be incorporated into design scoping, and assessed as part of the overall profitability of the product – ideally as part of a ‘security-by-design’ process. The commercial value of additional security features should also be considered, particularly as the product may attract more users if there are strong security features. Finally, information about opting-out and reducing risk must be proactively, accessibly and regularly shared with users (for example, via a clearly labelled reporting function).
Incorporate user experiences and feedback into technology design, redesign and readaptation.
‘Building in’ gender sensitivity at all points of the technology design process is essential for mitigating gendered harms – especially when the technologies themselves may be deployed by third parties to mitigate harms. Particular attention should be given to how the same technologies are accessed and used by people of different genders, and to the efficacy of cybersecurity measures for affected individuals. This can be partially achieved through incorporating user experience and feedback, whether through research, consultations or demonstrations.
Without sufficient safeguards and gender-sensitivity from the outset – drawing directly from individual experiences and implementing context – technologies can intentionally or inadvertently further entrench harms faced by marginalized groups (for example, through the over-censorship of online content or over-surveillance).
Build internal, independent gender expertise and connect to international networks for best practices.
Technology companies should nominate internal gender champions with sufficient seniority and independence from commercial decision-making processes. These champions should be supported with sustainable resources and training, and be authorized to develop information sharing networks internally. The latter is an important step towards incorporating gender into broader corporate social responsibility commitments. By connecting to existing international, multi-stakeholder networks, events and initiatives, private sector actors can share and further develop best practices, and facilitate public-private partnerships for the redesign and readaptation of technologies for the purpose of harm mitigation.