As technology becomes integrated into more aspects of our lives, the profile of risks associated with technology is also expanding. New advances in many kinds of technologies pose potentially, significant ethical challenges (e.g. ‘Artificial intelligence’ (AI), ‘Biotechnology’, or decarbonization technologies). This coincides with our increasing use of these technologies, creating potential risks at a macro-level (e.g. cybersecurity of a nation’s critical infrastructures) and at a micro-level (e.g. security of personal data and individuals’ vulnerability to online manipulation). Such risks are certainly to be expected with the advent of disruptive technologies and they are the price we have to pay for the great benefits these technologies offer us; it is a question of how well we recognize and mitigate these risks so as to ensure that new technologies can be used for the benefit of all.
Society trends
Many governments around the world are turning their attention to the ethics of technology and the implications of fast-developing technology for future societies.
Ethics related to the use of ‘Artificial intelligence’ for automated vehicles, automated decisions, and consumer interactions are topics that are frequently raised[1] and governments will increasingly be expected to address concerns around digital harm, disinformation, antitrust and foreign interference.[2] The AI-enabled technologies of the future must benefit from effective ‘technical, legal, and ethical frameworks’, according to the UK Ministry of Defence. Ethical questions are perhaps most critical in the area of militarized AI, and the use of technology in conflict. While machines could behave without regard for human suffering, they may also more accurately calculate the costs of conflict. Complexities can be expected to arise if countries develop conflicting ethical and legal frameworks for AI, both in military contexts and more broadly.[3] Other key ethical issues related to AI systems are about unwanted bias, eavesdropping, and safety, and industry is already busy trying to address these. The ISO/IEC committee working on AI (ISO/IEC JTC 1/SC 42) has collected 132 use cases for AI, including ethical considerations and societal concerns for each (for more details, see ISO/IEC TR 24030:2021, Information technology – Artificial intelligence (AI) – Use cases).
When considering the ethics of using AI, however, it is equally important to consider the ethics of not using AI. The risks of using AI are frequently discussed, but one question that is not addressed often enough is – when does it become unethical for us not to use AI? For example, if AI technology could predict the next pandemic or speed up vaccine development, one could argue that it would be unethical not to use this technology. There are plenty of examples like this, for instance, a common question posed is: if an AI-enabled autonomous vehicle had to hit someone, who should it hit? But is this the right question if the proper use of AI-enabled autonomous driving can help save lives by reducing accidents overall?
Of course, AI is not the only emerging technology that could pose significant ethical challenges in the future. Advancements in biotechnology could – alone, or in combination with AI – lead to the creation of synthetic life forms or augmented human beings, with enhanced physical or cognitive abilities. How to regulate technologies that can fundamentally alter human capabilities or change the human gene pool “could prompt strident domestic and international battles” in coming decades (see ‘Gene editing’).[4] Even technological advances to treat diseases could engender political debates about the ethics of access (since treatments are likely to be available only to those who can afford them).[4] Not to mention continued ethical debates about genetically engineered crops and foods and their potential ecological or health-related consequences.[5]
As the climate crisis becomes more urgent, we may also soon face ethical issues related to the use of new technologies for decarbonization. While geoengineering technologies (carbon dioxide [CO2] removal and solar-radiation management) have for many years been considered morally unacceptable, they are now gaining more attention as potential solutions of last resort.[6] Ethical concerns here range from distributive justice for future generations or vulnerable populations (negative effects of geoengineering actions could disproportionately some countries or populations e.g. by increasing drought in Africa and Asia), to procedural justice questions (who should decide to use these technologies and how?).
Related trends
News stories
- Published 20 Standards | Developing 32 Projects
- Information technologyArtificial intelligenceGuidance on risk management
- Information technologyArtificial intelligence (AI)Use cases
- Information technologyArtificial intelligenceOverview of ethical and societal concerns
- Published 3 Standards | Developing 1 Projects
- Road traffic safety (RTS)Guidance on ethical considerations relating to safety for autonomous vehicles
- Published 37 Standards | Developing 7 Projects
- Ethical claims and supporting informationPrinciples and requirements
“Trust and accountability are the new litmus tests for businesses in a world where digital is everywhere.”[7]
In the future, will data privacy be a thing of the past? Many sources agree that there is a clear trend towards the progressive loss of privacy that accompanies new developments in technology. According to the UK Ministry of Defence, “In the coming decades, every facet of one’s life is likely to be recorded by the ubiquitous presence of wearable devices, smart sensors and the ‘Internet of Things’”.[3] But at the same time, there is also a trend towards emphasizing privacy, for example, using privacy by design development. Once privacy-respecting technology is available, the market has the choice, and the global success of the European Union’s General Data Protection Regulation (GDPR) principles is an indicator of this trend.[8]
The use of biometric data, such as fingerprints and facial mapping, is increasing in both private (e.g. social media and personal technology products) and public (law enforcement and population surveillance) contexts.[9,10] Consumer trust will be an increasingly important issue as technology becomes increasingly prolific in everyday activities. Already, a majority of consumers are wary of connected devices and fearful of misuse of their personal data.[7,11] Some even suggest there may be a ‘digital bubble’, the bursting of which will be due in part to privacy concerns – “Concerns about data privacy have called into question whether digital technologies will continue to grow at this rate.”.[11] At the same time, companies are adjusting to market conditions and, if the market demands privacy, industry will develop appropriate products.[7] Industry needs to realize that privacy-respecting products are not much more expensive (if well done), but can instead provide a competitive advantage, since trust is a key decision factor for consumers faced with multiple options. Initiatives allowing the creation of ‘digital trust’, such as Yelp and Foursquare, are thus likely to grow in popularity.[12] Once society acknowledges that data has a value and therefore the data owner needs to be paid, a ‘new balance’ will be established. The question is, if and when such an acknowledgement may come…?
In the meantime, to reassure consumers, both government regulation and business leadership are necessary to establish privacy and data management standards that keep pace with emerging needs.[10] Indeed, this will be a growing consumer expectation.[7] Ultimately, it seems inevitably that technology will permeate almost everything we do and lead to enormous improvements in quality of life across society. However, these benefits will need to be carefully balanced with the accompanying risks to privacy and security.[12]
Related trends
News stories
- Published 237 Standards | Developing 61 Projects
- ISO/IEC 27009:2020 [Withdrawn]Information security, cybersecurity and privacy protectionSector-specific application of ISO/IEC 27001Requirements
- Security techniquesExtension to ISO/IEC 27001 and ISO/IEC 27002 for privacy information managementRequirements and guidelines
- Published 2 Standards
- ISO/DIS 31700 [Deleted]Consumer protectionPrivacy by design for consumer goods and services
Increasing reliance on technology and the proliferation of digital devices in daily life will create increasing risks related to ‘Data privacy’, cyberattacks, and consequences of system failure.[3,13] The key factor for prevention is risk awareness and proactive risk mitigation.
New digital technologies present serious challenges for governments and organizations and cybersecurity will remain a priority as critical infrastructure is increasingly connected to online systems and technological dependence on the Internet continues to rise (see ‘Spread of the Internet’). Internationally, countries will have to respond to evolving cyber-threats and prepare for cyberattacks as an instrument of war, counterintelligence, and political interference.[9,13,14] One data breach can impact multiple nations sharing online systems.[15] If they are aware, national leaders may take appropriate steps to protect large-scale systems such as electrical, communications, financial, logistical, and food-production grids.[9] They need to be proactive. Common Criteria for Information Technology Security Evaluation or the EU Cybersecurity Act are two examples of such proactive ventures.
Questions around ‘cyber borders’ may be part of the discussion around ensuring protection from attacks therefore countries and organizations alike must prepare for developments in cyber-crime.[3] As increasing numbers of citizens are connected to, and reliant on, online networks, the potential for terrorist attacks will grow, if the system is not resilient enough and sufficiently protected.[9] For developing countries in particular, preparedness for cyber-threats will need to accompany digitalization programmes and development of connected systems.[16]
Finally, cyber-vulnerability does not exist only at the level of countries and organizations. Looked at from a slightly different perspective, the vulnerability of individuals is also set to increase because of their online exposure. For example, more people will get their information online, leaving them potentially more exposed to misinformation (‘fake news’), which could be used to manipulate individuals or even on a larger scale to influence public opinion.[13]
To effectively mitigate these risks related to cyber-vulnerability, people cannot rely on government action alone – society needs to be the driving force. Society needs to demand that organizations maintain highly sophisticated information security systems to foster consumer trust and remain competitive.[2]
Related trends
News stories
- Published 237 Standards | Developing 61 Projects
- CybersecurityMulti-party coordinated vulnerability disclosure and handling
- Information technologySecurity techniquesVulnerability disclosure
- Information technologySecurity techniquesVulnerability handling processes
- Published 162 Standards | Developing 31 Projects
- Road vehiclesCybersecurity engineering
- Published 232 Standards | Developing 59 Projects
- ISO/CD TS 6268-1 [Under development]Health informaticsCybersecurity framework for telehealth environmentsPart 1: Overview and Concepts
- Health informaticsDevice interoperabilityPart 40101: FoundationalCybersecurity
References
- Digital megatrends. A perspective on the coming decade of digital disruption (Commonwealth Scientific and Industrial Research Organisation, 2019)
- The global risks report 2021 (World Economic Forum, 2021)
- Global strategic trends. The future starts today (UK Ministry of Defence, 2018)
- Global trends. Paradox of Progress (US National Intelligence Council, 2017)
- Global trends 2040. A more contested world (US National Intelligence Council, 2021)
- Ethics of geoengineering (Viterbi Conversations in Ethics, 2021)
- Technology vision 2020. We, the post-digital people (Accenture, 2020)
- Two years of GDPR. questions and answers (European Commission, 2020)
- Global trends and the future of Latin America. Why and how Latin America should think about the future (Inter-American Development Bank, Inter-American Dialogue, 2016)
- 20 New technology trends we will see in the 2020s (BBC Science Focus Magazine, 2020)
- Beyond the noise. The megatrends of tomorrow's world (Deloitte, 2017)
- Future outlook. 100 Global trends for 2050 (UAE Ministry of Cabinet Affairs and the Future, 2017)
- Global trends to 2030. Challenges and choices for Europe (European Strategy and Policy Analysis System, 2019)
- Global risks 2035 update. Decline or new renaissance? (Atlantic Council, 2019)
- Asia pacific megatrends 2040 (Commonwealth Scientific and Industrial Research Organisation, 2019)
- Foresight Africa. Top priorities for the continent 2020-2030 (Brookings Institution, 2020)