Authored by: Ben Howkins and Julie Vinders
Reviewed by: Corinna Pannofino and Anaïs Resseguier
Technological innovation can both enhance and disrupt society in various ways. It often raises complex legal questions regarding the suitability of existing laws to maximise the benefits to society, whilst mitigating potential negative consequences. Some Technologies whose development and application are not completely realised or finished, and whose potential lies in the future. [...] even challenge us to rethink the ways in which our fundamental rights as human beings are protected by law.
For example, how do your rights as an individual stack up if your local environment is affected by TechEthos defines climate engineering as a technology family which enables the modification of natural processes [...] More activities aimed at addressing global climate change? Or what would happen to the right to not self-incriminate if advanced neurotechnologies in the courtroom can provide insights into a defendant’s mental state? And how might digital extended reality (XR) affect online safety and the emerging rights to be online and to disconnect?
A human rights impact assessment
A recent study by the TechEthos project analysed international and European Union (EU) human rights law, including the International Bill of Human Rights, the European Convention on Human Rights (ECHR), and the Charter of Fundamental Rights of the EU (CFREU), in relation to climate engineering, neurotechnologies and digital extended reality (XR). While such legal frameworks do not explicitly mention climate engineering, many of the provisions contained therein are nonetheless likely to be directly applicable. By highlighting the potential for enhancements to and interferences with various human rights, the study essentially provides a human rights impact assessment of the three technology families. It identifies some gaps and legal uncertainties, which may give rise to the need for further legislation, or at least further legal clarification in the future.