[PHILOS-L] JOB: 3-year Post-doc on the Philosophy of Military AI at TU Delft

7 views
Skip to first unread message

Stefan Buijsman

unread,
Jul 12, 2024, 5:36:39 PM (4 days ago) Jul 12
to PHIL...@liverpool.ac.uk

Caution: This email originated from outside of the University. Do not click links or open attachments unless you recognise the source of this email and know the content is safe. Check sender address, hover over URLs and don't open suspicious email attachments.

The TU Delft has a 3-year postdoc position open as part of the ELSA Lab Defence and the TU Delft Digital Ethics Centre:

Military AI-enabled systems pose serious ethical, legal and societal challenges. There should be meaningful human control (with clear responsibilities for the people involved) of the development and deployment of the automated processes in military operations. Such control should accommodate for concerns related to the increasing levels of AI-enabled machine autonomy in defence, as well as for AI technologies that play a role in providing situation awareness, collecting intelligence or cognitive warfare. All these applications have serious ethical and legal ramifications as they may bias decision-making, mean handing over some degree of autonomy to machines and impact human agency, human dignity and human rights. We therefore need systematic consideration of ethical, legal and societal aspects (ELSA) in the use of AI in the defense domain.

These considerations should ultimately inform us which AI-enabled systems are acceptable and which are not, under what conditions/circumstances. For the current unclarity on ELSA concerns over AI leads to both “over-use” (e.g., using too many AI-systems in too many situations, with lack of consideration of consequences) and “under-use” (e.g., not using AI, due to lack of knowledge or fear of consequences) of AI. Both over-use and under-use of AI in defense may lead to risks of protecting the freedom, safety and security of society.
This post-doc position will focus on a methodology for the safe and responsible use of AI in the defence domain, with particular interest in questions of responsibility and accountability of the people involved. The methodology will have to ensure ethical, legal and societal alignment in all stages of design, acquisition, and operationalization of autonomous systems and military human-machine teams. The project may also identify codesign methods for designing human-machine teams, which can be used to achieve ethical, legal, and societal compliance. It will help identifying the algorithms to be used to ensure this compliance, and will support the efforts to incorporate ethical, legal and societal aspects in a system-of-AI-systems.

The position is part of the ELSA (ethical legal societal aspects) Lab defence, granted under the NWA call “Human-centred AI for an inclusive society – towards an ecosystem of trust”, which focuses primarily on use cases for AI in countering cognitive warfare, in (autonomous) drones and in (non-lethal) autonomous robots. The successful candidate will work under the direct supervision of Stefan Buijsman, assistant professor in the ethics of technology and Mark Neerincx, full professor in Human-Centered Computing. In addition you work with the partners in this Elsa Lab consortium who have ample experience in (military) AI-systems, with a focus on human factors (TNO), legal (Leiden University, Asser), ethical (TU Delft), societal (HHS), and technical (NLDA) aspects. Furthermore, it is part of the Delft Digital Ethics Centre, a group of 30 philosophers of digital technologies that work on ethics by design in a wide range of domains.

The successful candidate in this position is expected to play an active role in the ELSA Lab project described above and to participate actively in the (stakeholder)  workshops, public events, and other project activities.  

Salary: €3266 - €5090 
Deadline: September 1st



Philos-L "The Liverpool List" is run by the Department of Philosophy, University of Liverpool https://www.liverpool.ac.uk/philosophy/philos-l/ Messages to the list are archived at http://listserv.liv.ac.uk/archives/philos-l.html. Recent posts can also be read in a Facebook group: https://www.facebook.com/PhilosL/ Follow the list on Twitter @PhilosL. Follow the Department of Philosophy @LiverpoolPhilos To sign off the list send a blank message to philos-l-unsub...@liverpool.ac.uk.

Reply all
Reply to author
Forward
0 new messages