Sparrow Technologies

0 views
Skip to first unread message

Azalee Freas

unread,
Aug 5, 2024, 2:47:38 AM8/5/24
to paswebanksar
SparrowQuantum has a devoted team of experts who design, develop, and produce high-quality photonic components that can elevate your quantum projects. You can rely on our cutting-edge building blocks and exceptional knowledge to help you succeed in photonic quantum technology and applications.

Solid-state deterministic single-photon sources are expected to become the building blocks of future quantum technologies spanning from quantum communication and computing to quantum simulations, sensing, and metrology.


Differently from probabilistic single-photon sources, deterministic quantum emitters provide high on-demand single-photon rates featuring impressive quantum photonic features. This combination is essential to enable any quantum experiment or application that employs single photons as carriers of quantum information.


Lodahl, who currently heads the Hybrid Quantum Networks Hy-Q Center of Excellence, was at that time studying the control of light emission using intricate photonic nanostructures. What started out as fundamental research began to reveal commercial possibilities, particularly as the quantum technologies market began to mature.


The goal of the company, which was officially founded by Lodahl in 2015, is as simple as it is ambitious: To become the market leader in quantum light-matter interfaces for commercial quantum technologies.


The Sparrow Quantum team is making progress on that goal by putting a product on the market for researchers right now and engaging with companies that will make up their growing customer base, such as vendors and providers of photonic quantum computing systems, as well as quantum communication and quantum internet companies.


Sparrow Advantages

According to Lodahl, Sparrow Quantum equipment offers distinct advantages for its clients.

Perhaps the most important advantage is the decades of fundamental research and years of experience applying the technology for commercial use. This experience provides a solid foundation of knowledge, expertise, and best practices that can be leveraged to optimize and improve the technology for real-world applications.


In this interview, Professor Robert Sparrow of Monash University speaks with EIA managing editor Adam Read-Brown about his work on ethical issues raised by new technologies. The conversation focuses on Autonomous Weapon Systems (AWS), often referred to as "killer robots." Unlike drones, which are remotely operated by humans, with AWS the robot itself determines who should live or die. What are the ethical arguments for and against these killing machines? Sparrow's article for EIA on this topic appeared in the Spring 2016 issue.


My name is Adam Read-Brown and I'm the managing editor of Ethics & International Affairs, the Council's quarterly peer-reviewed journal, which is now in its 30th year and is published by Cambridge University Press.


With me today is Professor Robert Sparrow, author of the article "Robots and Respect: Assessing the Case Against Autonomous Weapon Systems" which appears in the Spring 2016 issue of the journal published earlier this year.


ADAM READ-BROWN: Speaking with me from Australia, Robert Sparrow is a professor in the philosophy program, a chief investigator in the Australian Research Council Centre of Excellence for Electromaterials Science, and an adjunct professor in the Center for Human Bioethics at Monash University, where he works on ethical issues raised by new technologies. He is the author of some 75 refereed papers and book chapters on topics as diverse as the ethics of military robotics, aged care robotics, just war theory, human enhancement, pre-implantation genetic diagnosis, and nanotechnology. He is the co-chair of the IEEE Technical Committee on Robot Ethics and was one of the founding members of the International Committee for Robot Arms Control.


To start, Professor Sparrow, the subheading of your article in Ethics & International Affairs is "Assessing the Case Against Autonomous Weapon Systems." This term, "autonomous weapon systems" commonly abbreviated as AWS, might not be familiar to all of our listeners. So could you briefly describe what we are referring to when we use this term? What are the characteristics and capabilities of these weapons?


ROBERT SPARROW: Colloquially, we're talking "killer robots." Here it's important to distinguish between the remote-controlled weapon systems, like the Predator and Reaper drones that the United States deploys around the world, and systems where an onboard computer is choosing the targets for the system.


Now, there is controversy in the literature about what we mean by "choosing the targets" here. But the basic idea is that, in some sense, it's the weapon system itself that is determining who should live or die. While of course it's possible that some systems might carry out large parts of their operations autonomously, myself and other people writing in this area are particularly interested in autonomous targeting and autonomous targeting using lethal force. So essentially, machines that decide who to kill.


Inevitably, this debate features both of those movements. So people who want to be deflationary about autonomous weapon systems might point to a modern anti-tank mine and say, "Here's a weapon that is not remote-controlled, is not active all the time, decides when to explode." Or you could point to an anti-submarine weapon in the tradition of naval mines, called the CAPTOR system. This was a United States weapon system which was essentially a tethered torpedo with a sensor package that could fire a torpedo when it detected a submarine in the area. Now, if you want to think of those weapons as choosing who to kill, then autonomous weapon systems have in fact been around for a long time.


If you think that the required standard of choice can't merely be automatic or have very narrow procedures for determining when to launch the weapon, then you might see autonomous weapon systems as on the horizon. And there are things like Cruise missiles which already have a limited capacity to determine which of a list of targets they will strike.


And then, of course, there's an enormous amount of interest in connecting the sorts of data-mining and pattern-recognition algorithms that are widely used now across a range of military applications to the sensor systems and the targeting systems of weapons to give them this kind of capacity to choose from a range of targets.


ROBERT SPARROW: I was originally interested in robots and computers because they are interesting examples to use to investigate some traditional philosophical arguments, particularly around the moral status of our fellow creatures. If you think of something like a robot dog and you think of someone who is kicking a robot dog, you can think about the concept of virtue and the concept of a virtuous agent without having to deal with intuitions about the pain and suffering of the robot. So you can sort of investigate the contribution that the human end of our relations with other creatures makes.


I was originally writing about the moral status of hypothetical future artificial intelligences and the role played by the form of their embodiment in establishing their moral status. Then, I was interested in the ethics of what's called "ersatz companionship." So robots are being designed for the aged-care context as companions for lonely older citizens. I was interested in how much of a contribution that would make to human well-being.


But in the course of doing that research, I realized just how much robotics research was actually being funded by the military. Essentially, the vast majority of cutting-edge robotics research is funded by military programs.


I remember a famous essentially puff piece for an early guided weapons systems was titled "This bomb can think before it acts." [Air Armament Center Public Affairs Report (2000) 'This bomb can think before it acts,' Leading EdgeMagazine, 42, 2, Feb., p.12.] I remember being struck by that and thinking, "Well, in what sense is this true? Could it be true?"


So I wrote a paper about autonomous weapon systems arguing about who might be held responsible for when they commit a war crime, and whether you'd hold the weapon itself or the program or a commanding officer responsible when a hypothetical autonomous weapon system deliberately attacks a refugee camp, for instance. That paper turned out to be quite influential and the beginning of a debate about the ethics of autonomous weapon systems.


Since that date, I've been more interested in weapons that are closer to application. So I've been writing about drones and I've been writing about autonomous submersibles, where I've been less focused on the possibility that these might be artificial intelligences and more on something that is closer to term, where they're either remote-controlled or they're autonomous in some sense without us wanting to describe them as potentially full moral agents.


ADAM READ-BROWN: I'd like to turn now to those ethical issues that you alluded to and to your research on them, specifically some of the issues you bring up in the article you wrote for Ethics & International Affairs. If you would, I'd like to have you start by laying out some of the basics of your argument surrounding robots and respect.


ROBERT SPARROW: This is a category of weapons that is hypothesized or proposed or identified within the just war tradition as being "evils in themselves." They are weapons that should never be used in war.


Now, it quickly becomes quite hard to explain what unites this category. But throughout the history of warfare, there has been a strong intuition that some weapons are simply wrong, that we shouldn't use these even if they have military advantages, perhaps we shouldn't even use them if victory is at stake, because to use these kinds of weapons is to violate a profound moral imperative. The traditional examples are things like poisonous gases, fragmenting bullets, rape as a weapon of war. There's a significant portion of the international community that thinks nuclear weapons are mala en se and that it would never be ethical to use them.

3a8082e126
Reply all
Reply to author
Forward
0 new messages