From the U.S. Army's Mad Scientist Laboratory blog, :
183. Ethics, Morals, and Legal Implications
[Editor’s Note: The U.S. Army Futures Command (AFC) and Training and Doctrine Command (TRADOC) co-sponsored the Mad Scientist Disruption and the Operational Environment
Conference with the Cockrell School of Engineering
at The University of Texas at Austin on 24-25 April 2019 in Austin,
Texas. Today’s post is excerpted from this conference’s Final Report and
addresses how the speed of technological innovation and convergence
continues to outpace human governance. The U.S. Army must not only
consider how best to employ these advances in modernizing the force, but
also the concomitant ethical, moral, and legal implications their use
may present in the Operational Environment (see links to the newly
published TRADOC Pamphlet 525-92, The Operational Environment and the Changing Character of Warfare, and the complete Mad Scientist Disruption and the Operational Environment Conference Final Report at the bottom of this post).]
Technological advancement and subsequent employment often outpaces
moral, ethical, and legal standards. Governmental and regulatory bodies
are then caught between technological progress and the evolution of
social thinking. The Disruption and the Operational Environment
Conference uncovered and explored several tension points that the Army
may be challenged by in the future. Space Cubesats in LEO / Source: NASASpace
is one of the least explored domains in which the Army will operate; as
such, we may encounter a host of associated ethical and legal dilemmas.
In the course of warfare, if the Army or an adversary intentionally or
inadvertently destroys commercial communications infrastructure – GPS
satellites – the ramifications to the economy, transportation, and
emergency services would be dire and deadly. The Army will be challenged
to consider how and where National Defense measures in space affect
non-combatants and American civilians on the ground. Per proclaimed Mad Scientists Dr. Moriba Jah and Dr. Diane Howard, there are~500,000 objects
orbiting the Earth posing potential hazards to our space-based
services. We are currently able to only track less than one percent of
them — those that are the size of a smart phone / softball or larger. /
Source: NASA Orbital Debris OfficeInternational governing bodies may have to consider what
responsibility space-faring entities – countries, universities, private
companies – will have for mitigating orbital congestion caused by
excessive launching and the aggressive exploitation of space. If the
Army is judicious with its own footprint in space, it could reduce the
risk of accidental collisions and unnecessary clutter and congestion. It
is extremely expensive to clean up space debris and deconflicting
active operations is essential. With each entity acting in their own
self-interest, with limited binding law or governance and no
enforcement, overuse of space could lead to a “tragedy of the commons”
effect.1
The Army has the opportunity to more closely align itself with
international partners to develop guidelines and protocols for space
operations to avoid potential conflicts and to influence and shape
future policy. Without this early intervention, the Army may face
ethical and moral challenges in the future regarding its addition of
orbital objects to an already dangerously cluttered Low Earth Orbit.
What will the Army be responsible for in democratized space? Will there
be a moral or ethical limit on space launches?
Autonomy in Robotics AFC’s
Future Force Modernization Enterprise of Cross-Functional Teams,
Acquisition Programs of Record, and Research and Development centers
executed aradio rodeo
with Industry throughout June 2019 to inform the Army of the network
requirements needed to enable autonomous vehicle support in contested,
multi-domain environments. / Source: Army.milRobotics have been pervasive and normalized in military operations in
the post-9/11 Operational Environment. However, the burgeoning field of
autonomy in robotics with the potential to supplant humans in
time-critical decision-making will bring about significant ethical,
moral, and legal challenges that the Army, and larger DoD are currently
facing. This issue will be exacerbated in the Operational Environment by
an increased utilization and reliance on autonomy.
The increasing prevalence of autonomy will raise a number of
important questions. At what point is it more ethical to allow a machine
to make a decision that may save lives of either combatants or
civilians? Where does fault, responsibility, or attribution lie when an
autonomous system takes lives? Will defensive autonomous operations –
air defense systems, active protection systems – be more ethically
acceptable than offensive – airstrikes, fire missions – autonomy? Can
Artificial Intelligence/Machine Learning (AI/ML) make decisions in line
with Army core values?
Deepfakes and AI-Generated Identities, Personas, and Content Source: U.S. Air ForceAnew era of Information Operations (IO)is
emerging due to disruptive technologies such as deepfakes – videos that
are constructed to make a person appear to say or do something that
they never said or did – and AI Generative Adversarial Networks (GANs)
that produce fully original faces, bodies, personas, and robust
identities.2
Deepfakes and GANs are alarming to national security experts as they
could trigger accidental escalation, undermine trust in authorities, and
cause unforeseen havoc. This is amplified by content such as news,
sports, and creative writing similarly being generated by AI/ML
applications.
This new era of IO has many ethical and moral implications for the
Army. In the past, the Army has utilized industrial and early
information age IO tools such as leaflets, open-air messaging, and cyber
influence mechanisms to shape perceptions around the world. Today and
moving forward in the Operational Environment, advances in technology
create ethical questions such as: is it ethical or legal to use cyber or
digital manipulations against populations of both U.S. allies and
strategic competitors? Under what title or authority does the use of
deepfakes and AI-generated images fall? How will the Army need to
supplement existing policy to include technologies that didn’t exist
when it was written?....