Hello Pima, I really didn't understand your comments about STEM. I have selected an Achitectural Major, and am leaning towards a Robotics Engineering Major now, how difficult would this be?
Talk about barking up the right tree.
granpa, I would suggest reading this befrore requesting a change in major.
I don't think there is a Robotics Engineering Major at most schools. Robotics are generally a part of the Mechanical Engineering Department and I know Mechanical is an approved major.
From yesterday's Wall Street Journal
KILLER DRONES ARE SCIENCE FICTION
By WERNER J.A. DAHM
There's been a lot written lately about a future in which our military might use autonomous "killer drones" to hunt, identify and kill human targets. We owe it to a public made uneasy by this—rightfully so—to point out that while such stories make intriguing science fiction, autonomous lethal military strikes are unlikely to occur for a very long time, if ever.
I should know. As the chief scientist of the U.S. Air Force, I led a major 2010 study, called "Technology Horizons," on the technology-enabled capabilities that we will need to meet the challenges we face in the next 10-20 years. The first volume of that report is publicly available.
It describes the growing role of autonomous systems in reducing the Air Force's manpower costs, increasing its capabilities, and meeting the demands of modern warfare. Crucially, though, the report documented why we will continue to keep humans "in the loop" even as we move toward such increasingly sophisticated autonomous systems.
First, it's not technology that has held us back from fully autonomous military strikes—from a purely technical perspective, it has been possible for some time to conduct them. Nor are the restraints merely legal and ethical. Instead, there is a less obvious technical reason why the military is unlikely to employ fully autonomous lethal strikes.
The key is to understand that regardless of whether a military strike is conducted autonomously or with human involvement, it is not an isolated act. The actual launching of a weapon onto a target is one step in a sequential process that the military refers to as the "find-fix-track-target-engage-assess" chain.
Each step in this chain is essential to enabling the following one, and each step takes time to complete. The entire chain takes substantial time, and that is why we look for ways to shorten these steps. Autonomy can reduce the total time needed to complete the chain, but let's look more closely to see where autonomy provides significant war-fighting benefits.
The longest times in the chain come from the "find, fix and track" steps—together with the "assess" and, to a lesser extent, "target" steps. By comparison, the "engage" step—in which commanders make the decision to commit a weapon onto a target—is relatively short. As a consequence, shortening the "engage" step even further by making it autonomous does almost nothing to shorten the overall chain.
For that simple reason, there is essentially no disadvantage to keeping humans involved at least in the "engage" step. And that is why there has been virtually no demand from war-fighters to autonomize that critical step.
Simply put, until the "engage" step becomes the longest single part of the chain, there is no benefit to making it fully autonomous and nothing lost by maintaining human supervisory intervention at that step. Thus it's not technology that prevents fully autonomous strike, nor is it our cultural resistance or even the fact that we don't have the legal or policy tools to permit fully autonomous strike. Instead, it is the simple fact that we don't gain anything from it.
Enjoy the science fiction, but expect to see humans "in the loop" for a long, long time to come.
Mr. Dahm, the director of the Security and Defense Systems Initiative at Arizona State University, is a former chief scientist of the U.S. Air Force.