Tuesday, January 25, 2022
Fun Facts to Know and Tell: RPi on an SSD, Microsoft Linux
Informed Delivery
The U.S. Postal Service (USPS) has a feature they offer called Informed Delivery (ID). It's free to residential customers. I signed up for it. Most every day in which mail is delivered - around 7AM local time for me - you receive an email containing black & white scanned images of the paper mail you can expect to receive at the specified street address in the next few days. Below is an example image of some junk mail (with some editing on my part) from this morning's email.
The USPS has to scan all paper mail. That's the only scalable way to sort and route it. They have to scan it whether I sign up for ID or not. Considering the volume of mail, the variations in address formats, and the support for even hand written envelopes, it's a remarkable technological achievement.
But now I wonder: who else can get this ID email for my address? Can law enforcement request it? Does it require a subpoena or a search warrant? Or is it considered public information, like the stuff in your trash bin waiting for pick up at your curb? Do the laws restricting domestic surveillance prevent the CIA or NSA from receiving it? What about the FBI or the DHS? Who else might have access to it? Can it be used to construct a vast network of implied communication, much as intelligence organizations do today with social media accounts?
Maybe this is how conspiracy theories get started.
Friday, January 14, 2022
Human-Machine Teaming and Autonomous Lethal Weapons Systems
I've been doing a lot of reading - and thinking - lately about autonomous lethal weapons systems. I've never helped develop one, but certainly the skills required to do so are in my wheelhouse. I'm philosophically opposed to them; I'm a fan of Isaac Asimov's Three Laws of Robotics. Yet I also believe that autonomous weapons systems are inevitable, and probably necessary. Such systems - e.g. armed autonomous flying drones used in land or sea battles - can go-where and do-what humans cannot. Physics is ruthless.
If our adversaries use them, I don’t see that we will have any choice but to do so as well in order to remain competitive on the battlefield. There’s a strong economic (and possibly even humanitarian, if they can reduce human error and danger to civilian populations) incentive to use autonomous lethal weapons. They will be particularly attractive to smaller first-world states, or any organization exploiting asymmetric warfare. Such systems may be fully autonomous, partially autonomous, or optionally autonomous. Combining a human operator with automation (a term I prefer to "artificial intelligence") is a form of human-machine teaming (HMT).
Lots of people in other domains are thinking about this. In its SAE J3016 standard, the Society of Automotive Engineers defines six levels of driving automation for vehicles, ranging from 0 (fully manual) to 5 (fully autonomous).
(Click on the image to see a larger version.)
It occurs to me that this might be applied to weapons systems as well. Here are some ways J3016 might be adapted to apply to weapons systems.
- 0 - the human operator has full control over the weapon system at all times.
- 1 - automation may assist the human operator with targeting, stabilization, etc.
- 2 - the human operator may relinquish control to the automation but can override its decisions.
- 3 - the automation may take control if it detects the human operator is impaired.
- 4 - the automation operates the weapon but a human operator is still required to approve a kill decision.
- 5 - the automation makes the kill decision without any human guidance or approval.
(H/T to John Stuckey for the inspiration for this.)
Friday, January 07, 2022
Test What You Detonate, Detonate What You Test
Another great article from the national security blog War On The Rocks: "When Software Bugs Go Nuclear: Testing A Digital Arsenal" by Laura Epifanovskaya, a researcher formerly in the U.S. Department of Energy's nuclear weapons program.
Motivated by the fact that the U.S. nuclear weapons stockpile is transitioning from analog to digital control systems - so as to interface with the digital systems in the latest generation of strategic weapons delivery platforms - she writes about the need for formal methods, Design For Test (a design methodology that applies to software as well as hardware), and NASA's motto "Test what you fly, fly what you test", in critical reliability systems.
There is a lot here that is applicable, not just to nuclear weapons, but to any complex high-technology system that absolutely, positively has to work right the first time - and never work when it's not supposed to. Her article also has some interesting tidbits into how nuclear weapons are tested without actually setting off a fusion reaction. Fascinating stuff.
Wednesday, January 05, 2022
Unintended Consequences of the Information Economy IV
Daniel Kim, CTO of Geosite, a company that provides geospatial tools, wrote an eye-opening essay in the national security blog War on the Rocks. In "Startups and the Defense Department's Compliance Labyrinth", he describes what companies have to go through to comply with the enormous, complex, and often redundant, conflicting, and changing requirements to deal with the U.S. federal government, and especially with the its Department of Defense. Total initial cost for Geosite: US$300,000. Compare this with the typical size of contract that start-ups in the U.S. government's Small Business Innovation Research (SBIR) program receive: about US$1,000,000; the cost of compliance could be more than a quarter of the entire budget.
Much of the overhead is in the realm of cybersecurity. No one can fault the DoD for requiring stringent security mechanisms. But it does place contracting with the DoD out of the scope of many small- to medium-sized companies. And even for large companies, it is an incentive for the business to seek revenue elsewhere where it is more easily made in the commercial sector.
Furthermore, the technical work necessary for compliance either takes time away from the core technical team in smaller organizations, or requires hiring (and paying) additional staff with the most hard to come by (and expensive) skill sets. As I am constantly reminded when I chat with a friend of mine who makes her living as a cybersecurity engineer, there is not a lot of overlap between the skill sets of folks that do the kinds of work I do and the folks that do the kinds of work she does. Kim cites a slew of standards, many from the National Institute of Standards and Technology, that document the processes and infrastructure necessary for compliance. Just being familiar with these tomes would be a significant effort.
I've mentioned before that my tiny one-man corporation has done its share of work over the years in the defense domain, but always as a sub-contractor to a far larger organization that provided all the infrastructure and process that was required to comply with the customers' requirements. Kim also mentions the U.S. Air Force's Platform One and its concept of a "Software Factory": a kind of pre-laid infrastructure surround that temporarily assimilates a start-up and provides it with a much simpler set of requirements. (If you have the kind of LinkedIn network that I have, you have already heard a lot about this.)
Alas, without the kind of support provided by these kinds of organizations, the DoD is not able to innovate in the same way, and at the same speed, as the commercial sector. Nor even easily take operational advantage of new and shiny technology that comes out of successful commercial start-ups. Which means, for the most part, it's another windfall for the handful of existing huge defense prime contractors.