I have written an op-ed on how to overcome the series of systemic failures we’ve experienced over the past dozen years, and submitted to a major publication.
In that process I shared the piece with a few people in my network, including an old friend and mentor who was on the founding team of one of the most important technology companies in the past century — ever, actually. In that discussion, I found myself using the factory floor automation in the industrial revolution to describe what semantic (AI) technologies can do, and specifically Kyield. My friend suggested that the ‘systemic failure’ in the recent terrorism incident may be as much human as data, to which I replied:
Of course it’s a human problem — just as errors on the factory floor were (in the industrial revolution – so it is with knowledge workers in the information revolution), which is a big reason why so much of (the work) was automated and made transparent to others. Similarly now with robotics entering surgery — we don’t want incompetent workers hiding their mistakes — covering up, protecting turf, their buddies, legacies, or organizations when it can cost large numbers of lives (or the global economy).
This is not to say that we want to replace large numbers of workers with automation, but rather put them to work in a less devastating and more productive manner. We really don’t need dozens of highly paid terrorism experts deciding which visa to yank, when in fact much (more) complex issues are already fully automated elsewhere in our society. Rather they should be trained to operate the CKO module in Kyield for example so that they can more effectively manage their organizations for continual improvement relative to their mission.
I would add that we also cannot afford, nor should we, to allow large numbers of mistakes like a physician’s handwriting on prescriptions continue to kill people. Similarly, we should not continue to promote by our apathy, or allow lobbyists to manipulate the process, over now curable diseases such as misinterpretation of data or lack of interoperability that determine the fate of wars, economies, and large numbers of lives. Moreover, we certainly cannot afford activism into enterprise systems like we observed in the housing bubble and meltdown. We can’t afford it (status quo) economically or morally. Fortunately, it is no longer necessary relating to enterprise networks, which determine to a large extent how modern organizations function– meaning our society.