Given the spin surrounding big data, duopoly deflection campaigns by incumbents, and a culture of entitlement across the enterprise software ecosystem, the following 5 briefs are offered to provide clarity for improving strategic computing outcomes.
1) Close the Data Competency Gap
Much has been written in recent months about the expanding need for data scientists, which is true at this early stage of automation, yet very little is whispered in public on the prerequisite learning curve for senior executives, boards, and policy makers.
Data increasingly represents all of the assets of the organization, including intellectual capital, intellectual property, physical property, financials, supply chain, inventory, distribution network, customers, communications, legal, creative, and all relationships between entities. It is therefore imperative to understand how data is structured, created, consumed, analyzed, interpreted, stored, and secured. Data management will substantially impact the organization’s ability to achieve and manage the strategic mission.
Fortunately, many options exist for rapid advancement in understanding data management ranging from off-the-shelf published reports to tailored consulting and strategic advisory from individuals, regional firms, and global institutions. A word of caution, however—technology in this area is changing rapidly, and very few analysts have proven able to predict what to expect within 24-48 months.
Understanding Data Competency
-
Data scientists are just as human as computer or any other type of scientist
-
A need exists to avoid exchanging software-enabled silos for ontology-enabled silos
-
Data structure requires linguistics, analytics requires mathematics, human performance requires psychology, predictive requires modeling—success requires a mega-disciplinary perspective
2) Adopt Adaptive Enterprise Computing
A networked computing workplace environment that continually adapts to changing conditions based on the specific needs of each entity – MM 6.7.12
While computing has achieved a great deal for the world during the previous half-century, the short-term gain became a long-term challenge as ubiquitous computing was largely a one-time, must-have competitive advantage that everyone needed to adopt or be left behind. It turns out that creating and maintaining a competitive advantage through ubiquitous computing within a global network economy is a much greater challenge than initial adoption.
A deep misalignment of interests now exists between customer entities that need differentiation in the marketplace to survive and much of the IT industry, which needs to maintain scale by replicating the precise same hardware and software at massive scale worldwide.
When competitors all over the world are using the same computing tools for communications, operations, transactions, and learning, yet have a dramatically different cost basis for everything else, the region or organization with a higher cost basis will indeed be flattened with economic consequences that can be catastrophic.
This places an especially high burden on companies located in developed countries like the U.S. that are engaged in hyper-competitive industries globally while paying the highest prices for talent, education and healthcare—highlighting the critical need to achieve a sustainable competitive advantage.
Understanding adaptive enterprise computing:
-
Adaptive computing for strategic advantage must encompass the entire enterprise architecture, which requires a holistic perspective
-
Adaptive computing is strategic; commoditized computing isn’t—rather should be viewed as entry-level infrastructure
-
The goal should be to optimize intellectual and creative capital while tailoring product differentiation for a durable and sustainable competitive advantage
-
Agile computing is largely a software development methodology while adaptive computing is largely a business strategy that employs technology for managing the entire digital work environment
-
The transition to adaptive enterprise computing must be step-by-step to avoid operational disruption, yet bold to escape incumbent lock-in
3) Extend Analytics to Entire Workforce
Humans represent the largest expense and risk to most organizations, so technologists have had a mandate for decades to automate processes and systems that either reduce or replace humans. This is a greatly misunderstood economics theory, however. The idea is to free up resources for re-investment in more important endeavors, which has historically employed the majority of people, but in practice the theory is dependent upon long-term, disciplined, monetary and fiscal policy that favors investment in new technologies, products, companies and industries. When global automation is combined with an environment that doesn’t favor re-investment in new areas, as we’ve seen in recent decades, capital will sit on the sidelines or be employed in speculation that creates destructive bubbles, the combination of which results in uncertainty with high levels of chronic unemployment.
However, while strategic computing must consider all areas of cost competitiveness, it’s also true that most organizations have become more skilled at cost containment than human systems and innovation. As we’ve observed consistently in recent years, the result has been that many organizations have failed to prevent serious or fatal crises, failed to seize missed opportunities, and failed to remain innovative at competitive levels.
While hopefully the macro economic conditions will broadly improve with time, the important message for decision makers is that untapped potential in human performance analytics that can be captured with state-of-the-art systems today is several orders of magnitude higher than through traditional supply chain analytics or marketing analytics alone.
Understanding Human Performance Systems:
-
Improved human performance systems improves everything else
-
The highest potential ROI to organizations today hasn’t changed in a millennium: engaging humans in a more competitive manner than the competition
-
The most valuable humans tend to be fiercely protective of their most valuable intellectual capital, which is precisely what organizations need, requiring deep knowledge and experience for system design
-
Loyalty and morale are low in many organizations due to poor compensation incentives, frequent job change, and misaligned motivation with employer products, cultures and business models
-
Motivation can be fickle and fluid, varying a great deal between individuals, groups, places, and times
-
For those who may have been otherwise engaged—the world went mobile
4) Employ Predictive Analytics
An organization need not grow much beyond the founders in the current environment for our increasingly data rich world to require effective data management designed to achieve a strategic advantage with enterprise computing. Indeed, often has been the case where success or failure depended upon converting an early agile advantage into a more mature adaptive environment and culture. Within those organizations that survive beyond the average life expectancy, many cultures finally change only after a near-death experience triggered by becoming complacent, rigid, or simply entitled to that which the customer was in disagreement—reasons enough for adoption of analytics for almost any company.
While the need for more accurate predictive abilities is obvious for marketers, it is no less important for risk management, investment, science, medicine, government, and most other areas of society.
Key elements that impact predictive outcomes:
-
Quality of data, including integrity, scale, timeliness, access, and interoperability
-
Quality of algorithms, including design, efficiency, and execution
-
Ease of use and interpretation, including visuals, delivery, and devices
-
How predictions are managed, including verification, feed-back loops, accountability, and the decision chain
5) Embrace Independent Standards
Among the most important decisions impacting the future ability of organizations to adapt their enterprise computing to fast changing external environmental forces, which increasingly influences the ability of the organization to succeed or fail, is whether to embrace independent standards for software development, communications, and data structure.
Key issues to understand about independent standards:
-
Organizational sovereignty—it has proven extremely difficult and often impossible to maintain control of one’s destiny in an economically sustainable manner over the long-term with proprietary computing standards dominating enterprise architecture
-
Trade secrets, IP, IC, and differentiation are very difficult to secure when relying on consultants who represent competitors in large proprietary ecosystems
-
Lock-in and high maintenance fees are enabled primarily by proprietary standards and lack of interoperability
-
Open source is not at all the same as independent standards, nor necessarily improve adaptive computing or TCO
-
Independent standards bodies are voluntary in most of the world, slow to mature, and influenced by ideology and interests within governments, academia, industry, and IT incumbents
-
The commoditization challenge and need for adaptive computing is similar with ubiquitous computing regardless of standards type
Most interesting work, have you carried the concept to Causal Analysis, via Pearl 2008?
George — not in terms of applying the math to customer data yet, if that’s what you mean — I just pinged and reviewed–from what I can glean quickly it looks very similar to our intent way back in testing our second large network in 97/98, which is where Kyield concept started to take root. Thanks for the note and interest- appreciated – Mark