A recent article at CACM brought up a topic that I’ve been thinking about off and on for quite a few years surrounding the relationship between network architecture, security, functionality, and sustainable economics in the neural network economy.
The article in question is titled “BufferBloat: What’s Wrong With the Internet?”, and is in discussion format with Vint Cerf—an old acquaintence, Van Jacobson, Nick Weaver, and Jim Gettys. Thanks to Franz Dill who alerted me to the article on his blog.
For this discussion let’s define the current neural network as: encompassing any individual entity, device, application, or sensor that is either part of, or connected to, the Internet, World Wide Web, or extension thereof—such as telecom, cable, or wireless networks. By looking at this problem systemically in the context of what is required of a neural network to perform as a sustainable ecosystem—which requires certain elements and features, we can then perhaps be far enough removed for a perspective that may help address this and other specific problems in the neural network economy. Read More
I am please to share a new paper that may be of interest:
–Optimizing Knowledge Yield in the Digital Workplace–
A new system design for thriving in the data-intensive universe–
From the abstract:–
The purpose of this paper is threefold. First, it briefly describes how the digital
workplace evolved in an incremental manner. Second, it discusses related structural
technical and economic challenges for individuals and organizations in the digital
workplace. Lastly, it summarizes how Kyield’s novel approach can serve to provide
exponential performance improvement.–
I am working on multiple articles relating to the patent I was issued last week, at least one of which should be posted here in the next few days, but in the interim I thought some might be interested in the common English part of the patent. I hadn’t visited this portion in some time–this was written in early 2006. Read More
I have argued consistently since the mid 1990s that the global medium (combined Internet and Web) increasingly reflects the global economy, and that rational, functional regulation is essential. I started this journey then with a very similar ideology to Alan Greenspan before the financial crisis; that self-regulation should be sufficient to prevent systemic crises, but in practice it has failed to do so.
Most of the actual regulation in computer networking today is accomplished via manipulation of architecture in one form or another, but technical standards on the web are voluntary, as the Tech Review article The Web is Reborn highlights, which was apparently in response to the article The Web is Dead at Wired earlier in the year. In the U.S. we are really reliant on primarily one form of regulation on the Web other than proprietary architecture and voluntary standards, which is social. Social regulation has evolved with the consumer web, occasionally demonstrating some power—as was recently demonstrated with Facebook security issues, but social regulations has also proved self-destructive at times, particularly regarding sustainable economics and jobs. Few if any consumers can see how their actions on the Web are impacting their own regional economy or industry, meaning that the blind is often leading the blind towards dangerous hazards in a similar fashion to the housing crisis. Ignorance is being exploited…. click to continue Read More
Since I have now responded to a related USPTO elections/restriction letter, I feel a bit more freedom in sharing additional thoughts on the underlying theory that has served as the foundation of Kyield:
Yield Management of Knowledge: The process by which individuals and organizations manage the quality and quantity of information consumption, storage, and retrieval in order to optimize knowledge yield. –(me)
(Please see this related post prior to attempting to categorize)
Background: The cyber lab experiment
The theory emerged gradually over several years of hyper intensive information management in my small lab on our property in northern Arizona (we are currently based in Santa Fe after a year in the Bay area). The experimental network called GWIN (Global Web Interactive Network) was designed after several years of previous high intensity work in our incubator, which followed a decade in consulting that was also drawn from. The GWIN product was entirely unique and intentionally designed to test the bleeding edge of what was then possible in computer and social sciences. We were particularly interested in filtering various forms of digitized intelligence worldwide as quality sources came online, conversion to useful knowledge, weaving through academic disciplines, and then mix with professional networking.
The network was open to anyone, but soon became sort of an online version of the World Economic Forum (photo), with quite a few of the same institutions and people, although our humble network even in nascent form was broader, deeper, larger, with less elitism and therefore more effective in some ways. Read More