A Different View of Neuralink, Absolute Truth, and Rates of Change

Quite often, recent events seem to trend towards my personal extremes of quiet unnerving and unsettling. One in particular – neuralink – is something I understand in a very limited aspect, but also one which brings to mind the chaotic collision between ethics, science, technology, spiritualism, society, individualism, morality, and mortality that is along the lines of the police car pileup(s) from “The Blues Brothers.”

From their website:

 Our Mission: Create a generalized brain interface to restore autonomy to those with unmet medical needs today and unlock human potential tomorrow.

Our brain-computer interface is fully implantable, cosmetically invisible, and designed to let you control a computer or mobile device anywhere you go.

I get it… I really do – that there is a slew of benefits which would provide access to the world for those with limiting disabilities.

I want so desperately to believe that there is potential for pure benevolence that such a gift of science would bring to the world.

“But you already know what I’m going to tell you.”

[Interesting selection of image quite intentional.]

It is the intangible and unknown which doesn’t sit well with me, and hasn’t, ever since I learned about this effort years ago. It is a step into the realm of potential and intent which feels so very wrong when the considerations of how it might be misused the same way a panacea might be used as a poison or any other tool inevitably can – and often will – be used as a weapon. It is a feeling that, along with efforts in artificial intelligence (AI), makes me want to go hoarse with one screamed question:

“Have any of these advocates and creators EVER watched one science fiction show/movie or read ANY book from the same genre??”

However…

My stance in a discussion with my wife tonight surprised both of us: that while I think Neuralink is as wrong as it is a possible abomination of technology as well as a technological Pandora’s Box, I also think Elon Musk’s efforts in this field are the right direction andwish him success.  

Perhaps similar thoughts were shared by the Manhattan Project’s physicists prior to 16July1945 – that the path they were embarking upon was one of both terrifying potential and hopeful conclusion. Similarly, one could speculate that every weapon which became a tool or tool which became a weapon placed a burden of commitment and responsibility upon those who were inspired to create… before someone else came up with the idea and held the advantage over others.

And that is the key: that Neualink or some other variation of mind-to-machine interface will happen; that such inevitability necessitates primacy – to be the first to fruition. As a natural progression, the first to develop a technology also gets to understand the scope and limitations of said technology… which allows for the identification of vulnerabilities and vectors of exploitation/failure… which, in turn, provides the foundation of observable indicators of others’ efforts/successes/failures/implementation of similar technologies and accompanying countermeasures.

Potential for first use, or potential for first recipient; either way, it is like having a flamethrower duel in a gunpowder factory – just because one has the capability doesn’t mean that it would be a good idea to get too carried away.


This discussion was at the end of one day started with pre-coffee discussion topics at our house: theoretical physics and the idea of an absolute truth… Euler’s identity… Voltaire… Deism… the factionalization of faith…

…None of which I hold any claim to absolute authority… but concepts that we truly don’t ever know what we don’t know.

[Cue David Thewlis’ rant as Johnny from “Naked”]:

No matter how many books you read, there is something in this world that you never ever ever ever ever fucking understand.

Look, if you take the whole of time and represent it by one year, were only in the first few moments of the first of January. There’s a long way to go. Only now we’re not going to spout extra limbs and wings and fins because evolution itself is evolving. When it comes, the apocalypse itself will be part of the process of that leap of evolution.

https://youtu.be/N90sl94g7PE?si=RBWpdBOP2ix9J77s

Existence is imperfection. Being a better human… and understanding those limitations and the possibilities beyond… THAT is perfection in progress.


Earlier this week, I was mulling over Peter Turchin’s War and Peace and War: The Rise and Fall of Empires:

Perhaps the issue we might be missing is that not only are major facets of human interaction changing, but the rate of change is condensing to even more abbreviated timeliness.

The demise of the Roman Empire took a while because information traveled slowly. The rise religions, the spread of epidemics, they all moved as fast as man (and information) could travel.

Later, these speeds increased to the point where something I post here could be read in Kazakhstan just as quickly as it could be in Argentina…

However, does that increased rate of propagation also cause a societal decreased level of direct engagement or capacity to give much of a shit?

It’s too easy to sit on our asses and wait… or get conveniently distracted… or to entrench in conformation biases… or to just not care.

Events are going to change the world around us faster than we are able to conceive/comprehend. And it will be up to future historians to confirm that which I’ve suspected for a while: rates of change are only noticeable once one is beyond/outside that rate of change.

What would the fall of Rome look like today, with ’round the clock news and social media? I am willing to bet it would take a fraction of the time…


Neuralink.

Absolute Truth.

The acceleration of the rate of change in the history of right now.

There’s somethin’ happenin’ here
But what it is ain’t exactly clear…

…We better stop, now, what’s that sound?
Everybody look what’s going down

“For What It’s Worth” – Buffalo Springfield

The truth behind these words are as applicable today as they were in 1967.

…Shall we see what tomorrow brings?

1 thought on “A Different View of Neuralink, Absolute Truth, and Rates of Change

  1. Mike –

    Thanks for this. You raise the issue of the fall of Rome (as an example). I have not read Turchin’s book, but I have read Tainter’s 1988 “The Collapse of Complex Societies”. One thing that you did not mention – and I do not know if this is because Turchin did not mention it, was the fall of temperatures from the Roman Warm Period, which led into the Dark Ages, then the rise into the Medieval Warm Period – which culminated in the building of massive cathedrals all over Europe (and, not coincidentally, the rise of the Mongols and the establishment of a more “user friendly” Silk Road). The descent from the Medieval Warm Period into the Little Ice Age saw the rise of stress for food production, even as small nations competed and fought to conquer and amalgamate.

    The division of the world into essentially two major power blocks out of the two world wars sparked a frenzy in the West to conquer the East – probably from an American perspective of Mackinder’s “heartland” thesis. I have come across opinions in the West (not only from Americans, but including Americans), that Russia is on the verge of disintegration, and America intends to conquer that nation and break it up before taking on China, which a number of Americans see as the “true” enemy. This is a curious reading of the situation from where I stand.

    We are witnessing the collapse of Western Civilization, and it is not clear that many people understand this.

    Tainter wrote, among other things, this passage which I think is essential to understand (and which Turchin may have also covered):

    *1. Sociopolitical organizations constantly encounter problems that require increased investment merely to preserve the status quo. This investment comes in such forms as increasing size of bureaucracies, increasing specialization of bureaucracies, cumulative organizational solutions, increasing costs of legitimizing activities, and increasing costs of internal control and external defense. All of these must be borne by levying greater costs on the support population, often to no increased advantage. As the number and costliness of organizational investments increases, the proportion of a society’s budget available for investment in future economic growth must decline. *

    *Thus, while initial investment by a society in growing complexity may be a rational solution to perceived needs, that happy state of affairs cannot last. As the least costly extractive, economic, information-processing, and organizational solutions are progressively exhausted, any further need for increased complexity must be met by more costly responses. As the cost of organizational solutions grows, the point is reached at which continued investment in complexity does not give a proportionate yield, and the marginal return begins to decline. The added benefits per unit of investment start to drop. Ever greater increments of investment yield ever smaller increments of return. *

    *A society that has reached this point cannot simply rest on its accomplishments, that is, attempt to maintain its marginal return at the status quo, without further deterioration. Complexity is a problem-solving strategy. The problems with which the universe can confront any society are, for practical purposes, infinite in number and endless in variety. As stresses necessarily arise, new organizational and economic solutions must be developed, typically at increasing cost and declining marginal return. The marginal return on investment accordingly deteriorates, at first gradually, then with accelerated force. At this point, a complex society reaches the place where it becomes increasingly vulnerable to collapse. *

    *Two general factors can make such a society liable to collapse. First, as the marginal returns on investment in complexity declines, a society invests ever more heavily in a strategy that yields proportionately less. Excess productive capacity and accumulated surpluses may be allocated to current operating needs. When major stress surges (major adversities) arise there is little or no reserve with which they may be countered. Stress surges must be dealt with out of the current operating budget. This often proves ineffectual. Where it does not, the society may be economically weakened and made more vulnerable to the next crisis. *

    *Once a complex society enters a stage of declining marginal returns, collapse becomes a mathematical likelihood, requiring little more than sufficient passage of time to make probable an insurmountable calamity. So if Rome had not been toppled by Germanic tribes, it would have been later by Arabs or Mongols or Turks. A calamity that proves disastrous to an older, established society might have been survivable when the marginal return on investment in complexity was growing. . . . *

    *Secondly, declining marginal returns make complexity an overall less attractive strategy, so that parts of a society perceive increasing advantage to a policy of separation or disintegration. When the marginal cost of investment in complexity becomes noticeably too high, various segments increase passive or active resistance, or overtly attempt to break away. The insurrections of the Bagaudae in late Roman Gaul are a case in point. *

    At some point along the declining portion of a marginal return curve, a society reaches a state where the benefits available for a level of investment are no higher than those available for some lower level. Complexity at such a point is decidedly disadvantageous, and the society is in serious danger of collapse from decomposition or external threat. – Tainter, The Collapse of Complex Societies, p. 195/196.

    He also wrote this:

    *Complex societies, it must be emphasized again, are recent to human history. Collapse then is not a fall to some primordial chaos, but a return to the normal human conditions of lower complexity. . . . One ambiguity in this view is the major loss of population that sometimes accompanies collapse. . . . In any event, nothing in the preceding paragraphs implies that human actions always achieve, in the long-term, a desirable outcome. * From what I see, the United States and its Empire are well into the process of collapse from “decomposition”, but the leadership and military are working overtime to expedite the process by seeking “external threats”.

    Where does AI fit in this? I have no idea. I wish us all luck.

    Jim

    Liked by 1 person

Leave a comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.

search previous next tag category expand menu location phone mail time cart zoom edit close