Attention Management, the Mirage of Self, and the Meaning of Life

Definitions

What is yourself?

Merriam-Webster proposes this:

Self: one, the entire person of an individual; the realization or embodiment of an abstraction. Two, an individual’s typical character or behavior:  an individual’s temporary behavior or character; a person in prime condition. Three, the union of elements (as body, emotions, thoughts, and sensations) that constitute the individuality and identity of a person. Four, personal interest or advantage. Five, material that is part of an individual organism.

“Me” in other words, defined as my behavior, character, condition, body, emotions, thoughts, sensations and about $10. worth of assorted organic compounds.

Not very satisfying as a definition, but a good enough map to where “I” live.

What is attention?

Merriam-Webster again:

“One: The act or state of applying the mind to something; a condition of readiness for such attention involving especially a selective narrowing or focusing of consciousness and receptivity. Two: Observation, notice; especially: consideration with a view to action.

William James (“Principles of Psychology”1890): “Everyone knows what attention is. It is the taking possession by the mind, in clear and vivid form, of one out of what seem several simultaneously possible objects or trains of thought. Focalization, concentration, of consciousness are of its essence. It implies withdrawal from some things in order to deal effectively with others, and is a condition which has a real opposite in the confused, dazed, scatterbrained state which in French is called distraction, and Zerstreutheit in German.”

Much more ink has been spilled in pursuit of how it works. This probably began much earlier – warfare would make an understanding of attention strategically vital – but among early writers was Nicolas Malebranche (1638 – 1715) who distinguished between perception of the world, and the world itself. He went on to say: “It is therefore necessary to look for means to keep our perceptions from being confused and imperfect. And, because, as everyone knows, there is nothing that makes them clearer and more distinct than attentiveness, we must try to find the means to become more attentive than we are.”

And of course, everyone knows what attention is. In order to study it, it had to be named; then everything that seemed to apply was relentlessly measured, diagrammed and mapped. But none of this tells us what attention is. Worse yet, if we pay attention, become more attentive, who is paying attention or being attentive?

It is all that we have access to among the millions of neural patterns swirling within our brains in any moment. It is the surface of brain activity, the part of which we are aware “now.”  Whatever occupies attention has been processed from raw sensory input and patterns of “neuroactivity” (Walter Freeman III coined this term), and it has certain definable characteristics. There is the familiar sight, sound and sensation; but there is also another element, an internal, vocal, running commentary. This usually passes, internally, for a human being’s identity, the Self supposedly inhabiting this body. When we speak of “paying attention,” it is the entity assumed to be paying. However, I propose that this is the same thing as saying attention is paying itself, because this internal monolog is a feature of attention, not its source.

Attention Management

Attention is the subject of a vast industry today. With the advent of the Internet this has accelerated enormously, but it was probably first formalized and codified by one Edward Bernays, an early pioneer, if not founder, of the “Public Relations” industry. He may not be its sole progenitor, but his position in its history may be an example of his consummate skill in the new discipline. It was this one man who discovered how to sway public tastes so precisely and predictably that women could be induced to take up smoking tobacco, and change their preferences in fashions. And lest anyone suspect this is a statement about women’s intelligence, one only need glance at the entertainment industry category of “sports”, in which groups of men attempt to injure each other over an oddly shaped pigskin-covered bladder, to understand that humanity is almost universally vulnerable to the technologies Bernays refined.

Attention Management, under other names, has become a highly refined art and science. Individuals and groups practice it every day. Every corporation has at least one full department devoted to this critical aspect of doing business. Infants may be its greatest masters, and courtship is full of examples: one small sound from a sleeping baby will wake any mother within earshot; a lover’s entire expected future rests on the beloved’s smallest gesture. These are among the human characteristics that have been harnessed with such success to sell products we cannot afford (both for initial cost and subsequent disposal and environmental damage), that will not ever provide hoped-for satisfaction, sex-appeal, wealth, health, youthfulness, weight-loss, security. It is not superficial, it now forms the foundation of the political process, the criminal justice system, public health, and most organized religious institutions.

There is another aspect of Attention Management that is less discussed (if possible), but vitally important: your own. Can a person control attention at will? Not much of it; as a survival adaptation, attention has evolved for finding food and mates, and avoiding danger. Considering the running commentary in our heads, it should be clear that it would never do to put so vital a function entirely at the disposal of our intellectual faculties. It is often impaired by alcohol and drug use, television, and other electronic media. Whether we can direct attention, and who or what else is managing it, are worthwhile questions.

A word about this notion of Evolution is pertinent here. It is not necessary to have any directing entity, steering, as it were, the evolutionary process. It is much more accurate to think of evolution as a sifting or combing process in which anything that dies might end a particular form’s development. Thus it does not necessarily aim for the best possible adaptation. The populations that happen to survive the culling of random chance may not be the most perfect version of a pachyderm or a primate. The incredible micro-adaptation we see in some species, such as the parasite that acts on mouse-brains to suppress fear of cats, thereby creating a vector for the parasite’s multiple-host life-cycle, is merely evidence of how long this has been going on, and at how vast a scale, and also how interconnected all life is.

Attention is not required much of the time. One can drive machinery, type legal language at 110 words a minute, make love, perform music, or almost any other complex activity; and have no present awareness while doing so. Something attends to the demands of each moment, but what we are calling Attention is not only unnecessary but can impair performance. In some disciplines there is intense training for the purpose of turning off attention, getting it out of the way.

We approach most of life from assumptions. These are not often examined or chosen with care, but are the product of snap judgments often made in moments of crisis. These background assumptions shape what we can experience, as the brain externalizes them, projects meaning on the world. When we encounter works of art, we tend to ascribe meaning to the work based in our own fundamental ideas, that the artist may or may not have intended. Literature is similarly interpreted in the context of the reader’s view of the world. We live in purely subjective assessments we see as objective reality, that have little to do with inherent qualities. The same is true for our relationships to women, men, money, sex, work, politics, social mores and speed-limit signs.

Among the most troublesome of these assumed realities is that there is a Self somewhere inside a person that experiences everything, evaluates options, makes decisions and choices, and generally regulates individual conduct. The essence of this belief is that this Self is the subject of experience, the arbiter of actions, formulator of opinions and the one guilty or innocent of crimes. The one that awakens in infancy, is filled with learning for better or worse, the same one that in later life has regrets for losses and rejections, transgressions and injuries. A large proportion of human beings further believe that this is the Soul, an entity that transcends the body after death, and either returns to a new body and another life, or passes on according to Divine Judgment, ascending to Heaven, where all of its true desires are ultimately fulfilled, or descending into Hell for an eternity of unbearable torture.

Science has arrived at a different view.

The assumption of Self-ness arises by implication from a point of view. We see, hear and feel the world around our person, and this view shifts according to the body’s location. There is the impression of time passing, and changes over time, from which we infer a sense of the Self’s continuity. Others address us, holding us to account for our actions whether in praise or blame. The very languages we speak are based on the validity of “Myself,” of “Mine,” “I” and “Me.” But there is a difficulty. While we can verify the existence and location of a tree or a rock, and successfully operate motor vehicles most of the time, and keep track of our health and wealth, we cannot locate a Self anywhere in the human body, except by inference. Something seems implicit from such a rich and deep and unique view of the world in each of us.

This assumed Self eludes even brain science. But to Scientists there is no better reason to pursue a question than its very elusiveness. Microchips and electronic manipulations of magnetism and light reveal new views of the inner workings of our brains. For the first time in history, these revelations have begun to answer some of the questions that such explorations previously only multiplied. For example, it is now recognized that brains do not function at all like digital computers.

Some revelations are more troubling.

We do not choose when choosing. Timing a baseball pitch against the time required for a batter’s brain to process speed and trajectory, and decide how best to swing at it, reveals that there is not enough time between windup and strike to perform these calculations. The brain must do the math well before the ball leaves the pitcher’s hand, and direct the motor functions that bring bat to ball. Only after the fact does it provide the batter with an experience of choosing whether and how to hit. Although the batter trains long and hard to judge the oncoming ball, and will swear to the validity of experience, the event is actually back-dated in the batter’s memory.

Worse yet, we do not see when seeing. This old Zen admonition turns out to be literal truth. About twenty percent of the image on the retina is new information; the rest is fed back to the eye through a thick cluster of neurons. There are lots of entertaining ways to discover this, such as the famous “Gorilla Test” from Harvard, which showed that people often fail to see what is right in front of their eyes. “Seeing is believing” takes on new meaning: neurologically speaking, it may also be true that believing is seeing.

The assumption that a Self lives in each of us is not a bad thing, but it has very serious impacts that have not been felt for most of our history, probably because population levels have not been so high. Meaning is generated in brain activity, it is not inherent in life. What anything means is but a flavoring of experience, a modified pattern of firing neurons. The Meaning of Life is an entirely human invention.

Evolutionary adaptation has left us with competitive impulses; at the present scale of population much of this is dangerously obsolete for survival as a species. On the whole humanity has overwhelmed most threats to survival through sheer force of numbers despite the fact that a billion of us lack sufficient food or potable water. For generations the only real threats to our survival have come from ourselves.

Science has made us aware of the deadly changes that will come, already unfolding in new weather patterns, if humanity does not modify its behavior. People who can read and write, who have in their hands the ability to steer the national governments and vast corporate enterprises now extracting wealth from underground, are aware from the reports of scientists that there is rapidly accelerating deterioration of the ecosystem. We are poisoning our world. We will join the myriad creatures that have gone out of existence unless this deterioration is stopped, and reversed. This is as much a certainty as anything Science has revealed in human history. People are now perishing from human ecological damage. Yet the behavioral causes of this unfolding disaster have not changed. If anything, we have increased our desperate flight toward species extinction.

What is the driving force behind these collectively suicidal behaviors? Greed has been a popular answer, with the implicit solution of stopping people from being greedy. And we may substitute any human weakness: fear, love of power, arrogance, etc., even noble, if misguided, concern for our children. The obvious solutions, being ignored, do not make the slightest difference. Pollution, violent mass killing and enslavement, and industrial-scale Public Relations campaigns designed to distract public attention, continue unabated. This is not new, it is the history of human civilization; but Scale is rapidly making it incompatible with species survival.

Scale is the key factor that has made our self-destructive behavior institutional. It is far beyond the capacity of a great leader or even a social movement to change the direction of our collective behavior, so organized have our institutions become. We might say that institutionalization is the process by which systems become committed to their own survival, and this is certainly the case with modern corporate entities. These are the real robots that have already turned on us, not the machines we read about in science fiction novels or see in films. Humanity has created a monster that is not answerable to us, and already grows fat on the destruction of human life. It is now leading us in the systematic slaughter of our fellow-beings.

What causes this madness cannot be so simple as the venality of a few men. We need a deeper examination of what drives collective human behavior, and that requires delving into the individual brains of human beings to discover, if we can, what makes us tick. Fortunately most of the technical work of this exploration has been done by now, expounded in academic institutions and even explained in entertainment media. The next logical step is to bring that knowledge before the public effectively in the hope of bringing our human institutions to heel. Public Relations, being best equipped for the task, should be won over to this cause; almost never do human beings deal in actual truth. Religion, too, presents possibilities, the job being so global and of such stupefying scope, and Religion has long purported to occupy the department in human affairs that turns stampedes and manipulates panic in the service of common benefit. But however it is done, even by accident, Science now agrees that we have but a generation to get it accomplished, or human life, and possibly all life anywhere, ever, is at its end.

Art, Literature and Science are not simply amplifiers for individuals hoping to garner large audiences, of course. These human endeavors seek, and very often with great success, to resonate with the higher impulses and experiences of humanity, calling forth great benefit for our world and our future. Brains are solipsistically isolated, but incomplete without the deep interconnection found in communities. Individual humans rarely do well in life as individuals, alone. Language is one evidence of this, as it forms not only the networks through which a voice, a sensibility, emerges that is far greater than the sum of its parts: language is our very medium of existence. If Life Means Anything, it means a species of creatures with opposable thumbs and an extra helping of curiosity has reached for, and grasped, the means of its own destruction, and like a toddler with a loaded pistol, we face a future that is uncertain to say the least.

Attention Wars

The dominant technology of the times is centered around the computer and the management of data it makes possible, but there is one thing it made possible that has now overtaken even the power of the microchip. I say this, even knowing as I do the vast reach of this device, that now controls most of the financial activity of the world without much human intervention. Indeed, human intervention in financial matters is far too gross and slow for practical purposes, as networks track and move economic forces that are as tectonic, tidal and chaotic as weather patterns or earthquakes.

The proper name for our present Age is the Age of Attention. Specifically, the computer and the Internet have made possible the commodification of human attention, and this has led to a massive shift of our economic foundations. The language of Public Relations, that arcane magick invented by Freud’s nephew Eddie Bernays, introduced the concept of the Consumer, and that is now a term-of-art in Economics. Where it once referred to a stage in a process that involved Raw Materials and Means of Production, there is now the idea of the Consumer as a kind of mythical beast that can be induced, with the right sequence of incantations, to eat whatever you wish to feed it, thereby converting your Product into Profits.

Now, however, we have passed into a new realm, a realm in which the actual exchange of goods and services and tokens of value no longer matters. It is now sufficient to attract the attention of the largest number of people at a given moment. That volume of attention, colloquially known as “eyeballs” in some circles, is itself a commodity of greater value than industrial metals or pork-belly futures. A quantity of eyeballs, pointed in the right direction at the right time, can make or break fortunes, start or end wars, or topple an empire.

In attempting to understand our times, much less ourselves, we turn to every discipline and read every book on every subject that promises a glimpse into how our world has gone so horribly awry. At least, those of us not already busy concocting new ways to exploit human failings for astronomical profits are doing so.

And horribly awry it most certainly has gone: as I write there are six or seven major shooting wars in progress, many for more than a decade; a growing number of countries are “failed states”; there are famines and genocides on an unprecedented scale; there are even pirates on the high seas. The largest military organization ever created is still growing, with proliferating industries supporting it, and now even exerting pressure to sustain even more wars in the interest of the most prosperous weapons industry in history. And let us not forget the epidemics: AIDS, cholera, and several antibiotic-resistent strains of extremely deadly germs abound. Breeding grounds for these pestilences include not only prisons in world powers gone bankrupt, but actual laboratories where infections are cross-bred and genetically engineered for invincibility as weapons of mass destruction. Controls at such locations are uneven to say the least.

What has all this to do with the Age of Attention? Simply this: it is now, like stone in the Stone Age and Bronze in the Bronze Age, the most abundant and reliable source of power and profit.

And it is important to understand that the quality of attention does not matter. When the image hits the screen, the freight has already been paid. What the Consumer does with whatever information has flashed across the brain does not matter in the least. It doesn’t even make a difference whether the Consumer is aware of the content of the message or not. What matters is the number of pairs of eyes that have been held in focus by whatever medium is in view. How can this be? Because that number can be quantified, and so it can be traded.

The less the Consumer knows about what is done with its attention, the more power is to be derived from it. If a large enough group of people can be focused on one thing, a great many enterprises are waiting to make use of this strange energy, and they will pay by the nanosecond.

In the ’90s, at an event I was privileged to attend, the great inventor and visionary R. Buckminster Fuller suggested that Democracy could be computerized simply by using weather satelites to read the temperature of a population at the moment when a referendum or a candidate was presented to them through mass media. We would not need to vote, this method would be far more precise than the ballot box; the results would reflect the true Will of the People. He could not have known how close he had come to describing the world we now inhabit. But the technology has found a darker use. Whatever the mechanism, Business has found ways to read our temperature, but this technology is applied in a very different way. Humanity is treated like a tremendous stimulus-response machine, and prodded and poked in various directions for profit.

A fly in the ointment.

Ages of the kind named by archaeologists are not successive, they are concurrent. The Stone Age is not over, it was only relocated. The Industrial Age too has been uprooted, not supplanted. With the advent of the Information Age, which I place at the world-wide implementation of money – the original storage-retrieval system for value – sometime in the last few dozen centuries, it became possible to manage affairs at a great distance. Where an enterprise used to consist of smaller units, such as a band of foragers, or a nest of theives, with specie one could manage anything from a feudal city-state to an entire overseas continent. In a similar way, on a much grander scale, in the Attention Age economic activity can be shaped and molded with impenetrable complexity.

This new human creation has far outstripped the power of mere governments, which are now maintained as a lower-level crowd control tool more or less analogous to herding. The ways this has manifested are numerous and varied, and bear almost no resemblance to one another. Endless permutations of form conceal, even from the operators, the machinery of an economy that is essentially self-parasitic. Human societies have become predatory, and the prey is humanity itself. We are become Ouroboros, the snake that eats its own tail. To say that our own invention has enslaved us is vastly understating the case.

Recently there were demonstrations around the world loosely based on the fact that one percent of humanity controls over half the real wealth on Earth. “We are the 99%!” they chanted, from financial districts in the larger financial centers of the world. One might think the “one percent” aware of this dichotomy and fighting to maintain their hold on the gears and levers of human endeavor. But it isn’t so; they are as much slaves as the “99%”. They know of no other way to live than the way they are living, even while it is an obviously self-defeating policy of killing the fabled golden goose.

We are stuck, we human beings, in a descending spiral that we are at a loss to control or stop. We have yet to recognize the great mirror that has fascinated our gaze and captured our brains, and if we do not wake up to our situation, we will continue like the proverbial monkey reaching for the moon, not realizing it is only the moon’s reflection on water, until we starve or murder ourselves into a final oblivion.

©Copyright 2023 Peter Barus