Expect Little!

My mission to procure cigarettes and beer from the liquor store nearly ended in disaster one night last summer when a woman in a swell car almost ran me down.  She had responsibly checked for oncoming traffic — in the wrong direction — before executing her turn, and as she passed me our eyes met through the passenger window.  She looked at me as if to say, “oh! how long have you been there?” and I stood in the street, toes mere inches from her rolling tires, grinning back at her in frank amusement.

I should have been outraged.  I should have spit upon the hood of her car.  The thing is, I just didn’t feel any anger toward her at all.  I found it funny that she might have killed me outright and, altogether oblivious of her  manslaughter, simply gone on to shop at Target.

Was it remarkable that a person should make such a glaring error among the throngs of humans negotiating the myriad avenues and boulevards of Los Angeles County, thought I?  Oh, hardly.  In fact, only an idiot wouldn’t expect it.

Then suddenly, as I went on my way with a wide smile warming my face, I shrugged, and an epiphany descended upon me as if from heaven.

“Expect little,” I said aloud.

And I’ve been saying it every day since.

The obnoxious behavior of others is normal for human beings. Expect little.

Expect little is a prayer.  It soothes and calms.  It educates.  It’s an unlikely mantra which inculcates a sort of passive humility.

It may be a nice gesture to presume that everyone is endowed with friendliness, elementary skills and common sense, but it’s an unlikely supposition which can only lead to discontent.   One ought rather to expect little of others.  Hate becomes very difficult when people act in accordance with your already-low expectations of them.

It behooves us all to acquaint ourselves with the idea that humanity may not be cut out for greatness, not even in our own hackneyed estimation.

Expect little, friends, because the highest percentage of people is always more rude, stupid, and unkempt than the minority of well-mannered, intelligent, and hygienic people.  This is because exceptional characteristics are by definition above average — which is to say, that they are the exception, rather than the rule.  Expecting little from people allows you to be content with the way people actually are, and pleasantly surprised by above-average behavior, which is as it should be.

To expect excellence from people, on the other hand, is silly.  People have never been cool en masse, but mass media has programmed us to expect everyone to be beautiful, polite, and at least somewhat intelligent.  This is (ha, ha!) not the case.

Expecting excellence from people is not even respectful to them.  In fact, it’s condescending.  You aren’t so cool, yourself, you know, particularly from the perspectives of people who don’t live up to your high standards.  We — you and I — are not cool enough to expect good things from others.  We don’t even know what cool is, in the universal sense.

Let people be stupid.  Let them be themselves, for God’s sake (big G).  Let them be stupid today, because you’re probably going to do something stupid tomorrow.

Think you're especially brilliant? Wrong. Each of us is just as gloriously idiotic as the next. Embrace humanity.

Expect little, because you can quickly become depressed by the amount of people who fail to meet your expectations.  That’s not any good.  Discontent with others leads to treating people as though you do not like them around — which tends to convince people that you do not like them around.  Pretty soon, you find yourself without anybody around, and where do you suppose everyone has gone?  Why, into the next room, of course, where everyone is frowning in your direction and calling you an elitist asshole.

Of you, they would do better to expect little.

We don’t only have irrationally high expectations of people, though.  Occasionally, we even find ourselves angry with luck, itself, as if it were slacking or something, remiss in its duties, not paying close enough attention to us and producing the wrong kind of random event.   This is perhaps our most common madness.  Why should we expect good fortune from random chance?  Random chance is the one thing from which we shouldn’t expect anything at all!

The world’s smartest computer can’t make accurate predictions of what random chance will produce.  Why bother lamenting an unfortunate mishap as if shocked that it might inconvenience you?  Mishaps happen.  In fact, mishaps happen so regularly — and with such colorful variety — that we ought long ago to have stopped guessing what should or should not transpire within the course of a day.  However, the rusty computers between our ears are always half-dedicated to overestimating their ill-collected data and faulty projections.

You see, then, we even expect too much of ourselves.  We’re only human, friends.  Chase your dreams in earnest, quest valiantly for glory, and by-all-means be the change you wish to see in the world, as the neo-hippies say — but…

Expect little.

Luck of the draw got you down? Dice come up snake-eyes again? Take my word for it -- expect little.

Expect little!  Expect your neighbor to make too much noise.  Expect your boss to give you too much work.  Expect helicopter parenting, drunk driving, and repeat offending, often by the same culprits.  Expect your favorite band to use too much cowbell.

Expect people from poorly educated states in poorly educated countries to act poorly educated.  Expect people crammed into tight quarters with millions of others to develop hurtful prejudices.  Expect full-grown adults to parrot what they see in movies, in magazines, and in mainstream music, and expect their teenagers (raised likewise by televisions and gangsta rap) to be perfectly disrespectful.

Expect politicians to lie, and cheat, and steal, not to mention fornicate with people you’d rather they wouldn’t.  Expect people with guns (soldiers, cops, and criminals) to shoot people.  Expect druggies to do drugs and go about in public on drugs, and to act just as though they might be high on drugs.  Say to them when you see them shrinking from the demons down aisle nine at Rite-Aid, “Hello, druggie.  How do you do?”

Expect preachers to sin, marriages to fail, and sons and daughters to leave the family religion.  Expect athletes to take steroids, psychiatrists to prescribe poison, and models to mutilate themselves surgically.  Expect wonder.  Expect marvel.  Expect to be astonished at the spectacle in which every one of us plays a humble part.

In other words, expect people to act just as though they were human — but for your own sake as well as that of others, the next time your friend complains that a significant other has forgotten an anniversary, or that some ruthless businessman has destroyed the local economy, or that a hapless driver has run over his or her favorite author (ahem), just shrug your shoulders and smile sympathetically, offer a beer and say to your friend,

“Expect little.”

With a great big smile and my fingers crossed, I remain,

Yours Truly,

-BothEyesShut

Advertisements

Immortality, the Gift That Just Won’t Quit

The definition of death doesn’t hold much water, really, once all the voodoo juju is shaken out of it.  The harebrained doctors have one make-believe definition of it, the self-important scientists have another, and the whimsical believers have yet a third.  When one has faith in the existence of death, though, death can be a gateway, a rebirth, or even a redemption.  Anticipating death makes up the cornerstone of most world religions, while avoiding it remains the focus of most sciences.

— And that’s O.K.  There’s nothing wrong with any of those philosophies in and of themselves, but let’s eschew all that for the sake of conversation.  Let’s look at death without any allusion to typical, traditional beliefs.  What does death resemble, now?  A permanent medical condition?

Nevermind.  Let’s just say that death is a simple state of affairs that any doctor can walk up and diagnose, like this:

“Hey, this guy’s dead.”

Why, this guy's dead!

The doctor means that the poor guy’s lungs have stopped breathing and his heart has stopped beating.  That’s clinical death.

Most realists think of death as nothingness, bleak, black, and empty, which is typical of them; because if there’s any way to have less fun and be more boring, the realists will practically kill themselves to show you how.  Even so, most atheists and agnostics think this way about death, too, which is disappointing because as anyone can tell you, they throw the best parties, and therefore oughta know better.

“What happens when you die?” you may ask one of them.

“Nothing,” they say.  “That’s kind-of the point.”

OK Mr. Sunshine, but nothing is precisely what never happens.  There’s always something going on.  Besides, lots of things happen when you die.  When you look at clinical death, it actually mirrors the very early stages of clinical birth, so-to-speak, which normal people call pregnancy.

In the earliest stages of pregnancy, the fertilized egg (or zygote if we really must) has forty-six chromosomes, as well as its own unique DNA structure.  Anti-abortion terrorists are keen to remind us that this little eggy wegg is alive, and they’re not wrong.  In fact, scientists pretty much have to agree with them, because the zygote exhibits growth, metabolism, reproduction, and reaction to stimuli.

Apparently, the smartypants bigshot scientists have decided that a thing is alive if it’s got those four attributes.

What the zygote does not have, though, is a lung or a heart with which to satisfy the medical doctor’s requirements.  Its respiration has not yet commenced.  Its pulse is nonexistent.

“Why, this guy’s dead.”

“Now, you just hang on a second there, Doc.  We’re picking up growth, reaction, metabolism and reproduction.  This sonofabitch is alive.”

Great.  So the zygote is dead and alive.   Perfect.

Perfectly nonsensical.

Zombie Zygotes of the Living Dead

Why not, though?  When a guy looks at his arm, he thinks of it as a living part of him, right?  If doctors amputate it from him, then no one looks at it quite the same way.  It’s dead now.  The amputation was, as far as his body was concerned, a little death (or, la petite mort in French, which incidentally means orgasm).

Yeah, why not?  After all, when a pregnant woman feels her baby kick, she thinks of it as a living part of her.  If doctors deliver it, and amputate it from her, then no one looks at it quite the same way.  The baby’s alive now — even though the amputation was, as far as the mother’s body is concerned, a little death (or en francais, orgasm by baby).

Dead and alive, alive and dead.

The dead aren’t really all that dead, anyhow.  We eat dead things to stay alive, in fact — but only dead things which have recently become dead.  Dead things become more dead over time, and we can’t eat things which have been dead too long.

There’s not enough life in them, you see.

But just wait a damned second.  A little death?  More dead?  Death isn’t supposed to have all these degrees, all these shades of gray.

Silly-headed cynics and so-called realists step in at this point and remind us, “No, jerk.  Death isn’t in degrees or shades, and it’s definitely not gray.  Death is that certain change that happens in the instant that life stops for an organism.  Those four things you mentioned earlier?  Growth, reaction, et cetera?  The body can’t do those things anymore, so it’s dead.”

Yeah, alright, sure, Professor Killjoy, but from the broadest perspective, death doesn’t mark any significant change at all.  It’s just another change in an infinite pattern of changes — or, if you like, it’s another death in an infinite pattern of deaths.  Life, in fact, is what we call this infinite pattern of deaths.  Look:

Human life begins with an ovum and a sperm combining into a zygote.   This means the death of the ovum and the sperm, because they no longer exist as such; their chromosomes have been shared.  The zygote then begins cellular division at an extremely rapid rate, each division a little amputation (orgasm) from the parent cell, and these amputations are what we call growth.  When enough cellular carnage has occurred, the child is amputated from his or her mother, and soon afterward begins to eat dead things because of the life in them.

Dead things taste good.

Food is dead-ish

As the child grows, cells are born, grow old, die; are sloughed off, are excreted, are absorbed as more fresh dead stuff to nourish and prolong life.  Cells divide, and divide, and divide.  The lining of the small intestine is completely replaced over four-to-six days, you know.  The outermost layer of skin, or epidermis, every two weeks.  The hard structure of the human skeleton, every decade.  Even this child’s blood, just like the blood of every living person, is composed of red blood cells which live in the bloodstream for about four months before being replaced.

An elderly man of ninety years, therefore, has lived inside nine skeletons.  He has consisted of two-hundred and seventy human bodies’s worth of blood.

It’s all dead, though, remember?  We’re, like, hermit crabs or something.

Like our bodies, our minds unfold as a train of deaths and divisions, too.  Ideas grow and gestate, eating new information and transforming cold facts into newborn ideas, ideas which split and branch and grow of their own accord, just like a pride of lions flourishing from the carcasses of a few dead gazelles.  Sometimes ideas sprout from stagnant knowledge so automatically that our minds consider themselves inspired, but every new thought kills off an obsolete idea.

We grow and learn, shedding skin cells and obsolete ideas along the way like scraps of confetti following a parade, and when at the age of ninety we reflect on our adolescent selves, those teenagers seem long gone, long passed away, and the wistful feelings our memories evoke mimic those felt by mourners years after the funeral.

Death and life, life and death.

The thirty-year-old hermit crab and his previous shells

We still have no round definition of death, however.

Death seems no more than change and transition, and since change is an eternal constant, death must be occurring all the time.  If that’s so, then death as a single event does not exist.

If you think you’re going anywhere when you “die,” I’m afraid you’re horribly mistaken, as far as I can tell.  Nobody is going anywhere.  Nobody is going anywhere, and neither are the actions we are still making.  That the “dead” human mind no longer orchestrates these actions is inconsequential, since the mind was never orchestrating anything from the broadest perspective, anyhow, regardless of how intimately involved in the processes of the universe it seemed.

This will sound like glorious immortality to some and eternal damnation to others, so I guess that if you really wanted to you could call your opinion on living forever ‘heaven,’ or ‘hell,’ but don’t do that.  That’d be so tacky.

If all this sounds fantastic, consider that everything we are or will become was already here long before we were born.

All the material needed to put our bodies together had long been available before our births.  Our mothers merely needed to ingest some dead stuff and assemble it inside her.  The material to put our minds together had been here, too.  The elementary ideas, the deeper concepts, and the inner mysteries all, all, all had been waiting for our minds to ingest them and put them to use.  We were already here, waiting for assembly, just like The Great Gatsby had been when the Old Sport was alive inside Fitzgerald’s head, but not yet written down.

Sure, Dad can stick some spare auto parts together and build a car, but Mom can throw some spare body parts together and grow a person!

Cynics and skeptics will say, “An idea is not a thing, Sir,” and I must retort: well, where, exactly would you like to draw the line?  If Gatsby exists once he has been written down, what happens if the manuscript is destroyed?  — And if Fitzgerald writes him down again, is he birthing the same Gatsby?   What of publishing and printing?  Are all Gatsbys the same man, or different men?

Consider also the differences between brothers of the same family, raised in the same general time, by the same parents, on the same food, in the same area, with the same values, et cetera, et cetera.  One may grow up into a madman and the other a schoolteacher, but from the broadest perspective the difference can only be in human estimation, just like so-called death.  If we are arbitrarily, subjectively deciding what death is, then there really isn’t any such thing we can point to after all, is there?

In order to believe in death, one must think just like the doctors and scientists, coming up with their own willy-nilly criteria by which something can officially be called “dead.”  You may as well say that death is what we call the future, and birth what we call the past.

The Starship Enterprise notwithstanding, we will always be here, extant, just as we have always been here, and the proof and cause of both is that we can’t help but be here now.  There can be no escape.  We are captives of existence.  And why?

— Because the present time, nestled snugly between the past and future, between birth and death, seems very much alive, and it happens also to look very much eternal.

With much pleasure and measured amounts of pain I remain,

Yours Truly,

-BothEyesShut

Stumble It!

Oh, Yeah? Prove it!

Every experiment has significance, even the inconclusive ones.  When a team of smartguys at M.I.T. completes a study with inconclusive results, it reaches the ineluctable conclusion that another study is needed and immediately sets to work on it.  This testing can, will, and does continue until significant findings have been produced — er, that is — discovered.

Once significant results appear, the doctors conducting the study become proponents of it and publish these discoveries in remarkably well-respected journals.  These paperback journals are written in tedious, turgid English that is too obscure for the public to read, and have an average cover price of thirty American dollars, ensuring that the general populace gets no chance to join the conversation until it is Mickey Moused by Time Magazine and sold as an impulse buy at the grocery counter.

Hey, whatever.  At least mom’s getting in some string theory.

Journals cost upwards of thirty bucks, but at least they're jam-packed with ten-dollar words

As in all things in this universe, the idea proposed in this new study begets its equal and opposite, a second study which exists to provide an alternate scientific belief for anyone and anything negatively implicated in the first.

The satisfying thing about science is that it loves conflict.

Scientific prejudices appear out of this conflict, and because they are prejudices of science itself, the public presumes them factual.   From the broadest perspective, however, science walks in the well-trod footpaths of religion and theosophy.

When science decides that a certain quantum particle does not exist based on its failure to appear in tests, science is as faith-based as the creation myth of Genesis.  Science and religion have traditionally been rancorous archenemies, but this is a misunderstanding which, if one could get them talking again, could easily fertilize the most affectionate of friendships.

This animosity has been based on little more than a clerical error, anyhow.  Note how science and religion interplay in the following.

Once upon a time, in a faraway land called Berkeley, there lived a doctor of physics.  This doctor believed in a certain particle he called the God Particle, and hypothesized that it existed everywhere and had an effect on everything else.  So the doctor wrote a paper and was granted funding to perform experiments in a very special place with very special equipment, and after three months of rigorous, painstaking trials, the poor doctor was forced to concede that no evidence of his God Particle had surfaced in any tests at all.

To the scientific community, this absence of evidence presents hard, objective proof that Doc’s God Particle does not exist.  Even if they add the word “theoretically” to the conclusion (as they do with the theory of gravity, which they still can’t fucking figure out) they still use the test as a quotable citation in papers arguing that the particle is a fantasy of the doctor’s.

To be perfectly clear: in popular science, the absence of evidence can prove that a thing does not exist.

How’s that for self-satisfied conceit?  They can’t even plumb the depths of our ocean trenches, but they’ve got E.S.P., telekinesis, astral projection, sixth senses, prescient dreams, and automatic writing all figured out.  How?  No evidence, that’s how.

Oh.  Well, shit.

Scientific evidence shows that there is no scientific evidence that scientific evidence is scientifically evident

Now, let’s say that following the most costly failure of his professional career, Doc is forced to return to teaching at a preparatory high school for rich kids, which amazingly enough also happens to inculcate Catholicism.  In this private school, Doc is lecturing about the existence of God during a religious studies class, when suddenly a particularly cynical and sarcastic student raises her hand and demands to know how it is that anyone can feel sure that God (big G) exists at all.

Well, this is the question for which the course entire exists, and so the doctor puffs up with dignity and conviction, and with great certainty informs his students that in all the centuries and centuries of assiduous scientific research, and of all the brilliant, most well-respected minds throughout history, not a single person has been able to prove that God does not exist.

To elucidate: in matters of religion, the absence of evidence to the contrary can prove that a thing does exist.

— And though science and religion may fixate on the same piece of evidence (that nothing has appeared in tests, in this case) they both exit these experiments feeling assured that their hypotheses have been logically supported, because objective reason has its roots in language, and language happens to have more than enough elasticity to correctly describe a single concept with two definitions, each the perfect opposite of the other.

As violent and arbitrary as this arrangement may seem, the truth is: the common person likes it fine.  In fact, practically everyone hates unchallenged assertions, even the people making the assertions, themselves.  Something about our nature causes us to see polar opposites in everything, and something about our minds causes us to invent contrary concepts for every conceivable idea.

Humanity likes nothing until it is contested, enjoys nothing better than a contest

It is this facet of the human personality which affords us such colorful figures as the venerable Flat Earth Society, which still maintains that the globe is flat; the irreproachable Tychonian Society, which avers that the sun orbits the earth; and one mad Dutchman at the University of Amsterdam, Erik Verlinde, who asseverates that gravity is, in fact, fictitious.

If the ever-patient and magnanimous reader finds the Flat Earth Society amusing, then the reader is hereby urged to consider that most contemporary physicists believe Dr. Verlinde’s theory to have very convincing implications, and that gravity is merely the effect of a universe maximizing its entropy, or disorder.  The concept of gravity as a universal power will probably not exist for our children.

Q: If gravity, of all things, really is a red herring, then how incredible and fantastic are groups like the Flat Earthers and Tychonians, really?

A: Every bit as credible as a science journal, just as veracious as a leading theoretician, and equally as trustworthy as the supposed date and time of the reader’s birth.

Lo, and behold the clerical error of which I spake: if science and religion could leave the protection of their podiums for a second, they might each glean a mutual respect for the irascible plight of the other, which is that they are both sadly, obviously, and pathetically full of shit.  Not one or the other.  Both.

Yes indeed, we like the results of our experiments best when they are disputed.  Should science publish a study which shows conclusive evidence on any topic at all, another science immediately sets out to prove the opposite.  The people of the world want every perspective sullied and watered-down, pushed and contested until a ninety-nine percent probability has its back against the fifty-fifty wall, precisely where we want it.

We want it balanced just so, because we like to choose sides as if they were baseball teams.

— And once we arbitrarily pick a team, we commence to argue, and bitch, and dispute for it as though our evidence were, after all, indisputable.

Even incontrovertible evidence meets with reasonable opposition

Evidence is stupid, anyhow.  It’s usually statistical, which as anyone can tell you is the most insidious form of prevarication.  For some reason, intelligent people appeal to the authority of statistics all the time and require the same of others, which is doubly asinine, as these egghead hotshots know full-well that appealing to any authority is a cardinal logical fallacy, and exponentially more so when the authority in question is an invariably inaccurate numeric representation of an actual, physical chain of events, collected from a sample base which even under the most fastidious methods has no chance whatever of accurately representing some other, similar yet different thing at an entirely different point in time.

As the British statesman, Benjamin Disraeli, once said, “There are lies, damned lies, and statistics.”

Most experiments require a test group and a control group, too, but like gravity and statistics, there’s no such thing as a dependable control group, either. The very act of including it in a study changes its natural state.

An excellent example of this occurs in quantum mechanics, in which certain particles exist only in patterns of probability — that is to say, they are probably there, or probably not-there, never certainly so — and these patterns of probability change according to which researcher happens to be recording the data.

If one supposes that fifty scientists conduct the same study, their findings will generally have an acceptable margin of error, each doctor achieving his or her own individual result.  The only difference between this margin and a larger one is that we declare the former admissible and the latter inadmissible. Experiments cannot gauge truth in objective reality any more than a preacher can divulge so-called Ultimate Truth (big U, big T) from a holy text.

Humanity finds evidence-for, and evidence-against, and ultimately judges its (supposedly) objective reality with the subjective whimsy of an adolescent girl deciding between prom dresses.

This, ladies and gentlemen, is what the world calls evaluation by evidence.

Weighing all evidence with the most discerning of eyes, the prom date is an apotheosis of adjudication

So all evidence is meaningless, then? All results, experiments, and hypotheses, nothing but evaporated time and energy?

Not at all. Just because there’s no such thing as True (big T) objectivity doesn’t mean one can’t create it for oneself or support it for others. We arrive at many, many decisions on a regular basis which matter to hundreds, perhaps thousands of people, and we put our faith in evidences in order to do so.  Truth is easy to arrive at in a box.

One has merely to define the box.

Contrary to an extremely annoying popular belief, though, there is no such thing as thinking outside the box, because from the broadest perspective nothing makes any sense.  Logic only happens within defined parameters.  One can exit one set of rules and enter another, more comprehensive set, but there’s always another box containing all the smaller sets to prove that they are infinitely short-sighted and presumptuous.

The important thing is to remember that we’re basing it all on faith.  Nobody knows what’s really going on.  The passionate stupidity of thousands of sheep in innumerable American religious flocks has allowed science license for abject arrogance.  The truth is, though, any honest scientist will tell you that science has no positive idea about the meaning of life, the universe, and everything.

That’s the slippery thing about Ultimate Truth (big U, big T).  It’s only true if it does not conflict with the properties of the universe — and the universe is in constant flux.  In fact, the only known absolute constant is the transitory nature of everything.  This means that even should an Ultimate Truth surface, it could only be ultimately true for an instant before becoming outmoded to newer, emergent properties of existence.

Mr. Jesus may very well have been the way, truth, and life once (or maybe is due up in a few more centuries) but neither he nor anybody nor anything else can be a static ultimate truth in an anti-static reality.  A more likely solution is that universal truth changes for each individual thinker, so that one’s universal truth may indeed be found in Biblical scripture at a certain age — and this is boxed-up objective truth, no less true than death or taxes — but neither before nor afterward.

“When I was a child, I spake as a child, I understood as a child, I thought as a child: but when I became a man, I put away childish things” (I Cor. 13:11).

Yeah, that’s right.  I can quote scripture.  It isn’t blasphemy when it’s true.

So perhaps we all have some real thinking to do, eh?  Perhaps it’s time to grow up.

Where does one stow an outgrown worldview?  Under the bed, next to the Tinker Toys and Legos, obviously.  Right where it belongs.

With glasnost and much cheek I remain,

Yours Truly,

-BothEyes

P.S. — Nowhere in this piece will the magnanimous reader find the word, “ontology.”

Stumble It!

The Saintly Altar of the Altered State

I.

The human brain, contrary to what mom told us, is not a miraculously engineered wonder of the Western world.  It’s miswired, misaligned, and mistaken much of the time.  Many charlatans — or psychologists if one prefers — believe that the brain’s first experience, birth, permanently damages it.  Birth is violently traumatic, and both emotionally and physically brutal.  In response to high levels of stress such as this, our brains shoot us up with adrenaline, hydrocortizone, and steroid hormones (glucocorticoids, if you really want to know) which means our first birthday present is that we get to enter the world innocent, healthy, and high as fuck.

— And that’s OK, because if it weren’t for altered states of consciousness, we’d have no genuine experience of this world’s completely random nature at all.

Since we can’t be born every time we want a fresh jolt of reality, we spend the rest of our lives self-medicating.

Holistic medicine the old-fashioned way

The brain operates a crackhouse in our heads, producing such heavy hitters as dopamine, a natural upper which makes us talkative and excitable, endorphin, an anæsthetic which has three times the potency of morphine, and serotonin, a mood enhancer which makes us act and feel like hippies.  Most of the meds recommended by school psy-charlatans for depression or anxiety alter the amount of serotonin produced by the brain.

These mind-altering substances have side effects which can prove worse than the emotional irregularity they medicate, such as violent tendencies, hallucination, depersonalization, derealization, psychosis, phobias, amnesia, and obsessive compulsive disorder — and that’s just for the benzodiazepines.  We don’t hit heart arrhythmia until Eldepryl (™).

Sexual dysfunction and gastrointestinal distress commonly affect patients taking Selective Serotonin Reuptake Inhibitors, or SSRIs.  Pop-culture knows this hip family of psychomeds well, which boasts such rock stars as Paxil, Prozac, and Zoloft.  Approximately twenty-two million Americans take these drugs every day, or statistically, every fourteenth American one encounters on the street.

So, the next time you’re shocked at the number of complete assholes you meet in a given day, remember that fourteen percent of America hasn’t taken a shit in four days and hasn’t had an orgasm in months.

Without sex and regularity, anxiety patients feel much better

II.

If the human brain were able to regulate its chemicals, nobody would recommend cooking up meds like Prozac and Paxil.  Since science has proven that many do not, though, society accepts these meds and also allows for a margin of error in prescribing them to healthy people.  Many groups in the United States froth at the mouth over the prevalence of drugs such as these — as well as that of other mind-altering substances, both legal and illegal.

One might as well try to place the entire nation on a single diet as try to stem the amount of self-medication engaged in by Americans, though.  Seventy-two million of us diagnosed ourselves and regularly took some sort of alternative medication in 2002.  The rest of us might not consider ourselves medicating, but we do, of course, and not just the usual Tylenol, Robitussin, and Pepto-Bismol, either.  We purposefully alter our brain chemistry all the time.

Over half the population of the U.S. drinks coffee on a daily basis to take advantage of its stimulant properties.  Sixty-four percent of us drink alcohol, perhaps to counter the tension from all our coffee.  Twenty-two percent of us smoke cigarettes to relax, especially while drinking alcohol or coffee.  Approximately eighteen percent smoke grass.  That’s without even discussing all the more-inventive drugs, such as LSD-6 and MDMA.

In addition to all this we must consider the oceans of so-called “health nuts.”  Fitness fanatics come in various degrees of seriousness and mental stability, from the casual weight-lifter to the manic Olympic triathlete, and nary a one of them considers himself or herself a drug addict.  Nevertheless, the scientific community established long ago that physical exercise heavily affects hormone, endorphin, and serotonin levels, and also that addiction to these natural substances occurs easily, naturally, and predictably in lab rats.

Since these highly addictive endorphins target all the same opiate receptors, 24 Hr. Fitness can be considered the modern American opium den.

Portrait of the American Addict

III.

We certainly do like to fuck with our brains.  Who can blame us, though?  As aforementioned, we’re the inheritors of broken machinery, the unhappy inhabitants of chaotic mental domains which do not even function in the haphazard, unpredictable way they should.  Humans fix things.  When a shoe comes untied, we tie it.  When a brain comes apart, we glue it together with whatever we happen to have on-hand: coffee for fatigue, whiskey for tension, tobacco for anxiety, what-have-you.

When we tinker with our minds, we’re seizing temporary control of our neurochemistry.  We don’t drink alcohol in spite of its tendency to impair our judgment; we drink it precisely because it impairs our judgment, and unlike other mind-altering addictions such as — oh, I don’t know — television, say, we know exactly how our brains will change when we indulge.

Humans have used mind-altering substances since the dawn of time.  Beer, alone, has a documented history going back six-thousand years before Christ.  When we look at our ancestors from so long ago, though, we can’t help but notice that their uses for beer, wine, tobacco, drugs, et cetera extend far beyond self-medication.  Of course, they were used for recreation, but the original use for most of these so-called vices was for creating an appropriate environment for religious and spiritual rituals.

The Greeks drank wine to evoke the ancient god, Dionysus.  The Jewish tradition of the Passover Seder requires four glasses of it per person.  Five-million Hindu sanyasi sadhus smoke hashish to repress their sexual desires and aid their meditation.  Over fifty American Indian tribes practice Peyotism today, a religion centered around ritual use of natural mescaline, which they use to communicate to the dead and to various deities.

These people aren’t balancing their serotonin — they’re putting gods on speed-dial.

Not seeing angels and demons, yet?  Here, drink some more of this.

They're gateway drugs, alright

IV.

These days religions get a bad rap.  Atheists can say the bad reputation of spirituality reflects its failure to cooperate with contemporary Western civilization, sciences, paradigms, and increasingly agnostic peoples.  Religions themselves, however, deserve no animosity.  One cannot judge a philosophy by its misuse.

Religions originally appeared because humans became convinced of evidence alerting them to other beings, other worlds.  Rituals appeared because humans wanted to commune with these other beings, other worlds.  Mind-altering substances proliferated in rituals because they provided sufficient evidence of their usefulness to millions of adults with brains the size of canteloupes.  We no longer use these drinks and drugs to speak with gods, though, because so many people these days seem to think they can do it without spending beer money, and many others don’t think very much of the idea of talking to gods, anyhow.

In other words, lots of boring self-styled “realists” think those other beings, other worlds never existed in the first place.

The funny thing is, everyone on planet Earth believes wholeheartedly in lots of things that don’t exist.  The value of currency, for example, is absolute balderdash.  It is valued for its various markings and symbols which invoke the names of people who lived hundreds of years ago, and which declare mottos and oaths in ancient, dead languages, markings and symbols which cast an enchantment over both buyer and seller, and in this mutual confusion one can purchase an automobile with nothing but decorated scraps of parchment paper.

There is no difference between the purpose of the markings on a dollar bill and that of the markings inscribed within a sorcerer’s sigil, or those upon an altar, or even those upon a WELCOME mat.  We live in a world of our mind’s creation, and everything real to us has been made real by us.

How did we miraculously make reality real?  Easy.  We simply named it that, like we did the table, the chair, and the dust bunny.  “Reality,” we said, “thou shalt be real,” to which so-called reality said in its easygoing way, “Alright,” and that was that.

The unreal didn’t mind being left out at all, though, because all of a sudden, it didn’t exist.

Wait, did you guys see that -- or am I crazy?

V.

So, here we are, then . . .  Nothing is real, and nothing is unreal.  Quite a mess we’ve gotten ourselves into at this point, and we’re very proud of it.  Naturally, we’ve taken the next step and done what any bipedal, cerebrally cortexed hominid would do in this situation: we’ve become ontological agnostics.  We don’t know what truth is, where to find it or how to prove that it’s there, but we believe in it all the same, bumbling about like the decorated surrealities we are, chasing after decorated scraps of parchment paper, and taking turns chastising one another for having faith in decorations.

What arrogant, blustering bastards we all are.

But how can we escape this cycle of idiocy?  How can we step from delusion and credulity into anything but delusion and credulity, if everything we know seems illusory and incredible?

Beer.

Cold, crisp, clean — beer.  And pills.  And smokes.  And coffees, wines, and liquors; buttons, tabs, and capsules.  Strenuous, extended exercise.  Yoga.  Za-zen meditation.  Brutally sorrowful dramas, uproariously hilarious movies.  Bitter, hate-filled debates.  Violence.  Pain.  Exquisite, sin-soaked and passionate pleasure.  The sweetness of selfless generosity lifetimes long, the glorious splendor of victory in competition, the self-righteousness of upbraiding one’s brother for having fallen from grace.  Mind-altering substances, mind-altering experiences.

In a paradoxical word, we can step away from the illusory by taking a break from reality.

In a life where nothing you think real can possibly exist, a world of erratic change and nebulous phantasms, mind-altering substances and experiences offer the most realistic opportunities available to a human.

— But of course, one could just go on as a believer . . .

With a glazed look and a raised glass I remain,

Yours Truly,

-BothEyesShut

Stumble It!

THUWH9S5JMPC

Self-Abasement, Incorporated: an Industrial Revolution

At the U.S. headquarters of Self-Abasement, Incorporated, a boss begins to instruct his underlings in the delicate art of business attire.

I.

Business attire, as we all know, is that particular brand of fashion which obscures one’s personality. Business attire offends people at places of relaxation and amusement, and doesn’t look distinguished in one’s workplace, either, regardless how much money one has spent on it.

Business attire, though having been designed to look respectable, handsome, and elegant, fails to do so, because while companies can require that one wear a pinstriped skirt, they can neither require that one should own several such skirts, nor that one should daily press the wrinkles out. The boss can force us to wear a tie, but not to tie a fresh knot daily. These are discretions belonging to the wearer, and this is the irony of business attire.

When one’s silk tie has been in the same Windsor knot for six months, it’s insincere to feel elegant.

You'd be amazed at how long a necktie can go knotted, how long a bra can go unwashed

Yet the boss, a college graduate of average ambition, has also a boss, and this chief boss is the one telling him to enforce the company’s dress code. The command strikes little boss as odd because the dress code has always been followed with little trouble.

“But no,” the chief tells him. “Following the code is just acceptable; we can’t have our employees looking acceptable. Our employees represent the company, and the company can’t look just acceptable.”

“No,” says the boss, “of course it can’t, of course the employees can’t,” even though he is thinking of the word acceptable, its definition, and wondering why there ought to be a dress code at all if not to define precisely how employees should dress for work.

So the boss bows out of the presence of the chief and makes his way to his own cubicle. His cubicle has a window overlooking the blacktop of the parking lot below, because he has worked with the company for twenty-one years and has earned this luxury. Once there, he reviews the company’s dress code, then clicks his mouse pointer to create a new document. His creation takes forty minutes. Making copies takes three. He delivers them to his underlings in no time at all.

The cubicle creatures have become wary of the boss’s hardcopy memos, so they wait until his squeaking loafers have rounded the corner to pluck it up and take their medicine.

They grimace at the familiar arial font, and they sneer at the bullet points. The tone and content of the memo is no different from any that have come before: heartlessness approximates professionalism; condescension masquerades as magnanimity. Tragic, terrible irony seeps from every typo and grammatical error. The cubicle creatures begin to pop up like gophers. They peer over the walls of their little boxes at one another, holding up the memo and pointing.

What bullshit! They can’t do this to us. I’m going to talk to Johnson right now. Can you believe this shit?

They cannot believe this shit.

I Cannot Believe This Shit

ATTN: ALL EMPLOYEES

AS OF 4/25/10 the dress code is being clarified. Some employees arent following company procedure so this should help them dress aproppriately for work. NO EXCUSES! NO EXCEPTIONS!!!!!!!!!!

– Shirt and tie, men

– BLUE or BEIGE blowss, women

– BLACK or NAVY BLUE slacks

TO CLARIFY IN ADDITION!

– Mens slacks must front crease

– NO JEANS on Fri. anymore per Johnson

– Polo shirts are only all right Fri. on floor 3 if they are blue or beige

– EMPLOYEES MUST SHINE/POLLISH THEIR SHOES EVERY WEEKEND BEFORE MON. Mailroom employees must black nylon laces

– No dangly ear rings

– CLEAR or RED only pollished nails

John Johnson wll be reviewing staff Wed. to make sure these rules are being followed.

Thank you for your cooperation,

Gary Melendez

II.

Sometimes when I’m at my job, tappity-tap-tapping on my plastic keyboard and diddling the little touchpad on my laptop from time to time, it occurs to me that I’m accomplishing work which required hours of painstaking, interminable scrawling on sheafs of expensive parchment not so long ago.

Thank you, Industrial Revolution.

The underlings of Self-Abasement, Inc. do not feel the benefits of that historic occasion, though. They feel the crushing weight of imaginary duties, instead, because the introduction of technology to the workplace has eliminated most clerical work, leaving employees with more time between tasks than ever before, time which bosses must fill in order to look industrious.

Having long ago mastered the art of making two hours of work look like a two-day job, proletariat underlings manage to keep their jobs, and this explains how American employment competes with technology which would otherwise make human labor obsolete.

Bosses know that their underlings cut corners and screw off for large amounts of time, though (because they are very guilty of the same thing) so the bosses spend most of their paid hours playing gotcha! with the rest of the staff, ratting out the minimum of underlings necessary to look busy.

Underlings, bosses, and chiefs all have more free time, but the sergeants to whom the chiefs report have no more free time than previously, because sergeants never did any of the clerical work, anyhow.

Sergeants do labor which C.E.O.s need done but cannot do themselves, labor requiring certain talents and educations which computers cannot be programmed to use. In addition, companies need creative, educated humans in virtually every area of their industry, so these sergeants find themselves in high demand, spread thin, overworked and under-appreciated.

The sergeants have meetings, at which they give presentations, with which they sign deals, by which they secure work and money for their employers, which also secures the employees below. They are hard to reach, rarely seen in the office, and have little time for shenanigans. Their private time is taken up with anything and everything that could possibly relax them.

— Drug habits and divorces, for instance.

The big meeting feels like a summer holiday, when your cocaine has gone up both nostrils and your hands have been up both skirts

As a very protracted result of industrialization, then: underlings inflate their jobs in order to look busy and justify their positions; bosses inflate their jobs in order to look busy and justify their positions; sergeants enjoy the odd amphetamine here and there and become extra-marital enthusiasts.

What, the reader may ask, are the chiefs doing during all this self-inflation?

Sergeants have no time to police them and must be content with available evidence that the chiefs are doing their jobs — but just what, exactly, were their jobs? Since dividing their responsibilities among the bosses, the job of the chief has evaporated into the delegation of labor amongst laborers who are many times more experienced at accomplishing these tasks than the chief ever was. In physical terms, the chief actually does nothing.

However, nothing is a very difficult job to perform, as it turns out.

In order to earn wages for doing nothing, the poor chief must somehow take credit for the work his underlings complete and build hard evidence of having had a hand in it, as well, which proved an inexorable challenge until the late-twentieth-century innovation of micromanagement.

III.

Some definitions of micromanagement stretch for whole paragraphs, while others curtly name it in a concise six or seven words. Micromanagement describes more than a mere business philosophy, though. It is an undiscovered culture. It is an esoteric cabal.

Micromanagement is a sorcery woven over North America which upholds the global economy, feeds innumerable hungry mouths, and maintains the eminent prestige of the corporate-American business style.

It shares also the unfortunate distinction of the Faustian pact, however, in that it happens to kill everyone who subscribes to it.

The micromanager, here seen protected from unemployment by his circle of arcane documentation.

When chiefs first aspire to practice micromanagement, they begin by conjuring new requirements to add to existing regulations. This increases the complexity of the rules, and since they must enforce these rules, this inflates the scope of their job, likewise. In the case of the wretched cubicle creatures at Self-Abasement, Inc., for instance, their chief focuses on the company dress code, which had been a perfectly functional dress code except that it was too easy for his employees to follow and therefore did not give the chief anything to do.

By adding a few superficial, superexacting details, chiefs ensure that their cubicle creatures will resist this tyrannical posturing and fail to observe all new regulations. The chiefs then sign a few official documents of reprimand, obtain the signatures of all offending employees, and in this way create a paper connection between themselves and the actual labor performed by the underlings.

Memos, too, serve to solidify a micromanaging chief’s presence in the office. Suggested by the sergeants and articulated through the chief’s invariably horrific grammar, they explode in mass emails like viral outbreaks, or wind up scotch-taped to cabinetry in the staff lounge, stall doors in the restrooms, or any number of surprising locations where one would not expect a memo to lurk, such as inside the silverware drawer in the kitchen:

DO NOT PUT FORKS AND SPOONS IN THE UTENSILS DRAWER!

These officious memos help to prove the indispensability of the micromanager, and also make his or her presence known throughout the cubicle labyrinth, invoking him or her like the summoned incarnation of a corporate Zeitgeist. Without the ostentation of these memos the chiefs would seem incorporeal, because by nature of their work (which does not exist) they toil alone in their offices, leaving them only to use the restroom or drop in on a boss to make certain the chief’s responsibilities are being sufficiently handled.

This, of course, begs the question underlings have pondered since the inception of the micromanager: if we’re out here doing all the work, and all he does is come up with crazy new rules every two weeks — then what the hell is he doing in there all the time?

It is the opinion of many cubicle creatures that copious amounts of auto-eroticism transpire in the office of the chief.

Connectivity. Infrastructure. Masturbation.

IV.

The Industrial Revolution of the nineteenth century put thousands of people out of work, and forced thousands more into new schools instituted to train farmers for life as factory hands. Had those day-laborers developed the sort of industrial sleight-of-hand practiced by micromanagers today, they would have been hailed as geniuses. They would perhaps have spent their working hours in the shade of apple trees, shouting perfunctory instructions to the other hands and winning their contempt, like this:

“Smith, yer gone need ter lift that hoe up t’yer shoulder to keep the furrow nice’n straight, hear?”

“Sho’ is a fine thing we got Johnson ter tellus how ter hoe ‘n sow ‘n plant ‘n scrape. I wonder where he gits his idears from.”

“I reckon those idears o’ Johnson’s come from about the same place as the manure do, but I sho’ wouldn’t mind trading up fer his salary, or fer his shady patch o’ sittin’ over thar, neither!”

That micromanagers work illusory jobs for pay does not seem inherently evil, though, as all the crucial work seems to be getting done, anyhow. Giving people something to do simply because people need something to do hardly appears like the worst thing in the world; mentally handicapped individuals have been employed in this fashion for decades, as have convicts, and even grandchildren (“Do what Nana says and sweep those leaves into a big pile on that side of the yard, and let me know when you’re done so I can show you how to sweep them back again.”). Micromanagers commit but a misdemeanor in duping dimwitted companies into paying them for inventing paltry regulations and decorating the office with memos.

In the innumerable tortures they design for the pathetic, piteous cubicle creatures, though, they betray themselves as the authors of fresh hells, their mass emails sundering the contentment and optimism of scores of people with neither shame nor care. The despair these micromanagers distribute as part of their useless, makeshift jobs horrifies the hapless cubicle creatures slowly, their gaunt faces growing more sallow and lined every day as though forced to watch imperturbable carpet bombs falling over an amusement park in crawling, relentless slow motion. Dress codes, new forms, an additional mite of data entry, an extra stop on the fifth floor to obtain a signature, the straws stack upon the quavering spines of corporate employees all the world over — hourly paid, conveniently quashed like cockroaches.

The proverbial last straw never comes for the cubicle creature, though, because each poisonous favor is only as brutal as the last, and like a cuckolding indentured servitude, they can only endure the apathy of their superiors by the anæsthetic of mindless subservience.

One is not mistaken to also detest the cubicle creature. One must consider that while their financial constraints may convince them to daily demean themselves like cowering, obsequious rodents, the shoe polishers of the world, garbage collectors, sewer scourers, bedpan changers, septic tank adventurers and other dauntless laborers of unseemly occupations go about their business with all the dignity and assurance of a British barrister, the cubicle creature having sacrificed self-love and self-respect for the sake of a dollar or two per hour above the wage that is generally paid to teenagers working in fast-food restaurants.

Marty Feldman, having left his position at Self-Abasement, Inc., re-learned how to smile and began an unlikely career in cinema. Seen here in early recovery.

V.

What course of action, then? When I reflect upon the farmhands during the Industrial Revolution, I imagine them going to work in factories with the same resignation and mental fatigue in their faces I see on those of the cubicle creatures, the bosses, and the micromanaging chiefs. This inheritance of misery cannot be tolerated.

However, the solution is not to stamp out micromanagement; that seems implausible. Micromanagers generally possess few marketable talents and so would not know what to do with themselves were it not for micromanaging. They will defend their philosophy to death. They sink in a quicksand of their own devising, and like Dr. Faustus, they do not believe that it will destroy them.

The micromanagers, themselves, appear doomed.

Readers given to martyrdom may decide to practice the Way of Nice for their respective chiefs, but should one find oneself in the position of the cubicle creature, the boss, the chief, or the sergeant, one would do best to quit the place like a spark leaving the flint.

Corporate offices transform human time and energy into cashola. That is their purpose; they have none other. Unless one could change one’s living days into enough capital to justify such a dark metamorphosis, to take a position in a corporate office is to commit oneself to a sanitarium operated by lunatics.

Most corporate fucks work jobs that they hate in order to feed, clothe, and educate their children, transfusing their very lifetime into that of their offspring. Their personal joy and appreciation for the beauties of life visibly deflate from them with every passing day, and many live in fear of termination like battered housewives clinging to abusive spouses. Self-destruction does not raise healthy children. It were better to live with dignity and pride somewhere in a rent-controlled ghetto and nourish one’s family with ramen.

As the great Al Pacino once said, “There is nothing like the sight of an amputated spirit; there is no prosthesis for that.” No, and there is no salvation for those who commit a daily suicide all their lives, either.

Beware the promise of material happiness or contentment.

Beware the myth of financial security.

Beware the fiscally ambitious and the ones who have it all.

— But most importantly, beware that part of you which dreams of winning lotteries, marrying rich, or retiring in a large, beautiful home.

It’s the part of you the rest of us have most to fear.

With remarkably tenacious optimism I remain,

Yours Truly,

-BothEyesShut

Stumble It!

Disinformation and You: a Love Story

Politics offend me.  What is it about government that causes such horrendous emotional amplification?  Whenever someone posits a political opinion at the beer-talk table, others hurry to kill or die for their disagreements.  This rash Friday-night idiocy once disgusted me, but the contempt I’ve felt for such reactionary exchanges has frankly become condescension.  My knee-jerk reaction to deeply concerned, utterly serious political conversations is to make sarcastic, snide remarks against the childish manner in which these discussions are generally conducted.  For “In a Real World. . .” though, this would be too easy, and would say too little.

Rather, let’s have a look at modern society’s treatment of world politics and see what remains to talk about afterward; though I must say I find talking about politics. . .  Really fucking embarrassing.  So, this doesn’t leave the room — OK?

I. Hooray!  Disinformation Is a Way of Life

It is irrational, pompous, and presumptuous to think one holds enough dependable information to come to veracious political conclusions.  For this reason, I’ve always fantasized a president’s first day happening something like this:

“Would you like some water, Mr. President?”

“Oh, no. . .  I mean — no, I’ll drink it straight.”

“Don’t feel bad.  Clinton passed out when he learned George Washington still secretly headed the executive branch from his empty crypt behind the White House.  It gets everybody, the first time.”

“It — it wasn’t the Washington zombie, so much.  I had anticipated that.  I just hadn’t expected his bionic life-support to look so much like, like — like Angelina Jolie.”

“Yes, well, President Washington picked up cross-dressing in the 1940s.  Who d’you think got J. Edgar Hoover into it?”

OK, so I may be exaggerating.  The basic idea is about right, though.  If there’s anything I feel secure in, it’s government secrecy.  I doubt they give Obama the code to program the White House’s TiVo.  Governments cover up everything, and that really shouldn’t be news to anyone.

George Washington presiding

Washington, D.C.: more secrets than a legion of adolescent girls. Why is President Washington's crypt empty, again?

Since a government’s first priority is to cover its own ass, it may be expected to take measures to cover said ass.  As these measures protect the government best when they’re also least conspicuous, governments hide, obfuscate, and divert attention from these measures.  I will call this activity by its classified codename, Operation Chickenshit.

Civilians interested in Operation Chickenshit must contend with its wily evasive maneuvers.  Working daily to suppress the news are hundreds of wildly clever, obscenely educated, anonymous Chickenshit agents with indescribable power at their disposal.  These suits work long, well-paid hours to shut up all so-called “sensitive” information except that which has been manufactured to obscure or omit the truth.  News sources can always be expected to omit more than they include.

As any half-blind, half-deaf White House attendant can tell you, politics happen in limousines and restrooms, not on the floor of the chamber of the House of Representatives.

So, intrepidly armed with watered-down news influenced heavily (and occasionally outright controlled) by Operation Chickenshit, we form entire political belief systems to wear as fashion accessories, then impose upon one another what we consider informed opinions.  We’re like arrogant little gourmands judging the dishes of a feast by reading the conflicting reviews of food critics, without ourselves having the slightest ability to taste any of the food.

Oh, like your concept of world government is gonna make it past this guy intact. Yeah, right. And there're like, a bazillion-dillion guys like him working in propaganda. Come on. Get real.

For some reason, though, people take it for granted that politics may be wholly grasped and engaged in by any flag-waving prick on the street.  Often, poli-sci hobbyists sneer at religious fanatics who argue over the nature of God (big G) because it seems ridiculous that so inconsequential a being as a human might measure gods.  These same detractors, although reasonable in their scorn, consider it well-within their own reach to discern the clandestine movements of governments, governments with the power to order the invention, construction, and execution of nuclear submarines, stealth bombers, and surveillance satellites orbiting planet earth.  These same self-important armchair philosophers (yes, I realize I have named myself) pontificate at length about exactly why American troops invaded Iraq.  I contend that, beyond the existence of troops there, very precious-fucking little can be known from a civilian perspective.

The purpose of their (or any military’s) mission will never be understood by any one civilian, agent, or president.  This is because the matter has causes too large, too plentiful, too varied, too far away, too long ago, and too inexplicable for any single person to know at once.  George Herbert Walker Bush may know what Reagan was doing in Nicaragua, but he can’t know which multinational corporations were pulling strings in drug cartels, nor what was motivating the contras to clean and oil their assault rifles, kiss their loved ones goodnight, and go dutifully to work.  That sort of information can’t be garnered through wiretapping any more than the quality of a novel can be ascertained through the study of sales statistics.

Our great-grandchildren will have it fed to them by Operation Chickenshit in high school, though, all tied-up in one neat, tidy little paragraph between what transpired in New York one fateful autumn day, and the election of America’s first Afro-American president.  And that, my friends, provokes me to laughter.

Columbus

America protected the Western world by invading the Middle East, does not influence Central American politics, and was first discovered by Christopher Columbus (Great Amer. History textbook, Ch. 1, 5, 15; Questions 3-12, due Thurs.).

To understand the height of conceit one must obtain to insist that one comprehends politics, one has only to consider the possibility that momentous events have secretly occurred.  Have people simply disappeared in large numbers?  Of course.  Have foreign governments been hijacked by the surgical placement of agents within?  Of course.  Have technologies been developed, the use of which would horrify the contemporary mind?  Of course.  Have the people of the world been permanently convinced of a lie so egregious in its enormity that dissolving it would result in nationwide rioting?  Of course.

It’s conspiracy theory, one would say — to which I rhetorically remind: have conspiracies transpired in every government since the dawn of civilization?  Of course.

Governments, in fact, are mere conspiracies in full bloom.

II. All the President’s Men

A conversation criticizing political conversation can’t be without mention of political leaders.  An inordinately large portion of such talk orbits the actions of presidents, congressmen, representatives, and governors.  Little talk is made of mayors, though, unless one’s current mayor has become embroiled in a fiasco of some kind or other.  We do not seem as interested in the non-scandalous activities of our mayors as we are in the minute-to-minute business of our president, and that’s strange, because the mayor is a person we can shake hands with if we don’t mind hanging around city hall long enough, someone whom we can speak directly to at council meetings and press releases.  The President is someone whose very existence can only be verified by very few people, as few people can get close enough to him to collect a priori evidence.  Most people see a president on television and automatically “know” that he exists, presides, and impacts lives as surely as a sledgehammer affects furniture, even though the vast majority of people see no more of him than the constituents of Oceania saw Big Brother in Orwell’s 1984.

I do not mean to place the President in the same box as Santa Claus and the Easter Bunny, because that would be way too much fun would sound counter-intuitive; however a rational, realistic reassessment of his function seems necessary.  Before we begin, however, let us take stock of some other positions in U.S. government.

Presidents

Who's this, the real President of the United States? Don't be ridiculous. This is the shadow coach and assistant shadow coach of your kid's soccer team. I have no idea who runs the White House.

Of the government of the United States of America, there are: 9 Justices of the Supreme Court, with a total of 37 clerks; 100 people in the Senate; 435 people in the House of Representatives, not to mention 4 delegates and 1 resident hotshot; 18 current cabinet members, not including the Vice-President and the Speaker of the House.  Also unofficially affecting our government are: 12,553 registered lobbyists in Washington, and an innumerable amount of pressures from Wall St.  To be perfectly textbook about it, there is also a Constitution governing all of this, having 7 articles and 27 amendments which are ostensibly inviolable.

There is exactly 1 President of the United States.

While it would be naïve to say that the Chief has no real power (as there are over 1.6 million veterans of the Middle East conflict who assure us he does) it would be equally silly to consider him anything but a single part of an enormous, plunging political machine with enough gathered inertia to operate without outside instigation for centuries.  The American government is also the result of centuries of social and economic structuring that occurred in Europe and elsewhere.  Small dominoes, then big dominoes, then gigantic dominoes fall in a neigh-endless march through our past, present, and future, and of these most American presidents represent a shockingly small fraction (there have been 43, by the way; considering our nation is only 2.35 centuries old, that’s a notable diffusion of responsibility).

What this means to me — and sometimes I feel the pariah for it — is that the President is no more than an eddy in an immeasurable whirlpool, a momentary breeze on the outskirts of a tornado, a glowing rivulet crawling slowly away from the fiery flow of a massive volcano.  This doesn’t change his relative importance, though.  Recognizing him as such merely places him in perspective, but this perspective is necessary to keep one’s balance when discussing politics, and especially when speaking with one of the countless political zealots who love to talk about presidents the way music fans love to discuss the individual members of a band.  Which of these incessant chatterers sounds more pretentious is anyone’s guess.

On 22 December 2009, Lord Vader and his stormtroopers rang the opening bell of the New York Stock Exchange, thus greatly simplifying 218 years of American politics.

The emphasis many place on the relative success or failure of a certain president cannot be fathomed.  Changing a president amidst all the above influences and excitedly expecting significant change does not show the scope of reason befitting a literate adult.

Swapping out presidents is not like rebuilding the engine of your car — it’s like changing the oil.

The main function of the executive branch is the same as the king’s function in chess, which is to distract attention from the rest of the board.  All the actual action is really taking place among the knights, bishops, rooks, and pawns, but the largest portion of any country’s populace is mostly uneducated and entirely uninitiated, and therefore lacks the tools to correctly appraise its leaders (largely why countries need government in the first place) so the president and his cabinet provide a sort of sitcom for all the uncreative types to cheer for or bicker about, vote for or demonstrate against, and generally spend all their mental energy spinning their wheels on.  Meanwhile, the brains of the constituency watch and laugh, watch and cry, or place a thoughtful finger to their chins and say, “Shit, even I can do that,” and throw their hat in the ring with the other candidates.

III. Election Day Apathy: “Ooooh, Goodie!”

So what’s an American to do in November, roll over?  Oh, hardly.  Some things can be researched to a point of relative certainty on the local level and are worth acting upon, and other things on the global level are almost certainly so, such as the existence of troops in Iraq at the moment.  If it suits us to vote on this basis, then we can feel fairly certain that someone will tally our ayes and nays.  And there are the civilly disobedient routes of expression, also, lest we forget.

There are ways to understand governments, too, if one rejects the impulse to think in terms of pundits and parties.  A man would do better to examine himself to find hard evidence of what a government is, would do better to examine his neighbors, and if possible, the people in other counties and states.  If one wishes to understand American government, one should begin with the American people, and end somewhere in China, Africa, or the Middle East.

In other words, a worm may understand the tree through a thorough study of the apple.  Should the worm develop an avid interest, the apple’s seed should afford all the wonder and mystery inherent in this universe, more than enough to internalize the complexities of the tree, and, who knows?  Perhaps even a thing or two about what it’s like to be a worm.

And there ain’t nothin’ Operation Chickenshit can do about that.

With Frank Incense and Mirth,

-BothEyes

True, False, Fuschia!

When it’s done well, conversation’s an art that impresses me more than anything in the world.  Humans learn all sorts of fascinating minutia while tooling around the world they inhabit, and some of them have a good sense of humor.  There’s nothing like talking with someone who can make you laugh and teach you things at the same time: gossip, trivia, history, world culture, current events, important and unimportant things, inexplicable things, and things as mundane as what happened on last night’s episode of “Whatever.”  Hell, people can even provide an intuitive guess at things they don’t know, which, after some cross-referencing with other people, usually becomes one of our educated guesses, and upon which many of us regularly depend.

In Southern California, however, we have a treasured tradition of attempting to convince one another of our ideas and opinions.  We squabble over the quickest route from A to B, and exhort one another with banners and bumper stickers (especially around election time).  Even our fucking tee-shirts bear the slogans and advertisements of our favorite points-of-view.  In popular gathering places, the usual discussion happens in every color of the rainbow a thousand times over:

“Yes, it is!”

“No!  It isn’t…”

Lou Pinella

Bears don't look like this unless they're going to maul each other. This peaceful show of aggression is a purely human trait.

All this shallow bickering should have stopped in grade school, but our social development is arrested by our earnest desire to help — at least, that’s the noble reason I’m giving for it; pride in our powers of perception fuels arguments at least some of the time.  Note also that there are excellent reasons to argue (see How to Refrain from Being a Dick for some examples) even though the bulk of arguments are bunk, but one must grow accustomed to the presence of contradictions and paradoxes in this life, and our desire to work together for the perpetuation of circular arguments seems to be one of them.  More on paradoxes later.

What reasons exist for giving up the incessant “tastes great / less filling” sort of tennis match and resuming less-combative conversation?  Read on, o’ my fellow friends of the Friday-night beer talk, and we might find a way to shut up our faces long enough to finish a watery American lager.

-Standing by One’s Opinion Is Vain

It’s a strange culture we live in.  We’re expected to be modest yet confident, friendly yet assertive, firm yet yielding, a list of directives that sounds like a good kiss.  It’s a fine balance, and in that balance we’re taught that “Your Opinion Matters!” even while all opinions are “like assholes: everyone’s got one and they all stink.”  I saw the former on a poster in a mall, I think.  My grandfather used to say the latter.  Who would you listen to first?

It’s true that everyone has opinions, though.  There’s little way around that, and if everyone has them, then one person’s idea is worth about as much as another’s, since even a so-called good idea can potentially be had by someone else.  Most people don’t argue other people’s points of view in an argument, though, and I find that extremely telling.  We’d often benefit by relating someone else’s point of view, rather than something we cooked-up ourselves, because one can’t be accused of arguing out of pride when the argument posed isn’t one’s own.  I’d be happy to give you my stepfather’s opinion of the New York Yankees, for instance, because I’m not a sports fan and there’s little danger that you’ll think me very serious about myself.

Stacks

Zillions of pros, zillions of cons -- dreamed up and written by the most erudite people on earth. I'm sorry, what were you saying?

I’d also be glad to give you Marcus Aurelius’ perspective on willpower, Anselm’s ontological proof of the existence of God, Fuller’s evidence that the world needs communism, and other trite epiphanies, but please don’t hold me accountable for relating them!  They aren’t my fault.  They were written years before I was born.

Appealing to the authority of famous smarty-pantses of the world is a notorious logical fallacy — in other words, quoting Albert Einstein doesn’t make one’s contention correct — but it’s much less vain than presuming others should accept your opinion simply because you’re so fucking cool.

It may be that one’s own opinion is most politely stated as a question, like, “I wonder if Iggy Pop isn’t better than David Bowie?” but we can’t always talk that way, so it’s important to remember the following.

-Everyone Is the Center of the Universe

Nobody can have any point of view but his or her own.  Everyone is the center of their universe.  They don’t know what your universe is like, and you don’t know theirs.  The universes may have commonalities, or they may not.  Regardless, my daily evidence suggests that I am the most important thing to myself so truly and consistently, that even my most heartfelt principles and ideals are only worth dying for because, hey, that’s my opinion.

If I allow that you humans are like me, and that you are centers of universes, too, then there’s no fucking way I’m going to convince any of you of anything (unless of course I say something you almost agree with, anyhow, or simply hadn’t thought of yet).  It’s especially difficult to convince others of something they do not readily believe since the proliferation of Grandpa’s opinions-and-assholes principle, the aforementioned proverb our culture developed to devalue any and every opinion from Kant’s Categorical Imperative to the capitalism of Carl Karcher.

"Egocentric," by Tyler Philips. The question is, does the mind's eye emit, permit, or permute?

We’re partially convinced everyone else is wrong before we even arrive upon a topic of discussion, and that’s not surprising; we were there to witness every time we were right, and we partially doubt (or forget altogether) many of the times we were wrong.  Who, after all, can argue with the center of the universe?  Besides, even if Jack were to convince Jill of a certain idea, Jill would merely be placing her own judgment of Jack’s reasoning before and above his idea in question.

One can’t accept or deny an idea as logical or illogical without presuming a presiding authority.  Parents discipline their children out of an inability to dethrone the god who refuses to recognize Dad or Mom as a sovereign leader.  Obedient children obey predicated on their own decision that doing so produces favorable results.

-Everyone’s Opinion Is Justified, and Everyone’s Reason Is Erroneous

It seems certain that all opinions have the same subjective value, but ideas backed by logic or reason have quantifiable, given parameters by which they must be measured.  In the case of the subjective, we are almost always correct and justified (we earnestly do feel that X is Y); in the case of the objective, we are almost always incorrect and unjustified from the largest perspective, because we know too abysmally little to state things as universally, absolutely true, and can only be correct in small, easily defined, easily proven and quantified matters, such as arithmetic — and even then, paradox shows that we are wrong from other perspectives.

Paul

This fool on the hill sees the world spinning 'round. It's not surprising that he's the center of the universe. The surprising thing is, you are.

To illustrate the futility of solid logic on a universal scale, consider some rudimentary arithmetic: good ol’ 1 + 1 = 2.  Given that we will use Arabic numerals and some other laws previously agreed-upon, this is going to seem standard, true, and inarguable.  The answer, however, is vulnerable to alternative interpretation due to the accepted meaning of addition, and of “1,” itself.   If one were literally added to one other, then the result should be a unification of these two separate entities.  In math alone, we agree to call this unification “2,” but linguistically, philosophically, or metaphysically the logic falls apart, because a uni-fication must result in uni, the Latin word for one.  From these points of view, anything added to anything else will result in exactly one new thing, and we happily operate in a world where these two conflicting perspectives are both true at the same time, never questioning either of them.

The desert is hot?  It’s an icebox compared to the sun.  Your OJ too sweet?  It’s entirely sour when opposed to honey, and honey’s still bland compared to a mouthful of refined sugar.  Everything’s validity or value depends on scale, context, and relativity, and for this reason everything is true, and everything is false.  Proving anything one way is the silliest thing in the world to achieve, because there are an infinite number of other perspectives, each of which may equally prove or disprove it depending on what you’d like to accomplish.

From the broadest perspective, in other words, absolute truth is as arbitrary as the selection of a crayon.  And I want a fuschia dinosaur.

The coup de grace is really brutal, though: even upon reaching a stable, static, universal truth, we find that the entire universe is in constant flux, rapid change, turmoil, decay, permitting, emitting, transforming, creating, destroying, and “moving on,” as Stephen King put it once, so that any true answer was only true for that universe at that time — and that was some time ago.

Forest Entropy

Nothing is static, everything is evolving, everything is falling apart. -Tyler Durden (Chuck Palahniuk)

-A Somewhat Oriental Alternative

My opinion is that contradiction and paradox are the bread and butter of the cosmos.  If I may be allowed to appeal to some authorities, quantum physics (and several other religions) agree with me, not to mention Ken Wilber, who looks so cool you just know he’s hip to his shit.  There’s nothing wrong with being wrong.  We’re all wrong.  I’m wrong right now; just ask all the people who stopped reading halfway through.  When people respond to a question, “Hmm, well yes and no,” I hear the warm laughter of oblivion and smile inwardly, but when I hear people insist that they know what they’re talking about, I have to laugh at myself for having absolute confidence that they should not be so confident.

Habitually trying to convince others to change their opinions is not only futile in the long run, it’s also genocide against the opinions you don’t hold.  Who wants everyone to agree with one another?  That kind of peace and harmony sounds fucking beige.  The only reason I have no contempt of those oh-so-cooperative insects, ants, is that deep down inside, I really believe they’re all arguing over the appropriate size food granules should be for carrying back to the nest.  It’s one of the interesting things about them.

There’s nothing wrong with letting the Rolling Stones be better than the Beatles, so long as the Beatles are also better than the ‘Stones.  I prefer Fitzgerald to Hemingway because I’m fairly certain that Hemingway is better.  Sometimes I wonder, perhaps I have never been very enthusiastic about sports teams because both sides of any game is a fleeting moment in a fleeting century, a single strike of the paddle in a ping-pong tournament played on a cruise ship which rounds the peninsula and disappears into the mist, forever.  Hell, sports teams don’t even have the same members game-to-game, let alone season-to-season!  People will wear the same logo on their cap from kindergarten to their own funeral, though, and some of them will be buried in it.

That, dear fellows, is what it means to try to convince people of things.  It’s insisting that the fluffy, domed cloud up there really is a turtle because that’s the way you see it, and because you’ve seen a helluvalot of clouds, by god.  My favorite part of all, though, is the hypocrisy involved in writing this piece, hypocrisy which will also be necessary to criticize or pontificate about it.  Hypocrisy is the sudden realization that one is the person whom one has chastised.  We define ourselves by standing out in contrast to others, and that makes us all identical in our hypocrisy.  How cute.

The trick, then, is to simply avoid the hypocrites who really seem passionate about their-slash-your point of view, right?  We can do that, can’t we?  Then we’ll have peace, a much more apathetic, blasé temperament all-around, and that’s something for us all to work toward.

So, until someone comes up with a better idea, I remain

Yours Truly,

-BothEyesShut

*Note: The artist featured this week, Tyler Philips, may also be found at his design company, Circuit 26 Design.

  • Enter your email address to subscribe to this blog and receive notifications of new posts by email.

    Join 22 other followers

  • Bookmark and Share
  • Local Directory for Long Beach, CA
  • Recent Readers

    View My Profile View My Profile View My Profile View My Profile View My Profile
  • Copyright Info.

    - All works are Copyright 2007, 2008, 2009, 2010 by BothEyesShut

    "In a Real World, This Would Be Happening"

    All rights reserved.

    - Header concept, photography, and design by Ruben R. Martinez (www.RubenRMartinez.com)

  • Real Bloggers United