Immortality, the Gift That Just Won’t Quit

The definition of death doesn’t hold much water, really, once all the voodoo juju is shaken out of it.  The harebrained doctors have one make-believe definition of it, the self-important scientists have another, and the whimsical believers have yet a third.  When one has faith in the existence of death, though, death can be a gateway, a rebirth, or even a redemption.  Anticipating death makes up the cornerstone of most world religions, while avoiding it remains the focus of most sciences.

— And that’s O.K.  There’s nothing wrong with any of those philosophies in and of themselves, but let’s eschew all that for the sake of conversation.  Let’s look at death without any allusion to typical, traditional beliefs.  What does death resemble, now?  A permanent medical condition?

Nevermind.  Let’s just say that death is a simple state of affairs that any doctor can walk up and diagnose, like this:

“Hey, this guy’s dead.”

Why, this guy's dead!

The doctor means that the poor guy’s lungs have stopped breathing and his heart has stopped beating.  That’s clinical death.

Most realists think of death as nothingness, bleak, black, and empty, which is typical of them; because if there’s any way to have less fun and be more boring, the realists will practically kill themselves to show you how.  Even so, most atheists and agnostics think this way about death, too, which is disappointing because as anyone can tell you, they throw the best parties, and therefore oughta know better.

“What happens when you die?” you may ask one of them.

“Nothing,” they say.  “That’s kind-of the point.”

OK Mr. Sunshine, but nothing is precisely what never happens.  There’s always something going on.  Besides, lots of things happen when you die.  When you look at clinical death, it actually mirrors the very early stages of clinical birth, so-to-speak, which normal people call pregnancy.

In the earliest stages of pregnancy, the fertilized egg (or zygote if we really must) has forty-six chromosomes, as well as its own unique DNA structure.  Anti-abortion terrorists are keen to remind us that this little eggy wegg is alive, and they’re not wrong.  In fact, scientists pretty much have to agree with them, because the zygote exhibits growth, metabolism, reproduction, and reaction to stimuli.

Apparently, the smartypants bigshot scientists have decided that a thing is alive if it’s got those four attributes.

What the zygote does not have, though, is a lung or a heart with which to satisfy the medical doctor’s requirements.  Its respiration has not yet commenced.  Its pulse is nonexistent.

“Why, this guy’s dead.”

“Now, you just hang on a second there, Doc.  We’re picking up growth, reaction, metabolism and reproduction.  This sonofabitch is alive.”

Great.  So the zygote is dead and alive.   Perfect.

Perfectly nonsensical.

Zombie Zygotes of the Living Dead

Why not, though?  When a guy looks at his arm, he thinks of it as a living part of him, right?  If doctors amputate it from him, then no one looks at it quite the same way.  It’s dead now.  The amputation was, as far as his body was concerned, a little death (or, la petite mort in French, which incidentally means orgasm).

Yeah, why not?  After all, when a pregnant woman feels her baby kick, she thinks of it as a living part of her.  If doctors deliver it, and amputate it from her, then no one looks at it quite the same way.  The baby’s alive now — even though the amputation was, as far as the mother’s body is concerned, a little death (or en francais, orgasm by baby).

Dead and alive, alive and dead.

The dead aren’t really all that dead, anyhow.  We eat dead things to stay alive, in fact — but only dead things which have recently become dead.  Dead things become more dead over time, and we can’t eat things which have been dead too long.

There’s not enough life in them, you see.

But just wait a damned second.  A little death?  More dead?  Death isn’t supposed to have all these degrees, all these shades of gray.

Silly-headed cynics and so-called realists step in at this point and remind us, “No, jerk.  Death isn’t in degrees or shades, and it’s definitely not gray.  Death is that certain change that happens in the instant that life stops for an organism.  Those four things you mentioned earlier?  Growth, reaction, et cetera?  The body can’t do those things anymore, so it’s dead.”

Yeah, alright, sure, Professor Killjoy, but from the broadest perspective, death doesn’t mark any significant change at all.  It’s just another change in an infinite pattern of changes — or, if you like, it’s another death in an infinite pattern of deaths.  Life, in fact, is what we call this infinite pattern of deaths.  Look:

Human life begins with an ovum and a sperm combining into a zygote.   This means the death of the ovum and the sperm, because they no longer exist as such; their chromosomes have been shared.  The zygote then begins cellular division at an extremely rapid rate, each division a little amputation (orgasm) from the parent cell, and these amputations are what we call growth.  When enough cellular carnage has occurred, the child is amputated from his or her mother, and soon afterward begins to eat dead things because of the life in them.

Dead things taste good.

Food is dead-ish

As the child grows, cells are born, grow old, die; are sloughed off, are excreted, are absorbed as more fresh dead stuff to nourish and prolong life.  Cells divide, and divide, and divide.  The lining of the small intestine is completely replaced over four-to-six days, you know.  The outermost layer of skin, or epidermis, every two weeks.  The hard structure of the human skeleton, every decade.  Even this child’s blood, just like the blood of every living person, is composed of red blood cells which live in the bloodstream for about four months before being replaced.

An elderly man of ninety years, therefore, has lived inside nine skeletons.  He has consisted of two-hundred and seventy human bodies’s worth of blood.

It’s all dead, though, remember?  We’re, like, hermit crabs or something.

Like our bodies, our minds unfold as a train of deaths and divisions, too.  Ideas grow and gestate, eating new information and transforming cold facts into newborn ideas, ideas which split and branch and grow of their own accord, just like a pride of lions flourishing from the carcasses of a few dead gazelles.  Sometimes ideas sprout from stagnant knowledge so automatically that our minds consider themselves inspired, but every new thought kills off an obsolete idea.

We grow and learn, shedding skin cells and obsolete ideas along the way like scraps of confetti following a parade, and when at the age of ninety we reflect on our adolescent selves, those teenagers seem long gone, long passed away, and the wistful feelings our memories evoke mimic those felt by mourners years after the funeral.

Death and life, life and death.

The thirty-year-old hermit crab and his previous shells

We still have no round definition of death, however.

Death seems no more than change and transition, and since change is an eternal constant, death must be occurring all the time.  If that’s so, then death as a single event does not exist.

If you think you’re going anywhere when you “die,” I’m afraid you’re horribly mistaken, as far as I can tell.  Nobody is going anywhere.  Nobody is going anywhere, and neither are the actions we are still making.  That the “dead” human mind no longer orchestrates these actions is inconsequential, since the mind was never orchestrating anything from the broadest perspective, anyhow, regardless of how intimately involved in the processes of the universe it seemed.

This will sound like glorious immortality to some and eternal damnation to others, so I guess that if you really wanted to you could call your opinion on living forever ‘heaven,’ or ‘hell,’ but don’t do that.  That’d be so tacky.

If all this sounds fantastic, consider that everything we are or will become was already here long before we were born.

All the material needed to put our bodies together had long been available before our births.  Our mothers merely needed to ingest some dead stuff and assemble it inside her.  The material to put our minds together had been here, too.  The elementary ideas, the deeper concepts, and the inner mysteries all, all, all had been waiting for our minds to ingest them and put them to use.  We were already here, waiting for assembly, just like The Great Gatsby had been when the Old Sport was alive inside Fitzgerald’s head, but not yet written down.

Sure, Dad can stick some spare auto parts together and build a car, but Mom can throw some spare body parts together and grow a person!

Cynics and skeptics will say, “An idea is not a thing, Sir,” and I must retort: well, where, exactly would you like to draw the line?  If Gatsby exists once he has been written down, what happens if the manuscript is destroyed?  — And if Fitzgerald writes him down again, is he birthing the same Gatsby?   What of publishing and printing?  Are all Gatsbys the same man, or different men?

Consider also the differences between brothers of the same family, raised in the same general time, by the same parents, on the same food, in the same area, with the same values, et cetera, et cetera.  One may grow up into a madman and the other a schoolteacher, but from the broadest perspective the difference can only be in human estimation, just like so-called death.  If we are arbitrarily, subjectively deciding what death is, then there really isn’t any such thing we can point to after all, is there?

In order to believe in death, one must think just like the doctors and scientists, coming up with their own willy-nilly criteria by which something can officially be called “dead.”  You may as well say that death is what we call the future, and birth what we call the past.

The Starship Enterprise notwithstanding, we will always be here, extant, just as we have always been here, and the proof and cause of both is that we can’t help but be here now.  There can be no escape.  We are captives of existence.  And why?

— Because the present time, nestled snugly between the past and future, between birth and death, seems very much alive, and it happens also to look very much eternal.

With much pleasure and measured amounts of pain I remain,

Yours Truly,

-BothEyesShut

Stumble It!

Advertisements

Oh, Yeah? Prove it!

Every experiment has significance, even the inconclusive ones.  When a team of smartguys at M.I.T. completes a study with inconclusive results, it reaches the ineluctable conclusion that another study is needed and immediately sets to work on it.  This testing can, will, and does continue until significant findings have been produced — er, that is — discovered.

Once significant results appear, the doctors conducting the study become proponents of it and publish these discoveries in remarkably well-respected journals.  These paperback journals are written in tedious, turgid English that is too obscure for the public to read, and have an average cover price of thirty American dollars, ensuring that the general populace gets no chance to join the conversation until it is Mickey Moused by Time Magazine and sold as an impulse buy at the grocery counter.

Hey, whatever.  At least mom’s getting in some string theory.

Journals cost upwards of thirty bucks, but at least they're jam-packed with ten-dollar words

As in all things in this universe, the idea proposed in this new study begets its equal and opposite, a second study which exists to provide an alternate scientific belief for anyone and anything negatively implicated in the first.

The satisfying thing about science is that it loves conflict.

Scientific prejudices appear out of this conflict, and because they are prejudices of science itself, the public presumes them factual.   From the broadest perspective, however, science walks in the well-trod footpaths of religion and theosophy.

When science decides that a certain quantum particle does not exist based on its failure to appear in tests, science is as faith-based as the creation myth of Genesis.  Science and religion have traditionally been rancorous archenemies, but this is a misunderstanding which, if one could get them talking again, could easily fertilize the most affectionate of friendships.

This animosity has been based on little more than a clerical error, anyhow.  Note how science and religion interplay in the following.

Once upon a time, in a faraway land called Berkeley, there lived a doctor of physics.  This doctor believed in a certain particle he called the God Particle, and hypothesized that it existed everywhere and had an effect on everything else.  So the doctor wrote a paper and was granted funding to perform experiments in a very special place with very special equipment, and after three months of rigorous, painstaking trials, the poor doctor was forced to concede that no evidence of his God Particle had surfaced in any tests at all.

To the scientific community, this absence of evidence presents hard, objective proof that Doc’s God Particle does not exist.  Even if they add the word “theoretically” to the conclusion (as they do with the theory of gravity, which they still can’t fucking figure out) they still use the test as a quotable citation in papers arguing that the particle is a fantasy of the doctor’s.

To be perfectly clear: in popular science, the absence of evidence can prove that a thing does not exist.

How’s that for self-satisfied conceit?  They can’t even plumb the depths of our ocean trenches, but they’ve got E.S.P., telekinesis, astral projection, sixth senses, prescient dreams, and automatic writing all figured out.  How?  No evidence, that’s how.

Oh.  Well, shit.

Scientific evidence shows that there is no scientific evidence that scientific evidence is scientifically evident

Now, let’s say that following the most costly failure of his professional career, Doc is forced to return to teaching at a preparatory high school for rich kids, which amazingly enough also happens to inculcate Catholicism.  In this private school, Doc is lecturing about the existence of God during a religious studies class, when suddenly a particularly cynical and sarcastic student raises her hand and demands to know how it is that anyone can feel sure that God (big G) exists at all.

Well, this is the question for which the course entire exists, and so the doctor puffs up with dignity and conviction, and with great certainty informs his students that in all the centuries and centuries of assiduous scientific research, and of all the brilliant, most well-respected minds throughout history, not a single person has been able to prove that God does not exist.

To elucidate: in matters of religion, the absence of evidence to the contrary can prove that a thing does exist.

— And though science and religion may fixate on the same piece of evidence (that nothing has appeared in tests, in this case) they both exit these experiments feeling assured that their hypotheses have been logically supported, because objective reason has its roots in language, and language happens to have more than enough elasticity to correctly describe a single concept with two definitions, each the perfect opposite of the other.

As violent and arbitrary as this arrangement may seem, the truth is: the common person likes it fine.  In fact, practically everyone hates unchallenged assertions, even the people making the assertions, themselves.  Something about our nature causes us to see polar opposites in everything, and something about our minds causes us to invent contrary concepts for every conceivable idea.

Humanity likes nothing until it is contested, enjoys nothing better than a contest

It is this facet of the human personality which affords us such colorful figures as the venerable Flat Earth Society, which still maintains that the globe is flat; the irreproachable Tychonian Society, which avers that the sun orbits the earth; and one mad Dutchman at the University of Amsterdam, Erik Verlinde, who asseverates that gravity is, in fact, fictitious.

If the ever-patient and magnanimous reader finds the Flat Earth Society amusing, then the reader is hereby urged to consider that most contemporary physicists believe Dr. Verlinde’s theory to have very convincing implications, and that gravity is merely the effect of a universe maximizing its entropy, or disorder.  The concept of gravity as a universal power will probably not exist for our children.

Q: If gravity, of all things, really is a red herring, then how incredible and fantastic are groups like the Flat Earthers and Tychonians, really?

A: Every bit as credible as a science journal, just as veracious as a leading theoretician, and equally as trustworthy as the supposed date and time of the reader’s birth.

Lo, and behold the clerical error of which I spake: if science and religion could leave the protection of their podiums for a second, they might each glean a mutual respect for the irascible plight of the other, which is that they are both sadly, obviously, and pathetically full of shit.  Not one or the other.  Both.

Yes indeed, we like the results of our experiments best when they are disputed.  Should science publish a study which shows conclusive evidence on any topic at all, another science immediately sets out to prove the opposite.  The people of the world want every perspective sullied and watered-down, pushed and contested until a ninety-nine percent probability has its back against the fifty-fifty wall, precisely where we want it.

We want it balanced just so, because we like to choose sides as if they were baseball teams.

— And once we arbitrarily pick a team, we commence to argue, and bitch, and dispute for it as though our evidence were, after all, indisputable.

Even incontrovertible evidence meets with reasonable opposition

Evidence is stupid, anyhow.  It’s usually statistical, which as anyone can tell you is the most insidious form of prevarication.  For some reason, intelligent people appeal to the authority of statistics all the time and require the same of others, which is doubly asinine, as these egghead hotshots know full-well that appealing to any authority is a cardinal logical fallacy, and exponentially more so when the authority in question is an invariably inaccurate numeric representation of an actual, physical chain of events, collected from a sample base which even under the most fastidious methods has no chance whatever of accurately representing some other, similar yet different thing at an entirely different point in time.

As the British statesman, Benjamin Disraeli, once said, “There are lies, damned lies, and statistics.”

Most experiments require a test group and a control group, too, but like gravity and statistics, there’s no such thing as a dependable control group, either. The very act of including it in a study changes its natural state.

An excellent example of this occurs in quantum mechanics, in which certain particles exist only in patterns of probability — that is to say, they are probably there, or probably not-there, never certainly so — and these patterns of probability change according to which researcher happens to be recording the data.

If one supposes that fifty scientists conduct the same study, their findings will generally have an acceptable margin of error, each doctor achieving his or her own individual result.  The only difference between this margin and a larger one is that we declare the former admissible and the latter inadmissible. Experiments cannot gauge truth in objective reality any more than a preacher can divulge so-called Ultimate Truth (big U, big T) from a holy text.

Humanity finds evidence-for, and evidence-against, and ultimately judges its (supposedly) objective reality with the subjective whimsy of an adolescent girl deciding between prom dresses.

This, ladies and gentlemen, is what the world calls evaluation by evidence.

Weighing all evidence with the most discerning of eyes, the prom date is an apotheosis of adjudication

So all evidence is meaningless, then? All results, experiments, and hypotheses, nothing but evaporated time and energy?

Not at all. Just because there’s no such thing as True (big T) objectivity doesn’t mean one can’t create it for oneself or support it for others. We arrive at many, many decisions on a regular basis which matter to hundreds, perhaps thousands of people, and we put our faith in evidences in order to do so.  Truth is easy to arrive at in a box.

One has merely to define the box.

Contrary to an extremely annoying popular belief, though, there is no such thing as thinking outside the box, because from the broadest perspective nothing makes any sense.  Logic only happens within defined parameters.  One can exit one set of rules and enter another, more comprehensive set, but there’s always another box containing all the smaller sets to prove that they are infinitely short-sighted and presumptuous.

The important thing is to remember that we’re basing it all on faith.  Nobody knows what’s really going on.  The passionate stupidity of thousands of sheep in innumerable American religious flocks has allowed science license for abject arrogance.  The truth is, though, any honest scientist will tell you that science has no positive idea about the meaning of life, the universe, and everything.

That’s the slippery thing about Ultimate Truth (big U, big T).  It’s only true if it does not conflict with the properties of the universe — and the universe is in constant flux.  In fact, the only known absolute constant is the transitory nature of everything.  This means that even should an Ultimate Truth surface, it could only be ultimately true for an instant before becoming outmoded to newer, emergent properties of existence.

Mr. Jesus may very well have been the way, truth, and life once (or maybe is due up in a few more centuries) but neither he nor anybody nor anything else can be a static ultimate truth in an anti-static reality.  A more likely solution is that universal truth changes for each individual thinker, so that one’s universal truth may indeed be found in Biblical scripture at a certain age — and this is boxed-up objective truth, no less true than death or taxes — but neither before nor afterward.

“When I was a child, I spake as a child, I understood as a child, I thought as a child: but when I became a man, I put away childish things” (I Cor. 13:11).

Yeah, that’s right.  I can quote scripture.  It isn’t blasphemy when it’s true.

So perhaps we all have some real thinking to do, eh?  Perhaps it’s time to grow up.

Where does one stow an outgrown worldview?  Under the bed, next to the Tinker Toys and Legos, obviously.  Right where it belongs.

With glasnost and much cheek I remain,

Yours Truly,

-BothEyes

P.S. — Nowhere in this piece will the magnanimous reader find the word, “ontology.”

Stumble It!

The Saintly Altar of the Altered State

I.

The human brain, contrary to what mom told us, is not a miraculously engineered wonder of the Western world.  It’s miswired, misaligned, and mistaken much of the time.  Many charlatans — or psychologists if one prefers — believe that the brain’s first experience, birth, permanently damages it.  Birth is violently traumatic, and both emotionally and physically brutal.  In response to high levels of stress such as this, our brains shoot us up with adrenaline, hydrocortizone, and steroid hormones (glucocorticoids, if you really want to know) which means our first birthday present is that we get to enter the world innocent, healthy, and high as fuck.

— And that’s OK, because if it weren’t for altered states of consciousness, we’d have no genuine experience of this world’s completely random nature at all.

Since we can’t be born every time we want a fresh jolt of reality, we spend the rest of our lives self-medicating.

Holistic medicine the old-fashioned way

The brain operates a crackhouse in our heads, producing such heavy hitters as dopamine, a natural upper which makes us talkative and excitable, endorphin, an anæsthetic which has three times the potency of morphine, and serotonin, a mood enhancer which makes us act and feel like hippies.  Most of the meds recommended by school psy-charlatans for depression or anxiety alter the amount of serotonin produced by the brain.

These mind-altering substances have side effects which can prove worse than the emotional irregularity they medicate, such as violent tendencies, hallucination, depersonalization, derealization, psychosis, phobias, amnesia, and obsessive compulsive disorder — and that’s just for the benzodiazepines.  We don’t hit heart arrhythmia until Eldepryl (™).

Sexual dysfunction and gastrointestinal distress commonly affect patients taking Selective Serotonin Reuptake Inhibitors, or SSRIs.  Pop-culture knows this hip family of psychomeds well, which boasts such rock stars as Paxil, Prozac, and Zoloft.  Approximately twenty-two million Americans take these drugs every day, or statistically, every fourteenth American one encounters on the street.

So, the next time you’re shocked at the number of complete assholes you meet in a given day, remember that fourteen percent of America hasn’t taken a shit in four days and hasn’t had an orgasm in months.

Without sex and regularity, anxiety patients feel much better

II.

If the human brain were able to regulate its chemicals, nobody would recommend cooking up meds like Prozac and Paxil.  Since science has proven that many do not, though, society accepts these meds and also allows for a margin of error in prescribing them to healthy people.  Many groups in the United States froth at the mouth over the prevalence of drugs such as these — as well as that of other mind-altering substances, both legal and illegal.

One might as well try to place the entire nation on a single diet as try to stem the amount of self-medication engaged in by Americans, though.  Seventy-two million of us diagnosed ourselves and regularly took some sort of alternative medication in 2002.  The rest of us might not consider ourselves medicating, but we do, of course, and not just the usual Tylenol, Robitussin, and Pepto-Bismol, either.  We purposefully alter our brain chemistry all the time.

Over half the population of the U.S. drinks coffee on a daily basis to take advantage of its stimulant properties.  Sixty-four percent of us drink alcohol, perhaps to counter the tension from all our coffee.  Twenty-two percent of us smoke cigarettes to relax, especially while drinking alcohol or coffee.  Approximately eighteen percent smoke grass.  That’s without even discussing all the more-inventive drugs, such as LSD-6 and MDMA.

In addition to all this we must consider the oceans of so-called “health nuts.”  Fitness fanatics come in various degrees of seriousness and mental stability, from the casual weight-lifter to the manic Olympic triathlete, and nary a one of them considers himself or herself a drug addict.  Nevertheless, the scientific community established long ago that physical exercise heavily affects hormone, endorphin, and serotonin levels, and also that addiction to these natural substances occurs easily, naturally, and predictably in lab rats.

Since these highly addictive endorphins target all the same opiate receptors, 24 Hr. Fitness can be considered the modern American opium den.

Portrait of the American Addict

III.

We certainly do like to fuck with our brains.  Who can blame us, though?  As aforementioned, we’re the inheritors of broken machinery, the unhappy inhabitants of chaotic mental domains which do not even function in the haphazard, unpredictable way they should.  Humans fix things.  When a shoe comes untied, we tie it.  When a brain comes apart, we glue it together with whatever we happen to have on-hand: coffee for fatigue, whiskey for tension, tobacco for anxiety, what-have-you.

When we tinker with our minds, we’re seizing temporary control of our neurochemistry.  We don’t drink alcohol in spite of its tendency to impair our judgment; we drink it precisely because it impairs our judgment, and unlike other mind-altering addictions such as — oh, I don’t know — television, say, we know exactly how our brains will change when we indulge.

Humans have used mind-altering substances since the dawn of time.  Beer, alone, has a documented history going back six-thousand years before Christ.  When we look at our ancestors from so long ago, though, we can’t help but notice that their uses for beer, wine, tobacco, drugs, et cetera extend far beyond self-medication.  Of course, they were used for recreation, but the original use for most of these so-called vices was for creating an appropriate environment for religious and spiritual rituals.

The Greeks drank wine to evoke the ancient god, Dionysus.  The Jewish tradition of the Passover Seder requires four glasses of it per person.  Five-million Hindu sanyasi sadhus smoke hashish to repress their sexual desires and aid their meditation.  Over fifty American Indian tribes practice Peyotism today, a religion centered around ritual use of natural mescaline, which they use to communicate to the dead and to various deities.

These people aren’t balancing their serotonin — they’re putting gods on speed-dial.

Not seeing angels and demons, yet?  Here, drink some more of this.

They're gateway drugs, alright

IV.

These days religions get a bad rap.  Atheists can say the bad reputation of spirituality reflects its failure to cooperate with contemporary Western civilization, sciences, paradigms, and increasingly agnostic peoples.  Religions themselves, however, deserve no animosity.  One cannot judge a philosophy by its misuse.

Religions originally appeared because humans became convinced of evidence alerting them to other beings, other worlds.  Rituals appeared because humans wanted to commune with these other beings, other worlds.  Mind-altering substances proliferated in rituals because they provided sufficient evidence of their usefulness to millions of adults with brains the size of canteloupes.  We no longer use these drinks and drugs to speak with gods, though, because so many people these days seem to think they can do it without spending beer money, and many others don’t think very much of the idea of talking to gods, anyhow.

In other words, lots of boring self-styled “realists” think those other beings, other worlds never existed in the first place.

The funny thing is, everyone on planet Earth believes wholeheartedly in lots of things that don’t exist.  The value of currency, for example, is absolute balderdash.  It is valued for its various markings and symbols which invoke the names of people who lived hundreds of years ago, and which declare mottos and oaths in ancient, dead languages, markings and symbols which cast an enchantment over both buyer and seller, and in this mutual confusion one can purchase an automobile with nothing but decorated scraps of parchment paper.

There is no difference between the purpose of the markings on a dollar bill and that of the markings inscribed within a sorcerer’s sigil, or those upon an altar, or even those upon a WELCOME mat.  We live in a world of our mind’s creation, and everything real to us has been made real by us.

How did we miraculously make reality real?  Easy.  We simply named it that, like we did the table, the chair, and the dust bunny.  “Reality,” we said, “thou shalt be real,” to which so-called reality said in its easygoing way, “Alright,” and that was that.

The unreal didn’t mind being left out at all, though, because all of a sudden, it didn’t exist.

Wait, did you guys see that -- or am I crazy?

V.

So, here we are, then . . .  Nothing is real, and nothing is unreal.  Quite a mess we’ve gotten ourselves into at this point, and we’re very proud of it.  Naturally, we’ve taken the next step and done what any bipedal, cerebrally cortexed hominid would do in this situation: we’ve become ontological agnostics.  We don’t know what truth is, where to find it or how to prove that it’s there, but we believe in it all the same, bumbling about like the decorated surrealities we are, chasing after decorated scraps of parchment paper, and taking turns chastising one another for having faith in decorations.

What arrogant, blustering bastards we all are.

But how can we escape this cycle of idiocy?  How can we step from delusion and credulity into anything but delusion and credulity, if everything we know seems illusory and incredible?

Beer.

Cold, crisp, clean — beer.  And pills.  And smokes.  And coffees, wines, and liquors; buttons, tabs, and capsules.  Strenuous, extended exercise.  Yoga.  Za-zen meditation.  Brutally sorrowful dramas, uproariously hilarious movies.  Bitter, hate-filled debates.  Violence.  Pain.  Exquisite, sin-soaked and passionate pleasure.  The sweetness of selfless generosity lifetimes long, the glorious splendor of victory in competition, the self-righteousness of upbraiding one’s brother for having fallen from grace.  Mind-altering substances, mind-altering experiences.

In a paradoxical word, we can step away from the illusory by taking a break from reality.

In a life where nothing you think real can possibly exist, a world of erratic change and nebulous phantasms, mind-altering substances and experiences offer the most realistic opportunities available to a human.

— But of course, one could just go on as a believer . . .

With a glazed look and a raised glass I remain,

Yours Truly,

-BothEyesShut

Stumble It!

THUWH9S5JMPC

Self-Abasement, Incorporated: an Industrial Revolution

At the U.S. headquarters of Self-Abasement, Incorporated, a boss begins to instruct his underlings in the delicate art of business attire.

I.

Business attire, as we all know, is that particular brand of fashion which obscures one’s personality. Business attire offends people at places of relaxation and amusement, and doesn’t look distinguished in one’s workplace, either, regardless how much money one has spent on it.

Business attire, though having been designed to look respectable, handsome, and elegant, fails to do so, because while companies can require that one wear a pinstriped skirt, they can neither require that one should own several such skirts, nor that one should daily press the wrinkles out. The boss can force us to wear a tie, but not to tie a fresh knot daily. These are discretions belonging to the wearer, and this is the irony of business attire.

When one’s silk tie has been in the same Windsor knot for six months, it’s insincere to feel elegant.

You'd be amazed at how long a necktie can go knotted, how long a bra can go unwashed

Yet the boss, a college graduate of average ambition, has also a boss, and this chief boss is the one telling him to enforce the company’s dress code. The command strikes little boss as odd because the dress code has always been followed with little trouble.

“But no,” the chief tells him. “Following the code is just acceptable; we can’t have our employees looking acceptable. Our employees represent the company, and the company can’t look just acceptable.”

“No,” says the boss, “of course it can’t, of course the employees can’t,” even though he is thinking of the word acceptable, its definition, and wondering why there ought to be a dress code at all if not to define precisely how employees should dress for work.

So the boss bows out of the presence of the chief and makes his way to his own cubicle. His cubicle has a window overlooking the blacktop of the parking lot below, because he has worked with the company for twenty-one years and has earned this luxury. Once there, he reviews the company’s dress code, then clicks his mouse pointer to create a new document. His creation takes forty minutes. Making copies takes three. He delivers them to his underlings in no time at all.

The cubicle creatures have become wary of the boss’s hardcopy memos, so they wait until his squeaking loafers have rounded the corner to pluck it up and take their medicine.

They grimace at the familiar arial font, and they sneer at the bullet points. The tone and content of the memo is no different from any that have come before: heartlessness approximates professionalism; condescension masquerades as magnanimity. Tragic, terrible irony seeps from every typo and grammatical error. The cubicle creatures begin to pop up like gophers. They peer over the walls of their little boxes at one another, holding up the memo and pointing.

What bullshit! They can’t do this to us. I’m going to talk to Johnson right now. Can you believe this shit?

They cannot believe this shit.

I Cannot Believe This Shit

ATTN: ALL EMPLOYEES

AS OF 4/25/10 the dress code is being clarified. Some employees arent following company procedure so this should help them dress aproppriately for work. NO EXCUSES! NO EXCEPTIONS!!!!!!!!!!

– Shirt and tie, men

– BLUE or BEIGE blowss, women

– BLACK or NAVY BLUE slacks

TO CLARIFY IN ADDITION!

– Mens slacks must front crease

– NO JEANS on Fri. anymore per Johnson

– Polo shirts are only all right Fri. on floor 3 if they are blue or beige

– EMPLOYEES MUST SHINE/POLLISH THEIR SHOES EVERY WEEKEND BEFORE MON. Mailroom employees must black nylon laces

– No dangly ear rings

– CLEAR or RED only pollished nails

John Johnson wll be reviewing staff Wed. to make sure these rules are being followed.

Thank you for your cooperation,

Gary Melendez

II.

Sometimes when I’m at my job, tappity-tap-tapping on my plastic keyboard and diddling the little touchpad on my laptop from time to time, it occurs to me that I’m accomplishing work which required hours of painstaking, interminable scrawling on sheafs of expensive parchment not so long ago.

Thank you, Industrial Revolution.

The underlings of Self-Abasement, Inc. do not feel the benefits of that historic occasion, though. They feel the crushing weight of imaginary duties, instead, because the introduction of technology to the workplace has eliminated most clerical work, leaving employees with more time between tasks than ever before, time which bosses must fill in order to look industrious.

Having long ago mastered the art of making two hours of work look like a two-day job, proletariat underlings manage to keep their jobs, and this explains how American employment competes with technology which would otherwise make human labor obsolete.

Bosses know that their underlings cut corners and screw off for large amounts of time, though (because they are very guilty of the same thing) so the bosses spend most of their paid hours playing gotcha! with the rest of the staff, ratting out the minimum of underlings necessary to look busy.

Underlings, bosses, and chiefs all have more free time, but the sergeants to whom the chiefs report have no more free time than previously, because sergeants never did any of the clerical work, anyhow.

Sergeants do labor which C.E.O.s need done but cannot do themselves, labor requiring certain talents and educations which computers cannot be programmed to use. In addition, companies need creative, educated humans in virtually every area of their industry, so these sergeants find themselves in high demand, spread thin, overworked and under-appreciated.

The sergeants have meetings, at which they give presentations, with which they sign deals, by which they secure work and money for their employers, which also secures the employees below. They are hard to reach, rarely seen in the office, and have little time for shenanigans. Their private time is taken up with anything and everything that could possibly relax them.

— Drug habits and divorces, for instance.

The big meeting feels like a summer holiday, when your cocaine has gone up both nostrils and your hands have been up both skirts

As a very protracted result of industrialization, then: underlings inflate their jobs in order to look busy and justify their positions; bosses inflate their jobs in order to look busy and justify their positions; sergeants enjoy the odd amphetamine here and there and become extra-marital enthusiasts.

What, the reader may ask, are the chiefs doing during all this self-inflation?

Sergeants have no time to police them and must be content with available evidence that the chiefs are doing their jobs — but just what, exactly, were their jobs? Since dividing their responsibilities among the bosses, the job of the chief has evaporated into the delegation of labor amongst laborers who are many times more experienced at accomplishing these tasks than the chief ever was. In physical terms, the chief actually does nothing.

However, nothing is a very difficult job to perform, as it turns out.

In order to earn wages for doing nothing, the poor chief must somehow take credit for the work his underlings complete and build hard evidence of having had a hand in it, as well, which proved an inexorable challenge until the late-twentieth-century innovation of micromanagement.

III.

Some definitions of micromanagement stretch for whole paragraphs, while others curtly name it in a concise six or seven words. Micromanagement describes more than a mere business philosophy, though. It is an undiscovered culture. It is an esoteric cabal.

Micromanagement is a sorcery woven over North America which upholds the global economy, feeds innumerable hungry mouths, and maintains the eminent prestige of the corporate-American business style.

It shares also the unfortunate distinction of the Faustian pact, however, in that it happens to kill everyone who subscribes to it.

The micromanager, here seen protected from unemployment by his circle of arcane documentation.

When chiefs first aspire to practice micromanagement, they begin by conjuring new requirements to add to existing regulations. This increases the complexity of the rules, and since they must enforce these rules, this inflates the scope of their job, likewise. In the case of the wretched cubicle creatures at Self-Abasement, Inc., for instance, their chief focuses on the company dress code, which had been a perfectly functional dress code except that it was too easy for his employees to follow and therefore did not give the chief anything to do.

By adding a few superficial, superexacting details, chiefs ensure that their cubicle creatures will resist this tyrannical posturing and fail to observe all new regulations. The chiefs then sign a few official documents of reprimand, obtain the signatures of all offending employees, and in this way create a paper connection between themselves and the actual labor performed by the underlings.

Memos, too, serve to solidify a micromanaging chief’s presence in the office. Suggested by the sergeants and articulated through the chief’s invariably horrific grammar, they explode in mass emails like viral outbreaks, or wind up scotch-taped to cabinetry in the staff lounge, stall doors in the restrooms, or any number of surprising locations where one would not expect a memo to lurk, such as inside the silverware drawer in the kitchen:

DO NOT PUT FORKS AND SPOONS IN THE UTENSILS DRAWER!

These officious memos help to prove the indispensability of the micromanager, and also make his or her presence known throughout the cubicle labyrinth, invoking him or her like the summoned incarnation of a corporate Zeitgeist. Without the ostentation of these memos the chiefs would seem incorporeal, because by nature of their work (which does not exist) they toil alone in their offices, leaving them only to use the restroom or drop in on a boss to make certain the chief’s responsibilities are being sufficiently handled.

This, of course, begs the question underlings have pondered since the inception of the micromanager: if we’re out here doing all the work, and all he does is come up with crazy new rules every two weeks — then what the hell is he doing in there all the time?

It is the opinion of many cubicle creatures that copious amounts of auto-eroticism transpire in the office of the chief.

Connectivity. Infrastructure. Masturbation.

IV.

The Industrial Revolution of the nineteenth century put thousands of people out of work, and forced thousands more into new schools instituted to train farmers for life as factory hands. Had those day-laborers developed the sort of industrial sleight-of-hand practiced by micromanagers today, they would have been hailed as geniuses. They would perhaps have spent their working hours in the shade of apple trees, shouting perfunctory instructions to the other hands and winning their contempt, like this:

“Smith, yer gone need ter lift that hoe up t’yer shoulder to keep the furrow nice’n straight, hear?”

“Sho’ is a fine thing we got Johnson ter tellus how ter hoe ‘n sow ‘n plant ‘n scrape. I wonder where he gits his idears from.”

“I reckon those idears o’ Johnson’s come from about the same place as the manure do, but I sho’ wouldn’t mind trading up fer his salary, or fer his shady patch o’ sittin’ over thar, neither!”

That micromanagers work illusory jobs for pay does not seem inherently evil, though, as all the crucial work seems to be getting done, anyhow. Giving people something to do simply because people need something to do hardly appears like the worst thing in the world; mentally handicapped individuals have been employed in this fashion for decades, as have convicts, and even grandchildren (“Do what Nana says and sweep those leaves into a big pile on that side of the yard, and let me know when you’re done so I can show you how to sweep them back again.”). Micromanagers commit but a misdemeanor in duping dimwitted companies into paying them for inventing paltry regulations and decorating the office with memos.

In the innumerable tortures they design for the pathetic, piteous cubicle creatures, though, they betray themselves as the authors of fresh hells, their mass emails sundering the contentment and optimism of scores of people with neither shame nor care. The despair these micromanagers distribute as part of their useless, makeshift jobs horrifies the hapless cubicle creatures slowly, their gaunt faces growing more sallow and lined every day as though forced to watch imperturbable carpet bombs falling over an amusement park in crawling, relentless slow motion. Dress codes, new forms, an additional mite of data entry, an extra stop on the fifth floor to obtain a signature, the straws stack upon the quavering spines of corporate employees all the world over — hourly paid, conveniently quashed like cockroaches.

The proverbial last straw never comes for the cubicle creature, though, because each poisonous favor is only as brutal as the last, and like a cuckolding indentured servitude, they can only endure the apathy of their superiors by the anæsthetic of mindless subservience.

One is not mistaken to also detest the cubicle creature. One must consider that while their financial constraints may convince them to daily demean themselves like cowering, obsequious rodents, the shoe polishers of the world, garbage collectors, sewer scourers, bedpan changers, septic tank adventurers and other dauntless laborers of unseemly occupations go about their business with all the dignity and assurance of a British barrister, the cubicle creature having sacrificed self-love and self-respect for the sake of a dollar or two per hour above the wage that is generally paid to teenagers working in fast-food restaurants.

Marty Feldman, having left his position at Self-Abasement, Inc., re-learned how to smile and began an unlikely career in cinema. Seen here in early recovery.

V.

What course of action, then? When I reflect upon the farmhands during the Industrial Revolution, I imagine them going to work in factories with the same resignation and mental fatigue in their faces I see on those of the cubicle creatures, the bosses, and the micromanaging chiefs. This inheritance of misery cannot be tolerated.

However, the solution is not to stamp out micromanagement; that seems implausible. Micromanagers generally possess few marketable talents and so would not know what to do with themselves were it not for micromanaging. They will defend their philosophy to death. They sink in a quicksand of their own devising, and like Dr. Faustus, they do not believe that it will destroy them.

The micromanagers, themselves, appear doomed.

Readers given to martyrdom may decide to practice the Way of Nice for their respective chiefs, but should one find oneself in the position of the cubicle creature, the boss, the chief, or the sergeant, one would do best to quit the place like a spark leaving the flint.

Corporate offices transform human time and energy into cashola. That is their purpose; they have none other. Unless one could change one’s living days into enough capital to justify such a dark metamorphosis, to take a position in a corporate office is to commit oneself to a sanitarium operated by lunatics.

Most corporate fucks work jobs that they hate in order to feed, clothe, and educate their children, transfusing their very lifetime into that of their offspring. Their personal joy and appreciation for the beauties of life visibly deflate from them with every passing day, and many live in fear of termination like battered housewives clinging to abusive spouses. Self-destruction does not raise healthy children. It were better to live with dignity and pride somewhere in a rent-controlled ghetto and nourish one’s family with ramen.

As the great Al Pacino once said, “There is nothing like the sight of an amputated spirit; there is no prosthesis for that.” No, and there is no salvation for those who commit a daily suicide all their lives, either.

Beware the promise of material happiness or contentment.

Beware the myth of financial security.

Beware the fiscally ambitious and the ones who have it all.

— But most importantly, beware that part of you which dreams of winning lotteries, marrying rich, or retiring in a large, beautiful home.

It’s the part of you the rest of us have most to fear.

With remarkably tenacious optimism I remain,

Yours Truly,

-BothEyesShut

Stumble It!

Books, Part of This Nutritious Breakfast

There isn’t a community in this nation which doesn’t prefer an hour at the gym to thirty minutes in a library.

Americans would admit it freely, too. If Gallup polled them, Americans would say, “Well, yeah. Wouldn’t anyone rather look like Marilyn Monroe than think like Ben Franklin? I mean, come on, that’s easy.”

These priorities are, as Robert Frost once called them, sincerely fucked-up. It’s sound reason for health-conscious people to concern themselves with their intellectual diets and exercises at least as much as their physical ones.  It’s good logic for health nuts to care about their educations as much as their caloric intake.

The dangers of an unhealthy lifestyle worry many Southern Californians, as well as other folks both domestic and foreign, but those concerns present nothing like the Faustian hellscape that is an intellectually malnourished way of life.  The mind deserves at least as much attention as we pay to our diets.

No amount of crunches will give them that, "I drive responsibly" look.

I.

The mind is easily dismissed, because the mind is hard to describe.  It’s an abstract concept.  Nevertheless, people have cause to worry about exercising their minds just as they fret over diets and exercise, because the mind and body are degrees of the same thing. Neither mind nor body means anything without the other, just like hot and cold, or far and near.  Degrees of the same.

This is not a Taoist argument about balance, though. This is a statement rooted in thousands of years of philosophy, thousands of years of brilliant thought.

These thoughts began when someone tried to find the mind. Where it was he or she could not say, and neither could anyone else. No one, in fact, has ever been able to pinpoint the mind satisfactorily, beyond the assertion that it is inextricably braided into the physical brain.

Now, if there is no known location for the mind, why not presume it made of the same energy and matter as everything else in the world? If the mind were nothing more than an effect (and cause) of physical activities of the brain and body, nothing would remain unexplained, nobody would need to wonder where the mind were located, anymore. More fun than that, though, and much more amusing, it would destroy the divide between body and mind, and anyone in search of health food would need to consider whether chamomile tea might not be healthy.

Chamomile promotes drowsiness, you know, and thereby hinders the mind’s ability to concentrate.

Chamomile, the thinker's devil weed.

Consider the relationship between physical actions and abstract ideas.

Should one decide to flip the bird, one begins by willing the fist to extend the middle finger. The question is, at what point does the incorporeal idea become flesh hard enough to physically move the finger? To answer, one must arbitrarily choose a point along the path from thought to action, and the transformation, so-called, happens too rapidly for anyone to discern a difference. It is as though the action itself contained all the desire, will, and thought that had ever been involved. Ideas and actions are, in the end, inextricable from one another.

It remains possible that our thoughts may be nothing more than the streaming recognition of all our potential actions.

For Batman once said, “Of what use is a dream, if not a blueprint for courageous action” (Batman, the feature film, 1966).

Modern science has corroborated the mind’s influence as part of the body. Harvard’s Dr. Langer successfully showed that housekeepers who merely considered their jobs differently as they went about their duties lost significant weight, as well as ten-percent of their blood pressure. Studies at the Cleveland Clinic recorded participants increasing their muscle strength by thirteen-percent in three months, not by weight training, but by simply imagining themselves doing the exercise for fifteen minutes per day, five days per week. Not to be left out, Drs. Yue and Cole of the Department of Exercise Science at the University of Iowa say, “Strength increases can be achieved without repeated muscle activation. . . The results of these experiments add to existing evidence for the neural origin of strength increases that occur before muscle hypertrophy” (J Neurophysiol. 1992 May; 67 (5): 1114-23).

If this evidence for the intimate body-mind relationship does not weigh enough, consider also: thinking has not only the power to slim us down, but also to fatten us up.  How many calories are in a Snickers bar craving?  Some.

I'm training for the Olympics.

II.

A key reason to favor intellectual exercise over physical exercise is that the earnest pursuit of reason results in the adoption of healthy practices. Active mental lives regularly lead to active physical lives, provided that one’s studies are not allowed to grow too narrow or repetitive. Students of geology or ecology soon deposit themselves on a hike outdoors. Fans of philosophy and poetry, likewise, soon learn to disdain the confines of buildings. Sociology and anthropology fanatics soon find themselves moving about society with the lithe grace of a politician.

No true student of philosophy needs to be told that natural foods nourish better than artificial or chemically treated foods, and indeed, it is likely the growing illiteracy rate among American adults that has allowed such rudimentary lapses of judgment to begin with.

The pursuit of athleticism, however, hardly ever leads to intellectual pursuits. One has merely to look at the workout habits of the typical American to conclude as much: one hour on a treadmill daily, staring at a mounted television while listening to Lady Gaga on headphones, followed by three hours of television in bed to reward ourselves.

As a society, we once walked for miles on a regular basis. We used to read while we walked, too.  We did so often enough to form a cliche now long-forgotten, the careless reader blindly turning a corner, running headlong into someone important, someone attractive, or someone dangerous-looking.

“Oh! Excuse me!” the clumsy reader would say, to which the bulldozed pedestrian would reply,

“Why don’t you watch where you’re going?”

This peeve of society no longer exists to interrupt our afternoon strolls, however, because hardly anyone walks anymore — and even fewer people read. What readers and walkers do exist, certainly do not do them at once, anymore, and this is yet another example of how our intellectual divorce from the physical realm has affected our daily lives.

Abraham Lincoln's clumsy travel habits earned him the nickname, "Absent-Minded Abe," which he lamented until stumbling into John Wilkes Booth in 1865. "Why don't you watch where you're going?" Booth reportedly said.

People interested in their health had better to start a reading habit than a calisthenics regime or a dieting plan. Calisthenics have little to do with one’s quality of life outside of physical fitness. Granted, staying fit and feeling healthy present one with many important benefits, not the least of which being longevity of life, but the benefits of intellectual fitness far outstrip those of a merely athletic lifestyle.

An educated autodidact takes interest in more of the world around him or her, experiences epiphany on a regular basis, and usually gets the joke (even when the joke is not funny).  What good would it do to live for a hundred and fifty years, if all the intellectual stimulation one managed in all that time were counting reps and calories?  Physical exercise for its own sake feels good — right up to when it hurts — but also involves hours of terribly boring repetition.

Boredom is the agony of an intelligent mind beginning to atrophy, just as aches are the pain of muscles gone unused.

Dull people suffer from boredom like victims of bone cancer.  Note the torture children experience at the mall, shopping for school clothes with their mothers.  They don’t know enough to entertain themselves with what little stimulation exists for them in that environment, so the effect is like that of sensory deprivation.  One might as well blindfold them and bind their hands; they’ve absolutely no idea what to do with themselves.

We secretly replaced two of these bored athletes with two complete morons. Can you tell them apart?

The difference between a child and an adult is, the child will gladly, eagerly find a way to entertain himself or herself.  A dimwitted adult, too far gone and having lapsed into perpetual complacency, grows so comfortable with boredom that he or she can tolerate hours upon hours of commercial television programming, finding no irony in laugh tracks and APPLAUSE signs, predicting the plots of show after show, and repressing occasional surges of distaste, disgust, and boredom with the nimble dexterity of a catatonic ninja.

Stupidity is a vicious circle this way.  The dumber one becomes, the more content one is to become an even bigger idiot.  As the light goes out from behind the eyes, the person in the dark back there has a dwindling chance to figure out what makes life so generally unbearable for them.  If one believes that this ignorance is truly bliss, one is horrifically mistaken.

To return to comparing mental health to physical health, an intelligent yet obese person will need to contend with deadly health concerns — and also might suffer from nightmarish, excruciating social repercussions — but he or she will have logic to resist exacerbating these issues, at least.

An uneducated, unintelligent person (even if blessed with good looks, adequate finances, and loving friends and family) still must survive and endure the following hazards of their fatuousness: irresponsible contraction and communication of disease, increased likelihood of incarceration, hampered ability to communicate, vulnerability to cons and scams, misinformed belief systems, haphazard parenting skills, higher unemployment rate, drastically lower income, poor decision making, higher risk of mental illness, self-inflicted illness, incautiousness and injury, unchecked emotion, social ineptitude, confusion, malnutrition, bewilderment, laziness, haplessness, recklessness, and a nigh-infinite parade of other easily supposed hazards.

Sure, obesity can cause pulmonary hypertension, but stupidity can cause a lawn chair to resemble aircraft.

Cancer is cause for concern. Idiocy, is the bane of our existence.

III.

It will occur to the casual reader that since intellectual exercise seems so much more crucial to a long, healthy life than physical exercise, some logic must exist for the favoritism of Southern Californians and others for the latter.  How in the world could so many people ignore the obvious dangers of watching too much television and reading too few books?  Granted, the media does everything it can to keep people out of libraries, into strip malls and reclining armchairs, but other reasons exist.

Studying up after a long period of laziness remains many, many times more difficult than losing weight or getting ripped.  A little willpower allows access to a proper diet and calisthenics routine — but willpower alone will not help an illiterate person put the Meditations of Marcus Aurelius to use.  Rob Cooper lost three-hundred pounds in two-and-a-half years, but the ability to read influential works of literature or contemporary science journals takes years of formal education, followed by several more years of diligent bookworming.

These studies aren’t just difficult for an aspiring collegiate, though; they’re also dizzyingly excruciating for a dullard to endure.  Dieters have hunger pangs to contend with, and joggers must overcome both pain and fatigue, but neither of these agonies can match the psychological horror of limitless boredom.  Eighty percent of U.S. families did not buy or read any books last year, which means they found no joy in turning the pages of Harry Potter, Salem’s Lot, or even their Bibles or Qur’ans.  For an uninitiated thinker, completing even so accessible a text as Twain’s The Adventures of Tom Sawyer is as difficult as scaling a sheer wall, and less enjoyable than staring at one.

Perhaps this accounts for the dull look in their eyes.  Maybe staring at walls is the secret addiction of idiots all the world over.

Melissa's bedroom wall has been on the N.Y. Times bestseller list for eight weeks running.

There’s also no immediate monetary profit in engaging new intellectual pursuits.  Most people need to clothe their children and put gas in their car.  They don’t have time to sit and read Chaucer unless someone will pay them for their time.  Even if college educations result in lucrative jobs, they neither put food on the table at home, nor pay for themselves, in the meantime.

This makes reading very unfashionable in places like Southern California, where a successful, meaningful life is measured in terms of material wealth.  Everyone in Southern California knows or has met a member of the modern aristocracy who accumulated the entirety of his or her wealth without even a high school diploma.  Few remark that these people may lead valueless, colorless lives fraught with confusion, disinterest, and despair.  Few question whether raising children, attending churches, or advancing careers can supplant an earnest search for one’s own meaning in life.

An intelligent, educated person with debts to pay, has debts to pay, as well as an appreciation for the horrors and beauties of the world we live in.

An unintelligent, uneducated person with money — has money.

Perhaps the most pervasive cause for the preference of physical health over intellectual health, though, is a social divide between jocks and geeks which prevents a natural exchange of information, information jocks desperately need about the use of books, and information plenty of geeks could use about the use of barbells.  Idiots don’t hang out with intellectuals, because educated types make them feel stupid and insecure.  This aversion suits educated conversationalists just fine, too, because they’re tired of having to explain to drunk people in basketball jerseys that comparing political figures to Hitler doesn’t facilitate a mutually beneficial discussion.

With a social disparity this extensive, it’s hard to imagine anyone over thirty spending some hard-earned Monday Night Football time learning to play chess, instead.  With great hope and trepidation, though, one must presume that it’s happened somewhere, sometime, and that it just might happen somewhere again.

Having pounded a creatine shake, a Monster energy drink, and three shots of wheatgrass, this valiant bro opened his game with a variation on the Ruy Lopez.

IV.

Today, universal health care stands out as Washington’s most ambitious undertaking in decades.  In time, the White House might be able to pull it off, too — but what about universal education?

The so-called public option for educating our citizens doesn’t even bother to hide its own shame and self-loathing, anymore.  What if one’s intellect really does matter at least as much as one’s biological health?  That would make the problem of national obesity look like a pebble beside the Himalayan catastrophe that is our national stupidity.

It’s amusing to consider that universal health care could make an effective political smoke screen, if the categorical failure of Bush’s No Child Left Behind legislation were ever to draw unwanted attention.  In the years to come, our life expectancy may exceed all expectations, affording every uneducated American an additional ten, twenty, or even thirty years of bad decision making.

Feeling fit and staying active isn’t a silly prospect; it’s an important part of being human, but a healthy physique alone does not a fulfilling, rewarding life make.  It behooves us all to balance our time in spin class with our time between the pages of something thought-provoking.  It is childish to pretend that looking good and feeling good supplants the need for imagination, contemplation, and meaningful dialogue.  Flexed guns and a washboard six-pack can’t govern anyone’s life.  They help a black v-neck tee-shirt fit more fashionably, but what has good fashion sense done for us, lately?  We’re sexy enough, for Chrissakes.

Were there research available on the subject today, it’s likely that stupidity would prove more responsible for a shitty sex life than outmoded fashion sense ever was.  Decent fucking requires a modicum of know-how.  No amount of salon time can make up for a person’s inability to locate a clitoris.

MacGyver. You can bet he never needed a paper clip and a ballpoint pen to find a clitoris.

Put your bullet-shaped helmet away, o’ legion of spandex-clad bicycle enthusiasts, and pluck up a volume of Bukowski.  He’ll keep your interest for an hour or two, I swear.  And roll up your spongy L.A. Fitness brand yoga mat, o’ acolytes of spirituality through weight-loss programs, and fetch a copy of Huston Smith.  Everything you ever wanted to know about humanity’s search for its soul is there.

It’s time to stop overpaying our athletes and underpaying our teachers, overvaluing our blockbuster hits and underestimating our modern classics.  People of great intellect aren’t having a hard time getting laid, they’re having a hard time finding other intellects.  It’s time to re-evaluate the amount of attention we pay to our physiques when we pay so little attention to our minds, and it doesn’t take a Mensa award-winner to see the American reasoning faculty drying up like a dessicated chunk of cacti on a cracked stretch of desert highway.

Evolution’s the great equalizer, though.  If there’s any truth to it at all, then it won’t take long for all the athletic ignoramuses to jog, hike, and bike straight into traffic or off of cliffs, and the rest of us will have more than enough time to take up aerobics.

With total amazement and utter stupefaction, I remain,

Yours Truly,

-BothEyesShut

Stumble It!

O’ War! War! O’ Elegant, Heavenly War!

Reason and intelligence lead thoughtful people to reach the same conclusions when those conclusions seem most obvious, and that’s a shame.  We intellectual sorts daily nod and smile at one another, agreeing on many momentous topics of discussion, differing on only the tiniest of distinctions.  Too many discussions terminate with these knee-jerk conclusions, really, and one of these universally agreed-upon topics happens to be the matter of war.

War, says the sage scholar, is a base, savage, corrupt, unworthy use of our time and resources.  War, he spits, defiles our dignity and pollutes our minds, denounces our integrity and poisons our innocence.  War, he decries, is hell.

However, this perspective does not lend itself to a round, fair judgment of martial practices.  War is too ancient a human institution to be flippantly dismissed out-of-hand.  We owe too much of our bounteous, idyllic lifestyle to war for such a hasty expulsion of it.  War is too human to be deemed inhumane.

War, the heart of so much civilization, cannot be immoral, unjust, or depraved. War is not loathsome, nor is it an abomination. War is not iniquity.

War, in fact — is a really, really good time.

War is not hell. Come now, does this look like hell to you?

I. War Brings People Together

“[The most awesomest party ever] grows out of the barrel of a gun.”

— Mao Tse-Tung

Nothing thrills the soul like a good explosion, except maybe a good explosion with body parts flying out of it. Rather than blowing people up solo, though, one can make the minutest bang a resounding ka-boom! by inviting one’s friends and neighbors along. An armed skirmish inspires conviviality, and any reason to hold a shin-dig is a good one.

Many Southern Californians live in apathy of their neighbors, ignorant of their neighbors’ names, ignorant of their neighbors’ proclivities, ignorant of their neighbors altogether except for the kind of car they drive and which households make the most noise.  We repeatedly prove ourselves too proud to love, too haughty to give a heartfelt hug when we need it most. Drop a few cluster bombs on the local strip mall, though, and people cling to one another like infant monkeys.

Never mind the block party; Mrs. Dilweed’s acclaimed potato salad isn’t going to make any friends. It’s suppression fire from a machine gun nest at the end of a suburban cul-de-sac that softens the hardest of hearts. Until cowering in a muddy shell crater with them, one never knows one’s true brothers and sisters. Camaraderie springs from warmth, and the root word of warmth is war (little known fact). This is why most ordnance produces heat, flame and conflagration, and why even cold bullets, once in merry flight, are called fire.

Don’t stay out in the cold. Choose warmth. Choose war.

Did you see that buzzbomb clip Ralph as it whizzed by? Bang! Zoom! What a gas!

II. War Inspires Art

“The object of war is not to [party hard] for your country but to make the other bastard [party hard] for his.”

— General George S. Patton, Jr.

What pastoral oils graced canvases during Earth’s peaceful centuries? What poetry dripped honeylike from the tongues of minstrels during the Great Pacific Period? What music resounded through the halls of humanity during the Time of Tranquility?

Aha! But there were never any such occasions, of course. Do not be silly.

All great art is the result of a vicious, mindless, self-consuming, bullet-tossing, bomb-fumbling world hell-bent on blending hell into every fine thing produced by man. Without the bang of guns, there would be no onomatopœia. Without the need for camouflage, there would be no paint. Without the need for morale, there would be no music, no comedy, no burlesque.

Without war, the Beatles would have been a boy band. Without war, Hemingway’s For Whom the Bell Tolls would have been about schoolchildren dismissed for summer. Without war, Leutze’s painting of Washington crossing the Delaware, boot at the prow, would have featured that great general having his shoes shined.

No art exists but that which came from the fertile, menstruating womb of war. What possible inspiration could there, otherwise, be? God (big G)? Please. We have a Sistine Chapel already, thank you.

Without war, we'd not have pretty paintings like "2,000-Yard Stare," by Tom Lea

III. War Improves the Humans-to-Resources Ratio

“The death of one man is [smart shopping]. The death of millions is a [hot deal].”

— Josef Stalin, comment to Churchill at Potsdam, 1945

Limited resources! cry the teachers of social studies. Limited resources! cry the pundits of the mass media. Limited resources! cry the politicians of every country throughout time. All these persons devoutly believe to have spotted the obvious reason for war, when all along they’ve had it backwards. War is not a battle over limited resources. War is the simple solution by which humanity divides limited resources amongst fewer peoples.

What difference does it make if seventy percent of all the oil in the world exists in the Middle East and North Africa, if there are so few people in said world that they couldn’t possibly consume it all in seventy-seven generations? War isn’t a contest of tug-o’-war with natural resources as the prize. War is a game of musical chairs which begins with someone left standing, and ends with everyone seated comfortably.

Every human death brings humanity closer to feeding itself. The practice of warfare puts palatable provisions on everyone’s plate.

Always enough to go around when "around" is less round

IV. War Spurs Science

“You can’t say that civilization don’t advance, however, for in every war they kill you in a new way [that is consistent with the scientific method].”

– Will Rogers

Dehydrated foods, microwave technology, and countless other advances sprang from the American war machine, yet detractors still picket and march and gripe and whine, saying, “Make love, not war!” and, “Draft beer, not people!” as though these pithy proverbs were the pinnacle of wit and political consciousness. These naysayers have conviction — one can tell by the limitless cash they spend on verbose bumper stickers for their hybrid automobiles, verbose little slogans such as, “Why do people bomb people who bomb people to show that bombing people is wrong?” and “It will be a great day when schools have all the money they need and the air force has to hold a bake sale to construct a bomber” — but their hypocrisy outshines their passion every time they stir water into their Carnation Instant Breakfast (™) or nuke their breakfast burritos for thirty seconds on High.

War motivates our sharpest knives and brightest bulbs to design ever-more-efficient blenders in which to purée people, without which the interminable process of old-fashioned battle would positively bore the soldiers to death. Who wants a war without robotic drone fighter planes firing laser-guided ordnance while threading the needle through phased-array radar sites? Nobody, that’s who. Night vision goggles with infrared target-acquisition-sharing capability! Electromagnetic silent supersonic Gauss rifles! Nuclear submarines playing hide n’ seek beneath polar ice caps, with bionic remote-controlled spy sharks to follow them!

Let’s face it, war makes a technological wonderland out of an otherwise unremarkable world, and though it may seem somewhat more destructive, we’d all probably die of boredom without it, anyway.

The hi-tech miracles of war bring delightful conveniences into every home. Every boy and girl will want a civilian version of BigDog under the tree this Christmas!

V. War Brings the Rich and Poor Together

“When the rich wage war, it’s the poor who [benefit greatly].”

– Jean-Paul Sartre

Of the many struggles plaguing mankind, class warfare remains one of the most deleterious. The working class has always been exploited by people with money and power, and has always outnumbered its rich slave-owners by a ratio too imbalanced to ignore. In 2006, the top one percent of the population of the United States owned more than twenty percent of the wealth. This is the same as if the rich had stolen every single possession from nineteen percent of American citizens, not to mention everything these unfortunate nineteen percent are currently earning, and everything they will earn until the day they fall over and die — until the statistic changes again, that is.

What to do for this social sickness? Depose the rich and give their stuff to the poor, á la Robin Hood? That only works in movies. Once again we find that war, that old internecine pastime, is the answer.

The problem is not economic disparity. The crisis is that aristocrats are an alarmingly endangered species, their numbers falling faster than those of the black rhino, the giant panda, or the beluga sturgeon. In order to save this grievously assailed caste, the opposing herd must be thinned. What better use for the poor, than war? War is not only useful for inciting art, science, conservation, and brotherly love; it’s also humanity’s best method of lessening the huddled masses of impoverished paupers to match the dwindling and endangered populations of aristocrats.

Eat your heart out, Franklin Delano Roosevelt.

Why not? Ancient Romans coined their money and forged their swords from the same metal, and in the same fire.

VI. War Spurs Philosophy

“We make war that we may live in [wine-induced philosophical contemplation].”

-Aristotle

Humanity once needed to laze in order to store up energy for the hunt. Now that our prey comes to us through drive-thru take-out windows, we no longer require such lazing, but shaking the habit has proven too difficult for most of us and as a result, we’re lazy.

Philosophers are no different, and in fact often constitute the laziest portion of society (armchairs redounding). For this indolence the fault falls but partially on them, however. Having explained away the meaning of life with eighteen answers to choose from (and this before even touching upon world religions) philosophers peaked rather young, and the resulting malaise keeps them from coming up with new material for our amusement on a regular basis, lazy bastards that they are.

With the threat and promise of war, though, philosophers and thinkers from every corner of the globe clamber over one another to pose their perspectives to the world. War is detestable! say some, and War is inevitable! say others, and War is glorious! say still more, all of them having worked out valid, logical reasoning to support their point of view.

Without war, whatever would we do for philosophy? Where would we find our bathroom reading? Like it or not, the world has war to thank for the musings of Confucius, Gandhi, Lao Tze, Kant, Martin Luther King, Jr., and the rest of the simpering peaceniks.

No war, no philosophy.

Socrates preferred the M4A1 for its close spread at medium range.

VII. War Holds Religions Accountable

“An eye for an eye makes the whole world [see eye-to-eye].”

– Mohandas Karamchand Ghandi

Perhaps most importantly, war keeps the world’s major religions on their toes. Any religious leader can jaw non-stop about how one ought to live one’s life, but when hundreds of weeping mothers pour in on Sunday begging for a divine promise to bring their sons home from war unscathed, even the most wretched charlatan must turn his gaze inward and ask himself, “Do I really know what the hell I’m talking about? Do I really think there’s an ultimate source of love and wisdom and fairness who could let a war like this happen, simply because people are born imperfect and grow up stupid enough to fire projectiles at each other?”

Mark 13:7 says that wars must happen.  Judaism and Islam have been hurling grenades at one another for centuries.  Hinduism even has a goddess, Kali, dedicated to destruction, and Taoism doesn’t really care one way or the other.  It should surprise no one, therefore, that most of the people recruiting for war, speaking in favor of war, and doing the actual killing practice religion.  War benefits religions by holding them accountable, and by accomplishing the following:

War eliminates the fighters from religious congregations, leaving only the lovers.

War forces religious leaders to answer in detail the most treacherous, and imperative, mysteries of life.

War allows believers to emphasize their belief in heaven by martyring themselves, an otherwise impossible task in the modern era.

‘There are no atheists in foxholes’ is not an argument against atheism — it’s an argument against foxholes,” says James Morrow.  Indeed, nobody wants a godless heathen in the trenches defending America.  What would that say about us here at home?

Warriors of anti-aircraft fire and theosophical debate, may your barbs fly true!

VIII. War Destroys Warfarers

“We have to face the fact that either we are going to die together or live together and if we are going to live together then we are going to have to [die together anyway].”

— Eleanor Roosevelt

Having covered all the aforementioned benefits of war, it remains to note that even if war could be disparaged (not bloody likely) enemies of this most honorable practice would have nothing to fear, because war primarily destroys warfarers. Collateral damages aside, and the odd woman-and-child combination notwithstanding, most victims of war who die with bullets in their chests die also with guns in their hands.

War, then, is a cancer-eating cancer. Who can fear an innocuous thing like that?

Like Romeo and Juliet, war loves war, and war kills war.

IX. War Expedites Evolution

“Violence is the last refuge of the [guy who should have tried violence sooner].”

— Isaac Asimov

The human race has war to thank for much of its enduring success and happiness, but natural selection continues. Having developed foresight, as well as a prototypical reasoning faculty, humans owe it to themselves to help speed evolution along, rather than sluggishly floating through stages of development like flotsam on a wave.

Since evolution depends on the deaths of as many would-be parents as possible, war hurries genetic development exponentially. Millions of heroic, conscientious warmongers with an earnest desire to kill opt out of parenthood, and thereby hurry the filtration process. In addition to these purposeful patriots, millions eject themselves from the gene pool by enlisting under dubious pretenses also, including (though fortunately not limited to) the overemotional, the desperate, the directionless, the uneducated, the unassuming, the weak-willed, and the easily-convinced. With all these excellent specimens volunteering their progeny for oblivion, homo sapien version 2.0 might just be released millions of years ahead of schedule.

One never knows which genetic mutation will prove most useful to the next line of humans, but one thing is certain: war finds those beneficial mutations quickly — much faster than waiting for rest homes to empty does.

Evolution at the speed of boom

With so much to thank war for, how can we continue to castigate this most-precious of traditions? There’s so little the world can agree on! And yet, everyone admires the silent nobility of a rusted, burned-out tank half-hidden in tall, green grass. Everybody can appreciate the natural beauty of an antiquated minefield, the subtle majesty of barbed wire silhouetted against the sunrise, its coils spiraling along the horizon like glittering ivy.

Why must we as a civilized people rebel against our most fundamental natures? Let us enjoin our destinies hand-in-hand, staring boldly, proudly down the rifled barrels of our mutual obliteration. Let us not come to regard our beatific invasions as clumsy mistakes, but as the measured, artful strokes of a virtuoso violinist crafting a concerto.

There’s nothing sick or evil about death. Death, so-called, does not even truly exist except as the briefest juncture between shapes of life, a nurturing moment in the infinite infancy of existence. Let us not stay the hand of the reaper, but take up our plows and sow our seeds in preparation for Death’s gentle harvest.

We did not invent war. We are war.

So stand down the picket signs and snatch up the weaponry, salute the Commander In-Chief and strut stolidly to doom. Our splendor and sublimity await!

With Much Love and Many Rockets,

-BothEyesShut

Stumble It!

American Unoriginal, 501 Blues

The United States of America has always embraced its individuality.  Our land, after all, represents an award for having proven our independence from the European imperialists, and for having developed our own voice, our own style, our own civilization.

After that, we developed blue jeans.  We had been rebels, and having won our independence, we no longer had a cause.  Now we celebrate our independence on Independence Day, then spend the rest of the year discouraging various dependencies exhibited by our children and the so-called co-dependent relationships engaged in by our friends.  We like our independence so much that we invented baseball, basketball, and football to avoid playing soccer with the other countries.  ‘Cause, you know; like, fuck those guys.

We do work together in our 501 blues as a begrudgingly unified American people, too, but this is not the side of ourselves we wish to emphasize.  We want to stand triumphantly alone on mountaintops, shaking our fists in defiance of the global status quo — and why not?  Seems more fun than following others on a well-traveled rail all our lives.  Our rails have naturally (or unnaturally) converged in some ways, however, and some leaders have admonished us to retain our differences and revolt against pressures to homogenize.

Those leaders who champion our individuality become cultural heroes, such as Henry David Thoreau (Mr. March-to-the-Beat-of-a-Different-Drummer, himself) and Thomas Jefferson (“The pillars of our prosperity are most thriving when most free to individual enterprise”).  The punk rock movement, led by iconoclasts like Jello Biafra and Iggy Pop, embodied the Western youth’s violent rejection of the mainstream.  Mr. Paul, who wrote that we ought not conform, happens to represent America’s favorite enthusiast of America’s favorite religion (Romans 12:2).

Mr. Paul, Henry David Thoreau, Jello Biafra

For awhile it seemed we might make these leaders of ours proud, proud of our ambitious creativity, proud of our cultural accomplishments, and proud of our devil-may-care disregard for the world’s opinion of us, but look at us now: our disregard for global opinion has alienated us, our cultural accomplishments have been largely surpassed, and our red-blooded creativity, once symbolized by riveted, indigo, serge de Nimes overalls, has become a sad, poorly-manufactured-in-Indonesia parody of itself.

American Individualism, look upon the blue face of your stillborn spirit, and despair.

There was a time not so long ago when a fella could dress as colorfully as he liked.  Plenty of guys wore blue jeans, sure, but could also step into bell-bottoms, plaid pants, coveralls, or any manner of matched slacks.  Trousers were high-waisted, waist-high, hip-hugging or standard, and could be held up with a belt or suspenders.  Even during times of extremely prevalent trends (trends, plural, mind you) we managed to assert our own personalities through the clever juxtaposition of numerous possible garments.  Look at the variety expressed in this typical ad from thirty years ago:

Bells and whistles. The former garnered the latter, I imagine.

It may be surmised that these clothes came from the same season of the same line, and that the fashion designer had intended the outfits to somewhat coordinate with one another.  These similarities notwithstanding, the variety of colors and fabrics and styles makes modern America look as uniquely fashionable as dental-office wallpaper.

I mean, look at that bad-ass motherfucker on the right.  Have you seen anything like that pilgrim-style collar in your life?  More pertinent to our conversation about American creativity, though, are their pants: endlessly more more fun and imaginative than those merely acceptable blue jeans.  The bell-bottoms apparently came checkered, plaid, or plain with cuffs, and you can bet there were more colors than those offered here.  I’m guessing these fabrics were wool, polyester, cotton, and corduroy respectively, far beyond today’s usual variety of cotton, nylon, or cotton-nylon.  The fedoras are a nice touch, too, but I’m focusing on trousers, here.  And why, you ask?

Because — if modern American creativity could be measured in trousers, my friends, it would look like this:

What color were the socialist overalls in Orwell's 1984, again?

This was merely one of a score of images I could have chosen from (I selected this for the flag waving, which I consider a bonus).

Hypothesis: the American public does not exhibit the level of independent thought of which it seems so proud.

Conclusion: for all our independence and rebellion, we can’t even choose our pants uniquely, anymore.

One respondent to BothEyesShut’s American Trousers Study reported, “Hell yes, we’re independent.  We think fer ourselves, sure do, and if a pair of blue jeans just happens to be the most American piece of clothing we own, don’t y’all blame us for looking uniform.  Just because we wear the same style pants as everyone else, don’t you go thinkin’ you’ve got some sorta creative edge on us, or nuthin’.  Blue jeans were good ’nuff fer my pappy, and they were good ’nuff fer his pappy, and by God (big G) they’ll be good ’nuff for me, my son, his son, and the dog, too, if’n we decide to haul off ‘n buy him a pair!”

Cletus has a point.  As a nation, our creativity does capture the globe’s attention with our radical, unpredictable, freedom-waving manner of dress.  We’re just as edgy and innovative as any of those other countries, like Japan. . .

Gomen nasai.

or France. . .

Frenim-Clad

Or the United England Kingdom. . .

The United England Kingdom

So, OK, I admit it — I admit that we denizens of the United States are not the only ones who forgot how to sew fabrics other than denim, but as anyone can see, we aren’t becoming more interesting by learning from the innovations of other countries.  We aren’t trying to decide whether we’ll wear our awesome Scottish kilts to the party or our dashing Spanish sailor’s slacks.  Rather, we’re destroying whatever cool fashions may have existed in these places before the stonewashed blue plague set in.  We’re not doing it on purpose, though.  Like carriers of a cultural disease, we became victims ourselves before spreading it around.

Levi Strauss, pragmatic inventor of what he insisted on calling, “Levi’s overalls,” did not advertise his way to the top of the fashion charts, however; his product had undeniable merit.  The machine-spun fabric withstood months of laborious mining, and the copper-riveted pockets did not tear out at the corners when laden with rocks, bolts, and other detritus toted by the miners.  In 1890, Strauss added a watch pocket for pocket watches (that little rectangular one at the right hip) because men generally carried their watches on chains in vest pockets, and vests, of course, could not be worn in the mines without becoming torn and soiled.

So we non-miners bought them, too.  Our wives were tired of patching and darning our trousers just as much as Mrs. Strauss had been, and what do you know?  By the time James Dean wore them in “Rebel Without a Cause,” the United States Navy had been issuing them to sailors for over fifty years.  Then theatres, schools, and churches banned them in a last-ditch effort to contain adolescent interest in rebellion, an effort which backfired, of course, and by the sixties they had become commonplace.  Then stonewashed.  Then cut-off.  Then ripped.  By 2004, the average American owned seven pairs of blue jeans.

Seven pairs.  Seven.

Forty years ago, guys could go ladykilling on Main St. on a beautiful Saturday afternoon and expect prospective marks to decorate themselves from the waist down, rather than default to the best-fitting of their seven pairs of blue jeans.

Liberated elegance, from a time when people had to know how to match their clothes.

Yeah, so old Levi isn’t at fault.  Jeans are ubiquitous because indolence is human.  We’re too damned lazy to exercise our character, and fuck, jeans “go with” everything.  They really do look nice, too; I like mine boot-cut with a dark, royal bleu de Gênes color, and always wear ankle boots with them to look less casual.  There’s nothing wrong with them — they aren’t the problem.  If it were up to our jeans, I bet they’d rather not be worn as a matter of course, either.

We don’t have complete control over our fashion proclivities.  Marketing and thought control are synonymous, and even more commonplace than the clothes sold thereby.  In spite of this assault on the American freedom of choice, few high schools in the United States still teach media, leaving teens (and their hard-won pocket cashola) defenseless, unaware that they are always someone’s target audience, victims of omnipresent psychographic advertising.

These mind vipers love us all dressing alike, eating the same foods, listening to the same bands (who all sound alike now, anyway) because it’s child’s play to advertise in generalities when the general public is generally going to like anything that fits the general description of what they generally want to buy.  How can a budding fashion designer build a name for himself?  Why, advertise a logo on magazines and bumper stickers, then slap it on a pair of blue jeans and charge enough money to ensure only affluent people can afford to flaunt them.  Sold.

Do people purchase things they might regret as a result of mass marketing? Oh -- sometimes, I suppose.

Many entities benefit from transmogrifying a free-thinking, unpredictable people into a cowed and colorless one.  Politicians, far from pandering to liberals or conservatives, have aimed at median voters for decades.  We owe this trend to the tendency of most Americans to contradict themselves on the ballot.  Most Americans, for example, call the torture of terrorists justifiable, yet insist on federal investigations into the torturing of terrorists.  Most Americans back abortion rights, so long as women do not abort their pregnancies for certain reasons — gender selection, for instance.  This tendency lets interested parties market to the broadest, largest group of people with a single advertisement, and for this reason interested parties work to make us as similar to one another as possible.

It is, of course, human nature to prefer what does not surprise us, as well, so we shirk the shocking and reject the revolutionizing.  Hippies dressed differently, so they were terrorized.  Punk rockers dressed differently, so they were terrorized.  Women who wear burkas in the U.S. dress differently, so they are terrorized.  The most dangerous thing to a way of life is a new, fresh idea, and many people can’t help but hate the guy with the wacky hat.

The wacky hat is distracting.  It isn’t simply fear that causes us to attack everything creative and unique in our midst.  High school administrations that adopt a “No distracting hairstyles” clause for their dress code know well what independent thought can do to a “sit down, shut up” curriculum (more on this in Part I of “How to Refrain From Being a Dick”).  When we stop worrying about our hair, we also free time from our mind’s busy schedule to think about something else — like how we’re going to afford a three-hundred-dollar pair of Sevens brand blue jeans.  We’ll need the trousers if we want to attract that blonde who makes us hard by packaging her ass in a three-hundred-dollar pair of Sevens brand blue jeans.

Creativity: securing seats in the gene pool since the dawn of time.

Originality is powerful.  Unique traits fuel evolution, command attention, and map uncharted territories in any given scenario.  Best of all, exercising one’s individuality today is easier than ever.  One could, for instance, boycott blue jeans.  The last American Levi’s factory closed in 2003, anyhow.

Levi’s blue jeans: Not Made in U.S.A.

So, go ahead!  Have waffles for dinner and ride a pogo stick to work.  Go apeshit, America!  Take the plunge.  Spend an hour looking for trousers at the mall; look for pants that are neither denim, beige, nor black.  Good fucking luck!  It’s far harder than you think, and if you’re anything like me, it’s going to piss you off to see how few possibilities the market allows you.

There’s nothing wrong with national trends.  Trends become traditions and traditions become culture, and culture’s one of few things differentiating us from dust mites.  When trends control our thoughts and curb our options, though, it’s time to trim them back.  When everyone loves Twilight, it’s time to take a second look at Dracula.  When everyone has a pair of those retro Ray Ban Wayfarer sunglasses, it’s time to switch up to neon blade-style Oakleys.  Do it.  Let’s see your face behind a K-rad pair of those fuckers.

I’m not kidding myself, bytheway.  I know there’s no escape.  But there’s an important difference between the guy who goes gently into that good night and the guy who spits and cusses and brawls all the way down.

Or — I’m imagining that, and we’re all just as boring as everyone else.

No way.  I saw a forty year old man in a swell black tuxedo and pink bow tie slam dancing at a Vandals show, once.

And there was nothing boring about that.

With Great Reprobation, Condemnation and Fulmination,

-BothEyesShut

  • Enter your email address to subscribe to this blog and receive notifications of new posts by email.

    Join 22 other followers

  • Bookmark and Share
  • Local Directory for Long Beach, CA
  • Recent Readers

    View My Profile View My Profile View My Profile View My Profile View My Profile
  • Copyright Info.

    - All works are Copyright 2007, 2008, 2009, 2010 by BothEyesShut

    "In a Real World, This Would Be Happening"

    All rights reserved.

    - Header concept, photography, and design by Ruben R. Martinez (www.RubenRMartinez.com)

  • Real Bloggers United