Immortality, the Gift That Just Won’t Quit

The definition of death doesn’t hold much water, really, once all the voodoo juju is shaken out of it.  The harebrained doctors have one make-believe definition of it, the self-important scientists have another, and the whimsical believers have yet a third.  When one has faith in the existence of death, though, death can be a gateway, a rebirth, or even a redemption.  Anticipating death makes up the cornerstone of most world religions, while avoiding it remains the focus of most sciences.

— And that’s O.K.  There’s nothing wrong with any of those philosophies in and of themselves, but let’s eschew all that for the sake of conversation.  Let’s look at death without any allusion to typical, traditional beliefs.  What does death resemble, now?  A permanent medical condition?

Nevermind.  Let’s just say that death is a simple state of affairs that any doctor can walk up and diagnose, like this:

“Hey, this guy’s dead.”

Why, this guy's dead!

The doctor means that the poor guy’s lungs have stopped breathing and his heart has stopped beating.  That’s clinical death.

Most realists think of death as nothingness, bleak, black, and empty, which is typical of them; because if there’s any way to have less fun and be more boring, the realists will practically kill themselves to show you how.  Even so, most atheists and agnostics think this way about death, too, which is disappointing because as anyone can tell you, they throw the best parties, and therefore oughta know better.

“What happens when you die?” you may ask one of them.

“Nothing,” they say.  “That’s kind-of the point.”

OK Mr. Sunshine, but nothing is precisely what never happens.  There’s always something going on.  Besides, lots of things happen when you die.  When you look at clinical death, it actually mirrors the very early stages of clinical birth, so-to-speak, which normal people call pregnancy.

In the earliest stages of pregnancy, the fertilized egg (or zygote if we really must) has forty-six chromosomes, as well as its own unique DNA structure.  Anti-abortion terrorists are keen to remind us that this little eggy wegg is alive, and they’re not wrong.  In fact, scientists pretty much have to agree with them, because the zygote exhibits growth, metabolism, reproduction, and reaction to stimuli.

Apparently, the smartypants bigshot scientists have decided that a thing is alive if it’s got those four attributes.

What the zygote does not have, though, is a lung or a heart with which to satisfy the medical doctor’s requirements.  Its respiration has not yet commenced.  Its pulse is nonexistent.

“Why, this guy’s dead.”

“Now, you just hang on a second there, Doc.  We’re picking up growth, reaction, metabolism and reproduction.  This sonofabitch is alive.”

Great.  So the zygote is dead and alive.   Perfect.

Perfectly nonsensical.

Zombie Zygotes of the Living Dead

Why not, though?  When a guy looks at his arm, he thinks of it as a living part of him, right?  If doctors amputate it from him, then no one looks at it quite the same way.  It’s dead now.  The amputation was, as far as his body was concerned, a little death (or, la petite mort in French, which incidentally means orgasm).

Yeah, why not?  After all, when a pregnant woman feels her baby kick, she thinks of it as a living part of her.  If doctors deliver it, and amputate it from her, then no one looks at it quite the same way.  The baby’s alive now — even though the amputation was, as far as the mother’s body is concerned, a little death (or en francais, orgasm by baby).

Dead and alive, alive and dead.

The dead aren’t really all that dead, anyhow.  We eat dead things to stay alive, in fact — but only dead things which have recently become dead.  Dead things become more dead over time, and we can’t eat things which have been dead too long.

There’s not enough life in them, you see.

But just wait a damned second.  A little death?  More dead?  Death isn’t supposed to have all these degrees, all these shades of gray.

Silly-headed cynics and so-called realists step in at this point and remind us, “No, jerk.  Death isn’t in degrees or shades, and it’s definitely not gray.  Death is that certain change that happens in the instant that life stops for an organism.  Those four things you mentioned earlier?  Growth, reaction, et cetera?  The body can’t do those things anymore, so it’s dead.”

Yeah, alright, sure, Professor Killjoy, but from the broadest perspective, death doesn’t mark any significant change at all.  It’s just another change in an infinite pattern of changes — or, if you like, it’s another death in an infinite pattern of deaths.  Life, in fact, is what we call this infinite pattern of deaths.  Look:

Human life begins with an ovum and a sperm combining into a zygote.   This means the death of the ovum and the sperm, because they no longer exist as such; their chromosomes have been shared.  The zygote then begins cellular division at an extremely rapid rate, each division a little amputation (orgasm) from the parent cell, and these amputations are what we call growth.  When enough cellular carnage has occurred, the child is amputated from his or her mother, and soon afterward begins to eat dead things because of the life in them.

Dead things taste good.

Food is dead-ish

As the child grows, cells are born, grow old, die; are sloughed off, are excreted, are absorbed as more fresh dead stuff to nourish and prolong life.  Cells divide, and divide, and divide.  The lining of the small intestine is completely replaced over four-to-six days, you know.  The outermost layer of skin, or epidermis, every two weeks.  The hard structure of the human skeleton, every decade.  Even this child’s blood, just like the blood of every living person, is composed of red blood cells which live in the bloodstream for about four months before being replaced.

An elderly man of ninety years, therefore, has lived inside nine skeletons.  He has consisted of two-hundred and seventy human bodies’s worth of blood.

It’s all dead, though, remember?  We’re, like, hermit crabs or something.

Like our bodies, our minds unfold as a train of deaths and divisions, too.  Ideas grow and gestate, eating new information and transforming cold facts into newborn ideas, ideas which split and branch and grow of their own accord, just like a pride of lions flourishing from the carcasses of a few dead gazelles.  Sometimes ideas sprout from stagnant knowledge so automatically that our minds consider themselves inspired, but every new thought kills off an obsolete idea.

We grow and learn, shedding skin cells and obsolete ideas along the way like scraps of confetti following a parade, and when at the age of ninety we reflect on our adolescent selves, those teenagers seem long gone, long passed away, and the wistful feelings our memories evoke mimic those felt by mourners years after the funeral.

Death and life, life and death.

The thirty-year-old hermit crab and his previous shells

We still have no round definition of death, however.

Death seems no more than change and transition, and since change is an eternal constant, death must be occurring all the time.  If that’s so, then death as a single event does not exist.

If you think you’re going anywhere when you “die,” I’m afraid you’re horribly mistaken, as far as I can tell.  Nobody is going anywhere.  Nobody is going anywhere, and neither are the actions we are still making.  That the “dead” human mind no longer orchestrates these actions is inconsequential, since the mind was never orchestrating anything from the broadest perspective, anyhow, regardless of how intimately involved in the processes of the universe it seemed.

This will sound like glorious immortality to some and eternal damnation to others, so I guess that if you really wanted to you could call your opinion on living forever ‘heaven,’ or ‘hell,’ but don’t do that.  That’d be so tacky.

If all this sounds fantastic, consider that everything we are or will become was already here long before we were born.

All the material needed to put our bodies together had long been available before our births.  Our mothers merely needed to ingest some dead stuff and assemble it inside her.  The material to put our minds together had been here, too.  The elementary ideas, the deeper concepts, and the inner mysteries all, all, all had been waiting for our minds to ingest them and put them to use.  We were already here, waiting for assembly, just like The Great Gatsby had been when the Old Sport was alive inside Fitzgerald’s head, but not yet written down.

Sure, Dad can stick some spare auto parts together and build a car, but Mom can throw some spare body parts together and grow a person!

Cynics and skeptics will say, “An idea is not a thing, Sir,” and I must retort: well, where, exactly would you like to draw the line?  If Gatsby exists once he has been written down, what happens if the manuscript is destroyed?  — And if Fitzgerald writes him down again, is he birthing the same Gatsby?   What of publishing and printing?  Are all Gatsbys the same man, or different men?

Consider also the differences between brothers of the same family, raised in the same general time, by the same parents, on the same food, in the same area, with the same values, et cetera, et cetera.  One may grow up into a madman and the other a schoolteacher, but from the broadest perspective the difference can only be in human estimation, just like so-called death.  If we are arbitrarily, subjectively deciding what death is, then there really isn’t any such thing we can point to after all, is there?

In order to believe in death, one must think just like the doctors and scientists, coming up with their own willy-nilly criteria by which something can officially be called “dead.”  You may as well say that death is what we call the future, and birth what we call the past.

The Starship Enterprise notwithstanding, we will always be here, extant, just as we have always been here, and the proof and cause of both is that we can’t help but be here now.  There can be no escape.  We are captives of existence.  And why?

— Because the present time, nestled snugly between the past and future, between birth and death, seems very much alive, and it happens also to look very much eternal.

With much pleasure and measured amounts of pain I remain,

Yours Truly,

-BothEyesShut

Stumble It!

Advertisements

Oh, Yeah? Prove it!

Every experiment has significance, even the inconclusive ones.  When a team of smartguys at M.I.T. completes a study with inconclusive results, it reaches the ineluctable conclusion that another study is needed and immediately sets to work on it.  This testing can, will, and does continue until significant findings have been produced — er, that is — discovered.

Once significant results appear, the doctors conducting the study become proponents of it and publish these discoveries in remarkably well-respected journals.  These paperback journals are written in tedious, turgid English that is too obscure for the public to read, and have an average cover price of thirty American dollars, ensuring that the general populace gets no chance to join the conversation until it is Mickey Moused by Time Magazine and sold as an impulse buy at the grocery counter.

Hey, whatever.  At least mom’s getting in some string theory.

Journals cost upwards of thirty bucks, but at least they're jam-packed with ten-dollar words

As in all things in this universe, the idea proposed in this new study begets its equal and opposite, a second study which exists to provide an alternate scientific belief for anyone and anything negatively implicated in the first.

The satisfying thing about science is that it loves conflict.

Scientific prejudices appear out of this conflict, and because they are prejudices of science itself, the public presumes them factual.   From the broadest perspective, however, science walks in the well-trod footpaths of religion and theosophy.

When science decides that a certain quantum particle does not exist based on its failure to appear in tests, science is as faith-based as the creation myth of Genesis.  Science and religion have traditionally been rancorous archenemies, but this is a misunderstanding which, if one could get them talking again, could easily fertilize the most affectionate of friendships.

This animosity has been based on little more than a clerical error, anyhow.  Note how science and religion interplay in the following.

Once upon a time, in a faraway land called Berkeley, there lived a doctor of physics.  This doctor believed in a certain particle he called the God Particle, and hypothesized that it existed everywhere and had an effect on everything else.  So the doctor wrote a paper and was granted funding to perform experiments in a very special place with very special equipment, and after three months of rigorous, painstaking trials, the poor doctor was forced to concede that no evidence of his God Particle had surfaced in any tests at all.

To the scientific community, this absence of evidence presents hard, objective proof that Doc’s God Particle does not exist.  Even if they add the word “theoretically” to the conclusion (as they do with the theory of gravity, which they still can’t fucking figure out) they still use the test as a quotable citation in papers arguing that the particle is a fantasy of the doctor’s.

To be perfectly clear: in popular science, the absence of evidence can prove that a thing does not exist.

How’s that for self-satisfied conceit?  They can’t even plumb the depths of our ocean trenches, but they’ve got E.S.P., telekinesis, astral projection, sixth senses, prescient dreams, and automatic writing all figured out.  How?  No evidence, that’s how.

Oh.  Well, shit.

Scientific evidence shows that there is no scientific evidence that scientific evidence is scientifically evident

Now, let’s say that following the most costly failure of his professional career, Doc is forced to return to teaching at a preparatory high school for rich kids, which amazingly enough also happens to inculcate Catholicism.  In this private school, Doc is lecturing about the existence of God during a religious studies class, when suddenly a particularly cynical and sarcastic student raises her hand and demands to know how it is that anyone can feel sure that God (big G) exists at all.

Well, this is the question for which the course entire exists, and so the doctor puffs up with dignity and conviction, and with great certainty informs his students that in all the centuries and centuries of assiduous scientific research, and of all the brilliant, most well-respected minds throughout history, not a single person has been able to prove that God does not exist.

To elucidate: in matters of religion, the absence of evidence to the contrary can prove that a thing does exist.

— And though science and religion may fixate on the same piece of evidence (that nothing has appeared in tests, in this case) they both exit these experiments feeling assured that their hypotheses have been logically supported, because objective reason has its roots in language, and language happens to have more than enough elasticity to correctly describe a single concept with two definitions, each the perfect opposite of the other.

As violent and arbitrary as this arrangement may seem, the truth is: the common person likes it fine.  In fact, practically everyone hates unchallenged assertions, even the people making the assertions, themselves.  Something about our nature causes us to see polar opposites in everything, and something about our minds causes us to invent contrary concepts for every conceivable idea.

Humanity likes nothing until it is contested, enjoys nothing better than a contest

It is this facet of the human personality which affords us such colorful figures as the venerable Flat Earth Society, which still maintains that the globe is flat; the irreproachable Tychonian Society, which avers that the sun orbits the earth; and one mad Dutchman at the University of Amsterdam, Erik Verlinde, who asseverates that gravity is, in fact, fictitious.

If the ever-patient and magnanimous reader finds the Flat Earth Society amusing, then the reader is hereby urged to consider that most contemporary physicists believe Dr. Verlinde’s theory to have very convincing implications, and that gravity is merely the effect of a universe maximizing its entropy, or disorder.  The concept of gravity as a universal power will probably not exist for our children.

Q: If gravity, of all things, really is a red herring, then how incredible and fantastic are groups like the Flat Earthers and Tychonians, really?

A: Every bit as credible as a science journal, just as veracious as a leading theoretician, and equally as trustworthy as the supposed date and time of the reader’s birth.

Lo, and behold the clerical error of which I spake: if science and religion could leave the protection of their podiums for a second, they might each glean a mutual respect for the irascible plight of the other, which is that they are both sadly, obviously, and pathetically full of shit.  Not one or the other.  Both.

Yes indeed, we like the results of our experiments best when they are disputed.  Should science publish a study which shows conclusive evidence on any topic at all, another science immediately sets out to prove the opposite.  The people of the world want every perspective sullied and watered-down, pushed and contested until a ninety-nine percent probability has its back against the fifty-fifty wall, precisely where we want it.

We want it balanced just so, because we like to choose sides as if they were baseball teams.

— And once we arbitrarily pick a team, we commence to argue, and bitch, and dispute for it as though our evidence were, after all, indisputable.

Even incontrovertible evidence meets with reasonable opposition

Evidence is stupid, anyhow.  It’s usually statistical, which as anyone can tell you is the most insidious form of prevarication.  For some reason, intelligent people appeal to the authority of statistics all the time and require the same of others, which is doubly asinine, as these egghead hotshots know full-well that appealing to any authority is a cardinal logical fallacy, and exponentially more so when the authority in question is an invariably inaccurate numeric representation of an actual, physical chain of events, collected from a sample base which even under the most fastidious methods has no chance whatever of accurately representing some other, similar yet different thing at an entirely different point in time.

As the British statesman, Benjamin Disraeli, once said, “There are lies, damned lies, and statistics.”

Most experiments require a test group and a control group, too, but like gravity and statistics, there’s no such thing as a dependable control group, either. The very act of including it in a study changes its natural state.

An excellent example of this occurs in quantum mechanics, in which certain particles exist only in patterns of probability — that is to say, they are probably there, or probably not-there, never certainly so — and these patterns of probability change according to which researcher happens to be recording the data.

If one supposes that fifty scientists conduct the same study, their findings will generally have an acceptable margin of error, each doctor achieving his or her own individual result.  The only difference between this margin and a larger one is that we declare the former admissible and the latter inadmissible. Experiments cannot gauge truth in objective reality any more than a preacher can divulge so-called Ultimate Truth (big U, big T) from a holy text.

Humanity finds evidence-for, and evidence-against, and ultimately judges its (supposedly) objective reality with the subjective whimsy of an adolescent girl deciding between prom dresses.

This, ladies and gentlemen, is what the world calls evaluation by evidence.

Weighing all evidence with the most discerning of eyes, the prom date is an apotheosis of adjudication

So all evidence is meaningless, then? All results, experiments, and hypotheses, nothing but evaporated time and energy?

Not at all. Just because there’s no such thing as True (big T) objectivity doesn’t mean one can’t create it for oneself or support it for others. We arrive at many, many decisions on a regular basis which matter to hundreds, perhaps thousands of people, and we put our faith in evidences in order to do so.  Truth is easy to arrive at in a box.

One has merely to define the box.

Contrary to an extremely annoying popular belief, though, there is no such thing as thinking outside the box, because from the broadest perspective nothing makes any sense.  Logic only happens within defined parameters.  One can exit one set of rules and enter another, more comprehensive set, but there’s always another box containing all the smaller sets to prove that they are infinitely short-sighted and presumptuous.

The important thing is to remember that we’re basing it all on faith.  Nobody knows what’s really going on.  The passionate stupidity of thousands of sheep in innumerable American religious flocks has allowed science license for abject arrogance.  The truth is, though, any honest scientist will tell you that science has no positive idea about the meaning of life, the universe, and everything.

That’s the slippery thing about Ultimate Truth (big U, big T).  It’s only true if it does not conflict with the properties of the universe — and the universe is in constant flux.  In fact, the only known absolute constant is the transitory nature of everything.  This means that even should an Ultimate Truth surface, it could only be ultimately true for an instant before becoming outmoded to newer, emergent properties of existence.

Mr. Jesus may very well have been the way, truth, and life once (or maybe is due up in a few more centuries) but neither he nor anybody nor anything else can be a static ultimate truth in an anti-static reality.  A more likely solution is that universal truth changes for each individual thinker, so that one’s universal truth may indeed be found in Biblical scripture at a certain age — and this is boxed-up objective truth, no less true than death or taxes — but neither before nor afterward.

“When I was a child, I spake as a child, I understood as a child, I thought as a child: but when I became a man, I put away childish things” (I Cor. 13:11).

Yeah, that’s right.  I can quote scripture.  It isn’t blasphemy when it’s true.

So perhaps we all have some real thinking to do, eh?  Perhaps it’s time to grow up.

Where does one stow an outgrown worldview?  Under the bed, next to the Tinker Toys and Legos, obviously.  Right where it belongs.

With glasnost and much cheek I remain,

Yours Truly,

-BothEyes

P.S. — Nowhere in this piece will the magnanimous reader find the word, “ontology.”

Stumble It!

American Unoriginal, 501 Blues

The United States of America has always embraced its individuality.  Our land, after all, represents an award for having proven our independence from the European imperialists, and for having developed our own voice, our own style, our own civilization.

After that, we developed blue jeans.  We had been rebels, and having won our independence, we no longer had a cause.  Now we celebrate our independence on Independence Day, then spend the rest of the year discouraging various dependencies exhibited by our children and the so-called co-dependent relationships engaged in by our friends.  We like our independence so much that we invented baseball, basketball, and football to avoid playing soccer with the other countries.  ‘Cause, you know; like, fuck those guys.

We do work together in our 501 blues as a begrudgingly unified American people, too, but this is not the side of ourselves we wish to emphasize.  We want to stand triumphantly alone on mountaintops, shaking our fists in defiance of the global status quo — and why not?  Seems more fun than following others on a well-traveled rail all our lives.  Our rails have naturally (or unnaturally) converged in some ways, however, and some leaders have admonished us to retain our differences and revolt against pressures to homogenize.

Those leaders who champion our individuality become cultural heroes, such as Henry David Thoreau (Mr. March-to-the-Beat-of-a-Different-Drummer, himself) and Thomas Jefferson (“The pillars of our prosperity are most thriving when most free to individual enterprise”).  The punk rock movement, led by iconoclasts like Jello Biafra and Iggy Pop, embodied the Western youth’s violent rejection of the mainstream.  Mr. Paul, who wrote that we ought not conform, happens to represent America’s favorite enthusiast of America’s favorite religion (Romans 12:2).

Mr. Paul, Henry David Thoreau, Jello Biafra

For awhile it seemed we might make these leaders of ours proud, proud of our ambitious creativity, proud of our cultural accomplishments, and proud of our devil-may-care disregard for the world’s opinion of us, but look at us now: our disregard for global opinion has alienated us, our cultural accomplishments have been largely surpassed, and our red-blooded creativity, once symbolized by riveted, indigo, serge de Nimes overalls, has become a sad, poorly-manufactured-in-Indonesia parody of itself.

American Individualism, look upon the blue face of your stillborn spirit, and despair.

There was a time not so long ago when a fella could dress as colorfully as he liked.  Plenty of guys wore blue jeans, sure, but could also step into bell-bottoms, plaid pants, coveralls, or any manner of matched slacks.  Trousers were high-waisted, waist-high, hip-hugging or standard, and could be held up with a belt or suspenders.  Even during times of extremely prevalent trends (trends, plural, mind you) we managed to assert our own personalities through the clever juxtaposition of numerous possible garments.  Look at the variety expressed in this typical ad from thirty years ago:

Bells and whistles. The former garnered the latter, I imagine.

It may be surmised that these clothes came from the same season of the same line, and that the fashion designer had intended the outfits to somewhat coordinate with one another.  These similarities notwithstanding, the variety of colors and fabrics and styles makes modern America look as uniquely fashionable as dental-office wallpaper.

I mean, look at that bad-ass motherfucker on the right.  Have you seen anything like that pilgrim-style collar in your life?  More pertinent to our conversation about American creativity, though, are their pants: endlessly more more fun and imaginative than those merely acceptable blue jeans.  The bell-bottoms apparently came checkered, plaid, or plain with cuffs, and you can bet there were more colors than those offered here.  I’m guessing these fabrics were wool, polyester, cotton, and corduroy respectively, far beyond today’s usual variety of cotton, nylon, or cotton-nylon.  The fedoras are a nice touch, too, but I’m focusing on trousers, here.  And why, you ask?

Because — if modern American creativity could be measured in trousers, my friends, it would look like this:

What color were the socialist overalls in Orwell's 1984, again?

This was merely one of a score of images I could have chosen from (I selected this for the flag waving, which I consider a bonus).

Hypothesis: the American public does not exhibit the level of independent thought of which it seems so proud.

Conclusion: for all our independence and rebellion, we can’t even choose our pants uniquely, anymore.

One respondent to BothEyesShut’s American Trousers Study reported, “Hell yes, we’re independent.  We think fer ourselves, sure do, and if a pair of blue jeans just happens to be the most American piece of clothing we own, don’t y’all blame us for looking uniform.  Just because we wear the same style pants as everyone else, don’t you go thinkin’ you’ve got some sorta creative edge on us, or nuthin’.  Blue jeans were good ’nuff fer my pappy, and they were good ’nuff fer his pappy, and by God (big G) they’ll be good ’nuff for me, my son, his son, and the dog, too, if’n we decide to haul off ‘n buy him a pair!”

Cletus has a point.  As a nation, our creativity does capture the globe’s attention with our radical, unpredictable, freedom-waving manner of dress.  We’re just as edgy and innovative as any of those other countries, like Japan. . .

Gomen nasai.

or France. . .

Frenim-Clad

Or the United England Kingdom. . .

The United England Kingdom

So, OK, I admit it — I admit that we denizens of the United States are not the only ones who forgot how to sew fabrics other than denim, but as anyone can see, we aren’t becoming more interesting by learning from the innovations of other countries.  We aren’t trying to decide whether we’ll wear our awesome Scottish kilts to the party or our dashing Spanish sailor’s slacks.  Rather, we’re destroying whatever cool fashions may have existed in these places before the stonewashed blue plague set in.  We’re not doing it on purpose, though.  Like carriers of a cultural disease, we became victims ourselves before spreading it around.

Levi Strauss, pragmatic inventor of what he insisted on calling, “Levi’s overalls,” did not advertise his way to the top of the fashion charts, however; his product had undeniable merit.  The machine-spun fabric withstood months of laborious mining, and the copper-riveted pockets did not tear out at the corners when laden with rocks, bolts, and other detritus toted by the miners.  In 1890, Strauss added a watch pocket for pocket watches (that little rectangular one at the right hip) because men generally carried their watches on chains in vest pockets, and vests, of course, could not be worn in the mines without becoming torn and soiled.

So we non-miners bought them, too.  Our wives were tired of patching and darning our trousers just as much as Mrs. Strauss had been, and what do you know?  By the time James Dean wore them in “Rebel Without a Cause,” the United States Navy had been issuing them to sailors for over fifty years.  Then theatres, schools, and churches banned them in a last-ditch effort to contain adolescent interest in rebellion, an effort which backfired, of course, and by the sixties they had become commonplace.  Then stonewashed.  Then cut-off.  Then ripped.  By 2004, the average American owned seven pairs of blue jeans.

Seven pairs.  Seven.

Forty years ago, guys could go ladykilling on Main St. on a beautiful Saturday afternoon and expect prospective marks to decorate themselves from the waist down, rather than default to the best-fitting of their seven pairs of blue jeans.

Liberated elegance, from a time when people had to know how to match their clothes.

Yeah, so old Levi isn’t at fault.  Jeans are ubiquitous because indolence is human.  We’re too damned lazy to exercise our character, and fuck, jeans “go with” everything.  They really do look nice, too; I like mine boot-cut with a dark, royal bleu de Gênes color, and always wear ankle boots with them to look less casual.  There’s nothing wrong with them — they aren’t the problem.  If it were up to our jeans, I bet they’d rather not be worn as a matter of course, either.

We don’t have complete control over our fashion proclivities.  Marketing and thought control are synonymous, and even more commonplace than the clothes sold thereby.  In spite of this assault on the American freedom of choice, few high schools in the United States still teach media, leaving teens (and their hard-won pocket cashola) defenseless, unaware that they are always someone’s target audience, victims of omnipresent psychographic advertising.

These mind vipers love us all dressing alike, eating the same foods, listening to the same bands (who all sound alike now, anyway) because it’s child’s play to advertise in generalities when the general public is generally going to like anything that fits the general description of what they generally want to buy.  How can a budding fashion designer build a name for himself?  Why, advertise a logo on magazines and bumper stickers, then slap it on a pair of blue jeans and charge enough money to ensure only affluent people can afford to flaunt them.  Sold.

Do people purchase things they might regret as a result of mass marketing? Oh -- sometimes, I suppose.

Many entities benefit from transmogrifying a free-thinking, unpredictable people into a cowed and colorless one.  Politicians, far from pandering to liberals or conservatives, have aimed at median voters for decades.  We owe this trend to the tendency of most Americans to contradict themselves on the ballot.  Most Americans, for example, call the torture of terrorists justifiable, yet insist on federal investigations into the torturing of terrorists.  Most Americans back abortion rights, so long as women do not abort their pregnancies for certain reasons — gender selection, for instance.  This tendency lets interested parties market to the broadest, largest group of people with a single advertisement, and for this reason interested parties work to make us as similar to one another as possible.

It is, of course, human nature to prefer what does not surprise us, as well, so we shirk the shocking and reject the revolutionizing.  Hippies dressed differently, so they were terrorized.  Punk rockers dressed differently, so they were terrorized.  Women who wear burkas in the U.S. dress differently, so they are terrorized.  The most dangerous thing to a way of life is a new, fresh idea, and many people can’t help but hate the guy with the wacky hat.

The wacky hat is distracting.  It isn’t simply fear that causes us to attack everything creative and unique in our midst.  High school administrations that adopt a “No distracting hairstyles” clause for their dress code know well what independent thought can do to a “sit down, shut up” curriculum (more on this in Part I of “How to Refrain From Being a Dick”).  When we stop worrying about our hair, we also free time from our mind’s busy schedule to think about something else — like how we’re going to afford a three-hundred-dollar pair of Sevens brand blue jeans.  We’ll need the trousers if we want to attract that blonde who makes us hard by packaging her ass in a three-hundred-dollar pair of Sevens brand blue jeans.

Creativity: securing seats in the gene pool since the dawn of time.

Originality is powerful.  Unique traits fuel evolution, command attention, and map uncharted territories in any given scenario.  Best of all, exercising one’s individuality today is easier than ever.  One could, for instance, boycott blue jeans.  The last American Levi’s factory closed in 2003, anyhow.

Levi’s blue jeans: Not Made in U.S.A.

So, go ahead!  Have waffles for dinner and ride a pogo stick to work.  Go apeshit, America!  Take the plunge.  Spend an hour looking for trousers at the mall; look for pants that are neither denim, beige, nor black.  Good fucking luck!  It’s far harder than you think, and if you’re anything like me, it’s going to piss you off to see how few possibilities the market allows you.

There’s nothing wrong with national trends.  Trends become traditions and traditions become culture, and culture’s one of few things differentiating us from dust mites.  When trends control our thoughts and curb our options, though, it’s time to trim them back.  When everyone loves Twilight, it’s time to take a second look at Dracula.  When everyone has a pair of those retro Ray Ban Wayfarer sunglasses, it’s time to switch up to neon blade-style Oakleys.  Do it.  Let’s see your face behind a K-rad pair of those fuckers.

I’m not kidding myself, bytheway.  I know there’s no escape.  But there’s an important difference between the guy who goes gently into that good night and the guy who spits and cusses and brawls all the way down.

Or — I’m imagining that, and we’re all just as boring as everyone else.

No way.  I saw a forty year old man in a swell black tuxedo and pink bow tie slam dancing at a Vandals show, once.

And there was nothing boring about that.

With Great Reprobation, Condemnation and Fulmination,

-BothEyesShut

True, False, Fuschia!

When it’s done well, conversation’s an art that impresses me more than anything in the world.  Humans learn all sorts of fascinating minutia while tooling around the world they inhabit, and some of them have a good sense of humor.  There’s nothing like talking with someone who can make you laugh and teach you things at the same time: gossip, trivia, history, world culture, current events, important and unimportant things, inexplicable things, and things as mundane as what happened on last night’s episode of “Whatever.”  Hell, people can even provide an intuitive guess at things they don’t know, which, after some cross-referencing with other people, usually becomes one of our educated guesses, and upon which many of us regularly depend.

In Southern California, however, we have a treasured tradition of attempting to convince one another of our ideas and opinions.  We squabble over the quickest route from A to B, and exhort one another with banners and bumper stickers (especially around election time).  Even our fucking tee-shirts bear the slogans and advertisements of our favorite points-of-view.  In popular gathering places, the usual discussion happens in every color of the rainbow a thousand times over:

“Yes, it is!”

“No!  It isn’t…”

Lou Pinella

Bears don't look like this unless they're going to maul each other. This peaceful show of aggression is a purely human trait.

All this shallow bickering should have stopped in grade school, but our social development is arrested by our earnest desire to help — at least, that’s the noble reason I’m giving for it; pride in our powers of perception fuels arguments at least some of the time.  Note also that there are excellent reasons to argue (see How to Refrain from Being a Dick for some examples) even though the bulk of arguments are bunk, but one must grow accustomed to the presence of contradictions and paradoxes in this life, and our desire to work together for the perpetuation of circular arguments seems to be one of them.  More on paradoxes later.

What reasons exist for giving up the incessant “tastes great / less filling” sort of tennis match and resuming less-combative conversation?  Read on, o’ my fellow friends of the Friday-night beer talk, and we might find a way to shut up our faces long enough to finish a watery American lager.

-Standing by One’s Opinion Is Vain

It’s a strange culture we live in.  We’re expected to be modest yet confident, friendly yet assertive, firm yet yielding, a list of directives that sounds like a good kiss.  It’s a fine balance, and in that balance we’re taught that “Your Opinion Matters!” even while all opinions are “like assholes: everyone’s got one and they all stink.”  I saw the former on a poster in a mall, I think.  My grandfather used to say the latter.  Who would you listen to first?

It’s true that everyone has opinions, though.  There’s little way around that, and if everyone has them, then one person’s idea is worth about as much as another’s, since even a so-called good idea can potentially be had by someone else.  Most people don’t argue other people’s points of view in an argument, though, and I find that extremely telling.  We’d often benefit by relating someone else’s point of view, rather than something we cooked-up ourselves, because one can’t be accused of arguing out of pride when the argument posed isn’t one’s own.  I’d be happy to give you my stepfather’s opinion of the New York Yankees, for instance, because I’m not a sports fan and there’s little danger that you’ll think me very serious about myself.

Stacks

Zillions of pros, zillions of cons -- dreamed up and written by the most erudite people on earth. I'm sorry, what were you saying?

I’d also be glad to give you Marcus Aurelius’ perspective on willpower, Anselm’s ontological proof of the existence of God, Fuller’s evidence that the world needs communism, and other trite epiphanies, but please don’t hold me accountable for relating them!  They aren’t my fault.  They were written years before I was born.

Appealing to the authority of famous smarty-pantses of the world is a notorious logical fallacy — in other words, quoting Albert Einstein doesn’t make one’s contention correct — but it’s much less vain than presuming others should accept your opinion simply because you’re so fucking cool.

It may be that one’s own opinion is most politely stated as a question, like, “I wonder if Iggy Pop isn’t better than David Bowie?” but we can’t always talk that way, so it’s important to remember the following.

-Everyone Is the Center of the Universe

Nobody can have any point of view but his or her own.  Everyone is the center of their universe.  They don’t know what your universe is like, and you don’t know theirs.  The universes may have commonalities, or they may not.  Regardless, my daily evidence suggests that I am the most important thing to myself so truly and consistently, that even my most heartfelt principles and ideals are only worth dying for because, hey, that’s my opinion.

If I allow that you humans are like me, and that you are centers of universes, too, then there’s no fucking way I’m going to convince any of you of anything (unless of course I say something you almost agree with, anyhow, or simply hadn’t thought of yet).  It’s especially difficult to convince others of something they do not readily believe since the proliferation of Grandpa’s opinions-and-assholes principle, the aforementioned proverb our culture developed to devalue any and every opinion from Kant’s Categorical Imperative to the capitalism of Carl Karcher.

"Egocentric," by Tyler Philips. The question is, does the mind's eye emit, permit, or permute?

We’re partially convinced everyone else is wrong before we even arrive upon a topic of discussion, and that’s not surprising; we were there to witness every time we were right, and we partially doubt (or forget altogether) many of the times we were wrong.  Who, after all, can argue with the center of the universe?  Besides, even if Jack were to convince Jill of a certain idea, Jill would merely be placing her own judgment of Jack’s reasoning before and above his idea in question.

One can’t accept or deny an idea as logical or illogical without presuming a presiding authority.  Parents discipline their children out of an inability to dethrone the god who refuses to recognize Dad or Mom as a sovereign leader.  Obedient children obey predicated on their own decision that doing so produces favorable results.

-Everyone’s Opinion Is Justified, and Everyone’s Reason Is Erroneous

It seems certain that all opinions have the same subjective value, but ideas backed by logic or reason have quantifiable, given parameters by which they must be measured.  In the case of the subjective, we are almost always correct and justified (we earnestly do feel that X is Y); in the case of the objective, we are almost always incorrect and unjustified from the largest perspective, because we know too abysmally little to state things as universally, absolutely true, and can only be correct in small, easily defined, easily proven and quantified matters, such as arithmetic — and even then, paradox shows that we are wrong from other perspectives.

Paul

This fool on the hill sees the world spinning 'round. It's not surprising that he's the center of the universe. The surprising thing is, you are.

To illustrate the futility of solid logic on a universal scale, consider some rudimentary arithmetic: good ol’ 1 + 1 = 2.  Given that we will use Arabic numerals and some other laws previously agreed-upon, this is going to seem standard, true, and inarguable.  The answer, however, is vulnerable to alternative interpretation due to the accepted meaning of addition, and of “1,” itself.   If one were literally added to one other, then the result should be a unification of these two separate entities.  In math alone, we agree to call this unification “2,” but linguistically, philosophically, or metaphysically the logic falls apart, because a uni-fication must result in uni, the Latin word for one.  From these points of view, anything added to anything else will result in exactly one new thing, and we happily operate in a world where these two conflicting perspectives are both true at the same time, never questioning either of them.

The desert is hot?  It’s an icebox compared to the sun.  Your OJ too sweet?  It’s entirely sour when opposed to honey, and honey’s still bland compared to a mouthful of refined sugar.  Everything’s validity or value depends on scale, context, and relativity, and for this reason everything is true, and everything is false.  Proving anything one way is the silliest thing in the world to achieve, because there are an infinite number of other perspectives, each of which may equally prove or disprove it depending on what you’d like to accomplish.

From the broadest perspective, in other words, absolute truth is as arbitrary as the selection of a crayon.  And I want a fuschia dinosaur.

The coup de grace is really brutal, though: even upon reaching a stable, static, universal truth, we find that the entire universe is in constant flux, rapid change, turmoil, decay, permitting, emitting, transforming, creating, destroying, and “moving on,” as Stephen King put it once, so that any true answer was only true for that universe at that time — and that was some time ago.

Forest Entropy

Nothing is static, everything is evolving, everything is falling apart. -Tyler Durden (Chuck Palahniuk)

-A Somewhat Oriental Alternative

My opinion is that contradiction and paradox are the bread and butter of the cosmos.  If I may be allowed to appeal to some authorities, quantum physics (and several other religions) agree with me, not to mention Ken Wilber, who looks so cool you just know he’s hip to his shit.  There’s nothing wrong with being wrong.  We’re all wrong.  I’m wrong right now; just ask all the people who stopped reading halfway through.  When people respond to a question, “Hmm, well yes and no,” I hear the warm laughter of oblivion and smile inwardly, but when I hear people insist that they know what they’re talking about, I have to laugh at myself for having absolute confidence that they should not be so confident.

Habitually trying to convince others to change their opinions is not only futile in the long run, it’s also genocide against the opinions you don’t hold.  Who wants everyone to agree with one another?  That kind of peace and harmony sounds fucking beige.  The only reason I have no contempt of those oh-so-cooperative insects, ants, is that deep down inside, I really believe they’re all arguing over the appropriate size food granules should be for carrying back to the nest.  It’s one of the interesting things about them.

There’s nothing wrong with letting the Rolling Stones be better than the Beatles, so long as the Beatles are also better than the ‘Stones.  I prefer Fitzgerald to Hemingway because I’m fairly certain that Hemingway is better.  Sometimes I wonder, perhaps I have never been very enthusiastic about sports teams because both sides of any game is a fleeting moment in a fleeting century, a single strike of the paddle in a ping-pong tournament played on a cruise ship which rounds the peninsula and disappears into the mist, forever.  Hell, sports teams don’t even have the same members game-to-game, let alone season-to-season!  People will wear the same logo on their cap from kindergarten to their own funeral, though, and some of them will be buried in it.

That, dear fellows, is what it means to try to convince people of things.  It’s insisting that the fluffy, domed cloud up there really is a turtle because that’s the way you see it, and because you’ve seen a helluvalot of clouds, by god.  My favorite part of all, though, is the hypocrisy involved in writing this piece, hypocrisy which will also be necessary to criticize or pontificate about it.  Hypocrisy is the sudden realization that one is the person whom one has chastised.  We define ourselves by standing out in contrast to others, and that makes us all identical in our hypocrisy.  How cute.

The trick, then, is to simply avoid the hypocrites who really seem passionate about their-slash-your point of view, right?  We can do that, can’t we?  Then we’ll have peace, a much more apathetic, blasé temperament all-around, and that’s something for us all to work toward.

So, until someone comes up with a better idea, I remain

Yours Truly,

-BothEyesShut

*Note: The artist featured this week, Tyler Philips, may also be found at his design company, Circuit 26 Design.

How to Refrain From Being a Dick

“Judging people” has a very unfashionable connotation these days.  Nevertheless, it’s not only something everyone does, but an important part of life, a tool with which we sculpt our own personalities to best reflect our ideal persona.  Being “judgmental” is considered ugly and rude, but we’re constantly asked to judge whether someone else’s behavior were appropriate or not, and if we’re expected to do that, then how can judgment be ugly?  Well, it’s ugly whenever we unfairly conclude something, of course.  That’s precisely when judging the behavior of others makes us an asshole.  It therefore becomes a man to consistently check his judgment of others for inconsistencies such as hypocrisy, unfairness, or just plain ol’ meanness.

I try to catch myself when I think something uncool about another person — preferably before I say it — and this self-censorship is part of how I try to be cool to people.  However, I also feel that I owe society a little vigilance in holding my friends and neighbors accountable when I calculate their behavior to be assholey or dickish, and in mentioning similar judgments to other people when appropriate — you know, in order to spread the word: “Excuse me, but I couldn’t help but notice that what you just did makes you look like a huge asshole.  You should knock that shit off.”  Being nice to everyone all the time seems injudicious, seems to perpetuate unwanted, uncool behavior in society [see “A Hurried History of Pagans and Pulpits,” if’n you’re inclined].

Moe was such a dick, they even circumsized his hair.

The difficult part is knowing which behaviors to encourage and which to discourage, or, colloquially speaking, what makes a guy become a dick.  This week’s “In a Real World…” attempts to provide a lenient and justifiable guide to judging other people (but is really only a 10-dollar-word version of a shit list).  I apologize for electing myself to the position of Grand Inquisitor, bytheway, but we agnostics and atheists need a little guidance, too.  Besides, I didn’t come up with this stuff; it’s not my idea; it’s merely my best synthesis of things society seems to think at-large.

I.  Behaviors, Knowledge, and What People Have Commonly Known

The most important thing to understand — and this is crucial — is that nobody is or is not a dick.  People merely do, or refrain from doing, dickish things.  In order to know which activities are cool and which are uncool, we must know to what extent we may hold people accountable, for people live their lives according to what they know (or think they know) and it’s not reasonable to hold everyone accountable for not knowing everything.  It becomes our job, then, to make a reasonable list of things people can be expected to know and understand.  I won’t do this arbitrarily.  Let’s look at what people have commonly known through history, and we’ll call it common knowledge.

Makes sense to me to start from the late nineteenth century.  That’s when public schools started in the United States and institutionalized knowledge, making it “common,” so-called.  Before public schools, most people knew some folk medicine, some folk music, some recent history, how to mostly practice their religion, and the appropriate ways to perform their mostly menial jobs.  That’s most people, mind.  News came word-of-mouth, superstitions were widely believed and practiced, and one’s reputation in the community meant more then than one’s criminal record or credit score does today.  Note that rudimentary logic, reason, and rationality are not here featured, yet people somehow managed to treat their favorite same-skinned neighbors civilly.  If pretty much everyone can be expected to know how to be nice, therefore, then the question becomes: to how many people?

The Industrial Revolution needed workers who could sit down, shut up, and ask permission to urinate. Open for business.

It makes sense to me that we can expect a completely uneducated person in America to operate on a level consistent with the above knowledges.  When I’m in a particularly rural area with little technological industry (the same industries which necessitated American public schooling to begin with, you see) I do not call a man a dick for failing to understand that we civilized folk don’t use one word or another anymore, or that his neighbor is probably not going to a hell when he dies, or that going slow in the fast lane hinders my progress and wastes my precious city-boy time.  He doesn’t know the things I know, therefore I consider him cool by what I can only approximate to be a standard typical of his culture.

That’s very important, too: understanding that our culture (and its subcultures) is not the arbiter of cool will keep us from being a dick from alien perspectives.  Ethnocentric people don’t usually give a fuck about outside perspectives.  That’s why they end up getting caned in Shanghai and cussed at in Paris.  Of course, it’s also why Westerners are occasionally beheaded in the Middle East: from our perspective, religious fanatics are total dicks.

II.  The Least a Person Can Be Expected to Know

Literacy, history, the arts, science, math, and other high school subjects were not commonly understood until roughly the nineteen-forties, and even then we must confine this newfound understanding to centers of urban civilization.  Today in Southern California, however, all of these subjects are taught in public schools, and we can expect even the worst student to grok the most-basic gist of them: science says, causes have effects; math says, everything affects everything else; history says, people really like to hurt each other.  Understanding any one of these can teach someone to stop being such an asshole.

Gandhi stated simple, logical reasons for not acting like a dick. His goofy smile is a side-effect of enlightenment.

Is even this basic knowledge needed to escape being a dick, though?  That wouldn’t follow; it’s very easy to imagine a friendly ignoramus, after all.  They abound in literature as gentle giants and wizened, elderly farmers.  The knowledge base requisite to avoid being a dick must be smaller than this, and I suggest the following standards:

A. Colloquial etiquette and civility

B. Simple Logic (“if this, then that”)

A working knowledge of etiquette and niceties typical of one’s region seems a good place to start, but we’re going to need civility, too, which I’ll define as a simple deference toward one’s fellow man.  Without a little etiquette and civility, one is certain to act like a dick sooner or later (and probably sooner).

To exemplify a lack of etiquette, if someone rockets the snot out his nostril with a firm blow while in line at Starbucks, he or she is going to disgust everyone, and that’s going to hurt his or her reputation, especially if snot spatters the the top of some little kid’s head.  As for civility, double-parking makes a good for-instance: there’s nothing more dickish than for some cocknose to take up two parking stalls in a traffic-choked part of town because he’s just too busy to take ten seconds repositioning his car so that someone else can use the other space.

It can be argued that etiquette is ancillary to civility and need not be mentioned.  Upon broad examination, though, one finds that these cousins are too independently important to be combined.  Following etiquette usually keeps one from acting uncivilly, even when one’s inborn civility tends to be found wanting.  For example, my dad drops the occasional racist joke — but that doesn’t mean he double-parks in Little Saigon.  He’d never be able to live with himself.

A logical badass doesn't double-park.

The knowledge which ultimately decides how big a dick someone will be from day-to-day, though, is ultimately that of simple logic.  A man without logical aptitude is incapable of seeing that, if he shits on his next-door neighbor’s welcome mat, he’s likely to smell it through his own open window.  A person lacking basic “if this, then that” understanding may become enraged and violent at the ravings of a transient hobo, or pick a fight with someone over an ex-lover (emphasis on the prefix: ex), or fail to see that volume and repetition rarely aid one’s argument in a debate.  On the other hand, the possession of active logic can and most-often will lead to polite, friendly, and decidedly less-dickish decision making.  Logic probably won’t teach you to open doors for ladies (an arguably outdated custom, anyhow) but it may lead you to smile more, use tissues when blowing your nose, and refrain from double-parking, lest you want your paintjob keyed.

That’s all I think a person needs to know in order to keep from being a dick: civility with a dash of etiquette, and simple logic.  Nothing more.

Now, I’m not the sort to state a bunch of principles only to finish without having left a handy tip or two, so what follows is intended to aid the reader in his or her quest for coolness.  You’re welcome!

III. What You Have No Business Expecting People to Know

Some may find themselves getting carried away in their estimations of others.  This condition is rampant among certain groups of people (especially the youth, as well as many of my dearest friends) and is an understandable side-effect of the assessments all humans must naturally make of their conditions, including the actions of fellow humans in close proximity.  Many branches of knowledge are unnecessary to keep from being a dick, though, yet are consistently roped-in with the exceedingly small number of things one could rationally expect to be “common knowledge.”  A few examples follow.

The (once) popular game which made (not) knowing everything totally (un-) fashionable!

A. Fine Art, Appreciation or Execution of

Amidst the constant din of advertisement and pop-culture, it’s unrealistic to imagine anyone could form a cogent idea of what constitutes real quality in the arts, be they musical, visual, literary, or otherwise.  If you call someone a dick for enjoying the homely, modest pleasures of Taylor Swift’s melodic country tunes because, fuck, you’re surprised anyone could find pleasure in such utter simplicity, I’ve got news for you: that innocent music fan is not the asshole in this example.

B. Fashion

Don’t like her shoes?  Well, where were you to help her out when she blew sixty bucks on them?  You dick.

C. Much of What Is Taught In High School

As far as I can tell, most people spend much of high school trying to escape the slings and barbs of all the immature assholes on campus.  That, combined with a few years of disuse and intellectual decay is more than enough to obliterate many of the details gleaned from an American high-school education.  If someone forgets who fought the War of Eighteen-Twelve, it’s OK to inform them, but don’t call them an asshole.  You might just become one someday, some day very soon, in fact.

D. Philosophy, Government, Religious Studies

Almost everyone has a favorite one of these.  Each is much, much, much more closely related to the others than may appear at the outset — even across the branches — and their worshipers have a habit of accidentally becoming assholes in their righteous quest to vanquish those whom they view to be assholes.  Regardless the potential veracity of your particular passion, it’s just a few interesting ideas.  Don’t be a dick.

E. Any Knowledge One Could Pay Someone Else to Use

So, what if your friend can’t change the oil in his car?  Can you sew a fucking blanket?  Can you catch a stupid rabbit?  Good luck surviving your first natural disaster.  Asshole.

IV. How to Pinpoint, Within Reason, a Dick

The only definitive way to note dickish behavior is by observing selfish, uncool activity.  However, certain traits which often accompany a terrific want of reasoning faculty are visible even from across the street, and it behooves the reader to acquaint himself or herself with them in order to decrease the likelihood of abetting, encouraging, or becoming the victim of, an asshole.  If for you this sounds too much like judging a book by its cover, as the cliche goes, please imagine any important piece of literature, The Diary of Anne Frank, for example, with the cover of a typical dimestore romance novel, and explain to me why it doesn’t have one — the publisher would almost certainly sell more copies with all that cleavage and flowing hair, after all.

By Kurt Vonnegut

AThe Eyes

You’ve probably noticed a certain dullness in the eyes of your slower-witted neighbors.  While a lack of education alone does not make a person a dick (see above) it does increase the likelihood.  Slowness or laziness in the eyes may denote a lack of purposeful seeing and searching, a sort of disinterested passivity about life and the surrounding world.  If passive thinkers act like dicks, it’s probably not because they mean to be, it’s just that it hasn’t occurred to them to give a fuck about you.  Besides, who the hell are you, anyway?  You think you know me?  You don’t even know me!

B. Gaping Mouth, Poor Posture, Other Signs of Habitual Relaxation

It takes energy to be cool.  One may argue that in the long run it takes less energy than is needed to be a real dick, but on the battlefield of life, some people just can’t be bothered with courtesy.  This is the guy who’ll casually drop his litter on your front lawn, keep your misdelivered mail, block your car in the driveway, blab sensitive information, or “forget” to return borrowed items.  We all do stupid, inconsiderate shit like this sometimes — but some people do stupid, inconsiderate shit like this as a matter of course.  Avoid like the plague, son.

C. Failure to Produce Supporting Information On Proposed Points of View

The most remarkable aspect of the true dick is an uncanny suspension of disbelief.  A real cocksucker can hold any point of view he likes without feeling any compunction to find reason in it whatsoever.  These winners say things like, “It just is,” and, “See?  I’m right, huh.  Ask this guy.  Aren’t I right?  See!”  Ask them why they think their team is gonna go all the way this year, though, and you’ll hear more cutting-edge statistics than the WTO has compiled over the last decade.  Even this is not enough to convict them of being assholes, though.  You have to wait until they say something really offensive.  It usually takes ten minutes, half a beer, or one unit of patience less than you have — whichever system of measurement works for you.  Play it safe I say, and politely withdraw at the first sign of unsubstantiated bullshit, or you might get some on you.

*        *        *

I do hope that this little tutorial has elucidated some of the complexities involved in not being a dick.  It’s one of the most important skills to hone as a human being, and a difficult one for many people, impossible given certain situations.  Perhaps humility comes with increasing effort as one achieves more throughout one’s life, but somehow I don’t think so.  With a little consideration, anyone can act as nice as he or she would like to act.

I like the word consideration: it has the denotation of purposeful thought and the connotation of politeness, a perfect marriage between logic and civility.  Even if we’re pretty cool to one another, we can always do a little bit better, and personally, I like doing better than usual.  People like that, and people smile when they like things, and as far as I’m concerned, anything like a smile to pretty-up this overdeveloped parking lot is a good idea.  So be cool!  Be considerate.  But above all, don’t be a dick.

We friendly bastards are ever-vigilant.

Earnestly and Bemusedly Yours,

-BothEyesShut

Stumble It!

Rookie Religious, Selfish Spiritualist

In talking about various lifestyles, it’s hard not to see commonalities between fashion and thought.  The twentieth century may be easily divided into its prevailing Western philosophies, each decade pigeonholed for its own flavor-of-the-month philosophical fad, such as Bertrand Russell in the Roaring Twenties, Friedrich Nietzsche in the nineteen-fifties, or Jean-Paul Sartre in the nineteen-sixties, though others could suffice as well.  People tend to take their philosophical fads about life, the universe, and everything very seriously, and I can’t abide “seriously.”  I regard seriousness as an intellectual plague of the modern day.

The hardest people to prove wrong are usually laughing — and they’re usually laughing at themselves.  Even Shakespeare’s wise men were all court jesters, and I for one don’t blame them.  The funniest thing about humanity is its nearsighted self-importance, and laughing at people when they’re passionately convinced of themselves amuses hell out of me, like turning a vicious, snapping turtle on its back.

What follows amused me thoroughly to write, an indictment of three sorts whom I no longer naively expect to present consistent logic in casual conversation.  Each of them easily deserves their own post, but I like to examine a variety of topics, so this will have to do.  It should be noted (and I say this with an uncharacteristic twinge of tenderness in my voice) that I consider the following social groups fragile in one or more crucial ways, and I wouldn’t say these things to them unless they asked for it — or had the ability to stop reading.

New converts: more faith in their tee-shirts than you've got in gravity.

I. Socially Ambitious Spiritual Leaders

If there’s anything atheists and agnostics seem exceptionally good at, it’s automatic distrust.  The secular paradigm does not depend on faith as immediately as most religious perspectives do.  It’s not surprising, therefore, that when spiritual leaders run for office or hold massive conventions in sports arenas, atheists and agnostics refuse them “the benefit of a doubt.”  Since typical spiritualism and religion are against fame, large-scale material gain and power over one’s fellow man, it is often difficult for the secular world to trust spiritual leaders who appear on television, magazine covers, or the jumbotron digital screen at Anaheim Stadium.  Non-believers have no patience for spiritual leaders who ignore their own religious tenets.  Go figure.

Believers, though, they have no problem practicing George Orwell’s concept, doublethink.  Pat Robertson’s a great big jackass because he said horribly racist things to the media recently, but Jerry Falwell’s memory will remain untainted by his own shortcomings because they’ve been conveniently forgotten by people who desperately want to believe in their representatives.  Jimmy Swaggart’s biography, “Thrice-Born: the Rhetorical Comeback of Jimmy Swaggart,” says his public applauded the reasons he gave for his moral failings.  How’s that for accountability?  Spiritual leaders, it would appear, can abuse the public trust as much as they like without serious, lasting repercussions.  The only people who remember when they lie or steal or otherwise transgress their own moral standards are the same people who thought these leaders were crooks to begin with.

The historic Jimmy Swaggart apology. It's OK, big guy, we never really believed in hell, either.

There’s much paradox in large-scale spiritual leaders, anyhow.  Throughout history, hardly any of their burgeoning number have been founders of their particular brand of faith.  The majority have been little more than charismatic persons with evocative ideas and perspectives regarding preordained doctrine, which would be fine if that were all these leaders had to offer.  Once they’ve garnered sufficient attention, though, they tend to inflate their office like a wartime American president and commence making changes of all sorts and sizes, great and small, changes to the traditions of their faith, their practices, their creed, even their holy texts or ultimate doctrines themselves.  If the reader fails to see paradox in this, he or she will be kind to note that it is only upon these traditions, creeds, texts, and doctrines that the leaders attained their positions.  Fine joke, that.

If this argument seems dubious, one has only to consider the lists and lists, branches on branches of religious schisms and sects, denominations and cults.  Each of these represents an example of the above paragraph in action.  For instance, Martin Luther was not Christ, and yet. . .  Sai Baba was not Swami Vivekananda (let alone Ghandi, let alone Ramakrisna) and yet. . .  All that remains to be said is: don’t read Josh McDowell to understand the philosophy of Jesus Christ, read Jesus — and don’t read Alan Watts to grok Taoism, read the Tao-Te-Ching.  Socially ambitious religious leaders all either attain to power or have it heaped upon them, and anyone can tell you what affect power has on people.

II. Golden Years Relapse and AA Christians

Anyone can tell you that many elderly humans return to God (big G).  Alcoholics and druggies do, too, and in fact are ushered to it by groups like Alcoholics Anonymous and Narcotics Anonymous.  It isn’t bad math or inconsistent logic, if one looks at it.  Many religions offer amnesty in the form of baptisms, confessionals, or amoralism, and promise eternal life and love for virtually nothing in return; when faced with oblivion — well, one almost has to err on the side of a possible paradise rather than risk eternal suffering.  Of course, many non-believers see no risks or possibilities whatever, so they go about their business and simply snuff it at some point or other, tilled and ready to fertilize the daffodils.  Golden-years converts and addict converts, they revert to what took some of them decades of soul-searching and introspect to escape, namely, the same damned worldview they had when they were still being punished by their parents.  What a fucking way to go.

Rev. Oh Beng Khee, a friendly pastor who converts 25-40 seniors over lunch every weekend. Would you like fries with that?

The main frustration comes from their immediate desire to proselytize and witness to non-believers or believers of other faiths.  There’s nothing for one’s confidence in a doubtful matter like convincing someone else that it’s true.  Try it!  You’ll like it.  It’s a sad shame that so many of the world’s most beautiful systems of thought have no standard at all governing the earnestness of their converts, because there’s narry a congregation in the world without a solid percentage of confused persons, people having no business at all swearing oaths, speaking prayers, and outwardly worshiping symbols and icons with serious doubt in their minds all the while.  That sort of thing is definitely not good for everyone else in the congregation who stakes his or her own faith on the support of so many other steadfast believers.  If a fella learns to operate Windows XP on Monday, ought he to be given a job in information technology on Friday?  Do your beliefs a favor, you golden-years and A.A. converts: keep your gods to yourself until your faith outlives your reputation.

III. Spiritualists and Neo-Hippies

So-called spiritual people do not call themselves religious, and do not abide anyone else calling them religious, kindof like a Frenchman insisting that he be called a Parisian.  Religions control people, they say; spiritualism, though, frees minds like in a Bob Marley song.  Self-proclaimed spiritual people say that religions siphon money from believers, and that offerings and donations do not reach the poor and disadvantaged when they come from churches.  Of course, if the money were given to Hari Krisna dancers, “Save Tibet,” or the aforementioned Sai Baba, it’s global change in pocket change.  This is one example of dualistic spiritualist thought, but a mere one of hundreds, and the differences betwixt spiritualism and mainstream religion have more to do with the size of the congregation than with anything else.  But you knew that, already.

Sai Baba. You have no idea how globally popular this motherfucker is -- but if you've ever purchased a box of incense sticks, it was probably Sai Baba brand. Not kidding.

One annoying difference (or similarity) is the spiritual persons’ habit of maintaining a salad-bowl paradigm.  Today’s new-age and spiritual believers do not have a consistent set of beliefs, but rather pick and choose as though the fundamental principles of the universe were a produce section in the local supermarket.  While this may well be true of the universe and its principles, little effort is taken on the part of many spiritualists to reconcile one belief with another, so that while tarot cards might predict a fine day, Y Ching sticks may proclaim tumult while astrology declared perfect balance throughout the cosmos, and the modern spiritualist will find a way to accept the resulting conclusion — an admittedly shallow example, but a suitable one for our purposes.

Perhaps worst of all, few spiritualists really give a fuck about the authenticity of their beliefs.  The easiest American instance of this is the widespread abuse of the Hindu concept of karma.  Since spiritualism’s rise to flower-child popularity, the word karma has been used to describe a sort of cosmic vengeance which, were one to drop a brick on someone else’s head, would bring ten bricks down on one’s own.  This is a gross misinterpretation likely born from the Western need for a holy fist of heavenly justice.  Karma in the Hindu traditions is the effect of this life on the next life.  It is inextricable from the concept of reincarnation.  The effects of this life on this life are called dharma, and are much closer to the scientific concept of cause-and-effect than anything else, which pretty much takes all the magic and mysticism out of it — much like a large portion of other twisted metaphysical and hermetic philosophies.  The closest spiritualists in America typically get to understanding (or caring to understand) this crucial distinction, however, is a giggly aha! moment when the title of the mediocre sitcom, “Dharma and Greg,” comes to mind.

"When we go green, we go all the way," because, you know, you have to sit in full-lotus position to recycle a fucking can. Makes me want to burn a mound of styrofoam in the nearest Whole Foods supermarket, right next to the flax seed and patchouli oil.

That’s a fantastic image of modern spiritualists, in fact: imagine a group of people dressed like fashionable, anachronistic hippies, smiling at their recognition of a word they don’t have any compunction to really comprehend.  Oh, also?  Also make them shake their head ‘no’ while smiling.  That’s the spiritualist version of disagreement.  They’re as peaceful as Ghandi and as passive as apple pie, so they have to wait until their detractors have left the conversation to agree with one another about how much they disagreed with what that last guy said.  They could have to enjoin real conflict otherwise, and that just wouldn’t be natural.

*     *     *

What leads people to spontaneously become acolytes of a new system of beliefs?  Is it an immediate and crucial yearning for not just one dire answer, but a network of interlacing answers?  Is it a need to belong, a desire for a ready-made society of comrades united toward a common cause?  Nobody can say without being equally presumptuous, but there is a thread of similarity that connects these tenderfoot believers which is hard to ignore, one which their members would likely not bother to refute, and that is the vulnerability present in the spirit of each, the meek, affrighted lamb attempting to appear a lion by proudly waving its humble timidity like a glorious banner of courage.  Terror must abate — it must — and everyone hides under the covers under certain circumstances.

I’m afraid, too, afraid of men with guns, afraid of car accidents, afraid of both heights and spiders. . .  But I try to remind myself that these fears are usually irrational and childish.  I try not to massage my oft-bruised ego with salves of irrational, childish behavior like bandwagon belief systems, not that there’s anything wrong with being childish — so long as one doesn’t take oneself seriously, of course.  So what’s my problem?  What the fuck is it I want from people?

Consistency, sucka.  I want some goddamn consistency.

I want loudmouthed Christians to study their fucking books — all of them, apocryphal or not.  I want neo-hippies to study a little Hinduism if they’re going to start talking pop-culture reincarnation, want them to show genuine interest in all the yogas, karma, raja, jnana, and bhakti, too,  rather than presuming they learned all there is to know from a hatha yoga session at 24-Hr Fitness.  The cults of Kali, for instance.  There’s a side of Hinduism I doubt the Beatles would have endorsed.  I want grandparents who find God again (big G) to keep their dignity and pass on the altar call for the first few visits on Sunday.

But most of all, I just want people to re-evaluate their silly confidence in their best guesses at the secret of life.  We don’t know.  We don’t know.

Hey.  We don’t know.

With All My Cheerful Tidings,

-BothEyes

Stumble It!

Hippocrates and Dionysus Fistfight In Heaven

I lost faith in doctors when I was thirteen or fourteen.  I had gone to my physician for a checkup I needed in order to compete on my swim team, and he looked up at me with consternation upon seeing a scab on my leg where, a couple nights before, I had heated a knife and sliced out a dime-sized cluster of warts, these ugly little growths I’d felt self-conscious of all my life.

“What is this?” he asked.

I told him.

“Ho!” he said.  I’d never heard a Vietnamese man say ‘ho!’ before. “That was very heroic of you — but we have stuff for that.”

His tone gave me the impression that he was insulted, but I didn’t understand it at the time.  I bet he thought about how contaminated my surgical conditions had likely been, and how lucky I was to not have gotten an infection.  I bet he thought about how clean and modern and painless and simple it would have been for me to just dial up the hospital, wait on hold, make my appointment, clear my schedule, keep the appointment, wait in the lobby, get the bastards removed with dry ice or nitrogen or whatever they’re using now, and finally pay for whatever fees and salves I would then have to pay for.

Or, I could heat my pocketknife over my bedside candle one night while I was reading in bed, slice the fuckers off, and go to sleep.

Yeah.  I stand by my decision.  Well done, teen me.  In case you’re wondering, there’s no scar and they never came back.

Doctors: can't live without 'em; can't shoot 'em.

In assessing modern medicine, I keep Andre Gide in mind.  Gide wrote a book called The Immoralist (among other great works) in which he castigates the relevance and inherent paradoxes of morality, and I quite liked it in my early twenties.  He used to hang around with Oscar Wilde, too, whom I greatly revere, and they used to exchange axioms and epigrams together.  Most readers will recognize his most famous one, the witticism which I use to remind myself of my principles in judging many things, and particularly medicine: “Believe those who are seeking the truth.  Doubt those who find it.”  The suitability will not be immediately clear, so allow me to explain.

In math, science, architecture, the arts et cetera, no respectable professional will say that the absolute answer has been found to any question in his or her field.  This reluctance results from the extremely rational fear of the next expert proving the assertion incorrect by the simple application of a different perspective.  Until recently, physical science considered material things solid; now these things are known to be particles in a mostly empty void like stars in space.  This new understanding changed many things in math and natural sciences, and many people were made fools overnight.

What allowed some scientists to keep their dignity was their reticence in assuming an answer.  Answers are funny things.  They presume to be splinters of an overarching, all-inclusive and ultimate truth, but while a so-called ultimate truth would need to be timeless, answers are too specific for elasticity or longevity and are changed out for more-contemporary answers all the time.

Ha ha, yeah. . .  Answers.

Andre Gide, cavalier of righteousness and earnest living. I swear, sometimes I wonder if all the cool people during the Victorian era were homosexual.

Gide’s “truth” is the same thing.  He knew that the answer, truth, or whatever you like could be perfectly suitable to one question or another, but understood also that a truth transcending all problems, for all circumstances, from every perspective, and for all time to be very, very unlikely.  I see medicine as modernity’s various truths or answers to its questions about health.  I see medical problems changing with the times, morphing in response to ever-changing atmospheres and causes, answers transforming to match reincarnating problems.  Illnesses have always come from different causes but with the same symptoms, or different symptoms from the same causes, and medicine has done its best to keep abreast of it all, by god.  Medicine’s best is an educated guess in many, many cases however, and it’s healthy to remember so.

Another healthy thing to remember is, you can trust yourself.  I believe that if you’re not too lazy or too proud to do a little research, the best doctor in any sufficiently common case is you.  Very nearly all people are on some regimen, diet, or health plan of their own device at all times already, anyhow.  Self-medication is ubiquitous.  Everybody’s prescribed a massage for themselves to relieve a cramped leg, or applied pressure to stop bleeding, or performed oral surgery on a loose tooth (yanked it) and everyone who reads this has everything they need to do the research necessary to cross-reference different medications and treatments.  All the same, some people go to the doctor when they get a headache; others refrain from visiting a trained professional so obstinately that, once the ambulance arrives, EMTs chastise them for having brought themselves to the brink of death.  There must be a balance, one would think.

Me?  I don’t suppose there ought to be a balance, myself — at least, not if “balance” means going to the doctor half the time.   The body is not a feeble thing by nature.  It can be made weak, but it isn’t designed to be weak.  Humanity has survived for approximately four-hundred-thousand years, says archeology, which means that global climate shifts, meteor bombardment, diseases, predators, war, famine, and even poor judgment have failed to kill us thus far.  If there’s anything I like to have faith in, it’s the magnificence of the human body.

The mind, however. . .  I’m not sure I’d trust it to babysit my child — but I digress.  The body is built to last, anyhow.

The human body: 400,000 years and still kicking, kicking our way across the bubbling surface of our glorified primordial soup. Michelangelo, by the way.

Still, some people go to the doctor for “checkups,” which apparently are intended to catch problems before they become serious, an idea which sounds pretty good at the outset.  Consider though the obvious relationship between checkups and hypochondria: hypochondriacs believe they have symptoms of an ailment which a doctor says does not exist; people who go for checkups believe they may have phantom symptoms only a doctor can say do exist.  A good doctor will be incensed at this comparison, and he or she should be, because he or she knows how much training and studying goes into a medical degree, how much diligence and care goes into his or her work.  An unscrupulous, lazy, or perhaps merely money-minded doctor will scoff, too, but for very different reasons which are easily surmised.  Checkups are not harmful if one has a good doctor, though.  It’s just like taking your car to the garage for a professional once-around — if the professional knows the car well, then it should come out at least as healthy as it went in.

This is where the trail divides in the wood.  It may be said that these concerns about doctors are neurotic or phobic, but it can not be said that I’ve indicted the medical industry unfairly, or even to the extent to which it has already been convicted.  I choose to cross-examine the patients, rather.  After all, they alone made the choice to pay someone else to do the research and suggest a mode of treatment.  Much unnecessary woe can be avoided with responsibility and a little self-reliance.

Consider the question of pills again, please.  How many people are on an anti-anxiety medication?  Anti-anxiety meds have side effects ranging from awful to magnificently horrendous, with nausea, depression, impaired thinking and impaired judgment on the mild side, and mania, rage, vivid hallucinations and violent behavior on the other.  How many of these people tried working out daily first?  How many tried meditation or California-style yoga?  How many prescribed a nice quiet read on the beach for themselves?  Or — and sorry if this is blasphemous — but how about a cigarette and a beer?

Hippocrates is wanted for questioning in connection with the unexplained disappearance of several shipments of drugs. Authorities suspect the ancient philosopher may be attempting to offset his responsibility for having authored the oath ironically uttered by their inventors.

The history of humans using beer to relax cannot be overstated.  Benzodiazepines (the most widely used anti-anxiety medications) were introduced in the year nineteen-sixty.  That’s fifty years ago at the time of this writing.  The reader will smile to understand that beer has been used to combat anxiety for six-thousand years.  It’s practically the first thing mankind invented after the drum.  And tobacco?  Hell, even psychiatrists will prescribe cigarettes under certain circumstances.  It wouldn’t do to die of cancer, of course, but it’s worth mentioning that a shorter, calmer life is presumably preferable to a longer one spent in nausea and depression and punctuated with bouts of impaired judgment and thinking.  If some doctor were to walk up to me and offer me a red pill in one hand and a blue pill in the other, saying, “Here, son, you look a little upset,” I’d say, “Thanks but no thanks, doc.  I’d be much obliged if you could draw me a pint and spare a smoke, though.”

What can I say?  I like my medicines tested — very thoroughly tested.  I don’t trust lab coats much, certainly not like I trust a six-thousand-year-old track record.  Beer may not be the healthiest remedy, but it’s clearly the most honest one.

The reader will understand that I’m talking with my tongue in my cheek, sure, but what I intend to say should be agreeable enough: one ought to take a moment or two to reflect on older, more obvious solutions to problems for which people automatically seek doctors.  Many of today’s treatments are tomorrow’s charlatanery.  If the colloquial terms charlatan or snake oil mean nothing to you, you probably know one and have purchased the other from him.  Beer and cigarettes might sound sinful, but even the pharmaceutical companies responsible for producing “bennies” admit that the judgment and thinking of their patients is hindered.  How many of these people are there?  Forty years’ worth of patients!  And how many people with anxiety, exactly?  Only forty million. Makes you think about the actions of the American public a little differently, doesn’t it?  Imagine forty million people walking around, nauseous, depressed, unable to think or make proper decisions — oh, and let’s not forget the rage and violence, either.

What’s sinful now?

Dionysus has a taste of wine with his good friend, Pan, in attendance. These gods were not known to suffer from anxiety.

Pharmaceutical companies say they’ve got a pill to fix “X” problem so long as one can afford to take it daily, but the pill won’t work forever unless its function is a basic one like thinning blood (aspirin) or killing bacteria (penicillin).  Even these two time-honored medical traditions will be impotent against the issues of the year 3000, I’d wager.  Beer and cigarettes?  You bet they’ll be comforting the overworked and underpaid until the sun marries the moon and all the cows actually come home.

So the next time you’re feeling as though a metropolis of responsibility and woe were pressing you into the dirt, and you forget that you’re a surviving hero of four-hundred-thousand years worth of genetic and social fine tuning, consider putting your feet up with a cold one and a cool smoke before you reach for the psych meds.  Humanity still seems alright after all those suds and tobaccy.  Ain’t nobody can tell you what the benzodiazepine eaters are going to look like in six-thousand years.

Cheers, and Happy New Year, Friends!

-Both

  • Enter your email address to subscribe to this blog and receive notifications of new posts by email.

    Join 22 other followers

  • Bookmark and Share
  • Local Directory for Long Beach, CA
  • Recent Readers

    View My Profile View My Profile View My Profile View My Profile View My Profile
  • Copyright Info.

    - All works are Copyright 2007, 2008, 2009, 2010 by BothEyesShut

    "In a Real World, This Would Be Happening"

    All rights reserved.

    - Header concept, photography, and design by Ruben R. Martinez (www.RubenRMartinez.com)

  • Real Bloggers United