If we are to examine the idea of choice then we should begin by defining our terms. I will define choice in the mathematical sense according to the axiom of choice. The actual statement of the axiom is a bit technical and requires infinity so we can content ourselves with the following: choice is defined as the ability to select one item from a set of indistinguishable items. Distinguishability has important implications in mathematics, thermodynamics, and quantum mechanics so it is reasonable to include it in our definition. If the items are distinguishable, then selecting one is a trivial matter as some factor will favor one choice over the others. It is only when the choices are identical that we truly “choose.”

Philosophers may debate whether choice as such exists or does not, but we may also ask a practical question about how this ability might arise. For this purpose, let us assume that humans do possess the ability to choose and that most other species do not (or at least not at our level). It would necessarily have evolved and therefore be of enough benefit to survival to warrant the extreme cost of the human brain. As it must have evolved before the advent of agriculture, it would be beneficial in the hunter-gatherer lifestyle. When hunting, selecting the weakest, or fattest, or most desirable is a fairly simple application of the decision-making power which weighs information and finds the optimum. If a unique optimum can be found then there is no need for choice. Distinguishability on the basis of some optimizable factor is the difference between a choice and a decision.

Before we proceed I should describe a defensive behavior of prey animals called flocking. There are many benefits to travelling in a large group, especially for protection. One non-obvious benefit is that predators are presented with the opportunity to kill one of many. This is known as the predator confusion effect. The effect can be heightened if all the individuals look and act identically (flocking behavior). A good example is the zebra whose stripes make a lone zebra highly visible, but make herds of zebras a mass of shifting lines stopping a predator from focusing on a target long enough to kill it. In systems where predator confusion is in effect, prey can be singled out for greatly increased risk by marking them as different using colored dye. How might hunters counter this strategy of indistinguishability? If choice is an ability specifically evolved, and not some freak accident of nature, then I think it possible that it is a counter-strategy to flocking behavior in our prey.

It is sometimes said that when presented with many similar options, choice is paralyzing, but in fact the opposite is true; choice is what allows us to select an option despite it not differing from its fellows in any way. The paralysis comes from attempting to make a decision among distinguishable objects without enough information to definitively know which is optimal. I have had much success with clarifying the difference between choices and decisions to friends asking about selecting colleges, careers, romantic partners, etc. When they accept that it is impossible to decide what is “best,” they are free to make an arbitrary choice and continue with their life.

Racial discrimination is an awful thing. At this point, I think most of the enlightened world would agree that society would benefit from the elimination of stereotypes. But while they exist, it may not be so bad to take advantage of the ones that work in your favor.

Although “positive” stereotypes are detrimental to the racial group as a whole because they do not reflect real vicissitudes among individuals, they can be used to your advantage when you’re the beneficiary. If you’re a model minority and someone assumes that you are smart and industrious, roll with it. If you’re not actually intelligent and hard-working, this just makes it easier to fake it like everyone else does. This is especially the case with anything non-white being more “authentic.” This can apply to any aspect of culture, but if you have cultural authority based on the color of your skin, then use that empowerment to better spread your message.

Much has been written about Twelve Years a Slave winning the Oscar for Best Picture because of white guilt. Two Academy voters even when as far as voting for it without seeing it because of its “social relevance.” Ellen DeGeneres may have joked that there were two scenarios–either Twelves Years a Slave would win or the Academy is made up of racists–but she touched on the real issue of how much race played into the decision. “Social relevance” may just be a euphemism for white guilt.

Now here I’ll make a distinction between white guilt and affirmative action. Ostensibly, affirmative action is driven by a goal of diversity, and less so now to rectify past wrongdoings. White guilt is a response that European descendants feel towards minorities because of a history of imperialism, including institutionalized slavery. Although a college admissions officer may be inclined to admit the black candidate because her ancestors were slaveholders and she is correcting for some cross-generational moral deviance, her official line would be that the black candidate would increase the college’s “diversity.”

I won’t proffer an opinion on affirmative action here; that would take up entire post, if not the entire blog. However, I will say that white guilt is completely fair game. As a minority, you don’t know what kind of unspoken biases are held against you. Safe to say that most of those stereotypes are more detrimental than beneficial. You can work to subvert that racism because you know that it is wrong on a societal level, but you can also smartly use it to your advantage when necessary. So when a white person offers you some sort of concession because of some historical event that likely didn’t directly affect either of you, go ahead and grab it!

Lastly, if you are a white reader, don’t let white privilege blind you any more than pernicious racial biases do. Sometimes an Asian person is just not that smart; sometimes a black person is just not that athletic. And sometimes a movie about slavery is just not that great.

*NB: This also applies to Jews, of course.

These days, you step away from the Internet for a moment and suddenly you’re behind on current events. Most recently, I went dark for twenty minutes and when I came back on Twitter, Nelson Mandela had died. Pair this with the tragic death of Paul Walker just days early, the Internet cried.

I know vaguely who Nelson Mandela was. I never studied apartheid or much of South African history in school. I didn’t even watch Invictus. What I do know about Mandela is cursory. He fought against racial discrimination and was willing to go to jail for 27 years for his beliefs. He was President of South Africa from 1994-1999. In short, most of his accomplishments happened either before I was born or during my formative years. While I can’t say that I didn’t benefit from him making the world a better place, his impact on my life is relatively indirect and minimal.

I have seen every Fast and Furious movie with Paul Walker. Those movies are likely Walker’s greatest legacy. That alone has meant that Paul Walker has had a more direct impact on my life than Mandela. Comparing the legacies of these two individuals is an idiotic task, but if I were to post a RIP to either of them, it would be about Walker. Mandela led and extraordinary life with huge accomplishments, but he died peacefully at the age of 95 and had been mostly out of public light since 2004. Walker died at 40, and while he didn’t leave us with too meaningful roles, he still had a potentially long career ahead of him.

Looking at my Twitter and Facebook feeds, I see plenty of my peers posting messages of mourning for Mandela. I suppose it would look rather shallow if you grieved over the loss of Walker but said nothing of Mandela, who, by almost any definition, had a greater impact in the world. However, your Facebook friends and Twitter followers aren’t relying on you for world news updates. Your social media presence should be personal and relate to how the things you post affect you (although please, we don’t care what you ate for lunch).

I don’t doubt that for some people, Mandela may have been a source of personal inspiration and his life was meaningful on a personal level to those people. But for the majority of people expressing their sorrow, I hesitate to assume that they even know what apartheid was. This is reflected by the awful tributes to Morgan Freeman that people have put up or the tasteless comparisons of Walker and Mandela, like the picture at the start of this post. Just because someone important died, doesn’t mean the rest of the world needs to know that you were vaguely aware of that person’s significance.

A few days ago the question arose: who was the smartest person in history? I immediately offered the obvious answer, “the Earl of Sandwich.” To my surprise, some of the people disagreed and there was a heated argument. Let me lay out my case here in full.

Aristotle said, “a man is defined by his actions.” Under this definition the smartest person would be the one who had done the smartest thing. The person with the former Guinness world record for highest tested IQ is Marilyn Vos Savant who is a role model to all aspiring young girls for doing a very smart thing indeed–marrying a rich man. Has she done anything else? Not really, so let’s move on. One might give a stock answer such as “Einstein” or “Newton,” but remember that they were standing on the shoulders of giants so their accomplishments were not so much a matter of genius as that of good balance. The Earl of Sandwich, on the other hand, invented the sandwich. I am tempted to end the article right here.

However, there are those who full of contentious urges would vehemently deny the contribution to all humanity of this grand patron. To silence such blathering fools I will present incontrovertible proof of his surpassing intellect. The first objection they will raise is that the sandwich was invented by some rabbi thousands of years ago. Wrong! The rabbi used unleavened bread and bitter herbs, and everyone choked it down politely while secretly swearing never to eat anything even remotely similar again. Then they will scoffingly impair his legacy saying it is not important enough. Wrong! A common idiom for an important invention is, “the best thing since sliced bread.” For what purpose would one need sliced bread except to make sandwiches? On the topic of world-changing inventions we might well mention Gutenberg’s printing press. But what is a printing press, really, if not a lead-and-paper sandwich? So we see that the sandwich is the foremost invention after which all others are patterned and it was the Earl of Sandwich who brought the idea into the modern perfection so beloved today.

The idea has spread to other cultures which have developed their own particular style of sandwich. The Japanese, not wanting to discriminate between top and bottom, encased the filling uniformly in breading to form a sushi roll. The Mexicans made a grilled cheese out of tortillas and called it a quesodilla. The people of Philadelphia created possibly the greatest sandwich ever conceived and then proceeded to ruin it by using cheese whiz instead of the white cheddar and provolone that a sane person would. I have received marriage proposals from men and women alike merely by describing a cheesesteak I had concocted. For devising a thing that in making and eating has given purpose to the lives of both halves of our species, I hereby declare John Montagu, Fourth Earl of Sandwich, the smartest person in human history.

This was another thing I thought was obvious, but when I searched for it came up with a lot of crap about changing gender roles which is utterly false. I will begin with a generalization which, like all generalizations, is completely true, and fair to all parties. Men are biologically, not culturally, straightforward and direct. We see things in black-and-white; there are never any shades of grey–the primary reason we don’t read that book. The mindset has its advantages in dealing with things that follow a law of excluded middle such as: this rocket will fly or it will crash into the ground in a fiery explosion that looks awesome. We understand a situation of that kind and are able to plan to have our cameras ready. Women on the other hand are inherently better at situations where the antecedent and the event lie somewhere on a continuum, the exact position differing in the opinion of each observer. These are often the circumstances of human interaction making men with their on/off perspective look like bumbling idiots in various social relationships.

The topic in question is cleaning duties and how they are split according to gender. Let us consider the task of cleaning a bathroom. First one must determine the bathroom is in need of cleaning, then what should be done to clean it, and when it can be considered clean again. It’s clearly one of those sliding scale things that men are uneasy with and would prefer to avoid whenever possible. “If it’s dirty today, was it clean yesterday? If so, can I wipe a paper towel around at random and bring it back to the cleanliness of yesterday and be done? If not, then it was dirty yesterday, but she didn’t make a fuss about it then so why the fuss now? It’s PMS again isn’t it . . .” This is how a man thinks about the clean/unclean dichotomy. So the female ends up cleaning the bathroom and grumbling. They grumble louder if it’s that time of the month and even louder if you try to imply that it’s because it’s that time of the month. If grumbling could scrub grout they might not have such a tough time of it. I’m just saying. Men dodge chores not because they are lazy or chauvinistic, but because an ill-defined impetus and goal give them a vague sense of malaise they’d rather ignore. So what does that leave the men in a household to do? Consider the chore of taking out the trash. There is a clear threshold for action (can full), a clear course of action (empty can), and a clear metric for completion (can empty). Few people would think a half-full garbage can needs emptying so the potential ambiguity is eliminated. The trash is taken out regularly and the man feels that he has contributed in some significant way to the running of the home, though it’s mostly a lie. This also works with changing lightbulbs (dark/light), microwaving burritos (frozen/edible), and the buying of basic necessities (no beer/beer). Watching the kids poses something of a problem as the alive/dead metric lacks the sort of nuanced refinement most women desire in the childcare realm.

Sadly this leaves women doing the majority of housework which is patently unfair. Here’s my advice to women to get their male housemates–boyfriend, roommate, husband, son–to do more. Have the need for a task to be accomplished be triggered by an event calling for immediate action like reminding him of laundry day by using his socks to set off the smoke detector. You could also toss a match into three months of accumulated stumble decorating the vanity plate. And the smoke detector would go off reminding him to change his socks too. That’s efficiency! Telling him the kitchen floor is dirty is pointless because being dirty is not an event. Refrigerator mold growing eye stalks and growling at the dog is an event, but you won’t elicit a response from him before then no matter how meaningfully you roll your eyes. Women, you’ll never get men to see the world in anything other than binary opposites; it’s not in their chromosomes, stop trying. If all else fails you can fall back on the age-old paradigm: chores/no sex.

Humans, unlike all other species before, evolved a well-developed mental ability to think in abstract concepts. This most powerful of our evolved traits is the basis of our intelligence. However, it can go awry if we deal in practice with symbols and forget what it is they represent. The logical fallacy resulting from this mistake is called reification: to treat an abstract symbol as the real thing it represents. The use of abstract symbols condenses ideas and removes them from the concrete reality from which they were derived. The allure of the simplified ideal and the lengthening distance from its object a repeatedly used symbol experiences draw humans into the trap. Here are three instances where this occurs and their negative consequences.

Money and Value

The most obvious instance is the symbol/object pair of money/value. Of course reification of money has existed for all of civilization, but it has become increasingly destructive with the advent of electronic banking and trading, as money moves from something in your wallet to pixels on the screen. With funds being recorded electronically and trades being made with computers in microseconds it is inevitable that we have phenomena like flash crashes in the stock market. Traders are not buying stock based on the real value of the company, but on the speculated monetary value the useless slip of paper will have in a few seconds during which no new information regarding the company was introduced and yet the price changed. This can only be explained by the reification of money which can be changed almost instantly despite the enduring and stable value it represents. Believing the price up and believing the price down has no cause in reality, but it certainly has an effect. The scariest part is that this arbitrary number that we have believed into existence, we have also believed into being important enough for millions of people to lose their homes when a few people believe it to go down. It is no wonder our economy operates on “fiat currency.”

Credential and Qualification

Speaking of economy, economic efficiency depends on matching qualified workers with jobs appropriate to their skill set. Employers need a symbol which represents the fact that an applicant possesses the desired skill set, that symbol being a credential. However, when employers place too much weight on credentials and fail to check that the credential faithfully indicates the presence of a body of knowledge they commit the fallacy of reification. When the employer can no longer distinguish between the symbol and the object, the consumer no longer cares to acquire the skill set, only the credential and the accrediting institution, following the demands of the employer and consumer, focuses on churning out credentials rather than knowledge. Eventually such an economic system will fall apart and the employers, recognizing their mistake, will demand a different kind of symbol and the process will repeat ad infinitum.

Age and Wisdom

Most cultures (thankfully not modern American culture) impose a hierarchy based on age with the eldest at the top. The logic behind this practice is that a person who is older is likely to have more life experience and therefore be wiser. But wisdom is difficult to detect and so, like the credential, society looks for an easily recognizable symbol to stand for the complex idea. Because of a correlation between the two, age was chosen as the symbol of wisdom. Once it is encoded into a culture, the origin is forgotten and all members are required to “respect their elders” even if those elders have done nothing to deserve respect. Their long lives may be primarily a symptom of luck or other factors out of their control.

The most common example of this fallacy in history has been religious people mistaking metaphorical statements for their literal meaning. When the Christian bible says that during the sermon on the mount Jesus took a few loaves and fishes and made a meal to feed thousands it is not referring to a petty magic trick. The passage is a reference to the doctrine that “man does not live by bread alone, but by every word issuing from the mouth of God.” And so Jesus rebuked his disciples who had thought that by “beware the leaven of the Pharisees” he meant the Pharisees might try to poison their food. Those who fall into the same error as the twelve disciples have like them earned the biting criticism, “get behind me Satan, for you are not mindful of the things of God but of the things of man.”  Perhaps reification is not worthy of eternal punishment; still, a little caution is due so that the machine of our brains does not overpower the rational consciousness of our minds.

With an eye towards appreciating humor, we are going to analyze a few choice XKCD comics. Why? Because researching this article involved reading them for 4 hours. (A complete aside: because of its bilateral symmetry, I for years thought XKCD was a sophisticated emoticon not an unphonetic string.) As you can imagine, the humor may be lost once the analysis begins. First, we need to briefly treat an elementary theory of humor.

Excluding slapstick, most humor is based on irony. Some types of irony are: verbal irony, language whose literal meaning is contrary to what is intended; situational irony, events proceeding proceeding int the opposite manner to what was expected; and dramatic irony, a situation where the audience knows more than the characters. All three have in common the notion of a disparity between what is stated and what is implied. The space between what is shown to the audience and what is known by the audience we will refer to as “the knowledge gap“. It can be either positive (the audience knows more than is shown) or negative (the audience knows less than is shown). Humor can include both, though it is usually positive. Negative knowledge gaps in mystery novels, horror films, and magic tricks tend to make people uneasy which can be enjoyable, but rarely funny.

A layperson might say that what makes a gag funny is the punchline. An amateur comedian might say that it is the setup. A master comedian would say that it is neither the setup nor the punchline, but the space between them, called “the beat“, which creates humor. Crossing that space, the hang-time of the mental leap, is at the heart of humor and the larger the gap that is crossable, the funnier the gag. Like jumping a canyon on a motorcycle, if the gap is too narrow, it’s unimpressive; if too wide, you slam into the side of a cliff. A cogent analogy would be the bump-set-spike process of volleyball wherein the flashy, visible spike is merely the follow-through of a well-executed set. The setup and punchline mark out the gap to be crossed and in what manner. Now that we have a primer, let’s jump into the comics.

The Double Meaning

http://xkcd.com/101/

The joke is based on a pun involving the word “miss.” The humor is enhanced in a number of ways by widening the knowledge gap. The phrasing of the pun mimics that found in sappy advertising to contrast with the morbid nature of the implied meaning. Using “loved ones” creates a much wider disparity between the two meanings of “miss” by directly invoking the extremes of love and murder than a weaker phrase such as “friends,” or “husband,” or “neighbors” might. The use of a laserscope which is ancillary to the rifle adds to the gap by making the reader take an extra mental leap to connect lasersights to improved accuracy for killing. Had the box had a picture of a rifle, the smaller gap would not be as funny.

The Disconnect

http://xkcd.com/1111/

This joke type is simple: start in one direction, then veer off to the side. The gap is between what is expected, which is set up by a recognizable pattern, and what actually happens. It is intensified by having one character, often the straight-man, follow the expected train of thought that the reader follows. Sometimes they continue oblivious to the change of direction (Who’s On First), and sometimes they mirror the reader’s reaction as in the comic. We see here a variant where the straight-man catches on before the end and tries to steer the premise back while the funny-man maintains the diverted course oblivious to the original direction.

The Reveal

 

http://xkcd.com/655/

This joke technique employs a negative knowledge gap until the very end when it becomes positive making for a very wide gap even for soft punchlines. XKCD employs a weak punchline with the visual reveal in the third panel followed by a soft punchline in the fourth panel which increases what is revealed using dialogue and therefore the gap. Usually the reveal works best as a hard punchline to have as intense a reversal from negative to positive gap as possible. However, in this case the visual reveal is still a negative gap because we don’t know the relevance of what is revealed until the last panel and so the effect of the partial reveal in the second panel adds to rather than detracts from the humor.

That’s So True!

http://xkcd.com/770/

This common technique in stand-up comedy comes from a knowledge gap that is both negative and positive. It usually involves describing something which is accepted as normal or correct, but about which there is something strange or wrong. The negative gap is how we usually think of the thing, and the positive gap is how the comedian gets us to think of it with some incisive wisdom or careful observation. As in the reveal, combining negative and positive knowledge makes for a much wider gap and hence more comedy. In this instance the knowledge gap is enhanced by showing not only the gap between ideals and reality, but between the stereotypical male and female view of relationships correspondingly. The author has of course chosen the characters’ genders to match the direction of the knowledge gap. Were the joke to be “all the boys,” it would be more confusing than funny because the knowledge gaps would work against each other.

Satire


http://xkcd.com/610/

Satire is unlike a gag in that it does not have a punchline. However, the theory of humor as based on irony still applies. Satire uses contemporary knowledge of the audience as a setup to contrast with an feigned ignorance of that knowledge by the subject of the satire. The implied knowledge is almost never stated openly making it difficult to recognize satire, sarcasm, and parody without context. It differs from the previous example in that instead of presenting a view of the self as foolish it presents a view of the other as foolish. The knowledge gap is all positive as the audience feels superior in their superior knowledge while they laugh at the ignorant objects of the satire. The brilliance of this piece of satire is that the subjects are themselves thinking how superior they are to others and it is this misguided belief that we laugh at. Should we not be careful that as we use satire to feel superior it is not we who are the objects of the very satire we are laughing at?

Giving a child a normal name is already a shady proposition because it is the thing which most identifies him or her as an individual and yet is attached to them before he or she is even born. Woe to the child who has the kind of self-important parents who think that a unique (read: stupid) name reflects how special they believe their child to be. Firstly, chances are the child is not special so they can be safely assigned an ordinary name. Second, as noted, the name has nothing to do with the person and so it is a reflection rather of the parents’ belief in how special they themselves are. Which is to say they are arrogant pricks. A misspelled name is a form of child abuse that ought to be punished by an 8-10 year prison term. Megyn Kelly could easily press charges against her parents or be excused for kicking them both squarely in the gonads. Megan is spelled M-E-G-A-N; M-E-G-Y-N is an ad campaign for do-it-yourself pap smears. In extreme cases, the humiliation a person suffers every day for 18 years listening to people stumble over the pronunciation of “Aimee” is grounds for execution.

Now that you have determined to give your child a standard, properly spelled name, you have to give some thought to their inevitable nickname. Avoid names like Richard, particularly if their last name is anything untoward. Dick Armey is not a person, it is the sort of thing that appears during the end credits of Superbad.

Even standard names are occasionally inappropriate because of the drift of language over time. Passing on a name like “Cummings” is a hate crime against the people of English-speaking countries. Do not make us snicker everytime we have to introduce you; get it changed like a normal person. I imagine the legal proceedings would go something as follows:

Judge: What reason do you have for changing your name?
Mr. Cummings: Because it’s “Cummings.”
Judge: Fair enough. Would you like to have your parents shot while we’re at it?
Mr. Cummings: Yes, yes I would. You’ll have to exhume my father, but I’d like him shot all the same.

It’s true that certain last names are horrid and need not be perpetuated, but no last name really need be. If children always take the last name from one of their parents, the only possibility is that the number of unique last names decreases or stays constant each generation. The common practice of children taking their father’s sir name (sire’s name) ensures that eventually every person on the planet will be Chan with a few Smiths thrown in for good measure requiring a fourth name which some parents have already begun abusing for their own ego’s sake. The children, I mean, they are abusing the children. The politically correct alternative of the hyphenated name is, let’s face it, stupid. Even for one generation it looks stupid, for 4 generations it’s a family tree hanging off the end of your name like a dingleberry. Instead, let us begin the tradition of moving the middle name to the child’s last name. That way we can get some fresh new ones aired out. Or we could bring back the tradition of naming people by their profession. Goodbye Jane Cooper, hello Jane Stripper. Okay, that’s a bad idea actually.

I’m typically not a fan of gender neutral pronouns like “s/he” because they are awkward to use. I’m fine with using male pronouns by default. That is  because I use “he” as just a default gender-ambiguous pronoun. I ascribe no gender biases when I say “he.” I can just as easily refer to a hypothetical corporate CEO as “he,” as I do for a hypothetical emotionally unavailable spouse.

Once upon a time, “he” was the acceptable gender-unknown pronoun by default. With the rise of the women’s movement, “he” fell to the wayside in favor of “she” to be more inclusive of women. This is especially the case when referring to professions and other formerly male-dominated roles. For example, it seems more common these days to refer to a gender-unknown judge, police officer, accountant, astronaut as “she.” Even in industries that are still statistically overrepresented by men, gender politics have seeped through. I acknowledge the arguments that a default masculine assignation might perpetuate glass ceilings and stereotypes of female versus male gender roles, especially in the workplace. If we always refer to positions of power by male terms, we associate men with those roles.

Therefore, I accept using “she” as the gender-unknown pronoun since we don’t objectify people as “it,” even if it would solve all these problems and “they” is grammatically incorrect.  If it is now more politically correct to always use female pronouns when the genders are unknown, that should always be the case, even when using “she” perpetuates negative female stereotypes. Case in point from a New York Family Law outline regarding divorce for constructive abandonment:

“The willfull, continued, and unjustified refusal to engage in intimate relations with a spouse for one year or more may constitute constructive abandonment [as grounds for unilateral divorce]….In cases in which the parties have not engaged in intimate acts for a period of one year or more, the plaintiff must establish that he repeatedly requested resumption of the marital relations.”

The poor pleading husband who doesn’t get laid often enough has grounds for divorce. I haven’t looked at the actual statute, but I’m assuming that a wife who has not had sex in more than a year can also seek divorce. Would it have been confusing to use “she” in that instance? I’m guessing that some students would’ve done a double-take if it did read “she.” It wouldn’t have been as clear precisely because people assume it’s the husband who has “repeatedly requested resumption of the marital relations.”

After establishing that it is men who aren’t sexually fulfilled in marriage, the outline continues to paint them as deadbeats:

“Spousal support is the obligation of one party to provide the other support….It is awarded in a divorce if one spouse cannot provide for his own needs with employment.”

In this example, the former husband is the one who cannot provide for himself. So we’re not going to perpetuate the stereotype that men are the breadwinners, but we’re going to assume that it’s men who beg their wives for sex? This is the problem with picking and choosing which gender-unknown pronoun you want to use. The point of a default is that it is supposed to be free from any bias. If we’re all going to use “she” from now on, then that should apply across the board, even when it reinforces negative stereotypes of women, e.g. the divorced housewife. Simply reversing the male and female, reserving “she” for positive associations and using “he” for negative ones only reflects the biases of the speaker. Guy Kawasaki, in The Art of the Start, puts it best, ‎”If only defeating sexism were as simple as throwing in an occasional he/she, she, her, or hers. I use the masculine pronouns merely as a shortcut. Successful entrepreneurship is blind to gender. Don’t look for sexism where none exists.”

The presidential election is gearing up already, and like always the blackish candidate is starting to look a lot like the executive-type candidate when you examine his policies and actions. Technically, Ron Paul is still out there gnoming it up with half a platform of sensible, realistic ideas and half of plans to eliminate the Department of Education possibly in an effort to stop the Queen from guessing his real name. Every election we are promised change, that something substantial will be different, and every time the reform looks so much like the thing it’s reforming that if you squint hard enough you can almost make out the change fairies with a can of bullshit repainting the same damn thing for the hundredth time. It got me thinking about why substantial change is so hard to come by, why there is such an unbridgeable gap between American ideals and American government. It was like a dream: ephemeral, illusory, transcendent. It began with “if.”

If you believe . . .

. . . a true American does not bow to a king, much less a king of kings . . .

. . . an organization is designated “non-profit” when it meets the standards of a non-profit organization as set forth by law and files the appropriate paperwork, not automatically because it is a religion . . .

. . . the Eighth Amendment banning cruel and unusual punishment from being inflicted would naturally apply to being burned alive in a lake of fire for all eternity . . .

. . . a person has exceptionally poor judgment if they do not recognize as an obvious fraud a cult leader who declares himself a god then tells everyone to sell their property and give him the money . . .

. . . lawmakers are elected by the people and subject to the same laws . . .

. . . questioning the ruling authority is properly called dissent, not heresy or blasphemy, which is protected under the First Amendment and therefore one cannot be punished for it . . .

. . . the majority of Christian churches took the wrong side on women’s rights, civil rights, gay rights, and evolution and so have lost all credibility . . .

. . . an adult who talks to an invisible dead man with magic powers that lives in a kingdom in the sky is partially insane . . .

. . . a person should love their children, their spouse, and their friends, more than Jesus . . .

. . . “acts of god” is as ridiculously inappropriate a phrase as “alien invasion” to put in a legal document . . .

. . . swearing on a bible didn’t help Nixon tell the truth . . .

. . . Muslims have a right under the First Amendment to freely practice their religion and cannot be tortured for refusing to confess Jesus as lord . . .

. . . Gandalf is not your best friend despite reading about him in a book of magic and mythical creatures . . .

. . . freedom of religion does not mean churches can violate the Americans with Disabilities Act by firing a teacher for having narcolepsy . . .

. . . the Earth was formed 4.5 billion years ago from stellar debris . . .

. . . obedience through fear is the coward’s path of least resistance . . .

. . . dying and coming alive again three days later is more like taking a long nap than sacrificing your life . . .

. . . love does not mean agreeing to not hurt someone if they worship you . . .

. . . The Constitution is more important than a 2000 year old collection of fairytales . . .

 

. . . then maybe you don’t want to elect another Christian.