If we are to examine the idea of choice then we should begin by defining our terms. I will define choice in the mathematical sense according to the axiom of choice. The actual statement of the axiom is a bit technical and requires infinity so we can content ourselves with the following: choice is defined as the ability to select one item from a set of indistinguishable items. Distinguishability has important implications in mathematics, thermodynamics, and quantum mechanics so it is reasonable to include it in our definition. If the items are distinguishable, then selecting one is a trivial matter as some factor will favor one choice over the others. It is only when the choices are identical that we truly “choose.”

Philosophers may debate whether choice as such exists or does not, but we may also ask a practical question about how this ability might arise. For this purpose, let us assume that humans do possess the ability to choose and that most other species do not (or at least not at our level). It would necessarily have evolved and therefore be of enough benefit to survival to warrant the extreme cost of the human brain. As it must have evolved before the advent of agriculture, it would be beneficial in the hunter-gatherer lifestyle. When hunting, selecting the weakest, or fattest, or most desirable is a fairly simple application of the decision-making power which weighs information and finds the optimum. If a unique optimum can be found then there is no need for choice. Distinguishability on the basis of some optimizable factor is the difference between a choice and a decision.

Before we proceed I should describe a defensive behavior of prey animals called flocking. There are many benefits to travelling in a large group, especially for protection. One non-obvious benefit is that predators are presented with the opportunity to kill one of many. This is known as the predator confusion effect. The effect can be heightened if all the individuals look and act identically (flocking behavior). A good example is the zebra whose stripes make a lone zebra highly visible, but make herds of zebras a mass of shifting lines stopping a predator from focusing on a target long enough to kill it. In systems where predator confusion is in effect, prey can be singled out for greatly increased risk by marking them as different using colored dye. How might hunters counter this strategy of indistinguishability? If choice is an ability specifically evolved, and not some freak accident of nature, then I think it possible that it is a counter-strategy to flocking behavior in our prey.

It is sometimes said that when presented with many similar options, choice is paralyzing, but in fact the opposite is true; choice is what allows us to select an option despite it not differing from its fellows in any way. The paralysis comes from attempting to make a decision among distinguishable objects without enough information to definitively know which is optimal. I have had much success with clarifying the difference between choices and decisions to friends asking about selecting colleges, careers, romantic partners, etc. When they accept that it is impossible to decide what is “best,” they are free to make an arbitrary choice and continue with their life.

A few days ago the question arose: who was the smartest person in history? I immediately offered the obvious answer, “the Earl of Sandwich.” To my surprise, some of the people disagreed and there was a heated argument. Let me lay out my case here in full.

Aristotle said, “a man is defined by his actions.” Under this definition the smartest person would be the one who had done the smartest thing. The person with the former Guinness world record for highest tested IQ is Marilyn Vos Savant who is a role model to all aspiring young girls for doing a very smart thing indeed–marrying a rich man. Has she done anything else? Not really, so let’s move on. One might give a stock answer such as “Einstein” or “Newton,” but remember that they were standing on the shoulders of giants so their accomplishments were not so much a matter of genius as that of good balance. The Earl of Sandwich, on the other hand, invented the sandwich. I am tempted to end the article right here.

However, there are those who full of contentious urges would vehemently deny the contribution to all humanity of this grand patron. To silence such blathering fools I will present incontrovertible proof of his surpassing intellect. The first objection they will raise is that the sandwich was invented by some rabbi thousands of years ago. Wrong! The rabbi used unleavened bread and bitter herbs, and everyone choked it down politely while secretly swearing never to eat anything even remotely similar again. Then they will scoffingly impair his legacy saying it is not important enough. Wrong! A common idiom for an important invention is, “the best thing since sliced bread.” For what purpose would one need sliced bread except to make sandwiches? On the topic of world-changing inventions we might well mention Gutenberg’s printing press. But what is a printing press, really, if not a lead-and-paper sandwich? So we see that the sandwich is the foremost invention after which all others are patterned and it was the Earl of Sandwich who brought the idea into the modern perfection so beloved today.

The idea has spread to other cultures which have developed their own particular style of sandwich. The Japanese, not wanting to discriminate between top and bottom, encased the filling uniformly in breading to form a sushi roll. The Mexicans made a grilled cheese out of tortillas and called it a quesodilla. The people of Philadelphia created possibly the greatest sandwich ever conceived and then proceeded to ruin it by using cheese whiz instead of the white cheddar and provolone that a sane person would. I have received marriage proposals from men and women alike merely by describing a cheesesteak I had concocted. For devising a thing that in making and eating has given purpose to the lives of both halves of our species, I hereby declare John Montagu, Fourth Earl of Sandwich, the smartest person in human history.

This was another thing I thought was obvious, but when I searched for it came up with a lot of crap about changing gender roles which is utterly false. I will begin with a generalization which, like all generalizations, is completely true, and fair to all parties. Men are biologically, not culturally, straightforward and direct. We see things in black-and-white; there are never any shades of grey–the primary reason we don’t read that book. The mindset has its advantages in dealing with things that follow a law of excluded middle such as: this rocket will fly or it will crash into the ground in a fiery explosion that looks awesome. We understand a situation of that kind and are able to plan to have our cameras ready. Women on the other hand are inherently better at situations where the antecedent and the event lie somewhere on a continuum, the exact position differing in the opinion of each observer. These are often the circumstances of human interaction making men with their on/off perspective look like bumbling idiots in various social relationships.

The topic in question is cleaning duties and how they are split according to gender. Let us consider the task of cleaning a bathroom. First one must determine the bathroom is in need of cleaning, then what should be done to clean it, and when it can be considered clean again. It’s clearly one of those sliding scale things that men are uneasy with and would prefer to avoid whenever possible. “If it’s dirty today, was it clean yesterday? If so, can I wipe a paper towel around at random and bring it back to the cleanliness of yesterday and be done? If not, then it was dirty yesterday, but she didn’t make a fuss about it then so why the fuss now? It’s PMS again isn’t it . . .” This is how a man thinks about the clean/unclean dichotomy. So the female ends up cleaning the bathroom and grumbling. They grumble louder if it’s that time of the month and even louder if you try to imply that it’s because it’s that time of the month. If grumbling could scrub grout they might not have such a tough time of it. I’m just saying. Men dodge chores not because they are lazy or chauvinistic, but because an ill-defined impetus and goal give them a vague sense of malaise they’d rather ignore. So what does that leave the men in a household to do? Consider the chore of taking out the trash. There is a clear threshold for action (can full), a clear course of action (empty can), and a clear metric for completion (can empty). Few people would think a half-full garbage can needs emptying so the potential ambiguity is eliminated. The trash is taken out regularly and the man feels that he has contributed in some significant way to the running of the home, though it’s mostly a lie. This also works with changing lightbulbs (dark/light), microwaving burritos (frozen/edible), and the buying of basic necessities (no beer/beer). Watching the kids poses something of a problem as the alive/dead metric lacks the sort of nuanced refinement most women desire in the childcare realm.

Sadly this leaves women doing the majority of housework which is patently unfair. Here’s my advice to women to get their male housemates–boyfriend, roommate, husband, son–to do more. Have the need for a task to be accomplished be triggered by an event calling for immediate action like reminding him of laundry day by using his socks to set off the smoke detector. You could also toss a match into three months of accumulated stumble decorating the vanity plate. And the smoke detector would go off reminding him to change his socks too. That’s efficiency! Telling him the kitchen floor is dirty is pointless because being dirty is not an event. Refrigerator mold growing eye stalks and growling at the dog is an event, but you won’t elicit a response from him before then no matter how meaningfully you roll your eyes. Women, you’ll never get men to see the world in anything other than binary opposites; it’s not in their chromosomes, stop trying. If all else fails you can fall back on the age-old paradigm: chores/no sex.

Humans, unlike all other species before, evolved a well-developed mental ability to think in abstract concepts. This most powerful of our evolved traits is the basis of our intelligence. However, it can go awry if we deal in practice with symbols and forget what it is they represent. The logical fallacy resulting from this mistake is called reification: to treat an abstract symbol as the real thing it represents. The use of abstract symbols condenses ideas and removes them from the concrete reality from which they were derived. The allure of the simplified ideal and the lengthening distance from its object a repeatedly used symbol experiences draw humans into the trap. Here are three instances where this occurs and their negative consequences.

Money and Value

The most obvious instance is the symbol/object pair of money/value. Of course reification of money has existed for all of civilization, but it has become increasingly destructive with the advent of electronic banking and trading, as money moves from something in your wallet to pixels on the screen. With funds being recorded electronically and trades being made with computers in microseconds it is inevitable that we have phenomena like flash crashes in the stock market. Traders are not buying stock based on the real value of the company, but on the speculated monetary value the useless slip of paper will have in a few seconds during which no new information regarding the company was introduced and yet the price changed. This can only be explained by the reification of money which can be changed almost instantly despite the enduring and stable value it represents. Believing the price up and believing the price down has no cause in reality, but it certainly has an effect. The scariest part is that this arbitrary number that we have believed into existence, we have also believed into being important enough for millions of people to lose their homes when a few people believe it to go down. It is no wonder our economy operates on “fiat currency.”

Credential and Qualification

Speaking of economy, economic efficiency depends on matching qualified workers with jobs appropriate to their skill set. Employers need a symbol which represents the fact that an applicant possesses the desired skill set, that symbol being a credential. However, when employers place too much weight on credentials and fail to check that the credential faithfully indicates the presence of a body of knowledge they commit the fallacy of reification. When the employer can no longer distinguish between the symbol and the object, the consumer no longer cares to acquire the skill set, only the credential and the accrediting institution, following the demands of the employer and consumer, focuses on churning out credentials rather than knowledge. Eventually such an economic system will fall apart and the employers, recognizing their mistake, will demand a different kind of symbol and the process will repeat ad infinitum.

Age and Wisdom

Most cultures (thankfully not modern American culture) impose a hierarchy based on age with the eldest at the top. The logic behind this practice is that a person who is older is likely to have more life experience and therefore be wiser. But wisdom is difficult to detect and so, like the credential, society looks for an easily recognizable symbol to stand for the complex idea. Because of a correlation between the two, age was chosen as the symbol of wisdom. Once it is encoded into a culture, the origin is forgotten and all members are required to “respect their elders” even if those elders have done nothing to deserve respect. Their long lives may be primarily a symptom of luck or other factors out of their control.

The most common example of this fallacy in history has been religious people mistaking metaphorical statements for their literal meaning. When the Christian bible says that during the sermon on the mount Jesus took a few loaves and fishes and made a meal to feed thousands it is not referring to a petty magic trick. The passage is a reference to the doctrine that “man does not live by bread alone, but by every word issuing from the mouth of God.” And so Jesus rebuked his disciples who had thought that by “beware the leaven of the Pharisees” he meant the Pharisees might try to poison their food. Those who fall into the same error as the twelve disciples have like them earned the biting criticism, “get behind me Satan, for you are not mindful of the things of God but of the things of man.”  Perhaps reification is not worthy of eternal punishment; still, a little caution is due so that the machine of our brains does not overpower the rational consciousness of our minds.

With an eye towards appreciating humor, we are going to analyze a few choice XKCD comics. Why? Because researching this article involved reading them for 4 hours. (A complete aside: because of its bilateral symmetry, I for years thought XKCD was a sophisticated emoticon not an unphonetic string.) As you can imagine, the humor may be lost once the analysis begins. First, we need to briefly treat an elementary theory of humor.

Excluding slapstick, most humor is based on irony. Some types of irony are: verbal irony, language whose literal meaning is contrary to what is intended; situational irony, events proceeding proceeding int the opposite manner to what was expected; and dramatic irony, a situation where the audience knows more than the characters. All three have in common the notion of a disparity between what is stated and what is implied. The space between what is shown to the audience and what is known by the audience we will refer to as “the knowledge gap“. It can be either positive (the audience knows more than is shown) or negative (the audience knows less than is shown). Humor can include both, though it is usually positive. Negative knowledge gaps in mystery novels, horror films, and magic tricks tend to make people uneasy which can be enjoyable, but rarely funny.

A layperson might say that what makes a gag funny is the punchline. An amateur comedian might say that it is the setup. A master comedian would say that it is neither the setup nor the punchline, but the space between them, called “the beat“, which creates humor. Crossing that space, the hang-time of the mental leap, is at the heart of humor and the larger the gap that is crossable, the funnier the gag. Like jumping a canyon on a motorcycle, if the gap is too narrow, it’s unimpressive; if too wide, you slam into the side of a cliff. A cogent analogy would be the bump-set-spike process of volleyball wherein the flashy, visible spike is merely the follow-through of a well-executed set. The setup and punchline mark out the gap to be crossed and in what manner. Now that we have a primer, let’s jump into the comics.

The Double Meaning

http://xkcd.com/101/

The joke is based on a pun involving the word “miss.” The humor is enhanced in a number of ways by widening the knowledge gap. The phrasing of the pun mimics that found in sappy advertising to contrast with the morbid nature of the implied meaning. Using “loved ones” creates a much wider disparity between the two meanings of “miss” by directly invoking the extremes of love and murder than a weaker phrase such as “friends,” or “husband,” or “neighbors” might. The use of a laserscope which is ancillary to the rifle adds to the gap by making the reader take an extra mental leap to connect lasersights to improved accuracy for killing. Had the box had a picture of a rifle, the smaller gap would not be as funny.

The Disconnect

http://xkcd.com/1111/

This joke type is simple: start in one direction, then veer off to the side. The gap is between what is expected, which is set up by a recognizable pattern, and what actually happens. It is intensified by having one character, often the straight-man, follow the expected train of thought that the reader follows. Sometimes they continue oblivious to the change of direction (Who’s On First), and sometimes they mirror the reader’s reaction as in the comic. We see here a variant where the straight-man catches on before the end and tries to steer the premise back while the funny-man maintains the diverted course oblivious to the original direction.

The Reveal

 

http://xkcd.com/655/

This joke technique employs a negative knowledge gap until the very end when it becomes positive making for a very wide gap even for soft punchlines. XKCD employs a weak punchline with the visual reveal in the third panel followed by a soft punchline in the fourth panel which increases what is revealed using dialogue and therefore the gap. Usually the reveal works best as a hard punchline to have as intense a reversal from negative to positive gap as possible. However, in this case the visual reveal is still a negative gap because we don’t know the relevance of what is revealed until the last panel and so the effect of the partial reveal in the second panel adds to rather than detracts from the humor.

That’s So True!

http://xkcd.com/770/

This common technique in stand-up comedy comes from a knowledge gap that is both negative and positive. It usually involves describing something which is accepted as normal or correct, but about which there is something strange or wrong. The negative gap is how we usually think of the thing, and the positive gap is how the comedian gets us to think of it with some incisive wisdom or careful observation. As in the reveal, combining negative and positive knowledge makes for a much wider gap and hence more comedy. In this instance the knowledge gap is enhanced by showing not only the gap between ideals and reality, but between the stereotypical male and female view of relationships correspondingly. The author has of course chosen the characters’ genders to match the direction of the knowledge gap. Were the joke to be “all the boys,” it would be more confusing than funny because the knowledge gaps would work against each other.

Satire


http://xkcd.com/610/

Satire is unlike a gag in that it does not have a punchline. However, the theory of humor as based on irony still applies. Satire uses contemporary knowledge of the audience as a setup to contrast with an feigned ignorance of that knowledge by the subject of the satire. The implied knowledge is almost never stated openly making it difficult to recognize satire, sarcasm, and parody without context. It differs from the previous example in that instead of presenting a view of the self as foolish it presents a view of the other as foolish. The knowledge gap is all positive as the audience feels superior in their superior knowledge while they laugh at the ignorant objects of the satire. The brilliance of this piece of satire is that the subjects are themselves thinking how superior they are to others and it is this misguided belief that we laugh at. Should we not be careful that as we use satire to feel superior it is not we who are the objects of the very satire we are laughing at?

Giving a child a normal name is already a shady proposition because it is the thing which most identifies him or her as an individual and yet is attached to them before he or she is even born. Woe to the child who has the kind of self-important parents who think that a unique (read: stupid) name reflects how special they believe their child to be. Firstly, chances are the child is not special so they can be safely assigned an ordinary name. Second, as noted, the name has nothing to do with the person and so it is a reflection rather of the parents’ belief in how special they themselves are. Which is to say they are arrogant pricks. A misspelled name is a form of child abuse that ought to be punished by an 8-10 year prison term. Megyn Kelly could easily press charges against her parents or be excused for kicking them both squarely in the gonads. Megan is spelled M-E-G-A-N; M-E-G-Y-N is an ad campaign for do-it-yourself pap smears. In extreme cases, the humiliation a person suffers every day for 18 years listening to people stumble over the pronunciation of “Aimee” is grounds for execution.

Now that you have determined to give your child a standard, properly spelled name, you have to give some thought to their inevitable nickname. Avoid names like Richard, particularly if their last name is anything untoward. Dick Armey is not a person, it is the sort of thing that appears during the end credits of Superbad.

Even standard names are occasionally inappropriate because of the drift of language over time. Passing on a name like “Cummings” is a hate crime against the people of English-speaking countries. Do not make us snicker everytime we have to introduce you; get it changed like a normal person. I imagine the legal proceedings would go something as follows:

Judge: What reason do you have for changing your name?
Mr. Cummings: Because it’s “Cummings.”
Judge: Fair enough. Would you like to have your parents shot while we’re at it?
Mr. Cummings: Yes, yes I would. You’ll have to exhume my father, but I’d like him shot all the same.

It’s true that certain last names are horrid and need not be perpetuated, but no last name really need be. If children always take the last name from one of their parents, the only possibility is that the number of unique last names decreases or stays constant each generation. The common practice of children taking their father’s sir name (sire’s name) ensures that eventually every person on the planet will be Chan with a few Smiths thrown in for good measure requiring a fourth name which some parents have already begun abusing for their own ego’s sake. The children, I mean, they are abusing the children. The politically correct alternative of the hyphenated name is, let’s face it, stupid. Even for one generation it looks stupid, for 4 generations it’s a family tree hanging off the end of your name like a dingleberry. Instead, let us begin the tradition of moving the middle name to the child’s last name. That way we can get some fresh new ones aired out. Or we could bring back the tradition of naming people by their profession. Goodbye Jane Cooper, hello Jane Stripper. Okay, that’s a bad idea actually.

The presidential election is gearing up already, and like always the blackish candidate is starting to look a lot like the executive-type candidate when you examine his policies and actions. Technically, Ron Paul is still out there gnoming it up with half a platform of sensible, realistic ideas and half of plans to eliminate the Department of Education possibly in an effort to stop the Queen from guessing his real name. Every election we are promised change, that something substantial will be different, and every time the reform looks so much like the thing it’s reforming that if you squint hard enough you can almost make out the change fairies with a can of bullshit repainting the same damn thing for the hundredth time. It got me thinking about why substantial change is so hard to come by, why there is such an unbridgeable gap between American ideals and American government. It was like a dream: ephemeral, illusory, transcendent. It began with “if.”

If you believe . . .

. . . a true American does not bow to a king, much less a king of kings . . .

. . . an organization is designated “non-profit” when it meets the standards of a non-profit organization as set forth by law and files the appropriate paperwork, not automatically because it is a religion . . .

. . . the Eighth Amendment banning cruel and unusual punishment from being inflicted would naturally apply to being burned alive in a lake of fire for all eternity . . .

. . . a person has exceptionally poor judgment if they do not recognize as an obvious fraud a cult leader who declares himself a god then tells everyone to sell their property and give him the money . . .

. . . lawmakers are elected by the people and subject to the same laws . . .

. . . questioning the ruling authority is properly called dissent, not heresy or blasphemy, which is protected under the First Amendment and therefore one cannot be punished for it . . .

. . . the majority of Christian churches took the wrong side on women’s rights, civil rights, gay rights, and evolution and so have lost all credibility . . .

. . . an adult who talks to an invisible dead man with magic powers that lives in a kingdom in the sky is partially insane . . .

. . . a person should love their children, their spouse, and their friends, more than Jesus . . .

. . . “acts of god” is as ridiculously inappropriate a phrase as “alien invasion” to put in a legal document . . .

. . . swearing on a bible didn’t help Nixon tell the truth . . .

. . . Muslims have a right under the First Amendment to freely practice their religion and cannot be tortured for refusing to confess Jesus as lord . . .

. . . Gandalf is not your best friend despite reading about him in a book of magic and mythical creatures . . .

. . . freedom of religion does not mean churches can violate the Americans with Disabilities Act by firing a teacher for having narcolepsy . . .

. . . the Earth was formed 4.5 billion years ago from stellar debris . . .

. . . obedience through fear is the coward’s path of least resistance . . .

. . . dying and coming alive again three days later is more like taking a long nap than sacrificing your life . . .

. . . love does not mean agreeing to not hurt someone if they worship you . . .

. . . The Constitution is more important than a 2000 year old collection of fairytales . . .

 

. . . then maybe you don’t want to elect another Christian.

The hippies would have us believe the answer is ‘absolutely nothing,’ and that it is a characteristic of primitive cultures which have not progressed beyond the need for violent conflict. But then some people believe hippies should be kept away from the public in bong-equipped zoos and drum for our amusement. Let us pretend that I am one of them. I will attempt to make a positive case for war in both the past and future.

Historically war has been an agent of change and there are numerous instances of that change being for the better as a direct result of the war. Wars for independence are an undeniable example; non-violent resistance rarely works against an autocrat. Wars to defend a people from an aggressor are justified. Offensive wars to eliminate repressive regimes can have a beneficial impact on a nation’s inhabitants far outweighing the damage caused by the war. We can agree that war has been useful and necessary in the past. The case for war in the future is more difficult to make because one must argue that there will be problems not possible to solve in a different way. To form this argument I will go further into the past to the time of proto-human evolution.

Hominid evolution developed in 2 million years a species the intelligence of which far exceeds any other. Not only this, it far exceeds what is necessary for its survival. Seemingly necessary. Our bodies are not so different from chimpanzees that we could not survive in approximately their mode of living which requires only chimpanzee level intelligence. The human brain is immensely expensive in oxygen and calories. It is also highly dangerous as can be seen by the number of women who died of childbirth prior to the 20th century. This does not happen in other species. There must be an evolutionary benefit to something so costly. What in this world could a proto-human need such intelligence for unless it were to outsmart other proto-humans? There are a multiplicity of side branches in hominid evolution, none of them still exist. What are the chances that they died out of natural causes?

The following is pure supposition: I posit that humans evolved intelligence to compete with other humans and that the ultimate competition is war. Any increase in intelligence, any product of that intelligence contributes to the military capability of the population that has it. When this war-born intelligence has no conflict to spend itself on, it creates an internal conflict manifesting in a desire to create art, pursue science, build, invent, produce; all the things which make humans recognizable from other species. The hunting instinct that became the war instinct is what drove our evolution towards intelligence and still drives us to be unsatisfied with our current achievements. Even if this feeling does not culminate in war, it is important that it could. To take away the possibility of war it would be necessary to take away the willingness to die or kill. It is a cheap existence which contains in it nothing worth more than life itself. But I’ll let Steinbeck say it: ”fear the time when the bombs stop falling while the bombers live, for every bomb is proof that the spirit has not died . . . fear the time when manself will not suffer and die for a concept, for this one property is the foundation of manself, and this one property is man, distinctive in the universe.” War is the result of a fundamental piece of our character. War is a result of this property in our species. To remove war we would need to remove the cause. Again, Steinbeck: ”Results, not causes; results, not causes.” (Ch.14 Grapes of Wrath)

However, war is quite destructive and while we should retain the will to war, we should restrain ourselves in most circumstances. We can keep the war-like character alive without a full war by engaging in mock-battles, that is, sports. It is in competitive sports that we exercise and hone and learn to control our desire to fight.

In a collection of similar species, the predatory ones tend to be most intelligent. As Larry Niven writes of herbivores, “how much intelligence does it take to sneak up on a leaf?” Alien species that contact our world are often portrayed as war-like for the sake of entertainment. If we assume evolution works similarly on all planets, it is quite likely that this will be true.

Photo credit: asterix611

The time has come to celebrate the Winter Solstice and the New (calendar) Year as it passes the same point in the orbit of the Earth around the Sun. Well can we understand and accept that days tied to orbit-specific events should be recognized annually, but what about other events, in particular birthdays and anniversaries? Astrology aside, the date of one’s birth has no meaning and taken literally will never happen again, so that a birthday is a once-in-a-lifetime event. Certain popular birthdays could stand to be ignored altogether. The bias in the preceding statement may be due to being stranded in Tuscon on December 24th hungry, with no food, and a disturbing paucity of Chinese restaurants. The point is not that we should have done with birthdays for imaginary people only, but real ones as well.

Similarly, historical events do not recur annually yet holidays based on them do. For a society of a few hundred years this is no impediment, but consider a society spanning millenia. I believe the chief downfall of the Roman Empire was that by 476 AD every third day remembered some important battle or popular religion from a thousand years previous and no one could get any work done. When should a holiday be retired? For battles and such like things with no further significance the lifespan of those who lived through them will suffice. Certain things like wars for independence, especially those that end by founding a country, are more enduring in their impact and deserve a longer remembrance; they should be retired when the country or people established by them have become unrecognizable by the original patriots. The Thanksgiving holiday, which sole purpose is to skip two days of work and eat good food, could be celebrated monthly without protest.

There are precedents for events observed every four years like the Olympics or the World Cup. If the Olympics were held every year they would lose their significance. Leap years also occur approximately every four years and half the population would not object to moving Valentine’s Day to February 29th in recognition of Pope Gregory’s revised Julian calendar. Or buying chocolate or pretending not to hate dancing awkwardly or whatever that holiday’s supposed to be about. Point is, making every holiday annual by default makes no sense.

Upon first meeting someone and after exchanging monikers, his question turns to my occupation. When I say “mathematician” the response is invariably either, “wow, that must be hard” or “I’ve always hated math.” Not surprisingly this was also the case when I was studying English and my friends in the fields of physics and chemistry have confirmed similar experiences. This effect drops off towards the fields that are perceived as less intellectual (Sociology major? Don’t hurt yourself now . . .) which is to be expected; however, I would like to delve a little deeper into why the reaction is so strong, in particular, so strongly negative.

We begin with a related phenomenon, that of educated people who admit they don’t like to read. Actually, they won’t admit it outright; they try to phrase it as not having time to read. It would be easy to say that most people are lazy and reading requires mental activity as opposed to watching television or playing video games. This is simply not true, especially as video games become increasingly involved and are not less intellectually challenging and engaging than books. Couple these facts with the backlash against science and academics and the pattern emerges that it is not thinking folks are averse to–it is learning. Especially the kind of learning that seems like school. Blame rests entirely with our current education system.

Preschool and kindergarten are intended to engender a desire for education, but they clearly fail. The why of their failure rests on two foundations of the current system: patronization and boring instruction. How can learning be boring when humans are naturally curious? To avoid children ever feeling disappointed for not understanding immediately, the curriculum has all challenging material removed to be replaced with repetition so that we are left with a single cause: patronization. Even young children can tell when they are being talked down to and the condescension continues through to high school where it is met with resentment and hostility. This is aggravated by the policy of promoting self-esteem over achievement leaving students emotionally unable to deal with criticism or failure and disgruntled because they feel they have been short-changed. Condescension would be accepted if the teachers demonstrated an expertize in their subject which deserved respect. Unfortunately even when the teacher is well-qualified, the material and its presentation earn contempt for banality, irrelevance, and unnecessary complications caused by dumbing-down the truth. Unlike a fit, disciplined drill instructor who can motivate through abusive language, ill-informed teachers delivering repetitious lessons in their best intoned Mr. Mackey mmm-kay? cannot hope to succeed with condescension.

We may now conclude that people avoid scholars and academic pursuits because they have an emotional memory of being patronized instead of informed. Or, in other words, schools have killed any interest in reading.