Racial discrimination is an awful thing. At this point, I think most of the enlightened world would agree that society would benefit from the elimination of stereotypes. But while they exist, it may not be so bad to take advantage of the ones that work in your favor.

Although “positive” stereotypes are detrimental to the racial group as a whole because they do not reflect real vicissitudes among individuals, they can be used to your advantage when you’re the beneficiary. If you’re a model minority and someone assumes that you are smart and industrious, roll with it. If you’re not actually intelligent and hard-working, this just makes it easier to fake it like everyone else does. This is especially the case with anything non-white being more “authentic.” This can apply to any aspect of culture, but if you have cultural authority based on the color of your skin, then use that empowerment to better spread your message.

Much has been written about Twelve Years a Slave winning the Oscar for Best Picture because of white guilt. Two Academy voters even when as far as voting for it without seeing it because of its “social relevance.” Ellen DeGeneres may have joked that there were two scenarios–either Twelves Years a Slave would win or the Academy is made up of racists–but she touched on the real issue of how much race played into the decision. “Social relevance” may just be a euphemism for white guilt.

Now here I’ll make a distinction between white guilt and affirmative action. Ostensibly, affirmative action is driven by a goal of diversity, and less so now to rectify past wrongdoings. White guilt is a response that European descendants feel towards minorities because of a history of imperialism, including institutionalized slavery. Although a college admissions officer may be inclined to admit the black candidate because her ancestors were slaveholders and she is correcting for some cross-generational moral deviance, her official line would be that the black candidate would increase the college’s “diversity.”

I won’t proffer an opinion on affirmative action here; that would take up entire post, if not the entire blog. However, I will say that white guilt is completely fair game. As a minority, you don’t know what kind of unspoken biases are held against you. Safe to say that most of those stereotypes are more detrimental than beneficial. You can work to subvert that racism because you know that it is wrong on a societal level, but you can also smartly use it to your advantage when necessary. So when a white person offers you some sort of concession because of some historical event that likely didn’t directly affect either of you, go ahead and grab it!

Lastly, if you are a white reader, don’t let white privilege blind you any more than pernicious racial biases do. Sometimes an Asian person is just not that smart; sometimes a black person is just not that athletic. And sometimes a movie about slavery is just not that great.

*NB: This also applies to Jews, of course.

These days, you step away from the Internet for a moment and suddenly you’re behind on current events. Most recently, I went dark for twenty minutes and when I came back on Twitter, Nelson Mandela had died. Pair this with the tragic death of Paul Walker just days early, the Internet cried.

I know vaguely who Nelson Mandela was. I never studied apartheid or much of South African history in school. I didn’t even watch Invictus. What I do know about Mandela is cursory. He fought against racial discrimination and was willing to go to jail for 27 years for his beliefs. He was President of South Africa from 1994-1999. In short, most of his accomplishments happened either before I was born or during my formative years. While I can’t say that I didn’t benefit from him making the world a better place, his impact on my life is relatively indirect and minimal.

I have seen every Fast and Furious movie with Paul Walker. Those movies are likely Walker’s greatest legacy. That alone has meant that Paul Walker has had a more direct impact on my life than Mandela. Comparing the legacies of these two individuals is an idiotic task, but if I were to post a RIP to either of them, it would be about Walker. Mandela led and extraordinary life with huge accomplishments, but he died peacefully at the age of 95 and had been mostly out of public light since 2004. Walker died at 40, and while he didn’t leave us with too meaningful roles, he still had a potentially long career ahead of him.

Looking at my Twitter and Facebook feeds, I see plenty of my peers posting messages of mourning for Mandela. I suppose it would look rather shallow if you grieved over the loss of Walker but said nothing of Mandela, who, by almost any definition, had a greater impact in the world. However, your Facebook friends and Twitter followers aren’t relying on you for world news updates. Your social media presence should be personal and relate to how the things you post affect you (although please, we don’t care what you ate for lunch).

I don’t doubt that for some people, Mandela may have been a source of personal inspiration and his life was meaningful on a personal level to those people. But for the majority of people expressing their sorrow, I hesitate to assume that they even know what apartheid was. This is reflected by the awful tributes to Morgan Freeman that people have put up or the tasteless comparisons of Walker and Mandela, like the picture at the start of this post. Just because someone important died, doesn’t mean the rest of the world needs to know that you were vaguely aware of that person’s significance.

Humans, unlike all other species before, evolved a well-developed mental ability to think in abstract concepts. This most powerful of our evolved traits is the basis of our intelligence. However, it can go awry if we deal in practice with symbols and forget what it is they represent. The logical fallacy resulting from this mistake is called reification: to treat an abstract symbol as the real thing it represents. The use of abstract symbols condenses ideas and removes them from the concrete reality from which they were derived. The allure of the simplified ideal and the lengthening distance from its object a repeatedly used symbol experiences draw humans into the trap. Here are three instances where this occurs and their negative consequences.

Money and Value

The most obvious instance is the symbol/object pair of money/value. Of course reification of money has existed for all of civilization, but it has become increasingly destructive with the advent of electronic banking and trading, as money moves from something in your wallet to pixels on the screen. With funds being recorded electronically and trades being made with computers in microseconds it is inevitable that we have phenomena like flash crashes in the stock market. Traders are not buying stock based on the real value of the company, but on the speculated monetary value the useless slip of paper will have in a few seconds during which no new information regarding the company was introduced and yet the price changed. This can only be explained by the reification of money which can be changed almost instantly despite the enduring and stable value it represents. Believing the price up and believing the price down has no cause in reality, but it certainly has an effect. The scariest part is that this arbitrary number that we have believed into existence, we have also believed into being important enough for millions of people to lose their homes when a few people believe it to go down. It is no wonder our economy operates on “fiat currency.”

Credential and Qualification

Speaking of economy, economic efficiency depends on matching qualified workers with jobs appropriate to their skill set. Employers need a symbol which represents the fact that an applicant possesses the desired skill set, that symbol being a credential. However, when employers place too much weight on credentials and fail to check that the credential faithfully indicates the presence of a body of knowledge they commit the fallacy of reification. When the employer can no longer distinguish between the symbol and the object, the consumer no longer cares to acquire the skill set, only the credential and the accrediting institution, following the demands of the employer and consumer, focuses on churning out credentials rather than knowledge. Eventually such an economic system will fall apart and the employers, recognizing their mistake, will demand a different kind of symbol and the process will repeat ad infinitum.

Age and Wisdom

Most cultures (thankfully not modern American culture) impose a hierarchy based on age with the eldest at the top. The logic behind this practice is that a person who is older is likely to have more life experience and therefore be wiser. But wisdom is difficult to detect and so, like the credential, society looks for an easily recognizable symbol to stand for the complex idea. Because of a correlation between the two, age was chosen as the symbol of wisdom. Once it is encoded into a culture, the origin is forgotten and all members are required to “respect their elders” even if those elders have done nothing to deserve respect. Their long lives may be primarily a symptom of luck or other factors out of their control.

The most common example of this fallacy in history has been religious people mistaking metaphorical statements for their literal meaning. When the Christian bible says that during the sermon on the mount Jesus took a few loaves and fishes and made a meal to feed thousands it is not referring to a petty magic trick. The passage is a reference to the doctrine that “man does not live by bread alone, but by every word issuing from the mouth of God.” And so Jesus rebuked his disciples who had thought that by “beware the leaven of the Pharisees” he meant the Pharisees might try to poison their food. Those who fall into the same error as the twelve disciples have like them earned the biting criticism, “get behind me Satan, for you are not mindful of the things of God but of the things of man.”  Perhaps reification is not worthy of eternal punishment; still, a little caution is due so that the machine of our brains does not overpower the rational consciousness of our minds.

Giving a child a normal name is already a shady proposition because it is the thing which most identifies him or her as an individual and yet is attached to them before he or she is even born. Woe to the child who has the kind of self-important parents who think that a unique (read: stupid) name reflects how special they believe their child to be. Firstly, chances are the child is not special so they can be safely assigned an ordinary name. Second, as noted, the name has nothing to do with the person and so it is a reflection rather of the parents’ belief in how special they themselves are. Which is to say they are arrogant pricks. A misspelled name is a form of child abuse that ought to be punished by an 8-10 year prison term. Megyn Kelly could easily press charges against her parents or be excused for kicking them both squarely in the gonads. Megan is spelled M-E-G-A-N; M-E-G-Y-N is an ad campaign for do-it-yourself pap smears. In extreme cases, the humiliation a person suffers every day for 18 years listening to people stumble over the pronunciation of “Aimee” is grounds for execution.

Now that you have determined to give your child a standard, properly spelled name, you have to give some thought to their inevitable nickname. Avoid names like Richard, particularly if their last name is anything untoward. Dick Armey is not a person, it is the sort of thing that appears during the end credits of Superbad.

Even standard names are occasionally inappropriate because of the drift of language over time. Passing on a name like “Cummings” is a hate crime against the people of English-speaking countries. Do not make us snicker everytime we have to introduce you; get it changed like a normal person. I imagine the legal proceedings would go something as follows:

Judge: What reason do you have for changing your name?
Mr. Cummings: Because it’s “Cummings.”
Judge: Fair enough. Would you like to have your parents shot while we’re at it?
Mr. Cummings: Yes, yes I would. You’ll have to exhume my father, but I’d like him shot all the same.

It’s true that certain last names are horrid and need not be perpetuated, but no last name really need be. If children always take the last name from one of their parents, the only possibility is that the number of unique last names decreases or stays constant each generation. The common practice of children taking their father’s sir name (sire’s name) ensures that eventually every person on the planet will be Chan with a few Smiths thrown in for good measure requiring a fourth name which some parents have already begun abusing for their own ego’s sake. The children, I mean, they are abusing the children. The politically correct alternative of the hyphenated name is, let’s face it, stupid. Even for one generation it looks stupid, for 4 generations it’s a family tree hanging off the end of your name like a dingleberry. Instead, let us begin the tradition of moving the middle name to the child’s last name. That way we can get some fresh new ones aired out. Or we could bring back the tradition of naming people by their profession. Goodbye Jane Cooper, hello Jane Stripper. Okay, that’s a bad idea actually.

I’m typically not a fan of gender neutral pronouns like “s/he” because they are awkward to use. I’m fine with using male pronouns by default. That is  because I use “he” as just a default gender-ambiguous pronoun. I ascribe no gender biases when I say “he.” I can just as easily refer to a hypothetical corporate CEO as “he,” as I do for a hypothetical emotionally unavailable spouse.

Once upon a time, “he” was the acceptable gender-unknown pronoun by default. With the rise of the women’s movement, “he” fell to the wayside in favor of “she” to be more inclusive of women. This is especially the case when referring to professions and other formerly male-dominated roles. For example, it seems more common these days to refer to a gender-unknown judge, police officer, accountant, astronaut as “she.” Even in industries that are still statistically overrepresented by men, gender politics have seeped through. I acknowledge the arguments that a default masculine assignation might perpetuate glass ceilings and stereotypes of female versus male gender roles, especially in the workplace. If we always refer to positions of power by male terms, we associate men with those roles.

Therefore, I accept using “she” as the gender-unknown pronoun since we don’t objectify people as “it,” even if it would solve all these problems and “they” is grammatically incorrect.  If it is now more politically correct to always use female pronouns when the genders are unknown, that should always be the case, even when using “she” perpetuates negative female stereotypes. Case in point from a New York Family Law outline regarding divorce for constructive abandonment:

“The willfull, continued, and unjustified refusal to engage in intimate relations with a spouse for one year or more may constitute constructive abandonment [as grounds for unilateral divorce]….In cases in which the parties have not engaged in intimate acts for a period of one year or more, the plaintiff must establish that he repeatedly requested resumption of the marital relations.”

The poor pleading husband who doesn’t get laid often enough has grounds for divorce. I haven’t looked at the actual statute, but I’m assuming that a wife who has not had sex in more than a year can also seek divorce. Would it have been confusing to use “she” in that instance? I’m guessing that some students would’ve done a double-take if it did read “she.” It wouldn’t have been as clear precisely because people assume it’s the husband who has “repeatedly requested resumption of the marital relations.”

After establishing that it is men who aren’t sexually fulfilled in marriage, the outline continues to paint them as deadbeats:

“Spousal support is the obligation of one party to provide the other support….It is awarded in a divorce if one spouse cannot provide for his own needs with employment.”

In this example, the former husband is the one who cannot provide for himself. So we’re not going to perpetuate the stereotype that men are the breadwinners, but we’re going to assume that it’s men who beg their wives for sex? This is the problem with picking and choosing which gender-unknown pronoun you want to use. The point of a default is that it is supposed to be free from any bias. If we’re all going to use “she” from now on, then that should apply across the board, even when it reinforces negative stereotypes of women, e.g. the divorced housewife. Simply reversing the male and female, reserving “she” for positive associations and using “he” for negative ones only reflects the biases of the speaker. Guy Kawasaki, in The Art of the Start, puts it best, ‎”If only defeating sexism were as simple as throwing in an occasional he/she, she, her, or hers. I use the masculine pronouns merely as a shortcut. Successful entrepreneurship is blind to gender. Don’t look for sexism where none exists.”

Race relations is a sensitive subject for obvious reasons. It’s difficult to speak about the issues without running into walls of political correctness on one end or accusations of racism on the other. Being a minority, I’ve weathered charges of racism pretty easily. It seems that calling a white person racist is one of the worst insults imaginable to that person. Given the history of Caucasian discrimination in this country, I can see how whites would be especially offended by any sort of inferences of racism. However, as a child of immigrants from a country where racism is not a forefront issue like it is here, it’s just never seemed like a big deal. As I’ve gotten older however, I’ve become even more sensitive to subtle racism when the effect is too easily downplayed.

Whatever my feelings about racism, I hate the term “person of color.” At its simplest level, it splits people into a dichotomy–either you’re white or you’re not, as if that distillation is all that’s required for intelligent discussion of race. At least when you use the word “minority,” there could be instances where that minority, within a given population, could be white. But with “person of color” that can only mean that they are nonwhite. I don’t believe the charges that “minority” has connotations of subjugation. Not only that, but the idea of categorizing based on skin color is archaic and should not be perpetuated in use.

I suppose that this is a necessary term when you’re talking about diversity programs, but that overlooks the problem that diversity programs that only seek to ensure a significant “person of color” population is inherently flawed. It is easy, living in a diverse city, to forget that most of the country is predominantly white and in many communities, minorities are so few and far between that they might as well be grouped together for a coherent antimajoritarian agenda. However, this parochial outlook should not represent the country any longer. Sure, there are still parts of the country where being Asian means you’re conceived of as either Chinese or Japanese, but the country as a whole is a pluralistic society.

As alluded to, the cloud hanging over this whole discussion is affirmative action. My stance is that affirmative action programs that simply seek to achieve non-white diversity would not be respectful of the diversity within the nonwhite community. These days, most affirmative action programs likely will group minorities into broad categories and seek adequate representation of those groups. Yet any time you set an arbitrary group, there will always be underrepresented subgroups. This has been a big problem with Asians and Pacific Islanders, commonly grouped together as one, but actually representing very diverse cultures. When the public perceives too many overachieving East Asians and proceeds to pass judgment on the achievement of South East Asians, it unfairly discriminates against that group that actually should benefit from affirmative action.

Image credit: http://vitaminsea.typepad.com

What’s the point in applauding after watching a movie? If it were a live performance, I can understand showing your appreciation to the performers. But when a movie is over, no one involved in the production will have any idea of your applause so it’s clearly not directly at them. The possible exception is a movie screening with cast and crew in attendance. Applause was appropriate when performances were in person, otherwise, they’re unnecessary.

So then, who is the applause for? It seems to me that people clap after movies they like to signify to the other people sitting in the theater that they collectively witnessed a triumphant film. To those people, I ask if it’s necessary to project your opinion. Do we need validation from you that it was a good movie? I wasn’t quite sure I could express my satisfaction until my personal opinion of the movie was validated. Thank you for telling me that I’m allowed to like this movie.

If the applause isn’t for the benefit of everyone else in the theater, perhaps it’s some sort of spontaneous reaction to watching something entertaining? I loved it so much that I just had to clap! I fail to believe that applause is as involuntary as laughing at a pratfall or tearing up during an emotional farewell. Perhaps people clap because they think that’s an appropriate response no matter the number in the audience. But then I ask, how many of you clap when you watch television by yourself?

What do you do when you encounter a bathroom sign like this? At the very least, I could make out that WC meant restroom, but for which gender was this designated?

Now you may ask whether I could determine gender by comparing the image of the androgynous  child to the image on the other door, but there is no other door. The two restrooms at this restaurant were in separate sides of the building. With no reference point, I could only try to translate “Vomini.” I was at an Italian restaurant, so I figured maybe “men” would be something like the Spanish hombre, but my meager experience in Romance languages proved unhelpful. A Google search for “vomini” only turned up oddly gender-neutral results.

Well, when in doubt, just do the visual spot check. I peaked inside looking for the familiar urinal for confirmation. Instead, I find another unique feature. There was one enclosed stall with a toilet, and next to it was another toilet with no stall or door at all. What do you make of that? Well this didn’t look like a single-occupancy, after all, there are two toilets. But who would sit on the toilet with no door with a clear view to the sink?

I acknowledge that classier places don’t like to have the clear and ambiguous stick figure with the caption “MEN.” But there are plenty of ways to indicate which gender belongs in this bathroom without the familiar blue sign. If it’s a unisex bathroom, then might as well invite everyone. Either label it plainly “RESTROOM” or put up one of these amusing signs to invite the whole family.

And as far as the poor quality photo, the restaurant was also exceedingly dark, which made the gender confusion all the more prominent.


Edit: My co-editor has brought to my attention that I had an Indiana Jones moment. ”Dammit, in Latin, Jehovah starts with an ‘Y’.”
Uomini means men, but they spelled it with a ‘v’ because, well, it’s Latin.

Every That’s What She Said – watch more funny videos

Everyone should know by now the joke “that’s what she said,” used in retort to any type of sexual double entendre. Importantly, “that’s what she said” is used when the phrase immediately prior could be construed as something a woman would say in a sexual context. The following are some examples of statements that can properly be followed with “that’s what she said.”

  • You really think you could go all day long?
  • Why is this so hard?
  • You already did me.
  • I can’t stay on top of you twenty-four seven.

What these have in common is that each phrase can conceivably be said by a woman in a typical heterosexual context. They are double entendres, phrases that could be construed in more than one meaning, often with a risque, inappropriate or ironic secondary meaning.

Given its spread to popular culture, everyone think they are Steve Carrell and overuse the joke. “That’s what she said” certainly lends itself to many situations considering numerous euphemisms for sex and sexual situations. However, this doesn’t mean “that’s what she said” can be used in every instance of something sexual. Keep in mind what it actually means. It has to be a spoken phrase that a female partner could conceivably say in a sexual encounter. Therefore, not every semi-sexual phrase can be followed by “that’s what she said.” Sometimes it just doesn’t make sense. Examples of improper use of “that’s what she said” are as follows:

  • This sucks.
  • They did it on the roof.
  • You’re all a bunch of dicks.

You get the picture. They just aren’t things that “she” would say. So don’t use an already-played out joke improperly.

Anytime a product is named by its intended function, it should be able to satisfactorily perform said function. That’s a simple concept, one I feel few people would reject. Case in point: the toothpick. Granted, I did not do any research into the history of toothpicks besides a cursory glance at Wikipedia, but I think it’s fairly safe to say that toothpicks were invented with one specific function. At the very least, the toothpicks marketed now are sold with some expectation that they will be used to pick teeth. Hence, my frustration at having to grab a handful of these toothpicks every time I want to remove one bit of detritus from my teeth.

I grew up believing all toothpicks came in those shakers they have at each table at Chinese restaurants. Up until I moved to New York, I have never bought a box of toothpicks that weren’t Chinese made. Strangely enough, Chinese made toothpicks seem to be higher quality than the American Penley brand I bought this time. Maybe it’s because these toothpicks are flat instead of round, but the quality of the manufacturing is truly sub-par. The above pictured are just a random sample I pulled from the box. Splinters, broken tips and uneven shaping were common. Worst yet, not one of the toothpicks have the structural integrity to remove anything from my teeth.

I acknowledge that there are alternative uses to toothpicks. But at the very least, it should be able to do the one thing it was meant to do.