I'm going to be taking a wee break from the blog until the new year.
I'll see you again in 2005. Take care.
Thursday, December 23, 2004
Tuesday, December 21, 2004
The experimental method is the cornerstone of modern science. As such, most people find it extremely boring. Generally, the most you can hope for, when it attempting to discuss it, is that you might provoke fond memories of a cute lab partner in the mind of the person you are attempting to discuss it with.
Fortunately, such things as The T.W.I.N.K.I.E.S. Project exist to demonstrate that there is a fun side to The Method. This is a site dedicated to various experiments inflicted upon innocent snack cakes by bunch of (I must presume) bored college students.
My personal favorite is the Turing test, where they attempt to determine whether or not Twinkies are sentient.
Sunday, December 19, 2004
As parents are won't to do, mine would occasionally tell people embarrassing anecdotes about me. One of the more humiliating stories pertains to an event that happened when I was three years old (and yes, I know that I shouldn't be embarrassed by something that happened then; never the less, I am). Apparently, they invited a friend of theirs over to our house. When I saw this man, I started screaming my little head off. When asked why I was so upset, I pointed at him and said, "He's a monster!"
The man was black.
Putting aside my own residual (and irrational) shame, there's a few things about this incident that strike me as curious. The first and foremost is that my father was clearly humiliated by my reaction. To understand why this is a curiosity, you need to understand that this is the same person who told me that he would disown me if I "ever brought a black girl home" and who dozens of racist jokes that he'd tell at the drop of a hat (his favorite being about a black astronaut with the punch line being "too late, the jig is up"). Just to add another twist to the story, my mother is Spanish (from a colony of Spaniards who've been living in the Southwestern United States since the 1500's, but that's another story). My own complexion is best described as a kind of light olive tone and not really white – and I'm lily white compared to some of my nieces and nephews. I should hardly be the child of a racist and, yet, quite a few of my father's attitudes (the same father, I remind you, who was mortified by my juvenile reaction to his black friend) were patently racist.
You may well ask why I'm brining up these tangled family memories. In my own convoluted way, I am trying to demonstrate that the subject of racism is complex. This should be an obvious statement but it is clear that many people do have a simplistic view of the subject. The simplifications break down into two mutually incompatible perceptions of race relations in America.
- Racism is endemic and ubiquitous.
- Racism is negligible and isolated.
The view that racism is negligible is driven by the perception that the majority of Americans do understand that racism is a bad thing. In this view, racism only persists in isolated pockets (the stereotype is of rednecks in the Deep South going to Klan meetings) and in the presence of a few isolated individuals who tarnish those around them by association. In this view, accusations of racism are typically the result of oversensitivity and that much of what is called racism is actually an overreaction. Let us call this a Type 1 stance.
The converse view, that racism is prevalent, is driven by the perception that the majority of Americans (in particular, the white majority) secretly harbor racist sentiments and that this persistent (albeit submerged) racism exposes itself in terms of hiring practices, housing opportunities, police harassment, and so on. It is believed that many comments and statements that are passed off as being innocent are, in fact, laden with racist implications which further expose the bigotry of those who say them. Even many of those who try to go out of their way to deny being racists are, in fact, only overcompensating for their core racist views. Let us call this a Type 2 stance.
The sharp disjunction between these stances can best be illustrated by the issue of affirmative action legislation. The policy of affirmative action grew out of the civil rights movements of the 1960's as a direct reaction to manifestly unfair hiring and enrollment practices. At the time, it was considered to be a necessary antidote to a pervasive social poison. Almost immediately, it became the subject of controversy as charges of reverse discrimination were levied at the policy.
Things came very much to a head with the advent of California's Proposition 209 which effectively ended the state's participation in affirmative action programs. Those who opposed the measure considered it to be anti-progressive while those who supported it argued that it was egalitarian since the text of the proposition specifically stated that "[t]he state shall not discriminate against, or grant preferential treatment to, any individual or group on the basis of race, sex, color, ethnicity, or national origin in the operation of public employment, public education, or public contracting."
The real root of the controversy was an issue of perception. Those who believed that racial bias had become a rare anomaly felt that affirmative action was an unnecessary and harmful legacy meant to address a problem that, essentially, no longer existed while those who believed that racism continued to be a prevalent and persistent factor in American life felt that affirmative action remained a necessary counterweight to a culture that was still rife with bigotry and discrimination.
There is no question that America has been a racist culture and that fact of racism did not end with the signing of the Emancipation Proclamation (a document, it must be noted, that was largely motivated by political considerations and not humanitarian ones). So-called "separate but equal" apartheidistic policies were living institutions a mere four decades ago. I would contend that the fear of racism in American culture is neither unreasonable nor irrational. What, however, is the reality? That is a much more difficult question to answer.
Last year the University of California and MIT conducted a joint study where nearly identical resumes were sent to prospective employers in the Boston and Chicago areas. The one variation between the resumes was in the names put on them. The researchers culled black and white birth records to find names that were more statistically "white" and "black" sounding. What they found was that resumes with white sounding names got call-backs at an average rate of 1 in 10 times while those with black sounding names got call backs at a rate of 1 in 15 times.
I think that this and similar studies suffice to demonstrate that there is a bias in racial perceptions in America and that it's not limited to in-bred yahoos who dress up in bed sheets to burn crosses. At the same time, however, I do not believe that these studies support the contention that the typical American is either overtly or even intentionally racist. One may well wonder why the researches focused on resumes as opposed to the results of face to face interviews. I suspect that the answer is that face to face interviews would not have yielded the same dramatic levels of racial disparity in hiring practices. The study is well designed to ferret out subconscious racial biases. The very fact that the researches felt the need to go after subconscious motivations may well indicate an ironic sort of progress against the days when deliberate racism was the norm. Certainly it is a fact that one of the most harmful accusations that can be leveled against a person in our culture is a charge of racism. If racism were, in fact, embraced by our culture, such accusations would not be harmful, they would be laudatory.
I think that one of the major problems in discussing race in our culture is that we tend to engage in a false dichotomy: either one is a racist or one is not. The implication is that someone who occasionally tells an mildly offensive joke that employs stereotyping (bearing in mind that even blond jokes utilize stereotypes) is in the same epistemological category as someone who advocates racial genocide. I would contend that racism is not a binary state but rather that it is something that exists on a graduated scale. I would further argue that many political and socially active groups have systematically ignored that very gradation in an attempt to advance progressive policies – a methodology that I believe is now backfiring and likely to cause more harm than good.
I would like to propose the existence of at least three different types of racial bias. In order of increasing severity, I would like them as biases of racial stereotype, racial hierarchy and, finally, racial antipathy. I would describe a racial stereotype bias as one where a person subscribes to any number of generalizations regarding the characteristics of a given race. It is important to note that racial stereotype biases come in both negative and positive flavors. Believing that all members of race X are thieves may be more hurtful than believing that all members of race X are law abiding but both cases remain examples of racial bias. A bias of racial hierarchy, by contrast, is a view where one believes that different races exist in a hierarchy of relative virtue. A racial hierarchy bias will almost certainly include a set of racial stereotype biases but not necessarily vice versa. It should also be emphasized that racial hierarchies don't necessarily preclude limited egalitarianisms – one may admit that certain races are equal while, never the less, believing that races, in general, exist on an ordered scale. Finally, we have the bias of racial antipathy where the holders of the bias actively hate members of certain races (with the most extreme form being the hatred of all other racial categories). Again, one may have a hierarchical bias without having a antipathetic bias but it would be rare to have an antipathetic bias without having a converse belief in racial hierarchies and, in turn, holding to a set of racial stereotypes. A such, I would suggest that racial bias has a pyramidal structure with the more extreme forms of racism being supported on a foundation of less virulent forms of racial bias. In addition to distinctions of kind there are also distinctions of degree within a type. A person who dislikes Asians would qualify as having an antipathetic bias as well someone who advocated the internment and extermination of Asians. Never the less, the first person, while agreeing that Asians are undesirable may, never the less, be appalled at suggestions of genocide.
At this point, we may well ask why this should matter. Isn't any sort of racial bias a bad thing? Yes. Indeed, the interlocking nature of the pyramid gives us cause to be concerned that any type of bias can enable worse types. The more stereotypes one has regarding other racial groups, the easier it becomes to categorize races on a hierarchical scale. Once one starts employing a stratified racial perspective, it becomes easier to believe that some races deserve contempt. As with any sort of slipper slope argument, we must caution ourselves not to presume that path down the slope is inevitable, however, I think that it is clear that such a path does exist, at least, in potential which ought to be worrying enough. So why ought we not consider these divisions to be nothing more than an academic distinction without any practical value when it comes to addressing the realities of racism? I think that these divisions are important precisely because ignoring them not only gives a distorted view of the degree of the problem but that it gives us a distorted perspective on how to solve the problem.
One common tactic that has been employed by minority advocacy groups has been to diligently and publicly ferret out anything that hints of racism and to expose the perpetrators of racists sentiments in a very public manner. For a long time this has been an effective policy precisely because all forms of racism have been treated as though they were biases of antipathy. Given that antipathetic racism is strongly associated with such organizations as the Ku Klux Klan and the Nazi genocides, people have had a natural aversion to being associated with anything that smacks of them. In modern American culture, there is an immense stigma to being branded a racist. In recent years, however, the effectiveness of this tactic has shown signs of weakening. Indeed, there has been a quite (but growing) backlash against those groups that have employed this technique. The reason for this is that people who may hold biases of stereotype justifiably don't see themselves as being like people who have biases of antipathy. Since the common implication is that anyone who holds racist views must, in fact, hate members of other races, it is reasonable for someone who does not, in fact, actually hate other races to conclude that they aren't, in fact, racist and to, in turn, come to resent charges of racism and to, furthermore, conclude that any charge of racism that don't involve overt statements or acts of racial hostility are, in fact, suspect. In short, by lumping all racially biased conduct and belief under one banner, advocacy groups eventually end up looking like the boy who cried wolf even though their general position that any sort of racism is a bad thing is, in fact, defensible!
I will return to my father as an example. It is my belief that he was subject to a bias of hierarchy. He did, in fact, believe that some races were objectively better than other races. This is why he didn't want me to "bring home" a black mate. In his worldview, this would have represented an inappropriate mixing of racial categories. This view of the world was, in turn, built upon an elaborate foundation of racial stereotypes that he wholeheartedly embraced. He did not, however, actually hate the members of other races. He was genuinely mortified by my "monster" comment and was not being at all disingenuous when he said that he had friends in other racial groups. Most people would properly consider his hierarchical perspective to be racist but he, himself, rejected the claims of racism precisely on the grounds that he did not harbor any racial antipathies. I won't deny that there was more than a little cognitive dissonance in his stance but I think that it is, never the less, a fact that we lump all racist perspectives under a single umbrella of equivalency made it much easier for him to deny that he, himself, belonged under that umbrella.
How much easier is it for someone who does not, in truth, feel that any race is superior to any other to excuse themselves for, never the less, believing that there are differences between the races? So long as we insist that any racism is an act of hatred, such a person will be able to excuse themselves from thinking of their views as racist because we have foisted a definition of racism that does, in fact, exclude them. Given that there is a path up the pyramid, this implied exclusion is not one that I believe that we can afford to make as a culture.
Because there are different types of racism, I believe that there must be different approaches to the issues of racial bias. Insisting that every racist statement is an act of hatred may be morally satisfying but the ultimate result is that we end up diluting the effectiveness of the charge while leaving the basic problem unresolved. I would suggest that biases of stereotype are the most common sort of racial prejudice in modern American culture and that the bulk of our efforts should be directed as resolving it. I think that such efforts should eschew implications of hatred and, instead, resort to implications of ignorance and naïveté. I would also suggest that while some of shaming could still be utilized (no one wants to be ignorant), it may be better to positively emphasize the aspects of education and self-improvement. By appealing to a higher standard, I think that people can be motivated to voluntarily purge themselves of the urge to stereotype.
Biases of hierarchy will be more difficult to deal with, however, the problem is also less severe. I suspect that the best way to contradict a perception of hierarchical differences between the races is by example. As more and more members of minorities obtain the status of cultural icons, the view that the races are naturally distributed along a hierarchy will be harder to support. I do, also, believe that there may be some worth in indicating that hierarchical biases are closer to antipathetic bigotry, but I think that such efforts should be employed selectively and not as a cudgel lest they suffer from the eventual effects of backlash.
I would contend that biases of antipathy are, in fact, in the extreme minority and that efforts directed against them should be narrow albeit intensely focused. Once a person has crossed into the realm of actual racial hatred, I suspect that there's very little than can be done to convince them that they are in the wrong. Calling them racists won't do any good for the simple reason that they would tend to wear the appellation as a badge of honor. The best that can be done, I fear, is to use the force of public shame and the rule of law to minimize the actual amount of harm that they can do with their prejudices. Grass roots efforts can keep them from public office and the weight of state and federal law can be used to circumvent the material harm that they can do to others.
These are merely my suggestions, of course. Whether or not you think that my proposed approach has merit, though, I think that it is undeniable that any approach based on an oversimplification of the problem is doomed to failure. I believe that it is beyond doubt that the issue has been oversimplified and that our methods of trying to combat the problem, while yielding initial successes, are reaching a point of diminishing return that is only going to continue to diminish. I think that we must admit that the problem has dimensions that go well beyond the simple binary dichotomies that have traditionally framed the issue. Even if you don't agree with my particular distinctions of bias, I think that the contention that all racial prejudices are alike in simple untenable. Even if you aren't convinced that my framework is accurate, I think that we can agree that a better framework is needed all the same.
Thursday, December 16, 2004
It wasn't as though
We didn't see
What was happening,
Nor that we were
Desensitized to the brutality.
It turned our stomachs
And it made us squirm.
You couldn't be human
And not care.
But we didn't know
What we could do about it.
We felt that it was hopeless:
A thing beyond our control.
It gave us a sick feeling,
An impotent feeling.
That's why we turned away.
That why we're talking
So very, very loudly
About things that
Just don't matter.
Tuesday, December 14, 2004
It is often said that God is in the details. Of course, it is also said that so is the Devil.
Whatever the case may be, The Zoom Quilt is an interesting study in detail. You'll need a Flash player to be able to see it, but it's well worth it. It takes the form of a "painting" that you can zoom into or out of, indefinitely (ultimately, it does loop around) revealing more and more detail.
Be warned that some of the content is mildly disturbing and may not be appropriate for a work environment or for young children.
Sunday, December 12, 2004
In this essay I'm going to discuss Copyright and Trademark laws and their relation to the First Amendment. In particular, I am going to address what I see as a looming threat to the doctrine of Fair Use.
Contrary to what many Americans would like to believe, the articles of the First Amendment are not and have never been absolute — many people are familiar with Supreme Court Justice Oliver Wendell Holmes' example of yelling fire in a crowded theater. Over the years, the Court has provided a list of types of speech and expression that are not considered protected.
Among the forms of speech that have not been considered to be protected are slander (meaning verbal defamation), libel (meaning written defamation), obscenity (which is distinguished from pornography), sedition, incitement to riot and so forth.
It should be observed that none of these exemptions have gone unchallenged and that considerable effort has been expended to precisely define them and to determine whether or not they should be Constitutionally exempted. There are libertarian absolutists who insist that a plain reading of the First Amendment does not broker any exceptions to its protections and that, furthermore, any restrictions of the speech of a free populace must, by definition, compromise its freedoms. Most ethicists, however, accept that such an absolute stance in untenable in the face of the manifest demands of society.
Copyright law is a particularly thorny issue. Just as patents are intended to drive innovation by giving an inventor a temporary exclusive license to his invention, so is copyright intended to drive creative expression by giving an author or artist exclusive license over his intellectual creations. Originally, the terms of control were for the lifetime of the author. In the 19th century, this was extended to the lifetime of the author plus fifty years in order to provide for the estate of a work's creator. Recently the Sony Bono Term Extension Act has extended this duration an additional twenty years for a sum total of the life plus seventy years (a term that many feel exceeds the intent of copyright).
Trademarks are similar to copyrights except that where a copyright is a work unto itself a trademark is used to specifically identify a company or a product. A trademark can be almost anything, including any word, phrase, symbol, design, sound, smell (!), color, product configuration, group of letters or numbers, or combination thereof, so long as they are used by a company to identify or distinguish its products or services. Although trademarks, in the United States, are granted for an initial period of ten years, so long as they are consistently used for a minimal period of five years, they can effectively be extended forever.
Copyright law, trademark law and the First Amendment have any an uneasy relationship with one another. A naïve reading of the First Amendment would suggest that the very notion of copyright legislation is patently unconstitutional (again, a perspective that certain Libertarians embrace). As I've noted, however, the First Amendment has never been understood to be an absolute stance and copyright law is a perfect example of the limitations of the First Amendment. That said, a copyright or a trademark is not, itself, absolute. Although I may not profit, outright, from another person's work or utilize another entities trademark without authorization and compensation, there are circumstances where I can legally reference a representative part of a person's work or an entities trademark without violating the terms of copyright or trademark. The terms that allow me to do so are called the terms of Fair Use.
There are two basic forms of fair use that are recognized under the law. The first form is that of allowable citations of another agents work or mark. If I am discussing McDonald's marketing campaigns, I am perfectly within my rights to reproduce their logo and to quote various slogans that they have employed over time (e.g., "You deserve a break today") in spite of the fact that their logos and their slogans are protected trademarks. In like measure, if I am reviewing a book, I may freely quote representative samples of the text of the book even thought the whole of the text is protected by copyright law. Although, as with any law, there are grey areas in determining the precise boundaries between representative citation and unlawful plagiarism, the general distinction between legal citation and unlawful reproduction is fairly well understood.
The second form of fair use takes form of artistic representations. It should be emphasized that this protection specifically includes works of satire and parody. If I'm making a collage for a work titled "Globo-Corporate Christ", I am allowed to depict Jesus crucified to the McDonald's arches (the issue of my good taste is another matter). Likewise, if I were to write a satirical picture-book that had Dorothy and crew from the Wizard of Oz going to see the Wizard of Gluttony, I would be free to portray them passing through the famed arches.
Artistic and satiric depictions are exempted by Fair Use for a good reason. Art and especially satire can be a potent form of political and social commentary. Jonathan Swift proved this when he wrote "A Modest Proposal" which was a scathing essay directed against English treatment of "the Irish problem" which proposed that Englishmen should cannibalize Irish children (many people did not realize that it was a satire and condemned Swift for a monster). The ability to depict and even mock public and private institutions and symbols is an important part of the public dialog that contributes to the functioning of a free society.
Naturally corporations are loath to see their trademarks misappropriated and authors (and other artists) are reluctant to allow their works to be mishandled. Fair Use is not an absolute protection. I can't slap the McDonald's logo on my restaurant and claim that I'm simply making an ironic comment on the decline of mom and pop business in the United States. However, because art is a very subjective thing, it can be difficult to distinguish the line.
Recently, Mattel has launched several lawsuits in attempts to defend its Barbie brand name. One suit was against the Swedish pop music group Aqua for their song "Barbie Girl". The song has the refrain, "I'm a Barbie girl in a Barbie world &mdash life in plastic, it's fantastic" as well as the line "You can bush my hair, undress me anywhere". Mattel contended that the song was doing harm to the Barbie name and that their use of it constituted trademark violation. The courts ruled that the song was, indeed, a legitimate social commentary and that it's use of the Barbie name was protected. Likewise, the courts have ruled in favor of artist Tom Forsythe, who posed Barbie dolls in provocative stances, as well as artist and vendor Paul Hansen for such creations as "Transvestite Ken" and "Big Dyke Barbie". On the copyright side of the fence, the estate of Margaret Mitchell, author of Gone with the Wind, sued Alice Randall to prevent the publication of The Wind Done Gone, which is a parodic sequel to the original book.
By and large, when such cases have reached court, the courts have been sympathetic to artists and satirists, which is not terribly surprising. When it comes to First Amendment issues, the courts have tended to error on the side of caution. Plaintiffs bringing suit against artists have to demonstrate not only that their properties are being used without authorization but that the value of their properties are being diluted and that the artists in question are not engaging in valid artistic expression. These are very high hurdles to cross and one might wonder why organizations such as Mattel don't simply look the other way when it comes to such cases.
The directors of a corporation have a legal obligation to do everything in their legal power to maximize the worth of their corporation. This is known as the rule of due diligence. If the shareholders of a company believe that the officers of a corporation are not exercising due diligence they can launch punitive law suits of their own. A corporation that fails to aggressively defend its brands against dilution of value can easily be charged with a failure of diligence. This is especially true given that the potential cost of failing to defend a brand name is not merely some abstract loss of value but, in fact, the loss of the brand name altogether. If a trademarked name enters the common vernacular, the courts can rule that the name has become generic and is no longer owned by the company that originated it. Aspirin is the classic case of a brand that has become a generic term (other examples include cellophane, linoleum, dry ice, and spandex). As such, it is both rational and prudent for a company to aggressively pursue infringement cases.
So, you may ask, what's the problem? Corporations do their duty to protect their properties and the courts act to clarify where the boundaries lie, generally favoring artists over corporate interests. It would seem like the system works. The problem is that the system works only if you can afford to participate in the system. It costs money to launch a lawsuit but it also costs money to defend yourself against a suit. When the artist in question has the backing of a major music label or can afford the out of pocket expenses on his own, this may not be a problem but many artists don't have the funds to defend themselves even if they are clearly in the right.
A few months ago, the on-line comic strip Penny Arcade ran a cartoon involving the character of Strawberry Shortcake. The cartoon was actually a commentary on the game developer American McGee. American McGee is known for a successful game based upon the premise that Alice, from Alice in Wonderland, had grown up and ended up in a mental institution. In the game, she was freed from the institution and had to fight her way through a nightmarishly transformed Wonderland. Recently, American McGee announced that he was going to provide a similar treatment for The Wizard of Oz (the development of the game has since been put on hold). The author's of Penny Arcade thought that this was a bit creepy and decided to satirize it with a comic titled "American McGee's Strawberry Shortcake". The comic depicted a very adult and provocative version of Strawberry Shortcake holding a whip and sitting astride another character and text indicated her new, sinister nature.
The clear intent of the comic was to mock American McGee for using icons of childhood innocence and distorting them into something with a violent adult orientation. The very next day, the authors of the comic got a cease and desist letter from American Greetings, the holders of the Strawberry Shortcake trademark, that warned them that if they did not immediately remove the offending image they would be served with a lawsuit. Although Penny Arcade is a relatively successful web comic, the owners of the site had neither the funds nor the time to contest such a suit so they complied with the request for removal. It is likely that a court would have ruled in their favor but the consideration is moot. The simple threat of a lawsuit was enough to stifle them.
The courts have consistently ruled that a law can fall afoul of the First Amendment even if it doesn't directly restrict speech. So long as it has a "chilling effect" on speech, it can be construed to be violation. As such, the government can not simply tap the phone lines of citizens at random and set up microphones in a park in an attempt to capture illicit speech. What happens, however, when the very threat of lawful litigation has such a chilling effect? Clearly we can't forbid companies from trying to protect their legitimate interests. To do that would open the door to scoundrels who would deliberately and maliciously steal the intellectual properties of others.
Proposals have been put forth to legislate a solution to this problem with such ideas as "loser pays" schemes and punitive actions against suits deemed frivolous. I think that it is, in fact, important that such cases do reach the courts for the simple reason that every case that goes before the court helps to define where, exactly, the boundaries between legitimate artistic expression and malicious intellectual theft lie. The courts are an important part of the Constitutional process and attempting to circumvent them through preemptive legislation strikes me as an ill advised concept.
I feel that a better result can be found by appealing to the private sector. The ACLU is famed for defending the Constitutional interests of the poor, the unpopular, and the disenfranchised. I think that the ACLU's mission is too broad for this particular task but I think that it would be in the interests of artists, everywhere, to set up a legal defense fund that would be able to specifically address such cases as these. One might propose a dues system where artists interested in having protection could pay into the fund. I think that a better system would be to have a philanthropic trust set up which would defend all meritorious cases regardless of any membership considerations. In such a way, the interests of free artistic expression would be preserved uniformly and upon a level playing field while corporations would still be able to exercise due diligence in the protection of their properties. Such a thing would, I believe, be the best of both worlds.
Thursday, December 09, 2004
They call to us
To strike camp,
Pegs to be pulled,
Tents to be folded,
Mules to be burdened.
We passed the dawn
Doing this and making plans.
Was swamp or mountain;
Delirium or travail.
No garden like brooks
To ease our trail
To sooth our weary feet.
We must choose
Between these choiceless options
Destined by our destination.
Tuesday, December 07, 2004
As every true geek knows, geeks come in all sorts of varieties. There are computer geeks, political geeks, movie geeks, weather geeks, as so forth. The diversity of geekdom is, indeed, deep and wide.
As for myself, I am very much a science geek. I've been deeply interested in science since I was five. One of the great disapointments of my life was the dawning realization that I would make a lousy scientist. Fortunately, science is much like sports in that even if you aren't a player, you can still have a lot of fun being a spectator. So for today's fun, I thought that I'd list out some of my favorite science sites:
Slashdot is a site that is largely dedicated to computer geekery (Linux is very much a holy word among those people), however, it does have a great science sub-section that has the additional advantage of being accessible via an RSS feed.
The Loom is an excellent science blog run by Carl Zimmer, who is the author of some marvelous science popularizations. The focus of the site is mainly on the biological sciences with special emphasis on evolutionary biology. It also has an RSS feed.
ScienceDaily.com is, at the sites name would imply, a site dedicated to breaking science news. One of the things that I particularly like the fact that they are very good about linking to the original news releases for their articles.
Finally, we come to my favorite site: the arXiv.org e-Print archive. Be warned, this is not a site for the faint of heart. It is a repository of original papers in the fields of physics, mathematics, nonlinear science, computer science and quantitative biology. If you are serious about science, this is where you can see the cutting edge of research and theory (many papers appear here before they are officially published in scientific journals), but be warned that these are undiluted publications that tend to be filled with a lot of technical jargon and hardcore mathematics.
Sunday, December 05, 2004
I have, since the age of five, been a science wonk. For many years I dreamt of being a scientist when I grew up but, alas, that never came to pass (I’ve always been weak at applied math). Never the less, I’ve tried my best to be as educated as a layman can be when it comes to matters of science. In many ways this has been an ideal arrangement for me.
Professional scientists need to focus on their particular domains. Although cross-disciplinary research does occur and one does come across the occasional polymath, the norm is that biologists study biology; physicists study physics, and so on. Indeed, the level of specialization tends to be far narrower. In the realm of physics, a person may know everything about optics and next to nothing about cosmology or, conversely, be an expect on condensed matter physics while being frankly ignorant when it comes to string theory (and vice versa).
Since I’m just an ordinary guy looking in from the outside, I can afford to be a dilettante. This allows me to take a broad perspective and to follow my whims. One week I might be reading up on genetic algorithms and the next week I might be learning about gamma ray bursts. I do try to approach my lay studies with an appropriate sense of humility, however. Science is defined by its methodology, which is a very rigorous and demanding one. I get to enjoy the efforts of other people’s labors while not having to go to the effort of doing any of my own research nor of subjecting my conclusions to the grinding mill of peer review and publication. Scientists are like gardeners cultivating an orchard on very challenging terrain whereas I am like a child who comes by and plucks the juicy fruits that they have grown. It is this realization, above all others, that keeps me from becoming a kook.
Every scientist who achieves the least degree of fame (as well as quite a few science fiction authors who get lumped in by association) attracts the attentions of people who are uncharitably called kooks. Some kooks are, clearly, suffering from some form of mental illness. Having a brother who suffers from paranoid schizophrenia, I’ve seen what mental illness looks like up close and can well understand how such a distorted state of mind could compel someone to believe that he has the answers to the mysteries of the universe as well as an overwhelming urgency to share those answers with the world. Most people have an instinctual aversion to the mentally ill but the truth is that such people are blameless. They can no more help how they think than a person with cerebral palsy can help how they walk. Many kooks, however, are not mentally ill, per se.
We are a very creative species. You can see that sense of creativity in our effortless ability to craft improvised tools. Even young children can arrive at ad hoc solutions to challenging problems with the appropriate tool is not had with a degree of facility that would put even the most ingenious chimp in the world to shame. Our big brains are rarely idle. We seek solutions even when the problems facing us are abstract and when lack the cognitive tools to address an issue.
One of the defining traits of humans is that when we encounter a barrier or a limitation, we have a deep-rooted urge to surmount it. We can’t look at a fence or a wall without wondering what’s on the other size and nothing provokes our curiosity like the challenge of a locked box. This is every bit as much true as with natural barriers such as the so-called "sound barrier". As much as strategic consideration drove us to strive for the creation of supersonic aircraft, it is unquestionable that a major part of what motivated the engineers and test pilots who challenged the barrier was the simple fact that it was something that no one was able to break.
This instinct to face and overcome obstacles has served us well over the millennia. A mere fifteen thousand years ago, we were slaves to the elements and the natural forces of the world. Weather, predatory animals, famine and disease were mysterious and largely implacable foes. The natural world seemed vast and inimical. Animistic religions may well have sprung from an urge to anthropomorphize the world so that we could, at least, negotiate with it. To be sure, people still get struck by lightning, mauled by bears, succumb to starvation and, of course, perish from illness but our ability to deal with these ancient woes is much closer to a state of parity if not outright mastery (the predators of the world have much more to fear from us than we do from them).
If we were not a species of problem solvers we’d still be in a state where thirty years was considered a long time to live. The fact that we feel cheated if we can’t make it at least to seventy speaks well of the indomitability of our wills. Unfortunately, this natural sense of challenge that we have can lead us down false paths. As much as it may gall us, there are some problems that literally can’t be solved.
For thousands of years people have been performing elaborate geometrical feats with nothing more than a straight edge and a compass. You can build all kinds of polygons such as triangles and octahedrons as well as performing such geometric operations as bisecting a line or an angle and creating perpendiculars. Some operations, however, that seem like they ought to be possible have proved elusive. One such operation has been the trisection of an angle. It’s very easy to bisect and angle, by which I mean creating a line that precisely divides any arbitrary angle in half. However, creating a pair of lines that precisely divides an arbitrary angle in three parts is more difficult. In fact, it’s impossible to do so only using an ideal compass and an unmarked straight edge. This has been proven mathematically.
A mathematically proof is an absolute statement. There is no room for arbitration or subjectivity. Theologians have often developed a deep fascination with mathematics because it seemed to be a window into a realm of pure and abstract truth such as one might find in the very mind of God. When I say that it’s impossible to trisect an angle with these tools I mean just that. This hasn’t stopped hundreds, if not thousands, of people from trying anyway.
There is no phrase so provocative to the human psyche as any phrase that begins with the words “you can’t”. Such phrases immediately compel us to say, “Oh yeah?” and “Why not?”
We celebrate people who say that nothing is impossible. We cheer for those who dare to do what can not be done. We scorn those meek and petty souls who would tell us that there are things that we can’t do. Who are they to tell us what our limits are? The sky is the limit and not even then! Right? Wrong. Some things really are impossible and trisecting the angle is one of them. People who submit solutions to the problem either don’t understand the problem or don’t have the necessary skills to understand why their solutions are flawed. Most of the people who make the attempt are merely ignorant and, with a little effort, can be made to understand why their methods are flawed. Some, however, persist in their insistence that their solutions are correct and that it is the entire community of mathematicians that is in the error. These are the people who have crossed the line into kookdom.
A classic example of a subject that attracts kooks is the issue of perpetual motion. A perpetual motion machine is a hypothetical device which would generate more energy than was required to run it or, at least, the same amount of energy (a state known as unity). Because such a device would produce a surplus of energy, it could, in principle, use that energy to power itself thus creating a machine that would run indefinitely (hence, perpetual motion). There’s only one problem: the laws of thermodynamics prevent any such device from ever being constructed. In the real world, a machine would always lose energy to friction. Once the machine was out of fuel (or disconnected from its power source) it would invariably run down over time. The thermodynamic principles that mandate this have been well understood for centuries. Unfortunately, this is a case where mere physics gets in the way of sublime hopefulness.
The people who try to build perpetual motion machines are typically not stupid. Many of them clearly have a talent for engineering. What they lack is a basic understanding of physics. Most people who dabble in attempts to built perpetual motion devices quickly become discouraged. Perhaps they learn a little about physics or, perhaps, they simply become frustrated and turn their attentions elsewhere. Some become charlatans who claim that they have succeeded in order to bilk the gullible. A small minority, however, consider the very contention that such devices are impossible to be a slap in the face. Motivated by the highest human instincts they find themselves engaging in the greatest folly. Some of these people waste their entire lives in this pursuit. Although their spirit is laudable their actions, alas, are pitiable. There is no way to calculate how much raw talent has been squandered in the quest to achieve this mirage but there is no doubt that it is vast.
Some barriers offend us not simply because they represent abstract challenges but because they seem to impose direct limitations upon us. The speed of light is one such barrier. Our science fiction is filled with faster than light (FTL) engines for a reason. The alternative is the depressing thought that getting to even the nearest star would take an absolute minimum of 4.3 years. The very term “light barrier” implies that there’s something there to be broken. The term implies that it’s analogous to the sound barrier, only faster. It is, however, a misleading term.
The sound barrier was always understood to be an engineering challenge. No educated person actually believed that it was impossible to go faster than the speed of sound. Plenty of things in the natural world did so all the time (e.g., meteorites). Indeed, every time a bullet was fired or a whip was cracked, the sound “barrier” was broken. The only serious question was whether an aircraft could be designed that would exceed mach 1 and maintain its structural integrity. Few doubted that, eventually, such would be done. It was a challenge but not an impossibility.
The speed of light is not like the speed of sound. The speed of sound is a local phenomenon that can vary according to the medium it’s traveling through (it’s faster through water, for instance) and the such variables as atmospheric pressure. It is an engineering challenge because subsonic flight and supersonic flight have different aerodynamic characteristics. Creating a craft that can manage both environments is tricky.
The speed of light, by contrast, is a very different sort of phenomenon. Many people ask the naïve question of what would happen if someone were going at ten kilometers per and hour under the speed of light and they sped up by another twenty kilometers per an hour. Surely they’d be going ten KPH faster than the speed of light. This is an intuitive thought but this is also a case where the physical world behaves very non-intuitively. Let’s say that I’m traveling half the speed of sound and you fire a bullet parallel to me that’s traveling at mach 1. From your perspective, the bullet is speeding away from you at mach. If I measure it’s velocity, relative to me, as it passes by, though, it’ll only seem like it’s going half that fast. If we try the experiment with me going at fifty percept the speed of light (relative to you) and you fire a laser beam, the results will be very different. You’ll measure the laser beam zipping away from you at approximately 300,000 kilometers per a second. As it passes by me, I will also measure it as going 300,000 kilometers per second. It gets weirder. If someone else fires a laser in the opposite direction I will still measure its velocity as being 300,000 KPS.
This is strange but also true. A century’s worth of experimentation has demonstrated beyond any shadow of a doubt that this is precisely what is observed. To account for this weirdness, Einstein came up with the Theory of Relativity. In the framework of his theory, distance and time is not an absolute quantity but one that can be distorted by velocity and acceleration (as well as mass, but let’s not go on a tangent). The speed of light remains constant for all observers. Although I may accelerate to my hearts content, the photons streaming past me will always be traveling at a constant velocity. An independent observer would see me approaching the speed of light (relative to him) but continuously slowing down as I approached it, never reaching it. Like I said, non-intuitive but, never the less, well verified.
This is precisely the sort of problem that attracts kooks.
“Sure, they say you can’t. Well, I say you can, and who’s to say otherwise, huh?”
In all fairness, there are respected physicists who have looked for ways to circumvent the limitations of the speed of light via such mechanisms as wormholes, space warps and quantum entanglement. Unfortunately, such methods tend to fall prey practical limitations (proposals for space warps and wormholes involve the use of “exotic matter” that, unfortunately, only exists in our imaginations as well as energies on the order of galaxies or, even, the entire universe) as well as more fundamental issues (transversable wormholes, for instance, lead directly to temporal paradoxes). My own suspicion is that the closer we look at such efforts, the more likely it is that we’ll find that they are disallowed. Never the less, it is conceivable that some such method might get us around the limitations of the speed of light. What can not be done, however, is to simply accelerate something past the speed of light.
There are a lot of people who insist that this is just plain wrong. Most of them error in simply understand the nature of the problem. One common objection to the speed of light limit is to suppose that you have a pair of scissors with very long blades. If the blades were long enough, simply closing the scissors would cause the ends of the blade to snap together faster than the speed of light – or so you might think. In reality, you couldn’t bring the blades together fast enough. As you tried to bring them together, the apparent mass of the ends of the scissor would increase requiring you to expend more and more energy to accelerate them further. Ultimately the blades would either snap from the strain or you’d simply run out of additional energy (no matter how much you had). Nothing could cause them to go faster.
The other tact that kooks tend to take with respect to the issue of FTL is to simply insist that Einstein was wrong and that Relativity is in error. One of the hallmarks of kooks is that they tend to think of themselves as mavericks – lone voices of dissent standing against wrong headed academic elitists. They will typically compare themselves to Galileo or Edison or some other paragon of invention and iconoclasm. There is something about such a person that appeals to us. Who is to say that some smart fellow won’t prove that Einstein was wrong? What do those Ivory Tower academics know anyway?
The sad fact is that science is an elite undertaking. There was a time, early in its undertaking, where an intelligent amateur could make important contributions to the work of science. Although amateurs do still, occasionally, make some contribution to the venture, the typical contribution is on the order of an lay astronomer discovering a new comet or a gifted teen working out some novel but largely inconsequential mathematical proof. The days when a gentleman scientist could make significant discoveries using only an air pump are long gone. The major reason for this is precisely because it is much easier to make new discoveries when forging across virgin territory. It is also a truism that easy questions tend to be resolved before complex questions. As a result of these two factors, science has become more and more of a specialized and complex realm.
None of this is to say that laymen are forbidden from practicing science. There are no science police that are going to come are raid your garage/laboratory in search of illicit Bunsen burners. However, if you want to be taken seriously you’ll need to demonstrate an understand of both existing theory as well as the methodology of science. Science isn’t just people in white coats peering at beakers and occasionally shouting “Eureka!” Science is a system of inquiry and exploration that has been proven to, over time, yield cumulatively more reliable models of the universe. If you challenge a theory without demonstrating that you first of all understand it, you aren’t going to be taken seriously. If you aggrandize yourself without bothering to go through the process of peer review, you’re going to be ignored and rightfully so. In the sciences, having a Big Idea is only the very first step. Demonstrating that your idea is true using the tools of the practice is the real challenge.
I do understand the temptations that turn otherwise sane people into obsessed kooks. A few years ago I was contemplating mathematical constructs that I called “urchins”. They were basically line segments intersecting a common point and having certain other characteristics. I was particularly interested in the forms that they would take when embedded in what’s known as a Hilbert Space (which is a space with an infinite number of dimensions). As I was doing this, I had what seemed to me to be a critical realization that I thought would lead to a solution to the Continuum Hypothesis. I was very, very excited about this since this was one of the really big problems in mathematics. For nearly a whole day I could barely contain my excitement and had to stifle the temptation to call a good friend of mine who is a professor of mathematics.
Fortunately I had the good sense to do some reading on the problem. I learned that I had, in fact, badly misunderstood what the problem was in the first place and that, more over, it was one of a class of problem that could not be solved using any of the standard axiom sets. That moment of forbearance saved me from causing my friend to think of me as another annoying kook without a clue. Never the less, it was exciting. The thrill of thinking, even momentarily, that I had outsmarted all of those big-headed mathematicians was absolutely intoxicating. I can understand how someone could become so enamored of such a pet theory that they would refuse to believe that they could be in error. It’s an addictive proposition.
I think, ultimately, that the fact that science attracts so many kooks is, in an odd way, a validation of the potency of science. People appreciate that it’s an important venture and they desperately want to be a part of it. Too often this can lead people down a path of self-delusion where they believe that they have demonstrated their personal worth by making a significant contribution to that grand adventure. It’s very easy to laugh at kooks, especially when the alternative is to be overcome by a sense of exasperation, but most kooks are, I believe, ordinary people who have succumbed to an ordinary urge to be more than they really are. The tragedy of kookiness is that many kooks really do seem to have above ordinary intelligence. I can’t help but to wonder how many of them, with only the slightest shift in perspective and a little training, could actually become the sort of person they dream of being.
Saturday, December 04, 2004
In the United States, one of the traditional sounds of Christmas is the ringing of the bells of Salvation Army workers. These folks generally position themselves in front of major department stores and restaurants, next to a small, suspended pot, and ring their bells in the hopes that passers-by will deposit some change in the pots.
Recently, some retailers have threatened to deny these workers the right to do so in front of their stores under the theory that the ringing distracts their customers from the more important task of shopping. Generally, such efforts lend themselves to very bad publicity with the typical result being that the retailers eventually retreat from their position.
I saw an interesting permutation of this ongoing issue while shopping yesterday. A Salvation Army worker had set himself up in front of a Kmart; however, instead of ringing a bell, he had a small sign on a stick with the words "DING" on one side and "DONG" on the other. Holding the stick between his hands, he would rub his hands back and forth causing the sign to alternate between its two sides.
I am reminded of the aphormism that compromise is the act whereby neither of two sides in a dispute get what they want, leaving both equally unhappy.
As a follow up, it gets stranger. According to an article I found on Yahoo! News , in some places, the Salvation Army havn't merely replaced the bells, but have replaced the actual bell ringers with cardboard cut-outs.
Thursday, December 02, 2004
Far beyond the Singularity
The world will commence again, again:
Smart matter seeds
Strewn across a field of asteroids;
Collected and collated as much as harvested,
And filed away by certain old minds
With inexhaustible memories.
We can not understand why
They have chosen to resurrect the old world —
It is something they create and destroy
Every few tens of millennia.
Some say the variations,
Between one iteration and the next,
Suggest a sequence of science experiments,
Using worlds like cosmic test tubes:
No lesser tools could fit the hands
Of those posthuman gods.
Some say they grow nostalgic
For their once and future world.
Some say that it is a game, played
With rules more complex then the universe.
Maybe it's just art?
The world glows again, again;
Gaia lives, a rejuvenated virgin.
The Earth cools and cracks,
Breaking a new set of supercontinental plates.
Ecce aqua! Ecce vitae! Ecce homo!
The brutish beast, man
Is made the child of his own children.
And then it happens.
God begotten men once again
A new Singularity blossoms into the universe
And the new posthumans scatter from their world
Like diamond sand caught in a universal wind.
We envy them.
We are old.
We are tired.
We are human.
Into living storage
Half a billion years ago.
Is this punishment
For some nameless transgression,
Or are we merely here
Because of some forgotten whim?
Tuesday, November 30, 2004
Slang has been with us since the origin of language (I imagine our proto-spanien ancestors complaining about the way kids grunt these days). The internet, being a meme engine, has added a lot of fertilizer to the environment of niche phrases and words.
Fortunately, for those of us baffled by such expressions as "My hed is pasted on, yay!" and "Ba weat grana weep mini bon", as well as more main stream offerings such as "fo' shizzle my nizzle" and "bling bling", UrbanDictionary.com gives us a way to keep up with all these strange linguistic permutations.
Monday, November 29, 2004
One of the first things I do, in the morning, is scan the headlines on Yahoo news. One, in particular, caught my eye:
IAEA Approves 'Non-Binding' Iran Nuclear Freeze
This refers to the International Atomic Energy Association; however, my morning-bleared eyes read that as "IKEA Approves 'Non-Binding' Iran Nuclear Freeze".
I knew the swedes made great, affordable furniture, but I had no idea that IKEA was so engaged in international affairs.
Sunday, November 28, 2004
Wired Magazine recently had a article on "Blogger Burnout". The articles focus is on why people who show initial enthusiasm for blogging often given it up not because they've lost interest in blogging, per se, but because they've simply reached a point where they're mentally and emotionally exhausted by the effort of posting entries.
Certainly I've seen signs of the phenomenon often enough. Some blogs start out at a furious pace of posting article daily or even multiple times a day. Eventually daily posts give way to weekly posts, then monthly posts, then posts on an entirely irregular schedule eventually followed by a persistent silence. Alternatively, some blogs start out on a regular posting schedule and keep it up for a fairly long amount of time and then just stop, one day, for no apparent reason.
I think that one of the problems of blogging is that it's very easy to come to think of it as an obligation. Once that perspective creeps in, it goes from being a hobby to being a task. Essentially, it becomes an unpaid job. One might well continue to post out of a sense of duty to one's readers but that can only go on for so long especially since such an attempt to post out of duty becomes, itself, progressively more of a burden. After awhile, it's inevitable that you're going to simply say "screw it" and walk away.
By like measure, a blog can be killed by a creeping sense of apathy. Most people who start up blogs come to the venture with a head full of ideas. Eventually, however, that initial stock of ideas becomes depleted and a person is left feeling that they either have to write about something that they don't really care about or, alternative, not write anything at all. Many blogs die with a shrug.
When I started Unstructured Musings I was mindful of both of these tendencies. The first and foremost thing that I decided was that I would write to an audience but that I would also write for myself. I want this to be an interesting and informative place for anyone who decides to look in here, of course, but even more important than that is that this remains something fun for me to do. My blogging is very much a hobby and, as such, ought to be a source of enjoyment (and believe me, it has been).
I deliberately started out slow with the goal of publishing an essay once a week on Sunday along with scattered missives, as they struck me, over the remainder of the week. As I grew more comfortable with the weekly posts I eventually added regular "fun" posts on Tuesday as well as samples of my poetry on Thursday. I decided to add those in because finding cool things on the web is easy enough and I also have quite a large stock of poetry (and an utter lack of shame when it comes to subjecting the rest of the world to it).
My biggest concern has been the essays. I enjoy writing essays but essay writing can be demand and I also recognize that every person goes through dry spells. My solution to this problem is two pronged. I publish my essays on a weekly basis but I write them as they come to me. Some weeks I'll only have a single essay come to mind, others I'll have three, and some I'll have none at all. The overall result of this is that I have a pool of pending essays to draw from. As I'm writing this particular essay, I have a further eleven essays in the pool "before" it (I don't always publish them in the order that I wrote them). As such, if I did hit a dry spell, I could go without writing anything for nearly a full three months before I reached a point where I wouldn't have anything to publish on Sunday. This also means that I can relax and deliberate take time off from blogging in order to allow my mental batteries to recharge, without any interruption to my readers, as I have already done at several points.
But what happens if I exhaust the queue? Well, if I run out of general topics, I can always resort to publishing book and film reviews (since I have an avid interest in both). If that grows boring, I can resort to the second prong which is, quite simply, to put the essays on hiatus.
I would still publish the Tuesday Fun and the Thursday Poetry Slam posts (giving my modest readership something to look forward to). In the meanwhile, I would continue to write essays as they came to me. If that meant going a year or two, so be it. Once I had another stock of essays stored up, I'd start publishing them again.
The bottom line, I think, is that a good blogger needs to be in control of their blog and not the other way around. If a blog starts to intrude upon family or work or starts to become a point of stress then, I think, the point of blogging has been lost.
I think that the very best blog strategy is simply to keep a sense of perspective. Being a blogger, there is a temptation to want to be a latter day Samuel Pepys but the truth of the matter is that the majority of blogs (and certainly my own) are only ever going to have a small audience. Some may find that disappointing but I, for one, consider it a liberation. I don't have a vast audience and I am not being critically judged by forces of history; therefore, I can write to my own interests and at my own pace.
I love blogging. I would encourage anyone who's interested in it to give it a whirl. All I would suggest is that you don't lose sight of your motivations for trying it out. Keep it fun and make it an enjoyment unto yourself.
Thursday, November 25, 2004
It's Thanksgiving here in The States. In honor of this day, I offer you this repeat. I hope that you enjoy it and, if you are one of the celebrants, may your turkeys be juicy and your stuffing moist.
A bloated thing
Blistered and blundering
To its sacrifice
Long hairs wicking
The scent of kerosene
Through wide nostrils
Of long knives
Carve the air
Fallen to its knees
Waiting for the blade
Neck raised high
Fires Burning as
The priests exchange
Blood for rain
Wednesday, November 24, 2004
We've all see the standard maps of the latest presidential election that show a very "red" country with fringes of "blue" in the northeast and along the west coast. The implication being that only a few heavily populated democratic states voted for Kerry and that the rest of the nation was solidly for Bush.
Michael Gastner, Cosma Shalizi, and Mark Newman of the University of Michigan have produced some interesting maps that show that the actual breakdown, once you look past the winner-take-all results of the electoral college, are actually more complex and interesting and that the geographical political divisions are not so apparent.
There results can be found at Maps and cartograms of the 2004 US presidential election results
Tuesday, November 23, 2004
I've been a lexophile since before I can remember. If you love words, too, there are few places as conducive to that form of enjoyment as World Wide Words.
Every week, the site's author, Micheal Quinion, posts fresh articles. His regular entries include topical words such as gravitas, weird words such as dactylomony, a Q&A section where he attempts to trace the history of a word, interesting turns of phrase such as "echo boomer", and reviews of books pertaining to language, such as The Power of Bable.
The weekly content is also available via a mailing list. I recommend this if only to get access to "sic!", which are samples of mangles english supplied by his readers. One such recent entry was an advertisement for a "Canoe and wine tasting" with the snarky commentary, "Mmm, Cedar, fiberglass and a hint of mud, with an impudent brackish undernote."
My stock of fun sites is starting to run a bit low. If you have a site, or several sites, that you particular enjoy, please email them to me at firstname.lastname@example.org for consideration. Any future Tuesday Fun entries that use your suggestion will credit you.
Sunday, November 21, 2004
Way back ago, this strip of beach was where the rich folk lived. Me and my kin lived inward to the slums. But that was before the Old Ones came back and all sorts of things came crawlin' out of the waters.
Ain't nowhere that it was good, after that, but the rich folk, as they do, moved to where things were, at least, a bit better. They moved in and we got pushed out to live next to the Deep Ones.
In the beginning there were some troubles. They raped some of our women and we hacked up some of them and ambushed a few more with shot guns. Eight months later some ugly babes were born and we bashed their heads in 'cause no one should be kin to things like that. After that, we had an understanding. They kept to them and we kept to us.
Ain't to say we haven't had an eye or two on them. When I was a boy, I'd sneak over by their camps and have a look in. Sometimes they'd be doin' these crazy rituals. Singin', kind of, and dancin' 'round this ugly statue like. Some times they'd go to the water and pull things out that'd make my eyes hurt. Most of the time they just fished and cooked and had chats with one to another as most folk will, though I don't really know what they'd say.
My grand daddy remembers from afore. Back then, he says, people could go where they wanted to. You could go up into the mountains and not get ate nor otherwise. You could go inland and come back the same person and not some sort of thing. You could walk from coast to coast, would you would. You could even fly, he says, but I think he was just havin' my leg.
I looked to the sky a once. There was something far way away. I couldn't rightly make it out except that it had a color that I couldn't put a name to. I looked to it and couldn't look away. I don't right remember what happened afterwards exceptin' that my folk say I was laid up for nine days with a fever or some such. I guess it was so 'cause they say it was so but I don't rightly remember except for the nightmares I sometimes have.
When I made twenty, my grand daddy gave me a thing he calls a talisman. It's a bit like a stick and a bit like a bone that's been bent funny. It feels sometimes warm and sometimes cold but always kind a sticky though it don't never stick to nothin'. He said I should always keep it and I always do.
No one goes to talk to the Deep Ones. Not never before, at least, though someone's just now come through who wanted to see them. He was tall and he had a robe like thing around him everywhere but his face. I think there was something in there with him but he looked most like a man, though you never know. I pointed him their way and gave my bye to him and wished him well 'cause there ain't no wrong in being nice... even if.
That was some hours ago and I don't know what happened to him. The Deep Ones are singin' loud and slow and there's a cold wind blowin' in from the sea. I don't expect it's nothing to fuss over, but my grand daddy's talisman is twitchin' which it ain't never done before.
I rekon I'll find out soon enough.
This story is part of the Cthulu Mythos of H.P. Lovecraft. The idea of a story set after the return of the Great Old Ones was inspired by J.B. Lee's story For Here They Shall Reign Again... although this is entirely my own take on the concept.
Thursday, November 18, 2004
I'm still in the process of moving, so no poem for today. In lieu of that, here's a rather charming rejection letter that I got awhile back from Tomorrow SF
Thank you for showing me "The Buddah in the Bathroom." It had many virtues, but selling to our audience is oftimes a matter of being clever in exactly the right way.
Sunday, November 14, 2004
I am in the process of moving, this week, so I'm afraid that the blog will have to go without updates until we're done with that. I expect to have fresh updates by no later than next Sunday.
I appreciate your patience.
Thursday, November 11, 2004
To your dog, You are God.
You are the Bringer of Food,
The Opener of Doors,
The Summoner of Strange Lights
You giveth him pleasure
When You stroketh his fur,
You taketh him to far places
Where he may frolick and run,
You maketh him to play fetch
In the yard and in the park.
You are the Lord of All Things
That Squeek, Roll and Can Be Chewed,
It is You for whom he wags his tail.
It is You for whom he waits,
In lamentation and vigil,
When You leave in the morning.
It is You that he loves
To the brim of his canine heart.
When you are a beast to him,
It is damnation.
Tuesday, November 09, 2004
There are a lot of "weird news" sites out there. What sets This is True appart is not only the wit of the sites owner, Randy Cassingham, but his passion over such issues as the foolishness of zero tolerance laws, his advocacy for tort reform, his efforts to help people fight spam, and his dedication to acknowledging the lives of the unacknowledged who, never the less, made a difference.
He also sells "Get Out of Hell Free" cards.
Sunday, November 07, 2004
One of the questions that atheists get asked a lot is the question of how we deal with death. One of the most common replies is that not existing after you die is no different from not existing before you were born. I think that this is a valid answer but I must admit that it also seems a bit glib.
I think that there is something disturbing about the notion of simply not being anymore. Part of that is purely psychosomatic. The brain wants to imagine an impossibility; it wants to imagine what it’s like to be dead. If death is, indeed, simply the terminus of being then there is, quite literally, nothing to imagine, but it tries to do so anyway. What we end up is the notion that we’ll be suspended in an endless darkness. Indeed, many of the early conceptions of the afterlife had precisely that kind of imagery, often coupled with the notion of being in an underworld (after all, dead bodies are often put into the ground, so an underworld is a logical place for the souls of the dead to go).
It is here that the comparison with our state before birth does help. Before we were born, we didn’t experience a grueling wait of billions of years to be born. We simply didn’t exist. As an atheist, I strongly suspect that this is the state of affairs that will follow my life. That, in itself, is a terrible notion. The idea that all of my experiences and memories, all of the beliefs and hopes, all of the things that makes me a unique thing in the universe will perish is tragic. The fact that it’s a common tragedy that is repeated hundreds of thousands of times a day does not diminish its magnitude. Death is unfair and we are right to think of it as life’s greatest enemy.
I have heard arguments that try to frame death as the compliment of life. I can go so far to admit that I would not desire immortality. A life that could not end would ultimately extend beyond the point where every possible state of mind had been experienced. Beyond that point, existence would become a vast redundancy. The quality of life would be reduced in an endless cycles of recapitulations. Even the most joyous of possible lives would become a kind of hell in the long run. The problem with death, then, is not that we die, but that we die too soon.
The average person gets about eight decades worth of life (discounting those who die young). The first decade is spent growing out of childhood. The second is spent growing into adulthood. The last decade is often spent in a state of mental and physical decline. This leaves us with a mere fifty years worth of time to live as qualified adults. In that time, we usually have the chance to pursue one or, maybe, two careers. We may take up a half-dozen hobbies. Most people only ever become experts in a single field with a few prodigies managing expertise in up to three or four subjects. In our lives, most of us only come to be good friends with, at most, a few dozen people.
I like to imagine a world where any person who was so inclined to study every single subject known to humankind to a PhD level of proficiency, where every craft and hobby could be tried and mastered, and where we had the time to know and love every single worthwhile person. Against this supposition, the span of our real lives seems to be paltry indeed.
This grim limit is compounded by the fact that the death isn’t simply a point that we reach. We die by degrees. There’s a profound moment in the animated adaptation of Peter S. Beagle’s The Last Unicorn where an immortal unicorn is transformed into a mortal woman in order to save her from peril. She takes the form of a young, beautiful woman but her reaction is to hold herself and to cry that she can feel herself dying.
She’s right. By the time we become adults, our bodies have already started on the long, slow slide to ultimate failure. As we live, we accumulate irreparable injuries that place accumulating stress about the complex systems that keep us alive. It is, in fact, remarkable that can go as long as we do without dying. Most organisms have life spans that are measured in years, not decades. Our only competitors for longevity, in the animal kingdom, are some species of parrot, great tortoises and, perhaps, certain species of whale. On the scales of the natural world we do live long, just not long enough.
In many senses, we live too long. Outside of accidents, death rarely comes easy. Dying is often a cruel affair as one faculty after another succumbs to failure. Too often we are reduced to a nearly infantile state where we can’t even manage the basic necessities of personal hygiene. We joke about adult diapers because they are a humiliating, awful reality and humor is one of the few tools we have to cope with their terribly necessity. Worst of all is that many of our will have to experience the gradual loss of our very minds, the very things that make us who we are. The notion that our very self can be slowly ground to dust is almost beyond contemplate.
It is for these reasons that I can consistently say that I don’t believe we live long enough while, at the same time, asserting that I am more afraid of living too long than of not living long enough. It is my sincere hope that when death does come for me, I will be one of the lucky few who will be fortunate enough to have a death that is quick, painless and dignified.
I do understand why people consider death to be a challenge for atheists. Indeed, I would admit that, for me at least, death is a challenge. I view it, however, as a challenge to accept the world as it is. I do not want to believe something because it is comfortable and soothing to believe it. I have long asserted that reality has no obligation to match our expectations. The notion of an afterlife free from the depredations of aging and the limitations of premature death is certainly a tempting notion. I do not believe that afterlives are part of the real world, however.
I believe that death is unavoidable. I have confidence that our technologies will continue to improve and that our capacity to put off the end will increase over time. I hope that, eventually, we may be able to grant ourselves to live truly full lives and that, in the meanwhile, we should do what we can to make the process of dying less painful, less frightening and more dignified. I also think, more than anything, that the fact of death should motivate us to cherish our lives, and the lives of those around us, while we have them. Our lives are, quite literally, irreplaceable. Squandering a life seems, to me, to be very nearly an act of criminal irresponsibility. It is for this reason that I think we should strive to live them to the fullest of our abilities and to help our fellow human beings to do the same.
Thursday, November 04, 2004
There were five Zoroastrian gentlemen
Waiting to talk to you today.
They were going to tell you all about
Ahriman and Ahura-Mazda:
Acting like opposing plates
On an infinitely long scale.
They were going to explain
That every person,
By their acts and words,
Is like a tiny weight
Placed to one side or the other
Of this great device,
And that the future may well fall
To either side,
So do well and beware.
They were going to tell you —
Well, many things.
I was taking notes,
But the ink got smudged.
You would have seen them,
But you were so wound around
Your own worldview,
Not unlike a certain serpent
Around a certain forbidden tree,
That you missed your chance
To meet these five gentlemen.
They would still like to see you.
Should I take a message,
Or would you rather
I send them away?
Tuesday, November 02, 2004
In the United States, today is Election Day. In addition to making choices concerning propositions, state amendments, and sundry political offices, this is also the quad-annual election of the presidency.
Unless you've been living in a state of sedation for the last year (and lucky you if you have!), you are doubtlessly aware that this is an intensely close election. So close indeed that the infamous undecided block will play a critical role in swinging the vote one way or the other.
If you do happen to be one of those undecideds, I thought that I'd help you out a bit by pointing you to the Political Compass. The Compass won't tell you how to vote, but, if you take its test, it will give you a clearer idea of where you stand politically. (In the interest of full disclosure, my coordinates are (-1,-5) — which will make a whole lot more sense to you once you get your own results.)
If this still doesn't help you decide, maybe this latest cartoon from JibJab will do the trick.
Sunday, October 31, 2004
In many of the circles I've associated with, there has been a certain cachet to embracing a stance of cynicism. The thought behind this seems to be that cynicism is an emblem that proves that one is wise in the ways of the world and not prone to naiveté. Idealism, by this view, is a kind of hopeful foolishness where one embraces futile wishes in the face of the evidence of the real world. Cynics, by contrast, are imagined to be level-headed people who realistically accept the harsh realities of a cruel and indifferent world.
I will freely confess that for many years I held just such a perspective on life. In my own experience, good intentions rarely amounted to anything and I found that when I expected the worst I was rarely disappointed. My cynical stance certainly seemed to be a better representation of the state of the world as a whole than that found in Utopian fantasies. I had a much easier time believing in dystopias.
It has only been in the last few years that I have to believe that cynicism is, its own way, every bit as naïve as the worst sorts of idealism. More over, I have come to a new appreciation of what idealism can offer when tempered with the proper perspective.
It is said that every cynic is a disappointed idealist. Certainly that was true of my case. When I was young, I had any number of political and social notions that were exceedingly optimistic. I liked to believe that people were, at heart, good and that most people, given the chance, would behave altruistically towards their fellows. By and by I came to see that such a hope for an altruistic human nature was not supported by the evidence. Humans are not angelic beings and, given every opportunity, we do, indeed, behave in a self-interested manner. History has proven, again and again, that attempts to form utopian societies founder against the rocks of human behavior. They may work for a time but, ultimately, they collapse due to the basic selfishness of the individuals who comprise them.
How then, is it, that have I come to reject cynicism and reestablish myself as an idealist?
One of the truism of cynicism is that people, being base entities, never change. How is it, then, that societies change? More to the point, how is it that societies ever progress? A little less than one hundred and fifty years ago, slavery, in my nation, was a legal institution. A bit over eighty years ago, there were still states where women did not have the right to vote. Forty years ago, black Americans were subject to a form of apartheid. How did these evils ever come to end? I found that a truly cynical worldview simply could not account for them without all sorts of ad hoc justifications that supposed that the proper confluence of wrongs could, sometimes, generate a right. The reality, however, is that a great many people working from a set of convictions that could only be described as idealistic fought long and hard battles to bring these events about. The path to improvement was rarely straightforward and often required a descent into the worst realms of human behavior (as evidenced by the church bombings that punctuated the civil rights movement's struggle), but that only serves to underscore depths of conviction that were required to bring about these changes.
There is such a thing as naïve idealism. Any view that thinks that people are going to be good for the sake of being good is bound to fail. Attempts to build systems around such hopes are not only unbearably optimistic but, often, dangerous. One of the basic failings of Communism was the assumption that people could be motivated to work for the good of the community without any compensation above the knowledge that they would be helping their fellow human beings without any desire for personal status or material reward. There are many reasons that Communism doesn't work well in the real world but central to the majority of its failures is the simple fact that people aren't like that. Communism turned out to be a Utopian dream and a real world Hell. Too often idealisms turn into ideologies which, in turn, lead to all sorts of evil. Ideologies tend to become perversions of themselves precisely because they enshrine ideals above mere reality. Once an individual or a group severs ties with reality, it becomes very easy to justify evil in the name of a cause.
But just as there is such a thing as a naïve idealism, so is there a naïve cynicism. The notion that human beings are uniformly bad and ultimately selfish is every bit as wrong as the notion that humans are uniformly good and ultimately selfless. Cynicism, taken to its logical ends, prevents us from striving to be anything more than we are because it denies that we have a better nature to aspire to. Too often, cynical anticipations become self-fulfilling prophesies. A cynic may well feel smug when good deeds come to naught but a world of cynics would be a world trapped by its own expectations.
I have come to believe that the major failure of idealistic philosophies is the perspective that Utopia is a place that can be reached. A naïve idealist who thinks that perfect justice can be obtained must either sink himself into a state of permanent self-delusion or succumb to an admission of error which can easily lead down a slippery slope towards a state of abject cynicism. An informed idealism, however, would see that the notion of perfect justice is no more obtainable that being able to reach the place called "up". However, taking justice as an ideal, one can move in the direction of it just as can move up without ever actually reaching "up".
Ideals are not real things. One does not have to be a cynic to appreciate this fact. This does not make ideals worthless as ideas. Even if we can not have perfect racial harmony, we can hope to achieve a minimal amount of racial disharmony. Although sexual equality may well be beyond our human capacities, we can strive to diminish the sexual inequalities that face us down to a negligible insignificance. All that is required to be an idealist is the belief that we can be better and that we should be better along with the willingness to stive to become better. Although the cynics of the world may well insist that such a task is perfectly futile, history is on the side of the idealists.