Tuesday, December 2, 2008

Structure projects

Anthony's presentation was beautiful - a multi-layered mixture of old and new, and in my attempts to avoid working on my paper for a few minutes, I thought I'd expound on playing cards/tarot cards. Indeed, I need a place to dump this information so that I can clear the space for all the projects we have due in the next 7-8 days. So, with your permission, I will give further explanation of the deck. It is believed that tarot cards probably originated in Italy in the 14th century, since their earliest usage dates to 1391. One story is that the Catholic Church, which forbade their usage, considered divination a criminal act (Thou shalt have no other gods before me) but ironically, kept them locked away. The actual deck of playing cards is said to be the tarot deck, minus the pictures - which kept the cards alive while subverting the wishes of the Catholic Church. Indeed, there are 13 cards in each suit of 4, totalling 52 cards, just as in the original tarot deck (once the deck was taken to France, the number of cards increased to 78). Another theory is that the tarot deck was in fact too powerful for the lay person and the Catholic Church felt the information needed to be controlled. One has to wonder as they did not exactly burn the cards.

The deck of cards originated as a tool to make calculations and estimations of time and the movements of the Earth and other planets around the sun (which ties in again with the structure of a calendar in Anthony's project), and it is possible that, because they have been mentioned in every major civilization, they could have existed in Atlantis, but Atlantis itself is quite mythical. Plato first mentioned the island in his dialogues Timaeus and Critias but its existence has never been proved. And with that, I bid adieu to the blog! I have to go anyway - The Magnificent Ambersons is on and I can only do three things at once. Thanks to everyone for your projects - I've learned so much.


Tuesday, November 18, 2008

Lights...Camera...Conference!

I love peeking behind the scenes.  Seeing the construction.    The studio visits in 804 were one recent opportunity.  Cynthia's re-re-presentation of "Avatar Nation" was another—although this might be less behind-the-scenes and more behind-the-process.  The revisions, for me, were the main attraction.  The most substantial remix involved the images, video, and machinimas.  They were much more integrated into the presentation and yet much more autonomous from the oral presentation.  What I mean is that they were not illustrations of written examples.  They were not iterations of talking points.  They were not set up.  They were not explained (at least not explicitly until the Q&A).  They were allowed to communicate as images.  Fluid ruptures, in a sense, from the spoken word.

In some of my work—my manifesto, papers for 801, blogging in 804—I've been a little obsessed with the relationship between word and image.  They're starting to seem like siblings now: sometimes at play together, sometimes punching each other in the eyes and the mouth.  Image would be the elder sibling.  Word the bigger sibling.  If I've become one of their countless parents, then I've been sending Word to his room a lot.  Word needs a time out.

As an academic, how do I give Word a time out at a conference?  Everyone expects a reading or, if image is at all considered, a Powerpoint (Image's recent best friend, popular with all the popular people, but starting to seem, on good days, like an underachiever, and, on bad days, like an abusive boyfriend).

Maybe the World of Warcraft virtual conference is the answer.  Here the image, and the game, will always keep the Word in line.  The drawback, though, is that for all its many wonders,WOW is terribly austere in the wonders it allows its players to create (here too is my major critique of claims that the game is anti-establishment; it may be if we define the establishment as everything except the world the game establishes; from that, the designers have made it very difficult to rebel; carnevale only occurs when Blizzard hangs the decorations and hands out the masks).

Considering another, more open world like Second Life reminds us of the allure of WOW and makes its loss almost intolerable.  In Second Life anything goes, so much so that nothing goes.  The world thrives on user-generated content, which flourishes outside any narrative.  There is no game.  There is no frame.  There is no structure to the aesthetic.

If we want to reground the conference presentation, virtual worlds may not necessarily be the new dirt, but they certainly offer a valuable vantage point.  If image and word are to cooperate, narrative, and therefore performance, must return to the stage.

Tuesday, November 11, 2008

Structure Project

I'm thinking about ways to look at the Myers-Briggs personality type indicators for my structure project. I worked in an office where people were defined by their Myers-Briggs indicators and their actions were scrutinized based on their four-letter diagnosis. Like Wendy, I am also considering a game format...perhaps a maze...made of cubicles of course:)

Did I miss the point or did he lose it??? The point that is

Wendy stated in our class with Christina, while viewing black and white pictures taken in (I forget where) that the pictures seemed sad and devoid of life (forgive me..this is not a verbatim account). Upon reading Ulmer, I share the same sentiments to a degree. Ulmer's emonuments seems sad and devoid of life in that the examples he brings to us time and again focus on disasters and the questions/desires that stem from them. Car accidents, child murders, 9/11 ...oh my! he states that emonuments must have a punctum...that it must create within a viewer a peircing that opens them to the experience of what they are taking in. I have no doubt that Ulmer priviledges death as a the ultimate punctum. Though he allows us to use emonuments for our own purposes it does not offset his example of tragedy as an exemplar: a model/original/archetype.

What of color, sublime life, and dancing?

Though he does not state that this should not be the subject matter he does not imply explicitly that it should be.

Given the restriction of the form as a monument - which entails a looking back - I wonder if it doesn't have the capacity (which I think it does) to witness through a pleasant punctum that which is all around, but due to its encompassing presence not always acknowledged. Perhaps he believes that should be relegated to the arts: poetry and the like. All forms that he taps into to inform his conception of emonuments. Perhaps it is not his intention to deal with the dead...given that a part (I hesitate to say most) of the theory which informs his emonuments comes from dead people and then maybe it was.

Though he states that emonuments are for the living - a dialouge between the living...I cant help but see the parallels between this conversation he wishes us as egents to have (supposedly new and radical) and the conversations that have gone on before. Essentially a superimposition of traditional exhibitions onto an electrate medium/media.

It seems that Ulmer might have gotten away from himself in over theorizing simple traditional tenets that have been dictating the creation of alternative forms of communication as witnessed in the arts through the ages. I wonder what Christina or Varnette Honeywood (artist), or Picasso, or Gertrude Stein, or Wanda Coleman (poet, screenwriter) would have to say about his work?
Would their collective response be:
Why / Y you've stated what we've been about all along...(aside from all the death and dying that is)

Monday, November 10, 2008

Where do I begin? Ulmer has given us so much to think about. I'll begin by confessing that every time I sit down to read EM and consider the ramifications of MEmorials, I am forced to think about a tragedy that occurred in my family just over two years ago. It was the Friday before Labor Day and time for the annual family reunion on my mother's side in celebration of the Roseboro clan. My four year old cousin, Assadi (Arabic for Isaiah) was playing around in the driveway as my mom's nephew, Assadi's dad, Isey (as we call him) was cleaning out his car and getting ready for the drive from Roanoke, VA to Baltimore, MD where the family reunion was being hosted that year. Somehow, Assadi found my cousin Isey's loaded gun under the driver side seat and shot himself in the head. He was killed instantly. The whole situation was tragic and pointless. Everyone was sickened by the news. Why did Isey have a loaded pistol in his car? How did Assadi get to it so quickly? Suddenly, my grandmother's house, where the shooting took place, was the site of breaking action news. Family and friends forwarded online newspaper articles for details. It was surreal. Because Assadi's mom is a Muslim, he was buried within 48 hours of his death, so no one from our side of the family had a chance to say goodbye. We were all so shocked and trying to pull it together. Before we knew it, it was done. That weekend the family reunion went on as planned in Baltimore without those family members immediately touched by the tragedy. To this day, I'm not even sure if there was an official mention of it at the actual reunion that year.

The following year, another family reunion was held in Charlotte, NC and I decided to attend. Strangely, eerily there was no mention of little Assadi. Not one. I mentioned the omission to my mother, but that was it. It was like it had never even happened. I wondered about this to myself and believe I might have a clue as to why our family could not bare to utter a word about the tragedy. That year, the reunion committee produced a play about the origins of the Roseboros, which is traced back to Rubin Roseboro. While Rubin was not the first Roseboro out of slavery in our family, he was the first to own land. For that reason, he's noted as the founding patriarch. It is that piece of paper signifying Rubin Roseboro's ownership of land which makes him memor[i]able. Rubin's father, on the other hand, was the first member of the family to be emancipated, though he remained landless his entire life. As a result, Rubin's father has been deleted from the official family narrative. I think this connection to Rubin's landless father and Assadi is significant.

Like Ulmer, I see an opportunity for interface between these two forgotten Roseboros and believe their ultimate obscurity is connected. I think the [black] family [reunion], if it is to survive as an institution has to move away from the old Y formation of the "family tree" and toward a risomatic structure that can systematically situate these "sporadic" figures like Assadi and Rubin's father as entities that are lasting and consequential in their own right. Annual conventions, will no longer do the work needed to keep large extended families together. Shameful episodes, such as the senseless death of Assadi or the obscuring disenfranchisement of Rubin's father cannot be fully developed, and therefore adequately assigned meaning, through the old mapping structures associated with familial connections. How many times have you looked at a family tree to see boxes filled with the names of those prematurely dead family members who never lived fully enough to reproduce (either through sex or acreage)? Think of how the eye lingers on those stunted branches. What will future generations make of the Bradly McGee box that never branched out?

The vantage point of the screen allows a space for us to make sense of this lingering look. Through the electronic sphere we can construct the worlds that might have been inhabited by Assadi, Bradly, and Rubin Rosboro's father and, as a result, possibly make this one better.

Tuesday, October 28, 2008

Ulmer's Ch. 5

In Ch. 5, Ulmer discusses the need to register an American national identity using the internet as a "prosthetic unconscious of a virtual America." Here the internet can function as a living, thinking, feeling monument. By the end of the chapter, he discusses that there are multiple instances of loss that could be commemorated but are not, and by recognizing them, we bear witness to the sacrifice. I thought about all of the multiple sacrifices that we "write-off," such as the amber alerts, which are so upsetting and impact everyone who sees them on the highway, but which often leave us hanging - was the child found? Or house fires where parents cannot save all of their children, or hurricanes, where residents are often told to leave their animals.

Electracy helps use revisit our connection to disaster. And it is interesting that some losses are recognized as sacrifices (those of war, for example) and others are not. All life should be valued to the extent that it is memorialized even outside of the family network - how much would we have to wake up if we realized that child abuse, which Ulmer acknowledges, is more heinous and incipent than we currently acknowledge. Could we then make changes?

Structure project: ?? Probably a game. Don't know any programs so need to research it.

Tuesday, October 21, 2008

Rethinking Partial Sight

I had never considered "figurative blindness" before, or a rethinking of partial sight, but I felt like I learned something I should have known or at least considered before today. The notion of a space that is situated between vision and no vision seems murky - how does one institute a Design Research Project off of this space? And yet I now understand that without the resources to address the aspects of partial sight, including that such sight may deteriorate over time, it isolates these individuals and can bring with it depression. The need for designers to have empathy, to contribute rather than react, to create graphic and textile communication seems imperative, but the one aspect I could not find in the reading was the specific action they took to address it - what, other than books on tape and corrective devices - does the partially blind person use to bridge this gap? I kept reading for the exact specifications of what they had used to design for multiple communication devices, and didn't find it. That was the only aspect of this piece that disappointed me, because it did seem that they addressed the issue - but I am still not quite sure how....

Tuesday, October 14, 2008

In language, poetry has the quality of the punctum. In other words, the reader is rewarded for his/her savoring of language by allowing that language to intrude into one's sense of timing.  The breath and the heartbeat are invited to participate in the word structure, thus allowing language itself to penetrate the body. When that resolution moves the lingering reader into dangerous territory, poetry is all the more gripping. To really read poetry, one must commit her/himself to a luxuriant, almost decadent expenditure of attention to language.  Poetry requires a slowing down and a lingering gaze into accepting some conclusion -- of course Barbara Marie Stafford spoke of the notions of "slow looking" and the "lingering look" when she was here last week, so I also have her thoughts in mind. 

In this poem, Campo performs a stretched sonnet that forces us into the psyche of his persona narrator, or Jerry's friend. I think this is why poetry has such a unique capacity to memorialize. 

Revulsion by Raphael Campo

I think her name was Carly - no, Charlene.
So fucking beautiful, the way she laughed,
a hardness in her face that seemed so soft.
She picked up Jerry real quick - I mean,
without his knowing it - they dated three,
four months. He kissed her in the parking lot
one night in front of all our friends. We thought
she was a woman, too. Eventually,
he wanted more than just a kiss; she played
miss frightened innocent until he forced
his hand inside her dress. Her bloody face
was in the local newspapers next day,
beneath the one-inch headline MALE PROSTITUTE
FOUND DEAD. I recognized her, sure I did,
but I would say she got what she deserved -
I mean, she was a guy, a fucking fruit.
I am currently using my Digital Remix project as a demonstration of media's/images' ability to numb the mind to imagery and ideologies that--perhaps even 20 years prior--seemed repulsive. This idea is particularly interesting in instances like news media (as Ulmer points out) or even in "reality" TV (as Nicole S. points out) where the audience is supposed to see something "real" in what is presented. Most audiences recognize the concoctions (often somewhat ridiculous ones) formulated in reality TV--the human sensorium that Ulmer suggests probably plays a role in this--but many, even educated, audiences often forget to critically reflect on the news media and its relevance to "reality." Interestingly, this "reality" becomes spectaclized in order to "maximize [the audience's] desire" (83), rather than to present a Truth. This spectaclization then becomes the norm. Just as we see Hollywood expanding the boundaries of what is acceptable to view in terms of nudity, profanity, drug use, or any other preciously-deemed unacceptable behaviors and images, we also see the news media reaching further to aggrandize the dimensions of reality, perhaps stretching even further the truth in order to present a spectacle that we--society--are entertained by.

self "becoming image"

Ulmer speaks of "becoming image" and the differencce between one's self and one's image much along the same lines as Lacan and his notion of looking into the mirror as a child. Ulmer though takes this thought farther and begins a conversation about "perfect self-presence" through self "becoming image. " This chain of thought amounts to autocommunication that allows one to see one's self seeing one's self which essentially is the meta-awareness of the act of seeing oneself.

This type of autocommunication I believe has been continually exploited by those who create and produce reality shows.

Through creating the image of reality by the lack of scripts and casting people who the target audience can relate to in one way or another, reality shows work to allow people to "watch themselves" in a role of "perfect self-presence" personified by the cast. This phenomenon then becomes a type of autocommunication...a narcisistic endeavour that allows one to participate in their own self "becoming image."

Death might not make WOW a world

Death on a screen whether it be a computer or silver one works often times to represent death as consequence. Essentially, death itself is not the focus of the viewer/player. It is the effects of that death on the current situation which is the point. And so death is seen as a character of causation and not effect.

I came to this notion in thinking on the aesthetics of death as described in Klastrup's article on WOW and in thinking about how death is displayed in the various movies/tv shows that are watched. These thoughts lead me further back to thinking of plays such as "Death of a Salesman" but here I have to stop b/c in thinking on this play imparticular I am forced to look back toward "American Beauty" which deals with death in an all together different fashion. Death in "American Beauty" is the end...the effect...and even though we see briefly its affects on the other characters the emphasis lies not at that point but on all the other points or causes that lead up to it. This is important and perhaps why the movie was such a hit b/c it demonstrates death acting as it does in the "real" world. What is death but the ending point of what is here a summation of all of the causes that we experience that leads up to that point?

So why is it that death in WOW and other videogames acts as a cause? Because perhaps as Klastrup points out there has to be an incentive for a player to play better. If this is so, can we then state that death is part of what makes WOW a world? I ask this last question because death is positioned in these games as counter to its function in the real world

Permission

So I have some ideas with regards to my digital remix project and my final emonument that entails the use of my fellow class member's identities and their actions. Essentially, I'd like to have your permission to use you in my two projects that have to do with reality TV and reality. Your role in the project would be to represent reality and would be used to enact a stark contrast to those entities that comprise reality on tv. So please respond to this post with a yes or no so that I know what I'm working with. Gracias!

Tuesday, September 30, 2008

World of Deathcraft

The "death penalty" of games: yes, that certainly makes sense.  Virtual worlds must punish.  Virtual worlds built around the concept of war must punish with death.  A few alternatives:

-surrender of land
-occupation
-embargoes
-sanctions
-broken families
-incineration of cities
-torture
-slavery
-inflation
-physical disability
-PTSD
-imprisonment
-exile
-scorched earth

But these are much more messy.  Death is easily inflicted and overturned in a virtual world.

Klastrup's "Note on Death and Dying" in WOW calls attention to the construction of mortality and its interface (so to speak) with players.  Her most provocative findings/conclusions are: 1) it can teach through incentive; 2) it is an event (she makes this claim indirectly through Van Gennep's idea of a "liminal phase" and her rebuttal of the player who described battleground death as a non-event).

From this I'd like to offer one re-rebuttal and two questions:

Re-rebuttal: Death is an event in WOW but not a death event.  I will ignore PVE and PVP and Battlegrounds and get right to the most narrative-worthy material: the Leeroy Jenkins death--dying under the high-stakes conditions of a group run/instance/quest.   As Klastrup indicates, this kind of death does not only produce fools.  It makes heroes too (although I don't agree that these "valorous" deaths parallel WWI accounts--think of Wilson Owen and trenches, and the comparison falls apart).  My point is that this kind of death focuses much more on shifts in social standing than on a bodily event.  The group, if wiped, re-spawns and retries.  Nothing physical (besides the durability of armor) transforms.  Death, on the other hand, is a fleshy event.  Without any permanent material rupture, death be not found.

The game should be approached--as should most games--as a combat simulator without killing.  "Permadeath" should not describe "death."  "Death" should.  What happens in WOW is more along the lines of involuntary teleportation--a non-non-death  

Question 1): Does this really matter?

Question 2): If it (death or the non-non-death) is a powerful incentive to learn, should we bring it into the classroom?

Tuesday, September 23, 2008

Need for Consensus/Poynor

I remember watching a brief documentary several years back where business owners were bemoaning (aren't they always?) that the most recent college graduates had yet to learn how to think for themselves - that they were too used to working in teams. As a result of the way they had not been weaned, businesses were taking it upon themselves to teach the graduates how to be individuals. When I read Rick Poyner's article, in which he quotes a writer who stated "the time for being against is over," and "I do not want to separate. I have no interest in being against..." I immediately thought of those businesses in the 90's who were complaining about the lack of individualization. Then I realized how often we, as teachers, had been forced to "team teach" ideas and to show our own growth in teaching by putting students into groups and forcing them to get along - encouraging it as good behavior. Look what we've done! We've created a generation that literally clings to each other and doesn't wish to make waves.

Poynor nails it when he states that the critic must practice more than being a supporter and advocate, or there is nothing over which to find fault - therefore, this is not criticism. And yes, this world he grew up in, that had engaging writers who read across a range of cultural fields, who sought a broader audience of intelligent, thinking individuals - the "public intellectual" as he puts it, has dwindled to a pittance in the mainstream world, and one of the reasons I love being in a college setting. I still see the individual in education, but mostly at the university. Perhaps, and I would love to be wrong on this one, it does exist in the workplace for others in our group? Were you lucky enough to be with strong, intelligent people willing to criticize the design, or culture, or pop art, etc...? That has to be a gift one can give others - the ability to value yourself enough to stay mentally alive, and encourage it in others.

As for our next project, I'm still thinking about it. I would like to do a poster because of the strong visual impact, and am willing to learn a new program. I just hope I've picked a program (Illustrator) that will actually be what I need after I spend several hours learning it. If not, then I'll make whatever it will let me make.

Wednesday, September 17, 2008

Religious War

After the DOW dived 449 points today, I found myself wondering how we got into this mess. I've read the articles and there are some strong opinions, but what I haven't read, what seems to be eclipsed in all of this turmoil, is that on 9/11, we lost some of the brightest financial minds on Wall Street. (And then, I see I've used the word "lost" and I'm annoyed by it - it isn't that we lost them, they may have suffocated, or tried to find their way down stairs when the building collapsed on itself, or thrown themselves out of a window with a computer in their hands as buttress, but they weren't lost). We're aware of the immediate impact of their death, and of the sheer strength many used to pull us out of it, but can it be that the poor decisions we are now faced with are also a direct result of not just sub-prime mortgages and CEO over-compensation, but that we didn't have these people, their expertise sitting in the chairs that had once guided our investments?

I found myself grieving today for these victims, for all the victims, and I realized that seven years ago, I knew we had just lost a fabric of our society that was irreplaceable, and the damage would be severe. I can't say in all certainty, but is it possible that the investors who superseded them just weren't ready? And I wonder, are other people feeling these emotions too, and in a sense, the agony of having a financial market that is at the moment "unprecedented" is now compounded by the grief we feel as a nation for this event, if only subconsciously?

Ulmer discusses that a religious war over the status of icons has begun both the era of print, and now the digital era. What I think we've been left with is fear - and as he mentions this feeling of having a bullseye over our icons, of being the target. If, as he states, there is an interdependence of technology, social institutions and individual identity, then a site that allows us to grieve, the electronic Rushmore that can't be a physical bullseye, may help lesson the fear.

I went today to the Pentagon Memorial (http://www.pentagonmemorial.org) and the images were subdued, a place of respite, and even though it is an icon, and a symbol of the devastation from a religious war, I felt it had no energy as a target. Perhaps being a target goes hand in hand with bravado? I'm not sure, but I understand our need for memorials/MEmorials.

Tuesday, September 16, 2008

A sifting through of society's trash

Ulmer connects his argument in the section on sacrifice (pg 41-42)to Georges Bataille and his thoughts with regards to sacrifice and forms of unrequitted expenditure. His choice of Bataille as reference for this section implies that his idea of car crashes is: "ostentation squander" (<--Bataille word choice in "Gift of Rivalry: Potlach") and hence insightful when discerning the values of the entity (the US) that engages in it. Before moving on to its application to emonuments I want to take a closer look at Bataille's idea and Ulmer's notion of sacrifice. Below is a brief summary of their thoughts explicitly present in the article and informed by Bataille's article "Gift of Rivalry: Potlach" and his short text The Eye

Sacrifice (the continuity of life through the witnessing of death and still being) - as a way of understading society
Sacrifice as performance - ritual - rhythm - which invokes then the sense of pattern
Sacrifice as related to practices of production/consumption and unproductive expenditure
Sacrifice as having no end beyond itself
Sacrifice as ostentatious squander - an act afforded solely through one's ability to do so.

Sacrifice as a symptom related to a larger site of schematics. A symptom that may be reduced to its part of a social cycle that is: give and receive - man to man / man to society / society to man. A power shifting through life and death - a vengeful consumption of individual life at the expense of a society's witnessing - which is needed to acknowldege and reify its own existence. But more so a wasteful consumption of life simply because a society has the goods(individuals) to consume. A behaviour based on a sort of capitalistic circadian rhythmic impulsivity (which implies a lack of uncorrupt conscious reason) which can be directly linked to ownership.

The power to ingest/invest which thereby concludes in ownership, which in turn offers the power to destroy/expel/make abject. ie: i buy a flag..I can burn a flag / I ingest food which will eventually become abject <-- This idea as applied to society. An emonument - bearing witness, calling attention to this. A sifting through of a society's trash as means to discern its values and then laying claim to and projecting such to the mass in lightspeed.

Tuesday, September 9, 2008

Google Sketchup

Folks...here is the link to Google Sketchup. It's a fun 3-D modeling program, and you can download the basic version for free from this site. Click on the Learn More link to see it in action, and to read about the 3D Warehouse of objects you can import directly into your design.
Well here I go trying to make sense of Nietzsche. During class I thought aloud that his ideas of shirking old traditions and habits, thus making the world anew, at this time in the 21st century, seems almost mundane and old hat, that in my reading i encounter a perfectly acceptable contemporary ethos that is rather taken for granted -- and is not at all revolutionary. In homage to Nietzsche I wish to amend that thought with a new one.

I think the metaphors surrounding his ideas have become firmly embedded in our culture, which is clearly a testament to his influence wide and far. We constantly hear talk of people reinventing themselves, for example. However, the actions of giving up what's handed to you and creating something new is what's actually radical because it's so rarely done in practice. Most talk a good game about trying the untried but as the colloquialism goes, "leopards rarely change their spots, they merely hide their flaws and pull in their claws." So, in this respect, Nietzsche really does challenge us to accomplish the supernatural.

No matter what we learn from science and the humanities, as a species we have a remarkable way of returning to familiar pastures. As a general rule, we continue to develop our most fundamental connections in the "old ways" based on the age-old dicta and resist recognizing communities that define themselves by different structures. It's clear what can be re[con]tained, but I wonder how humanity is to evolve with these prohibitions so firmly in place?

Manifestering

Go manifest. Give yourself some time to manifest. Reflect. Think about something that inspires you. Think about something you're passionate about. Something will manifest.

My voice is annoying me, reminding myself to manifest...after all, this is it: this is the pronouncement that sets into motion the rest of 805's semester.

It's not for the lack of an issue, or lack of passion, or lack of commitment. I think the grand spoiler, the fly that invaded my meat, was Dogme 95. And that is not to say that I didn't or don't enjoy Dogme movies. In fact, The Celebration and Kira's Reason were those rare films that changed my mood for weeks after seeing them. I like Dogme's "Vow of Chastity." I support its mission. But the quality, obviously, is not consistent. And it's all those not-s0-successful Dogme films--there are currently 279--which have left me a little wary about the whole manifesting enterprise.

Dogme's "Rescue Action" is to save film from "superficiality" and "bourgeois romanticism" by "disciplining" the avant-garde and the new democratizing technologies into a counter-conformity cinematic movement. The discipline, though, can sometimes be too tempting. The weakest members of the Dogme society accept too much, agreeing both to the manifesto's rules and (consciously or unconsciously) to the ways in which the more successful members of the group have creatively conformed to those rules. What manifests then in these weaker films is only conformity.

There is a benefit: first-time filmmakers working in this vein tend to avoid mucking up gallatically; their errors are usually limited to making the derriviteness too transparent. I prefer the galatic muck-ups. They bring on an inevitable identity crisis much sooner. They are the stuff revolutions are made of.

We need manifestos. We need revolutions (like the one Dogme championed in the 90s). But perhaps we need to kill our own manifestos too. Maybe we need to stamp them with experation dates--"Best used by summer 2010." "Freshness gauranteed until December 2009." Or maybe the key is to continue to manifest. Manifest against our previous manifesting--turn the revolution topsiturvy.

This seems to be a revolving pillar of the RCID program, one on which the Third Sophistic finds its third position. And that makes the whole affair of manifesting much more exciting.

Nz's Wit

Nz states that the most knowledgeable philosopher is one "who has traversed many kinds of health and keeps traversing them" . He also discusses how one that has undergone great pain no longer has trust in life because in their state of pain "life" has become the problem. These things he conjectures as being a means of shaping an individual into a more profound - farther reaching, entity.
With this in mind, I thought back to one of my favorite movies "Wit" which is an account of an ivy league scholar's (who specializes in metaphysic poetry ) bout with insidious ovarian cancer.

The connection I see between the two is the protagonists state near her death in the throes of pain. Essentially, the protagonist exemplifies what Nz is commenting on with regards to what illness/pain fosters in individuals. The protagonist in her most painful state throws aside education, theories, and inflated language and is interested only in simple human kindness and being. Below is a tiny bit of her monologue (most of the movie is mono) as she passes into a state of immense pain that will end in her expiration:
(after eating a popsicle with her nurse)
I can't believe my life has become so corny. But it can't be helped,I don't see any other way. We are discussing life and death, and...not in the abstract, either.We are discussing my life and my death. And I can't conceive of any other tone. Now is not the time for verbal swordplay. Nothing would be worsethan a detailed scholarly analysis and...erudition, interpretation, complication. No. Now is the time for simplicity. Now is the time for...dare I say it...kindness. And I thought being extremely smart...would take care of it. But I see that I have been found out. I'm scared. Oh, God. I want... I want to... No. want to hide.I just want to curl up in a little ball. (hides under bedsheet like child)
I want to tell you...(gasp for breath)how it feels.(gasp for breath)I want to explain it. To use my words. It's just as if I can't. There aren't...(groans) I'm in terrible pain. Susie says...I need to be in aggressive pain management...if I'm going to stand it. "It." Such a little word. I think in this case... "it"...signifies being alive.

In the end, Nz's work can be seen as arguing for simplification. A return to and thus acknowledgement of the human condition and its dependence on things that are outside of our control. A bow, so to speak, to the non-interogatory appreciation of life's mysteries.
A nobel gesture one could argue...however, rife with embedded ego-stroking but that is another discussion all together.

Character, health, and religion

Nietzsche is new to me. Philosophy in general is to me. I studied technical writing for the last six years--how much philosophizing need we engage in for that?? :) Of course, I recognize obvious (and very important) connections, particularly when we pursue discussions about language, histories, dichotomies (and paralogies), and, of course, rhetoric. What is particularly interesting to me, after deliberating over The Gay Science (or Joyous Wisdom, as it was apparently originally translated) is how we define ethos and what it means as we *attempt* to interpret others' texts.

I can't say I know Nietzsche--certainly not personally, but not even at a literary level. But from what I gather, The Gay Science was written shortly after (or towards the end of) a particularly depressing time for Nietzsche. Previous writings were quite dark and his philosphy--his understanding of the world--took a different shape. Part of this was Nietzsche dealing with quite severe health problems, which, no doubt, would shape many persons' perspectives. The Gay Science seems to reflect a more hopeful approach to life and humanity. Nietzsche appears to have had an awakening of sorts.

I don't know what Nietzsche considered as his religion. I would venture to guess he believed that philosophy and science has a way of killing God. In his preface, he states that the Greeks knew how to live--they stopped courageously at the surfaces. They were superficial. Nietzsche apparently didn't see himself as superficial. He had gone too far, in essence, in search for his own truth to be able to comprehend God or science--for both of those tend to have a final say, a limit, or an omniscience. Reality, at least for him, didn't seem to have that.

So, what questions I raise are this: how does (or should) Nietzsche's illness affect the way we interpret what he meant? Should religion play a factor in interpreting ethos? When does a philosophists writing take a dramatic turn from influential discovery to psychotic, sickly rambling? I'm not saying that Nietzsche is (or was ever) in a state of dilerium, but certainly depression (and later hope) shaped his discoveries or personal "truth." This is, likely, the case for any human being. Thus, how much can we be allowed to consider in regards to ethos?

Tuesday, September 2, 2008

YouTube Manifesto

Check out the YouTube Manifesto.

Rick Poyner interview

Folks...here's the link to the Poyner interview clip.

girl, you better work it

blurring boundaries where they need to be blurred and containing them where they need to be contained...

just now as i was creating my account and signing in for my first blog post, i made a selection about which email account i would use for this blog. i chose my clemson.edu account after saying to myself, "self, this is a part of your professionalization at clemson, so you better use your professional domain address -- just to be safe." i started to use my netzero account, which is my general email address, but decided against it because of my hesitation to invite the professional so fully into my personal. alas, any such separation is but a pipedream because in the interest of saving time and energy (thus increasing my opportunities for liesure pursuit), i always end up having all my clemson email forwarded to my personal netzero account anyway.

in reality, what is this thing i'm calling leisure anyway? i can't remember the last time i had any meaningful social interaction that didn't involve some professional connection for me or the other person. for instance,the last half dozen or so parties i've attended were all given by professional acquaintances. i first met my significant other while he was at the court house practicing law. my best friend and i first set up our connection during the last call at my favorite restaurant after he finished waiting tables for the night.

basically, my questions involve the authenticity and effectiveness of professional boundaries. do they work? should they? and is there any safety in these constructions? why must i percieve danger through these blurrings? moreover, i can't figure out if my impulse to devise these walls emanate from my carefully cultivated social epistemic approach or whether or not they are the remnants of a deeply internalized indoctrination into the protestant work ethic. or maybe it's something else.

well i guess i'll devote this semester (and i'm sure many subsequent ones) to working out this and many other problems.

p.s. the manifesto i incorrectly cited in class as the "maker's manifesto" is actually called the "owner's manifesto."

Recess in the classroom

For Moulthrop: experience = play;  reflection = writing/thinking.  The problem we face, he says, is that we disassociate them.  Play is for playing, writing is for learning.  Thoughts and thinking are language and writing, so why bother with modes that don't engage half of that?  His call is to move away from that premise, to open up the classroom onto and into the playground. 

This is accomplished, the fusion of word and game, Moulthrop says, mainly through programming, the text of digital play, the language that empowers us to both experience and reflect.

Cynthia raised the point that programming may be frightening--and it is horrifying to me, in part because the last line of code I wrote was in third grade to make my name scroll and flash on an Apple II, in part because programming appears to be a close cousin to calculus, and  in large part because I know that I will lose weeks, maybe even months, (I will not stop once I start) trying to figure out how to make my name scroll and  flash again.

There is hope.  Software tools exist which put more emphasis on designing than coding.  Clicks, draggings, and icons replace numbers, letters, and symbols--good for us, but bad I suppose for Moulthrop.  It kicks his glyphs out of that new space he was hoping to create for them in the realm of experience/play.  But I'll leave his argument to him to instead wonder, here, who will give us those interfaces?  Who will give educators and students the friendly tools we need to re-associate play and reflection in the classroom?

Riffing on existing software only will get us so far.  Do we go to the software giants to have them create our new interfaces?  Do they then become the new textbook publishers?  Or do we go to the textbook publishers and call for them to hire teams of new designers and programmers to produce these new transgressive tools for us?  And how much will they cost?

Theory and Practice

With regard to our class discussion on academia and its (dis)respect for scholarship in new medias and technologies...

Do you think this division is a reflection of the what industry and academia value...practice vs. theory? Perhaps individuals that are frustrated with the lack of academic support for their research areas take both their theory and practice into the industrial realm. 

Sunday, August 24, 2008

Welcome to Designarchy

Design and anarchy are distant cousins of rhetorical style. In this course I propose they become less distant and more directly related to technologies that enable participatory anarchy. Let's participate. Let's design. Let's BE rhetorical.