Thursday, October 26, 2017

Spectacle

As we've seen, theorists such as Benjamin and Adorno had already begun to describe what they called "spectacle"; their anxiety about the potential of mass media to produce and reproduce a docile, distracted public is palpable. And yet it's not until Guy Debord that the "spectacle" reaches its definable, almost tangible apotheosis: "The spectacle grasped in its totality is both the result and the project of the existing mode of production. It is not a supplement to the real world, an additional decoration. It is the heart of the unrealism of the real society. In all its specific forms, as information or propaganda, as advertisement or direct entertainment consumption, the spectacle is the present model of socially dominant life."

'The heart of the unrealism of the real society' that phrase says (or seems to say) it all. Clearly, this is an ideological claim, related to Louis Althusser's nearly contemporary dictum that "ideology is a representation of the imaginary relationship of individuals to their real conditions of existence." And yet Debord has hold of something quite different here: in the 'society of the spectacle,' there is no one central or all-dominating ideology; no progressive sense of a newly empowered proletariat or enlightened bourgeoisie . The spectacle exists for itself; its ideological "work" (if that's the right word for it) is to reinforce the idea that we have moved beyond ideology, that for every view there is an equal and opposite view, and that, within this existential stasis, we might as well put on our 3D glasses, sit back and relax.

Which highlights another problem: what, stripped of its spectacular raiment and ideological armor, is the "real"? Ideological theorists would be loathe to allow that there is anything before or outside ideology, and yet they posit the function of ideology as being to hide this "real" from us, to distort or mask it in some way. So how would we know when -- if such were possible -- the mask were to be taken off? Such an act would seem impossible, and yet without it, what could be the point of a critique of the distortion?

It's a reasonable question, but to my mind, it's the result of seeing the situation as a "problem" in search of a "solution," rather than as a dynamic, a pattern, in search of an understanding. The society of the spectacle may well be unavoidable, but how we will position ourselves within it, how we will seek to correct for its distortions, how we will account for the ways in which the spectacle has or will be deployed, are essential, and do admit of considerable self-agency. In the case of the 'moral panic' over Hip-hop in the mid-1990's, I saw it this way:
In this 'society of the spectacle,' as Guy Debord has dubbed it, cultural myths rise and fall in an almost operatic struggle upon the electronic stages of television, radio, and compact disks. The myth of the 'concerned' liberal white goes toe-to-toe with hip-hop's carnivalesque mirroring of his/her own stereotypes; the Goats' "Uncle Scam" runs drug cartels, wars, and drive-by shootings like booths at an amusement park. If images of Willie Horton scared middle- class Americans into voting for George Bush, the images and words of Ice Cube, putting his gat in the mouth of Uncle Sam and shooting "'til his brains hang out'" will scare them more, and this fear in its turn will inspire laughter (as when Cube, on Predator, samples the voice of a young white girl in a talkshow audience and loops the results 'I'm scared . . . I'm scared . . . I'm scared')
In this scenario, there are tactical ways in which the viewer/consumer/citizen can indeed both evade and come to critical grips with the logic of the Spectacle: he/she can shake off the spell of fear, see the soundbyte politics and crassly manipulative adverts for what they are, and tune in to an alternative channel where music and lyrics such as those of the Goats and Ice Cube are audible. "Tuning in" and "tuning out," "filtering," and "remixing" the purported news and views on the big screen, a citizen in a spectacular society can, by sheer mobility and agility, find a way to be a resisting subject.

These days, alas, another old set of by-words of the '90's have come back to haunt us. When I see people wearing those ear-mounted cellphones, I can't help think of these phrases, and wonder whether they have just as much -- or even more -- currency then or now: "You will be assimilated" and "Resistance is futile"

Wednesday, October 18, 2017

The Culture Industry: Enlightenment as Mass Deception

This is one of the oldest of the essays we'll read this semester -- only Benjamin's writings precede it -- and yet it's also one of the most prescient and influential ones. All kinds of work on the questions of media, ideology, and culture have been shaped by it, at least in part, including the work of Barthes, Foucault, Althusser, Jameson, and Bhabha. Of course, the times have changed, but the best test of a polemical analysis such as this may not be whether it is still an accurate representation of the problem at hand, but whether, as we read it, we are able to see not only its flaws but the kinds of fresh connections to our times which its authors could not have anticipated.

Some phrases hold true, and even have fresh resonance today in 2012: "Even the aesthetic activities of political opposites are one in their enthusiastic obedience to the rhythm of the iron system" -- or how about "the universal criterion of merit is the amount of 'conspicuous production' of blatant cash investment" or "real life is becoming indistinguishable from the movies." There certainly is an insightful analysis of the seemingly uninterruptable hegemony of capitalist production, whether of cars or refrigerators or motion-picture soundtracks.

Yet there's also a good deal that has become a bit dated; Adorno and Horkheimer can conceive of capitalism only in terms of centralized production of essentially identical products, which are then sent out from these centers to the consumer to fulfill the pre-manufactured desire created by advertising and exposure to images of these same products. The post-industrial landscape, in which production has moved away from most developed nations, and consumption alone keeps them going, is one they don't anticipate; neither could they foresee that new media technologies, while certainly capable of expanding the same old centralized systems, would also fragment them, and move the production of "culture" from centralized facilities (the Hollywood studio, the Newspaper plant, the recording studio) to a remarkably diverse array of millions of sites, with the majority of content created by those who, in the old system, would only have been the "end users" of such products.

The other issue with this, as with others of Adorno's works, is his deep embrace of the cultural hierarchies of the old, classic, European world. Beethoven is self-evidently brilliant; a Hollywood soundtrack that quotes some of his melodic lines is self-evident garbage. Adorno also inveighed against "Jazz" as mindless music for the masses (something apparently has been lost in translation; he uses this word not to reference Ellington or Mingus, but as a catch-all for American popular music of the 1940's, what we today would probably call "big band" music).

Yet despite these shortcomings, the analysis can often be crystal clear, as in this passage:
"The culture industry perpetually cheats its consumers of what it perpetually promises. The promissory note which, with its plots and staging, it draws upon pleasure is endlessly prolonged; the promise, which is actually all that the Spectacle consists of, is illusory: all it actually confirms is that the real point will never be reached, that the diner must be satisfied with the menu."
Or, as Ed Valenti, who wrote the immortal words of the original Ginsu Knife commercial, might put it: "But wait, there's more!" (Interestingly, I found in the Wikipedia that the Ginsu Knife was in fact a Rhode Island product!)

We all have the sense that, no matter how much slicing or dicing we can do, we're still likely to wake up the next day with a need to go shopping again -- or will we? The current recession may be a deeper challenge than those in the past; deprived of the possibility of conspicuous consumption, what can we do? What matter?

Thursday, October 12, 2017

Globalism

In Arjun Appadurai's essay, “Disjuncture and Difference in the Global Cultural Economy,” he declares that "The central problem of today's global interactions is the tension between cultural homogenization and cultural heterogenization." I would agree, and add that, more often than not, "globalism" is a euphemism for the export of cultural commodities from the U.S. and other developed countries to other parts of the world. If there were a regularly pubished "cultural balance of trade," the U.S. would have a surplus of monumental proportions, and this isn't necessarily because our cultural commodities are better -- in large part, it's been fueled by the economic inequities of the world economy, in which an hour of labor, skilled or otherwise, is simply not worth the same around the world -- and neither is an hour of leisure.

From the point of view of the celebration of a "multicultural" future, the poses a problem: in the U.S., U.K., and other developed nations, the kind of world cultural imports available are scattered, watered-down, and homogenized -- think of the contrast between, say, "American Chinese Food," a Chinese restaurant that caters to Chinese diners, and the actual daily diet of a person in China. The first offers items which in fact are food hybrids, if in fact they are Chinese at all, such as enormous breaded wads of "sweet and sour chicken"; spices are toned down, and a spicy Szechuan dish that would set your head on fire at a "Chinese Chinese" restaurant is barely enough to warm your tongue at a "Chinese American" joint. But American food and cultural exports, and the local responses they evoke, are quite often the opposite: more flavorful away from home than at home. Cultural hybridity is alive and well outside the boundaries of the developed world, and there are numerous cultural forms -- from Ska and Reggae to Afro-Pop or Taarab -- which, born in these hybridities, have since exported themselves around the region and the world. There is nothing necessarily "multicultural" about a McDonald's restaurant in India, or an Apple Computer assembly plant in Singapore -- and yet their economic significance can't be excluded from any attempt to understand something we could call "globalization."

New Media have, of course, played an enormous role in globalization, by providing a common "playing field" -- although a very uneven and in some places highly censored one -- as well as by enabling communication across previously inaccessible pathways. And yet, even here, the unevenness take a strange toll, as with Chinese (and now, Romanian) World of Warcraft sweatshops, where young men play the game 16 or more hours a day to accumulate game currency for their employers, who then sell it back at enormous profit via eBay or other outlets back to U.S. customers. Today, you can place a free video "call" on Skype to people all over the planet, but workers in the U.S. who want to "wire" money back to their families in, say, Venezuela, pay enormous fees for this "service." So before we can celebrate -- if we should -- the rise of globalization, we have to see how, both within new media and without, its pathways are functioning, and closely examine their real costs.

Wednesday, October 4, 2017

Sound

When it comes to media, all of our human senses are related to the perception of frequency: the visible spectrum is that which we sense via sight, and the audio spectrum we sense via hearing. Taken together, they represent only two tiny patches of the total frequency spectrum,  and yet it's remarkable to consider that it's actually sound waves which we can hear with the greatest precision, distinguishing a wide variety of characteristics beyond frequency itself; the difficulty of programming a computer to recognize human voices using natural language is testimony to that (though software such as Dragon offers to change that soon).

Sound is also the very first medium to become technically recordable, as well as the first medium to be broadcast. Sound recordings had been around for nearly 50 years before the first sound films were released, and voice radio predates television broadcasts by twenty years or more (twenty if you count any broadcast, nearly forty if you count only commercial, regular broadcasts).

Sound has been a pioneer in the digital and Internet revolutions as well, and that's easily understandable. If you take the CD audio standard of 44.1 kHz, this is only about 7% as much data as the NTSC television standard of 5.75 MHz, so it's no wonder that audio was compressed, stored, and shared long before even standard-res video (HD has more than 3 times the data density of the old NTSC standard).

Sound, of course, was originally free and totally ephemeral; once spoken, sung, or plucked, it was gone. It then became a physical object, the cylinder and then the disc, sold to a mass public, and channeled through later forms such as the 8-track tape, cassette, and digital CD. But then, thanks to compression paradigms such as MPEG-2 audio layer 3 (originally designed to compress the sound elements of a video signal), sound became the first thing to fit through the narrow tube that was the pre-highspeed Internet; the rest, as they say, is history.

And yet, in most Media Studies programs and New Media books, sound seems relegated to a very small, supporting role. It's been naturalized, made invisible, save in its absence, as when talking about "silent" movies (which of course were never silent; before soundtracks they were nearly always accompanied by music).  And so I ask (dropping into a KRS-One tone), "Why is that?" And what should we do about it?

-----•-----

p.s. for a brief history of sound recording technology, see this page on the Media Culture I blog.
p.p.s. and have a look here at the history of synthesized voice.

Tuesday, October 3, 2017

Chaplin and Film

The role of Charlie Chaplin as the first iconic figure of the cinema is nearly impossible to overstate. No other person in the history of film has had the instant, international, and lasting appeal of "The Little Tramp," or Charlot as his European admirers have dubbed him. Recognizable, even at a distance, in a few frames or as a silhouette, his signature in movement as much a trademark as his profile, his essential appeal is as strong today as it was a hundred years ago. And at the same time as the "Tramp" embodied the poor, downtrodden luckless soul with a heart of gold, Chaplin the actor turned director soon became the mightiest of the moguls, a man who thought nothing of building enormous sets for a single shot, or of shooting 750,000 feet of film to get 30,000 he approved.

Walter Benjamin wrote only a few relatively brief accounts of Chaplin's films and impact, centered on 1928's The Circus. But his fascination is evident; Chaplin collapses the dyad and the triad established in "The Work of Art in the age of its Mechanical Reproducability" --he is both the actor in the "test" that is a cinema performance, and the director who will weave the film itself out of a myriad of possibilities -- and yet he remains entirely "natural," in the sense that he is also in the place of the viewer, the lone soul whose craving for sense in a senseless world is great enough to break the very frame of film itself.

There really hasn't been another figure like Chaplin, nor is there likely to be -- and that's not simply due to a lack of talent, luck, or the desire of mass audiences for iconic figures. The icons of the later twentieth century have tended to come from the area of music and performance, and although they have made forays into film (the Beatles with A Hard Day's Night, or Michael Jackson with Thriller), it has been only one of many outlets. Now, in the new media era's house of YouTubed mirrors, it seems easier to create a dozen Lady Gagas than one figure of enormous stature and appeal. Artists who, Chaplin-like, have endeavored to set up their own media empire, have had (generally) brief efflorescences followed by slow (or sudden) declines -- where are Prince's Paisley Park Records, the Beatles Apple Corps, or Madonna's Maverick label today? Even when giants combine, as with DreamWorks Entertainment, they rarely stay large for long.

Chaplin's own legacy seems secure -- news that his films were going to be re-released by the Criterion Collection, replacing the flawed digital transfers of MK2, has lit up the film world, and his name is still a "houshold" word -- most recently used to brand a line of luggage in South America, from the website of which the digital image at the head of this entry derives!

Monday, September 25, 2017

Detouring the New City

Détournement -- to turn aside, turn against, to go without going somewhere -- was a key element of the practices of the Situationist International under the guidance of Guy Debord. He and his minions would branch out and wander the streets of Paris, deliberately going about their routeless routes, leaving only footprints in the dust and the occasional graffito "Ne Travaillez Jamais!" (Never Work!).

Latter-day theorists of the quotidian -- the imperceptible action of behavior that, because it was so ordinary, had not previously been thought to count for much -- such as Michel de Certeau took the Situationists' idea one step further: rather than claiming a practice for the intellectual few who consciously chose the path of diversion, De Certeau regarded all walking, all movement, as a performative act, within with the purposeful ("I am leaving the factory to head home"), the purposeless ("I will stop and gawk at this building") and even the perverse ("After petting the cat, I will drop it in the wheelie bin") as all, fundamentally, acts. Such seemingly non-productive acts, however small and"capillary" in their action, constitute another kind of work, work diverted from the production and consumption of capitalism, a anti-work or sorts that de Certau called "bricolage." Alongside the detoured walk, there could be detoured objects: the paper clip twisted into a sculpture, the office xerox machine used to xerox one's own face, or poetry, or shaped into music such as that of the "Xerophonics." Even the pigeons are getting in on the game.

So how might one view -- or apply -- these concepts in one's own life? Being conscious of the effort is not a disqualification; neither is it a requirement. Here are some potential instructions, which of course themselves might be further detoured:

1. Drive in circles for ten minutes on the way home. Then, don't go home.
2. Wear an unnecessary item of clothing (a scuba mask, party hat, clown shoes) to work.
3. Throw you cell phone into the ocean.
4. Tear out a page from a book you love, insert it into a sandwich, and eat it.
5. Learn to speak backwards (here are instructions).
6. Acquire a dead language, such as Old English or ancient Greek, and speak to everyone in it.
7. Wear only clothing manufactured prior to December 31, 1985.
8. The next time you see an advertisement, buy the product immediately.
9. Spent 10 hours clipping coupons, then put them through a shredder.
10. Send a child to work in your place, and spend the day in a sandbox, or reading children's books.

Monday, September 18, 2017

Reading as technology

We are accustomed to think of books, and print in general, as old and familiar things. To us, books are the "real" which may or may not be supplanted by the "virtual" -- Kindles, Nooks, and Google e-books. This makes it a bit difficult for us to recover the sense that the book, like the scroll before it, and the clay tablet before that, is a technical development, one which initially seemed strange to a world which had not known any means of preserving words and keeping them "stored" for another day. There's a video, which I like to call "Book 1.0" on YouTube that illustrates this perfectly. The book is no more a "natural" object than is a smartphone or an automobile; it has simply been around so long that we have gotten used to it, and now begin to fear that we may "miss" it.

Walter J. Ong, the brilliant Jesuit scholar and pupil of Marshall McLuhan, was one of the first scholars to realize and emphasize the technological status of writing. For Ong, writing not only changes our practical lives, it actually restructures our consciousness. This happens in a number of ways; our tendency to think of knowledge as persistent, as capable of being stored elsewhere -- and with it our sense that we ourselves don't have to precisely remember anything -- is one key effect. Beyond this, though, our whole sense that by naming, cataloging, and finding form in things that we are in fact re-figuring the world; that our mental abstractions seem to have shape and permanence; that there can even be a thing such as "capitalism," "Marxism," or "psychology" are also after-effects of writing and print. Print, by making massive amounts of text cheap to make, distribute, and preserve, accelerated these changes; with the dawn of the internet, this process has taken another enormous leap. The disappearance of objects -- the book, the music CD, the videocassette or DVD -- and their replacement by the mere making available of media streamed from somewhere else, is one notable result of this accelerating process.

At the same time, Ong emphasized the complexity and sophistication of the non-literate mind (he disliked the term "pre-literate" at it presumes a progression toward writing as inevitable). The ancient Irish bards had to memorize hundreds of lengthy poems; in the 1920's in Yugoslavia, Ong's mentor Walter Lord found pairs of men who could, by singing interlocked lines back and forth between each other, reproduce an epic poem of tens of thousands of lines. Such poems are as ancient as speech itself, and a few -- the Elder Edda, Beowulf, the Kalevala, and Homer's Iliad and Odyssey -- survived into the manuscript era, the print era, and are now downloadable as e-books. And yet, in this disposable era, when computers and cellphones complete the circuit from shiny new tech devices to e-rubbish in a landfill in a few short years, the old belief -- that writing something down preserves it -- may yet be reversed.

Some say that E-books aren't proper books at all. Some point to events such as Amazon's silent deletion of copies of George Orwell's Animal Farm from Kindle readers as a cautionary tale. The Pew Charitable Trust recently completed a survey of books and readers, and some of its findings are quite unexpected.

So where do we go from here? Will e-readers be the death of the book? Will a dusty old paperback become a sort of weird antique, joining 78 rpm records, 16 mm film, and Betamax cassettes in the dead media junkpile? Or will we always, whatever else we have with them, have books?

Sunday, September 17, 2017

On Photography

The arrival of photography was not, as we conceive of it now, the arrival of the possibility of accurate representations of reality. The eyes of the times were as yet untrained to decipher the "real" within photographical realism. The Duke of Wellington complained that his nose looked too big (never mind that his nickname in the Army had been "Old Nosey"), and many public figures avoided photography as though it were the plague. In the cartoon shown here, Punch magazine satirized the "Interesting and Valuable Result" of a family photograph; to many at the time, the camera's eye seemed a lie, almost an instant caricature of the sitter's worst qualities.

The Daguerreotype, the very first commercial process, was expensive and time-consuming; early sitters had to remain still for at least three minutes, assisted in this task by a metal neck-brace. The cost of the photo was based on the size of the copper plate from which it was made; a "sixteenth plate" was the smallest, and cost the modern equivalent of more than $100; a quarter- or half-plate such as was ideal for a family portrait could cost well over $500. Daguerreotypes were also "one offs," in that the plate was the positive and (because opaque) could not be printed off. Yet at around the same time, William Henry Fox Talbot developed his "Talbotype" process (also known as a Calotype), using sensitized paper to produce a negative image. From this paper negative, any number of positives could be made, although since their medium was paper, the outlines were far less sharp than with Daguerreotypes. Finally, the invention of glass-plate negative processes such as the Ambrotype (also known as a Collodion Positive) created a medium in which many excellent copies could easily be made from a single negative. By the era of the American Civil War, inexpensive photographic processes such as the Tintype meant that very few soldiers went off to battle without leaving a photo behind, and quite often took a family photo with them. The final step in cheapness and availability was George Eastman's invention of flexible celluloid film, which was used both in still and and moving picture cameras; with its inexpensive "Brownie" box cameras and rolled film that could be processed anywhere, Eastman and Kodak (who later merged) made the "snapshot" a part of the American, and the world, landscape.

In the 1880's, the "photogravure" process made it easy to reproduce photographs in magazines and newspapers, and the era of the mass-produced image was in full swing.

Thursday, September 7, 2017

Walter Benjamin

Walter Benjamin's influence on the fields of media and cultural studies is so fundamental that it risks being a truism. Many times at academic conferences, I've had to suppress the urge to strangle some theory geek who starts every other sentence with "As Benjamin says ... " But well beyond this sort of slavish ipse dixit, there's a good reason Benjamin is so often cited: his analysis of modern culture is not only a perceptive one for his time but -- because his time was such a seedbed for technologies to come -- a prescient one. Nevertheless, we must work to read his text not to ask "What would Benjamin say" but rather what, if we proceed as did he, do we say?

His essay, usually translated as "On the Work of Art in the Age of its Mechanical Reproduction," is a case in point. In Benjamin's day, the new technologies of reproduction were things like colored lithographs of paintings, photographs, phonographs, and engravings. He could hardly have imagined the plethora of reproduction that is the Internet, where an image search for "Mona Lisa" produces not only thousands of copies of Da Vinci's painting, but hundreds of thousands of variations, ranging from Lego Mona Lisa to Osama Bin Laden Mona Lisa.

For Benjamin, reproducing an artwork changes it fundamentally: he identifies as the "aura" that ineffable quality of presence that an original work of art possesses, and which -- for him -- is missing in any mere copy. The democratization of art -- the ability of anyone, even the most humble, to own a reproduction of a great work of art -- does not, for Benjamin, entirely compensate for this loss; indeed, by pretending that a copy can supply the meaning of the original, it actually harms and cheapens the experience.

If that's actually so, then images of art would seem to be irreparably cheapened -- and yet large art exhibitions continue to sell tickets, and most museums are thriving. It would seem that the Internet has succeeded in making the question of reproduction moot -- which of many mathematically identical digital images is the "original" anyway? In other areas, such as music and film, the original -- by its very nature -- is lost already. The "actual" voice of Bessie Smith or the guitar solos of George Harrison disappeared the moment they were being recorded, and all that the spinning discs can give us is a captive reproduction. Similarly, in film, the actors' performances vanish into a machine -- for us, the Marx Brothers, Lauren Bacall, and Charlie Chaplin will always be alive -- but that's because, in the form we knew them, they were already copies.

And such copies has an aura of their own, perhaps -- the smell of warm popcorn, the squeak of a theater seat, the beam of light as it cuts through the air over our heads and materializes into magic -- all the attendant joys of going to 'the movies' are all associated with what is, in its essence, a copy.

Wednesday, August 2, 2017

Welcome to Media Culture II

Welcome to our course blog for Media Culture II here at Rhode Island College for Fall of 2017!

This course wil tackle some of the theoretical and philosophical debates over the consequences and potentials of new (and old) technologies of text, image, sound, and other media. We will use a series of theoretical and cultural “nodes” as means of addressing issues of both form and content in terms of new media streams and objects, and the social contexts and consequences of each. Most of the readings for this course are available online via links on the blog. There is just one required book, the Cultural Studies Reader, 3rd edition, edited by Simon During, though other books, such as Walter J. Ong's Orality and Literacy, are recommended. Each week, we'll have a new reading and a new blog posting with links to related materials, with the opportunity to start and continue our conversations about them here on the blog.

This fall, I'll be away in the Arctic on a lecture and research trip, and won't be on campus until the second week of classes (I will have limited e-mail contact) There will still be readings, and you'll be able to get the updated syllabus and everything you need here on our class blog. I look forward to seeing all of you in September!