TIMES, TIME, AND HALF A TIME. A HISTORY OF THE NEW MILLENNIUM.

Comments on a cultural reality between past and future.

This blog describes Metatime in the Posthuman experience, drawn from Sir Isaac Newton's secret work on the future end of times, a tract in which he described Histories of Things to Come. His hidden papers on the occult were auctioned to two private buyers in 1936 at Sotheby's, but were not available for public research until the 1990s.



Showing posts with label Post-Postmodernism. Show all posts
Showing posts with label Post-Postmodernism. Show all posts

Thursday, December 19, 2013

Cryptic Messages


Image Source: Live Science.

Live Science reports on a crypt which has been uncovered in the Sudan, in the old Christian Kingdom of Makuria, which reached its golden age from 750 to 1150 CE. The crypt is located in what was the capital city, Old Dongola, once an important centre in medieval Nubia (see a brief history of the region here). The Live Science report is based on a research publication from 2009. The archaeologists from the University of Warsaw who excavated the crypt and environs in 2009 have a Website here; you can see their 2012 excavation of the city's royal palace, here.

From the Kingdom of Aldoia, south of the Kingdom of Makuria, north of Khartoum: "Bishop Marianos (1005-1039) and Virgin with Child, after 1005 [CE]." Image Source: Early African Christianity.

Ruins of a Coptic Christian church in Old Dongola, Sudan. Image Source: SuperStock.

Friday, December 13, 2013

The Millennial Background Soundtrack



Intrusive, ultra-groomed media so dominate perception now that everyday life has a soundtrack. Whether it is a private iPod playlist, or the general racket of Web and TV news, marketing and incessant communications, there is a constant thrum of noise behind everything. In today's post, hear a few samples from composer Joshua Baker, who writes soundtracks for films, TV, the Web and video games. His work is disarming because it fades so perfectly into the meta-background.

Wednesday, December 11, 2013

A Millennial Artist: Jee Young Lee's Stage of Mind


Resurrection. Image Source: My Modern Met.

Caption for the above photograph: "Inspired by the Story of Shim Cheong, a Korea folktale as well as by Shakespeare’s Ophelia, Lee JeeYoung made this installation by painting paper lotus and flooding the room with fog and carbonic ice in order to create a mystic atmosphere.
 
Lotus flowers grow from the impure mud to reach for the light and bloom to the rise and fall of the sun; in Asia, it bears various cultural symbolisms such as prospects and rebirth. It is also known for its purifying function. The presence of the artist in the heart of such flower is meant to convey her personal experience. 'I was born again by overcoming negative elements that had dragged me down and cleansed myself emotionally. The figure within a lotus blooming implies a stronger self who was just born again and is facing a new world'. It is this is very moment when one reaches maturity and full-potential that Lee illustrates in 'Resurrection', and, more generally speaking, throughout the entirety of her corpus."

My Modern Met reports on a South Korean Gen Y artist, Jee Young Lee, who creates beautiful interiors (hat tip: Ken Kaminesky):
Jee Young Lee creates highly elaborate scenes that require an incredible amount of patience and absolutely no photo manipulation. For weeks and sometimes months, the young Korean artist works in the confines of her small 360 x 410 x 240 cm studio bringing to life worlds that defy all logic. In the middle of the sets you can always find the artist herself, as these are self-portraits but of the unconventional kind. Inspired by either her personal life or old Korean fables, they each have their own backstory, which of course, only adds to the intense drama. From February 7 to March 7, 2014, OPIOM Gallery in Opio, France ... present[s] a selection of Lee's ongoing body of work called Stage of Mind.
Further from Brain Factory:
This exhibition introduces seven new photographic works ... a project on which the artist has been working continuously since 2007. Jee Young Lee “constructs” scenes for her camera rather than employing the traditional method of “taking” images such as still lifes, figures, or landscapes. ...
Lee's artistic motivation derives from her quest for personal identity. In each of Lee's stories, the artist is the protagonist. At times facing away from us, at other times showing only part of her body or reclining, she quietly and mysteriously inhabits her dream-like realms. Through their bold materials and patterns, dramatic colors, and intriguing narratives, Lee's new works signal maturity, coherence, and sophistication. The legends of East and West, Korean proverbs, personal childhood experiences, and immediate realities provide the motifs for her creations. ...
Lee's constructed realities belong to the “directorial mode,” employed since the 1980's by Postmodernist photographers in repudiation of the Modernist practice that sought truth in the everyday world. Lee's “constructed image photography” may be compared to the works of German sculptor and photographer Thomas Demand, who builds life-sized models he intends to demolish after photographing them. Her “staged photography” brings to mind tableaux vivant not unlike U.S. installation artist and photographer Sandy Skoglund's orchestrated room-size installations. But in contrast to these earlier artists, Lee's subjects are deeply personal and intensely psychological.
See more of Jee Young Lee's works below the jump and at this site. All works are copyrighted by the artist and are reproduced here under Fair Use.

Childhood. Image Source: Bored Panda.

Tuesday, November 26, 2013

The Other People With Your Name


Still from The Double Life of Véronique (1991). Image Source: Wonders in the Dark.

I have a friend who wants to streamline his online image for professional reasons. He wants to deal with the Facebook photos which other people obligingly took, posted, and tagged with his name without asking. He wants make sure that nothing weird comes up when you google him, either correctly (some detail he would rather not remain permanently public) or incorrectly (some detail that is falsely associated with him).

Depending on how common your name is, you will be familiar with the experience: you google yourself, and up come the other people with your name. If you have a common name, then you have the comfort of the crowd; but then you have the problem of having any Web presence at all (assuming you want one).

But since my friend has an unusual name, for a long time googling him only brought up results about him. As the Web's reach deepened, another person appeared online with his name.

Tuesday, November 12, 2013

The Paleo Diet


"The idea that eating like our Stone Age ancestors is good for you is growing in popularity, and it has become the latest health fad from Hollywood to Berlin. Shown, a museum diorama of hunter gatherers." Image Source: Der Spiegel.

The fashionable interest in prehistoric humans includes replicating their presumed Paleolithic Diet:
The paleolithic diet (abbreviated paleo diet or paleodiet), also popularly referred to as the caveman diet, Stone Age diet and hunter-gatherer diet, is a modern nutritional plan based on the presumed ancient diet of wild plants and animals that various hominid species habitually consumed during the Paleolithic era—a period of about 2.5 million years which ended around 10,000 years ago with the development of agriculture and grain-based diets.  ... Centered on commonly available modern foods, the contemporary "Paleolithic diet" consists mainly of fish, grass-fed pasture raised meats, eggs, vegetables, fruit, fungi, roots, and nuts, and excludes grains, legumes, dairy products, potatoes, refined salt, refined sugar, and processed oils.
The Paleo Diet does not actually date to 2.5 million years ago, but rather originated in 1975, when gastroenterologist Walter L. Voegtlin published The Stone Age Diet: Based on In-depth Studies of Human Ecology and the Diet of Man. This diet has become increasingly popular through the 2000s, especially in light of anti-bread movements. Curiously, the last time there was a widespread popular rejection of bread (Amylophobia - fear of starch) was during another boom and subsequent economic downturn during the 1920s and Great Depression.

Image Source: GEICO ad via OpenTable.

Tuesday, September 24, 2013

The Post-Postmodern Side to Local Government


Gesté St. Pierre church 'deconstruction.' Images Source: Yahoo.

The recession has consumed another of its children. A report from Yahoo describes the demolition of Saint-Pierre-aux-Liens church in Gesté, western France on 7 September 2013; the church was built in the 19th century and is being demolished due to economic concerns over upkeep and renovation:
The Saint-Pierre-aux-Liens, a 19th century neo-gothic parish church was scheduled for demolition in 2007 by the city council. With only the bell tower and crypt remaining, the current church will be replaced in 2016, [with a building] which will be cheaper to maintain. This destruction is part of a large debate across France where villages are faced with churches in deteriorating condition, diminishing populations, fewer parishioners and priests, and increasing upkeep costs.
Is this how mercenary materialism plays out in the end? Perhaps. But oddly enough, the trend started before the recession, in 2007, when mayors in the area of Anjou began proposing the demolition of churches. Local authorities are responsible for expensive upkeep of these buildings under early 20th century secularization laws. Nearly 3,000 churches were reported in 2007 by The Telegraph as being "in peril" because of the municipal policy of 'deconstruction'; if ever there was a definitive example of Post-Postmodernism, it is here:
A wave of threatened demolitions in the Anjou area has sparked fears of a wider phenomenon as mayors struggle to pay for the upkeep of the buildings, a responsibility they have held since secularisation laws passed in 1905.
No exhaustive list of the nation's rural churches exists. However, of its 15,000 protected rural religious buildings, 2,800 are "in peril", according to l'Observatoire du Patrimoine religieux, a religious heritage watchdog.
For some villages it is already too late. In Saint-Georges-des-Gardes, population 1,500, a wall with rusty metal spikes is all that remains of the parish church, built in 1870.
The edifice is the latest victim of what mayors call "deconstruction" - a term more usually associated with the French [Postmodern] philosopher Jacques Derrida than knocking down churches.
The mayor said a full renovation would have cost more than a million euros. It was far cheaper to build a new one. His move has emboldened several other mayors to consider following suit.
In nearby Gesté, a pretty neo-gothic church stands majestically atop the hillside.
However, the mayor, Michel Baron, has announced his intention to "deconstruct".
Alain Guinberteau, who runs a website, 40000clochers ... celebrating France's churches, was the first to sound the alarm. Without its church, the landscape of Gesté will be disfigured, he said. "It will be cut off from its past. A part of our heritage and history is wiped away with one bulldozer. France is viscerally attached to its rural churches, which represent the very soul of French villages."
Mr Baron said a renovation would have cost three million euros, while "deconstruction" will cost less than half that amount. "We tried to have the church listed to receive public funds, to no avail. In its place we will put a modern hall of 500 seats, easy to heat. Villagers have taken it well," he claimed.
However, Jean Leclerc. 62, a villager, said the decision was shocking. "Lots of people are unhappy about it," he said. "We are not against modernity but the village will lose a lot of its charm and history."
The local priest, Pierre Pouplart, refused to take sides. "These are matters for the local authorities," he said.
In Valanjou, 45km away, where another church faces its last rites, locals have formed a defence association.
"There is a great deal of resistance," said René Cottenceau, 67, its president. "About 80 per cent of the villagers are against the move. The safeguard of our heritage is being overlooked. Just at a time when we are coming to appreciate the value of such buildings, ours is to be knocked down."
The church's plight has attracted the attention of the art historian Didier Rykner, who runs the art review, La Tribune de l'Art.
"Not since the Second World War have we seen churches reduced to rubble," he said. "These mayors are the new enemy. They are in the process of destroying the very soul of their own villages. Churches don't just have historic and artistic value, they have intimate personal histories attached to them - christenings, weddings, funerals."
I wonder if this is less a story about the collapse of older values and economic problems, and more a story about local government. The 20th century - with its 'Hemoclysm' that saw the deaths of some 180 million people - was an object lesson on the perils of how different states could overcentralize according to various ideologies. Practically universally, the lesson came back that overcentralizing, no matter how you do it, is a bad idea.

Despite fears from the political right wing and libertarians during the Great Recession that Keynesian-styled government spending will create bloated centralized states, perhaps the larger Millennial trend - spurred by technology - is that governments are essentially decentralizing.

Local governments were supposed to offer the antidote to Big States when the latter were crippled (to take the American example) by special interests, lobbyists, corporate sponsorship and political stalemates. Municipal and county authorities were supposed to be apolitical loci of practical, hands-on action: these officials work at the level of government where problems really get solved. The Center for Strategic and International Studies argued in 2011 that this was the classic, yet troubling, rationale for decentralization:
The implicit rationale for decentralization goes something like this. If a government can perform closer to the people it is meant to serve, the people will get more out of government and, in turn, will be more willing to accept that government’s authority. The rationale is compelling, and despite the potential pitfalls associated with implementation, most scholars agree that a decentralized system of government is more likely to result in enhanced efficiency and accountability than its centralized counterpart. Still, disparities between the theoretical rationale for decentralization and what is actually gained in practice are gaping.
I am only guessing that local governments are becoming stronger in unusual ways. In the French case, it might be tempting to attribute these stories to the decline of Christianity and eroding communities in France. But many citizens are very upset about France's demolished churches. Maybe deconstruction of churches is just the French 'local flavour' of a much larger pattern in the developed world. The salient question is: why are local people so helpless against the very local officials whom they elected? There is something else at work here besides changing values.

Other odd examples come to mind. The Swedish police have just confirmed that they kept illegal records and family trees of thousands of Roma people, based not on criminal activity, but on biological data. A host of American cities have gone bankrupt, most famously Detroit.

There are many reports in Canada about municipal authorities acquiring ever more power, even as they slash services, and their budget demands nevertheless continue to skyrocket on fiscally unsustainable paths. This is a non-transparent, mysterious, general trend across the country, which has emerged over the past 20 years; the trend is evident in property taxes, municipal debt, local corruption, municipalities' insurance premiums, policing - although this view is challenged by critics.

Central governments, seeking to cut costs and deficits, fob off their expensive policies and programs on local authorities. Strangely, the economic pressures on town councils offer them a 'get out of jail free card' to exercise greater autonomy and to enforce strange policies in the name of cutting costs.  In other words, localities are going broke, but they're becoming more powerful because they're going broke. Another facet of this trend is the rise of municipal politicians to national levels of power. Something tells me this pattern is at work in France. Will the 21st century provide history's cautionary tales about the dangers of overdecentralization?

See related stories on demolitions of churches across France, here, here, here, here and here. Below the jump, a video of the demolition of the church of Saint Jacques in Abbeville, France in April, 2013.


Monday, September 16, 2013

The Coming Siege Against Cognitive Liberties


Image Source: Nature.

The Chronicle of Higher Education reports on how researchers are debating the legal implications of technological advances in neuroscanning:
Imagine that psychologists are scanning a patients' brain, for some basic research purpose. As they do so, they stumble across a fleeting thought that their equipment is able to decode: The patient has committed a murder, or is thinking of committing one soon. What would the researchers be obliged to do with that information?

That hypothetical was floated a few weeks ago at the first meeting of the Presidential Commission for the Study of Bioethical Issues devoted to exploring societal and ethical issues raised by the government's Brain initiative (Brain Research through Advancing Innovative Neurotechnologies), which will put some $100-million in 2014 alone into the goal of mapping the brain. ...

One commissioner ... has been exploring precisely those sorts of far-out scenarios. Will brain scans undermine traditional notions of privacy? Are existing constitutional protections sufficient to guard our freedom of thought, or are new laws required as fMRI scanners and EEG detectors grow evermore precise?

Asking those questions is the Duke University associate professor of law Nita A. Farahany ... . "We have this idea of privacy that includes the space around our thoughts, which we only share with people we want to ... . Neuroscience shows that what we thought of as this zone of privacy can be breached." In one recent law-review article, she warned against a "coming siege against cognitive liberties."

Her particular interest is in how brain scans reshape our understanding of, or are checked by, the Fourth and Fifth Amendments of the Constitution. Respectively, they protect against "unreasonable searches and seizures" and self-incrimination, which forbids the state to turn any citizen into "a witness against himself." Will "taking the Fifth," a time-honored tactic in American courtrooms, mean anything in a world where the government can scan your brain? The answer may depend a lot on how the law comes down on another question: Is a brain scan more like an interview or a blood test? ...

Berkeley's [Jack] Gallant says that although it will take an unforeseen breakthrough, "assuming that science keeps marching on, there will eventually be a radar detector that you can point at somebody and it will read their brain cognitions remotely." ...

Moving roughly from less protected to more protected ... [Farahany's] categories [for reading the brain in legal terms] are: identifying information, automatic information (produced by the brain or body without effort or conscious thought), memorialized information (that is, memories), and uttered information. (Contrary to idiomatic usage, her "uttered" information can include information uttered only in the mind. At the least, she observes, we may need stronger Miranda warnings, specifying that what you say, even silently to yourself, can be used against you.) ...

In a book to be published next month, Mind, Brains, and Law: The Conceptual Foundations of Law and Neuroscience (Oxford University Press), Michael S. Pardo and Dennis Patterson directly confront Farahany's work. They argue that her evidence categories do not necessarily track people's moral intuitions—that physical evidence can be even more personal than thought can. "We assume," they write, "that many people would expect a greater privacy interest in the content of information about their blood"—identifying or automatic information, like HIV status—"than in the content of their memories or evoked utterances on a variety of nonpersonal matters."

On the Fifth Amendment question, the two authors "resist" the notion that a memory could ever be considered analogous to a book or an MP3 file and be unprotected, the idea Farahany flirts with. And where the Fourth Amendment is concerned, Pardo, a professor of law at the University of Alabama, writes in an e-mail, "I do think that lie-detection brain scans would be treated like blood draws." ...

[Farahany] says ... her critics are overly concerned with the "bright line" of physical testimony: "All of them are just grappling with current doctrine. What I'm trying to do is reimagine and newly conceive of how we think of doctrine."

"The bright line has never worked," she continues. "Truthfully, there are things that fall in between, and a better thing to do is to describe the levels of in-betweenness than to inappropriately and with great difficulty assign them to one category or another."

Among those staking out the brightest line is Paul Root Wolpe, a professor of bioethics at Emory University. "The skull," he says, "should be an absolute zone of privacy." He maintains that position even for the scenario of the suspected terrorist and the ticking time bomb, which is invariably raised against his position.
"As Sartre said, the ultimate power or right of a person is to say, 'No,'" Wolpe observes. "What happens if that right is taken away—if I say 'No' and they strap me down and get the information anyway? I want to say the state never has a right to use those technologies." ... Farahany stakes out more of a middle ground, arguing that, as with most legal issues, the interests of the state need to be balanced against those of the individual. ...

The nonprotection of automatic information, she writes, amounts to "a disturbing secret lurking beneath the surface of existing doctrine." Telephone metadata, another kind of automatic information, can, after all, be as revealing as GPS tracking.

Farahany starts by showing how the secrets in our brains are threatened by technology. She winds up getting us to ponder all the secrets that a digitally savvy state can gain access to, with silky and ominous ease.
Much of the discussion among these legal researchers involves thinking about cognitive legal issues (motivations, actions, memories) in a way that is strongly influenced by computer-based metaphors. This is part of the new transhuman bias, evident in many branches of research. This confirms a surprising point: posthumanism is not some future hypothetical reality, where we all have chips in our brains and are cybernetically enhanced. It is an often-unconsidered way of life for people who are still 100 per cent human; it is a way that they are seeing the world.

This is the 'soft' impact of high technology, where there is an automatic assumption that we, our brains, or the living world around us, are like computers, with data which can be manipulated and downloaded.

In other words, it is not just the hard gadgetry of technological advances that initiates these insidious changes in law and society. If we really want to worry about the advent of a surveillance state, we must question the general mindset of tech industry designers, and people in general, who are unconsciously mimicking computers in the way they try to understand the world. From this unconscious mimicry comes changes to society for which computers are not technically responsible.

A false metaphorical correlation between human and machine - the expectation that organic lives must be artificially automated  - is corrosive to the assumptions upon which functioning societies currently still rest. These assumptions are what Farahany would call, "current doctrine." We take 'current doctrine' for granted. But at the same time, we now take for granted ideas that make 'current doctrine' increasingly impossible to maintain.

This is not to say that change is unnecessary or that technology has not brought vast improvements.

But is it really necessary for everything to go faster and faster? Do we need to be accessible to everyone and everything, day and night? Should our bosses, the police, the government, corporations, the media, let alone other citizens, know everything we do and think in private? Do we really need more powerful technology every six months? Why is it necessary that our gadgets increasingly become smaller, more intimate, and physically attached to us? Why is it not compulsory for all of us to learn (by this point) to a standard level how computers are built and how they are programmed?

We are accepting technology as a given, without second guessing the core assumptions driving that acceptance. Because we are not questioning what seems 'obvious,' researchers press on, in an equally unconsidered fashion, with their expectations that a total surveillance state is inevitable.

Thursday, September 5, 2013

Retro-Futurism 25: Tozo - Empire of the Spider

Tozo (4 September 2013) © By David O'Connell.

You may recall this post and this post, in which I described a great Web comic that is a perfect example of Millennial retro-futuristic style. The cartoonist, David O'Connell, combines steampunk-ish early-modern-to-nineteenth-century costumes and imagery with futuristic tropes. He sets his hero's story in a world that looks like a cross between Renaissance Venice, the fin-de-siècle Ottoman empire, and 1970s' sci-fiction, all at the same time. One minute, the characters have Elizabethan lace collars, the next minute they are interacting with Star-Wars-type robots. It's just great. O'Connell finished his first odyssey with this character, Tozo the Public Servant, in 2012. Yesterday, after a long hiatus, he started a new story, Tozo - Empire of the Spider (see the beginning here).

See all my posts on Retro-Futurism.

Tuesday, June 11, 2013

Ellen Ripley Meets Therapeutic Nihilism


Still from Alien3 (1992) © Brandywine Productions / 20th Century Fox. Image Source: Alien Explorations.

In 1991, David Fincher directed the Alien sequel, Alien3, which was a decade and a half ahead of its time. The film was nearly ruined by studio interference and production problems. It had previously gone through versions to which science fiction author William Gibson, Eric Red (writer of the cult horror films The Hitcher and Near Dark), future Riddick director David Twohy, and New Zealand director Vincent Ward all separately contributed.

What audiences and critics found more difficult was the gloomy, apocalyptic plot. Alien3 marked the new era of the compromised protagonist. It was a fraught with despair, a difficult narrative for audiences accustomed to triumphant cinematic conclusions. The heroine, Ellen Ripley, is even more heroic because she is not going to win.

Tuesday, May 14, 2013

The Problem with Memory 7: Space Museums

"The collection of images included on EchoStar XVI may be easier for any extraterrestrial intelligences to find than the plaques and records flown on the Pioneer and Voyager missions." Image Source: Creative Time via Space Review.

Space, the final archive. Some may remember this post on the Voyager spacecraft time capsules. A commercial communications satellite, launched from the Baikonur Cosmodrome in Kazakhstan on 20 November 2012, continued the tradition and carried a collection of photographs and images of artwork designed to outlast humanity. Objects in geosynchronous orbit (aka the Clarke Belt, named for writer Arthur C. Clarke) could stay in space for over four billion years; the new satellite's payload is meant to survive for this duration, to offer a potential time capsule for alien intelligences to discover.

Saturday, May 11, 2013

Interview: Colin Hall's Real Life X-File



In the past few years of blogging, I have seen some bizarre things on the Internet. Conspiracy theories have always been around, but since the turn of the Millennium, they have proliferated online to create a new kind of post-Postmodern folklore. Along with users' feverish circulation and misalignment of data, the Internet erodes the line between real and virtual, between fact and fiction.

Irrationalizations pose as quasi-rationalizations. The ideas which have sprung out of this mindset have become increasingly counter-intuitive and counter-factual: you have, to name a few, 9/11 truthers and associated chatter around Osama bin Laden's death and the purported deaths of the Navy SEAL team members who invaded his compound (this, despite the fact that the man who shot bin Laden was recently interviewed by Esquire); Illuminati New World Order fear mongers; and Bigfoot hunters. There are people who do not believe the moon landings took place, or that there are sinister reasons why we never went back to the moon (besides money and politics?). There are even people who seriously think that Queen Elizabeth II is descended from a race of lizard aliens!

In a way, what Web cultists fear is less important than the fact that their off-kilter belief systems foster new communities online. The Web turns social alienation on its head, so that the marginalized come together in interesting ways, as with the case of Preppers and computer hackers.

At the same time, we would be naive if we did not acknowledge that governments, corporations and practically every major organization have not appreciated the value of the Web for propagandistic, marketing and political purposes over the past fifteen years. The 'Web' might become just that: a tight knot of social control. Part of that control may stem from our willingness to believe the unverifiable, the fantastic, the strange - even though right now, weirder online beliefs are associated with anti-establishment attitudes.

Most outlandish Web myths are just surreal popular entertainment. However, the more unsettling stories occupy grey areas and test our ability to verify fact and fiction.

Friday, January 25, 2013

Reservoir Gods


Image Source: Armenian History.

I was thinking about my posts on the collapse of heroism in Postmodern and post-Postmodern fiction and cinema, especially the Revolving Door of Death, in which heroes in pop culture over the past 25 odd years have been killed off for thrills and then brought back to life. Peter Chung explored that cat-came-back trope with his animated character, Aeon Flux, around 1991. Chung used that trope to comment on its moral nihilism.

Each age in the Great Year brings new standards of heroism as the Precession turns.

In the Age of Aries, classical heroes were not immortal. Whether they were warriors or prophets, their deaths in myths and religious legends were a huge breaking point in the heroic story.

In the Age of Pisces, the classical hero became a religious saviour through sacrifice, transcendence and immortality. He could come back from the dead, which made the human hero must become divine.

In the Age of Aquarius, heroes can come back from the dead; they are immortals, like vampires or zombies, but they still bear the daily drudgery, weaknesses and flaws of real human life. Humans don't become gods, gods become humans.

Thursday, October 25, 2012

Countdown to Hallowe'en 7: Millennial Romantic Gothic


Image Source: C0untess Bathory.

It is sometimes difficult for those who skate the surface of the Web to grasp how profoundly and rapidly the Internet is altering global cultures. The tidal wave of data and engaged masses create an exponentially-growing multimedia jumble. That hot mess is infinitely anachronistic and rudely disconnected from knowledge and context.

Today, an example of how social media have moved past the Hallowe'en themes in this post and this post, to generate a rumbling new Millennial subculture.

Saturday, October 13, 2012

Countdown to Hallowe'en 19: Return of the Dead

Image Source: Byte Size Biology.

For this month's Countdown to Hallowe'en blogathon, I am writing horror posts which relate to themes on this blog. However, some Millennial Bad Ideas I cover are so horrific in their own right that they are a horror story or film waiting to happen. No one has fictionalized them yet.

Some archaeologists and Prehistory theorists imagine that the history of advanced humans runs back tens of thousands of years earlier than thought, with environmental disasters such as Ice Ages and Atlantean flood events periodically wiping humankind's collective memory of what came before. (See my regular posts on Prehistory here.)

Others, however, look to a future in which the secrets of the deep past may simply be brought back to life and studied. I09 recently dismissed the cloning of Woolly Mammoths, but with new specimens turning up in Russian permafrost, that possibility persists. The horrific Millennial reality for today is the debate on the DNA research on, and potential cloning of, the Neanderthal. Apparently, the feat could be accomplished for about $30 million. By contrast, sending astronauts to Mars would cost somewhere between $40 and $80 billion.

This is a Millennial take on already popular zombies and immortality of the resurrected: a separate human species became extinct, but could be revived by modern science. While there is nothing wrong with the work to decode the Neanderthal genome, the misapplication of that knowledge would be another matter.

Monday, October 1, 2012

Countdown to Hallowe'en 31: The Blair Witch Generation


Blair Witch Project Still. Image Source: BluRay Definition.

This year, Histories of Things to Come is one of the cryptkeepers in the Countdown to Hallowe'en blogathon. Every day this month, I will highlight different themes on this blog when skewed through a horror lens, from transhumanism to anniversaries; from generations to comics; from time-keeping to anti-ageing.

First up, The Blair Witch Project (1999), which together with its accompanying faux-documentary, Curse of the Blair Witch, proved that before the Internet hit full force, people still believed that something fake was real if its producers said it was real. The film has been extensively parodied and diminished by a poor sequel that turned the tropes of the original into clichés. But for a tiny pre-Millennial niche in time, this film owed its astronomical success to a perfect balance between 90s' grunge and a high-tech future. Although it had clear precedents, this was the beginning of Reality Horror.

Made for an initial (later expanded) budget of $20,000, Blair Witch made almost $250 million. It was the first film to be mainly marketed on the Internet. The film-makers exploited the public's pre- and early Web credulity. Wiki: "The film's official website featured fake police reports and 'newsreel-style' interviews. Due to this, audiences and critics initially thought it was an actual documentary about the 'missing' teenagers. These augmented the film's convincing found footage style to spark heated debates across the internet over whether the film was a real-life documentary or a work of fiction." You can see some of that fake supporting information at the film's official site here.

Thus, the bulk of the film's profits came from the technological innocence of a departing century. Because it walked the line between past and future so exactly, it is also a quintessential Generation X film. With that precedent set, this generation has continued to push the envelope in the grey area between reality and virtual reality.

The film also depended on a pared-down, classic horror plot. Its believability depended on a real historical resonance still lingering from Hawthorne-esque memories of Colonial America. Ironically, as the Web-savvy public have become much more cynical about found footage and other fake-reality new media gimmicks (Postmodern and post-Postmodern), it is Blair Witch's purely modern take on colonial witchery - its grounding in the past - that makes it hold up over time.

Thursday, August 30, 2012

Carl Jung's Millennial Time Bomb


Image from Jung's Red Book. Image Source: Amazon.

In this post, see the founder of analytical psychology, Carl Jung, in an interview conducted in 1957 by Dr. Richard I. Evans, a Presidential Medal of Freedom nominee. Wiki on Jung:
Jung created some of the best known psychological concepts, including the archetype, the collective unconscious, the complex, and synchronicity. The Myers-Briggs Type Indicator (MBTI), a popular psychometric instruments has been developed from Jung's theories.

Jung saw the human psyche as "by nature religious". and made this religiousness the focus of his explorations. Jung is one of the best known contemporary contributors to dream analysis and symbolization.

Though he was a practicing clinician and considered himself to be a scientist, much of his life's work was spent exploring tangential areas, including Eastern and Western philosophy, alchemy, astrology, and sociology, as well as literature and the arts. His interest in philosophy and the occult led many to view him as a mystic.
These ideas reflect many aspects of the Millennial experience. Jung tried to discern the indiscernible; he sought to detect what exists in the unconscious. Jung contended that attention, concentration and memory failed at points where the unconscious was asserting itself. His hypothesis hints that ephemeral Millennial attention spans and Internet procrastination may describe the Web as a pool of the collective unconscious. In fact, many of Jung's ideas correspond to the growing divide between real life and virtual life on the Web.

Tuesday, August 28, 2012

Liminal Promise

Three-quarter moon over Logan, Utah (2009). Image © Ted Pease / Peezpix.

"According to the known laws of physics, the past and the future should be exactly the same."

"The future of the universe may be traveling back in time to meet us."

"Time may not be what we think it is. And all of eternity might already exist."
-Through the Wormhole (The Science Channel).

In storybooks and fairy-tales, myths and movies, we expect the highlights of a narrative to follow ups and downs, conflicts and catharses, resolutions and denouements. There are recognized patterns of fiction: almost every story, everywhere in the world, written in any time period, follows thirty-six models listed here. Similarly, Christopher Booker won attention in 2005 for a book entitled, The Seven Basic Plots, in which he argued that there are only seven stories: overcoming the monster; rags to riches; the quest; voyage and return; comedy; tragedy; and rebirth. These formulas imply that the human mind wants the world to follow a predetermined sequence of plot devices. In the modern myths of comic books and other story-telling media, these anticipated and often mandatory plot points are called 'beats.' These patterns are part of what the Berlin School of Experimental Psychology defined as a human urge toward completing a fragmented or fractured picture in reality, a Gestalt psychology.

When the psychological pressure of those narrative demands is applied to reality, we get into trouble. In cinema, 70 years of real life beats are routinely condensed into two fictional hours. On television, the time frame is likely 30 minutes to one hour. Maybe we can in truth only bear to have things work according to plan for short durations.

All the same, movies, television, comics, novels and other modern narrative forms have socialized the bulk of the modern public. Do people expect the beats of real life to arrive faster than they should? Or do they wrongly expect them to arrive at all? If one's life does not follow the expected 'story,' there can be tremendous pressure to make real life conform to fictional structures. When real life does not follow the prescribed story, a great deal of social anxiety and individual stress can result.

Friday, July 20, 2012

Chrono-Displacement Disorder: The New Normal

Image Source: Cauldrons and Cupcakes.

It used to be that the only ways our perception of the regular flow of time might change significantly were though grief, meditation, madness or trauma. At Ghost Hunting Theories, Sharon Day describes how stress makes people change their view of time, slowing the flow of time down, and transforming perception:
if you have recently lost a loved one -- that seems to make you more open to the other side, and if you have undergone major surgery or life-threatening illness. There are moments in our lives, where time slows down for us and we are out of sync with the mainstream, such as depression, illness, grief. Those times are the most often periods we are plagued with transients. Perhaps its our change in perspective of what is important, or changing our routine so we slow down and notice things.
In periods of altered temporal perception, Sharon believes that people see ghosts. This idea chimes with the idea that ghosts are electro-magnetic shades trapped in time pockets. Psychics and mediums, so the theory goes, perceive the underworld because they read time backwards, against its normal direction.

Even intensively studying history requires one spend many hours of present days in a past time period, understanding the past on its own terms. The experience of suspending one's hindsight and accepting historical actors' lack of foreknowledge is not always congruous with life in the 'now.'

Then again, now, the jarring experience once confined to superstition at worst, and to historians' archives at best, is available to anyone with a video camera and a computer. Popular gadgetry has increased people's awareness of time and their ability to manipulate it. Tech affects the temporal progression of language; it enables the mistreatment, misplacement and misunderstanding of historical events on the Internet; it takes cultural phenomena out of their temporal contexts. Computers grant access to whole bodies of historical knowledge, events and images and allow them to be treated like so much random data.  Anyone can jumble historical details together in a post-Postmodern collage. This mish-mash is ordered according to nothing but presentist impulses, with no regard for historical context and respect for forward time flow. This trend has been appearing in a lot of Millennial dramas, fiction and cinema. Tech makes the fictional illness, chrono-displacement disorder, real. Moreover, tech makes chrono-displacement disorder the new normal.

The most notable examples of individuals' personal tech-enabled time-plays have so far been fairly benign. People have been more intuitively respectful of the temporal direction of history when they are talking about their own lives. Thus, their plays on time are historically conventional: there are time lapses, wherein people photograph themselves every day over several years, and then put the photos together to make a personal time lapse video (compression of time). Sometimes people revisit a famous event which recurs, decades later (time cycles). Similarly, people retain old footage of themselves and mesh it with new videos, to create a conversation between the past and present (a fairly eternal metaphor of the older man talking to his younger self, which is classic chronal wish fulfillment). For the latest example of the latter, go to the viral video below the jump. It was made by Jeremiah McDonald (see his sites here and here), in which the Gen Y actor and film-maker, who is now 32 years old, talks to his 12-year-old self (via Mike Lynch Cartoons - thanks to -T.).

Thursday, June 21, 2012

Recession, Apocalypse and Hipster Futures

"The photo was taken December 7, 1978 in Albuquerque, New Mexico before the company moved its offices to Washington. The people in the photo are (from left to right, starting at the top) Steve Wood, Bob Wallace, Jim Lane, Bob O' Rear, Bob Greenberg, Marc McDonald, Gordon Letwin, Bill Gates, Andrea Lewis, Marla Wood, and Paul Allen." Image/Text Source: Museum of Hoaxes.

Remember the poster of the original members of Microsoft Corp. in 1978, which challenges investors' ability to recognize future trends? Museum of Hoaxes comments on the Baby Boomer board's future worth:
If you had chosen to invest your money with this bunch of scruffy looking characters back in 1978, you'd be quite rich now. But how rich did the people in the photo become? Here's their estimated wealth, listed in descending order:

Bill Gates: Still with Microsoft as it's chairman and chief software architect. His fortune is somewhere in the range of $50 billion.

Paul Allen: Left Microsoft in 1983 but remains a senior strategy advisor to the company. Worth around $25 billion.

Bob O'Rear: Left Microsoft in 1983. Is now a cattle rancher and is worth around $100 million.

Bob Greenberg: Left Microsoft in 1981 and then helped launch those Cabbage Patch Dolls that were so popular in the 1980s. Last time anyone checked, he was worth around $20 million.

Jim Lane: Left Microsoft in 1985. Now has his own software company and is worth around $20 million.

Gordon Letwin: Left Microsoft in 1993 and now devotes himself to environmental causes. Is worth around $20 million.

Steve and Marla Wood: They both left Microsoft in 1980 and Marla then sued the company for sex discrimination. They're worth around $15 million.

Bob Wallace: Left Microsoft in 1983. Worth around $5 million.

Andrea Lewis: Was Microsoft's first technical writer. Left the company in 1983. Worth around $2 million.

Marc McDonald: Was Microsoft's first employee. Left the company in 1984, but recently rejoined the company when Microsoft bought Design Intelligence, the company he was working for. Has the honor of getting to wear badge number 00001. Probably worth at least $1 million.
Nothing has changed since 1978. It is equally difficult to recognize today's counterparts of Microsoft Corp.'s scruffy characters.

2012: Would you invest in this man? Image Source: Bike Co.

This post is about two things: the tanking economy and the Internet-based underclass who will be architects of a future society in the post-recession wreckage, if they are lucky.

First, the tanking economy. Yesterday, the market plunged in another general global loss of confidence. The Wall Street Journal summarized the rat-a-tat-tat of headlines: Moody's downgraded the ratings of fifteen financial firms with global capital markets operations; Asian markets sank over fears of a "deepening global economic slowdown"; US stocks suffered the second worst day of 2012; for the EU, the advice was to dump Greece, and save Italy and Spain; and tech stocks slumped in a broad sector retreat. Following yesterday's Wall Street nosedive, Aussie stocks opened down today. Business Spectator: "The Australian stock market extended its losses at noon on the back of disappointing economic data from China and Europe and a move from the US Federal Reserve's to slash its growth forecast for the United States." The psychological atmosphere in the financial sector yesterday was one of anxiety, worry, frustration and thwarted premature optimism.

In other words, the recent economic recovery was only a middle part of a double-dip recession; the economy passed through the calm eye of a storm and is now hitting the wall on the other side. This argument demolishes the fiction that the recession ended in 2009, when it is in fact still ongoing. Today's drop was predicted by Charles Nenner, a former Goldmanite who remarked in March 2012 that the stock market rally in the first quarter of this year - the best since 1998 - would peak in April. He promised (here) that the rally would be followed by a plunge, which is now occurring.

Wednesday, June 20, 2012

Shopping for Other People's Dreams

Crania Anatomica Filigre: the third-most funded arts project ever supported on the site, Kickstarter. Image and sculpture (2011) © by Joshua Harker.

On Kickstarter the crowd-funding site, if you can dream just about anything up and successfully pitch it, you will likely get enough financial support to do it. Despite worries about fragmented societies becoming more violent and corrupt, the Internet fosters new connections. Kickstarter donors display remarkable generosity when it comes to helping other people achieve their personal dreams. The notion that anyone can be a celebrity or can take a stab at greatness is partly a lifestyles trope developed by post World War II mass media and marketing minds. Nonetheless, the enormous appeal of this self-serving forumla ironically ensures the rebirth of collective well-being in online communities. Shopping for and supporting other people's dreams with small donations is its own reward (although Kickstarter project managers typically offer project-based rewards at each donation level) because it perpetuates the post-war mantra that you can be anything you want to be.

Video Source: Kickstarter.

This democratization of the Self via marketed universal egotism is as problematic as it is revolutionary. Contrary to all appearances, it was a delayed revolution. Although the Baby Boomers, popularly known in the media as the 'Me Generation,' would appear to embody the credo of self-discovery through capitalism, they in fact profited from establishment structures which pre-existed them, and had not fully collapsed until the late 2000s. In short, they were still rebellious children of an earlier economic system and era.

As we climb through financial wreckage, we are only now starting to see how economies will evolve. It's rough, but not all gloomy. As traditional charities struggle through the recession just when they are most needed, enterprising individuals are readily raising cash for their personal projects. The great strain of the recession inspires new kinds of generosity, a reappraisal of values, a return to the drawing board. Priorities once implemented via capitalistic vanities are being expressed through different types of donation and consumption. Novel modes of survival create innovative trades with altered rules.