TIMES, TIME, AND HALF A TIME. A HISTORY OF THE NEW MILLENNIUM.

Comments on a cultural reality between past and future.

This blog describes Metatime in the Posthuman experience, drawn from Sir Isaac Newton's secret work on the future end of times, a tract in which he described Histories of Things to Come. His hidden papers on the occult were auctioned to two private buyers in 1936 at Sotheby's, but were not available for public research until the 1990s.



Showing posts with label Postmodernism. Show all posts
Showing posts with label Postmodernism. Show all posts

Thursday, June 6, 2019

What's Left Over? The Rationalist-Materialists


A quotation from the 2014 collection The Blooming of Madness 51, by Florida poet Christopher Poindexter. Image Source: pinterest.

A simple way to understand the philosophical crisis raised by technology is to ask yourself the question: 'What's left over?' This is a shorthand I devised, and partly borrowed from the sci-fi writer, Philip K. Dick (1928-1982).

Dick predicted the impact of simulated realities on our consciousness. Aware that simulations would soon be indistinguishable from organic beings and authentic objects, he kept trying to hit bedrock and finally concluded in his 1980 short story, I Hope I Shall Arrive Soon: "Reality is that which, when you stop believing in it, doesn't go away." This can be a maxim for testing your own views and those of others regarding the mind and its relationship to reality, especially when it comes to the meaning of creation (whether one sees God as Creator or humans as creators) and created objects like technology.

My previous post introduced the hypothesis that how people view technology may be grounded in rationalist-materialism, materialism, or in anti-materialism. Today, I will start with the rationalist-materialists; two subsequent posts will discuss the materialists and the anti-materialists.

To define what I mean by those systems of thought, I asked 'What's left over?' after one removes complex narratives and beliefs about reality in each case. That is, what core attitudes are we talking about when everything else is stripped away? My answers vastly oversimplify different philosophies about the mind and matter, and avoid formal academic definitions; nevertheless, I hope they will clarify our current conundrum as technological simulacra become harder to control.

Monday, October 22, 2018

Hallowe'en Countdown 2018: CERN is Wonderland


ALICE project at CERN: "(A Large Ion Collider Experiment) is a heavy-ion detector on the Large Hadron Collider (LHC) ring." Image Source: The Royal Society.

Today's post reveals how online researchers can take two terrifying ideas and combine them to create a new, counter-factual alt-history that is more terrifying because it seems more true, when it is actually less true.

Magic and Science, A Post-Truth Horror Mash-Up

These two frightening rumours are:

Theory 1: Evil Science
Those developing technology mean to enslave us with mind-bending and reality-altering devices that will get out of control.

Theory 2: Evil Magic
The CIA's historically-documented MKUltra program is a still-operational global scheme for brainwashing and controlling select individuals. The program's Nazi roots lead to murky accounts of ritualistic cults, black magic mayhem and murder, Satanic ritual abuse (SRA updates the ritual murder myth), prostitution and child torture. The hypothesis runs that MKUltra's malevolent techniques are now being applied wholesale via entertainment and news media to mainstream culture. A typical testimony in this genre is here.

Theory 1 seems more credible and tamer in terms of content than Theory 2. It isn't.

Theory 1: TLDR
  • The double slit experiment proved that an observer transforms an object by observing it. Scientists declare this occurs only on the quantum level.
  • Terrifying online theory: this leads to the question of whether any serious research projects have set out to understand and control that process on a large scale, thus potentially gaining the ability to shape general reality and change history at will.
  • To do this beyond the quantum level (if that is possible), researchers would have to control masses of observers in the first place; they would have to learn how to control human perception generally; and they would then have to begin to conduct experiments on that energetic field.
  • There is a hypothesis on the Internet that there may be a mathematical formula which defines how human perception affects reality. That is, this is a mathematical formula for the physics of human consciousness as it interacts with, and impresses itself upon, the material world.
  • A further hypothesis considers that the means to control human perception was already known in some cryptic way in occult or magical practices, but that means was not scientifically understood or instrumentalized in a reliable, industrial way. Thus far, the bending of space-time has supposedly been rooted in organic manifestations generated by highly-intuitive, spiritually-gifted, and esoterically-trained individuals. Some of these occult secrets have been revealed publicly through the arts.
  • Or they have been revealed through politics. Another fringe theory maintains that the murderous politics of Nazi Germany was an occult experiment in altering mass perception, thereby forcing reality and history to change, with genocidal consequences.
  • Yet another fringe theory asserts that after the fall of Nazi Germany, the Nazis' research into mass mind control was transferred to, and continued in, the USA and the UK at Project MKUltra and the Tavistock Institute, respectively.
  • Conspiracy theorists think all that work is now bearing terrible fruit and that CERN is the big scientific experiment dedicated to orchestrating human perception, thereby building a tool that will create new realities through the manipulation of mass consciousness.
  • One alt-researcher, Daniel Liszt, thinks that the map for CERN's research was originally presented in mathematical formulas embedded in the Alice in Wonderland books by British author, Lewis Carroll. The Annotated Alice reveals how Carroll did indeed insert mathematical formulas into his fiction.
  • The assumed CERN-Wonderland connection was fostered by the fact that CERN actually has a heavy ion detector called ALICE.

Sunday, April 22, 2018

Time and Politics 26: The Age of Divergence


Image Source: 8 tracks radio.

Last week, Michael Morgenstern intellectualized disinformation at The Chronicle of Higher Education: Dear Humanities Profs: We Are the Problem. Dismayed about American politics? Look in the mirror.

Morgenstern stated that postmodern theorists attacked the canon in the second half of the 20th century. According to these theorists, the western canon in arts and literature (and science) was a story of oppression, often intended, sometimes unconscious. This dominant account was written by Dead White Males. The period from the 1970s to 2010s was spent dismantling that canon, attacking power and privilege in the name of liberal civil rights and equality.

This approach extolled cultural relativism: there was no objective truth, no higher class of knowledge, no text, no vision of cultural superiority which could be offered as a mode of control (subtext: unless it was the new, divergent, relativist, postmodernist canon).

The Age of Divergence

Morgenstern has realized that this attack on cultural and intellectual convergence brought about our present circumstances. Postmodern literary critics had stated that every text was equal. Every text had its own 'civil rights.' Boomer intellectuals and their Gen X students recovered silenced voices, women writers, slave poets, indigenous histories, minority views. This was understandable and justified, because so many people had been mistreated and oppressed for decades, centuries, millennia. Without their voices, our histories were incomplete and our whole understanding of reality would be based on injustice and immorality.

However, this recovery of lost texts was also done in the name of undermining established experts and authorities. This was really a generational power struggle inside the academic profession, but it was dressed up in and justified with theory. Old tenured professors were unseated, early-retired, or pushed out. The aim was to supplant the older generation of intellectuals (viewed as 'the 1950s') with a radical new generation (defined as '1968'). But time has finally caught up with the 1968ers, who are now retired or retiring.

It's not as simple as this, but broadly speaking, this is Morgenstern's summary of how radical liberals attacked conservative authorities of the 1950s and built new intellectual value systems from the 1970s onward.

Morgenstern concluded that the liberals were successful. As a result, we now live in a world where no text is taken as true or accepted solely on the basis of the authority of its author or publisher. Unfortunately for this stratum of intellectuals, they now claim expertise, and by their own logic find themselves attacked, just as they once dismantled the institutional and cultural structures which came before them. Only one Chronicle reader, rebek13, pinned down Moregenstern's idea:
"I'm confused by other commenters who seem to have missed the main points I took from this, which were not so much about 'the canon' at all, which the author admits had exclusionary tendencies (though it need not).

What I see here is a critique of our abdication of the very idea of expertise, excellence, and beauty--literary studies serving as a prime example, but only that. We have in a postmodern haze suggested that tweets are just as good as texts, and that anyone's opinion on literature, philosophy, or history is really probably just as good as the expert who has spent years studying these fields. In doing so, we have made ourselves absolutely pointless and suggested the uselessness of our very fields.

Isn't it very odd that professors of literature have such poor defenses of the study of long, dense literary texts? Isn't it odd that many of my colleagues have turned to facebook comments and recipe books as objects of study (not scrutiny, ever, but study) as though these articles were just as precious as a novel? Why should students understand the value of a long text if we are saying 'everything is literature; nothing is any more worthwhile than anything else'?

It's funny. Colleagues with creative writing backgrounds seem to have a much more profound and certain appreciation of literature than those who spend their lives studying it. They would never suggest that a tweet could achieve the same things as a book. They are not that far gone.

The solution as I see it is not a return to canon as much as a return to the idea that some modes of thought and expression ARE better than others, and that experts are in a pretty good position to tell people how and why. Then we need to be those experts."
In short, due to the Internet, we are awash in oceans of information; and the very people who were supposed to decide what information was authoritative staked their own authority over the past fifty odd years on the deconstruction of privilege around information. This overhaul was supposed to offer new freedom for disempowered liberals in the old, conservative system. It was not supposed to spell freedom for disempowered conservatives in the new, liberal system.

Friday, September 16, 2016

The Brontë Effect


Image Source: Opheliac Madness.

At the great blog, Trans-D, Dia Sobin finds artistic connections between layers of time and dimensional existence. Recently, she dug through a trove of old books - with initial posts here and here - and settled on a 1943 edition of Emily Brontë's Wuthering Heights (1847). She wrote an incredible post on how Catherine's and Heathcliff's love reveals the blurred boundaries of reality. I commented, because she described something one might call 'the Brontë Effect'; the italicized text cites Dia's post, with my comments in non-italics:
"'And, there is also the transdimensional aspect of the story: the odd way in which Emily presented her narratives, from several different points of view, intertwining numerous points in time, thereby, creating a weird, reverberating gestalt as opposed to a linear chronicle.' ... [I responded:] I felt that there was an indistinctness, especially because the characters give their kids the same names. Past, present and future are jumbled together. ...

I wonder if Emily Bronte was exposed via her father to Scottish freemasonry? Because when you look at the story in the sense of two souls in an alchemical marriage, the story becomes much more clear. Maybe she intuitively 'reached for' alchemical concepts without knowing them. I am sure someone has researched it. A lot of the primal gothic takes on the trans-dimensional or multi-dimensional aspects ... if you consider the alchemical. Across time, space, in new incarnations, like the two lovers embody a conflicting spirit of humans on the moors, but [also on] Jacob's Ladder ... ."
First, regarding Dia's observation that Wuthering Heights is trans-dimensional and multi-temporal, one senses this less in reading the novel, and more in the lingering impression after one reads it. The story leaves one with a feeling of time smashed together through characters' blurred and overlapping identities; their names and roles repeat, and generational tweaks are permitted over decades. The novel goes on forever, but Catherine is only about 18 years old when she dies at Thrushcross Grange. The 2009 dramatization had her die at age 25; either way, she remains eternally young and a persistent force.

ITV 2009 adaptation of Wuthering Heights, starring Tom Hardy as Heathcliff and Charlotte Riley as Catherine. Image Source: Elementary.

Friday, June 24, 2016

Time and Politics 20: Brexit


The statue of Winston Churchill at Westminster. Image Sources: The Atlantic and The Telegraph.

Although the blog is on a break, Brexit is a momentous historical event. It made me think of a quotation* from the Younger Pitt: "Depend on it, Mr. Burke ... we shall go on as we are to the Day of Judgement."

Perhaps. Although the UK will not leave the EU for two years, Irish and Scottish support for the European Union may lead to the reunification of Ireland, the separation of Scotland, and the break-up of the United Kingdom. Because the campaign became so dark, ugly and tragic, culminating with MP Jo Cox's murder, I will not comment at length on the arguments for one side or the other. I can see both points of view, because the Brexit debate confirms trends I have observed here while researching posts on the economy and the cultural impact of technological change.

Sunday, May 31, 2015

No Dislike Button: Social Media's Utopian Judgements and Misjudgements


Image Source: RLBPhotoart via Ghost Hunting Theories.

The blog is back! You know that gradual sense of erosion, the haunting of a Millennial mind as it over-surfs through a day that starts with optimism and ends with futility? How do social media contribute to a day's drift toward despair? In a New Yorker article from October 2014, Joshua Rothman criticized Facebook's fake optimism, its missing 'dislike' button, its relentless insistence that we like everything and constantly cough up happy thoughts and accomplishments to build a smiley online community (Hat tip: Daniel Neville). Rothman sees Facebook as an arena, where participants compete as greatest contributors to collective happiness, equated with a complex of good attitudes and real outputs as proof that good attitudes work. Beneath that, there is a misjudgement of those who are not sharing enough good attitude tidbits, or enough evidence of personal success. Rothman thus concludes that Facebook is one of the Web's Kafkaesque lower courts of judgement:
Facebook, like much of the Web, is officially designed to encourage positivity; there is no “dislike” button, and the stated goal is to facilitate affiliation and belonging. But, over time, the site’s utopian social bureaucracy has been overwhelmed by the Kafkaesque churn of punishment. ... Facebook has become a dream space of judgment—a place where people you may know only in the most casual way suddenly reveal themselves to be players in a pervasive system of discipline. The site is an accusation aggregator, and the news feed is the docket—full of opportunities to publicly admire the good or publicly denigrate the bad, to judge others for their mistakes or to be judged for doing it wrong.

Not all of Facebook is devoted to overt judgment and punishment, of course; there are plenty of cute family photos and fun listicles floating around. But even superficially innocuous posts can have a hearing-like, evidentiary aspect. (Paranoia, unfortunately, is inevitable in a Kafkaesque world.) The omnipresent “challenge”—one recent version, the “gratitude challenge,” asks you to post three things you’re grateful for every day for five days—is typically Kafkaesque: it’s punishment beneath a veneer of positivity, an accusation of ingratitude against which you must prove your innocence. ... Occasionally, if you post a selfie after your 10K or announce a new job, you might be congratulated for “doing it right.” But what feels great in your feed takes on, in others’ feeds, the character of what evolutionary psychologists call “altruistic punishment”—that is, punishment meted out to those who aren’t contributing to the good of the community.
Social media's stick-wielding positivity is divorced from human experience, while constantly appealing to experience as proof of its viability. You had better build the happiness of your online community, little Boot-camper. Or else. Positive cultural motivation supposedly drives productivity; except it doesn't. In this fake positive culture, dominated by Facebook's small egotists, success becomes meta-performance, which does not mirror the protracted work and grit needed to accomplish anything substantial. Anyone remotely sensitive to actual positives and negatives is left enervated, isolated, alienated, depressed.

Monday, April 27, 2015

Rewrite the History of the 1960s


Unpacking the head of the Statue of Liberty (1885). Image Source.

This week, my post on Johnny Depp and Winona Ryder was highlighted on one of Gen X's best blogs, Are You There, God? It's Me, Generation X. While thanking Jennifer James and checking the other links she listed, I was struck by the way Generation X remains ensnared as an echo generation, its identity projected upon from the outside by a narcissistic Boomer narrative. Jen writes: "Part of my intention is to maintain a tiny space on the Internet where evidence of Gen X society can be preserved." Why is her mission such a struggle?

Head of the Statue of Liberty, displayed in 1878 after completion at the Third Universal Exhibition, or World's Fair, in Paris. It was exhibited in Paris for several years before being shipped to the United States. Image Source: Albert Fernique (born c. 1841, died 1898) / LOC via pinterest. Published in 1883 in Frédéric Bartholdi's Album des Travaux de Construction de la Statue Colossale de la Liberté destinée au Port de New-York (Paris).

The Boomer narrative, that blinkered, one-track view of history, started as a story about 1960s' youth counterculture and fighting for liberty from the establishment. Boomers' liberties made them into libertines, who erected a monument to their own whims and pleasures around the period from the 1967 Summer of Love through 1969. The entire world now seemingly turns around that temporal pivot. Sometimes, it is also treated as a historical bottleneck. For the post-war period, 1967-1969 becomes a shorthand for the social, political and economic history of what happened in developed countries, and everything must go through that chokepoint, as it relates to the Boomer story. But what if everything didn't go through that chokepoint? What if that is not the way things happened? What if other histories ran concurrently from the 1940s through to the present that go unacknowledged because they don't fit the generational story? What if this is not a story about generations at all? What if you can find millions of individuals who don't fit the generational idea, and what if constructing a whole new social order around social alignments based on horizontal categories like 'age' is fake? If any of these suggestions are possible, then the history of the 1960s needs to be radically overturned and rewritten.

Saturday, February 28, 2015

Memespace Hyperventilation


Palaces in the sky: Dark Roasted Blend recently celebrated the incredible visions of French science fiction comics from the 1970s, which American and British directors mimicked in comics and cinema in the 1980s and 1990s. Image Source: Jean-Claude Mézières via Dark Roasted Blend (Hat tip: Me and You and a Blog Named Boo).

On 27 February 2015, Richard Branson encouraged entrepreneurs to come forward to share and expand new ideas. That's great, although some of the big biz riffing around the future celebrates the new idea of the new idea. One never actually gets to a new idea. The out-of-control lingo-about-lingo about the newness-of-newness reminds me of the explosion of Postmodern Expert Speak in the 1990s, which constructed new foundations of intellectual cultural authority.

The Valérian and Laureline "series focuses on the adventures of the dark-haired Valérian, a spatio-temporal agent, and his redheaded female companion, Laureline, as they travel the universe through space and time." Above, "Baroque spaceships (complete with ghost-ridden halls and gargoyles sticking out into the void of space)." Image Source: Dark Roasted Blend.

Mr. Branson quoted commenter Jason Silva, a photogenic Gen Y guru, who is a one-man meme generator and Singularity freestyle philosophical poet. He is compelling and makes good points, but there is something weird about the way he takes the Tech Revolution so literally and with such breathless utopian fervour. His clever rants reach height after height against IMAX effects. His videos are fantastic, if you like the Singularity Themepark Channel. His Youtube commentaries are part of the TestTube Network, which shares an unreflective undergraduate confidence that its contributors can fix the world, or at least understand it, if they edit it and add a soundtrack to it.

Silva's enthusiasm reminds me of the glassy-eyed idealism around the founding of America, or the Revolution in France. He joyously accepts the demolition of temporal boundaries and celebrates breaches of physical and cognitive limitations. He lacks a sense of Techno-Irony about the separate virtual lives enjoyed by his Online Language and Online Ego. To illustrate how Silva can be pithy yet simultaneously hollow, compare his Existential Bummer (the last video below) about death and a life beyond with another writer on similar topics. See Kate Sherrod's Story Sonnets: Infected (24 February 2015) and Who's the Real Crook Here? (23 February 2015).

Monday, February 16, 2015

Modernity, Myth and the Scapegoat: Martin Heidegger, J. R. R. Tolkien and ISIL


Heidegger, at the centre of the photo, in the era of Nazi academia. Image Source: Le phiblogZophe.

Two paths diverged in the wood. I took the one less traveled by, and that has made all the difference. In 2014, the private notebooks of German philosopher Martin Heidegger (1889-1976) - muse of Jean-Paul Sartre, Jacques Derrida and Hannah Arendt - saw print. The publication of the so-called Black Notebooks confirmed that Heidegger's philosophy grew out of support for the Nazis and an essential anti-Semitism. Oceans of ink have been spilt over what Heidegger meant by Dasein, or Being-in-the-World (his union of subjective, objective and conscious perspectives with the world at large), but this elaborate existential debate completely misses the historical context which informed Heidegger's thought. Heidegger associated his cherished idea of Authentic Existence with the values of agrarian Europe. For the German philosopher, rootless Jews were part of a new, supranational world of corporate industry, banking and trade. Jewish precursors of globalization contributed to an inauthenticity of being, a life whereby everyday people, distanced from the soil, became phantom slaves in a technology-driven world that destroyed traditional culture.

The Scapegoat by William Holman Hunt (1854-1856). Image Source: Wiki.

It is too simplistic to dismiss Heidegger's thoughts on being and time as aspects of the Nazi narrative. But it is also wrong to say that his ideas can be read separately from their Nazi context. Heidegger was in the same ballpark, and that demands a serious reappraisal of his ideas.

In building their Aryan mythology against the Jews, the Nazis ironically appropriated the Hebraic concept of scapegoating. The scapegoat was originally an early Archaic, pre-Classical improvement (dating from around the seventh century BCE) on the sacrificial rites of other ancient societies. Scapegoating, a mental gambit which is alive and well today, occurs when one projects one's sins onto a goat and sends it off into the desert to die; this leaves one free from blame and responsibility, and able to get on with life without feeling guilty for one's wrongdoings.

Saturday, March 22, 2014

Gilgamesh Amateur


Meanwhile, on Amazon.com ... Image Source: Assyrian International News Agency.

On Amazon, someone gave the Epic of Gilgamesh, the oldest known epic, one star. This was not one star for the quality of the translation in the Penguin Classics edition. It was one star for the story, because, as the reviewer put it, the epic was too "clichéed." The reviewer had seen it all before! And the epic's author, whoever he was, was trying to "do too much." Amazon:
I found it cliched and with many references that have not aged well. It is very derivative and employs most of the generic stock events from B-grade paperbacks. Also, most of the jokes will be lost on readers, as they refer to events that long ago passed into the historical record. However, probably the novel's biggest problem is its attempt to do 'too much'. A lot like the recent film Clash of the Titans, the author appears to have relied on introducing sensational mythical creatures in place of a well-fleshed-out storyline that engages with the modern reader.
That's the turn of the Millennium for you, when people who don't understand the direction of history think that original source works of world culture are "derivative." The reviewer is dimly aware that the epic is an 'old' work, but doesn't seem to understand that the Clash of the Titans movie, released in 2010, is not on par with The Epic of Gilgamesh, either artistically or chronologically. This is not about a Jungian eternal repetition of mythical archetypes. This is not about the theory of time, where history is an artificially-conceived flow and all times are equal. This is about a genuine inability to recognize the origins of culture as we have experienced it.

The reviewer wrote the review right under the Penguin blurb, which states:
Originally the work of an anonymous Babylonian poet who lived more than 3,700 years ago, The Epic of Gilgamesh tells of the heroic exploits of the ruler of the walled city of Uruk. Not content with the immortality conveyed by the renown of his great deeds, Gilgamesh journeys to the ends of the earth and beyond in his search for eternal life, encountering the wise man Uta-napishti, who relates the story of a great flood that swept the earth. This episode and several others in the epic anticipate stories in the Bible and in Homer, to the great interest of biblical and classical scholars. Told with intense feeling and imagination, this masterful tale of love and friendship, duty and death, is more than an object of scholarly concern; it is a vital rendering of universal themes that resonate across the ages and is considered the world's first truly great work of literature.
The epic dates from the 18th century BCE and its source stories are probably older. You can read it online for free here.

The review is one of the Internet's unexpected fruits: the surprise of vast ignorance in a sea of knowledge, information and pure data. And this is not the ignorance of someone who could not be bothered to look up the Wikipedia entry. The information was right there, on the same page.

Wednesday, December 11, 2013

A Millennial Artist: Jee Young Lee's Stage of Mind


Resurrection. Image Source: My Modern Met.

Caption for the above photograph: "Inspired by the Story of Shim Cheong, a Korea folktale as well as by Shakespeare’s Ophelia, Lee JeeYoung made this installation by painting paper lotus and flooding the room with fog and carbonic ice in order to create a mystic atmosphere.
 
Lotus flowers grow from the impure mud to reach for the light and bloom to the rise and fall of the sun; in Asia, it bears various cultural symbolisms such as prospects and rebirth. It is also known for its purifying function. The presence of the artist in the heart of such flower is meant to convey her personal experience. 'I was born again by overcoming negative elements that had dragged me down and cleansed myself emotionally. The figure within a lotus blooming implies a stronger self who was just born again and is facing a new world'. It is this is very moment when one reaches maturity and full-potential that Lee illustrates in 'Resurrection', and, more generally speaking, throughout the entirety of her corpus."

My Modern Met reports on a South Korean Gen Y artist, Jee Young Lee, who creates beautiful interiors (hat tip: Ken Kaminesky):
Jee Young Lee creates highly elaborate scenes that require an incredible amount of patience and absolutely no photo manipulation. For weeks and sometimes months, the young Korean artist works in the confines of her small 360 x 410 x 240 cm studio bringing to life worlds that defy all logic. In the middle of the sets you can always find the artist herself, as these are self-portraits but of the unconventional kind. Inspired by either her personal life or old Korean fables, they each have their own backstory, which of course, only adds to the intense drama. From February 7 to March 7, 2014, OPIOM Gallery in Opio, France ... present[s] a selection of Lee's ongoing body of work called Stage of Mind.
Further from Brain Factory:
This exhibition introduces seven new photographic works ... a project on which the artist has been working continuously since 2007. Jee Young Lee “constructs” scenes for her camera rather than employing the traditional method of “taking” images such as still lifes, figures, or landscapes. ...
Lee's artistic motivation derives from her quest for personal identity. In each of Lee's stories, the artist is the protagonist. At times facing away from us, at other times showing only part of her body or reclining, she quietly and mysteriously inhabits her dream-like realms. Through their bold materials and patterns, dramatic colors, and intriguing narratives, Lee's new works signal maturity, coherence, and sophistication. The legends of East and West, Korean proverbs, personal childhood experiences, and immediate realities provide the motifs for her creations. ...
Lee's constructed realities belong to the “directorial mode,” employed since the 1980's by Postmodernist photographers in repudiation of the Modernist practice that sought truth in the everyday world. Lee's “constructed image photography” may be compared to the works of German sculptor and photographer Thomas Demand, who builds life-sized models he intends to demolish after photographing them. Her “staged photography” brings to mind tableaux vivant not unlike U.S. installation artist and photographer Sandy Skoglund's orchestrated room-size installations. But in contrast to these earlier artists, Lee's subjects are deeply personal and intensely psychological.
See more of Jee Young Lee's works below the jump and at this site. All works are copyrighted by the artist and are reproduced here under Fair Use.

Childhood. Image Source: Bored Panda.

Monday, August 26, 2013

Humanities, Arts, Critical Thinking? There's an App for That


Image Source: S. Gross via Living Life Forward.

It has become popular to label the arts and humanities as useless subjects whose graduates will never find jobs and will become a burden on society. In an arrogant and wrong-headed article, The New Republic blamed non-science specialists for the decline of their own disciplines, asserting that they display a "philistine indifference to science," and a dated devotion to dead end Postmodernism:
The humanities are the domain in which the intrusion of science has produced the strongest recoil. Yet it is just that domain that would seem to be most in need of an infusion of new ideas. By most accounts, the humanities are in trouble. University programs are downsizing, the next generation of scholars is un- or underemployed, morale is sinking, students are staying away in droves. No thinking person should be indifferent to our society’s disinvestment from the humanities, which are indispensable to a civilized democracy.
Diagnoses of the malaise of the humanities rightly point to anti-intellectual trends in our culture and to the commercialization of our universities. But an honest appraisal would have to acknowledge that some of the damage is self-inflicted. The humanities have yet to recover from the disaster of postmodernism, with its defiant obscurantism, dogmatic relativism, and suffocating political correctness. And they have failed to define a progressive agenda. Several university presidents and provosts have lamented to me that when a scientist comes into their office, it’s to announce some exciting new research opportunity and demand the resources to pursue it. When a humanities scholar drops by, it’s to plead for respect for the way things have always been done.
The article also suggested that scientific principles should be applied to the humanities in order to help humanities specialists learn how to find 'real' and 'objective' truths in human affairs:
History nerds can adduce examples that support either answer, but that does not mean the questions are irresolvable. Political events are buffeted by many forces, so it’s possible that a given force is potent in general but submerged in a particular instance. With the advent of data science—the analysis of large, open-access data sets of numbers or text—signals can be extracted from the noise and debates in history and political science resolved more objectively
Right, that's just what we need: to equate algorithms with keys to objective truth and to associate Big Data crunching with solutions to human problems. One could just as easily suggest that Big Data crunching is a source of many human problems.

Calls to remove humanities include the arts and liberal arts, although the arts are deemed so insignificant as to be mentioned only rarely by pundits indulging this anti-cultural trend. One glance at the number of articles at the bottom of this post suggests that perhaps this is more than a trend - it is a new movement.

These calls are coming from both sides of the political fence, along with people working in two sectors: high tech and finance. These two sectors are presently climbing to a pinnacle of power. Yet they arguably were the source of much distress over the past few years. They, along with their political guardians, have failed in their core assumptions about economic productivity, while focusing overly on marketing and consumption. And that is only the start of what is wrong with them. As the failures of these sectors have become more obvious, their commentators have gone on the offensive against the only preserves in society which offer any genuine alternative voice. What they cannot commandeer, they attack.

The frightening message to young students is, "Join the technocracy, or starve." If you can't be an engineer, a technocrat, a financial drone, or a marketer, what good are you? An article from Forbes on the 10 "least valuable" degrees reads like a checklist for building an unhealthy, plugged in, spiritually impoverished and blindly unconscious police state, with no connection to the past, no understanding of cultural context, and no way of recognizing or combating political oppression. The 10 "worst" college majors listed are:  (1) Anthropology and Archaeology; (2) Film, Video and Photographic Arts; (3) Fine Arts; (4) Philosophy and Religious Studies; (5) Liberal Arts; (6) Music; (7) Physical Fitness and Parks Recreation; (8) Commercial Art and Graphic Design; (9) History; (10) English Language and Literature.

This is a serious attack on arts and humanities disciplines. This is not about the so-called realities of economics. It is not about common sense. It is not about what is wrong with teaching in the arts and humanities. It is not even about the fact that the arts and humanities supposedly cultivate navel-gazing, lazy hipsters.

This trend against the arts and humanities is a vanguard bid for control of the establishment, from two sectors which (given their records so far) do not deserve even to make a bid for control at all. It is a battle for the right to communicate, to broadcast dominant social messages and to shape cultural values. It is a battle over the immense profits to be won through technologically-driven social control. It stems (no pun intended) from a larger awareness that politics, government and the economy are on the verge of massive transformation via technological advancements. Normally, the humanities would dominate this transformation because the change is occurring in the area of communications. Instead, people who work in these areas are being devalued, ignored, criminalized. The move to channel undergraduates away from these subjects reveals a need to diminish the numbers of those who understand particular branches of knowledge. It would be foolish and dangerous to see this as simple philistinism.

In the wake of the Great Recession and with the slow birth of a surveillance state, suddenly there are calls for the erasure of subjects which foster critical thinking of human behaviour, and meaningful understanding of grey areas in human affairs? Because they don't pay in the current economy? Well, what does pay in the current economy? Superhighways through archaeological sites and virgin rain forests? Global contracts to build nuclear power plants, based on faulty engineering? Construction bubbles in China which have built gigantic, empty cities? Giant dams to control water resources for one country at the expense of security in an entire region? Facial recognition software which puts the 'face' into Facebook? Where is the common sense in any of these projects? The argument against the arts and humanities reflects the consolidation of power away from, and isolating attack upon, any part of global culture that has not already been safely packaged, industrialized, appropriated and monopolized in the service of the fledgeling technocracy.

Anti-humanities pundits argue that humanities specialists only have themselves to blame. The latter don't deliver value for money. They are mired in Postmodernism. They are politically skewed. They dispensed with standards. Nitty gritty problems in the humanities and arts fields are being used by critics to obscure much larger developments here: a failed economy and an incipient surveillance society. The Great Recession did not transpire because there are Philosophy majors in the world. If anything, we need more Philosophy majors: it would be nice if we still had people around who could distinguish fact from fiction in the Information Age.

Friday, August 23, 2013

Positive Thinking and Negative Capability


Image Source: Thee Online.

Yesterday's post notwithstanding, this post highlights a Millennial search for positives. In his 2009 book, Pronoia, Californian astrologer and Baby Boomer New Age thinker Rob Brezsny asserts that negative reporting (like this story) and toxic entertainment are rampant in the new Millennium's global society. Brezsny suggests an antidote in the opposing coined term, "pronoia ... [which] John Perry Barlow defined as 'the suspicion the Universe is a conspiracy on your behalf':
[P]ronoia is ... utterly at odds with conventional wisdom. The 19th-century poet John Keats [1795-1821] said that if something is not beautiful, it is probably not true. But the vast majority of modern storytellers - journalists, filmmakers, novelists, talk-show hosts, and poets - assert the opposite: If something is not ugly, it is probably not true.

In a world that equates pessimism with acumen and regards stories about things falling apart as having the highest entertainment value, pronoia is deviant. It is a taboo so taboo that it's not even recognized as a taboo.

The average American child sees 20,000 simulated murders before reaching age 18. This is considered normal. There are thousands of films, television shows, and electronic games that depict people doing terrible things to each other. If you read newspapers and news sites on the Internet, you have every right to believe that Bad Nasty Things compose 90 percent of the human experience. The authors of thousands of books published this year will hope to lure you in through the glamour of killing, addiction, self-hatred, sexual pathology, shame, betrayal, extortion, robbery, cancer, arson, and torture.

But you will be hard-pressed to find more than a few novels, films, news stories, and TV shows that dare to depict life as a gift whose purpose is to enrich the human soul.

If you cultivate an affinity for pronoia, people you respect may wonder if you have not lost your way. You might appear to them as naive, eccentric, unrealistic, misguided, or even stupid. Your reputation could suffer and your social status could decline.

But that may be relatively easy to deal with compared to your struggle to create a new relationship with yourself. For starters, you will have to acknowledge that what you previously considered a strong-willed faculty - the ability to discern the weakness in everything - might actually be a mark of cowardice and laziness. Far from being evidence of your power and uniqueness, your drive to produce hard-edged opinions stoked by hostility is likely a sign that you've been brainwashed by the pedestrian influences of pop nihilism.
Does Keats's assertion that 'if something is not beautiful, it is probably not true' imply that widespread Millennial nihilism and negativity are lies? - Or do they initiate searches for a new baseline, for new values and new truths? Is negativity symptomatic of larger, positive growth? In some ways, we can view the push and pull between negativity and positivity in our time as a conflict between surviving strains of Enlightenment and Romanticism; in the Millennium, these strains trend between externally-imposed, alienating, hyper-rationalized mechanization and inward-looking, self-involved hyper-naturalism.


When Brezsny positively invokes Keats, he also points to Keats's famous idea of negative capability, a primal Romantic reaction against Enlightenment rationality (see comments here and here). Negative capability involves a Romantic immersion in imagination, the anti-rational, the legendary. It concerns an intuitive jump in the apprehension of the natural world at its most mysterious, which treats nature as something transcendent, not as series of secrets unlocked by science. Keats's 1819 poem Ode to a Nightingale (hear it here) expressed a Romantic search for natural beauty transformed by imagination into a healing, immortal myth; this imaginative process eases the daily sufferings and ultimately mortal troubles upon which reason fixates. But negative capability also embraces uncertainty and strife; it refers to the self-doubt one experiences when one is pushed past one's limits and beyond one's expectations by extreme experiences or emotions. In the negative realm, one must exist beyond the conventional, the labeled, beyond the boundaries of settled norms. Negative capability enables survival through a period of unknowing.

Out of the same Romantic movement to which Keats adhered came the Byronic hero. The Byronic hero is a predecessor of the Postmodern, broken anti-hero. Whether he is a criminal acting as hero, or a flawed, fallen hero, the anti-hero is the standard protagonist in Millennial fiction and entertainment. In today's stories, the drama hinges on whether the broken hero can become heroic. In other words, our epics explore how we may transform our negative world into a positive one. Our favoured tropes imply that only alienated people have moved past moribund limitations to attain the broader view necessary to achieve that transformation.

Often, broken norms or normlessness are taken today as signs of cultural collapse, political failure and societal doom. But Keats suggested that the ability to cope in a realm of social and cultural uncertainty is a negative art, which ultimately rewards with positive beauty and regeneration.

Source: Citation is © Brezsny, Rob (2009). Pronoia Is the Antidote for Paranoia, Revised and Expanded: How the Whole World Is Conspiring to Shower You with Blessings. North Atlantic Books and Televisionary Publishing. ISBN 1-55643-818-4, pp. 61-62.

Thursday, August 22, 2013

Where Are We Going? No Really, Where Are We Going?

Google Glass: 2012 preview, for release to consumers in 2014. Image Source: Extreme Tech.

The first twenty years of the Internet involved playing mental catch-up as the industry excitedly released each new application, operating system, or gadget. Except for think pieces at Wired, which launched in 1993 as a glossy magazine, few tried to grasp the implications as the sites and services rolled out - AOL (1991); Amazon (1994); eBay (1995); Yahoo! (1995); Craigslist (1995); Netflix (1997); PayPal (1998); Google (1998); Wikipedia (2001); Second Life (2003); Blogger (2003); Linked In (2003); Skype (2003); Facebook (2004); Digg (2004); YouTube (2005); Reddit (2005); Twitter (2006); Tumblr (2007); Pixlr (2008); Kickstarter (2009); Pinterest (2010); Instagram (2010). These are just the giants, with no mention of the porn sites, which do join the giants in the top rankings for traffic. See the Alexa Top 500 Global Sites for hundreds more of the most world's most popular Web hubs. There are also thousands more Web apps and services which you will have never heard of, unless they meet your particular needs.

The book reader of the future, from Everyday Science and Mechanics magazine (April 1935). Image Source: Paleofuture.

As great as these sites, services and devices are, if you are lucky, you can remember what life was like before they came along. It was far from perfect. But all someone had to do to become inaccessible was not answer the telephone. Now it takes a lot of willpower, excuses and effort to disconnect.

Wireless Emergency Alert System: "'Many people do not realize that they carry a potentially life-saving tool with them in their pockets or purses every day,' said W. Craig Fugate, administrator of FEMA." Image Source: NYT.

On the night of 5-6 August, a friend who lives in California was wakened in the middle of the night by cell phones in the house ringing an alarm he had never heard before: this was the state amber alert for a child abduction:
California issued its first cellphone Amber Alert late Monday, as phones in Southern California received an alert of two missing children in San Diego.

The timing differed from phone to phone but sometime between late Monday and early Tuesday many mobile phones across Southern California received an alert regarding James Lee DiMaggio, suspected of killing Christina Anderson, 44, and kidnapping one or both of her children, Hannah, 16, and Ethan, 8, the Los Angeles Times reported. ...

Some cellphones received only a text message, others buzzed and beeped as part of the Wireless Emergency Alert program, a cellphone equivalent of the Emergency Alert System that creates a high-pitched test tone on television.
The amber alert frightened many people when their mobile phones began ringing strangely (listen here). The system also warns the public about any other kind of major threat:
When you get an Amber Alert on your phone, you will definitely know. The sound is somewhere between a squeal, a siren and a series of tones. Even if you have your phone on silent or vibrate, or have enabled a "Do Not Disturb" or "Sleep" setting, your device may make this sound. The alert will appear as a text message including all pertinent information. ...
At the end of 2012, CTIA-The Wireless Association announced the transition from a Wireless Amber Alert program to a Wireless Emergency Alerts (WEA) program. ... Now, the WEA program sends messages to users within the area of the suspected abduction. For example, if a child in Orlando is abducted, all eligible devices within that area will broadcast the alert. A representative from the California Highway Patrol told HLN that Amber Alerts have previously been issued through wireless carriers regionally, but Monday's alert was the first to be broadcast statewide. It is of note that the WEA system also broadcasts other types of emergency alerts, such as severe weather warnings and imminent threat alerts.
To my friend, the alert brought home the point that mobile phones have erased privacy and are just "personal tracking devices that we also use as telephones." Smartphones are good for tracking criminals. They're also good for tracking everyone else.

A system like this can be a very powerful tool, as Orson Welles discovered in 1938. The Emergency Alert even entered the English language: This is only a test. - Or - This is not a test. In February 2013, hackers hacked a Montana TV station's Emergency Alert System and aired a fake zombie apocalypse warning to demonstrate the system's vulnerabilities. Ars Technica reported in June 2013 that the TV and radio Emergency Alert System is generally hackable. I could not find comment online about whether the Wireless Emergency Alerts program is also hackable, but presumably it is.

Some would argue that worrying about the future is pointless and unhealthy. In a July post, Maria Popova noted that anxiety is often associated with contemplation of the future; also, recent psychological research links the suicidal mind with an over-contemplation of the future:
In Time Warped: Unlocking the Mysteries of Time Perception ... BBC’s Claudia Hammond explores the psychology of mitigating our worries: Ad Kerkhof is a Dutch clinical psychologist who has worked in the field of suicide prevention for 30 years. He has observed that before attempting suicide people often experience a period of extreme rumination about the future. They sometimes reported that these obsessive thoughts had become so overwhelming that they felt death was the only way to escape. Kerkhof has developed techniques which help suicidal people to reduce this rumination and is now applying the same methods to people who worry on a more everyday basis. He has found that people worry about one topic more than any other — the future, often believing that the more hours they spend contemplating it, the more likely they are to find a solution to their problems. But this isn’t the case.
But what happens when the future becomes the present? As the technological future approached over the past 20 years, there seemed barely time to digest what was happening. It was enough to just keep up with the changes. There is a need to stand back, to see the big picture, to contemplate how we are changing as human beings, to understand what is happening to society, politics, the economy.

Devin Coldewey a Seattle-based writer and photographer, has a number of interesting articles for Tech Crunch (here) in which he tries to make sense of the impact of the Technological Revolution with reference to the past. In 2009, he compared Google and its many services to the construction of Roman roads (here). It was a metaphor-laden piece and pretty clumsy in its historical analogy. Nevertheless, Coldewey's comparison - between Google's messy-but-often-cool labs projects and the Roman road system - was intriguing. But Coldewey misunderstood the potential parallel in his historical comparison. The Roman road system was technologically revolutionary, but the purpose the roads served was not revolutionary at all. The Romans were building an empire. And so is Google.

Monday, May 27, 2013

Establishing the New Establishment


Title card to the opening of Episode 1 of the BBC series, Civilisation (1969). Image Source: BBC via Wiki.

Over the past generation, the word, 'civilization,' especially as it relates to 'Western Civilization,' (in capital letters) has become a contested subject. A politicized view in academic circles inverted the concept once taken for granted in the 1950s and 1960s. Scholars have challenged the idea of civilization as a source of racism, blind arrogance and violent imperial domination of other societies. Sometimes the critique looks at the Christian religion as a source of benighted oppression. Sometimes the fatal flaws of 'civilization' are colonialism, discrimination and power imbalances around race, class or gender. This post describes that Western/post-Western debate. It also considers how that debate has distracted from, and obscured, the evolution of new institutions and social conditions which constitute an emerging new establishment.

Wednesday, May 8, 2013

Time Management


Ozymandius in Watchmen. Image Source: Comic Vine.
 
From Comic Vine, on time, ego and morality:
"You don’t often get a character who is both the ultimate hero and villain of his piece. Ozymandias saves his world but, in doing so, becomes a terrible monster. In many ways this makes him the perfect statement about superheroes in the Post-Modern world. We don’t believe you can save the day without doing something horrible. Some will argue that the man has no personality, but his superiority complex, arrogance, and the weight he carries his decision with make him very real to me. Like Alexander the Great, he tries to unite the world with violence."

The Problem with Memory 6: Rashomon's Truth, Ego and Cyberethics

Rashomon (1950) film still. Image Source: A Potpourri of Vestiges.

If there is one mantra for good or ill of the new Millennium, it is that truth is relative and subjective. There is no better prescient illustration of this point than the great Japanese film directed by Akira Kurosawa, Rashomon (1950). The movie features different characters giving contradictory accounts of a rape and murder.

A Potpourri of Vestiges explains Rashomon's underlying revelation about the human psyche that any recollected 'truth' is a function of ego; this film concerns:
the unending desire of humans to placate their insatiable egos. This manipulation of facts has no limits and entirely depends upon the skill of imaginative improvisation of the individual along with his level of comfort at trickery. The ability to misinterpret comes naturally to the humans as an obvious tool to counter the adversities of life, and perhaps that's what makes it indispensable. As a direct consequence of contrivance, the concept of truth no longer remains universal but becomes rather subjective and a matter of individualistic perception. ... Rashomon is ingenious as its actual motive has nothing to do with the revelation of truth for verity is merely a matter of perception. On the contrary, Rashomon propounds to highlight the discrepancies among the different versions as a medium to depict the irrational complexities associated with the human psyche. Vintage Rashomon, such effect of the subjectivity of perception on recollection in the modern day parlance is more commonly known as The Rashomon Effect.
With the film's bleak conclusion, there is no moral objective standpoint possible. Any perception of an objective moral standpoint is in fact subjective - and nearly infinitely variable and exploitable.  The only boundaries are those imposed by ego, when it cannot bear to fabricate perception of the truth past certain points.

In other words, truth (and any correlation truth may have to morality) jointly depend on the functioning spectrums of private egotisms.  Perhaps that observation applies to the evolution of Cyberethics. The expansion of social networks, and our engagement with the Web in general, depend largely on how much we are willing to let the Internet build our social identities and realities.

See my earlier posts on Kurosawa here and here.

Friday, January 25, 2013

Reservoir Gods


Image Source: Armenian History.

I was thinking about my posts on the collapse of heroism in Postmodern and post-Postmodern fiction and cinema, especially the Revolving Door of Death, in which heroes in pop culture over the past 25 odd years have been killed off for thrills and then brought back to life. Peter Chung explored that cat-came-back trope with his animated character, Aeon Flux, around 1991. Chung used that trope to comment on its moral nihilism.

Each age in the Great Year brings new standards of heroism as the Precession turns.

In the Age of Aries, classical heroes were not immortal. Whether they were warriors or prophets, their deaths in myths and religious legends were a huge breaking point in the heroic story.

In the Age of Pisces, the classical hero became a religious saviour through sacrifice, transcendence and immortality. He could come back from the dead, which made the human hero must become divine.

In the Age of Aquarius, heroes can come back from the dead; they are immortals, like vampires or zombies, but they still bear the daily drudgery, weaknesses and flaws of real human life. Humans don't become gods, gods become humans.

Tuesday, November 6, 2012

Clawing Back to Recovery


Image Source: Media Bistro. 

Rarely have the domestic elections of one country meant so much to so many people inside that country and across the world. After the economic nightmare of the past four years (ironically delayed until now in some places), the United States stands at a fork in the road. Whatever choice the Americans make at the polls today, their collective decision may determine historic events in the 21st century.

Wednesday, October 24, 2012

Countdown to Hallowe'en 8: The Last Haunt of Living Memory

Image Source: Topic Sites.

One of the topics on this blog is memory, the tricks it plays, how it can be altered and co-opted, and above all, the moment at which it gives way to history. Below the jump, see a 9 February 1956 American game show television broadcast, which featured an interview with the only living witness of the assassination of President Abraham Lincoln on 14 April 1865 at Ford's Theatre in Washington D.C. (Hat tip: The Atlantic.)