Comments on a cultural reality between past and future.

This blog describes Metatime in the Posthuman experience, drawn from Sir Isaac Newton's secret work on the future end of times, a tract in which he described Histories of Things to Come. His hidden papers on the occult were auctioned to two private buyers in 1936 at Sotheby's, but were not available for public research until the 1990s.

Showing posts with label Empiricism. Show all posts
Showing posts with label Empiricism. Show all posts

Tuesday, October 22, 2019

What's Left Over? The Materialist Algorithm for Cognition

Image Source: Morten Tolboll.

This post continues my investigation of materialism and anti-materialism as competing responses to technology. My central argument is that right-wing and left-wing descriptions of politics and the economy are misleading and obsolete. Politics and economics are evolving to mirror tech-oriented materialism and anti-materialism. The worst of the former is leading to tyrannical political oppression. The worst of the latter is leading to an alienation from the mainstream consensus about reality.

Thus far in this blog series, I have focussed on materialism as a way of seeing the world, which is grounded in empiricism, scientific exploration, rationalism, secularism, and the associated economic mode of capitalist consumption. All of these aspects concentrate on humankind's five senses and how they can measure and experience the physical realm.

More radical forms of materialism reveal where this is stance is headed. In 2013, Stephen J. Cowley and Frédéric Vallée-Tourangeau edited a collected volume of scholarly essays entitled, Cognition beyond the Brain: Computation, Interactivity and Human Artifice. The editors explain that the notion that we are free to think inside our heads is a fairy tale. They argue that the thoughts we have inside our own heads as private expressions of personal existence, and as ego-controlled responses to the outside world, constitute a bedtime story we tell ourselves about our independence as individual beings.

'Thinking,' for these academics, is not an internalized activity expressing the cognitive power and freedom of a single, rational creature. Rather, as I suggested in my post, Who Writes Your Reality?, 'thinking' is a culturally-modulated experience. It is even possibly constructed from the outside in. That is, your brain from this materialist standpoint is like a Tabula rasa, upon which the outside world may write its programs as it wishes. The editors of Cognition beyond the Brain call old-fashioned notions of subjective 'thinking' a 'folk concept,' a culturally-shaped story we tell ourselves about what we are doing:
"Like all folk concepts, ‘thinking’ is a second-order construct used to ‘explain’ observations or, specifically, how action is—and should be—integrated with perception."
This is a radical departure from the earlier Postmodern deification of the subjective mind, wherein social objectivities were demolished and everyone's personal truth was considered sacrosanct.

Thursday, June 6, 2019

What's Left Over? The Rationalist-Materialists

A quotation from the 2014 collection The Blooming of Madness 51, by Florida poet Christopher Poindexter. Image Source: pinterest.

A simple way to understand the philosophical crisis raised by technology is to ask yourself the question: 'What's left over?' This is a shorthand I devised, and partly borrowed from the sci-fi writer, Philip K. Dick (1928-1982).

Dick predicted the impact of simulated realities on our consciousness. Aware that simulations would soon be indistinguishable from organic beings and authentic objects, he kept trying to hit bedrock and finally concluded in his 1980 short story, I Hope I Shall Arrive Soon: "Reality is that which, when you stop believing in it, doesn't go away." This can be a maxim for testing your own views and those of others regarding the mind and its relationship to reality, especially when it comes to the meaning of creation (whether one sees God as Creator or humans as creators) and created objects like technology.

My previous post introduced the hypothesis that how people view technology may be grounded in rationalist-materialism, materialism, or in anti-materialism. Today, I will start with the rationalist-materialists; two subsequent posts will discuss the materialists and the anti-materialists.

To define what I mean by those systems of thought, I asked 'What's left over?' after one removes complex narratives and beliefs about reality in each case. That is, what core attitudes are we talking about when everything else is stripped away? My answers vastly oversimplify different philosophies about the mind and matter, and avoid formal academic definitions; nevertheless, I hope they will clarify our current conundrum as technological simulacra become harder to control.

Tuesday, June 4, 2019

Believe Them or Not: The Red-Pilled Anti-Materialists

Notre-Dame Cathedral burned in Paris on 15 April 2019. Image Source: Ian Langsdon, EPA-EFE via MSN.

There are people on the Internet who are so red-pilled that they will shock today's greatest skeptics. They are more red-pilled than your average Black Pill. They are profoundly alienated from the mainstream and most would call them crazy and paranoid. Yet these rogues are significant because they could develop a future mainstream paradigm, if it isn't suppressed first. Red Pill leaders herald an online movement which will change the way we understand everything.

Red Pills are sometimes associated with right wing politics, but are not mainstream conservatives. In 2015, The Telegraph dismissed red-pilled rumours that late Tory PM of the UK, Edward Heath, was a child abuser. Image Source: The Telegraph.

What do they stand for? The mainstream condemns Red Pills as anti-social and anti-political: hackers, bloggers, vloggers, anarchists, libertarians, conservatives, the deplorables, the downtrodden, the impoverished, the religious, the deluded, the uneducated, the unintelligent. But are these anti-rationalists really anti-rational? Are they stupid, uncultured consumers of 'Fake News' and Deep Fakes, who will ruin democracy with their racist populist fascism? I would argue that scaremongering and stereotyping obscure the trend. What is actually unfolding is a battle between materialism (the mainstream) and anti-materialism (the red-pilled Underground). 

It is odd that Oxford, a hub in the world's power structure, is also producing leaders of the Underground. Two such latter figures are Dr. Joseph P. Farrell and Dr. Katherine Horton. Highly intelligent, they produce nuanced conspiracy theories that are as deeply researched as they are terrifying. If you look at them as advanced anti-materialists rather than as conspiracy theorists, their Oxford background makes more sense.

That said, their views are so incredible that they could be paid agents of disinformation, dispatched by the establishment to discredit Internet dissenters. And if not Farrell and Horton, then others with similar fringe opinions may be controlled opposition.

My post here offers only opening remarks on the shift which Horton and Farrell represent, to be analyzed in future posts. My aim in discussing these matters is not to support or deride Red Pills or their opponents. Rather, I mean to analyze the mainstream as a materialist movement and the Underground in terms of anti-materialism, without being sidetracked by inflammatory language which distorts how we usually consider these questions.

Stop 007 - Who I am (7 December 2016). Video Source: Youtube.

Thursday, December 15, 2016

Awaken the Amnesiacs 7: The Economist Predicts the Year 2017

The Economist predictions for 2017, released November 2016. Image Source: The Economist.

When I saw that The Economist's tarot spread cover for its special issue predicting next year, my heart sank. After the American election, #proofoflife at WikiLeaks, and the impeachment of the South Korean president, I did not want to think about any more political weirdness. In case you have had enough too, here is my TL;DR on The Economist's message:

The cover may or may not predict a terror attack in Europe or a nuclear threat in Asia in the first quarter of the year, cementing Trump's presidency in the second quarter. Despite the fact that Trump will backtrack on promises, his ability to mesmerize the populist masses will grow. A huge upheaval is expected in the third quarter, with Merkel likely to lose the German election and Wilders and LePen gaining in earlier elections in the Netherlands and France. A divided body politic must find middle ground. After watching this unfold, the globalist liberal establishment will renew itself by December through the debut of exciting new technology, increased connectivity in developing countries, and a redoubled effort to control the public space, both online and off, with figures from government, banking, entertainment and academia entering online discourse in earnest. They will absorb formerly independent Internet enterprises, recapture lost ground, go on the attack, and flip the populist script.

On Youtube, I sensed the alt-right, alt-media types didn't want to look at The Economist. They are weary. "They do this to us every year," sighed one Youtuber after making three videos on the subject. Even the conspiracy theorists are sick of conspiracy theories. They only reviewed this Planet Trump thing out of a dogged sense of obligation to keep up their end - New World Order, Illuminati, Trump, Pizzagate, yadda yadda. They were all sure about one thing. The cover contains a hidden message.

I envy the flat rationalist, who would scoff before imagining anything arcane here. This is merely an illustrator's whimsical reference, as though the editors had chosen a picture of a woman reading the future in a crystal ball. I hope it is true.

Perhaps the editors did this as a tongue-in-cheek layout to bait crazy conspiracy theorists. Actually, I think the editors at The Economist do not care how the cover is received by the alt-media, who are beneath their contempt. If so, then this is another tone-deaf, top-down misconception of the impact of the Internet on the public debate. On the Internet, mimetics, semiotics, virality and truth excavations rule the day, not inside jokes and polished arguments to a select audience. A tarot spread confirms the worst conspiracy theories about magic infecting high politics. To give evidence of this - straight from the horse's mouth - strengthens the alt-media, which cannot be good for The Economist.

In an earlier post, Magic, Numerology and the IMF, I observed that Christine Lagarde did not realize that she now had to speak to two audiences. Every speech she made would be channeled into the old MSM theatre, as well as the alt-media. She plainly did not understand the latter at all. Her clever asides would be taken the usual way by the old establishment, but on Youtube, they would feed the independent machine of conspiracy theorizing and populism.

To make matters worse, she seemed to confirm the conspiracy theorists' fears about a magic-obsessed establishment. Lagarde was making predictions for the year 2014, peppering her predictions for the coming year with numerological nudge-nudge-wink-wink-hint-hints, which promised (precisely, down to the day) the manipulation of oil prices to hurt Russia. To the alt-media, this made it look like her policy speech had a magical subtext and potency. The liberal democracies are supposed to be secular, humanist, and rationalist; they should not be run by the rules of parlour games, or by people who believe that parlour games are real. I argued that parlour games should stay in the parlour.

Thursday, July 28, 2016

Mind and Government, Terror and Ideology: Reframed

1907 photograph of an 1872 Leon Berger model guillotine, stored with its body basket. The photograph was reproduced by someone who currently makes historic replicas of guillotines. There had to be someone out there doing this. Oddly, there is more than one. Some people make mini-guillotines as a side hobby. The 1792 French Revolution guillotine mini-model plans are offered to aspiring carpenters on the Internet for USD $38, here. The finished mini-model (perfect for your back yard?) is here; the full-sized 1792 model, five times larger, built from the same plans for a Belgian museum, is here. Image Source: Bois de Justice.

This post was written before the terrorist attacks in Nice (14 July 2016) and Saint-Etienne-du-Rouvray (26 July 2016). With regard to those attacks, no disrespect is intended in discussing today's anniversary of the end of the Terror during the French Revolution. To be clear, although this analysis runs up to the present, it does not source radical Islamic terrorism in the western political system. I would argue that jihadism has its own specific origins, although it ironically mirrors as nemesis a western concern with the relationship between fear and control in psychology and politics.

This post on politics is the second of three on how perceived understanding or framing of reality diverges from hard facts, and creates problems in the historical narrative. I have a theory that when human beings build governments and devise theories of government, they project outwardly their awareness of the inner structure of the human psyche. That is, when we build and control society in the outer world, we embed how we think, perceive and feel into those constructions. And if there are parts of ourselves we would rather not face, we embed the suppression, too.

On a basic level, it makes sense. We fear our capacity for savagery and bloodshed, and know that the hell-pit at the dark end of the behavioural spectrum is something we ought to avoid. That is why the idea of climbing toward something higher through renewed social order is so appealing. The initial drive begins with a justified fear of the demons inside us and a moral journey to find the "better angels of our nature."

The French Revolution presents a powerful example of that journey and its challenges. Today marks the 222nd anniversary of the end of the Terror (6 September 1793 - 28 July 1794), a period of mass execution of enemies of the Revolution. It is ironic that 'terror' - described today as the greatest nemesis of global civilization - played a critical part of the establishment of modern western politics. Although there were revolutionary precursors in England and America, the founding moment began with the French Revolution. Everything we take for granted, from left-wing and right-wing politics, to the basic rights of human beings, was most clearly expressed there.

Today's post reconsiders the circumstances in which the west's current political ideologies developed, to see how the story of rational modern politics diverged from its reality. The French Revolution came dressed in the rhetoric of liberty, equality and fraternity, respectively sources of liberalism, socialism and nationalism. Revolutionaries changed how we measure time, months, hours, days. 18th century perceptions of time were different from post-revolutionary modern ones. The revolutionaries standardized weights and measures - previously a privilege of the nobility - with the creation of the metric system. They developed the modern media in their propaganda. They overturned a corrupt and bankrupt absolutist monarchical system, a privileged nobility and aristocracy, and a dominant clergy.

They did it through a commitment to rationalism. 1789's Tennis Court Oath was a pledge to develop a constitution, made in the spirit of earlier writings from the empiricist political philosopher and father of modern liberalism, John Locke (1632-1704). Locke's plan for government derived from his view of psychology. With his certainty that the mind was a tabula rasa, Locke insisted on experiential and logical systems of governance. He espoused the natural rights of man, of life, liberty, and property. He protected those innate values was through the social contract, imposed from outside upon the consenting individual in an embrace of nuture over nature. But starting with man's natural rights, he maintained that no one is innately superior to anyone else. He removed God and superstition from human politics, government and law, by stating that all men were divinely appointed to their state in nature. There was no divine right of kings: all people are equal.

From that natural and secular socialist equality, Locke derived fraternity and liberty as human beings left the pure state of nature and entered the body politic. As far as fraternity was concerned, toleration depended on having sufficiently enlightened, educated and morally informed citizens, who understood that some surrender of liberty was necessary to maintain a commonwealth. That social contract, if properly ordered, would clearly broadcast the principles and preconditions of mutual tolerance inside a nation. Within those non-totalitarian bounds, liberal citizens were free.

Locke influenced the French philosophes, notably Voltaire (1694-1778) and Jean-Jacques Rousseau (1712-1778). Further principles of liberty and separate powers came from other Enlightenment thinkers such as Montesquieu (1689-1755) to form the familiar 18th century values of the American constitution and the French Revolution. These thinkers drew the line between a divine source for the unified Church and State in absolutist monarchical systems and enlightened, secular, humanist, rationalist, democratic republics, with a separated Church and State. According to Montesquieu, there were underlying collective psychological trends in political development toward victory or defeat. Different types of government used varying core principles to drive those trends. The transition from monarchy to republic marked a shift in principles from honour to public virtue. But what must be avoided above all was a loss of liberty through fear. Wiki:
"[T]here were three main forms of government, each supported by a social 'principle': monarchies (free governments headed by a hereditary figure, e.g. king, queen, emperor), which rely on the principle of honor; republics (free governments headed by popularly elected leaders), which rely on the principle of virtue; and despotisms (enslaved governments headed by dictators), which rely on fear."
Thus, removing God from everyday government had created an interesting philosophical gap in the conception of modern politics. The unknown and unknowable had to be understood in new rational ways, or they would give rise to fear and dictatorship. In Some Thoughts Concerning Education (1693), Locke cautioned against raising children by intimidating them with fear. He warned against servants filling children's heads with fear of the dark, or goblins and monsters. Infantile superstition and threats bred subjection in grown men:
"Such bug-bear thoughts once got into the tender minds of children, and being set on with a strong impression from the dread that accompanies such apprehensions, sink deep, and fasten themselves so as not easily, if ever, to be got out again; and whilst they are there, frequently haunt them with strange visions, making children dastards when alone, and afraid of their shadows and darkness all their lives after. I have had those complain to me, when men, who had been thus used when young; that though their reason corrected the wrong ideas they had taken in, and they were satisfied that there was no cause to fear invisible beings more in the dark than in the light, yet that these notions were apt still upon any occasion to start up first in their prepossessed fancies, and not to be removed without some pains. ...

And to let you see how lasting and frightful images are, that take place in the mind early, I shall here tell you a pretty remarkable but true story. There was in a town in the west a man of a disturbed brain, whom the boys used to teaze when he came in their way: this fellow one day seeing in the street one of those lads, that used to vex him, stepped into a cutler’s shop he was near, and there seizing on a naked sword, made after the boy; who seeing him coming so armed, betook himself to his feet, and ran for his life, and by good luck had strength and heels enough to reach his father’s house before the mad-man could get up to him. The door was only latch’d; and when he had the latch in his hand, he turn’d about his head, to see how near his pursuer was, who was at the entrance of the porch, with his sword up ready to strike; and he had just time to get in, and clap to the door to avoid the blow, which, though his body escaped, his mind did not. This frightening idea made so deep an impression there, that it lasted many years, if not all his life after. For, telling this story when he was a man, he said, that after that time till then, he never went in at that door (that he could remember) at any time without looking back, whatever business he had in his head, or how little soever before he came thither he thought of this mad-man."
Locke's rational suppression, denial and dismissal of fear remained a weak alternative to the absolutist monarch's God. Given his denial of a priori knowledge and insistence on a posteriori knowledge, Locke faced the dilemmas of the rationalist, locked inside his own mind, guided only by his sense impressions of the world. Locke did consider what lay beyond empirical experience. In chapter 27 of An Essay Concerning Human Understanding (1689), he argued that worldly identity depended on an eternal, immaterial soul, incarnated in a physical body in the real world. In one example, that notion led him to suggest that a human being's worldly personal identity was distinct from the soul's consciousness. Worldly personality did not extend beyond the individual's rational thoughts, memories and life experiences. An eternal soul would have had past human lives, but a temporal individual personality housing that soul would have no memory of those past lives. In other words, Locke admitted that there were things beyond a posteriori awareness, but we have no rational access to them. Our only access to consciousness when building our personal identities would be through real life experiences and the memory of real life experiences. And that was the rock on which modern political order must be built.

However, when it came time to build the rational project during the French Revolution, to bring down the absolutist monarchy and remove God from government, the unknown manifested in the undertaking, in the form of the irrational element of fear. The rationalization of western politics depended on the Terror, on force as an instrument of fear to impress conformity to those ideals. Modern politics sealed a commitment to high intentions, rejected superstition and hereditary inequality; but it did so through mass intimidation and mass killing. From a psychological point of view, this means that when we strive toward highest purpose, we are still enmeshed in lowest impulses. The history of the French Revolution reflects a conscious-unconscious duality, as western political ideals emerged from bloodshed. The complete formula of the French Revolution would have been: liberty, equality, fraternity - and terror.

Wednesday, May 29, 2013

Millennial Mysteries: Bizarre Twists, the Lost, the Missing

Plaque at the gate entrance to Disneyland. Image Source: Wiki.

Many people, at some point in their lives, enter a realm bounded by mystery. This is a famous theme in noir and horror movies. David Lynch's Blue Velvet (1986) explored what would happen if two 'normal,' 'everyday,' 'rational' people veered off into mystery.

Wednesday, June 13, 2012

Dim Prospects

How not to get a job: study English. Image Source: Forbes.

Forbes just did a piece on the 'Best and Worst Master's Degrees for Jobs.' The top 10 degrees for getting a job fell in the computer sciences, physics, mathematics, the medical professions, business and economics. And the absolute worst Master's degrees for getting a job, according to descending pay and projected prospects, are:
  1. Library and Information Sciences
  2. English
  3. Music
  4. Education
  5. Biology
  6. Chemistry
  7. Counselling
  8. History
  9. Architecture
  10. Human Resources Management
Yes, at the turn of a Millennium, we see history and architecture, down at the bottom. And at the very time when the technological revolution has caused global literacy and written communication to explode at a scale never before seen in human history, all the major disciplines which teach written expression and analysis are relegated to the bottom of the employment and pay scales? And in a global economy, the study of foreign languages, rhetoric and grammar also have no prospects? Really? Did politicians, economists, jobs analysts, bankers and financiers learn nothing in 2008? Why are they still allowed to control the balance of power in our societies?

Forbes did not mention other subjects in the traditional humanities - law, philosophy, classics, linguistics, fine arts, theatre, dance, theology or the applied arts - presumably because these fields did not even offer numbers high enough to enter their sample statistics. Nor did Forbes touch on social sciences beyond economics, except for psychology, which is listed via the underrated discipline of counselling.

No wonder there is a terrible recession on, when this unimaginative, blinkered, conventional view still predominates. This view places supreme value on activities which generate a communications revolution and the basic infrastructure of international trade. Forbes hands us a world of middlemen and technicians who apply knowledge rather than discovering it: administrators, marketers, managers, industrial designers, economists, engineers, medical personnel and computer scientists. Empiricism is king. Positivism and neo-positivism consume all mysteries.

If there are any twinges of uncertainty, there are mass marketing machines and media cultures which create the false impression that the human aspects of the Millennial tech revolution have indeed been addressed. But they haven't. In a recent interview, indie author Craig Stone attacked the cult of celebrity, a myth of paramount creativity drummed up by marketing and industrial business concerns, which are no longer the popular commercial model:
I think with social media and Kindle we finally have the fairest way to find the world’s best writers chosen from a pool of millions – rather than what we have had traditionally – the best writers presented to us by the publishing industry who choose for us from a limited pool.

Is Stephen King the best horror writer in the world? No – but he is the best horror writer from a small pool printed by the publishing industry who resist other writers to maintain the reputation of writers that are household names.

In this new dawn, those previously thought of as writing gods are going to be revealed as just writers. Good writers, perhaps, but not the best because for the one Stephen King published there are hundreds ignored to sustain his reputation as the best because it’s easier and more profitable to publish a terrible Stephen King book than a great new book by an unknown writer.
Ironically, wildly popular Millennial reality talent shows seek to combine the two creative realities Craig Stone identifies. But these efforts are slickly produced and do not really solve the problem. Faux creativity is draped gaudily over every new gadget and applied software suite. Social networks build the Big Lie of the Individual, made 'special' by 'friends,' 'connections' and personal preferences. At some point in the 1980s and onwards, the word 'club' was used constantly in marketing lingo to confer special membership and privilege, when in fact it denoted that one had been absorbed into another featureless herd.

Friday, June 3, 2011

Teenage Mutant Nazi Pilots

Roswell Daily Record (8 July 1947). Image Source: Wiki.

The Space Review has just published a scathing review of a new book by Annie Jacobsen.  The book, entitled Area 51, claims to explain the 1947 UFO mysteries and conspiracy theories about aliens at Roswell.  The Space Review is having none of that though, and angrily dismisses the book:
Although many reviewers claim that her research on atomic weapons tests and classified aircraft projects in the Nevada desert is well-researched and informative (it isn’t), the part of her book that the critics end up tripping over comes near the end of the 523-page book. That is where Jacobsen claims that children, perhaps as young as 13 years old and genetically or surgically altered by Nazi doctor Joseph Mengele, flew a Nazi “flying disk” into the United States as part of a plan by Joseph Stalin to cause mass panic of an alien invasion. The plane crashed in Roswell, New Mexico in 1947. The United States then engaged in unethical Mengele-like research at Area 51.

Read that paragraph again. I’ll wait.

Now a normal, sane person would read that claim and conclude that Jacobsen is a nut, or, at the very least, pretty gullible. Unfortunately, we don’t live in normal, sane times, and various reporters have been eating this up with Cool Whip and a cherry on top and acting if the book is so well-researched that Jacobsen at least deserves the benefit of the doubt about the mutant Nazi teenage big-headed pilots, no matter how crazy that story seems. The New York Times even referred to the book as “levelheaded.”
I suppose once you're in Fox Mulder territory, you should go with the flow.  If you're going to get into UFOs, you have to be prepared for things to get - strange.  The Space Review is not willing to look at this book in the way the NYT has; the NYT made tongue-in-cheek references to Jacobsen's "dogged devotion to her research," which involved her talking to "a security guard — who, it turned out, had worked at Area 51 and became one of her most valuable sources."  I find it hard to believe that that is a serious statement.  Perhaps the NYT is standing back from this a bit, and taking the book as, say, a summer read that is an exercise in pop surrealism - or a metaphor for the Cold War?  What would a serious investigation into Roswell look like, given that it is a cultural phenomenon in and of itself? It's the ultimate Millennial gnostic puzzle box. You can't get to the bottom of it, as this BBC reporter attempted to, without acknowledging the conspiracy theories. It's impossible to get to the 'real truth' - nor does anyone really want to.  Jacobsen's book is out just in time for Roswell's 64th anniversary; the story still reflects a captured imagination about the incident, whatever it was. The most credible explanations I have seen point to some kernel of truth involving experimental aircraft and espionage.  UFO reports conceivably prevent accurate information from circulating about new aviation and weapons systems. There may be some basis to this, considering the recent furor in the press about the downed helicopter in the bin Laden raid, which revealed stealth technology. But that notion just opens up a new maze of conpsiracy theories. Along those lines, there has been an upsurge in UFO reports and sightings in the past few years.  See here, here, here, here, here, here, here, here, here, here and here for a few of many examples.

But with this review, The Space Review touched on something far more important about information sourcing, journalism and copyrights in the Internet age, where 500+ page books can be written based on the testimony of agile sources and a Web full of wild materialThe Space Review criticizes the professional credibility of the NYT reporter Janet Maslin, asserting that journalists should have held Jacobsen to account for her sourcing.  One reader of The Space Review agrees: "The reception for this book - good reviews in the [former] mainstreet media despite its obvious flaws - echos the similar reception that Craig Nelson's train wreck of a book, Rocket Men, got in 2009. One can only conclude that journalism standards simply no longer exist and the NYT, WaPo, NPR and the like, only are interested in selling papers or pumping up ratings."  The review concludes that Jacobsen swiped her idea from a James Blish sci-fi short story, Tomb Tapper, in the December 1956 issue of Astounding Science Fiction.

What struck me was what Jacobsen said about her sources: "When pressed about it on Nightline by Bill Weir, the only reporter who seems to remember what his job is, Jacobsen stated that as long as a source is credible to her, anything that source says is automatically credible and doesn’t require questioning—even if it defies common sense."  Her subject matter and the degree of seriousness it merits notwithstanding, Jacobsen's statement reveals the erosion of previous standards for discerning reality beyond ourselves. This is the ultimate twisting of the purpose of citation, which is supposed to introduce some measure of historical objectivity into a study, by acknowledging a variety of opinions on a given question.  In this case, Jacobsen's statement betrays a credo that expresses the opposite value.  This is Postmodern subjectivity taken to a new extreme.  It's a latter-day, Millennial version of the Boomer creed, "I'm OK, you're OK."  If the source is 'believable' to Jacobsen personally as an individual, then it's 'good enough.'  This makes me think of the hazards of egotism on the Internet, where narcissism is so often rewarded as an end in and of itself.  Distinguishing truth from falsehood can be a tricky business at the best of times.  But if the only modality for sorting out the difference is what any given random Internet egotist 'feels is true,' then we might as well say that in the Information Age, when data has triumphed, the means for recognizing truth and fiction in that data, as we once knew them, no longer exist.

Addendum: CIA documents have just been declassified that outlines some of the work done in Area 51:  see reports here, here and other CIA information was released today, see here.

See all my posts on Aliens.
See all my posts on Copyright.

Tuesday, May 31, 2011

Time Lapses: Ageing Backwards

Sam Klemke recording his life. Image Source: Hypervocal.

Yahoo News has a special interest story running about Sam Klemke, a 55-year-old caricature artist, who has been filming himself every year for 35 years.  He has put together a (now viral) Youtube video, wherein he ages backwards over that time. As he moves backwards, the technology gets worse and worse too; given the connection he draws between video and time, the degradation of the tech constitutes an implicit subplot in this man's life.  It's another illustration of the weird confluence in our era between empiricism and gnosticism.  We believe that our technology can somehow function as an objective observer, and capture the real essence of ourselves, freed from the confines of time - thereby showing us a higher truth.  Good luck with that one. From the Yahoo report:
Sam Klemke began filming himself 35 years ago long before the term vlogger even existed using Super 8 film and a separate audio recorder because his camera couldn't record sound.

He hoped that one day he would have the technology to merge the video and audio. He has been recording himself every year for the last 35 years, and last month he posted the edited clips in a single video that shows his life backwards.

The video has gone viral, racking up nearly 700,000 views online. Klemke now finds himself doing interviews with stations all over the world. He is even being called the first vlogger by some.

"I've always had this fascination that you could capture time with a movie camera," he says to KVAL News. "I remember being fascinated that you could look at a film with Jimmy Stewart in the 70s. And then you could also look at one of him in the 30s and you could see, oh my God, you could see a 40-year span of his life right before your eyes with two separate movies."
See the video below the jump.

Sunday, May 29, 2011

How Historical Events Change Language

A traumatic event can spawn a whole bunch of new words.  People on the other side of the event have a new vocabulary.

How do historical events - especially traumatic ones - change language?  One way is through the coining of neologisms. For example, while the 2008-2012? Great Recession persists, Time has done a little online piece about 'Post-Recession Lingo':
Adding to the list of post-recession terms such as "unbanked" (individuals without checking or savings accounts), "anti-dowry" (student loan debt holding you back from getting married or buying a house), and "Groupon remorse" (regret felt upon buying a daily deal you can't use or never really wanted), here's a roundup of zeitgeist-y phrases, including "squatter's rent," "light bulb anxiety," and "not retiring." 
"Financially Fragile"
If an emergency occurred and you needed to come up with $2,000 within 30 days, could you do it? (Legally, hopefully?) If not, then you'd be categorized as "financially fragile," and researchers say that nearly half of Americans fit the description. ...

"Light Bulb Anxiety"
This fear, based on oft-misunderstood legislation intended to phase out usage of traditional incandescent light bulbs, has caused business owners and everyday consumers to stock up the old-fashioned bulbs by the thousands, according to the NY Times. Why all the hoarding? Many people just prefer the light given off by incandescent bulbs over LED or compact fluorescent bulbs. Also, there are plenty of people who aren't sold on the idea that the new-fangled bulbs really save all that much money or energy: In one survey, one-third of homeowners who paid for energy-efficiency upgrades (including switching to CFL bulbs) hadn't seen the decrease in energy bills that they expected.

"Squatter's Rent"
Also referred to as "free rent," it's the money a homeowner—soon to be ex-homeowner, most likely—gets to keep each month when he stops paying the mortgage and has yet to be kicked out of the home. "Squatter's rent" around the nation is estimated to come to a total of $50 billion this year.
I have suffered from Light Bulb Anxiety, so I guess I'm glad there's a term for it.

New techniques in expression and new terms are needed to think about that which was previously unthinkable.  New words are signposts, showing us where the 'before' and 'after' of history are.  The removal of words indicates a break with the past.  But what happens to these linguistic reactions over the long term?  Do neologisms survive?  Do obliterated words, once forbidden by historical memories or historical shame, ever make a big return?  Sometimes, a population does away with their whole language altogether, and switches to another one, apparently better suited to the aftermath.  Finally, traumatic histories tend to produce new forms of language focussed on changing our understanding of time.

Sunday, May 8, 2011

Blog's First Birthday: Notes on Newton

Godfrey Kneller's portrait of Sir Isaac Newton, aged 46 (1689). Image Source: Wiki.

This blog is one year old today.  It started with this post about Sir Isaac Newton - astronomer, mathematician, physicist and natural philosopher - who hid his analyses of the Bible and esoteric speculations on the end of the world for fear that he would lose face as a scientist. Despite his secrecy, he likely took these arcane musings as seriously as his scientific work.  His theories on religious symbols and occult theories, which he called Histories of Things to Come, were hidden away in a time capsule of sorts.  They did not become publicly available until our own time, that is, ironically, until the era which Newton believed would be affected by his predictions. 

What characterizes an era?  This was a great mind which straddled a time of belief, faith, magic and mysticism on the one hand - and a time of rationalism, empiricism and scientific method on the other.  Newton addressed both traditions with equal attention.  That divided condition resembles our own period, when the Information and Tech Revolutions represent the ultimate triumph of the Age of Rationalism.  It is such a momentous triumph that it threatens to tip us again into a period of mystical awareness.  The uncomfortable tension and overlap between sense and sensibility are everywhere.  The turn of the Millennium is a hybrid between Enlightenment and Romanticism.

Thank you to everyone who has taken the time to read posts here on these subjects, whether they deal with one side of this polarity or the other.

Friday, April 8, 2011

Nuclear Culture 4: Worlds within Worlds

Plutonium Abraxsis by Judson Huss. Image Source: Snippits and Snappits.

Middle Eastern and Japanese news stories are offshoots of the same problem. The past 230 years of modernization, running hand-in-hand with liberal democratization, meant that the great mass of people in developed societies gained standards of living way beyond a level they ever had. At the core of this nexus between industrial, technological and scientific advances, the rise in quality of life, and competing left and right wing political ideologies is one problem: energy.

Raising the bulk of the human population to this extent requires vast amounts of energy. Yet the sources we use bring many problems - ozone layers; global warming; strategic conflicts over oil; controversy over natural gas drilling; pollution; terrorism; despots and popular revolutions in the Middle East; and fears about nuclear safety and the weaponization of civilian nuclear materials - are we there yet? One of the most prescient science fiction novels of the 1960s was Frank Herbert's Dune. In a way, it's even more accurate than Orwell's Nineteen Eighty-Four, because it moved beyond the political world to the deeper problems of technological determinism. Herbert saw that we would biologically and genetically contort ourselves to match our primary energy source. He made it clear: we will do anything for energy. That is because with energy, we have the raw force to accomplish whatever we can imagine.

Yet we face a deeper quandary.  It comes from precisely that - from 'whatever we can imagine.' Critics dismiss the Atomic Age with three words.  Hiroshima. Nagasaki. Chernobyl. But as terrible as nuclear weapons and accidents are, they are inseparably part of the wellspring of Millennial creativity.  The discovery of radioactive elements in the 19th and 20th centuries opened the door to atomic theory and quantum physics; these doctrines fundamentally altered our vision of reality, which had previously remained essentially unchanged since the Ancient Greeks. It's a sea change in perspective that no one can escape.

Monday, November 8, 2010

The Problem with Memory 2: The Science of Memory

Memory chip. Image Source: Venture Beat.

How do we remember?  What does the brain do, exactly, to create memories? What are we to make of a report like this one at Live Science, which states that memory is not just a product of brain cells forming connections - wherein nerves reorganize themselves and send messages between themselves to establish a memory; but individual brain nerve cells can also hold short-term memories?  There's a piece here from October 25 at Phys.org which further explains how memories are born.  A memory is created when our brain makes groups of its cells "fire in unison" - each memory has a different pattern.  Scientists are trying to find treatments or prevention for Alzheimer's and dementia by administering drugs to older rats which stimulate their neurotransmitters.  This research has been headed by Profesor Etan Markus at the University of Connecticut.  An earlier report from 2006 on memory creation in the brain is here.

Aside from the obvious fears of aging Baby Boomers, why is there pressure to figure out how memory works?  Consider that those who know exactly how neurobiology and neuropsychology overlap in order that we may comprehend differences in time will conceivably be able to control, manufacture and bend memories - in advertising, in cinema, in public life, on the internet, in the military.  Phys.org just came out with a report that scientists have discovered how to erase memory: "Researchers working with mice have discovered that by removing a protein from the region of the brain responsible for recalling fear, they can permanently delete traumatic memories."

Wednesday, July 14, 2010

Black Holes and the Birth of Our Universe? Making the Infinite Finite

There's a remarkable new theory out about the origins of our universe, space and timeNikodem Poplawski of Indiana University has just presented a paper in which he concludes: "Accordingly, our own Universe may be the interior of a black hole existing in another universe." For the original MIT blog piece go here.  For the July 4 abstract go here, which states:
"The Einstein-Cartan-Kibble-Sciama theory of gravity provides a simple scenario in early cosmology which is alternative to standard cosmic inflation and does not require scalar fields. The torsion of spacetime prevents the appearance of the cosmological singularity in the early Universe filled with Dirac particles averaged as a spin fluid. Instead, its expansion starts from a state at which the Universe has a minimum but finite radius. We show that the dynamics of the closed Universe immediately after this state naturally solves the flatness and horizon problems in cosmology because of an extremely small and negative torsion density parameter, $\Omega_S\approx -10^{-69}$. This scenario also suggests that the contraction of our Universe preceding the state of minimum radius could correspond to the dynamics of matter inside the event horizon of a newly formed black hole existing in another universe."

Sunday, May 30, 2010

Lovecraftian Time Slip

H. P. Lovecraft saw time as a well, a living thing, a key to dark, terrible secrets and alternate realities. Brand new material is unearthed at Miskatonic University's Department of Ancient Manuscripts: Cinema Suicide reports that The Whisperer in Darkness produced by the H. P. Lovecraft Historical Society will be released in October 2010 (see the trailer below in this post). Eldritch Animation has a new animated version of The Statement of Randolph Carter (see the video link below in this post).

Lovecraft, like M. R. James, relied on a mood of historical authenticity. Lovecraft established his historical mood  by setting his stories in the well-guarded world of early twentieth century academia and indulging his politically incorrect love of the WASP middle classes of New England, now targets of parody.