TIMES, TIME, AND HALF A TIME. A HISTORY OF THE NEW MILLENNIUM.

Comments on a cultural reality between past and future.

This blog describes Metatime in the Posthuman experience, drawn from Sir Isaac Newton's secret work on the future end of times, a tract in which he described Histories of Things to Come. His hidden papers on the occult were auctioned to two private buyers in 1936 at Sotheby's, but were not available for public research until the 1990s.



Showing posts with label Globalization. Show all posts
Showing posts with label Globalization. Show all posts

Sunday, November 4, 2018

Death in Panama


Image Source: The Daily Beast.

There is an Internet sub-genre devoted to investigating what has happened to missing people. One of the most disturbing recent cases of this kind captured the attention of bloggers, Redditors, Youtubers, and forum members.

"Lisanne Froon, aged 22. Picture taken on April 1, [2014] the day of her disappearance. Cellphone data, found in their backpack, revealed that attempts to dial 911 were made some 2 hours after this photo was taken. There was no reception. The last attempt to dial 911 was 10 days after their disappearance." Image Source: imgur.

This was the case of Kris Kremers and Lisanne Froon, two Dutch college graduates, aged 21 and 22. In March 2014, they traveled to Boquete, Panama for six weeks to learn Spanish and do volunteer work with local children. Then they disappeared during a day hike in the forest near the town.

At the time of writing this post, the debate on this case at the German forum, Allmystery, was nearly 600 pages long. One of the related Reddit threads is here. The police investigation focused on the girls' disappearance its effect on the local tourist industry. This post and a subsequent post will ask about the much larger context of this case, including the Panama Papers.

Image Source: Kris Kremers/Facebook/imgur.

Monday, October 24, 2016

Countdown to Hallowe'en 2016: The Zombie Drug


Gif Source: giphy.

Among the world's scariest drugs, first there was krokodil in Russia, then Devil's Breath in Colombia. Less scary but equally devastating was the epidemic in 2016 of synthetic marijuana, or Spice or K2 in New York City, which has provoked some Youtubers to imagine that these drugs are concocted and released to the general population by the Illuminati. Other conspiracy theorists argue that synthetic drugs are part of evil CIA or Masonic plots to destroy the public. In fact, they are made in labs in China. In 2015 and early 2016, a variant of synthetic bath salts, flakka, swept through South Florida, USA. Bad flakka trips turn people into deranged, rambling zombies. Far-right Christians called it the demon possession drug. In early 2015, when flakka reached Europe, the UKAustralia and Canada, VICE told people to ignore flakka scare reports. However, VICE admitted:
"Nobody buys a substitute when the real thing is available. Now, anyone can get their hands on thousands of substances that otherwise wouldn't have been available and that there is next-to-no info about. New drugs like 'flakka' kill more than the old school ones ever did, but it's all done legally. What a breakthrough for society."
The fear was justified. Scary videos of flakka users flooded social media and reports became desperate as the drug spread to other states.

Saturday, September 3, 2016

Next Gen Prophet


Next Gen/Prophet by Office courtesy of DMT Tapes FL (2015). Video Source: Youtube.

Yesterday, I went into a shop where there was a vinyl LP record player used as a prop; it was playing an early Van Morrison album that I have not heard in - a long time. The saleswoman told me her daughter, who is in her twenties, has never seen a vinyl record player and couldn't figure out how to turn it on. The record player was brand new, because vinyl LPs from the 1960s to the 1980s are back in fashion. From DMT Tapes FL, here is a track from the digital album: Compositions for Abandoned Shopping Malls (16 May 2015; Hat tip: Dan Bell). This retro-1980s electronic music is tagged alternative vaporwave / florida ambient / future funk / outsider ambient. Wiki:
"Vaporwave (or vapourwave) is a music genre and art movement that emerged in the early 2010s among Internet communities. It is characterized by a nostalgic or surrealist fascination with retro cultural aesthetics (typically of the 1980s, 1990s, and early-mid 2000s), entertainment technology, consumer culture and advertising, and styles of corporate and popular music such as lounge, smooth jazz and elevator music. Musical sampling is prevalent within the genre, with samples often pitched, layered or altered in classic chopped and screwed style. Central to the style is often a critical or satirical preoccupation with consumer capitalism, popular culture, and new-age tropes. ...

Music educator Grafton Tanner argued in his 2016 book Babbling Corpse: Vaporwave and the Commodification of Ghosts that 'Vaporwave is one artistic style that seeks to rearrange our relationship with electronic media by forcing us to recognize the unfamiliarity of ubiquitous technology.' He goes on in saying: 'Vaporwave is the music of non-times and non-places because it is skeptical of what consumer culture has done to time and space.' In his 2016 review of Hologram Plaza by Disconscious, an album in the mallsoft subgenre of vaporwave, Dylan Kilby of Sunbleach Media stated that '[t]he origins of mallsoft lie in the earliest explorations of vaporwave, where the concept of malls as large, soulless spaces of consumerism were evoked in some practitioner's utilization of vaporwave as a means for exploring the social ramifications of capitalism and globalization,' but that such an approach 'has largely petered out in the last few years in favor of pure sonic exploration/expression.'"
See my earlier posts on ambient music:

Image Source: reddit.

Image Source: Youtube.

Image Source: We Heart It.

Image Source: Phoenix 2772.

Vaporwave Wallpaper (2015). Image Source: Wallpaper Vortex.

Vaporwave Wallpaper (2016). Image Source: Wallpaper Vortex.

Thursday, March 5, 2015

Before the Ides of March


Street art by PichiAvo in Almería, Spain. Image Source: Knihovna Chrášťany.

Do you remember the year you first connected to the Internet? I have friends who participated in primitive discussion forums in 1992. I got my first email address in early 1995. At the time, it felt as though I had held out as long as possible. Happy anniversary: I have been online for twenty years.

Street art by PichiAvo at Hip Hop Street in Vicar, Almeria, Spain. Image Source: Street Art Hub.

On 7 May 2014, the L. A. Times reported how many people in the world have Internet access and how many do not:
60% of world's population still won't have Internet by the end of 2014. A report this week by the United Nations says nearly 3 billion people around the world will have access to the Internet by the end of 2014. But 4.2 billion will remain unconnected. ...
A commenter responded that Internet access should not be held in balance against commodities of basic survival - but should it?
While 60% of the world doesn't have internet, 69% won't have clean water, full medical services, low risk of war, incomes above $6900 a year, or above average infant mortality rates. You think the 900 million Chinese who are picking rice or working in sweatshops care about internet...
This dichotomy between the developed, connected world and developing, unconnected worlds, between being globally plugged in and anchored in a local reality, repeats in personal microcosm. We have connected lives online and distinct lives in meatspace. Constantly shifting from one's sea legs to one's land legs is stressful. How essential - or detrimental - is online activity to our basic survival, development and growth as individuals in the real world?

Street art by PichiAvo from Mislatas representan 2014 in Valencia, Spain. Image Source: Art the System.

It feels as though virtual life is growing at the expense of real life. When Karl Marx wrote that religion was the opiate of the masses, he could not have imagined this most potent drug, which keeps over 3 billion people pacified (you can watch them joining the Internet, one by one, here). Forty per cent of the world's population is connected. The push to get the remaining 60 per cent connected made me think about enormous budding economies and nascent power groups. With all that potential, the Internet could be a seat of freedom or the foundation of tyranny. What it will become depends on how one manages time online and off.

Saturday, January 3, 2015

Colour of the Year: Marsala


Images Source: Pantone.

Global cultural uniformity derives from marketing decisions which dictate the aesthetics of our lives. One such decision is the colour of the year, determined by New Jersey corporation Pantone, which tells all interior, events, makeup and clothing designers what palettes to use in the coming months. 2015's hue sits somewhere between cacao nibs and oxblood and is called 'Marsala.' Pantone has been choosing the colour of the year since 2000. You can see a video (here) of Pantone executive director Leatrice Eiseman describing why Marsala was chosen, mainly to answer a 'public need for nurturing and earthy rootedness.'

2014's colour: Radiant Orchid. Image Source: Pantone.

2014's Radiant Orchid palette supposedly drew from Indian influences. In fact, this purple palette is not typical of many Indian regional traditional palettes: see here, here, here, here, here, here, here and here. Image Source: Indian Wedding Site.

Image Source: Lamps Plus.

Last year's colour was 'Radiant Orchid.' In December 2013, polls greeted Radiant Orchid with mixed results. One poll respondent commented: "Let's just say subtlety is not Pantone's strong suit. No interest in using either of [Radiant Orchid or the 2013 colour of the year, Emerald Green]." And another: "How do they determine the color of the year? It never seems to be that popular in the Midwest. I remember a few years ago honeysuckle was the color of the year. So I stocked my store with honey colored accessories. Had to clearance them out. :)" Yet another: "Not a nice color. Can't imagine where I would use it." Another: "I'd like to be part of the panel that picks these 'color(s) of the year'. It appears the criteria include ... 'what is the craziest thing we can come up to see how many sheep follow us over the cliff.' In my mind, the color of the year should be one that is either (1) a color most used in the paint industry or (2) an innovative but usable combination for the majority of the populace. I realize that home décor colors typically follow fashion trends but I'm not quite ready to 'wear' my clothes on my walls."

Sunday, August 31, 2014

Time and Politics 12: College Days Before the War


 
Click to enlarge: application form to Elon College (1913). Image Source: Elon University via Chronicle for Higher Education.

For today, as classes start at universities across North America this week, see a college application from 1913 to Elon University, North Carolina, USA. It is only four pages long! The source is a report from the Chronicle of Higher Education. Those were the days when liberal arts were both progressive and considered a solid background for doing just about anything (including, unfortunately, dying in World War I). Post-2008-recession, critics consider the liberal arts to be politicized breeding grounds of the hopelessly underemployed, unemployed and unemployable. Today's defenders of the liberal arts insist that the arts and humanities teach their students critical thinking. Part of that critical thinking can extend to considering how progressive the new Millennium really is.

A friend, J., observed that at Elon University, "They didn't mention that, until 1963, only white people need apply!" He suggested that the progressive view now recognizes that the western-centric view of history - which this application embodies with its emphasis on classics - has given way to a broader, enlightened world history.

I agreed that this is the current prevailing view, although I feel it contains an anachronism. We now assume automatically that the western-centric vision is causally bound to racism, inequality, slavery, oppression, patriarchy. The notion that today's discipline of world history is more advanced than the previously western-centric, classics-focused liberal arts curriculum includes its own hubris-laden, anachronistic assumption about contemporary progress.

In 1913, people could not travel or communicate the way we can now. So why would we automatically expect people from that time to have the same broad global vision we do? Yes, it was an oppressive, unequal, patriarchal system. But at the time, wasn't the classics curriculum the founding source of liberal arts education? Wasn't that curriculum considered the epitome of progress in 1913?

One hundred years from now, what will people say about late 20th century and early 21st century liberal views of inclusion? Probably they will say that it was woefully benighted and reflective of its own time and place. We could equally say that today's world history discipline derives from perspectives informed by economic and political globalization, not the expansion of tolerance - even though it looks that way. Isn't it true that in today's globalized world, whose official creed is advanced progressive, tech-driven liberalism, there are more slaves now than at any time in history? And beyond that conventional definition of slavery, isn't technology not-so-quietly enslaving the entire plugged-in population? Bondage happens. That brutal reality - namely, that inequality, loss of freedom, vicious hatred, and violence are integral to the shiny, ultra-advanced globalized Millennium - breaks through heady tech dreams in unpleasant surprises and shocks.

Saturday, May 4, 2013

Boomer Legacies: The Employee-less Society


James Altucher: "People with jobs are like 'the walking dead.'" (2013) Video Source: Youtube.

The Baby Boomers inspire a long list of social and economic generational legacies which they do not care to acknowledge. As mentioned in this post, On Declaring Moral Bankruptcy, one of these legacies is the gutting of the liberal professions and the move to an employee-less society, in which employers feel no moral obligation or loyalty to their workers. The old concept of professional decency never worked very well, often was patriarchal or sexist and encouraged nepotism and other forms of patronage. Despite these deep flaws, there was at least a culture of professional behaviour which suggested loyalty ran up and down the workplace hierarchy. Increasingly, we see the opposite; workers are held to ever higher and more demanding standards, while employers wash their hands of responsibility to their underlings.

From Yahoo: the Daily Ticker's Aaron Task talks to James Altucher about why you should quit your job. See Altucher's site here, and his view in the video above that across all sectors in the economy, employers are moving toward temporary employees with no job security and no benefits. When employment numbers finally improve significantly, what you will not see in the stats is that we will be living in a new economy. In that new economy, there will be no job security.

The Great Recession, Altucher maintains, was just an excuse to get rid of expensive permanent employees whom employers wanted to remove anyway, due to outsourcing, technology and globalization. Your permanent job (if you (still) have one), Altucher insists, is going to disappear anyway, so start looking at what your parallel alternatives are as a solo-entrepreneur.

For the Baby Boomers who are seeking to cut costs, this is a remarkably short-sighted policy. It grounds the entire economy in the bottom dollar and ruthless soulless efficiency. But anyone could tell you that money is just an idea, and as such it reflects a philosophy of existence. Remove the humane aspects from the very human act of work and the society will overall become less humane. Could we not have made the transition to a more mobile and globalized economy without pulling the temple down upon our heads? That choice, which Baby Boomer employers made and are still making in the 2000s and 2010s, will have long term implications.

After being treated like this, I really wonder how eager Generations X, Y and Z will be to support Baby Boomers' exploding pension, health and other social welfare costs over the coming decades. Isn't there a chance that these generations might remember the 2000s-2010s, when they were casually fired and the culture of workplace concern for their welfare and futures was destroyed without a second thought? See my posts on Boomer workplace legacies with regard to Gen X here and here; Gen Y here and here and here; and Gen Z here.

For all my posts on 60s Legacies, go here

Monday, August 8, 2011

The Worst Famine in 60 Years

Image Source: Dag Blog.

Ana the Imp has a penetrating comment on the current famine ravaging Somalia, the worst in 60 years:
Commenting in the Times on the present famine in Somalia, Tristan McConnell said that that the situation is set to get worse – “There are no rains expected for months and there will be no harvests until the end of the year at the earliest.”

Wherever the rain falls it does not fall on Somalia. It’s a disaster, an act of God, a theme taken up by the various aid agencies last month, who were blaming it on the “worst drought in sixty years.” Yes, it’s the worst drought in sixty years apart, that is, from the heavy downpours that have been flooding refugee camps in Mogadishu, the capital of this state that is not a state, over the last few days.

The simple fact is that the whole thing is a lie: God is not responsible for the starvation, people are, and the people I have in mind are the Islamists of al-Shabaab, who, as Aidan Hartley reports in the Spectator, control large parts of the south, ruling “over the population in a style reminiscent of Pol Pot’s Cambodia crossed with the Taleban.” Somalia is suffering from a famine alright, but it has political rather than natural causes.
I take her point.  But I can just see the assessment of what is happening in Somalia furnishing much material for political talking heads.  Radical Islam versus Global Warming.  A disaster is a disaster is a disaster.  How do we see beyond interpretations on which contending bloated political establishments have grown fat; and beyond the cloud of meta-information encouraged by television and the Internet?

Politics should be sidelined in the name of tackling reality.  In this case, the bottom line is that "more than 29,000 children under the age of five have died in just the last 90 days in the southern part of the country."  Yet how to aid Somalia's starving people gets lost behind the arcane process by which that reality is labeled.  Time - the long view (60 years) and the short view (90 days) becomes just another rhetorical tool.  The interpretations of why the event is happening become more important than the fact that it's actually happening.  And meanwhile, reality worsens, percolating beneath the hype.

Wednesday, July 27, 2011

A Lament for Strawberry Ice Cream


Image Source: Evernew Recipes.

Sic Transit Gloria Mundi - thus passes the glory of the world. Worldly things are fleeting.  I was going to do this post on goods that are made to break (all of today's so-called 'consumer durables') so we'll buy more of them.  It's amazing that there is such hype around our environmentally-conscious recycling culture, yet we are completely surrounded by throwaway planned obsolescence. Fleeting fashions. Problems with backwards compatibility.  I remember Michael Franti's commentary on how technology inspired cognitive dissonance back in 1991. I keep pondering oxymoronic language that persists in this atmosphere, like 'disposable income,' 'approach avoidance,' 'arrested development,' and 'accelerated decrepitude.'  I don't wonder what Franti thinks now of the Internet, a drug more potent than the "methadone metronone": you can see the video for his song, Television, the Drug of the Nation, here.

However, rather than talk about planned obsolescence, I thought I would talk about planned eradication of some products that were perfectly good.  These products have been eliminated for no other reason other than they 'don't fit' with the mythologies relentlessly churned out by up-to-the-second marketing minds.  Another, related, phenomenon is Millennial rebranding. Examples:
  • I started thinking about planned eradication when DC Comics began hemming and hawing about whether it will or will not erase its Titans pop culture canon because the comics medium is going digital.  Gen X heroes 'don't fit' the visions demanded by new media, apparently.
  • Another giant 'rebranding for the Millennium' exercise took place at the Canadian Broadcasting Corporation, wherein the public broadcaster disposed of its classical music radio offerings and replaced them with alternative rock.  There was a national outcry against this in 2008, but the rebranding went ahead.
  • The Washington Post reported in late 2010 that Steven Spielberg is advising Nancy Pelosi on how to rebrand the Democratic party in the US to make the Democrats more appealing to Gen Y and Gen Z (Hat tip: Elite Trader): "Lawmakers say she is consulting marketing experts about building a stronger brand. The most prominent of her new whisperers is Steven Spielberg, the Hollywood director whose films have been works of branding genius."  Spielberg's rep denied this report.
  • There is an article here by Charlotte Werther about Britain's Millennial rebranding as 'Cool Britannia.'
Wiki on Millennial rebranding:
Rebranding has become something of a fad at the turn of the millennium, with some companies rebranding several times. The rebranding of Philip Morris to Altria was done to help the company shed its negative image. Other rebrandings, such as the British Post Office's attempt to rebrand itself as Consignia, have proved such a failure that millions more had to be spent going back to square one.

In a study of 165 cases of rebranding, Muzellec and Lambkin (2006) found that, whether a rebranding follows from corporate strategy (e.g., M&A) or constitutes the actual marketing strategy (change the corporate reputation), it aims at enhancing, regaining, transferring, and/or recreating the corporate brand equity.

According to Sinclair (1999:13), business the world over acknowledges the value of brands. “Brands, it seems, alongside ownership of copyright and trademarks, computer software and specialist know-how, are now at the heart of the intangible value investors place on companies.” As such, companies in the 21st century may find it necessary to relook their brand in terms of its relevancy to consumers and the changing marketplace. Successful rebranding projects can yield a brand better off than before.
Corporate Rebranding: Impact on Brand Equity (2006) © Laurent Muzellec and Mary Lambkin.

Rebranding often leads to the discontinuation of products, which has been going on for decades.  There are many sites devoted to discontinued products that consumers want to be able to buy again: see here, here, here, here and here. What strikes me is the helplessness of consumers in this instance - there is even a lobby group devoted to the problem. Marketers and managers have decided what consumers 'want,' even if that's not what consumers want. They ignore real demand and replace it with fake demand.

But what happens when planned eradication transcends specific products and even whole brands?  Never mind publishers, corporations, power brokers and governments. Here is a much more common product: Strawberry Ice Cream.  Try finding it in your local grocery store.  Yes, you can get yuppified flavours like Strawberry Cheesecake Low Fat Frozen Yogurt; or Strawberry Cheesecake Ice Cream; Strawberry Red Bean Vanilla Ice; Strawberry Rose Chocolate Ice Cream, or Strawberry Azuki Beans Ice Cream.  You can buy Black Cherry Ice Cream or Raspberry SorbetBut for some reason, regular strawberry ice cream 'doesn't fit' in the new Millennium.  I find this very strange. Of course, you can still get it somewhere in a decent gelateria. I'm talking about mass consumption.  There are hundreds of recipes online so that you can make it yourself, meaning that a high demand for it is still there. When did it disappear from stores? 

There is a Facebook group devoted to the loss of strawberry ice cream.  Why can't you buy JUST strawberry ice cream?? The group founder: "I love all types of ice cream it's true, and ... I love strawberry ice cream the best. After many late night discussions over this creamy dessert I have come to the shocking conclusion that there is an ice cream conspiracy - for no matter how hard I search, I can never find simply strawberry ice cream." Group members suspect the Neopolitan ice cream makers. There's another Facebook group on the disappearance: Strawberry Ice Cream: The Great Conspiracy.

It's not like strawberry ice cream was a one hit wonder. It existed for a long time before it vanished from grocery store freezers in the recent past.  It made it into the top ten ice cream flavours, squeezed in behind the likes of Cookies 'n' Cream ice cream and Chocolate Chip Cookie Dough ice cream.

Frivolous?  Yes.  But the larger point is valid.  What criteria do marketing consultants use to decide which products 'fit' the current moment and which ones do not? And why?  Perhaps in this case, a basic classic disappeared under too much choice, too much complexity, too much competing noise; marketing gimmicks fancied up the source product until there was no source product left.

Breathless ADDENDUM: I found some strawberry ice cream yesterday at a gas station convenience store.  But as far as I can tell, it's still no longer ubiquitous in grocery stores.  Please write me if you find otherwise - I'd love to be proven wrong!

NOTES FOR READERS OF MY POSTS.
If you're not reading this post on Histories of Things to Come, the content has been scraped and republished without the original author's permission. Please let me know by following this link and leaving me a comment. Thank you.

Tuesday, July 26, 2011

Subliminal Slavery of the Subconscious Self


Revlon ad altered to highlight seeming subliminal death imagery. Image Source: Subliminal Manipulation.

The other day, I caught a telltale momentary flicker on the television and wondered - with some uneasiness - what subliminal message I had just seen. The history of subliminal messages is bound up with the rise of mass democracy. The manipulation of the hidden depths of individual psychology, the cult of the self, the obsession with subjectivity, moral relativism and self-love that intensified through the 20th century and crested with the movements of the Baby Boomers, all involved techniques to manipulate, control, and profit from, the masses. Now, we should wonder how those techniques and ideas are carrying over into the virtual reality that the Web is becoming.

Thursday, July 21, 2011

Time and Politics 5: The Decline of Left and Right

Image Source: Dark Roasted Blend.

This is an apolitical blog, and for good reason: the Tech and Information Revolutions have transformed our societies almost overnight, and have swept away the standards and values that anchored us. This holds true wherever you are in the world, and whatever your values are. We can't know what new values will arise. Presumably, some will change global cultures and economies for the better, some for the worse. That 'normative entropy' also extends to the world of politics, where the platforms of left and right have crumbled. Politics has degenerated into name-calling, crisis-mongering, and a total breakdown of meaningful debate between left and right.  Any consensus that could be the basis of practical policy is also gone: voters vacated the political centre after 9/11.  Weirdly, political rhetoric has never been clearer about its left-right delineations.  But this fact belies the reality that actual partisan policies have disappeared, with politicians sifting through the wreckage, searching for anything that will get them elected.

Sunday, June 5, 2011

Things That Make You Go Hmm

Banksy, CCTV. Image Source: High Snobiety.

Every week, some headlines make you wonder, even more than usual.  These are some, new and old, that fall in that category:
Even for conspiracy theorists, it must be difficult to find any larger meaning in the daily flood of information.  I like a comment @paleofuture aka Matt Novak recently made: "Infinite narcissists strumming infinite e minor chords will eventually score the complete works of mumblecore."

Sunday, May 29, 2011

How Historical Events Change Language

A traumatic event can spawn a whole bunch of new words.  People on the other side of the event have a new vocabulary.

How do historical events - especially traumatic ones - change language?  One way is through the coining of neologisms. For example, while the 2008-2012? Great Recession persists, Time has done a little online piece about 'Post-Recession Lingo':
Adding to the list of post-recession terms such as "unbanked" (individuals without checking or savings accounts), "anti-dowry" (student loan debt holding you back from getting married or buying a house), and "Groupon remorse" (regret felt upon buying a daily deal you can't use or never really wanted), here's a roundup of zeitgeist-y phrases, including "squatter's rent," "light bulb anxiety," and "not retiring." 
"Financially Fragile"
If an emergency occurred and you needed to come up with $2,000 within 30 days, could you do it? (Legally, hopefully?) If not, then you'd be categorized as "financially fragile," and researchers say that nearly half of Americans fit the description. ...

"Light Bulb Anxiety"
This fear, based on oft-misunderstood legislation intended to phase out usage of traditional incandescent light bulbs, has caused business owners and everyday consumers to stock up the old-fashioned bulbs by the thousands, according to the NY Times. Why all the hoarding? Many people just prefer the light given off by incandescent bulbs over LED or compact fluorescent bulbs. Also, there are plenty of people who aren't sold on the idea that the new-fangled bulbs really save all that much money or energy: In one survey, one-third of homeowners who paid for energy-efficiency upgrades (including switching to CFL bulbs) hadn't seen the decrease in energy bills that they expected.

"Squatter's Rent"
Also referred to as "free rent," it's the money a homeowner—soon to be ex-homeowner, most likely—gets to keep each month when he stops paying the mortgage and has yet to be kicked out of the home. "Squatter's rent" around the nation is estimated to come to a total of $50 billion this year.
I have suffered from Light Bulb Anxiety, so I guess I'm glad there's a term for it.

New techniques in expression and new terms are needed to think about that which was previously unthinkable.  New words are signposts, showing us where the 'before' and 'after' of history are.  The removal of words indicates a break with the past.  But what happens to these linguistic reactions over the long term?  Do neologisms survive?  Do obliterated words, once forbidden by historical memories or historical shame, ever make a big return?  Sometimes, a population does away with their whole language altogether, and switches to another one, apparently better suited to the aftermath.  Finally, traumatic histories tend to produce new forms of language focussed on changing our understanding of time.

Sunday, May 22, 2011

The More We Want, The More We Suffer

Buddha teaching Kisa Gotami how to bear the death of her child. Image Source: Inspirations & Lessons In Life.

The Tech Revolution, combined with the ageing of much of the world's population in the West, China and Japan, leads us to a uneasy and pressured relationship with time, a globalized trade in youthful beauty, a troubling obsession with anti-ageing, and a collective compulsion to avoid the realities of hardship, illness and death.  When I see Website after Website breathlessly anticipating the end of the world, I wonder if we have collectively lost the plot. The People of the Book and the Muslims are obsessed with end times. It's a death culture, jostling up against a seniors' youth culture.

Meanwhile, alarming events deepen our sense of unease. A dire news report of a meltdown in the Japanese Fukushima plant stated that the dreaded China Syndrome has occurred in Reactor #1. Last week, a worker died there; and the Japanese have widened the evacuation zone around the plant. If Armageddon does arrive, we will face it popping vitamins, buying the most expensive anti-wrinkle creams in the history of humankind, injecting Botox, dieting, going under the knife - and eating potassium iodide.

In this atmosphere, it's rare to see a meditation on death that is compassionate, brave and honest. Charles Wong's excellent Inspirations & Lessons In Life blog recently posted a poignant and personal comment that relates the vanities of advanced societies to the growing awareness of mortality, because the more we have, the more we have to lose - and we will always lose:
"Buddha said being human will sure have sufferings from birth, old age, sickness and dying. This is because we all have craving, therefore we exist in this cycle of rebirth. Death comes to everybody ... .  Once The Buddha asked this woman if she wants to have 10 children in her village and she said yes that would make me very happy. The Buddha then said if the 10 children died, you would suffer 10 times ... the more we want, the more we suffer."

Thursday, May 19, 2011

Technology and Society - Living Organisms

Twitter predicts the stock market (18 October 2010). Image: physics arXiv blog.

Two big Cyberpunk movie franchises from the turn of the century, The Terminator and The Matrix, depended on a cataclysmic moment when computer networks turned sentient. There is no danger of that yet, although there are signs of networks taking on an organic momentum of their ownThe Internet may reflect the confluence of human thought and action in such a way as to make it representative of the collective senses of society.  In the minds of some, like society itself, the Internet is almost a living creature.  This idea lay at the heart of the excellent Darren Aronofsky movie, PiThe concept clearly drives Google's seminal research into Artificial Intelligence.

Recent research along these lines by Johan Bollen and colleagues at Indiana University shows that Twitter is exhibiting the characteristics of an organic entity. This result was reported last October at the Technology Review: "An analysis of almost 10 million tweets from 2008 shows how they can be used to predict stock market movements up to 6 days in advance."  This is because Twitter is apparently a good index of the emotional state of the Twitterverse at large and that emotional state can be gauged:
Numerous studies show that stock market prices are not random and this implies that they ought to be predictable. The question is how to do it consistently. Today, Johan Bollen at Indiana University and a couple of pals say they've found just such a predictor buried in the seemingly mindless stream of words that emanates from the Twitterverse. For some time now, researchers have attempted to extract useful information from this firehose. One idea is that the stream of thought is representative of the mental state of humankind at any instant. Various groups have devised algorithms to analyse this datastream hoping to use it to take the temperature of various human states.
One algorithm, called the Google-Profile of Mood States (GPOMS), records the level of six states: happiness, kindness, alertness, sureness, vitality and calmness. 
The question that Bollen and co ask is whether any of these states correlates with stock market prices. After all, they say, it is not entirely beyond credence that the rise and fall of stock market prices is influenced by the public mood.  So these guys took 9.7 million tweets posted by 2.7 million tweeters between March and December 2008 and looked for correlations between the GPOMS indices and whether Dow Jones Industrial Average rose of fell each day. Their extraordinary conclusion is that there really is a correlation between the Dow Jones Industrial Average and one of the GPOMS indices--calmness. In fact, the calmness index appears to be a good predictor of whether the Dow Jones Industrial Average goes up or down between 2 and 6 days later. "We find an accuracy of 87.6% in predicting the daily up and down changes in the closing values of the Dow Jones Industrial Average," say Bollen and co[.] That's an incredible result--that a Twitter mood can predict the stock market--but the figures appear to point that way. ...

But there are at least two good reasons to suspect that this result may not be all it seems. The first is the lack of plausible mechanism: how could the Twitter mood measured by the calmness index actually affect the Dow Jones Industrial Average up to six days later? Nobody knows.

The second is that the Twitter feeds Bollen and co-used were not just from the US but from around the globe. Although it's probably a fair assumption that a good proportion of these tweeters were based in the US in 2008, there's no way of knowing what proportion. By this reckoning, tweeters in Timbuktu somehow help predict the Dow Jones Industrial Average.
Bollen's publication is listed at Cornell University Library here.  The possibility that Twitter could be used to predict the stock market caused a fair amount of buzz.  But it really has much broader implications. Bollen and his co-authors speak of the collective emotional state of society, as assessed by evident patterns of behaviour on the Internet:
Behavioral economics tells us that emotions can profoundly affect individual behavior and decision-making. Does this also apply to societies at large, i.e., can societies experience mood states that affect their collective decision making? By extension is the public mood correlated or even predictive of economic indicators? ... We analyze the text content of daily Twitter feeds by two mood tracking tools, namely OpinionFinder that measures positive vs. negative mood and Google-Profile of Mood States (GPOMS) that measures mood in terms of 6 dimensions (Calm, Alert, Sure, Vital, Kind, and Happy).
Separate research at Princeton has examined how ideas proliferate (Hat tip: Erekalert via Lee Hamilton's blog). Again, the researchers' focus is identifying the next big idea before it happens. In other words, it is not just a question of viewing the Internet as a system in which ideas hatch and grow. This vision extrapolates those patterns so that the Internet becomes a tool of prognostication:
Princeton computer scientists have developed a new way of tracing the origins and spread of ideas, a technique that could make it easier to gauge the influence of notable scholarly papers, buzz-generating news stories and other information sources.

The method relies on computer algorithms to analyze how language morphs over time within a group of documents -- whether they are research papers on quantum physics or blog posts about politics -- and to determine which documents were the most influential.

"The point is being able to manage the explosion of information made possible by computers and the Internet," said David Blei, an assistant professor of computer science at Princeton and the lead researcher on the project. "We're trying to make sense of how concepts move around. Maybe you want to know who coined a certain term like 'quark,' or search old news stories to find out where the first 1960s antiwar protest took place."

Blei said the new search technique might one day be used by historians, political scientists and other scholars to study how ideas arise and spread. ... Blei said their model was not meant as a replacement for citation counts but as an alternative method for measuring influence that might be extended to finding influential news stories, websites, and legal and historical documents.

"We are also exploring the idea that you can find patterns in how language changes over time," he said. "Once you've identified the shapes of those patterns, you might be able to recognize something important as it develops, to predict the next big idea before it's gotten big."
This desire to find the 'big picture' by following the paths of information coursing through our culture reappears in the study of 'transmedia.'  Transmedia, according to Hukilau is: "interactive storytelling across multiple media and platforms. The story actually cuts across, loops between and re-enforces the different strands that tie together the different platform exploitations. In other words, each component part is relevant to a whole – the bigger picture."

The suspicion that there is a process embedded within the Intenet that makes it an entity unto itself fuels Millennial conspiracy theories.  At no other time in the history of the world could you have so many innocuous bits of data juxtaposed, seemingly granting them meaning.  The resultant mentality responds well to stories that associate different agents in the global economy, granting that association a malevolent aspect. In late 2010, a typical example appeared in Pravda (via Before It's News), which reported the fact that Monsanto recently purchased Blackwater (now called Xe).  Monsanto has developed an "intelligence wing" that conducts industrial intelligence operations and also keeps an eye on the biotech giant's many critics.

Without a doubt, a team-up between Monsanto and Blackwater is fairly alarming. However, another aspect to this story is our basic, knee-jerk response to it.  Our automatic credulousness is a major problem. We want to believe in deep dark secrets everywhere, as helter skelter blobs of information suddenly end up next to each other, seemingly creating a whole new dimension of menace. Where are the sources for some of these brooding news bytes? Who manufactures these news items - and to what end? There is always another wizard working the levers behind another velvet curtain.  To be clear: it is part of the nature of the Information Revolution to mash data together mechanically without any rhyme or reason.  Yet in these mash-ups, we inevitably perceive sense and order where there really isn't any.

Informatics specialists' research into a 'bigger picture' inside the Internet is the sober aspect of this credulousness. This research retains the assumption that the Internet can be tamed and harnessed. These researchers accept the fundamental premise that there is a demi-god in the machine, an internal order, that can be comprehended and released, Phoenix-like, and hatch a golden egg called the Singularity

The Singularity is perhaps the most concrete expression of this type of magical thinking. It is a view of the Tech boom that also reveals generational rifts. In general, the core proponents of the Singularity - the point at which technology in our society exponentially transforms us and our societies beyond all recognition - are Baby Boomers. They betray the same rapturous idealism when speaking about the potential of the Information Revolution, the same iconoclastic convictions, that they manufactured forty years ago in their collective youth. I09's top editor, Gen Xer Annalee Newitz, pours cold water on it:
I don't believe in the Singularity for the same reason I don't believe in Heaven. Once I met a Singularity zealot who claimed that eating potato chips after the Singularity would induce sublime ecstasy. Our senses would be so heightened that we could completely focus our whole attention on the ultimate chippiness of the chip. For him, the Singularity was just like Sunday school Heaven, full of turbo versions of everything we love down here on Earth. But instead of an all-powerful God zotting angel puppies into existence for our pleasure, we would be using the supposed tools of the Singularity like nanotech and A.I. to conjure up the tastiest junk food ever.

That is not a vision of social progress; it is, in fact, a complete failure to imagine how technology might change society in the future. Though it's easy to parody the poor guy who talked about potato chips after the Singularity, his faith is emblematic of Singulatarian beliefs. Many scientifically-minded people believe the Singularity is a time in the future when human civilization will be completely transformed by technologies, specifically A.I. and machines that can control matter at an atomic level (for a full definition of what I mean by the Singularity, read my backgrounder on it). The problem with this idea is that it's a completely unrealistic view of how technology changes everyday life.

Case in point: Penicillin. Discovered because of advances in biology, and refined through advances in biotechnology, this drug cured many diseases that had been killing people for centuries. It was in every sense of the term a Singularity-level technology. It extended life, and revolutionized medical treatment. And yet in the long term, it wound up leaving us just as vulnerable to disease. Bacteria mutated, creating nastier infections than we've ever seen before. Now we're turning to pro-biotics rather than anti-biotics; we're investigating gene therapies to surmount the troubles we've created by massively deploying penicillin and its derivatives.

That is how Singularity-level technologies work in real life. They solve dire problems, sure. They save lives. But they also create problems we'd never imagined - problems that might have been inconceivable before that Singularity tech was invented.
Newitz is correct: there is no proof that globalized, overlapping computer systems and algorithms will produce some ultimate rationalized process. The Internet offers something else: cultural entropy, disintegration and degradation. For example, look at how the Internet shreds time and consumes personal lives. I like to imagine someone from the 18th century alighting in our society, and witnessing millions of people, staring for hours into grey boxes and little black gadgets. To the time traveller, it would be a picture of hell.

That is not to say that we will descend into chaos because of the Tech Revolution. Neither good nor bad outcomes are certain. The assumption that we are all on a road up into the light of progress colours researchers' inquiries into the internal mysteries of the Internet. It would be just as easy to argue and prove that we are heading toward fractured social oblivion and total tech-driven war - as it would be to say that sites like Twitter can explain the collective emotions of online users and predict the performance of the stock market. Moreover, this research comes out during the worst recession since the 1930s.

In short, algorithms may correlate blobs of information in a way that suggests causal connections between them. That is a logical fallacy. Some predictions garnered from deep study of online structures may produce results that make sense. But it is still magical thinking. The notion that the Internet can predict the future and unlock our ultimate mysteries is an illusion, which will sometimes work, and sometimes will not.

Saturday, April 2, 2011

Nuclear Leaks 1: Hanford


The GammaMaster: Japanese wristwatch and Geiger counter in one.  Image Source: MedGadgetCurrently sold out, the watch sells for USD $250 and was previously available out of Hong Kong.

Caption for the above photo: Leave it to the Japanese to integrate both a digital timepiece and a fully functional Geiger counter, all of it crammed into into a standard sized wristwatch. ... Feature-wise there are limits to what you can do in a wristwatch format. There's the time display in traditional dial analog format - it is a watch after all. For displaying radiation dosage, in addition to a digital display, the GammaMaster has an LCD analog display, which provides a visual indication of the current dose rate and cumulative dose. Both can be run in dosimeter mode or in survey meter mode.

Warning alarms can be set to indicate when a specified dose rate or cumulative dose is exceeded. Alarm settings are always displayed on the analog scales. Some minor negatives with the GammaMaster. For one, the radiation units of are in metric µSv/h and don't include mRems/hr, more commonly used in the U.S.

This month, I am doing a countdown to the anniversary of the Chernobyl disaster by highlighting nuclear accidents, leaks, tests and similar incidents that have occurred elsewhere in the world. If oil is the fuel of our present, nuclear power is - or seemed to be - synonymous with the future. Yet it is a future not of clean burning high-tech scientific efficiency, but one of apocalyptic consequences if mismanaged, damaged or disrupted. The operation systems of nuclear facilities are also vulnerable to computer hacker attacks (which I have blogged about here and here).

In the wake of the Japanese crisis at the Fukushima plants, the internet public is getting very jumpy. One of my friends described the slow burn that is wearing down our sense of well-being as an "apocalyptic pressure on the metaphorical inner ear." (Thanks, J.)  On Twitter, I notice that nerves are raw. On 17 March, @ozcanyuksek tweeted: "travellers from Japan have triggered the radiation detectors at Chicago's airport. Though the report does stress the levels were very low." Amid assurances that Japanese radiation fallout levels were low, @MorrisonMD remarked: "Worried about radiation from Japan coming to US? Vast majority will be dispersed, but take antioxidants & Vit C." On 23 March, Tomoko Hosaka commented: "More on Tokyo tap water. Expert: Stress people get from radioactivity is more dangerous than radioactivity itself." On 28 March, @leashless commented: "Insomnia and flashes of ill temper. I am subtly, but deeply, stressed." On 1 April, after Tokyo Electric had repeatedly made alarming comments about radiation levels, then retracted them, the company announced they were sending in a robot to find out the source of an unknown leak, which they later isolated on 2 April. By 1 April, there were alarms raised by "experts" - despite all previous claims that levels of radiation currently spreading globally from Fukushima were harmless - that the world food supply was threatened by Japanese fallout.  There is a low grade, growing background fear that reflects our helplessness and global connections back at us.

Radiation from radioactive elements - an invisible, little understood force that can kill you at high or prolonged doses - is, like petroleum, one of the millstones hanging around our collective necks from the twentieth century.  Nuclear power may not survive very far into Millennium if public opinion turns sharply against the industry.  Yet it has overcome serious incidents before, so it may continue, no matter how people feel about it.  This is the central, all-consuming problem of our times - how to create enough energy to fuel our global economy without destroying ourselves in the process.

Sunday, March 13, 2011

A Future in Glass


Image Source: Corning via Waylou.

Corning has put out a corporate video (here and here) envisioning a future filled with its glass products (Hat tip: Dump.com via @LillyLyle).  Like my recent comments on Audi commercials that suggest the future is here, or almost here, these products are paired with values associated with the rise of the tech-oriented social classes.  It's a vision of ease, convenience, accessibility, cleanliness - and streamlined wealth:
Can you imagine organizing your daily schedule with a few touches on your bathroom mirror? Chatting with far-away relatives through interactive video on your kitchen counter? Reading a classic novel on a whisper-thin piece of flexible glass?
Corning is not only imagining those scenarios – the company is engaged in research that could bring them alive in the not-too-distant future. You can get a glimpse of Corning’s vision in the new video, “A Day Made of Glass.”
Corning Chairman and CEO Wendell Weeks says Corning’s vision for the future includes a world in which myriad ordinary surfaces transform “from one-dimensional utility into sophisticated electronic devices.” ...
Glass is the essential enabling material of this new world. “This is a visual world – so transparency is a must,” explains Wendell. But that’s just the beginning. Ubiquitous displays require materials that are flexible, durable, stable under the toughest of environmental conditions, and have a cool, touch-friendly aesthetic. And not just any glass will do. This world requires materials that are strong, yet thin and lightweight; that can enable complex electronic circuits and nano functionality; that can scale for very large applications, and that are also environmentally friendly. ...
Does the world showcased in “A Day Made of Glass” seem like something out of a fantasy movie? Jim Clappin, president of Corning Glass Technologies, reminds us that, just a decade ago, pay phones, VCRs, and film cameras were also commonplace. Today, we’re accustomed to movies streaming on demand to a 60-inch television hanging on the wall and to video calls on notebook computers, essentially for free.
“The consumer trend driving our vision for tomorrow is very clear,” Jim says. “We all want to be connected with what we want…when we want…anywhere…and with great ease. Corning’s innovations in glass will enable this journey to continue.”
See the video and popular reactions to it below the jump.  The reactions remind me of one of my earlier posts, in which I talked about an unanticipated anti-tech backlash in the future.