TIMES, TIME, AND HALF A TIME. A HISTORY OF THE NEW MILLENNIUM.

Comments on a cultural reality between past and future.

This blog describes Metatime in the Posthuman experience, drawn from Sir Isaac Newton's secret work on the future end of times, a tract in which he described Histories of Things to Come. His hidden papers on the occult were auctioned to two private buyers in 1936 at Sotheby's, but were not available for public research until the 1990s.



Saturday, June 2, 2012

Biotechnological Magnetic Computer Memory

Magnetospirillum magneticum. A magnetotactic bacterium that could be the basis of growing computer memory organically. Image Source: Eff Yeah Microbiology!

The way computers become biological begins with expanded memory. The Economist reports that nanotechnologists have found a way to grow computer memories, thereby mimicking biological memories in cells, by manipulating tiny magnetic elements transmitted by proteins. 
[T]here may be a cheaper option [for manufacturing tiny computer memory components]—namely to mimic Mother Nature, who has been building tiny devices, in the form of living cells and their components, for billions of years, and has thus got rather good at it. A paper published in Small, a nanotechnology journal, sets out the latest example of the technique. In it, a group of researchers led by Sarah Staniland at the University of Leeds, in Britain, describe using naturally occurring proteins to make arrays of tiny magnets, similar to those employed to store information in disk drives.

The researchers took their inspiration from Magnetospirillum magneticum, a bacterium that is sensitive to the Earth’s magnetic field thanks to the presence within its cells of flecks of magnetite, a form of iron oxide. Previous work has isolated the protein that makes these miniature compasses. Using genetic engineering, the team managed to persuade a different bacterium—Escherichia coli, a ubiquitous critter that is a workhorse of biotechnology—to manufacture this protein in bulk.

Next, they imprinted a block of gold with a microscopic chessboard pattern of chemicals. Half the squares contained anchoring points for the protein. The other half were left untreated as controls. They then dipped the gold into a solution containing the protein, allowing it to bind to the treated squares, and dunked the whole lot into a heated solution of iron salts. After that, they examined the results with an electron microscope. Sure enough, groups of magnetite grains had materialised on the treated squares, shepherded into place by the bacterial protein. In principle, each of these magnetic domains could store the “one” or the “zero” of a bit of information, according to how it was polarised.
See the original research here

Friday, June 1, 2012

Prometheus Evolution

"Rock art in Wonderwerk cave 40 km from Kuruman Northern Cape South Africa: Ash found in a layer dated at a million years old hints that inhabitants of the cave were using fire a million years ago." Image Source: Daily Mail.

Gurus, theorists and scholars of the turn-of-the-Millennium keep revisiting the deep past, and discovering that Prehistoric human civilization runs back many more thousand years than previously believed. Daily Mail (sourced from a U of Toronto report) reports on research findings from April 2012, which confirm that human species used fire one million years ago, 300,000 years earlier than previously assumed:
Traces of ash mixed with million-year-old bones and tools have been uncovered in the Wonderwerk Cave in South Africa. Burned plants and bones were found in the cave, suggesting that its inhabitants cooked and perhaps even socialised around camp fires.

The huge cave near the edge of the Kalahari Desert has been the scene of previous excavations which have uncovered an extensive record of human occupation.

A team led by the University of Toronto and Hebrew University of Jerusalem has identified the earliest known evidence of the use of fire by human ancestors.

Microscopic traces of wood ash alongside animal bones and stone tools were found in a layer dated to one million years ago. ... University of Toronto anthropologist Michael Chazan said: "The analysis pushes the timing for the human use of fire back by 300,000 years, suggesting that human ancestors as early as Homo erectus may have begun using fire as part of their way of life.

The control of fire would have been a major turning point in human evolution. The impact of cooking food is well documented, but the impact of control over fire would have touched all elements of human society. Socialising around a camp fire might actually be an essential aspect of what makes us human."

The research is published in the Proceedings of the National Academy of Sciences.
The PNAS article's authors state: "To the best of our knowledge, this is the earliest secure evidence for burning in an archaeological context."

Journal reference: Francesco Berna, Paul Goldberg, Liora Kolska Horwitz, James Brink, Sharon Holt, Marion Bamford, and Michael Chazan. Microstratigraphic evidence of in situ fire in the Acheulean strata of Wonderwerk Cave, Northern Cape province, South Africa. Proceedings of the National Academy of Sciences, April 2, 2012 DOI: 10.1073/pnas.1117620109

See all my posts on Prehistory.

Thursday, May 31, 2012

Workplaces in the 2020s

Brian David Johnson. Image Source: WSJ via Intel.

The Wall Street Journal has interviewed Intel's Chief Futurist, Brian David Johnson, on how robots will take over certain professions by the 2020s, and which human jobs and skills will become prominent in the future as a result:
“As we approach the year 2020, the size of meaningful computational intelligence begins to approach zero. In other words, computers are shrinking. Soon they will get so small that they will be nearly invisible. This means that in 10 years we will be able to turn almost anything into a computer. We will be living and working in an intelligent environment and in many cases we will essentially be working inside a computer.

“Every day we create a massive amount of data. Google’s Eric Schmidt has famously said that every two days we create as much information as we did from the dawn of civilization up until 2003. As computing gets more powerful, it will seem like data has a life of its own. And it will. You’ll have machines talking to machines, computers talking to computers, all processing this data. But what will it feel like to work in this coming age of big data?

“Data will be our colleagues and our employees. And, like all employees, they will need a good manager – an algorithm. An algorithm is really just a sequential series of steps that processes data. We will need to “train” our algorithms to have a better understanding of humans and how to make human lives better. After all, we are their bosses.

“As we look to 2020 it might appear that computers and data will overrun the workplace. But remember, a computer will never clean a bathroom sink. At least in the near future, computers and data won’t replace the paper towels in the bathroom. But a computer will write up your local little League scores and a computer will operate on your spine. (Hint, this is already happening.)

“This has broad implications for what we think of as valuable skills and employees in the workplace. Today we value journalists and surgeons much more than janitors, but in 2022 we may think very differently. We will need to understand what humans are really good at and foster those skills, outsourcing the rest to the brilliant intelligence and efficiency of the future.”
WSJ held a live chat with Johnson here. In the chat, he remarks that brain surgeons will be replaced by robots. But we will still need human beings to scrub bathroom sinks? I am not sure how this spells progress. WSJ: "You actually suggested in your guest column that janitors might be more valued than neurosurgeons in the workplaces of the future."

Johnson emphasizes the need for human skills and analysis in the future: "Emotional intelligence will be important in a world where we are bombarded by data. Understanding how to talk to and interact will people will be an important skill [presumably the sink scrubbing comes into play here]. Having a deep understanding of programming and computers will be a great base but critical thinking will be even more impor[t]ant...understanding how to process all the BIG Data will be key." I suppose it will, until computers evolve to the point where they are better at analyzing Big Data than we are.

Wednesday, May 30, 2012

Curios: The Best of Modern Horology at Christie's



Video Source: Christie's via World Auction.

Curios is my blog series on interesting things that pop up at auction houses. Today, Christie's in Hong Kong is selling some of the most sophisticated watches on Earth. From the Christie's catalogue:
In 2011, Christie's international auction sales in Geneva, Hong Kong and New York of important watches surpassed the US$ 100 million barrier for the first time and achieved an unprecedented total of more than US$116.3 million, reconfirming the firm’s global supremacy for this category for the 5th year in a row. Underlined by consistent selling rates of over 90% achieved at each Christie's watch sale this year, such an outstanding result is a consequence of the fierce competition generated by a growing number of collectors coming from the Eastern and Western hemispheres. No less than 1,000 bidders from 40 or more different countries participated at every Christie's watch sale in 2011, demonstrating how this category knows no bounds.

Today’s watch market has never been so international, nor has it seen such high levels of scholarship and demand, and Christie's leads it at every level, especially when it comes to vintage collector’s wristwatches manufactured by the most renowned manufacturers. Throughout the year, Christie's Watch Specialists scouted and offered the finest and most important watches, mostly fresh to the market, perfectly presented and realistically estimated, and the reaction of the watch community was enthusiastic.
Technological advances have seen watches become incredibly complex and these timepieces cost several million dollars each. World Auction compares one of the watches on the block today (the 2010 Muller Aeternitas Mega 4 model, now billed as the most complicated wrist watch ever made in the world) with the 1932 Patek Philippe Supercomplication commissioned by Henry Graves, and the 1989 Patek Philippe model Calibre 89, to indicate how far horologists have come:
The sale of watches 
organized by Christie's in Hong Kong on May 30 is a festival of high complication. The technical advances of the latest watches have certainly been enabled by the computers, making it a new category in the market for luxury goods.

Franck Muller [the great Swiss watchmaker based] ... in Geneva, positions himself as the master of complications. Number 1 of his Aeternitas Mega 4 model from 2010 is estimated HK $ 4.8 M.

This tonneau-shaped watch is large and thick, 61 x 42 x 19 mm, but it comes however into the category of wristwatches. Its 36 complications are activated by 1483 components.

A comparison with two prestigious watches enables to appreciate the evolution of technology.

The masterpiece of the inter-war period, the Patek Philippe commissioned by Graves, had 24 complications. Too big to be worn on the wrist, the Calibre 89 of Patek Philippe, 89 mm diameter and 41 mm thick, totaled 33 complications and 1728 components.

Tuesday, May 29, 2012

Photo of the Day

Image (11 May 2012) © AndrĂ© Kuipers; Image Source: ESA/NASA via Flickr.

This is the Atlantic Ocean from the International Space Station on 11 May 2012. The photo was taken by Dutch astronaut André Kuipers: "We fly into the night over the Atlantic Ocean. Looking back at the sunset over Rio de la Plata and Buenos Aires."

Floating Shiny Blob Thing Heralds the Future



At his blog, The Blind Giant, Nick Harkaway has brought attention to something called ZeroN; it is a so-called 'levitated interaction element,' built at MIT, and promises to become the future user interface:
This is the ZeroN.

There’s an article about it here. It’s a prototype. It wobbles a bit. But if you don’t find it amazing, you need to step back and refresh your sense of wonder.

Done that? Okay.

Floating ball. Gravity nullified by electromagnetism. You are living in the future now, yes?

Monday, May 28, 2012

Circadian Misalignments

Image Source: Impactlab.

You may recall this post, about scientists who plan to control the body's internal clock. I09 (via Brain Pickings) discusses a new book by German chronobiologist Till Roenneberg, who argues that our social clocks and bodily clocks are not lining up and this is causing obesity, psychological stresses and physical illnesses. Roenneberg's "social jet lag" is induced by the conflicts between natural light, technology, business working hours and time spent socializing. From I09:
Just because you sleep later than your early rising friends doesn't mean you sleep longer than they do; nor does it make you lazier. And yet, the association between the time of day that a person wakes up and how proactive or driven they are is just one example of the many preconceptions that society upholds regarding sleep and productivity.

But here's the problem: these expectations might actually be working against us.

In his recently published book, Internal time: Chronotypes, Social Jet Lag and Why You're So Tired, German chronobiologist Till Roenneberg provides numerous examples of how social expectations surrounding time may be having a detrimental effect on large sections of the human population. ...

Evidence continues to pile up that the late-night schedules of shift workers clash so violently with their internal biological clocks that they actually increase their risk of obesity, diabetes, and a long list of other nasty health effects. Researchers have linked these adverse effects to discordance between the timekeeping mechanisms within our own bodies (the molecules that control the daily cycle of fat production and storage in your liver, for example) and our odd-hour work schedules.

Researchers who study metabolism call this "circadian misalignment." Roenneberg calls it "social jet lag" (a concept he explains quite succinctly in the video featured [below]). Whatever you call it, a growing body of evidence suggests that the disconnect between our internal clocks and societal clocks could be informing aspects of our daily lives ranging from metabolic disorders, to suicide rates, to alcohol consumption, to why older men marry younger women.

You can check your 'chronotype' here at Roenneberg's site CLOCKWORK.org and his contribution to the EU project to train the bodily clock here. Roenneberg's work at Munich University's chronobiology project is here.

Sunday, May 27, 2012

The Unabomber's Trojan Horse


Most of the people today who have integrated high technology into their lives without question, and indeed, may even be unwittingly enslaved by it, were born around the time the turn-of-the-Millennium's original neo-Luddite was caught and convicted in 1996. Thus, they would not necessarily grasp the fact that a brilliant young Harvard-educated mathematician foresaw their intimate attachment to high tech and violently opposed it in the name of protecting this younger generation. In the 1970s and 1980s, Ted Kaczynski veered from the then-popular back-to-the-land movement towards anti-tech terrorism. Incidentally, in his teens as a young Harvard prodigy, he had volunteered for psychological tests at the university to earn pocket money; these turned out to be traumatic mind control experiments. These experiences likely caused deep psychological damage and not unreasonably, fostered in Kaczynski a high level of paranoia and a distrust of authorities.

Kaczynski's story became a bizarre, real-life version of the Terminator sci-fi film franchise. In our fantasies, a neo-Luddite time-tossed nuclear family like Kyle Reese and Sarah and John Conner are heroes, misunderstood saviours who are humanity's last hope against sentient machines. But in our reality, someone who acted as these fictional heroes would is marginalized, crazy, murderous, isolated, and incarcerated.