TIMES, TIME, AND HALF A TIME. A HISTORY OF THE NEW MILLENNIUM.

Comments on a cultural reality between past and future.

This blog describes Metatime in the Posthuman experience, drawn from Sir Isaac Newton's secret work on the future end of times, a tract in which he described Histories of Things to Come. His hidden papers on the occult were auctioned to two private buyers in 1936 at Sotheby's, but were not available for public research until the 1990s.



Showing posts with label Neuroscience. Show all posts
Showing posts with label Neuroscience. Show all posts

Tuesday, October 22, 2019

What's Left Over? The Materialist Algorithm for Cognition


Image Source: Morten Tolboll.

This post continues my investigation of materialism and anti-materialism as competing responses to technology. My central argument is that right-wing and left-wing descriptions of politics and the economy are misleading and obsolete. Politics and economics are evolving to mirror tech-oriented materialism and anti-materialism. The worst of the former is leading to tyrannical political oppression. The worst of the latter is leading to an alienation from the mainstream consensus about reality.

Thus far in this blog series, I have focussed on materialism as a way of seeing the world, which is grounded in empiricism, scientific exploration, rationalism, secularism, and the associated economic mode of capitalist consumption. All of these aspects concentrate on humankind's five senses and how they can measure and experience the physical realm.

More radical forms of materialism reveal where this is stance is headed. In 2013, Stephen J. Cowley and Frédéric Vallée-Tourangeau edited a collected volume of scholarly essays entitled, Cognition beyond the Brain: Computation, Interactivity and Human Artifice. The editors explain that the notion that we are free to think inside our heads is a fairy tale. They argue that the thoughts we have inside our own heads as private expressions of personal existence, and as ego-controlled responses to the outside world, constitute a bedtime story we tell ourselves about our independence as individual beings.

'Thinking,' for these academics, is not an internalized activity expressing the cognitive power and freedom of a single, rational creature. Rather, as I suggested in my post, Who Writes Your Reality?, 'thinking' is a culturally-modulated experience. It is even possibly constructed from the outside in. That is, your brain from this materialist standpoint is like a Tabula rasa, upon which the outside world may write its programs as it wishes. The editors of Cognition beyond the Brain call old-fashioned notions of subjective 'thinking' a 'folk concept,' a culturally-shaped story we tell ourselves about what we are doing:
"Like all folk concepts, ‘thinking’ is a second-order construct used to ‘explain’ observations or, specifically, how action is—and should be—integrated with perception."
This is a radical departure from the earlier Postmodern deification of the subjective mind, wherein social objectivities were demolished and everyone's personal truth was considered sacrosanct.

Monday, September 16, 2019

What's Left Over? Merge with the Demon


Beware the implantation of a synthetic conscience, crafted by others. Image Source: pinterest.

I have a theory that our religions mirror our levels of technology. This post continues my What's Left Over? series, which describes two contending spiritual stances - the materialist and the anti-materialist - which are appearing in response to the Technological Revolution.

Science and technology refer to ways of doing, while the arts and theology describe ways of being. Neither can exist without the other. Ideally, they are used in balance. But the burgeoning technocracy thrives on doing; it pretends to support the arts and organic freedoms, while actually discouraging them. As a result, technologists have to build simulated versions of being to make up for the shortfall. Their simulacra of moral sensibilities are poor substitutes for the true human love of nature and mystery.

Left unfettered, technocrats will construct your personality and life from the outside in. And while their creed is superficially moral and progressive, it is grossly materialist, anti-human and anti-humane.

If you thought corporations and institutions were already corrupt, and rewarded the worst aspects of your nature and exploited your best instincts against you, imagine what it will be like inside the corporatist technocracy, where their externalized conscience will come from a chip, embedded in your brain and communicating with your cerebral cortex by way of a neural interlace.

This fake, mechanized, embedded chip conscience is the foundation stone of what the people at The Boiler Room podcast call the "technocratic dictatorship." Elon Musk is its leading proponent.

Musk was once Silicon Valley's space cowboy and he had a romantic heart. In a 2010 interview, his ex-first-wife Justine struggled with the loss of Musk's original visionary, who became her own "private Alexander the Great." In 2016, he wasn't completely gone. He named one of his droneships, 'Of Course I Still Love You.'

In 2014, Musk warned against the advent of artificial intelligence as "summoning the demon." Now a shadow of his former self, in July of this year he stammered out his new agenda, a brain-chip interface called Neuralink. Neuralink will allow us to merge with the demon and to "stay one step ahead" of it. This is supposedly the only way to survive the coming Singularity.

Putting a Chip in Your Brain Will Not Make You a Superhero (or a god) (7 August 2019). Video Source: Youtube.


See all posts in the What's Left Over? series on materialism and anti-materialism in technological advancement.

Friday, May 11, 2018

The Artificial Intelligence Nemesis


Image Source: thebodhitrees.

The creation of AI is a story of humanity. It will end where it begins, with a nemesis that will test humankind. This is because human beings grapple with inner knowing on ever more profound levels, driven by self-engineered crises.

Artificial Intelligence: The Nemesis in the Mirror

Anonymous - This Shocking Footage Should Worry You! (2018-2019) (13 January 2018). Video Source: Youtube.

AI is a big mirror. As Google's Cloud Lead Dr. Fei-Fei Li stated, AI began with the question, "Can machines think?" Engineers began building machines to mimic human thinking, to reason, see, hear, think, move around, manipulate. That was AI's foundational dream. In the 1980s, machine learning was born, followed by deep learning, which is rooted in neuroscience. This young discipline is set to explode, due to the exploitation of big data, harvested from around the globe. Thus, no matter how the machines end up evolving, it is worth asking now what we are doing with AI and why we doing it. There are unconscious human impulses that are informing AI design.

Find your museum Doppelgänger: some people have found themselves in paintings at art museums. Image Source: Kottke / My Modern Met / Davidurbon.

This mirror will test a psychological mode which human beings have used to build, change, create: the obsession with the nemesis, the other, the twin, the Doppelgänger.

The nemesis psychological complex works by externalizing something we cannot manage inside our own natures. Once the thing is externalized, we interact with it to create new ways of understanding and operating in the world. One of my posts, I Will Teach You Infinities, described how the nemesis complex informed the structure of language, because language progressively builds away from the starting point of selfhood, or 'I.'

Saturday, October 28, 2017

Countdown to Hallowe'en 2017: Latent Feline Infection


This meme dates from 2010. Image Source: Know Your Meme.

Today's post concerns how cats influence human health, based on the article: Influence of latent Toxoplasma infection on human personality, physiology and morphology: pros and cons of the Toxoplasma–human model in studying the manipulation hypothesis. That article concerns the often asymptomatic or latent disease, toxoplasmosis, caused by an intercellular parasite which is passed through cats to humans and is related to human depression, autism, cerebral calcification, sex addiction, and schizophrenia. It is a disease that affects rodents and makes them lose all fear of cats; this has allowed the parasitic protozoan to return to feline systems, where it can complete its life cycle. The parasite also affects the rodent brain's sexual impulses and reward centres.

Toxoplasma gondii parasitic protozoa. Image Source: Ke Hu and John Murray via Business Insider.

Scientists believe that toxoplasmosis infection similarly makes humans attracted to cats. The disease affects an enormous number of people, which may incidentally ensure the survival of the domestic cat, while also safeguarding the protozoan. Wiki:
"Up to half of the world's population are infected by toxoplasmosis but have no symptoms. In the United States about 23% are affected and in some areas of the world this is up to 95%. About 200,000 cases of congenital toxoplasmosis occur a year. Charles Nicolle and Louis Manceaux first described the organism in 1908. In 1941 transmission during pregnancy from a mother to a child was confirmed."
Perhaps those Walking Dead zombie stories are telling you the truth about a reality you suspect, but cannot see. If you think everyone around you is infected by some mysterious bug that makes them crazy, you may not be paranoid. You may be right. From Florence Robert-Gangneux and Marie-Laure Dardé in American Society for Microbiology (2012):
"Low seroprevalences (10 to 30%) have been observed in North America, in South East Asia, in Northern Europe, and in Sahelian countries of Africa. Moderate prevalences (30 to 50%) have been found in countries of Central and Southern Europe, and high prevalences have been found Latin America and in tropical African countries."
Approximate rates of infection by country below are almost all based on tests of pregnant women. Some studies only focus on particular regions, and the numbers change from year to year. Follow the links for years and details; not all countries conduct studies or keep statistics. Note that these statistics tend to show infection passed from mother to child in the womb. They do not include post-natal infection, in which people are infected directly by handling cats or eating raw or rare meat:
  • Argentina: sources for Buenos Aires 45.6%-57.2% infected
  • Australia: 28% infected; 23% found infected in pregnant women from Melbourne
  • Bahrain: 22.3% tested positive, from pregnant women in Manama only
  • Belgium: 48.7% infected
  • Brazil: 66.9% infected; other sources range 55.6%-77.8%, depending on region
  • Canada: unknown; extrapolated estimate 14%-22% infected
  • Chile: 40%-90% of the population, depending on region
  • China: Changchun only, 10.6% infected among pregnant women
  • Colombia: 43.1%-66.7%, different sources, depending on region
  • Costa Rica: approximately 55%, increases in rural areas and with poverty
  • Croatia: 38.1% infected
  • Cuba: different studies from Havana areas, 44%-66.3%
  • Czech Republic: 30%-40% infected; in Prague, 19.8% infected
  • Denmark: statistics for Copenhagen, 27.8% infected
  • Egypt: measured in women only, 46.5% in urban areas, 57.6% in rural areas
  • France: 45% infected
  • Germany: Western Pomeranian statistics only, 63.2% infected
  • Greece: depending on different studies and different regions, 20%-36.4% infected
  • Grenada: 57% infected, nationwide estimate
  • India: 11.6%-45%, depending on class and region
  • Iran: 29.4%-63.9%, depending on class and region
  • Iraq: 49.2% infected, statistics for pregnant women tested in Basra
  • Italy: 17.5%-34.4%, a range of averages, from different studies and different regions
  • Ivory Coast: 60% infected among pregnant women from Abidjan
  • Jordan: 47.1% infected in Amman
  • Kuwait: 45.7% infected among pregnant women
  • Malaysia: 49% infected among pregnant women from Kuala Lumpur
  • Mexico: extrapolated estimate 22% infected; another source found 6.1% of pregnant women infected in Durango
  • Morocco: Rabat statistics only, among pregnant women, 50.6% infected
  • Netherlands: 35.2% nationwide
  • New Zealand: statistics for pregnant women from Auckland, 35.4% infected
  • Poland: 35.8%-43.7% infected, based on pregnant women only in three cities, Warsaw, Lodz, Poznan
  • Romania: 57.6% infected in women of child-bearing age, Timisoara only
  • Serbia: 33% infected nationwide
  • Singapore: 17.2% infected among pregnant women
  • Slovakia: 22.1% among pregnant women, Bratislava data only
  • Slovenia: 34% infected nationwide
  • South Korea: 4.3% infected; other sources range 0.8%-3.7% infected in Daejeon and Suwon
  • Spain: 18.8%-43.8% infected, averages from certain regions only
  • Sudan: Khartoum and Omdurman only, pregnant women tested, 34.1% infected
  • Sweden: 18% infected from Stockholm and Skane only
  • Switzerland: 8.2%-35% infected, based on data from Lausanne, Geneva, Basel only
  • Thailand: 5.3%-21.5% infected among pregnant women from different regions
  • Turkey: 30.1%-60.4% average infected among pregnant women, from different classes and regions, impacted by regional culinary traditions (consumption of raw meat)
  • United Kingdom: 31% infected, 20 million people in 2012, with 80% of these asymptomatic; East Kent statistics for pregnant women showed 9.1% infected
  • USA: 15%-22.5% of the population, depending on demographic, 60 million+ infected; another source states 11% nationwide, with 7.76% for US-born and 28.1% for foreign born
  • Venezuela: 38% infected among pregnant women from Lara State
  • Vietnam: from Nha Trang only, 11.2% of pregnant women infected
In humans, the disease can remain latent until adulthood, or until the body experiences weakened immunity due to other causes. The non-latent form of the disease can feel like the 'flu' and spreads to the brain, eyes, and other organs; it creates cysts in the amygdala and nucleus accumbens, and changes the connections inside the brain which deal with fear responses, sexual attraction, decision-making and memory. The parasite also increases dopamine production, altering brain chemistry in the same ways cocaine - and schizophrenia - do. Those who are infected post-natally and directly through handling cats after birth are affected more severely than those who are infected congenitally.

Image Source: Twitter.

Sunday, October 1, 2017

Countdown to Hallowe'en 2017: Psychic Spies and Time Cross Predictions


Image Source: BackPackerVerse.

Welcome to another countdown to Hallowe'en! This year, October 31st will mark the 500th anniversary of the start of the Protestant ReformationMartin Luther (1483-1546) nailed his Ninety-Five Theses to the door of All Saints' Church in Wittenberg, Germany in 1517. On October 31st, I will publish a special interview about Martin Luther.

The Hallowe'en countdown is an online event, a mass blogathon, in which dozens of blogs count down to the end of October every year. The countdown is led by this blog (where you can see the other participants), run by the American comic book and graphic novel writer, John Rozum, and blogger Shawn Robare, who runs the Gen X Website, Branded in the 80s.

My contributions to the Hallowe'en countdown, to be published this year every three days, tend to cover odd subjects which nonetheless shed light on mainstream Millennial technology and culture. Today's post deals with the precognition and temporal psi protocol developed by the US Army in 1978, known as remote viewing. This was one of the primary psychic weapons cultivated in the Stargate Project; you can read the table of contents of the Stargate manual at the CIA's online reading room, here.

During the Cold War, the US military used remote viewing to view its enemies psychically and to foresee the future. A film dramatized this work, The Men Who Stare at Goats (2009); it was based on the 2004 book of the same name and starred George Clooney. On 12 August 2017, Leonard (Lyn) Buchanan, one of the US government's former remote viewers (and the person upon whom Clooney's character was based), gave an interview on Youtube, below.

Original Govt. Psychic Spy Reveals Remote Viewing Secrets (12 August 2017). Video Source: Youtube.

Buchanan stated that at the end of World War II, the Nazis' secret research was shared among the Allies. Due to lack of interest from the USA and UK, the Russians took the Germans' psychic and esoteric experiments, including remote viewing. According to Buchanan, the Russians developed this body of knowledge, and achieved apparent successes, to the alarm of the Americans. The most famous psychic in the Russian program was Nina Kulagina (1926-1990), who was later accused of fraud.

This was what led the US Army to develop the Stargate Project, officially until 1995, although the CIA may then have taken it over under another name. You can read a 1985 Master's thesis, Psychokinesis and Its Possible Implication to Warfare Strategy by W. G. Norton, here. There is a 2004 report by Eric W. Davis, Teleportation Physics Study, sponsored by the Air Force Research Laboratory at Edwards Air Force Base, here.

Lyn Buchanan claimed that remote viewers can see the past and present with high degrees of accuracy. They can view the future, but the future (unlike the past and present) is changeable. Buchanan maintained that he had been tasked to view as far into the future as the year 2080, whereupon he got into strange, conspiratorial territory:
"An agrarian society with very few people. Large cities, mainly deserted, and most people very self sufficient on their little farms. ... A low population count. That's what I found. ... [Between now and 2080] there's going to come some rough times. ... The population will be greatly affected. ... Intentional changes, from the tasking - remember there's 30 per cent inaccuracy here - from what I've found, the first step of all that is what is called chemtrails. The chemtrails are a selective thing that will damage the health of, or even kill off, certain types of people, while leaving others not so badly affected. And in the process, a sort of genetic selection will be made. But that's doom and gloom, and one of the rocks in the pond of time, I don't think there's very much we can do about that."
Other weirdness associated with Buchanan's remote viewing involves psychic police work; gambling; space exploration; space aliens; cosmology; and consciousness. He said he does not "kill things" using remote viewing, but "it's scarily easy to do." Another figure associated with this project is Joseph McMoneagle, who was the first recruit in the US Army's remote viewing program. He was designated as 'Psychic Spy 001.' You can hear an interview with him below. He spoke of visiting Russia, touring their remote viewing facilities and meeting his Russian psychic spy counterparts, who did remote viewing in the conflict in Chechnya. He remarked that the Russian capacity for remote viewing is "substantial."

Joe McMoneagle US Army Remote Viewer aka Psychic Spy 001 (2 March 2014). Video Source: Youtube.

The CIA is also rumoured to have used this parapsychological tool in psychic espionage. The US government's remote viewing training manual is here and here. I have previously written about psychic military projects here.

Excerpt from a CIA document on remote viewing. Click to enlarge. Image Source: 4chan.

Remote viewing is considered pseudoscience, but it has understandably attracted a lot of attention. After all, what government - or civilian group - wouldn't want to be able to predict coming events, re-examine historical events, or send spies to learn closely guarded secrets at almost no expense or risk?  The Chinese have studied remote viewing. The UK's Ministry of Defence secretly ran a remote viewing experiment in 2001 and 2002. Some members of the public are also attempting to conduct organized investigations of this protocol. For example, German remote viewers are listed here.

Proponents of remote viewing claim that the technique seems to work best under blind conditions, that is, viewers should not know what target the project leader has assigned. Some remote viewers, like this one trying to foresee the possibility of war with Russia, use Associative Remote Viewing (ARV), which involves knowing the target.

Friday, July 7, 2017

Wonders of the Millennial World 9: The Speed of Consciousness


Image Source: pinterest.

One of the questions which I pursue on this blog is whether or not human consciousness is manifesting on the Internet, in the psycho-anthropological myth-space of virtual reality, or as an alternate way of being, or in zeroes and ones, or in the physics of quantum computing. If human consciousness operates collectively on the Internet, it raises a subsequent question of whether that online consciousness can alter actual, real world reality.

The Big Bang Theory: Schrödinger's cat from Season 1, Episode 17, The Tangerine Factor (19 May 2008) © Warner. Reproduced under Fair Use. Video Source: Youtube.

This sounds bizarre, but the relationship between mind and matter remains one of the great unsolved mysteries of modern science. In my 2015 post, A Quantum Christmas, I outlined the origins of the mystery: when we measure matter, matter changes. And quantum entangled matter shows that we can measure an entangled particle, and its unmeasured twin will change too.

This means that matter has some other way of existing beyond our perception and we have an effect on matter which we do not understand. At the quantum level, matter changes from a previous, indefinite state before we look at it, where it exists in all places at once. After we look at it, it acquires a definite, measurable state. The impact our observations have on reality is exerted at a speed of three trillion metres per second. This is how fast our observations travel to transform reality into fixed states. We might call it the speed of consciousness.

Try to assess this, and find further, via the double slit experiment, that before we measure matter, when it exists in an indefinite state, it acts like a wave of energy. Measure the matter, and it becomes definite, fixed, and the wave 'collapses' to behave like particles.

Sunday, April 10, 2016

Talismans


Image Source: pinterest.

"How does a man come to know the unknowable?" He can do it through pushing the boundaries, or through some philosophical bridge. Maybe he does it through a woman, or a leap of faith, or a contemplation of the order of the universe that he cannot see. In these respects, I want to thank Dia Sobin at Trans-D Digital blog for permitting me to quote her 20 March 2016 post, The Language of Birds & the Alchemy of Love: The Music Box. She wrote a beautiful passage about the way in which girls keep talismans from their pasts to preserve memories and conjure up love. Women,
"have a peculiar predilection for keeping memorable items in special boxes, especially as young girls. Our little magic boxes ... full of talismanic detritus we've collected over the years ... a coin, jewelry, a shred of hair, a crumbling flower head, a photo, a signature, stones, bones ... whatever. Generally the tokens are kept to remind us of lovers or loved ones ... small trophies for experiences that may eventually retreat into a mental shadowland in the same way the objects themselves have retreated into the shadowy recesses of the box. But, no matter. The box becomes a sort of artificial memory bank... a collection of three-dimensional objects representing transdimensional events in the same way a collection of symbols do. In the end, whether we're talking about musical codes, alchemical codes, or the enigmatic chemistry of love and attraction, some type of hidden language is involved ... as is some kind of communication that lies outside the bounds of what is consciously understood."
Studies confirm that women remember events, especially emotional ones, better than men. Not only is the part of the brain which deals with memory larger in women, but that brain difference prompts female behaviour dedicated to maintaining memory through the organization of material objects. This tendency to tuck away bits of sacred junk in drawers and boxes demonstrates women's semi-conscious need to connect the emotional world and past memories to the tangible world in the present and future in direct ways. Women habitually manipulate time to turn the unreal side of life into something real. With these little anchors, they navigate the course of their lives. If you remember who you were, you don't lose track of who you are, and of the person you will become.

Tuesday, March 18, 2014

Smartphone Brain Scanners


Image Source: emotiv.

In the never-ending quest to quantify reality and generate dubious data sets, we come to the invention of smartphone brain scanners. Real time brain-mapping Emotiv EEG was created in 2011 by researchers at the Technical University of Denmark. Billed as "holding your brain in the palm of your hand," this app demonstrates how Millennial technology puts the cart before the horse, curiously reversing the normal understanding of how we function. Computers are creating the illusion that everything can be understood by being measured and instrumentalized before we consider any other factors: the hand drives the mind, rather than the other way around. This makes us unreflective puppets of concepts such as 'usability.' Make something or do something because we can; build apps around that capability; worry later about what it all means or what it will do to us.

The current applications of this technology relate to medical research. But the tech's inventors are confident that the app will be widely applied for other reasons. From Science Nordic:
Initially, the researchers will use the mobile system for research purposes, but one day this type of small brain scanner will perhaps be something that everyone has.
“There’s a trend at the moment to measure oneself more and more,” says Larsen. “An everyday example is the Runkeeper app for the iPhone, which measures the user’s running and walking trips. There will be more and more of this type of sensor, which you can wear on your body and connect to your mobile phone, so you can see the data that’s collected.”
Jakob Eg Larsen suggests where a mobile brain scanner can be useful: “If you’re about to doze off, you can actually see this from an EEG signal. If you’re driving a car or if you’re a long-distance lorry driver, then you could have this mobile equipment with you and you could have a system that warns you if you’re about to fall sleep.
The tech is open source. Originally out on the Nokia N900, a subsequent variation of the design was made for the Samsung Galaxy Note 2. An iPhone app, Mynd, uses similar technology. Think of the potentials for marketing! Below the jump, a demo video shows that several sets can be worn in social situations and people can observe each other's brains as they interact with one another.

Tuesday, January 7, 2014

The Problem with Memory 9: Remembering to Predict the Future


Image Source and © Traer Scott Photography.

Researchers at the Donders Institute for Brain, Cognition and Behavior at Raboud University Nijmegen in the Netherlands are studying how to erase painful memories which are the main symptoms of Post-Traumatic Stress Disorder. On 22 December 2013, Time reported that the Dutch researchers found that specific and recent bad memories could be targeted and erased with shock treatments. But they have not established that entrenched negative memories, typical in PTSD sufferers, could be so treated.

Image Source and © Traer Scott Photography.

This post and this post noted similar memory-erasing research currently undertaken in California and Massachusetts. All of these concepts recall the grotesque treatment dramatized in the 2004 sci-fi film, Eternal Sunshine of the Spotless Mind. Some researchers find the notion of erasing memory to be "too invasive"; they are instead trying to decouple memory from associated negative emotions. And they acknowledge that erasing negative memories of important events is akin to erasing the primary sources of history:
Elizabeth Phelps, professor of psychology and neural science at New York University ... and other researchers have previously used far less invasive techniques to reduce the emotional charge attached to a memory— rather than eliminating the memory itself. For example, one study exposed participants to smells paired with shocks and then wafted the same scents into their noses as they slept.  The volunteers didn’t forget which scent was linked with the shock— but they no longer had a fear response to it. “If you could take away the fear associated with the memory and keep the memory, that would be more optimal,” she says.

[T]he potential uses of a technique that erases personal memories raises profound ethical questions. Our memories are deeply related to our selves and many survivors of trauma get a sense of meaning and purpose from knowing what they have conquered. If negative or challenging memories are selectively removed, what would they leave behind?

“What if we wiped out all of the memories of the Holocaust?” asks Greely, “That would be terrible.  On the other hand, the suffering caused by some memories is really powerful and I would want to prioritize letting people who want to relieve their suffering, as a general matter, relieve their suffering.”

Friday, October 4, 2013

All Hallows' Eve Countdown: Facial Asymmetry and the Other Self


20th century prophet and clairvoyant Saint Slava Sevrukova saw Jesus Christ in a religious time travel vision and claimed that his facial features were mesmerizingly symmetrical. Image Source: The New Oxonian.

Welcome to a new installment in this year's Hallowe'en Countdown! Today, I follow up on this post about Mark Twain's speculation that Satan was a metaphor for our dream selves - a manifestation of our deepest, most unruly instincts. Today's post is about the hidden sides of ourselves which lie in plain sight, quietly expressed through body language, posture, a nervous tic. Good tailors and seamstresses know that people's whole bodies are asymmetrical, and part of the artistry of their craft lies in making clothes which make a body appear balanced, when it really isn't.

Gwyneth Paltrow has highly unusual facial symmetry. Her regular appearance (center); her right side is mirrored on itself in the far left photo; her left side is mirrored on itself in the far right photo. Princess Diana also had a very symmetrical face (see here). Image Source: right reading.

The same goes for faces. Very few people have nearly perfectly symmetrical faces. Facial asymmetry is a feature known to artists, photographers and hairstylists. Half their battle is finding the flattering angle in an imperfect subject. You can see a professional photographer discussing how she lights her own face to make it look balanced, 'normal' and beautiful, here. But there are even deeper dimensions to the implicit metaphysical messages our unbalanced faces give. With every crooked smile, we supposedly betray the ghosts which lurk within us.

Monday, September 16, 2013

The Coming Siege Against Cognitive Liberties


Image Source: Nature.

The Chronicle of Higher Education reports on how researchers are debating the legal implications of technological advances in neuroscanning:
Imagine that psychologists are scanning a patients' brain, for some basic research purpose. As they do so, they stumble across a fleeting thought that their equipment is able to decode: The patient has committed a murder, or is thinking of committing one soon. What would the researchers be obliged to do with that information?

That hypothetical was floated a few weeks ago at the first meeting of the Presidential Commission for the Study of Bioethical Issues devoted to exploring societal and ethical issues raised by the government's Brain initiative (Brain Research through Advancing Innovative Neurotechnologies), which will put some $100-million in 2014 alone into the goal of mapping the brain. ...

One commissioner ... has been exploring precisely those sorts of far-out scenarios. Will brain scans undermine traditional notions of privacy? Are existing constitutional protections sufficient to guard our freedom of thought, or are new laws required as fMRI scanners and EEG detectors grow evermore precise?

Asking those questions is the Duke University associate professor of law Nita A. Farahany ... . "We have this idea of privacy that includes the space around our thoughts, which we only share with people we want to ... . Neuroscience shows that what we thought of as this zone of privacy can be breached." In one recent law-review article, she warned against a "coming siege against cognitive liberties."

Her particular interest is in how brain scans reshape our understanding of, or are checked by, the Fourth and Fifth Amendments of the Constitution. Respectively, they protect against "unreasonable searches and seizures" and self-incrimination, which forbids the state to turn any citizen into "a witness against himself." Will "taking the Fifth," a time-honored tactic in American courtrooms, mean anything in a world where the government can scan your brain? The answer may depend a lot on how the law comes down on another question: Is a brain scan more like an interview or a blood test? ...

Berkeley's [Jack] Gallant says that although it will take an unforeseen breakthrough, "assuming that science keeps marching on, there will eventually be a radar detector that you can point at somebody and it will read their brain cognitions remotely." ...

Moving roughly from less protected to more protected ... [Farahany's] categories [for reading the brain in legal terms] are: identifying information, automatic information (produced by the brain or body without effort or conscious thought), memorialized information (that is, memories), and uttered information. (Contrary to idiomatic usage, her "uttered" information can include information uttered only in the mind. At the least, she observes, we may need stronger Miranda warnings, specifying that what you say, even silently to yourself, can be used against you.) ...

In a book to be published next month, Mind, Brains, and Law: The Conceptual Foundations of Law and Neuroscience (Oxford University Press), Michael S. Pardo and Dennis Patterson directly confront Farahany's work. They argue that her evidence categories do not necessarily track people's moral intuitions—that physical evidence can be even more personal than thought can. "We assume," they write, "that many people would expect a greater privacy interest in the content of information about their blood"—identifying or automatic information, like HIV status—"than in the content of their memories or evoked utterances on a variety of nonpersonal matters."

On the Fifth Amendment question, the two authors "resist" the notion that a memory could ever be considered analogous to a book or an MP3 file and be unprotected, the idea Farahany flirts with. And where the Fourth Amendment is concerned, Pardo, a professor of law at the University of Alabama, writes in an e-mail, "I do think that lie-detection brain scans would be treated like blood draws." ...

[Farahany] says ... her critics are overly concerned with the "bright line" of physical testimony: "All of them are just grappling with current doctrine. What I'm trying to do is reimagine and newly conceive of how we think of doctrine."

"The bright line has never worked," she continues. "Truthfully, there are things that fall in between, and a better thing to do is to describe the levels of in-betweenness than to inappropriately and with great difficulty assign them to one category or another."

Among those staking out the brightest line is Paul Root Wolpe, a professor of bioethics at Emory University. "The skull," he says, "should be an absolute zone of privacy." He maintains that position even for the scenario of the suspected terrorist and the ticking time bomb, which is invariably raised against his position.
"As Sartre said, the ultimate power or right of a person is to say, 'No,'" Wolpe observes. "What happens if that right is taken away—if I say 'No' and they strap me down and get the information anyway? I want to say the state never has a right to use those technologies." ... Farahany stakes out more of a middle ground, arguing that, as with most legal issues, the interests of the state need to be balanced against those of the individual. ...

The nonprotection of automatic information, she writes, amounts to "a disturbing secret lurking beneath the surface of existing doctrine." Telephone metadata, another kind of automatic information, can, after all, be as revealing as GPS tracking.

Farahany starts by showing how the secrets in our brains are threatened by technology. She winds up getting us to ponder all the secrets that a digitally savvy state can gain access to, with silky and ominous ease.
Much of the discussion among these legal researchers involves thinking about cognitive legal issues (motivations, actions, memories) in a way that is strongly influenced by computer-based metaphors. This is part of the new transhuman bias, evident in many branches of research. This confirms a surprising point: posthumanism is not some future hypothetical reality, where we all have chips in our brains and are cybernetically enhanced. It is an often-unconsidered way of life for people who are still 100 per cent human; it is a way that they are seeing the world.

This is the 'soft' impact of high technology, where there is an automatic assumption that we, our brains, or the living world around us, are like computers, with data which can be manipulated and downloaded.

In other words, it is not just the hard gadgetry of technological advances that initiates these insidious changes in law and society. If we really want to worry about the advent of a surveillance state, we must question the general mindset of tech industry designers, and people in general, who are unconsciously mimicking computers in the way they try to understand the world. From this unconscious mimicry comes changes to society for which computers are not technically responsible.

A false metaphorical correlation between human and machine - the expectation that organic lives must be artificially automated  - is corrosive to the assumptions upon which functioning societies currently still rest. These assumptions are what Farahany would call, "current doctrine." We take 'current doctrine' for granted. But at the same time, we now take for granted ideas that make 'current doctrine' increasingly impossible to maintain.

This is not to say that change is unnecessary or that technology has not brought vast improvements.

But is it really necessary for everything to go faster and faster? Do we need to be accessible to everyone and everything, day and night? Should our bosses, the police, the government, corporations, the media, let alone other citizens, know everything we do and think in private? Do we really need more powerful technology every six months? Why is it necessary that our gadgets increasingly become smaller, more intimate, and physically attached to us? Why is it not compulsory for all of us to learn (by this point) to a standard level how computers are built and how they are programmed?

We are accepting technology as a given, without second guessing the core assumptions driving that acceptance. Because we are not questioning what seems 'obvious,' researchers press on, in an equally unconsidered fashion, with their expectations that a total surveillance state is inevitable.

Tuesday, October 23, 2012

Countdown to Hallowe'en 9: Who Waits for the Setting Sun?

Image © Sharon Day via Ghost Hunting Theories.

I have written before (here and here) about radical anti-ageing techniques currently being researched, funded and pursued, primarily by the Baby Boomers. One of those treatments involves blood transfusions. Thank you to Dia (check out her blogs here and here) for sending a link about the new vampiric Blood Countess version of said treatment, now in development: scientists have found that 'young blood can reverse some effects of ageing.'

Tuesday, July 24, 2012

Nuclear Culture 14: Crossroads between the Virtual and Real in the Nuclear Quiet

Turning points: in the radioactive evacuation zone near Fukushima, a weed called Common Mullein reclaims Japanese highways. Image Source: Kotaku via ENE News.

The Classical Greeks had two concepts of time, one quantitative, one qualitative. The latter was something they called kairos:
Kairos (καιρός) is an ancient Greek word meaning the right or opportune moment (the supreme moment). The ancient Greeks had two words for time, chronos and kairos. While the former refers to chronological or sequential time, the latter signifies a time in between, a moment of indeterminate time in which something special happens. What the special something is depends on who is using the word. While chronos is quantitative, kairos has a qualitative nature. Kairos (καιρός) also means weather in both ancient and modern Greek. The plural, καιροι (kairoi or keri) means the times.
Kairos was, for Aristotle, the contextual meaning of a time; in the New Testament, it is "the appointed time in the purpose of God," the turning point when the divine apparently intersects with human affairs. That is likely a concept with long, pre-Christian roots. Paul Tillich interpreted Kairoi as moments of crisis when the word of god becomes literal reality. For the non-religious, this is merely a metaphor, but the idea - of the fictional, the fanciful, the imaginative themes of private emotional worlds of faith and introspection suddenly becoming reality - remains sadly familiar. The terrible and shocking transition when the virtual becomes real is a Millennial concern, and applies even more in non-religious terms.

Saturday, July 21, 2012

DC's Batman Shooter: The Day Evil Won


Cover art for DC Comics' Final Crisis (2008) by J. G. Jones © DC Comics. Image Source: Wiki.

In 2008, DC Comics, publishers of Batman, continued a pattern of pumping dwindling sales by publishing a crossover multi-title event called Final Crisis. The publicity motto for that series was: the day evil won. Top editor and now Co-Publisher of DC Comics, Dan Didio, commented that the series examined the question: "What happens when evil wins?" It is a good question, and an ironic one for Mr. Didio to ask. The answer appears to be: evil wins the day that DC's Millennial virtual fantasies become a reality. What happens on the pulp pages and the movie screen now happens in the cinema itself. Reality has become just like a graphic novel.

In an Aurora, Colorado shooting 20 July 2012 at the Batman: The Dark Knight Rises midnight movie première, 12 people died and 70 were tragically injured. Predictably, America's media have launched into a heavily politicized and polarized debate about the right to bear arms, the Second Amendment to the US Constitution.

But this election-related argument will take public discussion far off track from the meaning and origins of this tragedy. Guns were not the only weapons used here, since Holmes lobbed tear gas grenades at the crowd, and his apartment is still sealed and under investigation by bomb experts. The apartment is booby-trapped and full of jars of liquid, mortar rounds, trip wires, bombs and incendiary devices, which Holmes likely learned how to make by searching for information on the Web. He also purchased his ammunition over the Internet. Thus, some commentators might begin to ask if we should censor the Internet as we control guns. In this crime, guns and bombs and the information on the Web were not the purpose, but means, to an end.

That end is a social malaise which saw the suspected shooter, James Holmes, tell police that he was "the Joker." And in fact, everything, from the gas lobbed into the cinema prior to the shooting, to Holmes's booby-trapped apartment, is very Joker-like.

The governor of Colorado, John Hickenlooper, sees this crime as an act of "senseless violence." But labeling 24-year-old Holmes, a graduate student who was in the process of abandoning his PhD in Neuroscience at University of Colorado, as 'insane' does not help to explain this crime. How did someone who was described by his old California neighbours as "clean cut, responsible and studied hard," and who graduated at the top in his undergraduate class in Neuroscience at University of California, Riverside, become someone who said he was "the Joker"?

Thursday, July 19, 2012

Fountain of Youth 14: Embrace Your Immortality

Get me outta here. Image Source: Digital Journal.

People are so literal-minded these days. The staunchly faithful believe the end is nigh. The staunchly un-faithful believe the end is not nigh. Either way, the new Millennium's opposing camps of the very religious and the very atheistic seek exactly the same goal: immortality. The irony in this fact - that those who go in for the apocalypse are on the same page as those who go in for the technological singularity - derives from an excess of literal-mindedness.

Here is an example of Millennial literal-mindedness, from a Gen X neuroscientist at Harvard who is attempting to figure out how to download his consciousness onto a computer interface so that he can live forever. From a Chronicle of Higher Education report:
In the basement of the Northwest Science Building here at Harvard University, a locked door is marked with a pink and yellow sign: "Caution: Radioactive Material." Inside researchers buzz around wearing dour expressions and plastic gloves. Among them is Kenneth Hayworth. ...

Hayworth has spent much of the past few years in a windowless room carving brains into very thin slices. He is by all accounts a curious man, known for casually saying things like, "The human race is on a beeline to mind uploading: We will preserve a brain, slice it up, simulate it on a computer, and hook it up to a robot body." He wants that brain to be his brain. He wants his 100 billion neurons and more than 100 trillion synapses to be encased in a block of transparent, amber-colored resin—before he dies of natural causes.

Why? Ken Hayworth believes that he can live forever.

But first he has to die.

"If your body stops functioning, it starts to eat itself," he explains to me one drab morning this spring, "so you have to shut down the enzymes that destroy the tissue." If all goes according to plan, he says cheerfully, "I'll be a perfect fossil." Then one day, not too long from now, his consciousness will be revived on a computer. By 2110, Hayworth predicts, mind uploading—the transfer of a biological brain to a silicon-based operating system—will be as common as laser eye surgery is today.

It's the kind of scheme you expect to encounter in science fiction, not an Ivy League laboratory. But little is conventional about Hayworth, 41, a veteran of NASA's Jet Propulsion Laboratory and a self-described "outlandishly futuristic thinker." While a graduate student at the University of Southern California, he built a machine in his garage that changed the way brain tissue is cut and imaged in electron microscopes. The combination of technical smarts and entrepreneurial gumption earned him a grant from the McKnight Endowment Fund for Neuroscience, a subsidiary of the McKnight Foundation, and an invitation to Harvard, where he stayed, on a postdoctoral fellowship, until April.

To understand why Hayworth wants to plastinate his own brain you have to understand his field—connectomics, a new branch of neuroscience. A connectome is a complete map of a brain's neural circuitry. Some scientists believe that human connectomes will one day explain consciousness, memory, emotion, even diseases like autism, schizophrenia, and Alzheimer's—the cures for which might be akin to repairing a wiring error. In 2010 the National Institutes of Health established the Human Connectome Project, a $40-million, multi-institution effort to study the field's medical potential.

Among some connectomics scholars, there is a grand theory: We are our connectomes. Our unique selves—the way we think, act, feel—is etched into the wiring of our brains. Unlike genomes, which never change, connectomes are forever being molded and remolded by life experience. Sebastian Seung, a professor of computational neuroscience at the Massachusetts Institute of Technology and a prominent proponent of the grand theory, describes the connectome as the place where "nature meets nurture."

Hayworth takes this theory a few steps further. He looks at the growth of connectomics—especially advances in brain preservation, tissue imaging, and computer simulations of neural networks—and sees something else: a cure for death. In a new paper in the International Journal of Machine Consciousness, he argues that mind uploading is an "enormous engineering challenge" but one that can be accomplished without "radically new science and technologies."
Extreme literal-mindedness boils immortality down to an "enormous engineering challenge." The connectome idea also has a metaphysical side. The connectome curiously reworks the concept of fate. This is a really seductive concept in a troubled (or if one prefers, fallen) world: a proposal to alter destiny on the cellular, genetic, atomic, and sub-atomic levels. Change destiny, whether it comes from nature or nurture, like changing a spark plug.

The article paints Hayworth as a dedicated figure, a futurist ahead of his time. And in that regard, he is a great visionary. His work may inadvertently cure terrible diseases or vastly expand our grasp of neural, or even cerebral, processes. But these would be incidental to his primary aim to 'cure' us of death. There is no moment where Hayworth stops and asks: should we be immortal? If death is hard-wired into every living thing on the planet, and even non-living things die, then maybe death exists for a good reason? Maybe it is the lynchpin in the order of the universe? Aside from the possibility that an immortal human could be horrifying, perhaps conquering death would destroy the balance of nature? I am not talking about hocus-pocus. Nor am I talking about cells and synapses, genes and enzymes. I am talking about the purpose of death, which we do not understand. There is a purpose for death in the universe, because even galaxies die. For Hayworth, these are non-issues:
One hundred years from now, he believes, our descendants will not understand how so many of us failed for so long to embrace immortality. In an unpublished essay, "Killed by Bad Philosophy," he writes, "Our grandchildren will say that we died not because of heart disease, cancer, or stroke, but instead that we died pathetically out of ignorance and superstition"—by which he means the belief that there is something fundamentally unknowable about consciousness, and that therefore it can never be replicated on a computer. ...

My [The Chronicle reporter's] conversations with Hayworth took place over several months, and I was struck by how his optimism often gave way to despair. "I've become jaded about whether brain preservation will happen in my lifetime," he told me at one point. "I see how much pushback I get. Even most neuroscientists seem to believe that there is something magical about consciousness—that if the brain stops, the magic leaves, and if the magic leaves, you can't bring the magic back."

I asked him if the scope of his ambitions ever gives him pause. If he could achieve immortality, might it usher in a new set of problems, problems that we can't even imagine? "Here's what could happen," he said. "We're going to understand how the brain works like we now understand how a computer works. At some point, we might realize that the stuff we hold onto as human beings—the idea of the self, the role of mortality, the meaning of existence—is fundamentally wrong."

Lowering his voice, he continued. "It may be that we learn so much that we lose part of our humanity because we know too much." He thought this over for a long moment. "I try not to go too far down that road," he said at last, "because I feel that I would go mad."
Yes. Madness arrives, on schedule, when science heads too far into the outer reaches without support. Hayworth may see philosophers as part of the problem. But it looks like he needs one or two of them to watch his back. It does not matter whether you think the world was created by a divine being (or beings) who set in motion a grand battle between the forces of good and evil - or that the universe (or multiverses) exploded in the Big Bang and can be rationally dissected. The real cultural and historically relevant Millennial phenomenon across the board is a literal-mindedness about everything that formerly belonged to the realm of mystery.

Saturday, December 3, 2011

Love in the New Millennium 8: What Women Want


Aishwarya  Rai: Often considered the most beautiful woman in the world.

What do women want? Forgive the rhetorical generalizations, but lots of people would like to know, including many women.  This post is a companion piece to my post on men and love and the Internet, here. In that post, I concluded that one of the reasons men invented the Internet was because it was a great democratic equalizer when it comes to pursuing women. Also, the Internet allows men to resolve discrepancies between dissatisfaction in their current situations and their desired situations. They can do this fairly seamlessly - until they have to make their online virtual reality match up with their everyday reality. Despite these wrinkles, the masculine desires for freedom and equality when searching for a mate appear to be two of the driving forces behnd the Tech Boom.

But how do the Internet, and technology in general, reflect and reveal women's greatest desires?

Back in high school, my insane English teacher said: "Don't be fooled, Boys! Women say they want love! But what they really want is powerrrrr." He said he figured this out one day when he found himself peeling a grape for his granddaughter.

Friday, December 2, 2011

Neuroeconomics

Image Source: Psychology Today.

A couple of years ago, I went to a conference where some new-fangled techniques in economic studies were discussed in one of the sessions. The novel methods involved presenting test subjects with economic choices, and then administering blood tests or brain scans to observe changes in hormonal levels and brain activity; it is an idea I touched on in an earlier post. At the time, I had a laugh; this bizarre hybrid of economics and psychobiology was chilling, yet funny, because of its dull literal-mindedness. 

When the economy tanks and everyone finds economists without answers, then economic analysts sometimes poach on other disciplines' territories to find new methods.  Occasionally, they move right into another field and make themselves at home. For example, they have been camping in the field of history, with an area of research they call Cliometrics.  They've also expressed some interest in economics as an applied philosophy. As one of my friends who works in philosophy said, if the economists were to stop by, he would go down to the front gate and say: "Nothing to see here, Boys, move on, move on."

And move on they have, with zero sense of irony - to neuroscience, psychology and biology. It's a sign of how desperate economic researchers are.  They need to find solutions to serious problems that their theories partly engendered. So far, they have come up empty-handed. Rather than containing the recession, their methods have been politicized; their ideas have been appropriated by practitioners in politics; and the economy is not improving. And so since 2008, with pressing urgency, the economists have been moving on, in a bid to develop whole new ways of economic thinking and remake the world economy.

Wednesday, July 13, 2011

Amazonians Challenge the Universal Time-Space Hypothesis

The Amondawa were first "discovered" by anthropologists in 1986. Image Source: V. da Silva/Sinha/BBC.

In an earlier post, I described the South American Aymara people who think backwards when they conceive of time.  The Telegraph recently reported on the Amondawa people of the Amazon, who apparently have no concept of time. This challenges some of the core theories of linguistics and human psychology, which assume that time is innate to humans, their cultures and societies. In fact, time is not universal. And those who live without it are strangely free:
The Amondawa people who live deep in the Amazonian rainforests of Brazil have no watches or calendars and live their lives to the patterns of day and night and the rainy and dry seasons.

They also have no age – and mark the transition from childhood to adulthood to old age by changing their name.

The team of researchers, led by University of Portsmouth, said that it is the first time they have been able to prove time is not a deeply entrenched universal human concept, as previously thought.

Professor Chris Sinha said: 'We can now say without doubt that there is at least one language and culture which does not have a concept of time as something that can be measured, counted or talked about in the abstract."

"This doesn't mean that the Amondawa are "people outside time", but they live in a world governed by events rather than the passing of time."

Only discovered in 1986, the Amondawa, about 150 strong, continue their traditional way of life, hunting, fishing and farming.

They also have their own language which have a number system but it only goes up to four.

Prof Sinha and his team, including a linguist and anthropologist, spent eight weeks with the Amondawa researching how their language conveys concepts like "next week" or "last year".

There were no words for such concepts, only divisions of day and night and rainy and dry seasons.

They also found nobody in the community had an age.

Instead, they change their names to reflect their life-stage and position within their society.

A little child will give up their name to a newborn sibling and take on a new one.

Prof Sinha said: "We have so many metaphors for time and its passing – we think of time as a 'thing' – we say 'the weekend is nearly gone', 'she's coming up to her exams', 'I haven't got the time', and so on, and we think such statements are objective, but they aren't.

"We've created these metaphors and they have become the way we think. The Amondawa don't talk like this and don't think like this, unless they learn another language.

"For these fortunate people time isn't money, they aren't racing against the clock to complete anything, and nobody is discussing next week or next year; they don't even have words for 'week', 'month' or 'year'. "You could say they enjoy a certain freedom." 
This research was published in Language And Cognition (see the abstract here).  As a result of this research, the research team proposes "a Mediated Mapping Hypothesis, which accords causal importance to the numerical and artefact-based construction of time-based (as opposed to event-based) time interval systems."  In other words, according to a BBC report, the team assumes that the lack of a concept of time comes from a lack of technology to measure time.

The BBC report also reports on other researchers who have criticized these findings.  Another academic comments that these people may in fact experience time in the way we do. However, this similar experience may not reflect in their language, which is how researchers generally peg human apprehension of time in different societies:
These arguments do not convince Pierre Pica, a theoretical linguist at France's National Centre for Scientific Research (CNRS), who focuses on a related Amazonian language known as Mundurucu.

"To link number, time, tense, mood and space by a single causal relationship seems to me hopeless, based on the linguistic diversity that I know of," he told BBC News. ...

Small societies like the Amondawa tend to use absolute terms for normal, spatial relations - for example, referring to a particular river location that everyone in the culture will know intimately rather than using generic words for river or riverbank.

These, Dr Pica argued, do not readily lend themselves to being co-opted in the description of time.

"When you have an absolute vocabulary - 'at the water', 'upstream', 'downstream' and so on, you just cannot use it for other domains, you cannot use the mapping hypothesis in this way," he said.

In other words, while the Amondawa may perceive themselves moving through time and spatial arrangements of events in time, the language may not necessarily reflect it in an obvious way.
Citation Information. Language and Cognition. Volume 3, Issue 1, Pages 137–169, ISSN (Online) 1866-9859, ISSN (Print) 1866-9808, DOI: 10.1515/LANGCOG.2011.006, /May/2011