TIMES, TIME, AND HALF A TIME. A HISTORY OF THE NEW MILLENNIUM.

Comments on a cultural reality between past and future.

This blog describes Metatime in the Posthuman experience, drawn from Sir Isaac Newton's secret work on the future end of times, a tract in which he described Histories of Things to Come. His hidden papers on the occult were auctioned to two private buyers in 1936 at Sotheby's, but were not available for public research until the 1990s.



Showing posts with label Neurolaw. Show all posts
Showing posts with label Neurolaw. Show all posts

Monday, September 16, 2019

What's Left Over? Merge with the Demon


Beware the implantation of a synthetic conscience, crafted by others. Image Source: pinterest.

I have a theory that our religions mirror our levels of technology. This post continues my What's Left Over? series, which describes two contending spiritual stances - the materialist and the anti-materialist - which are appearing in response to the Technological Revolution.

Science and technology refer to ways of doing, while the arts and theology describe ways of being. Neither can exist without the other. Ideally, they are used in balance. But the burgeoning technocracy thrives on doing; it pretends to support the arts and organic freedoms, while actually discouraging them. As a result, technologists have to build simulated versions of being to make up for the shortfall. Their simulacra of moral sensibilities are poor substitutes for the true human love of nature and mystery.

Left unfettered, technocrats will construct your personality and life from the outside in. And while their creed is superficially moral and progressive, it is grossly materialist, anti-human and anti-humane.

If you thought corporations and institutions were already corrupt, and rewarded the worst aspects of your nature and exploited your best instincts against you, imagine what it will be like inside the corporatist technocracy, where their externalized conscience will come from a chip, embedded in your brain and communicating with your cerebral cortex by way of a neural interlace.

This fake, mechanized, embedded chip conscience is the foundation stone of what the people at The Boiler Room podcast call the "technocratic dictatorship." Elon Musk is its leading proponent.

Musk was once Silicon Valley's space cowboy and he had a romantic heart. In a 2010 interview, his ex-first-wife Justine struggled with the loss of Musk's original visionary, who became her own "private Alexander the Great." In 2016, he wasn't completely gone. He named one of his droneships, 'Of Course I Still Love You.'

In 2014, Musk warned against the advent of artificial intelligence as "summoning the demon." Now a shadow of his former self, in July of this year he stammered out his new agenda, a brain-chip interface called Neuralink. Neuralink will allow us to merge with the demon and to "stay one step ahead" of it. This is supposedly the only way to survive the coming Singularity.

Putting a Chip in Your Brain Will Not Make You a Superhero (or a god) (7 August 2019). Video Source: Youtube.


See all posts in the What's Left Over? series on materialism and anti-materialism in technological advancement.

Wednesday, March 26, 2014

Amazon Delivers What You Will Buy Before You Order It


Coming soon. Image Source: Amazon via the Daily Mail.

On Christmas Eve of last year, Amazon filed a patent for anticipatory delivery. You can read it here. The Irish Times explains how microdata and personal branding have intersected so exactly that Amazon expects that it knows what you want before you even know you want it; it will purchase the item you want before it occurs to you to purchase it, and ship it preemptively:
The online retail giant now wants to start shipping items to customers before they have even ordered them. Taking the buying process to the next level, Amazon has developed a system that pre-emptively delivers goods to customers based on their previous purchases.
The company believes it can predict shoppers’ needs so precisely, that it wants to have a package in transit to them, before they have placed the order.
The e-commerce juggernaut has obtained a patent for what it calls “anticipatory package shipping”, a system whereby the company anticipates buying habits and sends potential purchases to the closest delivery hub, waiting for the order to arrive, or, in some cases, even shipping directly to a customer’s door.
According to the patent filing, items would be moved from Amazon’s fulfilment centre to a shipping hub close to the customer in anticipation of an eventual purchase.
“In some instances, the package may be delivered to a potentially interested customer as a gift rather than incurring the cost of returning or redirecting the package,” the patent reads.
“For example, if a given customer is particularly valued (according to past ordering history, appealing demographic profile, etc), delivering the package to the given customer as a promotional gift may be used to build goodwill.”
Amazon says the system is designed to cut down on shipping delays, which “may dissuade customers from buying items from online merchants.”
In deciding what to ship, Amazon said it may consider previous orders, product searches, wish lists, shopping-cart contents, returns and even how long an Internet user’s cursor hovers over an item.
See other reports here and here. As Amazon walks an insidious line between psychological intrusion and arrogant clairvoyance, not one word is said about how intrusive and manipulative this is. The patent doesn't say it, but presumably the anticipatory ordering function will also reflect Amazon processing previous buyers' personal information on social networks. Imagine if this concept, or something like it, was used by other businesses. Is a convenient service really worth so much? Why are all these initiatives accepted so passively by the public, with no regard for their larger implications?

Amazon additionally seeks to undercut the competition in shipping by setting up drone deliveries which will ship your order in 30 minutes or less. The drone program is called Amazon Prime Air. The program has a few years of R&D to go; it also has to pass Federal Aviation Authority regulations in the US. But on 7 March 2014, the Daily Mail reported that a transport judge recently dismissed a fine enforcing an FAA ban on commercial unmanned aircraft in the US.

One friend, C., on hearing this, expected that people will shoot the drones down, steal their packages and sell the contents on eBay, hence completing a near-perfect Internet Circle Of Life. In the meantime, sit tight and await the drones. Below the jump, see the drone package delivery promo video.

Monday, September 16, 2013

The Coming Siege Against Cognitive Liberties


Image Source: Nature.

The Chronicle of Higher Education reports on how researchers are debating the legal implications of technological advances in neuroscanning:
Imagine that psychologists are scanning a patients' brain, for some basic research purpose. As they do so, they stumble across a fleeting thought that their equipment is able to decode: The patient has committed a murder, or is thinking of committing one soon. What would the researchers be obliged to do with that information?

That hypothetical was floated a few weeks ago at the first meeting of the Presidential Commission for the Study of Bioethical Issues devoted to exploring societal and ethical issues raised by the government's Brain initiative (Brain Research through Advancing Innovative Neurotechnologies), which will put some $100-million in 2014 alone into the goal of mapping the brain. ...

One commissioner ... has been exploring precisely those sorts of far-out scenarios. Will brain scans undermine traditional notions of privacy? Are existing constitutional protections sufficient to guard our freedom of thought, or are new laws required as fMRI scanners and EEG detectors grow evermore precise?

Asking those questions is the Duke University associate professor of law Nita A. Farahany ... . "We have this idea of privacy that includes the space around our thoughts, which we only share with people we want to ... . Neuroscience shows that what we thought of as this zone of privacy can be breached." In one recent law-review article, she warned against a "coming siege against cognitive liberties."

Her particular interest is in how brain scans reshape our understanding of, or are checked by, the Fourth and Fifth Amendments of the Constitution. Respectively, they protect against "unreasonable searches and seizures" and self-incrimination, which forbids the state to turn any citizen into "a witness against himself." Will "taking the Fifth," a time-honored tactic in American courtrooms, mean anything in a world where the government can scan your brain? The answer may depend a lot on how the law comes down on another question: Is a brain scan more like an interview or a blood test? ...

Berkeley's [Jack] Gallant says that although it will take an unforeseen breakthrough, "assuming that science keeps marching on, there will eventually be a radar detector that you can point at somebody and it will read their brain cognitions remotely." ...

Moving roughly from less protected to more protected ... [Farahany's] categories [for reading the brain in legal terms] are: identifying information, automatic information (produced by the brain or body without effort or conscious thought), memorialized information (that is, memories), and uttered information. (Contrary to idiomatic usage, her "uttered" information can include information uttered only in the mind. At the least, she observes, we may need stronger Miranda warnings, specifying that what you say, even silently to yourself, can be used against you.) ...

In a book to be published next month, Mind, Brains, and Law: The Conceptual Foundations of Law and Neuroscience (Oxford University Press), Michael S. Pardo and Dennis Patterson directly confront Farahany's work. They argue that her evidence categories do not necessarily track people's moral intuitions—that physical evidence can be even more personal than thought can. "We assume," they write, "that many people would expect a greater privacy interest in the content of information about their blood"—identifying or automatic information, like HIV status—"than in the content of their memories or evoked utterances on a variety of nonpersonal matters."

On the Fifth Amendment question, the two authors "resist" the notion that a memory could ever be considered analogous to a book or an MP3 file and be unprotected, the idea Farahany flirts with. And where the Fourth Amendment is concerned, Pardo, a professor of law at the University of Alabama, writes in an e-mail, "I do think that lie-detection brain scans would be treated like blood draws." ...

[Farahany] says ... her critics are overly concerned with the "bright line" of physical testimony: "All of them are just grappling with current doctrine. What I'm trying to do is reimagine and newly conceive of how we think of doctrine."

"The bright line has never worked," she continues. "Truthfully, there are things that fall in between, and a better thing to do is to describe the levels of in-betweenness than to inappropriately and with great difficulty assign them to one category or another."

Among those staking out the brightest line is Paul Root Wolpe, a professor of bioethics at Emory University. "The skull," he says, "should be an absolute zone of privacy." He maintains that position even for the scenario of the suspected terrorist and the ticking time bomb, which is invariably raised against his position.
"As Sartre said, the ultimate power or right of a person is to say, 'No,'" Wolpe observes. "What happens if that right is taken away—if I say 'No' and they strap me down and get the information anyway? I want to say the state never has a right to use those technologies." ... Farahany stakes out more of a middle ground, arguing that, as with most legal issues, the interests of the state need to be balanced against those of the individual. ...

The nonprotection of automatic information, she writes, amounts to "a disturbing secret lurking beneath the surface of existing doctrine." Telephone metadata, another kind of automatic information, can, after all, be as revealing as GPS tracking.

Farahany starts by showing how the secrets in our brains are threatened by technology. She winds up getting us to ponder all the secrets that a digitally savvy state can gain access to, with silky and ominous ease.
Much of the discussion among these legal researchers involves thinking about cognitive legal issues (motivations, actions, memories) in a way that is strongly influenced by computer-based metaphors. This is part of the new transhuman bias, evident in many branches of research. This confirms a surprising point: posthumanism is not some future hypothetical reality, where we all have chips in our brains and are cybernetically enhanced. It is an often-unconsidered way of life for people who are still 100 per cent human; it is a way that they are seeing the world.

This is the 'soft' impact of high technology, where there is an automatic assumption that we, our brains, or the living world around us, are like computers, with data which can be manipulated and downloaded.

In other words, it is not just the hard gadgetry of technological advances that initiates these insidious changes in law and society. If we really want to worry about the advent of a surveillance state, we must question the general mindset of tech industry designers, and people in general, who are unconsciously mimicking computers in the way they try to understand the world. From this unconscious mimicry comes changes to society for which computers are not technically responsible.

A false metaphorical correlation between human and machine - the expectation that organic lives must be artificially automated  - is corrosive to the assumptions upon which functioning societies currently still rest. These assumptions are what Farahany would call, "current doctrine." We take 'current doctrine' for granted. But at the same time, we now take for granted ideas that make 'current doctrine' increasingly impossible to maintain.

This is not to say that change is unnecessary or that technology has not brought vast improvements.

But is it really necessary for everything to go faster and faster? Do we need to be accessible to everyone and everything, day and night? Should our bosses, the police, the government, corporations, the media, let alone other citizens, know everything we do and think in private? Do we really need more powerful technology every six months? Why is it necessary that our gadgets increasingly become smaller, more intimate, and physically attached to us? Why is it not compulsory for all of us to learn (by this point) to a standard level how computers are built and how they are programmed?

We are accepting technology as a given, without second guessing the core assumptions driving that acceptance. Because we are not questioning what seems 'obvious,' researchers press on, in an equally unconsidered fashion, with their expectations that a total surveillance state is inevitable.

Thursday, June 17, 2010

Fountain of Youth 3: David Eagleman at the Museums of Curiosity


 Sum, by David Eagleman. Pantheon, 2009.

On June 14, BBC Four interviewed Neil Gaiman, David Eagleman and Sarah Millican on the last episode of its current series for the radio show Museums of Curiosity. You can listen to the show between June 14 and June 20 here. After that, it goes offline. David Eagleman, best-selling author of Sum: Forty Tales of the Afterlives , and neuroscientist at Baylor College of Medicine, Texas described his perception of time and the anticipated merger of science and religion.