TIMES, TIME, AND HALF A TIME. A HISTORY OF THE NEW MILLENNIUM.

Comments on a cultural reality between past and future.

This blog describes Metatime in the Posthuman experience, drawn from Sir Isaac Newton's secret work on the future end of times, a tract in which he described Histories of Things to Come. His hidden papers on the occult were auctioned to two private buyers in 1936 at Sotheby's, but were not available for public research until the 1990s.



Showing posts with label Education. Show all posts
Showing posts with label Education. Show all posts

Friday, August 4, 2017

In Millennial Eyes 7: Edinburgh in 1544


Virtual Time Binoculars - Edinburgh 1544 from Smart History on Vimeo.

A new academic project has begun to explore the historical teaching potential of virtual reality environments. I have previously blogged on a similar projects concerning history and video game technology with regard to 17th century Netherlands in The Golden Age, and colonial Quebec in I Remember the French Revolution. This is the teaser trail for Virtual Time Binoculars - Edinburgh 1544:
"PREVIEW - TEASER TRAILER

New technology reveals old Edinburgh.

St Andrews research brings 16th Century capital back to life.

The lost townscape of sixteenth-century Edinburgh is being brought back to life by researchers at the University of St Andrews.

The reconstruction is still under construction and will be available on a number of digital platforms (including a mobile app, a 3D virtual experience, and more traditional web-based resources) from 1 May 2017.

'The new digital reconstruction is the first to be created of the sixteenth-century city, and is based on a drawing from 1544 (thought to be the earliest accurate depiction of the capital).'

The technology will be shown for the first time today (16 March 2017) at a special event showcasing the University’s research to industry partners.

The virtual time travel technology – which will be released as an app in May – provides a unique window into the capital around the time of the birth of Mary Queen of Scots.

The technology is the result of a collaboration between St Andrews historians, art historians, computer scientists and University spinout company Smart History. The result is an interactive tour of the capital as it appeared in 1544, just before the city was sacked and burned by an English army led by Edward Seymour, Earl of Hertford.

Dr Bess Rhodes, an expert on sixteenth-century Scottish history who collaborated on the reconstruction, said, 'For the first time visitors and residents can compare the city they know with the capital of James V and Mary Queen of Scots.'

'It has been amazing seeing the recreation of a lost townscape. I hope this project makes the public more aware of the layers in the capital’s history, and furthers understanding of the complex way in which Edinburgh evolved.'

The reconstruction is inspired by a sixteenth-century drawing of Edinburgh made by Richard Lee, an English military engineer who later designed the massive artillery defences at Berwick-upon-Tweed. Lee accompanied the Earl of Hertford’s forces to Edinburgh in 1544, and his drawing is thought to be the first realistic depiction of Scotland’s capital.

The interdisciplinary team of St Andrews researchers supplemented the information from Lee’s plan with archaeological evidence, sixteenth-century written sources, and information about the geography of the modern city, to create an updated reconstruction of Edinburgh.

Dr Rhodes continued, 'The 1540s were a tumultuous period in Edinburgh’s history. In December 1542 King James V of Scotland died, leaving his baby daughter Mary as monarch. Not long after the English King Henry VIII ordered an invasion of Scotland, with the aim of forcing the Scots to accept a proposed betrothal between the infant Mary and his young son (the future Edward VI of England).'

'One of the first major actions in the conflict later known as the "Rough Wooing" was the Earl of Hertford’s attack on Edinburgh in May 1544. Hertford’s forces failed to capture Edinburgh Castle, but set fire to the city, damaging the medieval townscape, before they retreated. Our reconstruction is the first digital representation of Edinburgh at this eventful moment in the capital’s past.'

The new reconstruction gives an overview of the townscape of the entire sixteenth-century city, with a particular focus on the Royal Mile – the historic spine of Edinburgh. The digital development was largely financed by a grant from Innovate UK."

See all my posts in the series, In Millennial Eyes, in which I explore how history and historic themes are viewed from 21st century perspectives.

Wednesday, June 14, 2017

Inequality, A Function of Virtual Reality


Still from Videodrome (1983) © Universal. Reproduced under Fair Use. Image Source: The Dissolve.

Today, academics at the London School of Economics are discussing socio-economic inequality and the rise of a dual-track system between haves and have-nots. The scholars at LSE recognize that whoever understands and commandeers the narrative of inequality will control new Millennial politics. They think there is a need to reframe the terms of debate, which means Millennial inequality is a partly-unknown quantity which is up for grabs. They consider inequality as a form of identity, that is, a subset of populist nationalism. Unfortunately, by using recognized terms from the 20th century political lexicon, these researchers are missing dimensions of the problem.

This blog has been exploring inequality rather differently. I maintain that technology and the Internet intensify the conflict between the establishment and the precariat. Globalization would be impossible without technology; the global economy's most basic motivating source is not political and is instead a function of the tools we are using. Technology has also fragmented 20th century economies and polities. The Internet compounds the breakdown through intensified media experiences, which lead to contrasting views of reality. That is, inequality is related in cryptic ways to the corresponding expansion and deepening of virtual reality.

Still from Videodrome (1983) © Universal. Reproduced under Fair Use. Image Source: The Dissolve.

I consider the influence of technology to be poorly understood. Philosophically, it implies that there are disturbing new ways of existing, the ripple effects of which are unprecedented and unknown. Of course, the current president of the United States did not come to the narrative of inequality through conventional politics, but through years of experience with the modality of reality television.

If the philosophical implications are difficult to grasp, consider the physical effects of technology for a start. A recent conspiratorial video from Truthstream Media concerns a 2003 patent, here, entitled, Nervous System Manipulation by Electromagnetic Fields from Monitors. The patent concerned the ways multimedia gadgets could be used to alter the functions of the human nervous system to change mass behaviour. It sounds like something for the tin foil hat crowd, except that the author of the patent expressed misgivings about potential abuses of the technology he designed. He recognized that his invention could damage collective psychology and physiology, but he only considered this outcome as a kind of regretful afterthought.

This Creepy Patent Proves They Can Remotely Hijack Your Nervous System (7 June 2017). Video Source: Youtube.


Sunday, April 30, 2017

Google's Infogate


Google was initially listed on 4 September 1998, hosted by Stanford.edu, having been developed by a couple of Gen X PhD students, Larry Page and Sergey Brin. Image Source: Business Insider.

This post is the first of several on decentralized information dynamics and real-fake, mainstream-alt news. I will start with the beast behind it all. On 4 September 1998, the search engine Google first appeared online. It is hard to believe, because it feels like a short time ago, but it was a pivotal moment between the way the world was and what it is now.

It used to be that if you wanted to find information you had to amass a personal book collection, and visit libraries and archives. You built a professional or amateur reputation as researcher. You had to get to know people - contacts, translators - who knew things, and travel around the world to see sequestered books, diaries and sealed files. Everything had to be taken down in pen and paper notes.

Orson Welles's sarcastic scene from Citizen Kane (1941), in which a journalist is granted permission to read in a private library which holds the unpublished memoirs of a deceased banker. Video Source: Youtube.

Once photocopiers existed, if the archivist was inclined, you would be permitted to photocopy small amounts of material at exorbitant rates. An average person might find one or two books at the bookstore or library, or have saved newspaper or magazine articles in a personal, physical, metal filing cabinet. That was it.

Before the Internet, information was hard won, and knowledge, wisdom and judgement about information even more so. The latter required time, contemplation, and reflection in total silence. It also demanded discussions with others who had done the same. It was a fundamentally different way of thinking.

Wednesday, July 20, 2016

In Millennial Eyes 3: Abandoned Buildings, Left to Rot

-
Abandoned federal gold exchange bank, Part 1 (29 June 2015). Video Source: Youtube.

This post is the first of three - respectively on the economy, politics, and war - which describe how a negative Millennial history is emerging from disconnections between perception and reality. For today, see urbex explorer Josh wander through an abandoned bank, somewhere in America. It appears that this bank closed in the 1980s. To protect properties from vandals, Josh does not reveal names or locations of many of the places he visits. He does not remove anything from the sites, he only films and photographs them. You can see Josh and his friends explore dozens of abandoned sites in the USA and abroadhere.

Urban explorers are now poking through the wreckage of a transformed economy. That transformation depended on the 'virtualization' of property. Before 2007-2008, the economic value of property lay more in its assessed worth as a tangible historical object. During and after the Great Recession, the temporal perception of property changed to become a fleeting and mutable virtual investment, divorced from its actual physical condition and connection to society. The process started before that, but the recession was the hard turning point. In order to understand this change in terms of its long term consequences, it is important to separate the official story of the recession from the post-recession reality which urban explorers have uncovered.

Abandoned federal gold exchange bank, Part 2. Video Source: Youtube.

Although urbex is the new Millennium's historical pastime, the perspective is based on experience, unmediated by historical knowledge, except for a Google search or two. Information on abandoned properties is suppressed on the Internet to discourage vandals and scavengers. Urban explorers seek history out, independently of the way it has been presented to them in the system. Josh thanks all his viewers, "even the haters," who jeer at his lack of knowledge. While urban explorers may not always know the historical context through which they move, they discover many things their viewers do not yet know.

Urban exploration reveals how rapidly the present is becoming the past. For some, the late 20th century and early 21st century are too recent to be considered historical. Urbex videos indicate how time is accelerating in everyday life, and why the past is being discarded at an alarming rate. It is not a pretty picture. Urban explorers document a secret history of incredible losses, shameful waste, and a throwaway culture which appeared over the past thirty years. Abandoned buildings and infrastructure are monuments to materialism, property bubbles, recessions and bankruptcies. The economic shocks are one thing. But the wreckage also confirms a deeper anti-historical malaise. Urbex confirms the need to revitalize historical awareness.

Wednesday, July 13, 2016

In Millennial Eyes 1: I Remember the French Revolution


Screenshot from Assassin's Creed Unity (2014). Image Source: ABC News.

In Millennial Eyes is a new series on this blog which explores how historic events are depicted and discussed from Millennial perspectives. 14 July is Bastille Day in France. One Millennial depiction of the French Revolution is the 2014 video game, Assassin's Creed Unity, developed by Ubisoft Montreal. Neo-history emerges from the game's virtual reality combat.

The location of the Assassin's Creed Unity developer, Ubisoft, in Montreal is interesting, because if you ever wanted to know what France would have been like had there been no French Revolution - or at least, a different kind of revolution - you need to go to Quebec. Like many former colonies, Quebec, originally known as New France or Canada, and later Lower Canada, followed a real path of 'alternate history' compared to that of her mother country. One glance at the map of New France in 1750 (below), compared to the map of Lower Canada in 1791 (below the jump), tells you what a devastating loss France and the French people suffered in North America in this period. Voltaire (1694-1778) famously quipped that losing New France was no great loss. In Candide (1759), he asked what use France had for a "few acres of snow (quelques arpents de neige)?" It was a lot more than that! New France once extended west to Saskatchewan, and south through the American Great Lake states, down to Texas and Lousiana. French Canada struggles with that loss to this day.

New France in 1750. Image Source: J. F. Lepage/Wiki. After the Battle of the Plains of Abraham and other British victories over the French during the Annus Mirabilis of 1759, New France became British under the Treaty of Paris in 1763.

After the Treaty of Paris in 1763, Canada (light pink) came under British control. Under the 1774 Quebec Act, the British placated the French population by maintaining their civil code laws and Catholic religion; the Quebec Act angered settlers in the Thirteen Colonies who had moved into Canadian territory over the Appalachians. This was one of the causes of the American War of Independence (1775-1783). Image Source: US Department of State via Wiki.

Image Source: pinterest.

Quebec's provincial motto, present on all post-1978 automobile licence plates, is Je me souviens, which means, I remember. The motto carries a mixed message. The architect of Quebec's provincial parliament building, Eugène-Étienne Taché (1836-1912), invented the motto in 1883 to express the greatness of New France and Lower Canada's founding role in Canada. Taché's provincial legislature displays the motto alongside statues of figures whose work collectively came to express a dual French-English historical meaning. Wiki:
"[The original statues] included founders (Jacques Cartier, Samuel de Champlain and de Maisonneuve), clerics (de Laval, de Brébeuf, Marquette and Olier), military men (de Frontenac, Wolfe, de Montcalm and de Lévis), Amerindians, French governors (D'Argenson, de Tracy, de Callières, de Montmagny, d'Aillesbout, de Vaudreuil) and, in the words of Taché, 'some English governors the most sympathetic to our nationality' (Murray, Dorchester, Prevost and Bagot) and Lord Elgin, who was given a special place for he was seen as an important player in obtaining 'responsible government.'"
The Amerindian statue group by Louis-Philippe Hébert (1850-1917) outside Quebec's National Assembly (provincial parliament) building commemorates Quebec's native peoples in the establishment of New France and Quebec as a Canadian province (click to enlarge). Image Source (August 2013) © Paul Gorbould via flickr.

In 1978, when the motto was revived by Quebec separatists, Taché's granddaughter controversially informed the Montreal Star that the motto's whole verse confirmed a dual memory: Je me souviens/ Que né sous le lys/ Je croîs sous la rose. ("I remember/ That born under the lily/ I grow under the rose.") In this view, Quebec is the child of France and England.

Wednesday, April 20, 2016

Rickrolling ISIS


Image Source: Academia Obscura.

In 2007, a bait-and-switch meme started on the Internet that tricked users into watching Rick Astley's 1987 hit Never Gonna Give You Up (hear it here). The meme is called 'Rickrolling' and while it has brought Astley back into the spotlight, by August 2015 he had earned only $12 from the prank because he has performer's rights to the song; profits from Rickrolling went to the very fortunate song-writing trio, Stock Aitken Waterman. Astley remains sanguine: "Listen, I just think it’s bizarre and funny. My main consideration is that my daughter doesn’t get embarrassed about it." Above, from Academia Obscura, a student's physics paper on Niels Bohr. Bohr, the 1922 Nobel Prize winner in Physics, modeled the atom in the 1920s, helped refugees flee the Nazis in the 1930s, worked at the Manhattan Project in the 1940s, and helped establish CERN in the 1950s.

Anonymous cyber-revenge campaign after the 13 November 2015 Paris attacks. Video Source: RT via Youtube.

The Young Turks opinion on Anonymous campaign against ISIS (16 November 2015). Video Source: Youtube.

The Rickrolling meme is resilient. A day after the Paris attacks on 14 November 2015, Anonymous began to spam and troll Twitter users with pro-ISIS terrorist hashtags by diverting their traffic to Rick Astley's video. This Rickrolling performs as a type of data-mining, in which Anonymous hackers keep track of those diverted to the video and mark them for cyber-attacks. The hackers use social media information to steal ISIS Bitcoin cryptocurrency holdings and they attack them on the Dark Net. They renewed this effort after the Brussels attacks on 22 March 2016. This is the hackers' reverse humour against ISIS operatives and sympathizers: never gonna give you up.

However, as I have commented before on this blog, it would be naïve to imagine Anonymous as purely heroic actors, after one has had a taste of their New World Order and World War III conspiracy theories, here, here, here and here. The campaigns against ISIS are related to Anonymous cyber-attacks on the Belgian government under hashtag #DownSecBelgium. On their announcement that they will rally at Place de la Concorde in Paris, France on 10 June 2016, one Youtuber was skeptical: "rien à voire avec Anonymous, c'est un fake."

Anonymous hacker campaign announced in French against ISIS one day after the Paris attacks (14 November 2015). Video Source: Youtube.

Anonymous hacker campaign announced in French against ISIS on the same day as the Brussels attacks (22 March 2016). Video Source: Youtube.

Sunday, April 3, 2016

Time and Politics 18: Quid Pro Quo


Follow Your Dreams/Cancelled by Banksy (2010) on a wall in Boston's Chinatown. Image Source: Lifehack.

Do not ask if the middle class is dead and where the political blame lies. Ask how much time you have, now that they are dying or already dead. Ask what has happened in the past in other societies after a middle class has died. Most people in the middle classes are waiting for things to improve. If that does not happen, there are two modern roads out of extreme social inequality and economic disparity: revolution or a police state. This was the message, on 20 March 2016, when BBC World News broadcast a programme on the post-recession destruction of the middle classes, entitled The Super Rich and Us, hosted by Jacques Peretti.

For a time after the Second World War, the social contract became quid pro quo - meaning, 'this for that' or 'something for something.' In English-speaking countries, it is a contractual concept under the Common Law, "an item or service traded in return for something of value." The Latin expression is the source for the British slang 'quid' for the pound sterling. One would work for a certain amount of time and gain money and a livelihood in return. Now however, the social contract is increasingly just - quo.

Saturday, February 20, 2016

The Permanent Web 1: History, IPFS, and Ethereum


Nineteen Eighty-Four (Signet Books ed., 1954). Image Source: Flavor Wire.

This blog asks how we can record and write history in the new Millennium, given the volume of information, the ephemeral nature of the World Wide Web (evident in link rot and dead Websites), and the problem that data and authorship can be changed. There is an additional problem of interpretation of data. From my blog's statement of intention:
"This blog is an experiment in writing real-time history, which deals with the impact of digital communications on our understanding of history. There are serious problems emerging in social media, in which opinions are confused with facts, data and sources are available but potentially manipulated, and historical interpretive authority derives from hit counts and technical algorithms determining audience traffic. These problems demand an ongoing and increasingly rigorous reappraisal of historical method and historiography, to take into account the impacts of online behaviour and virtual perception."
As the Internet evolves, a disturbing trend has emerged which confirms the worst predictions of author George Orwell. The impermanent Web, as it now exists, enables manipulations of centralized information which effectively destroy history, common values, societal stability and our whole grasp of reality. With Nineteen Eighty-Four, Orwell understood that the stability of history, and of our ability to write verified history, depends on - and further shapes - a society's underlying structural organization, its politics, and the degree to which an individual within that system can form autonomous thoughts and emotions. Nineteen Eighty-Four's political horror hinges on a love story and asks whether love can blossom and endure against a pure, anti-historical tyranny, or whether it will be crushed as sexcrime.

Peter Cushing played Winston and Yvonne Mitchell played Julia in the 1953-1954 BBC dramatization of Nineteen Eighty-Four. Watch it here. Image Source: Speaker to Animals.

Orwell promised that the destruction of history, information impermanence and propaganda manipulation rely on and reinforce two conditions: first, psychological authoritarianism, a precondition for real world totalitarianism; and second, the centralization of information. Orwell's protagonist, Winston Smith, addresses his secret diary to any reader who is free from the past or future: "To the future or to the past, to a time when thought is free, when men are different from one another and do not live alone — to a time when truth exists and what is done cannot be undone."

If there is no correction in the current state of affairs, our impermanent Web threatens to become aggressively anti-historical and ensure frightening Orwellian outcomes. This is why open source coders, hackers and independent designers are trying to think about the Web in a new way. They want to redevelop it to create two results: first, they want to build a permanent Web, where all files and information are perpetually recorded, maintained, encrypted, public, and protected; second, they want to decentralize, and then distribute, big data power. That would allow future generations to question any information presented as historical facts. With a permanent, decentralized Web, anyone in the future will be able to check the archived data for themselves, no matter who they are, and no matter where they are.

This new possibility is appearing in the cryptocurrency and peer-to-peer technology spheres, through mergers of cryptocurrency technology with a collaborative information protocol called Interplanetary File System, also known by its acronym IPFS. (IPFS is not to be confused with Google's Interplanetary Internet Protocols, which are being designed at NASA for Mars colonization; I first blogged about InterPlaNet in 2011, here.)

I want to thank Chris Ellis for bringing IPFS to my attention. Ellis is a great, original thinker in the Bitcoin industry and founder of ProTip and the Fullnode Project. His technical and philosophical meditations on the combination of a new Web network protocol with cryptocurrencies enabled me to think of their broader implications in historical terms. You can see my 2014 interview with Ellis here; my earlier discussion on Bitcoin crowdfunding of creative and intellectual projects with Protip is here.

After talking to Ellis, I wondered if a distributed, permanent historic archive heralds as big a shift as did the historic separation of church and state. Over the past decade, the Internet has shaken nation-states to their foundations. There are many examples of mass protests sparked by social media, notably in 2011 in Tunisia and Egypt (see related posts here and here). Anti-corruption activists such as Alexey Navalny test the weak spots in the authority and apparatus of the Russian state with live online broadcasts to a global community of real-time witnesses. For a recent example, see Navalny's crowdfunding campaign to develop live camera feeds at intersections and prevent roadside police intimidation, here. Global communications naturally erode nation-state divisions.

However, we have entered a new stage in this trend. It is fascinating to consider that the manner in which information is organized and communicated determines its durability, and its political and governmental impact, over time. Our current Web is impermanent and builds addresses associated with nation-states or centralized organizations. IPFS builds cryptographic addresses associated with the information at the address. It is 'content-addressed,' not nationally-, corporately- or institutionally-addressed. That change builds a distributed information network, which might decouple the nation from the state, and remake states into entities with no geopolitical identities. The alternative is 'techno-community-states' of various types.


Images Source: Altcoin Today.

For an example, see the Twitter hashtag #BlockchainsNotBorders, dominated by the project, BitNation: "Welcome to Governance 2.0: Borderless Decentralized Voluntary." On 19 February 2016, BitNation released the world’s first 'virtual nation' constitution on top of the blockchain of the cryptocurrency Ethereum. Altcoin Today sampled the anarcho-capitalist libertarian rhetoric associated with this project's separation of nation and state:
By using borderless technology in the form of smart contracts, BitNation wants to remove geographical apartheid of individual nation states. Doing so will lead to governance services that are more transparent, cheaper, and overall better for the end user.

The concept of do-it-yourself governance may seem strange to some people, but the BitNation team is very confident in what they are trying to accomplish. While there is still a lot of work to be done before DIY governance becomes a mainstream trend, the team is currently focusing on security and dispute resolution. Ethereum smart contracts will play a big role in his department, according to BitNation founder Susanne Tarkowski Tempelhof.

What makes these Ethereum smart contracts useful is how they bring a new layer of decentralization to the world. Unlike most services being used today, such as social media and email, BitNation’s governance 2.0 will not involve central authorities or governments. Needless to say, this model of governance is completely different to the centralized system of representative democracy most of the Western world is accustomed to.

Tempelhof feels that democracy is a massive failure, and the concept is – according to her – fundamentally flawed. Democracy as most people know it means governments give residents a variety of choices to vote on, without them having a say in the matter of what choices are offered. BitNation wants to change this by introducing democratic autonomous organizations (DAOs).

This concept needs some further explaining, however, as these Democratic Autonomous Organizations exist entirely on the Ethereum blockchain. From a governance point of view, decisions will be made by a group of parties who can decide on contracting people to do work, distributing assets, and even appointing shares. Two significant examples of a DAO can be found in Slock.it and DigixDAO. ... Tempelhof explained it as follows:
‘Decentralized’ means there is no single point of failure, like there is no central bank. If you look at Uber for example, they have got into trouble in Europe, they have been banned in France – but if Uber had been a DAO there would have been no bank accounts to freeze, nothing for regulators to ban. Decentralized also means personal autonomy. We decide what we do. Nobody tells us. And being borderless means that we are not confined by a passport to live in an area of war or famine. That’s as wrong as judging people on the color of their skin or sexual preference.
Another 'Governance 2.0' report from last year was entitled, Bitnation, Horizon and Blocknet join forces to deliver the world’s first platform for do-it-yourself governance services to 2.5 billion people across developing markets, using Bitcoin 2.0 technology (13 March 2015):
Bitnation, which is known mostly for its groundbreaking pilots including organizing the world’s first Blockchain Marriage and the world’s first World Citizenship ID on the blockchain, which has drawn attention from, amongst others, Wall Street Journal, Forbes, TechCrunch, Wired, and The New York Times. Bitnation aims to provide governance services in frontier and emerging markets, where it is needed the most. There are 2.5 billion unbanked in the world, the “System D” economy – the grey, unregulated markets – is a 10 trillion dollar economy, and 80% of the world’s population currently live in developing markets. Bitnation’s Founder and CEO, Susanne Tarkowski Tempelhof, worked for 7 years in challenging frontier environments, like Afghanistan, Egypt, Libya and other countries, primarily with assessing people’s perception and experience of governance.
After the BitNation announcement that you can 'Create Your Own Nation in 140 Lines of Code,' performed in an online live event from an Irish pub in Rio de Janeiro, Brazil on 15 February 2016, one could step back and consider Orwell again.

Nineteen Eighty-Four asked how the individual's heart and soul could survive inside a totalitarian state. Orwell also pondered how information systems helped to construct, and were thereby reflected in, the governmental structure employed by political authority. In Nineteen Eighty-Four, the two questions are intimately related. It is through systems of emotions, morality, thought and communications that we bind mentalities to power. This is why where once religion dominated the God-sourced monarchical state, the disciplines of history, philosophy and politics replaced religion in the era of the secular nation-state.

Of those three modern disciplines, aspiring technocrats have fixated on politics to develop their visions of  'techno-community-states.' But when history and philosophy are absent or only superficially considered, politics fares poorly in shotgun weddings with technology. We need all three disciplines together, and the help of other fields, to understand how tech-communities might be defined and connected to statehood, if at all. Chris Ellis, in conversation, preferred not to label the new permanent Web politically. His priority is to put "the technology into people's hands so that they get to have a say. Everyone gets their own narrative now. [And] it will be irrefutable that someone said [what they said]." He believes that it is better to allow time to tell whose narrative ends up being correct, rather than forcing older political theories onto the current evolution of the technosphere and its unprecedented circumstances and possibilities.

That is only partly true, because future truths don't spontaneously manifest themselves. There are always present-day conflicts to find a dominant narrative, and future corrections can be very difficult. Future historians will be actors who will help to decide which narrative ends up being 'correct' - in the sense of being the closest reflection of our Millennial reality, while also being relevant to future considerations - and for how long. But what is at least promised now, is that there actually will be future historians to make those analyses, as opposed to Winston Smiths, working in the Records Department of a future Ministry of Information. These are the overarching questions to the technical overview that follows; those questions will be explored in subsequent posts in this series on the Permanent Web.

The explanation on Ethereum starts at 2:25; the discussion on IPFS starts at 4:26; from The Daily Decrypt interview with Ethereum Director of Integration Taylor Gerring (26 January 2016). Gerring emphasized that Ethereum allows users to develop applications which sit on the cryptocurrency's blockchain. Video Source: Youtube.

In the January 2016 video above, The Daily Decrypt interviewed Ethereum cryptocurrency dev Taylor Gerring to explain what Ethereum is and how it might be combined with IPFS to build and incentivize a permanent Web. All data on IPFS are perpetually recorded online by means of peer-to-peer network distribution; and every single subsequent change to a given file is also permanently recorded by means of the amended file acquiring a new encrypted addresses, which can be indexed by a blockchain. This IPFS combination with cryptocurrency technology would establish a standing historical archive of online information. The encrypted addresses for any piece of information would ensure that that information could not be manipulated without the manipulation being noted in another encrypted process. Proponents in the open source coding community claim that this combination will build a new Web to function as the Web should have, from the start.

Tuesday, November 3, 2015

Crowdfund the Creative Life


Image Source: The Oatmeal (29 October 2015).

In 2014, Pierre-Michel Menger published a fantastic book, The Economics of Creativity: Art and Achievement under Uncertainty, which describes the strange social psychology that governs how we assign value to creative products. Artists and other creative thinkers have never been more desperately needed to understand the changes of the new Millennium; but they face industrialized work conditions, corporate business models, and commercialized distribution systems. Over-competition and over-supply create professional hierarchies in which obedience trumps innovation. Worst of all, creative disciplines - from the amateur arts to academia - depend on an idealized belief in genius achievement, which is supposed to ignore money to maintain purity of intention.

Menger argued that the problem of evaluating and supporting creativity should not be framed in terms of employment and work conditions. Rather, the focus should shift to understanding the nature of human invention and how to sustain it. We must rethink how creative people live and work and how they are compensated, because true creativity already depends on what Menger called "self-realization," a non-chaotic engagement with uncertainty. Move to the edges of any society, and you will find people analyzing and radically rethinking how our world works, and how we fit in the world. Thus, imaginative work is, by definition, not a fully programmable activity. It is unpredictable, both in terms of how long it takes and in terms of results. Yet that uncertainty must be managed, or creative people cannot survive, and their cutting-edge visions will be lost.

Sunday, July 19, 2015

Unhacking the Reality of Russia's Troll Farms


Image Source: ritabrezkolin.

Recent posts on this blog have pointed to debates about how to verify memory, truth and reality. These are pressing questions. Mistaken arguments based on online information lead to outrageous conspiracy theories. These theories masquerade as products of verified data-gathering. Many people online are travelling further and further from truth, rationality and reality, while thinking they are doing exactly the opposite.  Can we map the philosophical distinctions and debates made between subjective perception and objective reality onto the distinctions between virtual reality and meatspace? How do we determine reality, based on evidence and experience from online communication and relationships? How do we write news and real-time history, and confirm events in reality, when all information, sources and historical authorities are suspect?

Image Source: Buzzfeed.

Later posts will seek to answer these questions. Today's post explores how dangerous continued confusion on these questions can be, starting with Russian leader Vladimir Putin's quest to rewrite post-Cold War history. One of the primary tools used in this regard is the establishment of so-called 'troll farms,' groups of social media actors employed by the Russian state to spread anti-American real- and faux-news that appears to come from domestic American man-on-the-street sources. This is an example of Stalinist disinformation techniques and the Russian state-media apparatus, targeting a rival national audience rather than its own.

Of course, in the Communications Revolution, everyone in governments, corporations, hackerspace and social media circles is gathering and playing with information. Today's focus on the Russian case does not deny the existence of trolls and hackers working for other powers and businesses. The point is to understand the dangers of Millennial propaganda - which is always cloaked in the pretense of its targets' commonly-held virtues - with this example. Trolls and troll farms can alter public opinion; worse: they can change their targets' perceptions of reality. They can change memory. They can change history. A great informational power struggle is taking place. The winners will define reality. On those foundations, anything can be built.

Vigilance in this regard means not just finding the evidence online which confirms the message with which you agree politically, or hearing the message you want to hear. It means hearing the message you do not want to hear, and considering it to be plausible. Unfortunately, healthy skepticism is now a breeding ground for anti-reality trolls, so caution is required on that front too. Algorithms on search engines now tailor our results based on earlier search histories, narrowing our worlds into smaller and smaller solipsistic bubbles, exposing us only to those with whom we easily agree. Rationalists believe that logically-aligned data provide solid conclusions to support their arguments. They do not, because all data are malleable. Hackers and whistleblowers supposedly topple evil authoritarian power structures by blowing them away with the cold, hard truth. Well, even if they do, does that mean that these actors are automatically 'good'? To what service do they put their revelations? And as with the links listed below about Russia troll farm whistleblowers, how can we be sure they are telling the truth?


The recent Pluto flyby revealed how instantaneous media have confused people about the real accomplishments of Big Science and space exploration. Some thought the flyby was faked, echoing the moon landing conspiracy theory and indicating a lack of confidence in information authorities and in photographs and videos as evidence. Some complained that there was no live feed from Pluto, showing zero understanding the immense distances involved and the related scientific challenges. The vast supply of online information on these subjects has provided no online education. Instead, it has provided a false sense of confidence about being 'right' in quasi-rationalized, self-righteous and communal ways.

Image Source: CStar.

Sunday, May 31, 2015

No Dislike Button: Social Media's Utopian Judgements and Misjudgements


Image Source: RLBPhotoart via Ghost Hunting Theories.

The blog is back! You know that gradual sense of erosion, the haunting of a Millennial mind as it over-surfs through a day that starts with optimism and ends with futility? How do social media contribute to a day's drift toward despair? In a New Yorker article from October 2014, Joshua Rothman criticized Facebook's fake optimism, its missing 'dislike' button, its relentless insistence that we like everything and constantly cough up happy thoughts and accomplishments to build a smiley online community (Hat tip: Daniel Neville). Rothman sees Facebook as an arena, where participants compete as greatest contributors to collective happiness, equated with a complex of good attitudes and real outputs as proof that good attitudes work. Beneath that, there is a misjudgement of those who are not sharing enough good attitude tidbits, or enough evidence of personal success. Rothman thus concludes that Facebook is one of the Web's Kafkaesque lower courts of judgement:
Facebook, like much of the Web, is officially designed to encourage positivity; there is no “dislike” button, and the stated goal is to facilitate affiliation and belonging. But, over time, the site’s utopian social bureaucracy has been overwhelmed by the Kafkaesque churn of punishment. ... Facebook has become a dream space of judgment—a place where people you may know only in the most casual way suddenly reveal themselves to be players in a pervasive system of discipline. The site is an accusation aggregator, and the news feed is the docket—full of opportunities to publicly admire the good or publicly denigrate the bad, to judge others for their mistakes or to be judged for doing it wrong.

Not all of Facebook is devoted to overt judgment and punishment, of course; there are plenty of cute family photos and fun listicles floating around. But even superficially innocuous posts can have a hearing-like, evidentiary aspect. (Paranoia, unfortunately, is inevitable in a Kafkaesque world.) The omnipresent “challenge”—one recent version, the “gratitude challenge,” asks you to post three things you’re grateful for every day for five days—is typically Kafkaesque: it’s punishment beneath a veneer of positivity, an accusation of ingratitude against which you must prove your innocence. ... Occasionally, if you post a selfie after your 10K or announce a new job, you might be congratulated for “doing it right.” But what feels great in your feed takes on, in others’ feeds, the character of what evolutionary psychologists call “altruistic punishment”—that is, punishment meted out to those who aren’t contributing to the good of the community.
Social media's stick-wielding positivity is divorced from human experience, while constantly appealing to experience as proof of its viability. You had better build the happiness of your online community, little Boot-camper. Or else. Positive cultural motivation supposedly drives productivity; except it doesn't. In this fake positive culture, dominated by Facebook's small egotists, success becomes meta-performance, which does not mirror the protracted work and grit needed to accomplish anything substantial. Anyone remotely sensitive to actual positives and negatives is left enervated, isolated, alienated, depressed.

Wednesday, April 8, 2015

Perspectives and Milestones


Paul Stankard's handblown glass pieces look impossible to create. In Beauty Beyond Nature, he discusses the craft. Image Source: Paul Stankard via boing boing.

This blog has just passed 2 million hits! Welcome to the 1359th post and the end of the blog's fifth year. Thank you to everyone who has stopped here. Your time is precious and comments are always appreciated. Thank you to every interviewee who has been kind enough to discuss your work. And thank you to other bloggers I have met along the way, who have shown me the value of real-time publishing. These bloggers are all amazing people, intelligent and gifted mavericks (you know who you are), who know what to read on a desert island, and how to walk the line.

This blog was partly inspired by a site called The Strip, and partly by the late Mac Tonnies, whose blog (now in paperback on Amazon here) was published from 2003 until his untimely death in 2009. Tonnies' Posthuman Blues remains a landmark. Sometimes I revisit his blog, and I am still amazed by his ideas and vision, his uncanny ability to pin down the Zeitgeist, to channel the new Millennium's collective unconscious, to decrypt and encrypt a cultural environment of changing symbols, to describe the future.

Blogs remain relevant because some are still independent. Media independence? Political neutrality? What's that? Although media outlets co-opt blogs to make them branded social media products, some blogs remain artistic life statements and authentic testimonies. Readers follow a blogger on a personal journey as he or she tries to make sense of the exploding world of communications. So, Histories of Things to Come is based on a true story.

Monday, February 16, 2015

Modernity, Myth and the Scapegoat: Martin Heidegger, J. R. R. Tolkien and ISIL


Heidegger, at the centre of the photo, in the era of Nazi academia. Image Source: Le phiblogZophe.

Two paths diverged in the wood. I took the one less traveled by, and that has made all the difference. In 2014, the private notebooks of German philosopher Martin Heidegger (1889-1976) - muse of Jean-Paul Sartre, Jacques Derrida and Hannah Arendt - saw print. The publication of the so-called Black Notebooks confirmed that Heidegger's philosophy grew out of support for the Nazis and an essential anti-Semitism. Oceans of ink have been spilt over what Heidegger meant by Dasein, or Being-in-the-World (his union of subjective, objective and conscious perspectives with the world at large), but this elaborate existential debate completely misses the historical context which informed Heidegger's thought. Heidegger associated his cherished idea of Authentic Existence with the values of agrarian Europe. For the German philosopher, rootless Jews were part of a new, supranational world of corporate industry, banking and trade. Jewish precursors of globalization contributed to an inauthenticity of being, a life whereby everyday people, distanced from the soil, became phantom slaves in a technology-driven world that destroyed traditional culture.

The Scapegoat by William Holman Hunt (1854-1856). Image Source: Wiki.

It is too simplistic to dismiss Heidegger's thoughts on being and time as aspects of the Nazi narrative. But it is also wrong to say that his ideas can be read separately from their Nazi context. Heidegger was in the same ballpark, and that demands a serious reappraisal of his ideas.

In building their Aryan mythology against the Jews, the Nazis ironically appropriated the Hebraic concept of scapegoating. The scapegoat was originally an early Archaic, pre-Classical improvement (dating from around the seventh century BCE) on the sacrificial rites of other ancient societies. Scapegoating, a mental gambit which is alive and well today, occurs when one projects one's sins onto a goat and sends it off into the desert to die; this leaves one free from blame and responsibility, and able to get on with life without feeling guilty for one's wrongdoings.

Sunday, August 31, 2014

Time and Politics 12: College Days Before the War


 
Click to enlarge: application form to Elon College (1913). Image Source: Elon University via Chronicle for Higher Education.

For today, as classes start at universities across North America this week, see a college application from 1913 to Elon University, North Carolina, USA. It is only four pages long! The source is a report from the Chronicle of Higher Education. Those were the days when liberal arts were both progressive and considered a solid background for doing just about anything (including, unfortunately, dying in World War I). Post-2008-recession, critics consider the liberal arts to be politicized breeding grounds of the hopelessly underemployed, unemployed and unemployable. Today's defenders of the liberal arts insist that the arts and humanities teach their students critical thinking. Part of that critical thinking can extend to considering how progressive the new Millennium really is.

A friend, J., observed that at Elon University, "They didn't mention that, until 1963, only white people need apply!" He suggested that the progressive view now recognizes that the western-centric view of history - which this application embodies with its emphasis on classics - has given way to a broader, enlightened world history.

I agreed that this is the current prevailing view, although I feel it contains an anachronism. We now assume automatically that the western-centric vision is causally bound to racism, inequality, slavery, oppression, patriarchy. The notion that today's discipline of world history is more advanced than the previously western-centric, classics-focused liberal arts curriculum includes its own hubris-laden, anachronistic assumption about contemporary progress.

In 1913, people could not travel or communicate the way we can now. So why would we automatically expect people from that time to have the same broad global vision we do? Yes, it was an oppressive, unequal, patriarchal system. But at the time, wasn't the classics curriculum the founding source of liberal arts education? Wasn't that curriculum considered the epitome of progress in 1913?

One hundred years from now, what will people say about late 20th century and early 21st century liberal views of inclusion? Probably they will say that it was woefully benighted and reflective of its own time and place. We could equally say that today's world history discipline derives from perspectives informed by economic and political globalization, not the expansion of tolerance - even though it looks that way. Isn't it true that in today's globalized world, whose official creed is advanced progressive, tech-driven liberalism, there are more slaves now than at any time in history? And beyond that conventional definition of slavery, isn't technology not-so-quietly enslaving the entire plugged-in population? Bondage happens. That brutal reality - namely, that inequality, loss of freedom, vicious hatred, and violence are integral to the shiny, ultra-advanced globalized Millennium - breaks through heady tech dreams in unpleasant surprises and shocks.

Monday, March 3, 2014

Nuclear Technology Course Online



This blog often covers nuclear topics. For those who would like to learn more, the University of Pittsburgh is offering a free online course in Nuclear Science and Technology (Hat tip: Nuke Pro). The course starts on 3 March 2014 (there are future rounds as well) and runs for eight weeks in English and with subtitles. You can sign up here. All it will cost you is the textbook, which is here. It is also available at many different libraries, listed here (get the 1992 second edition or its 2008 reprint). And you might need to brush up on your basic physics and differential equations.

See all my posts on Nuclear topics.

Sunday, January 26, 2014

Craigslist's Shades of Grey


The new thing: 50 Shades of Grey Birthday Cake (thanks to -A.). Image Source: Inspire, Design, and Create.

This blog is always in search of the Millennial New Normal, so I thought, recalling this post, that I'd swing by Craigslist and have a look at their best ads. Craigslist is known for veering off the Internet's beaten track, and several MSM outlets, for example here, here, here, and here, have reported on the list's craziest ads. Posted on 15 December 2013, under the Pittsburgh Pennsylvania listings, there is the following disturbing advertisement; it would would make great fiction, not-great Millennial reality:
$40k a year to attend Harvard University as me


You must have either a 4.0 GPA in high school, or a 3.5 or higher GPA from a university to get hired for this.

Your age does not matter, but you must be a male since I have a male name.

I am looking for someone to attend Harvard University pretending to be me for four years, starting August 2014. I will pay for your tuition, books, housing, transportation, and living expenses and pay $40,000 a year with a $10,000 bonus after graduation. All you have to do is attend all classes, pass all tests, and finish all assigned work, while pretending you are me.

You do not need to worry about being accepted, I have already taken care of that.

If interested please email me a little info about yourself, and we can meet in person to discuss further.

When we meet you will be asked to sign a non disclosure agreement, so you can not reveal who I am or any further information, whether you're selected or not.
  • Location: Cambridge, MA
  • it's ok to contact this poster with services or other commercial interests
  • Compensation: $40,000 a year

Tuesday, September 17, 2013

Class, Work and Time


Image Source: Work Mommy Work.

Danish researchers have determined that social class (and associated jobs) determine how parents perceive time. Parents' perceptions of time tend to influence what subjects their children study. The parents' class distinction is shaped by the work-pay differences between the professional salary and the hourly wage. The Parent Herald reports:
The study "The Educational Strategies of Danish University Students from Professional and Working-Class Backgrounds" is based on 60 interviews with Danish students from six different university level study programmes: Medicine, architecture, sociology, economy, pharmacy and business studies.
Young people of parents with university degrees choose a 24-hour culture
... "For young people whose parents are university educated, factors such as prestige and a strong sense of professional identity are important. They are attracted by an educational culture in which you are a student 24/7, and where leisure activities are tied to the identity that lies within your studies. These young people have also grown up with topical discussions around the dinner table which also prepares them for their lives as students," says [co-author] Jens Peter Thomsen.
Young people from working class backgrounds choose '9 to 5' studies
When young people from working class homes with good grades in their A-level exams choose other paths than the prestigious studies, it is, among other things, due to the fact that they want a clearly defined aim of their studies.

"The young people who are first-generation university students often choose studies that are more '9 to 5' and less tied up to a sense of identity. They have lower academic expectations of themselves, and they choose studies with a clearly defined goal for their professional lives," in sectors where jobs are easily found.

They do not choose to study, e.g. sociology because it can be difficult to know what it might lead to jobwise, says the education sociologist. ... [Jens Peter Thomsen also] mentions that medical students from families of doctors may have a different view of the patient than a young person with a working class background, who also chooses to study medicine.
Who knows how generally true these findings are, given the study's limited numbers? At any rate, the researchers suggest that middle class students associate a broader understanding of time and work with their identity; and their job choices are secondary to that identity. That is, their work may be their life first, and their livelihood second.

On the other hand, students from working class families tend to take a pragmatic view, with a more constricted association between work and time; nor do they associate their personal identity with that more limited view. In other words, they may seek to shape personal identity outside their pragmatic job choices.

The researchers also found that the class-determined perception of time often persists across a generation, regardless of whether the graduates demonstrate upward or downward mobility. Finally, the study indicates that these class distinctions hold true, whether the university education in question is public and subsidized (and access to all fields of education is more equal) - or private and very expensive (and access to all fields of education is unequal).