ANEEB H AHMED WRITING / TWITTER / PGP +*

freedom advocate + humanist. interested in the open source philosophy and in generative tech that empowers individuals.

now: 21e6. then: uber advanced technologies group, eight sleep, IEM.

x*


THE NETWORKED SPECIES & INEXTRICABILITY THEORYlink / UPCOMING

The most fascinating insight from the 2016 US presidential election was not just that Donald Trump 'defied all odds' and won, but, even more surprisingly, our brave new world still has little idea as to how it could have ever happened. What we're in for is a rude awakening to the sheer inevitability of the Internet and the dramatic consequences that will inherently unfold. [... new post soon]


MOVE FAST AND BREAK THE FABRIC OF SOCIETYlink / 30 MAY 2018

As eternal creatures descending just yesterday from primates, we are running on an increasingly ancient operating system crafted organically by the forces of evolution. Our brains have developed to respond to relatively simple stimuli with relatively simple reactions tailored for survival, a sort of rudimentary trial-and-error process honed over millennia to ensure the long-term success of our species. The basis for this innate animal phenomenon, a system of self-awareness and subsequent improvement, is used as a model in computer science for what experts in machine learning refer to as reinforcement learning. Given enough compute hardware, a well-fitted algorithm, and the proper pipelines for rich, accurate data flowing in to recursively improve the algorithm, engineers can train the software to 'learn' (i.e. find the pattern of) on a given dataset and figure out its underlying systems in increasingly faster time, with increasingly more profound insights. In pop culture, the phrase of choice for these technologies is one you may have heard of: artificial intelligence.

The parallel between human cognition, reinforcement learning, and artificial intelligence is in the definition of intelligence, most simply described as orienting oneself presently, determining a future goal, and optimizing the actions required to reduce that gap. As you would expect, when you abstract intelligence into an algorithm that can be increasingly optimized, computers can quickly gain expertise in specific domains, with the programmers gaining expertise in how to better shape cognitive systems. And as we learn more about the foundational cognitive attributes and mechanics of the human brain, how the levers of neurons and synapses work in synchrony, strengthened and pruned by stimuli, we become dramatically more susceptible to the manipulation of these mechanics. This will become one of the greatest challenges for humanity moving forward.

George Orwell predicted many gloomy outcomes for how an intelligent, dystopian future would coalesce as as the second millennium came to a close. Governments would be spying on its citizens (correct - Snowden revelations.) Citizens' locations could be tracked in real time, at any time, by certain privileged entities covertly (correct - both state-level and private actors.) Propaganda machines would run rampant, within almost every person's grasp and with no escape from (mostly correct - the Internet is now pervasive with increasing saturation.) Listening devices would be everywhere (correct, but little did he know it would actually be made by corporations, not the government, and placed voluntarily in homes by giddy consumers themselves.) In just over 10 years since the introduction of the smart phone in 2007, the device has over 77% saturation among US adults, all of which come preloaded with state-of-the-art GPS + WiFi sensors and continuous Internet access.

Another product borne in the mobile Internet age, Facebook, transmogrified from an obscure Harvard attractiveness rating website to a ubiquitous, monolithic megacorporation with direct reach to over 1.5bn people daily and bearing responsibility for the tilting of the 2016 US presidential election, for the manipulation of the 2016 Brexit decision, for acting as a deadly political tool aiding in the 2017 Rohingyan genocide in Myanmar, and for evaporating at least one billion collective hours per day from users conditioned to scroll through their empty calorie Facebook feeds, among many other egregious moral and ethical violations. To keep users continually interested in using their product and therefore seeing more ads, Facebook and other advertising-based digital products develop dark patterns in their user interfaces, designed to directly manipulate subconscious human levers such as emotion, novelty, achievement, and reward. In the grand quest to sell you better ads, Facebook has created a dopamine-abusing surveillance panopticon, with widespread (and not yet fully understood) externalities, where negative emotions are literally packaged and sold to you and where many people are developing symptoms such as phantom vibration syndrome and seeing dramatic upticks in Internet addiction. Don't take my word for it: early Facebook executive Chamath Palihapitiya has described Facebook as 'creating tools that are ripping apart the social fabric of how society works... No civil discourse, no cooperation; misinformation, mistruth. You are being programmed.' Facebook invests massive amounts of capital into machine learning. For products as simple as a news feed, photos, and messaging, it seems peculiar for Facebook to invest about $8bn yearly into 'research and development'.

Therefore, the fundamental purpose of Facebook is not to develop a product for 'fostering relationships' or 'gathering together communities', as the PR goes, but rather to deploy a powerful surveillance tool to collect as much raw data on users as possible to micro-target advertisements to them (over 98% of revenue at >$40bn in 2017) and ostensibly be used overtly or covertly as a technique and vehicle for omnipresent observation. Increasing user retention is simply shorthand for increasing user addiction. And as Francois Chollet of DeepMind notes, that's not even the worst of it: the most worrisome issue is infact Facebook's 'use of digital information consumption as a psychological control vector'. With increased use of algorithmic filtering, crystal-clear insight into every individual user's interests and habits, and the ability to stratify audiences with surgical precision, Facebook and other advertising-based digital products can gaslight users deliberately or incidentally by slowly shaping the polarity of the content they consume and calibrate their worldview with.

This is all powered by machine learning algorithms and dark design patterns-- I want to reiterate the importance of this 'digital' characteristic. Our primate brains would have never anticipated the supernormal stimuli promulgated by computer screens or the Internet. And again, this is not something at a pitiful scale; this is over 1.5bn people (and counting), consuming 'curated' content daily in a private company's walled garden where it knows its users' interests and habits better than they know them themselves. The only network with a more expansive reach than Facebook in human history, with a two-thousand-year headstart, is Christianity. And when seeking comparisons for such a dramatic shift in the collective behavior of humanity, Glenn Reynolds notes the last such tectonic shift was with the spread of diseases in early city-based civilizations - then physical, now mental. Maybe we don’t know the disease vectors that we’re inadvertently unleashing. This is simply unprecedented.