This is the seventh and last post of Spectrum’s 2018 Summer Reading Group. Each post was drawn from chapters of the book Stand out of our Light by James Williams. You can view the reading/posting schedule here.
In the last few pages of Stand out of our Light, James Williams gives a final account of the value technology brings, why readers should struggle with him for the best expression of the social web, and why to save, with technology, humankind's abilities to attend, choose, and become.
"At its best," he writes, "technology opens our doors of perception, inspires awe and wonder in us, and creates sympathy between us" (124).
It's a wistful testimony partly inspired by the Overview Effect, a transnational, ecological perspective that astronauts brought back to Earth during the glory days of NASA. Like those astronauts, Williams is reaching for "a common goal, a common purpose, a common story" that can guide humankind out of the shallow internet of adverts and clicks, and the troll caves otherwise known as comment sections.
In short: Williams wants a way for people in contemporary American culture to recover a basis for empathy across difference, to reboot skewed individual and collective priorities, and to remind ourselves of what's worthy enough to want.
I want that too.
I spend some of my best hours each week developing digital content for faith-rooted justice movements. The people I work with — most of them faith leaders, advocates, and policy specialists — aim to inform, encourage, and act with others. These are folks who want to pass on a better world to their descendants, take care of vulnerable people in the meantime, and participate today in a counter-structure of people living as though that better world is possible.
They don't do what they do to prime audiences' moral outrage pump or to "exploit our psychological vulnerabilities." I see them as far more thoughtful about whether to share shocking videos of state violence on social media or use scarcity and urgency to prompt sales or donations than Williams acknowledges. (Perhaps Williams is writing from outside of those user communities, and not from inside them?)
But the app-and-advertising social web has indeed been built to behaviorist specs. BJ Fogg's Persuasive Tech Lab at Stanford is just one behaviorist source common to a generation of self-advancement "life design" coaches who blend psychology, game theory, programming, and user experience tech to induce new habits of action and thought in their readers and clients. As these students and researchers influence fields from counseling and finance to business development and app design, behaviorism has become one of the implicit contexts that we're all endlessly scrolling in.
So, like other members of this year's Summer Reading Group and Williams' teammates at the Center for Humane Technology, I'm both enmeshed with and skeptical of the manipulations of the modern web. I want us to be well-positioned to resolve the tension Jesus mentioned in his prayer for his followers: not that we would be taken out of the world but that God might keep us from evil (John 17:15). I can't quite say that we are yet.
Williams spends the last few chapters of the book wishing that professional associations could do the heavy moral lifting that the marketers and marks of the "attention economy" need. They're in no way robust enough to do so. Neither are our states or governments, or the corporations that model themselves after them. These structures enforce rule-compliance with the billy clubs of domination and expulsion and only occasionally stop when challenged.1
Williams also rejects religion as an influence on his philosophy of digital progress (102), and John Lennon once asked us to imagine a world without it. But I wonder if we can imagine a world with religion operating at its full, humanizing potential.
Imagine religion that offered common collective identity without effacing individual or group interests, religion that also promoted disciplines that individuals, families, communities, and groups could use life-long to build deeper reflective understanding and greater empathy. These practices might include meditation, moral inventory, peer accountability, common fellowship in common places, and the commonplaces of symbol, ideal, and time that make shared language possible.
Imagine religion that didn't devolve into autocracy; that didn't replace Messiah, Spirit, or avatars with suited or armored strongmen; and that didn't transform religious freedom for minoritized communities into religious regulation of minoritized communities. Imagine spirituality so well-rooted that it regularly transmuted outrage and debate, made space for reflection and trauma-recovery, and promoted empathy, hope, and organizing for meaningful systemic and relational change.
Instead of religion as method, however, Williams offers market economics as philosophy. He uses the market and its assumptions about commodities and scarcity throughout the book to interpret digital engagement and social media design. He's bothered by the products the market has shipped to us and the products the market is making out of us, but he leaves untouched the premises underneath it all.
From economist Herbert Simon forward (13), Williams uses transactional trade to unlock human psychology and relationships. Therefore, market rules — get this good for that trade-off; make profit using scarcity; treat resources and benefits and people as both limited and fungible — appear to govern human nature and interactions as well as the digital worlds technologists have made. Wealth and poverty, abundance and scarcity, an excess of information alongside a paucity of access or attention to appreciate it and use it well: all of these binaries are informed by the market construct that Williams treats as first principle.
But the market is a wobbly frame for human identity and interaction, and profit as the ultimate design goal is unfit ground for futures worth living in. If the trans-Atlantic slave trade left behind any lessons, the list must include these: one, humans aren't commodities, and two, orienting people primarily in terms of market sales dehumanizes both the traded and the traders.
Mayhem follows from conceiving of us as if we're analogous to grains of rice and hands of banana.2 Williams correctly perceives that the software and app market has incentivized design goals that don't serve actual people. But his perception doesn't take him much further because the assumptions of economics aren't in fact sound enough to drive humane design and ethical criticism.
People aren't things, and allowing economic metaphors to dominate how we understand people means allowing the organic world to be disordered by the logic of commodity. The major key for organisms should be community, not commodity.
Functional religion teaches us this.
So do cultural ethics like ubuntu, a South African ethic that Williams mentions during his epistemological chapter, "The Daylight." Contrary to the book's description of it, ubuntu is not about either individual personality or the moral binary of "good" and "bad." It's rather about the collective self, a village large enough to contain the entire world without requiring uniformity from the world. It's about an ethic of belonging, a relational ecosystem of interdependence,3 and not a network of pocket-sized malls that are user-centered in the same way that Thanksgiving dinner is turkey-centered.
Ubuntu and the moral concepts like universal human dignity that are implied by it may be new to human history,4 but our collective future depends on our taking them as foundational instead of the "dominant business model and design logic of the internet." Without that commitment, we'll keep running the explicit and implicit rules of the current system in the name of social good, even when its rules produce what Michelle Alexander calls "procedural injustices" and society tags the negative human impacts as acceptable loss.5
"Ultimately," computer scientist Jaron Lanier explained in a talk on social networks and digital engagement earlier this year, "you cannot have trust [between individuals] without having built a society of human trust."6 He was arguing that structure sets the stage for how we relate one-to-self, one-to-one, and one-to-others.
Williams wrestles with this when he writes, "Whether irresistible or not, if our technologies are not on our side," that is, if they are "adversarial" and fundamentally inhumane, "then they have no place in our lives" (100).
If our technologies are not on our side, then they have no place in our lives.
"Our technologies" don't only include our pocket computers and the social web. They also include our ecclesiastic and theological technologies: our doctrines, ritual stories and shared texts, interpretation methods, accountability processes, use of resources, and administrative structures.
Inasmuch as people created all of these technologies, people maintain them, and people revise them — often far too slowly for the good of the collective — it makes no sense to kick responsibility for them up or down the metaphysical chain. It's disempowering to blame monsters for the socio-technological systems we've designed and keep running, and it's worth considering instead what individual and collective agency we do have as we live in this world, seeking to be kept from evil.
Williams wants technological tools to track with human values and ideals and our sense of who we are, but we don't share consensus on what those values, ideals, or identities are or should be (cf. 118). Social polarization and fragmentation limit our ability to build consensus, but in the absence of agreement, it's the certain who inherit the Earth: those who are sure socially designing for those who aren't sure or those our systems haven't granted the standing to co-design.
Villagers of the cosmos in and beyond Adventism would benefit from our revisiting our approaches to advertising, evangelism, and other outbound content in the light of enthusiastic, ongoing, prior consent, or what marketing advertisers like Seth Godin call permission. The question of whether we're obliged as consumers to block online ads to preserve cognitive attention and moral clarity is akin to the question of whether we're obliged as creators to stop littering cities with books and pamphlets that nobody asked for, that aren't anchored to community concerns, and that superimpose a very narrow moral model on people outside our group.
Being villagers of value would require us to offer our neighbors a different quality of listening and a whole new level of humility. People can already use privacy plugins, ad blockers, and social network restrictions to counter the attention economy and content that's out of alignment with how they understand themselves and what's important to them. What would it mean to secure their permission in this context? As content makers and recipients, how can we withdraw consent, attention, or time from other makers without severing our underlying relationships with them?
Governments that conduct mass surveillance and data collection without consent have no moral standing to evaluate "attentional harm" or regulate "inner pollution." And requiring people to pay to opt out of marketing they didn't ask for while designers work to "optimize" their consent, another of Williams' alarming proposals, might fit in an adversarial market, but is a total nonstarter in the light of ubuntu. We can and should expect more of one another than entrapment or force or self-censorship through compliance oaths, even oaths as sympathetic as Williams' (119-121) or the updated baptismal covenant I drafted on a lark a few years ago.
"None of these interventions — greater transparency of persuasive design goals, the development of new commitment devices, or advancements in measurement — is enough to create deep, lasting change in the absence of new mechanisms to make users' voices heard in the design process... None of this should surprise us at all, because it's exactly what the system so far has been designed to do." (123)
Where we are is based on what our systems have so far been designed to do.
And what we'll design together next will be based on the quality of our moral imagination and our willingness to get perhaps the only social technologies Williams didn't think could help — religious technologies — ready to support the common good, not just private .org interests.
An electronic version of Williams’ book is available free/open access through the Cambridge University Press website. (For those who prefer a physical copy, Amazon has it in paperback for about $15 at the time of this writing.)
Notes & References:
1. Adventist publications like Adventist Today and Spectrum are monitoring the Adventist General Conference's assertions of authority to command global compliance with policies and creeds, across organizational levels. See Adventist Today's September 19 report on newly formed compliance committees.
2. Morrison, T. (1975, May 30). A humanist view (transcript). Portland State University’s Oregon Public Speakers Collection: “Black Studies Center public dialogue. Pt. 2.” Public Dialogue on the American Dream Theme. Portland State University Library.
3. Abrahams, Y. (2016, July). Thank you for making me strong: Sexuality, gender, and environmental spirituality. Journal of Theology for Southern Africa, 155: 70-87. Via the Global Faith and Justice Project. See also Kaoma, K. (2013). God’s family, God’s earth: Christian ecological ethics of ubuntu. University of Malawi Kachere Press; and Flunder, Y. (2005). Where the edge gathers: Building a community of radical inclusion. Pilgrim Press.
4. Williams, J. (2018). Stand out of our light: Freedom and resistance in the attention economy. Cambridge, p. 88. Compare Debes, R. (2018). Human dignity is an ideal with remarkably shallow roots. Aeon. Retrieved September 20, 2018.
5. Alexander, M. & Brown Douglas, K. (2018, September 11). Live stream: Community read and book discussion of Just mercy: A story of justice and redemption. Union Theological Seminary. Retrieved September 12, 2018.
6. Lanier, J. (2018, June 19). Ten arguments for deleting your social media accounts right now. C-SPAN. Retrieved September 16, 2018. See also Cherry, S. (2013, July 16). Jaron Lanier: We're being enslaved by free information. [Podcast, transcript]. IEEE Spectrum.
Keisha E. McKenzie is director of digital strategy at Auburn Seminary, NY, and principal at McKenzie Consulting Group, MD.
Image courtesy of Cambridge University Press.
We invite you to join our community through conversation by commenting below. We ask that you engage in courteous and respectful discourse. You can view our full commenting policy by clicking here.
This is a companion discussion topic for the original entry at http://spectrummagazine.org/node/9050