Richard Allen – regulate.tech
Social media gives voice to aggressive extremists, provides powerful tools for like-minded people to find each other and reinforce the thinking of the group, and allows lies and disinformation to be propagated at speed. Social media companies come under pressure to do something about all that and aren’t widely regarded as being sufficiently focused on their intent or sufficiently successful in their achievement.
This is an insider’s view of why that is harder than it looks and especially hard to scale, setting out clearly and logically how this can work and why it can’t. It’s very much worth reading for the clarity with which it does that. But it also aims to demonstrate support for the assertion that those working on this within the social media platforms are “good people making hard decisions as best they can.” The question for the rest of us is whether their doing the best they can is good enough – and the reassurance that Facebook knows best is perhaps not quite as reassuring as its supporters might hope.
Strategic Reading – and indeed strategy – tends to the lofty, the grand scale and the dispassionate. So at first sight, this personal and emotional reflection by Terence Eden might seem out of place here. But it is precisely because of the lofty perspective that his point is so important. Thinking about and, even more so, making decisions about issues which affect thousands or millions of people can never be about each of them as individuals. And few real world complex problems have a reassuring Pareto-optimal solution where we can sleep easy knowing that we have made things better for some and worse for none.
Abstracting from the individual can be a very necessary thing to do. But that’s not at all the same as forgetting that there are individuals, real people with real lives which can be made better or worse by distant decisions. To lose sight of that is to become less human. The first step to treating people badly is to strip them of their individuality. More insidiously, stripping people of their individuality is a step towards the risk of treating them badly. And yet as Terence says,
I simply cannot think about them as individuals. No one’s brain has room to contemplate the pain and joy and heartbreak and elation of so many people. It is unfair of me to care about any one person more than another.
The dilemma is inescapable. Being aware of it is the very least we should expect of those whose work forces that issue upon them.
There is an almost universal belief, no less strong for being almost as universally unspoken, that the UK political system is an exemplar of stability and moderation. There is a related belief, near universal among those most affected by it, that being a non-political civil servant is unproblematic, precisely because of those characteristics of the wider political system.
Those beliefs have been pretty resistant to evidence. Reflections on civil service ethics generate little interest. The remarkable resignation letter of a British diplomat in the USA cracked the facade, but the crack is already healing. This post takes that resignation as its starting point for a deeper examination of the fragility of civil society. It is short and pointed; alarmed but not alarmist. It sets a challenge. It is not clear where an effective response to that challenge will come from.
Maciej Cegłowski – Idle Words
This is another piece which isn’t new but which provides some good provocative food for thought, on how applying a computer programming perspective to problems which are fundamentally social can – and does – lead to unfortunate results. It’s written by Maciej Cegłowski, who brings elegant erudition to an unlikely range of subjects, in this case how an approach based on controlling closed systems breaks down when confronted with messily indeterminate systems, with a scattering of provocative one liners which combine challenge and simplicity, such as
Machine learning is like money laundering for bias.
But his conclusion is much broader – and an even greater challenge – than that single line suggests:
We have to stop treating computer technology as something unprecedented in human history. Not every year is Year Zero. This is not the first time an enthusiastic group of nerds has decided to treat the rest of the world as a science experiment. Earlier attempts to create a rationalist Utopia failed for interesting reasons, and since we bought those lessons at a great price, it would be a shame not to learn them.
Helen Margetts and Cosmina Dorobantu – Nature
Much of what is written about the use of new and emerging technologies in government fails the faster horse test. It is based on the tacit assumption that technology can be different, but the structure of the problems, services and organisations that it is applied to remain fundamentally the same.
This article avoids that trap and is clear that the opportunities (and risks) from AI in particular look rather different – and of course that they are about policy and organisations, not just about technology. But arguably even this is just scratching the surface of the longer term potential. Individualisation of services, identification of patterns, and modelling of alternative realities all introduce new power and potential to governments and public services. Beyond that, though, it becomes possible to discern the questions that those developments will prompt in turn. The institutions of government and representative democracy are shaped by the information and communications technologies of past centuries and the more those technologies change, the greater the challenge to the institutional structures they support. That’s beyond the scope of this article, but it does start to show why those bigger questions will need to be answered.
Ellen Broad – Inside Story
Ellen Broad is a technologist who is expert on policy, an ethicist who is an incisive thinker, a writer who has published an important and well argued book. She also, as it happens, is a woman. We all know that that last statement should be irrelevant to the preceding three. We all know that it isn’t.
But we can all too easily persuade ourselves that everything is more or less all right, that blatant discrimination is a thing of the past, that even though there are rotten apples, the barrel is sound. This article calmly and comprehensively demolishes that easy optimism. It is a very deliberate breaking of self-imposed silence, a discarding of the assumption that quiet compliance will somehow make things better:
If there is no “right” way as a woman to speak about gender issues — if there is no “right” way for a woman to take up space, to take credit — then silence won’t serve me or save me either. The only way forward from here is to start speaking.
In this essay, Ellen has spoken. She needs to be heard.
Rachel Coldicutt – Doteveryone
The subtitle of this post lays down a challenge:
Why a social scientist could be the most important person on your product team
Leaving aside the point that it might be an even better challenge if ‘philosopher’ were substituted for ‘social scientist’, this is an important issue. There is much talk (and much writing) about the need for ethics in data and software – though curiously rather less so in service design, where it is no less important.
But ethics is not some esoteric form of quality assurance added as a final overlay to activities otherwise devoid of any moral compass. It is perhaps better understood (in this context) as the encapsulation of a deep and pervasive view that technology should work for humanity, not the other way round.
What would computer science look like if it included the perspective of humanities and social sciences from the outset? And what if that perspective came not from some thinker in residence, but from people who brought a fusion of perspectives and understanding to problem solving?
And whatever the answers to those questions might be, there is a wider one still: where does that fusion not have a place? The Amalgamated Union of Philosophers, Sages, Luminaries, and Other Professional Thinking Persons may be due for a resurgence.