The Impossibility of Apolitcal Technology
A friend (with whom I’ve had some great email threads and IRC chats) is woke to the corruption of social media and discusses it here. While I agree with his accusations of politics, I disagree that any of the services can ever exist apolitically. I assert that any technology that assumes a value judgement is political. Steming from this definition is that any technology which either algorithmically or through operator intervention judges the rightness or wrongness of its product is political. Twitter clearly falls into this category, as well as the small printing presses of colonial America, or the large presses of 200 years later.
Some would say that it is possible to create apolitical software. How can, for example a compiler be in any way even remotely political? How can the considerations around its construction be anything but technical? Here is an example. Gcc is the compiler used to build most linux binaries. Years ago, a “technical” decision was made by a core gcc developer named Drepper to break static linking. This means that no useful binaries can ever execute on Linux without dynamically linking to certain libraries making the proposition of distributing signed binaries futile, making the proposition of secure software futile, making the proposition of Bitcoin futile, making the proposition of sound money futile, making the proposition of free trade futile. Whether or not Drepper is aware of the political implications of the of his technical decision is irrelevant to the fact of their existance. Nevertheless, there is a belief by technologists “educated” at ITT and the public equivilants that software can exist outside of politics. As a result the US has a legal system that runs on Word, a financial system that runs on Excel, and a voting system that runs on Windows.
And so I argue not for companies like Twitter and Facespace to renounce politics (that would be impossible) but for their opponents on the right stop using collectivist, totalitarian technology and start making moral decisions about the software they make and use.