The first page of David Auerbach’s memoir, which also functions as a selective history of the relation between the human being and computer programming, begins at the very beginning: “Like so many software engineers, I was a shy and awkward child, and I understood computers before I understood people.” As a child without social talents, Auerbach was drawn to coding in the 1980s. He was a kid, messing about with computers for fun at a time when such activities were also in their infancy. He wrote his first line of code, he recalls, to draw a square on a screen, in a language called Logo: “repeat 4 [forward 50 right 90].”
Auerbach went on to work as a software engineer at Microsoft and Google. He also wrote a column on tech for Slate between 2013 and 2016. And, notoriously, he was one of several prominent journalists who, according to a 2017 BuzzFeed report, fed tips to the right-wing provocateur Milo Yiannopoulos. His experience makes Bitwise an informative work, and his past as a columnist makes it an opinionated one. Since it is also a survey of Auerbach’s life, however, Bitwise is invariably shaded by the personality of its author, who came up through a masculine tech milieu that could also be antisocial.
Bitwise is a wide-ranging survey of computer languages and coding and their effect on people. A computer, Auerbach writes, is “a series of abstraction layers, one on top of the other.” In the old Apple layercake, for example, BASIC sits over the DOS operating system, which sits over the hardware, and each layer abstracts the one beneath it. By the same token, Auerbach shows, algorithms today abstract human existence, particularly complex phenomena like gender and psychology, into computerized systems that are shaping our experience of life in frightening, clumsy ways.
In particular, Bitwise examines the dubious magic algorithms perform on large datasets. Auerbach’s best examples are the American Psychiatric Association’s Diagnostic and Statistical Manual of Mental Disorders, and the Myers-Briggs Type Indicator (MBTI) personality test. The first is widely acknowledged to be a guide to psychiatric illnesses and their symptoms, rather than a hard-and-fast rulebook. But Auerbach explains that some researchers have tried to use its criteria to collect, then crunch, patient data—a purpose for which the DSM was never intended.
Relatedly, the MBTI is a quiz that assigns its taker a four-letter acronym defining their personality. ISTJ, for example, stands for “introvert sensing thinking judging.” The test was developed by amateur psychologists in the 1940s, Auerbach writes, and its blunt approach to self-knowledge has appealed to all sorts of casual users. (Auerbach describes himself as an “INTJ,” the arrogant, detached type.)
It also appeals, however, to computers. The MBTI sorts its takers into 16 categories. Of course, 16 is not a magical number that reflects a truth about humanity. It’s just a crude tool. But, Auerbach writes, “the MBTI publisher Consulting Psychologists Press, which markets and administers the personality tests to corporations, universities, and governments, claims that over 80 percent of Fortune 500 companies assess their employees with the MBTI.”
Because the MBTI segments people into such a delightfully simple grid of types, it has taken on a new identity. “The combination of its simplicity and its popularity makes it the ideal case study for how and why people classify themselves,” Auerbach writes. This “becomes a great deal more important once those classifications get fed into computer algorithms.” Tests like these “that offer strict division and regimentation of our self-identifications are one way in which we make ourselves more comprehensible to computers.” The ease of the numbers involved—the four letters, the sixteen categories—transmute the infinite variety of human temperament into a corporate tool.
Auerbach is on solid ground in his analysis, but his logic can sometimes be crude and impersonal. In one chapter he presents a critique of Facebook’s many options for a user’s gender identity. If each user is required to choose between male, female, and neutral when they sign up, how ingenuous is the company being by letting that user change their label later? The data has already been collected, according to the now-incorrect typology. He has a point here, but part of Auerbach’s critique is that language itself changes over time, which may render these typologies defunct in the future. “People may not be using terms like ‘pangender’ or ‘biracial’ in fifty years, much less two thousand,” he writes, either unaware or indifferent to the fact that this terminology was hard-won.
I asked Auerbach on the phone what alternative he would offer to Facebook’s system. He explained that he was trying to point out a moment of hypocrisy and “false nuance” in Facebook’s practice. “You don’t want any single entity prescribing these taxonomies in such a degree as Facebook does today,” he said. “It’s not about what taxonomy so much as having one overpowering entity that is not giving people the opportunity to opt out, choose their own, or simply add the degree that they want.”
His overall point about Facebook’s taxonomy is that the world is a dynamic place, but computers “paradoxically enable us to revise and refine our categorizations even as they insist that we continue to make those classifications.” What is the relation between the user and the typology that turns the user into data, and to what extent is the typology speaking back to us to tell us who we are?
The point stands, but Auerbach’s slightly unfeeling tone also begs a further question about his role in public life. In the BuzzFeed report, Joseph Bernstein included screenshots showing emails from Auerbach to Yiannopoulos concerning the love life of Anita Sarkeesian, a primary target of the online harassment campaign known as GamerGate. I asked Auerbach whether he wrote the emails, which he denied, though he told me that he had been in contact with Yiannopoulos over “a Wikipedia-related scandal.”
When we spoke Auerbach called Yiannopoulos a “harmful troll.” Auerbach has said the accusations against him are “categorically false.” When we spoke he said that he had checked the screenshots against his own account, and that they don’t match. His concern with BuzzFeed’s factchecking process is that the screenshots were not sufficient for him to figure out where the emails came from. He does not, however, accuse Buzzfeed of manufacturing the material.
Whatever the case, Bitwise bears out the impression of Auerbach as an intelligent translator of the digital world with an insensitive streak. As he grew out of programming in his teens (“the web did not exist in any accessible form, nor were computers part of most people’s daily lives”) he moved towards the kinds of literature that soothe a troubled adolescent soul. Vonnegut, Joyce, Woolf, and the Oulipo giants Georges Perec and Italo Calvino loom large. “The difficulty and obscurity of Ulysses intrigued me as a teenager,” he writes, “much as that Logo ... program had as a child.” Auerbach developed a literary sensibility that responded to constraint and deduction.
As in literature and in code, so in relationships. One section of Bitwise explains how seven “software engineering maxims that saved my code have also saved my marriage.” For example, a couple must “Beta-test” (he and his wife waited ten years to marry), and practice “Fault tolerance” (remember that your understanding of the partner is only an approximation).
When I spoke to Auerbach about the personal element of his book, he told me that he wished he didn’t have to put himself in it at all—or wishes, rather, that it could have been somebody else. The memoir form makes the book work, however, because it focuses a very broad subject through the microhistorical lens of a single person. In Auerbach’s case, the first-person voice allows him to sidestep the problem of technological determinism, a grand theory of the effect technology has on our minds. It doesn’t make this INTJ likable, necessarily. But Bitwise is a valuable resource for readers seeking to understand themselves in this new universe of algorithms, as data points and as human beings.