Esther & Eric chatting it up on the BT Big Thinkers Series
Posted by direwolff on October 15, 2006
Just got done watching a really interesting conversation/interview that Esther Dyson had with Eric Schmidt, CEO of Google. I’m always interested in hearing Esther’s perspectives as she had been a lecturer at Theseus Institute, the MBA programme I attended in the south of France back in ’93/’94. She’s someone whose generally astute insights and punditry I have followed since my early days in the PC industry (1983). I was acquainted with Eric when he was still at Sun Microsystems during my days at First Virtual Holdings, Inc. (FVHI), the first Internet-based payment system, back in ’95/’96. He was well acquainted with the founders of FVHI, Nathaniel Borenstein, Marshall Rose, Einar Steffereud (first mailing list in 1975) and Lee Stein, the first three being designers of some of the early protocols of the Internet, which is how our paths ended up crossing. From the limited interactions I’ve had with Eric, he has always been a mindful communicator.
Three related topics from their interview struck me as worth exploring further. The first was when Esther brought up a point around personalization, and Eric explained that “[a certain degree of] ambiguity is found in every query” (especially when dealing one or two words in a query box). He went on to comment that Google believes that by knowing more about a person (with their permission) will help to address this issue. This seemed to support Esther’s thoughts on the matter. However, I disagree with this perspective because even in our human interactions we require more hints of context that have nothing to do with personalization, to understand what’s being discussed or requested. One of examples Eric used was that of “Brazil” referring to either the country or the play (there’s also a movie with this title). If only this word was entered into a search query, even if my clickstream was provided to analyze, I don’t quite see how any of that info would be helpful. Mainly because if at the moment that I’m looking for Brazil I start at the query box, then there’s no real guessing what’s on my mind. However, providing tools that can help me disambiguate the meaning may go much further than trying to devine the answer of my intention. Even if yesterday I had spent all day on travel sites, this should contribute no more weighting to the country as the correct response than the play.
My hope for search technology is that it be enabled to help us reach our intentions through the use of tools that facilitate how we express more precision around our requests, not that we enable it to guess our intentions. I don’t want to interact with people who are mind readers any more than machines who would try to achieve such a thing. I also feel this way because there’s a false sense of security built into these guesses and as people see one or two correct answers, they begin to believe that much more complex processes can be automated beyond their need to oversee these. That’s when the problems begin. Note the false positives that came up in the “no fly” list of the Transportation Security Administration (also here and here). The fact is that search technology makes for excellent tools to help us sort through lots of information more efficiently, but it should not be depended upon to supply the answer automatically, even if such is buried within the data. Time should be spent on tools to help us disambiguate our requests, and keep the personalization issues out of this. Keeping my personal clickstream out of this becomes more clear in a subsequent point I’ll discuss below. Sorry if I seem a bit too luddite here, but in part I believe that people know what they’re looking for and need the freedom of seeing the results from their requests, since much learning can be done from these even when the need for finding an answer is immediate.
Another personalization point that Esther pushed on was its importance in terms of reducing click fraud (since we would know whether the click came from a machine or a person), or enabling services to know that a person read something versus a machine. It seemed that personalization as she was describing it here was serving more the interests of those being interacted with (whether this is an advertiser, a service or a person) rather than the interactor, as is the perspective laid out in the AttentionTrust Principles, where the question being addressed is, “how does managing my [information] clickstream serve me?”.
The third point which made me sit up and pay closer attention was when Esther raised issues around healthcare data and the role that Google could play given its Gmail architecture of private silos. Eric’s responses about the legal issues internationally were dead on, which is a relief that he has this understanding, but Esther’s questions seemed naively set in the mindset that often leads our government to say things like, “the people of this country have to give up their privacy in order to obtain greater security in these dangerous times”. While this may play in Mississippi, the framers of the Constitution foresaw this and rejected this sort of argument. Specifically to this interview, Esther suggested that where a Google or other company could aggregate people’s healthcare information anonymously, this could be great for addressing and trying to diagnose the spread of epidemics and disease related issues. While she’s absolutely correct here, the other issue at hand, which Eric raised, is that there are ways to find out who the people behind the anonymized information are, not to mention the temptations of future governments to have access to this information. Hence, the risk of reducing our privacy in some way based on the promise of a better future is not the sort of argument that we should succumb to, however tempting it might seem. My concern was that she seemed to be insisting on the aggregated information across private data stores model, because of this greater good it could offer. It has been a dangerous argument used to rip our constitutional rights from us in these terrorist related laws and I’m concerned that her perspectives could have an impact on others committed to seeing these come to reality.
All-in-all however, this interview was a fun hour to spend listening to two smart people converse about important issues of our times.