Last month ITHAKA hosted The Next Wave conference. We brought together people from both inside and outside the academy to discuss issues important to the future of education. Our broad theme was data, value, and privacy. As is always the case with ITHAKA meetings, we spent as much time projecting technology’s impact on the future as we did reflecting on how it is affecting us today. In this post I will share a few of the highlights and thought-provoking ideas from speakers at the meeting from outside the academy.

The Next Wave

 

Hilary Mason, CEO and co-founder of Fast Forward Labs, kicked off the meeting describing how data has been used to improve various services. She highlighted Google Maps, which uses data to show traffic, as one great example of an application where the technology remains behind the scenes. She also described how New York City used data from 911 calls to better locate ambulances to shorten response times, potentially saving lives. Mason demonstrated how innovative companies are using a multi-step product development process that includes collecting data, building models to test hypotheses, delivering changes to users, and then monitoring effectiveness. This approach is leading to products that are improved continuously.  As Mason noted, this kind of product development calls for new kinds of employees, people who are great at using data to inform products, but who also have an understanding of the ethical implications of their decisions.

Zeynep Tufekci, assistant professor in the School of Information and Library Science at the University of North Carolina and contributing writer to the New York Times, picked up on the more challenging ethical questions and the implications of the predictive capacities of data and its broader impact. Tufekci made an impassioned argument for transparency, arguing that the use of sophisticated data analytics means we are all “being watched.”  Computers increasingly can predict who is likely to be depressed, or who is likely to have a baby, or all manners of deeply personal behaviors or activities. We need to know when we are being evaluated in these ways.

Several of our speakers referred to the nature of the data being collected as coming in one of three categories. For example, Conrad Rushing, director of engineering at Tumblr talked about public data, private data, and secret data. Helen Cullyer, program officer for scholarly communications at the Mellon Foundation, talked about my data, our data, and transformed data. Implicit in these categories is that there is a hierarchy of privacy in data that must be carefully managed by the data aggregator. All data and all types of data are not to be handled equally. It is easy for a service provider to find itself handling data with a higher level of sensitivity than it may have expected or that it is prepared for.  For this reason, Rushing said that, whenever possible, organizations must design internal systems that prevent mis-handling data. A first principle is not to save the data! (Libraries have adhered to this principle for many years by not saving circulation data.) As an example, he pointed out that responsible companies do not store passwords; rather, they store hashed conversions of passwords.

Speaking of designing systems that use data effectively, both LaShawn Richberg-Hayes, director of research at MDRC, and Elizabeth Green, co-founder and CEO of Chalkbeat, talked about the importance of the big picture in designing research and product experiments. Richberg-Hayes emphasized the importance of thinking about evaluation up front, and forcing oneself to monitor lead measures not just lag measures.  Green explained that Chalkbeat relies on its mission to guide their theory of measurement.

Jaron Lanier, scholar, technologist and author, offered an expansive review of technological trends and their likely impact on society.  He commented on the emergence of “data monopolies” and compared the situation to the monopolies that emerged during previous economic revolutions—natural resources and transportation. In those cases, government intervened to break up the oil and railroad monopolies. He wonders however, how do you break up Facebook?  Isn’t its utility precisely because it is a monopoly?

Lanier expressed concern about these monopolies and the impact of the internet on society. Initially, the network seems to make things better for everyone, but in every case he is aware of, the longer-term outcome has been far less positive. He cited the Arab Spring as an example, where the web made it possible for revolutions led by enthusiastic individuals connected by the internet, but offered no substitute government. Similarly, in the music industry, it initially looked like the web was going to make things better for musicians, but it has not turned out that way. Instead, there has been an overall shrinkage of the market and a concentration of value (and money) going to a smaller number of artists. Instead of the benefits of the market being spread over many musicians in something like a normal distribution, huge returns are concentrated in a few (a zipf’s curve).  More important, from his perspective, is that the majority of the overall value goes to the enabling platform. On the web, musicians (or writers, or artists, or photographers) contribute their work and yet do not receive commensurate value. Most of that value goes to the “platform.” In that context, Lanier compared online platforms to casinos; they are the “house” collecting the rewards.  In the near term, this does not worry him too much, because it is his view that the innovators currently responsible for these platforms have primarily altruistic motives; however, he is concerned longer term because there is no way to know what will be the values of those who control these platforms in the future.

Lanier directed his thinking toward education, and more specifically online learning. Speculating on the value of adaptive learning systems and the ability of computers to replace faculty he said:  Any knowledge or skill that can be learned exclusively on a computer will not have value in the future. The implication being that if it can be taught by a computer it can be done by a computer. Everyone paused to think about that!

Mr. Lanier’s comments encapsulated much of what was discussed over the course of the two days. The power of new technologies delivered over the internet are having many positive effects; however, we need to think very carefully about the potential negative ramifications. The conversations reminded us all not to focus solely on direct and short term benefits of technology but to remain aware both of the personal trade-offs – in terms of the loss of privacy – and the wider societal impact – in the form of unequal distribution of wealth.

Interested in attending The Next Wave next year? Let us know.