What’s on your wrist right now? For many of us it isn’t a watch or a simple bracelet, it is a wearable technology that probably tracks the number of steps you take, the distance those steps have taken you and the calories those steps have helped you shed. Welcome to the fringe of the self-tracking era. Wether it is a FitBit or Nike Fuelband or Jawbone Up or even the new Misfit Shine around your wrist or clipped somewhere, you are part of the quantified self movement and there are more joining every single day.
For the die-hard self-trackers out there, this is not a new phenomenon but for many, new and old, the pervasive nature of mobile has made it easier. When I was a kid, I used to count steps: I knew how many there were from my house to my school, from the bus stop to my work, from the bedroom to the bathroom. Distance was measured that way, charted in my head and calculated as a daily running total. Today those numbers and calculations are relegated to my wrist and tracked through my smartphone.
This is a fascinating time of discovery in the mobile/wearable industry as companies big and small try to figure out the right strategy of entering this market and why. The race is on and this episode gives context to it through its history with a glimpse of where it’s going.
Key takeaways from this episode. Click on the link and the video will take you to that clip
Rob: Hello everybody, and welcome to UNTETHER.tv. I’m your host Rob Woodbridge. Look, for 400 and something episodes, all my life it seems that we’ve been driving towards this thing around mobile. And it’s a crazy, crazy revolution. And now I believe that you believe that this is actual reality.
Now the question is, what is the implication of us carrying these devices, of you carrying this device everywhere you go doing, everything that you do, and eventually paying for things, keeping all of your loyalties and credit cards inside of your wallet and maybe your key and your identification? These are big topics that I don’t believe we know the answer to yet. The implications of which haven’t been felt.
So what do we do? We’re going to talk to Nora Young, who is the author of “The Virtual Self” and a host on CBC, the host of the very popular technology podcast and show called Spark. She is in Toronto, I am in Ottawa, we will drop the tension between the two cities and focus on the things that are needed to understand what the implication of these mobile devices are on the virtual self, big data, and how it’s going to impact us going into the future.
Nora, thank you so much for being a part of this. Really appreciate it.
Nora: My pleasure, a cross for virtual rivalry.
Rob: It is, it’s not hockey season, so it doesn’t matter to Toronto, right? We can love each other today, but as soon as October 1st rolls around, it’s Toronto and Ottawa, we cannot speak to each other. But I just want to bring a little screen up, this is Nora’s book. It’s called “The Virtual Self: How Our Digital Lives Are Altering the World Around Us.” Fascinating read, I loved it. Made me think about a ton of things. And I want to surface a lot of that as we go through this episode today. But before we do that, I talked about you, gave a quick overview, but who are you? What do you do? CBC, The Spark, the book?
Nora: Well, I’m a journalist and I focus really on the intersection of technology and culture. So my show Spark which is going into its seventh season this weekend, in fact.
Nora: Amazingly, the seventh season. It really is about technology and culture. We don’t tend to do product reviews so much, we do things like how is this changing the shape of our society? How is this changing the way we work, live, go to school, date, all that kind of stuff? Not that there’s anything wrong with product review shows, it’s just not what we do. And I come to that with a background in arts and culture, but a lot of new media stuff. So I’ve been kind of swimming in this seat for a long time and so I came to CBC with the idea for Spark about seven years ago and I love it.
Rob: Spark is fascinating. It’s where I heard about raspberry pie. It’s where you surface this, what I would think of as the fringe of technology and how it intersects with the way that we did. When raspberry pie all of the sudden became part of everybody’s lingo, I thought back to the Spark episode. I thought, “Well, you know, I heard that first on Spark.” Was that the goal?
Nora: Yeah, I think our thinking is that you can look at sort of the fringes, almost sometimes the odd edges where people are doing consisting a little bit freaky but that is actually where the new normal is going to be in I would say a year to five years. That’s sort of the time that we’re going at it.
When you think of things like raspberry pie or something like that, when I hear from people, I think for a lot of people, maybe people who aren’t as technically savvy as you are, there’s a lot of anxiety about technology and how quickly things are changing. What I would love to be able to do is for people to say exactly that kind of thing, when the topic comes up at work or conversation, that they are like, “Yeah, I’m comfortable because I know that from Spark, I heard that six months ago or a year ago.”
Rob: It must be hard. I do this, I try to surface some of these new companies, these new emerging trends in technology, especially the mobile space. But it’s very difficult to find those companies that you think what they’re doing is monumental. And then sometimes when I sit down and have a conversation on UNTETHER, I have the conversation with somebody where I’m not really sure where they fit in, but by the end of it, we’ve surfaced something that is incredible. This is life changing or altering. They’re a small company in Lincoln, Nebraska, and they’re never going to materialize in the market, but that market will materialize. How do you do it? How do you find these guys? Tell me your secrets, Nora.
Nora: Sometimes it’s the middle of the week and you’re looking at this space on the board, where you know, it’s like, “Well, there’s eight minutes that’s got to be filled with something.” So you do have these little panicky moments. And I mean obviously sometimes you’re wrong. We’re a small group of people, but we all work really hard at trying to find that spot where maybe it’s not going to be the next Apple, but it tells you something about the way we live now or the way we live five years from now. So you just keep looking and keep looking.
Rob: Yeah. You do a great service demystifying this world of technology, because it is confusing, and I don’t think there is a more confusing thing that is going on right now in our world than this convergence of self and data, and what that means, and the implications of that, I mean, we’re starting to see it with all that’s happening in the States, and we’re starting to see this with all of the tracking software that’s going on, and there is a balance between good and evil, and it’s demystifying that and making sure that people are clear on what the implications are, what they do.
Rob: Like an 18 year old being too liberal on Facebook, right, and the implications 18 years down the road when they might be running for office. So there is this challenge, we have to educate very quickly, don’t we, in this world?
Nora: We do, and I think one of the challenges like in the journalistic space anyway, is that things really do tend to get broken down into technophiles and technophobes, and left and right, right? But to me, technology is, I mean you can’t really break it down to this is great or this is terrible, right, because it’s both.
Nora: And that, I think, is where the most intelligent conversations happen, but unfortunately a lot of it really does get broken down into either this kind of hysterical, ah, Facebook kind of side, or yay, everything new is fantastic, and if you say otherwise, then you’re embroiled in a moral panic, whatever, right?
So I think that we have to be a lot more mature about it, and think about things, as you say, and a lot of the surveillance stuff is pointing us into the darker side, I think, and I think part of the challenge is things are changing so quickly. And one of the things I felt when I was writing the book was that it’s almost like these little pieces of this jigsaw puzzle have come together, sort of from the bottom up. It wasn’t like suddenly there was this top down decision, hey, we’re going to live in this little bio, always on world, it just kind of started to happen, and then social networking took off, and selling took off, and all of this other kind of stuff happened.
And it’s only now that we’re starting to see, wait a minute, we’re starting to live in a radically different kind of way, and we have all these social norms and mores and standards and laws that were designed around a different time, and we are still in the process of playing catch up.
Rob: Yeah, I’d like to say, I think it’s inadvertently. We got here inadvertently, right, and now we’re dealing with the consequences. I just want to clear something up here, is that you will find neither a mature nor intelligent conversation here, okay, so I just want to clear that up. I want to set expectations here. I will be taking lots of notes, and you can bring the maturity and the intelligence into this conversation, because I’m lost.
All right, so the idea for the book, obviously it stems from what you, you get exposed to such great stories through Spark and through being a journalist, and inquisitive, and that things that you’ve serviced on. I mean I want to touch on a number of the things, but what was the idea behind the book? Why write a book in this world that we are in right now? Start there.
Nora: Yeah, I . .
Rob: Attention deficit, right?
Nora: It is a huge thing of why even write a book when things are changing so quickly. First of all I wanted to say like, let’s take a step back, and maybe the particulars, or the particular examples that I’m using might change, but the underlying control differences are how long a shelf life, I certainly hope so anyway.
The idea that I started with was, and I actually pitched the idea way back in the fall of 2009, so at that time you weren’t really talking that much about mobile, as much as you were just talking about things I’m starting to see, again at the fringes.
Like we did this story, my colleague Dan [Meisner] brought to the table about this guy Nicholas Felton who later went on to work for Facebook, but at that time was, we interviewed him because he was involved in this really extreme example of self-tracking, he was basically tracking everything he did in his life, like how many pops he drank, how many times he played poker in a year, and every year he would put out this “annual report” of sort of a summary of his life in numbers.
And at the time we thought, well, that’s interesting. Probably that’s not saying about the culture, even though nobody was doing things that extreme at that time, or almost nobody. I started to feel as I thought about that more, that you can see the beginnings of this kind of thing, of the desire of people to kind of keep a statistical record of what they are doing.
And I have to say that as I saw it, as I was researching the book, it started to build and build, and smartphones kept becoming more ubiquitous, and even since the hardback came out, I feel like it seems now nobody would blink an eye if you say, boy, we’re all living in the quantified self world now, because every second person is wearing a Nike fuel band or tracking something. There you go, right. Almost everybody is doing that. It seems like everyone’s doing that.
So it just seemed like we were at this sort of point where, what started as a fringe activity was about to become very normal, and where you could see the pieces of the puzzle coming together so that self tracking is not just this individual habit, but that it’s something that, culturally and societally we’re starting to do. Particularly as our analog practices are replaced with digital practices, right. And as we move from paper and pencil, metaphorically, to digital, we’re starting to use devices that, in quotes, know how they’re being used, so that part of the future of being digital is that the technologies we use know how they’re being used, right?
And so once you start thinking about that, and once you start thinking about, it’s not only the thing I carry in my hand. It’s all the little bits of pervasive computing power out there in the world around me that are ticking up information from air quality to where people are going and all this other kind of stuff that you start to see, this a very, this is a statistical world. It’s not just a statistical leg. It’s the society.
Rob: It strikes me. One of the great things in here is that, in the book, among other things, is kind of the realization is that, I’m a part of this. We’re all a part of this. We’re carrying these devices. Anyone’s who’s carrying these devices understands. And anybody who uses like ways and contributes to the greater good of other people in a car, kind of has an inkling of this.
But the way you described it is so astute. And I’m not gonna reread your words, I’m just going to paraphrase a little bit, is that transition from paper and pencil to these devices and the smartphone, we’re kind of losing our self, aren’t we? Because we’re not quantifying our emotions and our feelings of maybe what they used to do before technology allowed you to do this.
What we’re doing is we’re quantifying the number of steps and the number of calories we consume if it’s convenient and easy for us. And if it’s not, if I have to think about how I’m feeling and write it down, I shove that side off. So there is a real, I think there’s a societal impact here when all of a sudden, you’re not quantifying the self, you’re quantifying the step.
Nora: Yeah, yeah. Well this is, just to get back to where we started out talking about technology being neither good nor bad, right. I think that it’s very much a mixed bag, like what we’re starting to do. And to me, some of the dangers of it are that, exactly as you say, that we sort of risk losing what we can’t quantify, you know? That there’s a certain, I mean, whatever you’re . . . I think that’s a problem, right. That in a way, as a culture, we’re a bit drunk on the numbers. And if we kind of reduce our lives to what turns up on a beautiful graph or on a spreadsheet, then and we’re missing like, what about me is not turning up there. Then that, I think, is a bit of a danger.
And I do think that there’s a tendency to be a bit surfacey or, to use a five dollar word, performative about it, you know? I mean, in that sort of comical negative side, we’ve all seen those things that people like putting out on Twitter like, yay, I just ran five kilometers, or whatever, right. That sort of kind of humble bragging that people do on social media.
But even not in that sense, I think there’s a sense in which we want to kind of look at, and I think that’s a very fundamental human emotion, this desire to kind of look at ourselves and say, that’s me. Hey, that’s who I am. To tell a story about ourselves. But of course, the story of ourselves isn’t just the numbers.
The story of ourselves is all those highly subjective things, the way we understand ourselves, the way we reflect on what happened to us during our lives. That can’t actually be quantified. And that one of the people that I talk about in the book, the academic victim, Ira Schoenberger, talks about this idea of memory. And I think in some ways that we’re losing this distinction between memory as capturing the facts of what happened, right, when you can go back and revisit the conversation that you had with somebody or the argument you had with your boyfriend or girlfriend, you know, ten years ago.
And you can capture the facts of it versus the subjectivity of how you feel about it and how that changes over time, right? And I think that’s a real danger when you start recording everything, that you start to miss that distinction between recording and remembering, which is a really important difference.
Rob: I love that. Your statements just around the perception of who you want to be ultimately is what you’re pushing out there, you know? You’re not gonna put out the negative, right? Nobody puts out the fact that, I’m in Cornwall. Like I’m in Rome. You’re going to get a thousand photos. I mean Cornwall. You’re not going to get anything. You kind of push out who you want to be perceived as. The way I look at this is. Look, Facebook is who you want your friends to think you are.
And, all those social networks, the same thing with Instagram and with Twitter, are who your friends and business partners and associates, who you want them to think you are. And, then your true self is what you type into the Google Search bar, right.
And ultimately that’s the distinction, I think these devices allow us to lend credence to that side where it is the perception of who I am versus actually who I am. Except for the self trackers, the guys with cameras mounted on them that are following them wherever they go. They can’t hide.
And, throughout your research for the book and what you’ve seen since the book was published, obviously like you said earlier there’s been a [pull] from mobile, right. And during the Internet days it was a push. You had to convince somebody that they needed a website.
You had to convince somebody that they needed to get online. But with mobile we kind of woke up one day and there were five-billion connected devices around the world and smartphones accelerating and everybody using experimenting.
Obviously this had a huge influence on the way that we use these to self track and to create our virtual self. How big is this phenomenon? How far does this stretch? Is this like a North American thing? Or are we seeing this around the world?
Rob: Well I mean one of the things that I think is the most interesting is the most interesting in the book and is uncovering research about using cell phone data in particular in the developing world. And, so in this case we’re generally not talking about smartphones. We’re talking ordinary feature phones.
Bu, there’s been some really interesting research be done by people like Nathan Eagle and people add MIT into the idea of like why if you can anonymize the data about where people are, what are some of the things that you can learn. And, so they used it to do things like track. Not even track. But, predict where malaria outbreaks are going to happen based on the movement on anonymized cellphone data. And, this is a feature of how quickly cellphones have been spread through the developing world.
So, you take a country like Kenya for instance and cell phone penetration rates among people, I think between ages of 15 and 16, are very close to a hundred percent. So, you’re talking suddenly about this location data being potentially very useful, precisely in context where we’re more formal information might not be available. There has been an incredible use of cellphone data in Haiti after the crisis there to track things like the spread of cholera and where people are going and that kind of thing.
And, obviously it raises huge questions about privacy. And how you protect people’s data and whether anonymous data will really and, truly be anonymous, or whether there are risk factors there. And I think the research suggests that there are risk factors there. But I think you’re looking at something that in fact might be locally transformed. Once you start to think about what you can use once you can start to think about what you can use this information for.
Rob: We’re seeing this from a commercial standpoint. A lot of people are trying to turn around and take the location data. And use it to sell goods or to GEO fence, or to get trends, traffic trends, and people trends.
When you look at what happens in developing nations versus what we’re doing in North America, do you feel kind of embarrassed like I do about how we’re kind of superficially trying to use this data to line our pockets versus what they’re actually doing in developing nations?
From that, is it tracking the spread of malaria also to education and training and learning where people never have learned before? Do you feel kind of superficial being here?
Nora: I’m very excited about the altruistic things. I mean as long as those commercial relationships are transparent and on the up and up, I don’t see a problem with it. And, I think the really interesting space, what I think from a commercial point of view, is when you look at things like waves. If you look at potential to actually combine a useful service to individuals with something that can potentially make money for people who design it with something that actually has the social benefits that I think is a really interesting space.
Because what I think is going to happen culturally is that as people start to track their data more. They’re going to start to become a little bit more proprietary about the data, right? I mean once you realize, “oh, I can learn all these interesting things about how I read, and my exercise patterns, and how much I weigh, and what I eat, and all this other kind of stuff,” I think people are going to start to think about this, if it’s worth something to me, what is it worth to other people, what is our data collectively worth.
And I think the really great space is if you can think, a lot of really interesting examples of this, of taking something that has an individual value, a commercial value, and a social value, and that I think it’s really going to be the key spot, as people start to think about the value that their data has, that I think has a huge potential from a commercial point of view.
Rob: Do you think that, I mean, I’m one of these guys that when they ask me for my postal code at IKEA, I always say, okay, what’s in it for me, right. I’m going to give you some data that’s very valuable to you, which is my postal code, it might be just one of the one billion you collect every year, but I constantly think about it, right? I think about my data, that information is worth something.
So how do you think that this plays out when a lot of the stuff is collected anonymously, through the carriers, through the browsers that you use, through the location tools that you use, I mean, I use an app called Moves, I rave about this app because, well, this tracks steps, Moves tracks places and locations, and steps as well, I use it to remember where I’ve been, but that data sits somewhere in Sweden in the cloud, and at some point it’s going to be valuable to them, and I constantly think am I going to be able to make a living off my data at some point?
Rob: Like do you think that will happen?
Nora: Well, there are people who are arguing for that kind of thing. Jaron Lanier’s latest book, what the heck is it called, “Who Owns the Future?” I think it’s called. He’s arguing exactly that, that maybe it’s the case that perhaps he’s gone out the window, but maybe we can we can re-jig the system so that we have a series of micro pings, for instance, that can allow people, if not to make a living, at least can supplement their living doing that.
So people are making that kind of argument. My feeling about it is that there is nothing more inherently problematic in gathering data in exchange for service than in being a television broadcaster, and saying, I agree to provide programming, and the quid pro quo is you agree to be exposed to advertisements, right? There’s an implicit relationship there. And I think as long as those relationships are transparent, it’s not problematic.
The challenge I think is that, and then it’ll talk about this in the book, is that we’re not yet in that world where we have transparent clear rules about who’s doing what with the data and so on, and partly that, I think there are a number of reasons for that. I think one reason that that’s happening is because it’s sort of in the nature of startup culture that you think, okay, let’s think of a great idea and try and get users, and then we’ll try to figure out how to monetize it afterwards, and if you don’t know what you want to do with the data, then it’s pretty hard to have a transparent relationship. But the data, I think that’s one problem.
I think another problem is just, you know, and one of the people I talk to in the book, and [inaudible 00:23:16] talks about this, is that once we’re relying only on the standard form contract, the “I agree” thing, that’s not really sufficient. We need a new, we need to calibrate that relationship, and what the nature of those contracts can be and should be.
But then the other piece of puzzle is, I think too, is that it’s almost like a math problem. It’s that I don’t think we quite even know what the math implication for all this stuff is, when you start to put together, not just one data set, when you start to put together 18, 25, a thousand data sets, what is anonymous and what is not, and there’s been some really interesting research that’s been done precisely about this.
I mean you probably follow the Netflix example . . .
Nora: . . . where they have this thing where they, was contest to improve the algorithms, to offer people movie recommendations, and they released this huge anonymous and quotes data sets of users, and researchers, I think University of Texas, was able to pair that with another data set, and identified, not everybody, but they were able to identify some people.
So I think there are multiple layers of, you have to get sorted out here, but I think the potential, first of all, the genie’s out of the bottle as it were, right. It’s pretty hard to sort of roll things back and say, hey, we’re not going to do mobile, we’re not going to track anything anymore, because we’re doing it, but also I think we just have to think more clearly about it, because that information can be very useful. It can make our communities run better. It can help us as individuals too.
So I think we’re just not, it’s like I said before, the situation is changing so quickly that the framework around it in some ways, I think, hasn’t quite caught up with it.
Rob: And it takes a company like Path or a company like Apple to infringe on something that-, push the limit even if it’s for the betterment of the product. Right? Apple’s fiasco was that they stored the location in an, an unencrypted file on your phone. And the reason they did that, if you believe this. And it takes a little bit of faith, and I believe it, it enabled the location aspect, the GPS aspect much faster than it would be if it had to go and find a signal and a connection. And it drained your battery less.
And Path they just thought it was a cool service to be able to enable you to invite more people in to your circle in Path. Just turns out that-, don’t do those things. And that’s how we find the boundaries.
But do you think that that adage that Facebook has kind of realized and woken up to, and actually in not so many ways but admitted to this. Is that you are the product. If it’s a free thing that you are engaged in, which is most mobile apps now. Freemium and that adage is that if you’re not paying for it you’re not buying a product. You are the product. And do you think that can be sustained in this world where we’re going to get much more concerned about data that is out there about us?
Nora: Yeah. I think there is a little bit of a push back. I mean you’re starting to see things like data lockers and some data activism. I mean then again is happening at the fringes. So I mean there’s a guy in-, I believe he’s in California who had a pacemaker and defibrillator. And what he has been pushing for is access to his pacemaker data. And when he first approached the company about this, they said “Oh that’s our data. Our data.” Right?
And so now he’s like, “What do you mean? That’s my heart rate.” And that’s a bleeding edge case but I think you’re going to start to see more and more things like that. Like that that seems weird now is not going to be weird in two years, five years. So there is push back I think coming as people feel those kind of uncanny creepy moments. Of like, “Ugh I didn’t know you knew that about me.” And we’ve all had those moments. Right? Even if you know the information is out there suddenly it appears in a context that you, you weren’t expecting. And it freaks you out. Right? And it, it gives you the creeps and the hair stands up on the back of your neck.
Rob: You look behind you and see okay, someone’s here. Right? Yes.
Nora: Exactly. Exactly, right? And I think the thing is that the mores around that are changing so quickly that we don’t even necessarily know where those points are. That you can have a company that thinks, “Hey that’d be a cool feature if we did that.” And not realize that people are going to flip out about it, because who knows what people are going to flip out about? Right? Because, because five years ago things that made us creep out are not creepy to us anymore. Right?
So I think we’re in a very interesting-, I don’t know what the answer to that is. I think we’re in a very interesting space in terms of negotiating the relationship. And again, to talk about it from the commercial point of view I think there is a space that’s available and open at the very least we’re being people who are trusted sources of keeping your data. Right?
Especially if you think about areas where there’s bound to be huge growth in self tracking, like health. Right? Being able to say, “Okay. You’re storing your data with us about your exercise and your health and all the stuff that your insurance company would probably love to get its hands on. This is the nature of the relationship you have with us. It’s completely transparent. This is exactly what we’re doing with your data. This is how we’re protecting your data.” All that kind of stuff.
And I think it’s going to be a mixed bag of solutions. I think in some cases, maybe people would be willing to spend a reasonable amount of money, like whatever it is. I don’t know $20 a year or something for services that they think are-, it’s really important to keep private. Like healthcare information. And then a lot of stuff that is probably somewhat more trivial information that people are pretty comfortable just-, well really if you want to track my stats, that’s fine. Go ahead. I don’t care if you keep that data or not. Right? So I think we’re going to see a multiplicity of solutions to that as this kind of shakes out.
Rob: Health is very interesting. I interviewed the founder of something called Ubiqui Health. And I think it’s just an example of a small startup that, that focused on one thing which was migraine sufferers. So they created an application that allowed you as a migraine sufferer to document the things . . . your life basically. Track your life. So that they could look for . . . and they got 40,000 people to sign up for this field trial, which is a huge number when you think about it.
They committed to doing this for 90 days. And they tracked the food that they took. It took the barometric pressure, the temperature and the humidity and everything that triggers migraines. Like sleep as well and diet and exercise. And then they combined all that and they did it on behalf of big pharma. Right? So that they could look for triggers, a swath of triggers that that they could then look at seeing if they could get in to some preventative medicine.
And I think that as long as you have an open relationship and an understanding that that’s how the data is going to be used, that is a tremendous value to migraine sufferers, if they can stop eating parsley four days before the humidity reaches 88 percent, that data and then I think the thing that’s missing is some kind of notification that rings in your house that says, “Don’t eat that here because you’re going to get…”, there’s a finish here that we don’t have.
But you can see that. That’s a good use, and then I think of the things that you mentioned around insurance, and the information that you could collect about driving and your speeds and all of those things, like basically, your life’s black box, which is what we’re talking about here, and the implications of that.
I think that we think in the center, the average consumer, the average person thinks, “Oh my god, that’s going to be terrible,” and the person who smokes over here thinks that, “No, I don’t want my insurance to know,” and the person who’s very athletic over here says, “Yeah, I want a discount.”
But that stuff has to be on the fringe and at some point, five years from now, do you think, Nora, that will be in the center and everybody will accept it as a norm like we did Foursquare? Because, five years ago, the idea of checking in and telling people where I was going to be? Not even on a radar at that point. So do you think we’re heading in that direction?
Nora: Yes I do, I do, and I think you’re already starting to see that with, they’re having these sort of experiments with offering people discounts on their insurance if they’re willing to have devices in their cars that track the way they’re driving, and again I’ll be willing to see how this shakes out. I mean, does it end up being of a benefit to the consumer, or does it end up being more of a punitive thing, like “You’re not going to get insurance unless you agree to have this device on your car,” right?
I mean, there are all kinds of ways that this could potentially shake out. I think your example of the migraine tracking is a really good example of exactly what I’m talking about where the potential is there of something that’s personally beneficial to individuals and something that’s beneficial to the broader community, and, yeah, I absolutely think that going to be . . . that’s totally, totally normal in I don’t even think five years. I think two years.
Rob: We’re moving ahead that quickly, and imagine the scenario when everyone has a smartphone or something like that, and your wallet is attached to it, and this is where you start to think of the implications of these things, where you want to eradicate drunk driving at least in North America, where the population is big enough, or large enough for the smartphone population. We’re over 50% but still growing.
Canada is a perfect example. We could do it in Canada. But as soon we have whatever that mobile wallet looks like and whatever the mobile payment system is that you’re on, and the location of where, and all the data feeds into one spot, you can essentially not start your car based on the fact that you’ve bought four drinks at this bar within the last three hours, right?
So you sit in your car, you try to turn on the key that’s powered by your phone, and it looks it up where your spent your money, and it says, “No way, Jose, we’re not going to let you in the car. We’re not going to let you start it.” You’re kind of, “It’s ‘Big Brother, but then it’s societal benefit,” and they, thank god, at some point clash in a big way, and something breaks through obviously.
Nora: Yeah, and I think it’s a really interesting question of how, and if you’re interested of following this up [Getty Morosov’s] latest book is a lot of this exact question is that, “Yes there’s a lot of social benefit, but at what point does it become like rats in the Skinner box,” right? You’re kind of doing social engineering to produce certain outcomes which are beneficial, but in some ways, you’re denying people’s subjectivity. You’re denying their ability to make their own kinds of choices. I think that’s, in a way, a profound change that all of this stuff points to.
And it does get to, as we starts to get to this big data world, where we can make all sorts of correlations about people’s behavior, it does raise a lot of those, I think they’re almost spiritual questions, really, about how much we want to encroach on people’s own ability to make choices about their own lives, and how much we want to. And the other thing about this, too, is we have to be careful that we don’t overstep the bounds of what we think the data tells us, right?
Rob: Does that we mean, we don’t want to assume too much?
Nora: Well, I think the data can, I mean, I think humans are very good at reading things into things. Data can suggest things, but maybe there are alternate explanations for it, right? Or, I was thinking of this the other day, when Google was prompting me, saying, “Oh, your trip to the office was going to be delayed by a minute because the traffic has changed.” But I walk to work, and I thought, “That right, it’s making assumptions about how it thinks I get to work,” right?
You can imagine a scenario where, it’s not just one thing, it’s all of these smart tools in your life that are making assumptions about what want to do; what happens if they’re not right, and how frustrating that could be, right? That’s a danger, too, is just from the level of personal annoyance or the level of social harm of making wrong assumptions about what people are doing based on what the data appears to be showing them.
Rob: Yeah, like sometimes I think that it, you know, we have these… We’re in this industry, so we look at it and think oh big, you know, wondrous world that we’re about to embark on big data. But there’s something called serendipity isn’t there? Like all the dating services about compatibility and bringing people together that are deemed compatible by data. It takes that serendipity out of a bar meet up, right? Where you work through the differences and you learn over time to work with somebody, to be with somebody for a lifetime. And it’s like are we going to lose that serendipity if we go down this path too far?
Nora: Yeah, and I mean it’s something that’s come up with soft tracking for sure, but even with the search is that idea of if you’re always being given what you want, how are you going to find the thing that you didn’t know you wanted until you find it, right? Like that’s a beautiful thing particularly about living in a big city and walking down a street that you’ve never been down before, or going to some… I was thinking of this the other day with travel, you know? Like how you go to a different city and maybe you don’t speak the language that well and you get lost and you have to stumble around and you have to ask people directions. And that can be frustrating and it can be scary, but that also can be potentially a really great thing and you can learn a lot from those accidental things. You can make new friends because you had to actually talk to somebody instead of like looking at the directions on your iPad, right?
So there are kind of down sides to that. And I think that you have to remember to leave yourself open to turning the device off, leave yourself open to wandering through a city instead of saying what does Urban Spoon tell you about where you should eat in this particular neighborhood, you know? I think that there’s a real value in that. And humans, we are all pretty much control freaks and if we can control for all contingencies I think a lot of us want to do that, but in some ways the most exciting things are in chance.
Rob: How do you replicate that in a mobile, or you know, in a data environment?
Nora: There could be an app to give you bad information so then I could go wandering off on my own.
Rob: Would get me lost. And I think that’s what Apple was trying to do with their first Apple Maps, right? Was hey, we just want you to . . . we want to bring serendipity back. So, you might end up where you’re supposed to be or you might not. And it’s going to be an adventure.
Rob: They’re smarter than us. That’s the key. I often wonder about that is when you jam so much data in something is there just going to be an app that tells you that one day your alarm doesn’t go off because it’s done this determination about the world around you and the traffic and the smog and the temperature and your mood based on how much REM sleep you got the night before. It just says, we’re not going to set your alarm because don’t get out of bed today. That’s just our advice. All this data tells us not to get out of bed today. Just enjoy your sleep.
And part of me thinks that’s the perfect clean world, and the other side of it is that there is no dirt, right? Hand sanitizer and no kids playing in mud is what it means to me in my head. And so, like I’m stuck in the middle here because I see the future and I see it coming down and there are so many benefits to it all. But then on the flip side is that I don’t want my kids to be that sterile when it comes to making decisions, right?
Nora: Well and also I mean there is this question like, well how do you learn and become an adult if you’re not… I mean, I know the analogy of kids is really true, right? I mean that’s how kids become adults who have good judgment is because they make mistakes and they realize, oh I shouldn’t have had 10 candy bars because now I feel sick, you know? Like you do all those things that if you, the analogy is kind of with parenting.
If you have a parent that never lets their kid do anything or make a mistake or fall in the mud or whatever, they’re not going to mature into responsible adults, right? And by analogy, if you let your apps make your decisions for you, how are you going to exercise good judgment and what are you going to do when there’s a power failure and you don’t have access to your apps, right?
Rob: There’s nothing, exactly. How do I live?
Nora: What am I supposed to do?
Rob: You have to look up from your screen. And it is very interesting, but I think that there’s a balance, right? That there’s a generational thing here, and I’m 43, and I look at 19 year olds and 20 year olds and the way that they interact with the screens and the way that they do their things. And I mean, I check, I use this phone as a life line for me. It’s how I gather my information. It’s my data. And for me it’s, I don’t really mind that it tracks me to an extent, right? And because I live, what I do digitally in the open. Right? Very, very, very much in the open.
And I never want to be hindered by that, but I definitely worry about these moments where the next generation just has unfettered, unfiltered feelings and pushing all this stuff out into the ether. And I wonder what it means for them in 20 years when they’re getting into my age thinking, will this be the norm or will they have regrets about how much information they’ve shared throughout their lifetime. Because it can be used against you at some point and it will be.
Nora: Yeah. I think you and I are both getting into that “You kids get off my lawn” territory.
Nora: But I think that is really, in a way, there is the potential for that kind of problem. I mean, my gut tells me that at some point unless you’ve done something really egregious that we’re all going to. And I’m hardly the first person to say this. At some point everyone is going to have some embarrassing things out there about them. So maybe we all just kind of learn to live with it and . . .
The other thing I think, not to get too wool-gathery about it but the idea of privacy that we have that you and I grew up with, it’s now being changed and morphing is not something that’s been with us since the beginning of time. That in itself is a fairly new invention. A few hundred years old. And over human history, we’ve had different notions of privacy.
If you were living in a village in, I don’t know, the 1500s, your sense of concept of personal privacy probably really probably didn’t exist. Right? Or it existed in a very different kind of way. So, in some senses, I think, it’s because we’re at this exact moment and we’re seeing both worlds that we’re kind of flipped out about it. But the private and public are both pretty fluid concepts.
Rob: And I wonder if we’re at that point where a year from now we will have turned one way or another and we’ll feel a little bit more comfortable or not floating, I suppose, between the two. And I think that that’s where I’m sitting right now is that. It’s the same thing I try to teach my kids, anticipation, because there isn’t any left in this world. Right? When Bruce Springsteen was about to release an album, they would put it on this sheet at HMV and I would have to go into the store and it was an experience to go into the store and find out when the release date for that . . .
Rob: . . . CD or that album was and then I would go home and wait for it. Right? There was no releases. There was no Twitter. There was no updates. And then when it came, I would have to go back to the store and buy it and the experience of unpackaging and playing the music for the first time. I try to explain that to my kids who want [Teen Beech] movie, music right now all the time and I can give it to them. It’s hard to create anticipation. It’s the same thing around privacy. It’s a big, big, big challenge to instill that in a child and even in an adult which is, you know, much more difficult.
Nora: Yeah. No, for sure.
Rob: Do you think, my last question here, as when we go into the future, I have this theory around census and taxes and things that are onerous for humans to do and I wonder at some point all of this land somewhere. And it’s an automated census machine or an automated tax machine. Where it’s just like everything is filed for you. Everything is done for you through big data with or without your permission. Right?
When taxes went to prepaying taxes, removing it right from your paycheck from the honor system at the end of the year paying it, obviously collection was significant. But I start to think about census and the challenge that we have in two or three year old information and taxes, for example. All that information is out there. Do you think at some point we are never going to have to fill out a census form or fill out our taxes again as a result of this? Metaphorically, is that where we’re going?
Nora: Yes. I was just going to say if tomorrow you could promise me I wouldn’t have to do my taxes any more, I would be completely on board.
Nora: Yeah. I think the trend is certainly going towards things being automatic. Right? I mean, we’ve talked mostly today about the sort of conscious level of which we are choosing to track things but as more and more stuff is digital, I think the trend is going more towards the default being gathered automatically. That’s certainly the overwhelming way things seem to be moving, I think.
I think the question will be where are going to be the pressure points where people go and I suspect taxes will probably be one of them where people say, I’m not comfortable with that. I’m not comfortable with that level of things being automatic. But I mean culturally we see this with privacy too but the argument of. If you’re not doing anything wrong, what are you worried about? Right? Which I find troubling. I think that we should be thinking about what are the things that I want to take responsibility for, or that I want to keep private, not because I’m doing anything wrong, but just because it’s none of your damn business.
Rob: My choice.
Nora: I want to exercise an agency in this area, right, human agencies are really important thing, human responsibility is a really important thing, and at what point does it become off- loading too much of what individuals should be taking responsibility for. But it will be interesting to see where those question wits are, and where people get up on their hind legs and say, hey, no, I’m not doing that, because I think the technology is certainly taking us in the other direction, towards automatic.
Rob: So for seven years you’ve been doing this with Spark, obviously and a journalist for much longer, like have you been surprised by anything that you’ve seen over these years, like that has evolved, that you thought when you saw it for the first time, you would have thought there’s no way that this will take off, there’s no way that I would do that, and then seven years later you’re thinking back, how did I get involved in this, how did this start?
Nora: Yeah. I think that the thing that has been most surprising for me, and probably less surprising to your audience because there are people who are really engaged in the mobile space, is the extent to which people go onboard with linking of place and behavior, that surprised me that people choose to use Foursquare’s one example. I remember when Foursquare launched, thinking really, like, really?
But people have taken to that, and you see it with all the activity in the [inaudible 00:46:43] space now, you know, how much energy there is around, sort of annotating our geographical world, and linking what we’re doing and what we think about things, to real geographical spaces, and that to me is, that’s surprising to me, how quickly people have adopted. And I think the reason they’ve adopted it is because there is a clear, because [dude], we have our phone with us, but also that it’s clearly very powerful to do that.
Rob: I’m with you on that. I don’t know what the lifeline of Foursquare is, I don’t know what the lifeline of all of that software is, whether it evolves or whether it just becomes a part of the operating system, and operating system is the internet of things as well, but this seven years have been fascinating to watch, right.
Since the very first iPhone really emerged, and I’ve been doing mobiles since 1999, and long before there was such thing as a smartphone, we were doing feature phone games for a local company called [inaudible 00:47:42] moved into BlackBerry, then moved into applications and enterprise apps, and the evolution has been astounding. It almost seems like every term surprises me. Simply, like I’m not shocked that it’s happening, but it surprises me how what sticks and what doesn’t stick for people.
Rob: And the valuations of some of these companies are incredible. But it’s also the implication on businesses, the way that people use these tools for their businesses, I’m shocked that people aren’t embracing this a little bit more. My hope is that we’re on the fringe of this stuff being adopted by business, so that they can service their customers a little bit more. And shocking, there’s a bunch of, anywhere from 30 to 40 percent of some of these retailers and merchants that just don’t believe that mobile, and data, and location is relevant to their business, and I think that’s what I’m watching. It’s going to be significant when this thing happens.
Nora: Let’s just say, I know it sounds like I live beside the airport, I think what’s happening it’s just the tail end of the Canadian National Exhibition, so I think they are like stunt planes flying overhead, which is why, if I thought of it, that I would close these doors before I started our conversation.
Rob: It’s so perfect, I love it, it’s part of Canadian culture right there. Yeah, we all live near airports, and along the border. So Nora, where can we send people? You said Spark resumes this coming weekend . . .
Rob: . . . and if you’re listening to this after, it’s obviously available as a podcast . . .
Rob: . . . on iTunes or through this CBC website.
Rob: The book is just out in paperback as well now . . .
Nora: That’s right.
Rob: . . . so it’s in every format.
Nora: Yeah, it’s available in e-books as well, and the paperback available in the U.S. as well, and e-book as well.
Nora: And if you want to find out more about me and what I do, you can go to my infrequently updated website, Norayoung.ca, or if you want to find out more about Spark or subscribe to the podcast, go to cbc.ca/spark.
Rob: I love it, and you know what, it’s got to be hard to keep up a blog as well as write books, and do what you’re doing with Spark. I mean, at some point you just got to detox, right?
Nora: Yeah, I did a lot of detoxing this summer. I lived an almost entirely analog existence this summer, which was welcome for short, but my goal is to get back on the blogging path now, I’m finished with the book for the time being.
Rob: I’d like to see what the . . . there’s a book right there. It’s going from digital, immersed in digital to actually detoxing for a little while, and then coming back and see what the world looks like, but I’ll have to wait for that.
Rob: I’m not quite ready to do that yet. Nora, thank you so much for doing this, I really appreciate your time and insight. I can’t wait to, first of all, share this with everybody who is listening, and if you are listening to this or watching this, go and do yourself a favor, go and buy “The Virtual Self,” not only because it’s a great book, but I’m immersed in this industry, and I’ve interviewed like thousands of people for this. I think I probably had an “aha” moment a page as I was going through it, so for just that alone, I believe that you guys should be reading this book.
I’ve made recommendations to everybody that I know, and now I’m making that recommendation to you, the 600,000 strong out there, please, go out and buy this book, go and read it, and then I’d love your feedback on it. Please reach out to me, [email protected], and now you can go to Norayoung.ca, fill in the formats, that’s how I reached out to Nora, she responds, so if you have any comments about her book, suggestions, please, I’m sure she’d want to hear, right?
Nora: Thanks so much for the conversation. It was really great talking to you.
Rob: Thank you, Nora, and thank you guys for watching and listening, wherever you are, whatever you’re doing, thanks for tuning in, we’ll see you next time on UNTETHER.tv. Thanks, Nora.
Nora: Thanks, bye, bye.
As a journalist, author, and speaker, Nora explores how new technology shapes the way we understand ourselves and the world around us. Her book, The Virtual Self, on the explosion of data about our behaviours, opinions and actions, is currently available in digital and analog formats.