The Future: Is it Utopia, or 1984? Review of The Circle and The Age of Context
George Orwell’s famed 1984 was written in 1948 – 36 years ahead of the title’s date. The movie 2001: A Space Odyssey was written in 1968 – 33 years ahead of its title. With a 30-plus year head start, audiences back then saw both works as way, way out there – far enough away that it was really difficult to seriously see how we’d get from here to there.
So it’s interesting to consider two futurist books published in the last 4 weeks: Dave Eggers’ Brave New World-ish novel of life in a Google-Facebookish company called The Circle. Then there’s the west coast’s leading in-your-face techno-geek Robert Scoble’s new book The Age of Context: Mobile, Sensors, Data and the Future of Privacy, with co-author Shel Israel.
They describe two very different views of the future. But they share one thing: in the world of 2013, there’s no way it’s going to take 30-plus years to envision either future. Each book suggests great changes, but it’ll be here in, relatively speaking, no time at all. The future comes much faster than it used to.
The Circle is an easy read. It’s the tale of an innocent and talented young woman who rises fast in the California internet company of the future, The Circle. A clear blend of Google and FaceBook, The Circle is rapidly coming to dominate every aspect of life in the not too distant future.
A literary masterpiece it ain’t – it reminds me of Ayn Rand’s style. In other words (with apologies to Dorothy Parker), it runs the gamut from A to B.
But that’s not the point. The plot is what drives the book, and the scenarios feel all too real. Eggers is a journalist, and his strength lies in tightly drawing the tensions that arise from massive access to data, manipulated massively. I’m not claiming The Circle is in the same literary league as Brave New World or 1984, but it shares their well-crafted foreboding of Some Big Evil Stuff coming down the road.
It’s as easy to envision the good that comes from The Circle as it is to envision the bad, even though the Big Brother aspects of it ultimately win out. The biggest clashes, no surprise, come around issues of privacy. But it’s not the usual privacy issues that take front and center.
On the one hand, greater transparency and authenticity allow for a great amount of social good: less crime, better health, more efficient commerce, greater ease of social interactions.
But the biggest price is not lack of privacy – it’s the inauthenticity and insincerity that arises from a society that constantly has to share everything with everyone. To use Erving Goffman‘s metaphor, we end up constantly wearing masks, particularly when we insist we have abolished masks.
Eggers makes us see exactly how one gets from here to there: through pumped up Amazon book reviews, reciprocal autobot following, inauthentic LinkedIn recommendations. It’s all pimping out the rules of etiquette and reciprocity in service to self-interest.
The New Rule of Tell the Whole Truth quickly slides into the de facto rule of Say No Evil. And the price we pay is – it quickly turns to Say No Truth as well.
The Age of Context
Robert Scoble, the better known of the two authors, is the self-made techno-geek of the West Coast, most recently famous for his wife’s photo of him proudly wearing his Google glasses in the shower. Scoble’s a ham, but a very smart ham, and this is a very solid book. Not only is he totally up to the minute on his technology, but he and Israel do a very credible job of outlining the dynamics behind the technical world of the future. It’s far more believable than Eggers’ fictional creation, because it’s non-fiction, and very real.
They describe five very concrete forces: mobile devices, social media, big data, sensors, and location-based services, all coming together in the cloud. The Big Theme they envision is contextual computing. That is, the ability to know where we are, what’s going on around us, what we want to be happening, and what we wan to do – all contained in technology harvested to serve man.
Picture super super-powerful personalized software, integrated with cars, in wearable form, linking to the internet of things, our brain waves, and harnessed to databases that are accumulating at astonishingly exponential rates. It’s really amazing stuff, and it’s all very real; and Scoble and Israel walk you through it all.
They are not starry-eyed idealists. They raise the issues of privacy front and center, and are critical of Google in some very precise ways. They don’t claim to have all the answers, but do a service by more sharply defining the question.
Net net, I’m left with a definite sense of optimism. The benefits of the technology to come are huge, if we can manage to deal with the (real) issues. It would be unconscionable of us to forego most of the good because we don’t have perfect solutions to privacy.
A bit of perspective: Scoble and Israel cite a university professor who banned the use of Google in research because it was cheating, compared to “real” research. That was just ten years ago.
Imagine what could happen in 30 years?
Great insight with the line where speaking of it is not the lack of privacy but the way we share everything and the “masks” we wear in the process. I see that happening all around with Facebook and the constant “branding” that we all do. It will be interesting to see how our need for congruence plays through with this. Will we become more and more like the “brand” we portray?
I am reminded of something Larry Wingett said in a talk. “When I have the license plate winner I have to act like a winner in my car. Would a winner run the stop sign, would a winner act like a jerk behind the wheel?”
Well said, John, thank you.
Very interesting and as always well written. Both books will make for interesting reading. I was just reading an industry article on how sensors will change how we use energy in the not to distant future. These little devices will make our lives ever more public while providing great benefits. The implications for trusting relationships is unknown and for me more than a little worrisome. How do we keep the fundamentals of the trust formula in place?