2013: A Year of Disruptions

By Dr. Sean Carton, Chief Creative Officer and Professor of the Practice at the University of Baltimore 

Prediction is very difficult, especially if it’s about the future.
Neils Bohr, theoretical physicist

Considering that some folks think that 2012 might just be the last year in history, it may seem a bit pointless to try to prognosticate about tech trends for 2013. After all, if the world’s going to come to an end, who cares? On the other hand, if it doesn’t actually end we’re going to be kicking ourselves for not sharing our thoughts about the coming year. Either way, we figured it would be a good idea. So here we go: idfive’s technology and trend predictions for 2013!

Social Media Shakeups

There’s no denying that social media has been the thing for a few years now. We’ve seen the meteoric rise of Facebook, the ascension of Twitter as the zeitgeist (and soapbox) of the world, and the acquisition of a company that makes an app that lets people share blurry, discolored pictures for a billion dollars. We’ve all gotten used to a constant stream of wit and wisdom from celebrities, politicians, and creepy “friends” from high school whom we’d rather forget. We’ve gotten so used to sharing that life barely seems “real” anymore unless it’s being posted and commented on. For over a billion people, social media has become a fact of life.

But lately we’ve noticed what seems to be a disturbance in the Force that is social media, especially Facebook. First was their bust of an IPO that’s led to a pretty steady decline in their share price during 2012. Then there’s the growing grumbling around Facebook’s privacy policies, a grumbling that just seemed to grow louder as the year progressed with little or no sympathy from the company. It probably also didn’t help that founder Mark Zuckerberg finally admitted to calling his users “dumb f###s” (rhymes with “fire trucks”), though, as he noted in a New York Times interview, that comment was a “youthful indiscretion,” one that he wouldn’t repeat now that he’s all of 28 years old.

But potentially more disturbing (at least to Facebook) is that some recent studies (along with a plethora of websites) demonstrate that more and more Facebook users are admitting that they’d love to leave but are afraid of what might happen to their real-life social lives if they do. Facebook itself admitted that they were losing Web users to mobile, the very platform that they’ve been routinely criticized for “not getting”. Major advertisers have been grumbling about ad performance and some, like GM, even pulled their ads. Facebook has responded by raising ad prices and may now even be monkeying around with using the accounts of dead people (yup...DEAD PEOPLE) to boost advertiser “likes.”

Yup, it’s clear that Facebook’s in trouble on a number of fronts. So is Facebook headed down the tubes? We’re not going out on a limb that far yet, but other data – from informal surveys taken at talks we’ve given to observations of local teens who seem to be switching to Google+ in droves—seem to indicate that something’s afoot. All in all, it seems like 2013’s going to be a very bad year for Facebook and a much better year for Google+.

But the trend may be bigger than that. Parents are increasingly leery of their kids’ social media usage. As more and more people participate in social media, more and more of them are finding out that keeping it up is hard work…a fact that’s been known for quite a long time now by those of us who’ve been there. As users become more sophisticated users of social media they also become a lot more sophisticated about the privacy issues. Add on the specter of cyber bullying, increasing spam, and the sheer fatigue and information overload that comes with trying to keep up with social media and it looks to us like the social media bubble has burst.

Will social media go away? Certainly not…but we predict that 2013 will bring some major changes in how we use social media and a lot more soul-searching about how necessary it is for our lives. It’s definitely not going away, but it looks like 2013 is going to be the year that it gets put in its place.

Desktop Disruption

If we had to name 2012 based on the tech trends that dominated the past 12 months, we’d have to call it The Year of the Tablet. From iPads to Kindles to Androids to the Oprah-hyped (on an iPad, no less) Microsoft Surface, tablets are everywhere and have long surpassed the smartphone as the must-have totem of tech hipness. In fact, analysts now predict that tablet sales will surpass laptop sales by 2016 (we think it’s going to be sooner).

But if it was just about gadgets, the Rise of the Tablet in 2012 wouldn’t be as big a deal. After all, tablet computers have been around for a while, though usually sporting modified desktop interfaces. No, the big story of 2012 was the mass collective move from the keyboard and the mouse to the fingers and the touchscreen, a move first made popular by smartphones but amped up to paradigm-shifting levels by the mass adoption of tablets. The move has been so profound, in fact, that when Microsoft decided to release its new version of Windows it built it to be best used on a touchscreen, not with a keyboard and mouse! Mac OS still hasn’t made this leap, but the convergence of iOS (on the iPad, iPhone, and iPod) and OS X is well underway. One thing is clear: the touchscreen is the new desktop.

But more profound changes are on the way. In early 2013 (provided, of course, we’re all still around), startup Leap Motion will release the Leap: a pack-of-gum-sized, $70 gadget that will allow you to control your Mac (sorry, Windows users) with hand gestures Minority Report style. In the living room gaming consoles like the Xbox360 with Kinect and the Wii U are integrating motion and gesture controls into gaming so that users can ditch their controllers and play by simply waving their hands.

But there’s even wackier stuff coming down the pike. Soon, systems like the now-in-beta Brainput will allow us to control our computers with our brains. Advances in affective computing will change the state of our computers based on our emotions. And anyone who has an iPhone 4s or newer (or happens to watch a lot of The Colbert Report) knows how voice-activated assistants like Apple’s Siri have become commonplace conduits for human/machine interaction.

Will 2013 usher-in the post-mouse age? Not likely. Keyboards and mice are still better than the alternatives for those of us who have to produce a lot of digital content. However, for those that consume more media than they create, it looks like 2013’s going to be the year that many of us start to talk, poke, gesture, drag, pinch, and maybe even think at our computers more than we drag a mouse around a desk.

Disruption.EDU

One of the most “disruptive” characteristics of the Internet has been its role as an open, global channel for digital information. Since its popularization beginning in the mid 1990’s, industries that had been built on controlling access to information (travel agencies, automobile sales, and even medicine, to some extent) or controlling how information is distributed (newspapers, film, publishing, and music) have been radically transformed. Newspapers, deprived of their virtual monopoly on information by the explosion of free online news services have all but disappeared. The book publishing industry has been forced to deliver books electronically or face their inevitable demise. The music and film industries continue to fight tooth-and-nail to regain their past glory, but once their true product—audio and/or video information—moved from atoms to bits and those bits were transmitted over the Internet, their core business models were as good as dead. Record companies thought they were selling us plastic discs…what they were really selling was in the info on those discs. Separate the information from the container and that’s the end of the song.

Education—especially higher education—has existed in a privileged position similar to that of the entertainment industry for a long time now. Education’s “product” – knowledge—was bound to the physical realities of delivering it to its consumers (students). With the exception of highly unidirectional “distance learning” channels such as mail or television delivery, pre-Digital Age students who wanted to learn had to go in person to the places where the teaching took place.

The Internet changed all that by changing the same thing that changed the entertainment industry: the means of information distribution. By providing a two-way global channel for the transmission and reception of information (mostly via the Web), the Internet virtually destroyed the dimensions of both time and space in education. Online education meant that students could now not only learn where they wanted to learn but when (within certain parameters) they wanted to learn, too.

Over the past decade, the online education sector has exploded, led mainly by large investments in for-profit institutions that were able to take advantage of the economies of scale, global reach, and efficiencies of the online channel. Many “traditional” colleges have followed suit, offering online, offline, and combined hybrid formats as part of their degree programs.

But while the delivery mechanism was fairly quick to change, the business model wasn’t. Online or off students picked an institution and paid for the content they received. Courses online were run much like in-person courses, mainly limited to a small number of students and designed to encourage the kinds of interaction and participation that one would expect from an “in person” course.

Enter the MOOC or “Massive Open Online Course.” Designed for large populations of students and usually re-using content created for paying students, they were created (with the best of intentions) as a way of bringing quality education to a global, online audience for free. The only catch was that participants didn’t receive course credit (except under certain circumstances).

Of course, as soon as people began to hear that they could virtually “take classes” from heavy-hitters such as MIT, Harvard, Stanford, or The Wharton School of Business, they wanted in. MOOCs had existed in one form or another for several years already, but in the Fall of 2011 Stanford University launched a MOOC featuring a class in Artificial Intelligence led by two of the biggest names in the discipline. One hundred sixty thousand people signed up.

Obviously this was an idea whose time had come.

Originally MOOCs were run by individual institutions, but in the last year and a half, sites such as Coursera and EdX that aggregate courses from a number of top-tier institutions have arisen. At the same time, startups riffing off of the MOOC model as a way of delivering low-cost education to large numbers of people have begun to arise, a few (such as Baltimore-based Straighterline have even made arrangements so that customers who pay for the service can receive course credit.

Credit has always been the thing that’s kept MOOCs from threatening “traditional” institutions’ hold on education. One can participate in a MOOC, take the tests, do the homework, and ace the exams, but while the end result may have been a more educated student, they had nothing real to show for their efforts.

All that’s about to change. And we think that 2013 is going to be looked back at by the education industry in the same way that the music industry looks back on 2001 (the year before file-sharing free-for-all service Napster was shut down): the turning point when the industry changed forever.

The crucial elements are all in place.

  • Near ubiquitous Internet access.
  • Wide acceptance of online education as a viable alternative to “traditional” education.
  • Economic pressures including higher tuition, higher unemployment, and declining job prospects for graduates.
  • Rapid improvements in content delivery technologies.
  • Consumer acceptance of digital delivery of nearly everything.

All that’s missing is the recognition of MOOC learning with widely-accepted credentials.

Perhaps not for long! In late 2012, the American Council on Education received a substantial grant to examine whether or not certain courses on Coursera would be considered “credit worthy.” As of the time of this article’s writing the project had just gotten underway so we can’t publish the answer. But it almost doesn’t matter: considering the widespread move towards “non-traditional” alternatives to education and certification (such as the “Badges” movement) we believe that it’s inevitable that some system of officially recognizing online learning from non-traditional channels such as MOOCs or even self-service outlets such as Lynda.com will gain major traction in 2013. And when that happens…well, how many physical record or bookstores have you visited lately?

One thing’s for sure: 2013’s going to be an interesting year! As long as we get past December 21st, 2012, of course.