Thursday, July 6, 2017

The Undead Factoid: Who Decided 65% of the Jobs of the Near Future Don't Exist Today?


An article published this week on singularityhub.com leads with the following breathless description of the future of work:
We live in a world of accelerating change. New industries are constantly being born and old ones are becoming obsolete. A report by the World Economic Forum reveals that almost 65 percent of the jobs elementary school students will be doing in the future do not even exist yet. Both the workforce and our knowledge base are rapidly evolving.
Amid the questionable - do we really live in a world of accelerating change? - and the obvious - things are evolving - we find the very specific claim: 65 percent of the jobs our elementary school students will do don't exist today. Supposedly this startling fact has now been revealed by the World Economic Forum, and thus we should do... something. As if we thought everything was great about education, but now that we have this 65 percent figure, it's a call to action and we need to start changing education right now! Think of the children and their futures!

What follows in the article is a set of mostly non-controversial recommendations - like, teach students to think critically, and communicate well, which sound like excellent suggestions. But I (and others) bristled at the 65% - how could you reveal such a thing? I mean, you could predict it, but why 65% and not 55% or 72%? It just feels arbitrary and made-up. And I knew I'd seen this number before, so I decided to do a bit of digging. 

First of all, the World Economic Forum didn't "reveal" it - they refer to it as a "popular estimate" and attribute it as follows: 
McLeod, Scott and Karl Fisch, "Shift Happens", https://shifthappens.wikispaces.com
This is a YouTube video that you've probably seen that's loaded with facts and figures of various levels of reliability, designed to convince you that the world you know is changing rapidly, to generate gasps and chuckles from audience, and perhaps, to lead to some good conversation. I admit, I enjoyed it the first 20 times I saw it and even showed it a couple of times. I did have trouble with the statement that "We Live in Exponential Times" - what the heck does that even mean? But it does have a gee-whiz quality that's entertaining if not particularly enlightening.

Anyway, this version from 2006 has more than 5 million hits on YouTube, and there have been subsequent versions and it even seems to have launched a cottage industry for the creators - but I couldn't find any reference to the 65% statistic, neither in the presentation nor on the website. The website referenced by the World Economic Forum report has links to sources but most of them are broken (that exponential change at it again, I guess.) I suppose it may have appeared in one of versions, but I couldn't find it - so the trail seemed to end at "possibly invented for a YouTube video". 

But - Google to the rescue! - I found this article from the Atlantic in 2011, an interview with Cathy Davidson about her then-new book Now You See It: How Technology and Brain Science Will Transform Schools and Business for the 21st Century.  The interviewer starts with:
One of the foundational facts of your book comes early on. "By one estimate," you write, "65 percent of children entering grade school this year will end up working in careers that haven't even been invented yet." 
The Atlantic article doesn't mention a source, but googling the title & author brought me to a blog post by Davidson from just over a month ago, responding to a BBC piece that aired on a program called More or Less about statistics. Apparently the 65% stat came up recently in a town hall with a British politician, prompting the BBC journalist (who had a reaction similar to mine) to try to track it down. Go listen to the BBC program, it's fascinating. Davidson says she encountered the stat in a book by Jim Carroll (disappointingly, not this Jim Carroll)  and that it may have come originally from an Australian jobs report - which nobody can find. Furthermore, the BBC examines information on changes in the job market over the last 15 years and finds that - a most - a third of jobs today didn't exist 15 years ago, making the claim highly unlikely. (And, given that the original Jim Carroll book is 10 years old, nearly provably false.)

But just six weeks after the BBC debunked - and Cathy Davidson largely walked away from - the 65% claim, it becomes the lead of a brand new article, attributed to the World Economic Forum. It will no doubt be the reference for a presentation in the future, be cited in strategic plans, and quoted at school board meetings. Some factoids live on as zombies, like bits of urban legend, and I'm sure that this blog will do nothing to kill this one off.

By the way, thank you to the many K-12 public school teachers, and the liberal arts faculty members, who educated me pretty well for the job I do, which didn't exist when I was in Kindergarten. 


Friday, December 16, 2016

The Edtech Curmudgeon's Top Predictions for 2017

photo credit: Viewminder @ Flickr


Well, it's that time again. Time to look back at all the wonder that was the year 2016, and to look forward with excitement and trepidation to 2017.

They say that Alan Kay said that "The best way to predict the future is to invent it." They say that, although I once spent a day with Alan Kay and not once did he say it. He did tell an interesting parable about Buddhist monks and a basket of croissants however.

So without further ado (if indeed that WAS "ado") here are my predictions.

  1. Microsoft buys Prezi and creates HoloLens Prezi, supporting exciting presentations. Their motto is "It's PowerPoint, but it's 3D!" (In a related story, edtech budgets struggle to keep up with the demand for vomit bags.)
  2. The march of data continues, as universities across the country find new and better ways to help their stakeholders find, analyze, and visualize incorrect and incomplete information.
  3. A major company in the education space introduces a new LMS intended to disrupt and revolutionize the LMS market. Meanwhile, Canvas continues to sop up what's left of the market like a piece of bread on a gravy plate.
  4. A new concept, the Flipped Flipped class, dominates 2017 edtech press coverage. The Flipped Flipped class, inspired by Uber's experiments with driverless cars, eliminates the need for hiring expensive and noisy contingent faculty, as students can watch videos IN CLASS without an instructor present.
  5. A panel on diversity in ed tech carries on valiantly even after the one woman invited to join the panel can't attend because her travel funding is pulled at the last minute. 
I had more predictions, but I lost them when I cancelled my Evernote account.

Have a wonderful 2017, and remember, the future is always just up around the bend, past the signpost. If you look carefully, you can see it from here, just like Russia.



Monday, February 15, 2016

Three Laws of Bureaucracy

Basset Hound asleep on a patio in the sun

As I executed one of my important job functions as a senior administrator - scrawling my name in ink on pieces of paper - #2 below came to mind. For completeness, I figured I needed three. Not terribly original I know, but it made me feel a little better...

The Three Laws of Bureaucracy
  1. An institution at rest tends to stay at rest.
  2. For every action, there's an equal and opposite pile of paper.
  3. The life of any initiative is inversely proportional to its impact.
Got any more?
Photo credit: Don DeBold @flickr CC-BY 2.0

Friday, February 12, 2016

What Happens When Everyone Gets Their Own Logos

One of the hats I wear on campus is oversight of Marketing & Communications. Like most campuses we have a style guide and a set of web standards, which discourages you from creating your own logo and putting it on the web along with the campus logo.

Creating a logo is fun, and no matter what your campus logo is, there are some who will hate it and want to use something different. (Usually people want to use the old logo, which of course was once the new logo and was hated, often by the same people.) I get it, I really do, but if you want to see what a web page looks like when everyone gets their own logo, take a look:
screen capture of the home page at https://goes-app.cbp.dhs.gov/pkmslogout illustrating what it looks like when you have seven different logos on one page
Welcome to https://goes-app.cbp.dhs.gov - wait, where am I?
This is not just ugly.... it's confusing! Where am I, and what am I supposed to do here? How do the SEVEN DIFFERENT LOGOs relate to one another? What's the different between GOES and FAST? What the heck is FLUX? Sentri? (And how did Homeland Security, the parent agency of Customs and Border Protection, miss a chance to get in on the fun?)

You can't build an interface from the inside out, and you can't build a coherent brand from the bottom up. Letting everyone choose their own visual identity results in an ineffective and unappealing mismash, despite good intentions.

Thursday, February 11, 2016

How Broken is Email?

Email, which started as a transformation in the history of communication, seems to be close to a complete breakdown. Do any of you not struggle to find the nuggets of important information amidst the noise? How often do you send an important message via email and have to follow up via text or phone to find out if it got through? I don't know what will come next, but it's depressing to see where we are now.

Just to illustrate - this is ONE DAY'S WORTH of the spam collected for my campus email account. True, the spam filter caught all these messages, but a) occasionally there's a false positive, so I miss something important b) dozens of crap messages get through every day and c) my campus spends A LOT OF MONEY for the hardware and software that does the filtering.

So when our students tell us that they don't read email or read it only under duress because we tell them we have to - maybe they have already figured out something we don't seem to know yet.

Monday, November 2, 2015

Using Technology Alone Doesn't Guarantee Better Outcomes - So Why Reward It?

For 25 years, my colleague Casey Green has surveyed campuses about the use and impact of technology in his Campus Computing survey. I'm a big fan of the survey because it's the largest body of longitudinal data we have on the topic. Just last week, Casey released a summary of the 2015 data, and as usual, there's lots of interesting stuff to chew on. I want to focus here on one particular question, which deals with the incentives campuses use to encourage faculty to use technology. 

The particular question in the survey is worded as follows:
Does your campus/institution have a formal program to recognize and reward the use of information technology as part of the routine faculty review and promotion process?
Now, the whole issue of how faculty are reviewed for promotion (and tenure) is fraught and complex and related to controversies around tenure, the appropriate relationship between research and teaching, the rising role of contingent faculty, and indeed the very value and purpose of faculty. But even asking the question suggests that this is something our institutions should be doing.  As Casey puts it
For example, even as instructional integration is the top institutional IT priority again this fall, less than a fifth of campuses (17 percent) recognize instructional IT efforts as part of the faculty review and promotion process.
If it's a top priority, why wouldn't we tie it to the way we review and provide incentives for faculty?

Here's Bryan Alexander responding to Casey's findings:
How do colleges and universities support faculty in using technology?  Badly, it turns out, according to one critical measure. A look back at decades of campus computing strategy finds that the majority of American campuses neither recognize nor reward professors who integrate tech in their teaching and research. 
To use Bryan's word from the title of the posting (and I know he chooses words carefully) our institutions "refuse" to recognize the use of technology by not using it as part of the criteria for faculty recognition. 

I'm an advocate of providing technology to faculty and providing the best possible training and support that my institution's resources can muster, but ultimately technology is a means, not an end. Technology has the potential to transform instruction, improve engagement, and expand access to learning. Technology constantly opens new avenues for research and enables new exploration and discovery. But faculty should be rewarded for excellent teaching and research. If my campus had a practice of providing some kind of weighting or quota for the use of technology that was used in a review process, I'd be concerned that this would become a check box, detached from the meaningful goals at the center of the faculty role. 

There's a lot to critique about the way that faculty are reviewed and rewarded, but I don't believe that adding "did you use technology" or "how much technology did you use" to the review process will provide the outcomes we want. Those of us who provide technology need to listen carefully to our faculty colleagues, work in partnership with them to incorporate the right technologies (and ditch the wrong ones) and keep the focus on the goal: student learning. That's the outcome that concerns me, not how much technology we use. So it doesn't concern me at all that 17% (and holding) of institutions refuse to "recognize and reward" the use of technology - except that 17% might be higher than I would like.

Thursday, August 6, 2015

Virtualizing the Right Part of the Conference


(Note: The posting originally appeared as a guest post at Virtually Connecting - thanks to Rebecca Hogue and Maha Bali for encouraging me to write it.)

I'm so pleased to see the interest and attention that the Virtually Connecting project is getting. Our traditional models of conference attendance are not only dated, they exclude many people - those who can't travel to attend in person because of health, location, family, and most often finances. We need to figure out how to offer new options, but how?

We already know how not to do it. About 10 years ago I got a chance to experience HP's "Halo" telepresence system. The promise of Halo, and Cisco Telepresence and other systems was that low-latency, high resolution video and audio, combined with careful design of lighting and furniture, could allow widely separated teams or individuals to meet remotely while creating the illusion that they were in the same room. The tech was impressive, but there were big problems with the model - the equipment and connections were prohibitively expensive for all but those with the deepest pockets, and you needed to schedule a time and a place for a formal conference. HP got out of the business a few years later, and while others including Cisco still push expensive teleconferencing systems, I haven't found them to be particularly effective for most uses in education.

One insight into the limitations of expensive, fixed telepresence systems came to me about a week after I visited HP's Halo. I was speaking with a colleague who was describing corporate meetings in Japan. He told me about a culture where meetings were a tightly orchestrated and constrained form of communication, in which very little authentic feedback was given. Instead, the real meeting occurred later with alcohol and karaoke - an environment where the individuals were freed to discuss what they REALLY thought. It occurred to me that most models of remote meeting virtualize the least useful portion of the communication.

Now translate this to virtual conferences. What works well? Consider keynotes. Personally, I find that watching an 18 minute TED talk without distraction only happens when I really care about the topic, and it's a really excellent presentation. I've seen a few wonderful conference keynotes, but it stretches my attention to sit through the average 60 or 70 minute keynote when I'm in person. On line? Not likely to happen. Besides, so many keynoters give the same talk over and over and you can find a version of it on Vimeo or youTube anyway.

How about conference sessions? The best sessions are interactive and involve the audience. Set up a camera in the back of the room pointed at the front and broadcast the PowerPoint - that's a recipe for faithfully recreating the virtual experience of a boring lecture. To me, the whole notion of "lecture capture" is so deadly that I like to refer to it as "lecture capture and release - capture the lecture and take it far away to release it where it can do no harm."

I go to a conference for the interactions, the buzz, the sense of what (and who) is new and different, the trends, the issues. A typical virtual conference, a camera in the back of the room and a PowerPoint feed, gets you little of that. Maybe if there's a good Q&A session AND the people in the room manage to remember to use the microphones, you might get something, if you can wait through the talk to get to the Q&A. The best sense I get is usually the Twitter feed - sometimes you can get a lot, and other times it's just confusing without the context of the conference.

And that brings us to experiments like Virtually Connecting. It's personal - the connection is usually, literally, in someone's hand, rather than mounted in the back of the room. An iPad is not the world's greatest teleconferencing tool, but it's relatively cheap, it's portable, and it's personal. Google Hangouts is free and it's mostly pretty good, and it works fine with the bandwidth at many locations, even overseas. I've also used Zoom (my personal favorite) as well as watching a session via Periscope which works surprisingly well. There's a big psychological difference between watching a feed from the back of the room - clean, cold, impersonal - versus a tablet or a phone propped on a desk with someone's backpack - warm and  human-scaled. The audio and video might not be as good, but the immediacy and the sense of connection is heightened. You feel like you're in the midst of the action rather than watching from afar.

I've been on both sides of the conversation and it's a qualitatively different experience. When you're physically present at the conference sharing with someone remote, you're trying to figure out what that person wants to know and what they've already heard. When you're remote, you're trying to decide what would be a good question to ask to get a sense of what's happening there. It's a somewhat different kind of social interaction and I think it will take most people a little bit of time to get used to it.

But the chance to ask a few questions and see the response is just marvelous. It's deeper and richer and more personal, and the slightly underground feeling adds an authenticity and appeal that draws you in. This is worth so much more than high def video with perfect lighting. People can speak naturally, and it's more like the karaoke bar than the Halo room.

The beauty of Virtually Connecting is that Rebecca and Maha have figured out how the virtualize the right part of the conference - the personal interaction. I'm thrilled by their experiments and I'm sure they will be widely imitated. I'm glad I've had opportunities to experience Virtually Connecting and look forward to more chances to try it out and build connections from afar.