Is Innovation Contextual or Absolute?

When discussing the concept of truth, many people will make the distinction between “truth” (lower case t) and “Truth” (upper case T), where “Truth” refers to ultimate truth that is true for all, and “truth” referring more to contextual truth that may be true for some but not others. Or, to simplify, absolute Truth and relative truth.

In many ways, I see the same need to differentiate between “Innovation” and “innovation” when discussing the overall concept of innovation. Of course, I’m not sure if I really want to make such a problematic connection between innovation and truth. But I think there is something to determining whether someone is referring to absolute innovation or relative innovation. There are ideas and tools that are new to everyone and therefore count as absolute innovation, and then there are ideas and tools that are not new to everyone, but are new to those that are just discovering them.

For example, online learning is a concept that has been around for decades. It is not absolutely Innovative in a general sense. But to schools that have no online courses, their first online courses will be innovative in their context. Or to a person that has avoided going online in general (or didn’t have access to the internet), the ability to take online courses will also be innovative to them.

Of course, even the idea of “absolute innovation” is problematic. Virtual Reality seems like a new, innovative idea to most…. but the truth is, the concept of virtual reality has been around for some time. Maybe you can more accurately say that the idea of a more widely-available digitally-created simulation-based computer-run semi-immersive interactive virtual reality is innovative in general to anyone. A lot of dashes there.

And I have also intentionally not spelled out how I am defining innovation beyond “something new” for this article. Another problematic area.

So why does all this matter? It probably doesn’t for most. I first ran into this issue 6-7 years ago as a chair for a proposal review committee for an “emerging technologies” track at a conference. The track description relied heavily on the term “innovation” to delineate between “emerging technology” and “latest and greatest technology” (because that was another track). We had submissions that ranged from using the (just recently-released at the time) Google Wave in classrooms to teaching with PowerPoint. Where does one draw the line between “current” and “emerging” based on the criteria of “innovation”?

Well, long story short… you don’t if you want to keep everyone happy :) You let people self-define whether they are innovative or not in their context and then let them take the heat if the session attendees don’t agree that their idea was innovative in general.

So it might surprise people that as an “Innovation Coordinator,” I don’t just look at things like virtual reality and learning analytics. I also look at many established instructional design and digital presence ideas. I also look at low tech ideas on how to be a human in a digital age. Even more shocking to some is how I talk about how throwing a handful of dirt at a poster board on the ground to demonstrate the “Big Bang” to 8th grade students as being one of the more innovative ideas I utilized back when I was an 8th Grade Science teacher. Sure, I also created my own online course hub that I hand-coded in html in the summer of 2000 long before most were putting K-12 material online. But I also had to find a way to help 8th graders visualize the Big Bang on a $200 a year total budget (classroom material, science equipment, everything – $200). So what did I do? I put a white poster-board on the ground, grabbed a handful of dirt, pebbles, and grass in my hand, and did a 2 minute demo on what the Big Bang would look like. It was effective. It was cheap. It was innovative in that context.

I definitely wish there was more focus on looking at innovation beyond the coolest, newest, most expensive gadgets, apps, programs, ideas, etc. How do we innovate when cost is a barrier? When technology access is non-existent? When we need to transfer online lessons to face-to-face classes? We have all kinds of media outlets that look at Innovation the moment “it” happens – any new device, tool, idea, app. But what does innovation look like in a contextual situation, where budgets are small, resources are constrained, and technology access is limited? And not just current situations, but situations that have historically lacked in these areas? How do we innovate access to technology itself? How do we innovate the cost of technology? There is a much wider and more nuanced conversation about innovation to be had.

Being a Human Shopper in a Digital Online Shopping Age

So with a new year, our research lab is going to focus on writing and setting goals for the upcoming year. Our main question at LINK Lab is “What does it mean to be human in a digital age?” I thought this would be a great place to start with processing what my goals should be, so I began my quest to write goals for the year with that question in mind. Then some national news this week helped bring some clarity to how my personal goals would relate to our main question.

This week was full of news that Sears, Macy’s, JC Penny’s and other big name stores are laying off workers and closing stores. Many people have been posting this news on various social media outlets with the general response of “I like to shop online better anyways.” Of course, I do as well. But I have noticed over the last few years that I still make a point to go buy some things in person that I could easily buy online, even while I still buy many things online.

For me, this is one way I am unconsciously pushing back against the increasing loss of control that comes along with living in a digital world. For instance, I know the exact pair of blue jeans that will always fit me from a certain store no matter what. I could easily buy those jeans online, and know that they will be the right pair for me. But I still find myself wandering into the local mall to buy new jeans when I need them. Something in me is pushing back against the digital age to still connect with being a human. Shopping in person is a very human experience. You get to touch and observe the exact product you will buy before buying it.

When you buy something online, you lose control over what you get. It will probably end up being the right thing, but you still lose that control until it arrives at your door. For me, to still be a human shopper in a digital online shopping age means to take control over some things and go do what a human would do. This may be shopping in brick and mortar stores in person, or driving myself somewhere when I could have gotten an Uber, or drawing a picture on a piece of paper instead of blogging about an idea(I have a really interesting idea for a drawing to do about my pathways work – hope I get time to draw that out soon). Its not that online shopping or Uber or blogging are bad – I just need to do things for me that remind me of what it means to be human. That might be different for different people.

To bring this back to work, for me, the aspect of “what does it mean to be human in a digital age” that interests me the most is the tension between control and agency.

In a learning context for projects to be researched, that interest would manifest itself in a question something along the lines of “What happens when learners have more agency over their learning journey?”

edugeek-journal-avatarThis question is obviously a work in process that will probably be refined over the next few weeks. I hope to get some decent goals out of this overarching question that would apply to pathways, virtual reality, publications, etc. But it is a starting point for me at least.

Is It Really Learner Agency If The Instructor “Empowers” It?

For a few years now I have been struggling with how to “verb” agency in education (sometimes referred to as learner agency or student agency). When people first become aware of the idea, they tend to use terminology like “I want to allow student agency in my classes.” I guess on some levels that is technically what happens in many cases, as the teacher typically holds the power in the course, and they have to allow agency to happen.

However, once one uses that terms a bit and gets used to the idea, you usually realize that “allowing” agency is kind of a contradiction. People tend to shift towards using the term “empower”… as in, “My goal was to empower learner agency in this lesson.” This is the verb I hear most at conferences the few times that agency in education is touched on.

Of course, saying that the instructor is “empowering” agency is pretty problematic as well. Is a learner’s thought process really independent if the instructor is the one that empowered it? Is the autonomous action that flows from independent thought really all that autonomous if the teacher had to initiate the power to make it happen?

With some twists in logic and semantic word play, I am sure one could say that agency can be empowered, but to be honest – it really can’t. If the teacher is the one that “empowers” it, then its not really agency. What many really mean when they refer to “empowering agency” is “tricking learners into doing something that looks like independent thought and action, even if they didn’t really independently decide to think or act that way because at the end of the lesson there was a grade for coming up with something within specific instructor-determined parameters.”

I have started using terms like “unleash” when discussing agency in presentations, because that is probably about all you can really do with agency – remove the barriers that are holding it down, and let it do its own thing. But still, not really the best verb for agency.

Of course, this is probably why we don’t see much true learner agency in formal education settings – you set it loose, and it could go in any direction, or none, sometimes both from the same learner. It becomes something that is difficult to standardize and quantify once it really happens.

However, I am speaking of agency as if it is something that turns on and off at the flick of a switch, when the reality is that there are shades of agency that exist on a spectrum. Even when we unleash it, or just stand back and see what happens (or how ever you want to “verb” it), its not like learners just jump right into agency feet first and swim around in it like a natural. Some need guidance, scaffolding, a hand to hold, etc – whether because they are new to the idea in a system that has never allowed it or because they just need a more experienced hand to point them towards which way to go. Oh sure, there are many that do just launch out with little to no guidance to do it just fine. In any one class, you are going to have learners all over the place. They will even switch places from day to day or hour to hour.

edugeek-journal-avatarAgency in learning is something that takes the predictable linear instructivist narrative and explodes it all kinds of directions, but then even messes with linear time in that explosion as some need it to go slower while others need a guide through the explosion and others ride the explosion with enthusiasm wanting it to go faster. Oh, and then they all change their place in that process without a moments notice. So how does one come up with a verb to explain this chaos?

(image credit: Blue Chaos 3 by Josh Klute)

Decreasing Design Presence

With the Humanizing Online Learning MOOC in full swing, I wanted to dig more into a topic that I tend to allude to at conference presentations. While educators often talk (rightly so) about increasing teaching, social, and cognitive presence, there is also one form of presence that needs to be decreased when designing and teaching courses: design presence.

I’m using “design presence” here to cover a wide range of user interface, instructional design, and learning theory issues. In my mind, there are at least three areas that are heavy on design presence, and therefore design presence needs to be decreased in these areas:

  1. Technological Design Presence: tool/technology interfaces and instructions
  2. Instructional Design Presence: tool and content instructional design decisions
  3. Epistemological Design Presence: underlying learning theory choices

While some might notice there is some overlap with these areas and teacher, social, and cognitive presence, I have found that there are still some differences. Working to decrease design presence also ends up helping to increase teaching, social, and cognitive presence in the long-run.

Technological Design Presence

This is an area where user interface and instructional design collide, and for many courses designers the options are pre-determined by institutional adoptions. However, where choices are allowed, utilizing tools that have the least complex user interface options is ideal. For example, if you really want to use a listserv, but the tool you have to use is complex to sign-up and use, why not use Twitter? The user interface on Twitter is very simple compared to some older mass email tools. If you have to have a really complex set of instructions to use a tool, why not consider using something with less instructions and stress on the learner?

Or if you have a listserv tool that is easier to use than Twitter, why not use that instead of Twitter?

Where there are several options within a tool (like an LMS), why not choose the least confusing, most ready-to-use tool? Newer features in larger LMS tool sets often have a steep learning curve. For example, the blog feature in Blackboard was very confusing when it was first released, and it really worked more as a re-arranged discussion board. If you have to stay within Blackboard, then stick with the tools that take the least amount of time to explain to learners.

Additionally, think about other issues that cause unnecessary technology confusion. Blackboard was infamous for allowing course designers to set-up boxes within boxes within boxes. Avoid using tools and content structures just because you can. Avoid using desktop tools that make no sense online (like “folders” inside of online content). Avoid using complex navigational structures just because you can.

Once learners have to click around a half dozen times just to get somewhere, or dig through complex tool instructions, or spend too much time figuring out what you want them to do, they are running into too much technological design presence. Decrease what you can where you can.

Instructional Design Presence

This next facet has many connections to the first one, so there will probably be some overlap. Many times, course designers will make tool and content design decisions that are unnecessarily complex. For example, complex grading schemes that require dense explanations and calculators to figure out. Why go there? Obviously, there is merit to the idea that grades are problematic altogether, but many instructors are stuck with them. So why make them so complex? Why not just base course grades on a 100 point scale (which most people understand already), and make each assignment a straight portion of that grade. Complex structures based on weighted grades and 556 point scales and what not are a burden for both the instructor and the learner.

Rubrics are also a part of this area. Complex rubrics with too many categories and specific point values are, again, a burden for learners and instructors. Compare the complexity of this rubric with this one. I realize some people like the first one because it has so much detail, but to be honest, it is something most readers aren’t going to read through, because just glancing at it could cause stress.

Or another issue might be design choices that add unnecessary complexity, like having students upload Word docs to discussion forums for class discussion. Why not just use blogs? That is basically what you are doing with Word Docs and discussion forums.

Course designers typically make many choices with tools and content in their courses. Do these choices increase the instructional design presence of those decisions? Or do they decrease the design presence and allow learners to focus on learning rather than figuring out your designs?

Epistemological Design Presence

This area is a bit more difficult to get at, as it probably affects overarching decisions that affect everything in your course. For instance, if you lean more towards instructivism that places yourself at the center of everything in a course, you will probably choose many tools and interfaces that support your instructivist leanings: lecture capture, content heavy videos, long reading assignments, multiple choice tests, etc.

Now, just to point out, I am not a person to bash instructivist lectures across the board no matter what. There are times when learners need a well executed lecture. However, in education, many instructors use lectures too much. They use lectures to fill time when learners should be doing something hands on and/or active. If you are using lectures on video (or textbook readings) when learners should be creating their own knowledge, or applying concepts hands-on, or collaborating in groups, you have increased the epistemological design presence of your preferred learning theory at the expense of what the learners really needed. Time to decrease that facet of design presence.

There are times when learners don’t need to socially connect or listen to lectures, but work on their own. There are times when they need to connect with others rather than work individually. Don’t stick with instructivism or social constructivism or connectivism or any other theory you love just because you like it best. Put the learner first.

But what about the times where learners are at different levels and need different theories? Or, when no one theory fits and it is really up to the learner? I say, give them the choice. Build in multiple pathways for learning in your course. Build in scaffolding for learners to change into different theories. But avoid the mistakes I have made in the past and make sure to decrease the design presence of those options and pathways as much as possible. Don’t focus on the difference between the pathways – just focus on the fact that learners can make the choices they need at any given moment and then show the choices.

Decreasing Design Presence

edugeek-journal-avatarIf you are a good course designer, you probably already know everything I have touched on here. There is nothing new or different about what I am outlining here – this is solid instructional design methodology taught in most instructional design courses or learned on the job. However, it is seldom examined from the angle of decreasing design presence, and since I am one of the “wayfinders” in a course on the Community of Inquiry framework that covers teaching, social, and cognitive presence, I thought it would be a good idea to have a place to point to every time I mention “decreasing design presence.”

(image credit: Human Presence by Manu Mohan)

Self-Determined Learning: The Lesser-Explored Side of Open Learning

OpenEd 16 is in full swing and I am already kicking myself for not going this year. I seem to miss at least half of the cool conferences. Adam Croom has already provided a fascinating analysis of the abstract topics, which reveals a great list of important topics. However, I do notice something that is (possibly?) missing.

There is a lot about resources, textbooks, pedagogy, etc. Much of this focuses on removing barriers of access to education, which is a topic that we should all support. But what about the design of this education that they are increasing access to?

“Open pedagogy” seems to be the main focus of the design side of the equation. Of course, it is hard to tell from this analysis what people will really present on. When I think of open pedagogy, I think of David Wiley’s important work on the topic. Wiley’s description of open pedagogy is focused on being open about the design and assessment process, as well as allowing learners to remix and create their own open content.

So the question is – where is the learner agency, the self-determined learning, and the heutagogical side of “open learning”? It is probably there, but just not as explicitly named or explored. When you unleash your learners to determine their own pathway, their own context, their own content, and so on – that is also a part of open learning that needs to be specifically mentioned.

Open pedagogy is definitely a scaffold-ed step into self-determined open learning. Maybe some would argue that self-determined learning is implicitly a form of open pedagogy. I wouldn’t disagree, although I tend to avoid using pedagogy as a catch-all term for all forms of learning design due to the co-opting nature of expanding the use of pedagogy beyond “to guide a child.” But that really isn’t a huge deal to me as it is to the early childhood educators that feel left out of most academic educational discussions and usually don’t appreciate the college educators that typically leave them out also stealing the technical term for their design methodology.

Even when looking at the Wikipedia article on open learning, many of the topics touched on get close to self-determined learning, but not quite: self-regulated learning, active learning, life-long learning, etc. Almost there, but not quite.

edugeek-journal-avatarAgain, I know there are people out there that include the topics of learner agency and self-determined learning in the open learning / open education sphere, and that there are some people working in those topics. I just think there should be more. In my opinion, you can offer all the free content you want to and allow people to remix and re-use as much as you want… but if the design still focuses on the instructor (or the pre-determined content) as the center of the course, you have just created an open-licensed “sage on the stage” learning experience. Which I am sure many people will need, but for many others, this falls short of the concepts of learning how to be a learner.

We are the Monster at the End of the Book

I wanted to circle back to a thought I had while reading Maha Bali’s excellent post Reproducing Marginality? The whole post is excellent, but one line made me think more than others. In it, she quotes something that she wrote with Paul Prinsloo and Kate Bowles that says:

…for most of us not in the US (or the UK), this [edtech] vision has often signalled top-down, US-to-world, Anglo-oriented, decontextualized, culturally irrelevant, infrastructure-insensitive, and timezone-ignorant aspirations, even when the invitation for us to join in may be well-intentioned.

Many of us in the Western world of EdTech are trying to figure out how to fix Education and Ed Tech, looking for the evil monsters out there that are causing the problems, and then fixing those monsters with research, technology, design, or methods.

And sometimes we are afraid to see what those monsters are that are damaging education, because they may be too big for us to fix.

This all reminds me of one of my favorite books as a kid: The Monster at the End of the Book.


In this book, Grover notices the title of the book and spends every page trying to stop you, the reader, from reaching the end of the book. He nails pages together, builds brick walls, and pleads with you NOT to get to the end of the book and face the monster lurking there.


Grover is terrified of the monster at the end of the book. But when he gets to the end of the book, he finds that he was the monster all along and that he had nothing to fear.

We (in the western world) are pretty much the monster at the end of the book when it comes to education reform. We are doing everything we can to avoid that possibility – looking to everything but ourselves to fix the problems. But is is our (sometimes) extreme ethno-centrism, socio-cultural centrism, whatever you want to call it, that is the problem all along. I would even go so far to say that as long as we are the center of the education world, we are always going to be the problem.

edugeek-journal-avatarEducation is about learning. Learners do the learning. Learning needs to be the center of what we do. Learners can live anywhere in the world, in any context. We need to examine the structures that keeps the wrong things at the center of education. We need to skip to the end of the book, realize we are the monster at the end of the book, and turn the story around. Learner agency is the only true “innovation” was have left to explore deeply in the education world.

Big (Scary) Education (Retention) Data (Surveillance)

Big data in education might be the savior of our failing learning system or the cement shoes that drags the system to the bottom of the ocean depending on who you talk to. No matter what your view of big data is, it is here and we need to pay attention to it regardless of our views.

My view? It is a mixture of extreme concern for the glaring problems mixed with hope that we can correct course on those problems and do something useful for the learners with the data.

Yesterday at LINK Lab we had a peak behind the scenes at a data collection tool that UTA is implementing. The people that run the software at UTA are good people with good intentions. I also hope they are aware of the problems already hard coded in the tool (and I suspect they are).

Big Data can definitely look scary for a lot of reasons. What we observed was mostly focused on retention (or “persistence” was the more friendly term the software uses I believe). All of the data collected basically turns students into a collection of numbers on hundreds of continuums, and then averages those numbers out to rank them on how likely they are to drop out. To some, this is scary prospect.

Another scary prospect is that there is the real danger of using that data to see which students to ignore (because they are going to stick around anyways) and which students to focus time and energy on (in order to make the university more money). This would be data as surveillance more than educational tool.

While looking at the factors in this data tool that learners are ranked by led to no surprises – we have known from research for a long time what students that “persist” do and what those that don’t “persist” do (or don’t do). The lists of “at risk” students that these factors produce will probably not be much different from the older “at risk” lists that have been around for decades. The main change will be that we will offload the process of producing those lists to the machines, and wash our hands of any bias that has always existed in producing those lists in the first place.

And I don’t want to skip over the irony of spending millions or dollars on big data to find out that “financial difficulties” are the reason that a large number of learners don’t “persist.”

The biggest concern that I see is the amount of bias being programmed into the algorithms. Even the word “persistence” implies certain sociocultural values that are not the same for all learners. Even in our short time looking around in the data collection program, I saw dozens of examples of positivist white male bias hard coded in the design.

For example, when ranking learners based on grades, one measure ranked learners in relation to the class average. Those that fell too far below the class average were seen as having one risk factor for not “persisting.” This is different than looking at just grades as a whole. If the class average is a low B but a learner has a high B, they would be above the class average and in the “okay” zone for “persistence.”

But that is not how all cultures view grades. My wife is half Indian and half Australian. We have been to India and talked to many people that were under intense stress to get the highest grades possible. It is a huge pressure for many in certain parts of that culture. But even a low A might not register as a troubling signal if the class average is much lower. But to someone that is facing intense pressure to get the best grades or else come home and work in Dad’s business… they need help.

(I am not a fan of grades myself, but this is one area that stuck out to me while poking around in the back end of the data program)

This is an important issue since UTA is designated as a Hispanic Serving Institute. We have to be careful not get into the same traps that education has fallen into for centuries related to inequalities. But as our LINK director Lisa Berry pointed out, this is also why UTA needs to dive into Big Data. If we don’t get in there with our diverse population and start breaking the algorithms to expose where they are biased, who else will?  Hopefully there are others, but the point is that we need to get in there and critically ask the hard questions, or else we run the risk of perpetuating educational inequalities (by offloading them to the machines).

For now, a good place to start is by asking the hard questions about privacy and ownership in our big data plan:

Are the students made aware that this kind of data is being collected?

If not, they need to be made aware. Everywhere that data is collected, there should be a notification.

Beyond that, are they given details on what specific data points are being collected?

If not, they need to know that as well. I would suggest a centralized ADA-compliant web page that explains every data point collected in easy to understand detail (with as many translations to other languages as possible).

Can students opt-out of data collection? What about granular control over the data that they do allow to be collected?

Students should be able to opt out of data collection. Each class or point of collection should have permissions. Beyond that, I would say they should be able to say yes or no to specific data points if they want to. Or even beyond that, what about making data collection opt-in?

Who owns the students’ data (since it is technically their actions that create the data)?

This may seem radical to some, but shouldn’t the student own their own data? If you say “no,” then they should at least have the right to access it and see what is being collected on them specifically.

Think of it this way: How will the very substantial Muslim population at UTA feel about a public school, tied to the government, collecting all of this data on them? How will our students of color feel about UTA collecting data on them while they are voicing support for Black Lives Matter? How would the child of illegal immigrants feel about each class at UTA collecting data about them that could incriminate their parents?

edugeek-journal-avatarThese issues are some of the hard things we have to wrestle with in the world of Big Data in Education. If we point it towards openness, transparency, student ownership, and helping all learners with their unique sociocultural situations, then it has potential. If not, then we run the risk of turning Big Education Data into Scary Retention Surveillance.

Disruption is No Longer Innovative

How can you tell if an innovator is pulling your leg? Their lips are moving. Or their fingers are typing. I write that knowing fully well that it says a lot about my current title of “learning innovation coordinator.” To come clean about that title: we were allowed to choose them to some degree. I chose that one for pure political reasons. I knew that if I wanted to help bring some different ideas to my university (like Domain of One’s Own, Learning Pathways, Wearables, etc), I would need a title beyond something like “instructional technologist” to open doors.

But beyond a few discussions that I have on campus, you will rarely hear my talking about “innovation,” and I reject the title of “innovator” for almost anyone. Really, if you think any technology or idea or group is innovative, put that technology or idea into Google followed by “Audrey Watters” and get ready for the Ed-Tech history lesson the “innovators” tend to forget to tell you about.

In a broad sense, many would say that the concept of “innovation” involves some kind of idea or design or tool or whatever that is new (or at least previously very very “popular”). Within that framework of innovation, disruption is no longer “innovative.” Disruption is really a pretty old idea that gained popularity after the mp3 supposedly “disrupted” the music business and/or the digital camera disrupted the camera industry.

Of course, that is not what happened – mp3s and digital cameras just wrenched some power out of the hands of the gatekeepers of those industries, who then responded by creating the “disruption narrative” (which is what most are referring to when they just say “disruption”). And then proceeded to use that narrative to gain more control over their industry than before (for example, streaming music services). Keep this in mind any time you read someone talking about “disruption” in education. Who is saying it, what do they want it to do, and how much more control do they get over the educational process because of their disruption narrative?

Of course, there is debate over whether disruption is real or not. Both sides have good points. Regardless of if you believe that disruption is real or not, our current disruption narrative has been around for over two decades now… probably long past the expiration date that gets slapped on any “innovative” idea. If you are still talking disruption, you are not an innovator.

If you want to convince me that you are an innovator, I don’t want to know what cool ideas or toys you have. I want to know who you read and follow. Are you familiar with Audrey Watters? Have you read Gayatri Chakravorty Spivak’s Can the Subaltern Speak? Are you familiar with Adeline Koh’s work on Frantz Fanon? Do you follow Maha Bali on Twitter? If I mention Rafranz Davis and #EdtechBlackout, do I get a blank stare back from you?

If you were to chart the people that influence your thinking – and it ends up being primarily white males… I am not sure how much of an innovator you really are. Education often operates as a “one-size-fits-all” box (or at best, a “one-set-of-ideas-fits-all” box), and that box has mostly been designed by white males. Usually a small set of white males that think all people learn best like they do. How can your idea or technology be that “new” if it is influenced by the same people that influenced all of the previous ones?

So what has this “one-set-of-ideas-fits-all” box created for education? Think tanks and university initiatives that sit around “innovating” things like massive curriculum rethinking, “new” pedagogical approaches, and “creative new applications of a range of pedagogical and social technologies.” They try to come up with the solutions for the learners. Many of these are probably some great ideas – but nothing new.

Why not find ways to let the learners set their own curriculum, follow their own pedagogical approaches, or create their own ways of applying technology? Instead of walling ourselves up in instructional design teams, why not talk to the learners themselves and find out what hinders their heutagogical development? Why not look to learners as the instructors, and let them into the design process? Or dump the process and let learners be the designers?

What I am getting at is helping learners create and follow their own learning pathway. Each one will be different, so we need massive epistemological and organizational shifts to empower this diversity. Why not make “diversity” the new “innovative” in education? Diversity could be the future of educational innovation, if it could serve as a way to humanize the learning process. This shift would need people that are already interacting with a diverse range of educators and students to understand how to make that happen.

I would even go as far to say that it is time to enter the “post-innovation” era of Ed-Tech, where any tool or idea is framed based on whether it supports a disruption mindset or a diversity mindset. What does that mean about emerging ideas like big data or wearables? Post-innovation would not be about the tool or the system around it, but the underlying narrative. Does this “thing” support disruption or diversity? Does it keep power with the gatekeepers that already have it, or empower learners to explore what it means for them to be their one unique “human” self in the digital age?

For example, if “big data” is just used to dissect retention rates, and then to find ways to trick students into not dropping out… that is a “disruption” mindset. “We are losing learners/control, so let’s find a way to upend the system to get those learners back!” A diversity mindset looks at how the data can help each individual learner become their own unique, self-determined learner, in their particular sociocultural context: “Based on the this data that you gave us permission to collect, we compared it anonymously to other learners and they were often helped by these suggestions. Do any of these look interesting to you?” Even of the learner looks at these options and rejects all of them, the process of thinking through those options will still help them learn more about their unique learning needs and desires. It will help them celebrate their unique, diverse human self instead of becoming another percentage point in a system designed to trick them into producing better looking numbers for the powers that be.

edugeek-journal-avatarThis is also a foundational guiding aspect of the dual-layer/learning pathways idea we are working on at the LINK Lab. It is hard to come up with a good name for it, as we are not really looking at it as a “model” but something that turns the idea of a “model” or “system” inside out, placing each individual learner in the role of creating their own model/pathway/system/etc. In other words, a rejection of “disruption” in favor of “diversity.” We want to embrace how diversity has been and always will be the true essence of what innovation should have been: each learner defining innovation for themselves.

Personalized Learning Versus Dungeons and Dragons

Personalized learning is popular right now. But is that a good or bad thing? I can buy all kinds of personalized gadgets online, but do I really like or need any of them? If you decided to get me a custom dinner place mat that says “Matt’s Grub” – sure that is personalized. But its also a pretty useless personalized item that I have no interest in.

Many prominent personalized learning programs/tools are a modern educational version of the Choose Your Own Adventure book series from the 1908s. As I have written before, these books provided a promise of a personalized adventure for the reader, which was entertaining for a while. But you were really just choosing from a series of 50 pre-written paths, hoping to pick one of the ones that led to a happy ending. Of course, If you happened to have any physical characteristics that were different than the ones written into the story (I remember a classmate that had shaved his head making fun of one page that had the main character doing something with his hair – yes they were sometimes gendered stories even), then the “your” in “Choose Your Own Adventure” fell flat.


These eventually evolved into more complex books like the Lone Wolf gamebooks that had you doing your own battles, collecting objects, and other activities that were closer to role playing games.


But let’s face it – the true “Choose Your Own Adventure” scenarios in the 1980s were really role playing games. And few were as personalizable as Dungeons and Dragons.

Now, whether you love or hate D&D, or even still think it is Satanic… please hear me out. D&D, at least in the 80s, was personalizable because it was provide different pathways that were scaffolded. New players could start out with the Basic D&D boxset – which came with game rules, pre-designed characters, basic adventures to go on, etc. And that wasn’t even really the starting point. If basic D&D was too unstructured for you, there were books like the Dragonlance Chronicles or the Shannara series that would give you this completely guided tour of what D&D could look like. Oh, and even a Saturday morning cartoon series if the books were too much for you.

But back to D&D, once you mastered the Basic set, there were more sets (Expert, Companion, Master, and Immortal) – all of which gave you more power and control. Then, when you were ready (or if you found Basic D&D too pre-determined), there was Advanced Dungeons and Dragons. This was a set of books that laid out some basic ideas to create your own characters and worlds and adventures. And you were free to change, modify, add to, or completely re-invent those basics. Many people did, and shared their modifications in national magazines like Dragon Magazine. Oh, and what if you want to make your own world but are still unsure? You had a whole range of pre-designed adventures called Dungeon Modules. Just buy one, play, and get inspired to create your own. Or, maybe the opposite is true: you were just tired of your creation and wanted to take a break in someone else’s world.


To me, Dungeons and Dragons in the 1980s was a much better metaphor for what personalized learning should look like. You had completely mindless escapism entertainment (aka lectures) when you needed it, like the books and cartoons. You had the structured environment of Basic D&D to guide you through the basics (aka instructivism). You had a series of games and accessories like Dungeon Modules and Companion Sets to guide you (aka scaffold you) to the advanced stage. You had the Advanced books that set a basic structure for creating your own world (aka the Internet). Then you had a network of people sharing ideas and designs to keep new ideas flowing (aka connectivism). Many gamers would go back and forth between these various parts – creating their own world, sharing their ideas in the magazines, playing dungeon modules on occasion, reading the books, and dipping back to basic D&D when the mood hit them.

This scene from The Big Bang Theory shows how players can customize, adapt, and personalized the game experience, even as they play:

edugeek-journal-avatarOf course, there were problems with the gaming community. It was expensive, and often sexist and/or racist. So I am not painting the Dungeon and Dragons world of the 1980s as some perfect utopia. I am looking at the design of the tools and system here. It is one that in some fashion pre-ceded and informed what we are doing with pathways learning, and one that I think is closer to true “personalization” than what some personalized learning situations offer.

Pokemon Go and the Gimmickification of Education

I almost dread looking at my social media feed today. Pokemon Go (GO? G.O.? (wake me up before you) Go-Go?) received a large bit of media attention this weekend, apparently even already spawning posts about how it will revolutionize education and tweets about how we need what it produces in education:

All I could think about is: how did we get to this point? Every single tech trend turns into a gimmick to sell education mumbo jumbo kitsch tied to every cool, hip trend that pops up on the social media radar. I guess I shouldn’t been that surprised once Block-chain became educational, or Second Life was used to deliver classes, or Twitter replaced LMSs, or MySpace became the University of the future, or DVDs saved public schools, and so on and so forth. I bet at some point thousands of years ago there was a dude in white toga standing up in an agora somewhere telling Plato how chariots would revolutionize how he taught his students.

I’m all for examining new trends through an educational lens, but every time I just want to say “too far, Ed-Tech®, too far!”

We all know education needs to change. It always has been changing, it always will, and will always need to have a critical lens applied to how and why it is changing. But with every new technology trend that gets re-purposed into the next savior of education, I can’t stop this gnawing feeling that our field is becoming a big gimmick to those outside of it.

A gimmick is basically just a trick intended to attract attention. One or two seem harmless enough. Well, not that harmful? But once everything that comes down the pipe starts become this trick to get people to look at education, the gimmick gets old. People are still asking what happened to Second Life, to Google Wave, to you name the trend. After a while, they stop buying into the notion that any of us know what we are talking about. Just think of the long-term effect on the larger discourse of so many people declaring so many things to be the savior of education, only to abandon each one after a year or two.

edugeek-journal-avatarThe problem with the hype cycle of Ed-Tech is that is buries the real conversations that have been happening for a long time on whatever the hype-de-jour is. Do you want the Pokemon Go for education, where students are engaged, active, social, etc? We already have a thousand projects that have done that to some degree. Those projects just can’t get attention because everyone is saying “Pokemon Go will revolutionize education!” (well, at least those that say that un-ironically – sarcastic commentary that apparently went over many people’s head not included).

(see also “Pokemon GO is the xMOOC of Augmented Reality“)