scratch-metal

Disruption is No Longer Innovative

How can you tell if an innovator is pulling your leg? Their lips are moving. Or their fingers are typing. I write that knowing fully well that it says a lot about my current title of “learning innovation coordinator.” To come clean about that title: we were allowed to choose them to some degree. I chose that one for pure political reasons. I knew that if I wanted to help bring some different ideas to my university (like Domain of One’s Own, Learning Pathways, Wearables, etc), I would need a title beyond something like “instructional technologist” to open doors.

But beyond a few discussions that I have on campus, you will rarely hear my talking about “innovation,” and I reject the title of “innovator” for almost anyone. Really, if you think any technology or idea or group is innovative, put that technology or idea into Google followed by “Audrey Watters” and get ready for the Ed-Tech history lesson the “innovators” tend to forget to tell you about.

In a broad sense, many would say that the concept of “innovation” involves some kind of idea or design or tool or whatever that is new (or at least previously very very “popular”). Within that framework of innovation, disruption is no longer “innovative.” Disruption is really a pretty old idea that gained popularity after the mp3 supposedly “disrupted” the music business and/or the digital camera disrupted the camera industry.

Of course, that is not what happened – mp3s and digital cameras just wrenched some power out of the hands of the gatekeepers of those industries, who then responded by creating the “disruption narrative” (which is what most are referring to when they just say “disruption”). And then proceeded to use that narrative to gain more control over their industry than before (for example, streaming music services). Keep this in mind any time you read someone talking about “disruption” in education. Who is saying it, what do they want it to do, and how much more control do they get over the educational process because of their disruption narrative?

Of course, there is debate over whether disruption is real or not. Both sides have good points. Regardless of if you believe that disruption is real or not, our current disruption narrative has been around for over two decades now… probably long past the expiration date that gets slapped on any “innovative” idea. If you are still talking disruption, you are not an innovator.

If you want to convince me that you are an innovator, I don’t want to know what cool ideas or toys you have. I want to know who you read and follow. Are you familiar with Audrey Watters? Have you read Gayatri Chakravorty Spivak’s Can the Subaltern Speak? Are you familiar with Adeline Koh’s work on Frantz Fanon? Do you follow Maha Bali on Twitter? If I mention Rafranz Davis and #EdtechBlackout, do I get a blank stare back from you?

If you were to chart the people that influence your thinking – and it ends up being primarily white males… I am not sure how much of an innovator you really are. Education often operates as a “one-size-fits-all” box (or at best, a “one-set-of-ideas-fits-all” box), and that box has mostly been designed by white males. Usually a small set of white males that think all people learn best like they do. How can your idea or technology be that “new” if it is influenced by the same people that influenced all of the previous ones?

So what has this “one-set-of-ideas-fits-all” box created for education? Think tanks and university initiatives that sit around “innovating” things like massive curriculum rethinking, “new” pedagogical approaches, and “creative new applications of a range of pedagogical and social technologies.” They try to come up with the solutions for the learners. Many of these are probably some great ideas – but nothing new.

Why not find ways to let the learners set their own curriculum, follow their own pedagogical approaches, or create their own ways of applying technology? Instead of walling ourselves up in instructional design teams, why not talk to the learners themselves and find out what hinders their heutagogical development? Why not look to learners as the instructors, and let them into the design process? Or dump the process and let learners be the designers?

What I am getting at is helping learners create and follow their own learning pathway. Each one will be different, so we need massive epistemological and organizational shifts to empower this diversity. Why not make “diversity” the new “innovative” in education? Diversity could be the future of educational innovation, if it could serve as a way to humanize the learning process. This shift would need people that are already interacting with a diverse range of educators and students to understand how to make that happen.

I would even go as far to say that it is time to enter the “post-innovation” era of Ed-Tech, where any tool or idea is framed based on whether it supports a disruption mindset or a diversity mindset. What does that mean about emerging ideas like big data or wearables? Post-innovation would not be about the tool or the system around it, but the underlying narrative. Does this “thing” support disruption or diversity? Does it keep power with the gatekeepers that already have it, or empower learners to explore what it means for them to be their one unique “human” self in the digital age?

For example, if “big data” is just used to dissect retention rates, and then to find ways to trick students into not dropping out… that is a “disruption” mindset. “We are losing learners/control, so let’s find a way to upend the system to get those learners back!” A diversity mindset looks at how the data can help each individual learner become their own unique, self-determined learner, in their particular sociocultural context: “Based on the this data that you gave us permission to collect, we compared it anonymously to other learners and they were often helped by these suggestions. Do any of these look interesting to you?” Even of the learner looks at these options and rejects all of them, the process of thinking through those options will still help them learn more about their unique learning needs and desires. It will help them celebrate their unique, diverse human self instead of becoming another percentage point in a system designed to trick them into producing better looking numbers for the powers that be.

edugeek-journal-avatarThis is also a foundational guiding aspect of the dual-layer/learning pathways idea we are working on at the LINK Lab. It is hard to come up with a good name for it, as we are not really looking at it as a “model” but something that turns the idea of a “model” or “system” inside out, placing each individual learner in the role of creating their own model/pathway/system/etc. In other words, a rejection of “disruption” in favor of “diversity.” We want to embrace how diversity has been and always will be the true essence of what innovation should have been: each learner defining innovation for themselves.

dd-basic-set3

Personalized Learning Versus Dungeons and Dragons

Personalized learning is popular right now. But is that a good or bad thing? I can buy all kinds of personalized gadgets online, but do I really like or need any of them? If you decided to get me a custom dinner place mat that says “Matt’s Grub” – sure that is personalized. But its also a pretty useless personalized item that I have no interest in.

Many prominent personalized learning programs/tools are a modern educational version of the Choose Your Own Adventure book series from the 1908s. As I have written before, these books provided a promise of a personalized adventure for the reader, which was entertaining for a while. But you were really just choosing from a series of 50 pre-written paths, hoping to pick one of the ones that led to a happy ending. Of course, If you happened to have any physical characteristics that were different than the ones written into the story (I remember a classmate that had shaved his head making fun of one page that had the main character doing something with his hair – yes they were sometimes gendered stories even), then the “your” in “Choose Your Own Adventure” fell flat.

ChooseYourOwnAdventure

These eventually evolved into more complex books like the Lone Wolf gamebooks that had you doing your own battles, collecting objects, and other activities that were closer to role playing games.

LoneWolf

But let’s face it – the true “Choose Your Own Adventure” scenarios in the 1980s were really role playing games. And few were as personalizable as Dungeons and Dragons.

Now, whether you love or hate D&D, or even still think it is Satanic… please hear me out. D&D, at least in the 80s, was personalizable because it was provide different pathways that were scaffolded. New players could start out with the Basic D&D boxset – which came with game rules, pre-designed characters, basic adventures to go on, etc. And that wasn’t even really the starting point. If basic D&D was too unstructured for you, there were books like the Dragonlance Chronicles or the Shannara series that would give you this completely guided tour of what D&D could look like. Oh, and even a Saturday morning cartoon series if the books were too much for you.

But back to D&D, once you mastered the Basic set, there were more sets (Expert, Companion, Master, and Immortal) – all of which gave you more power and control. Then, when you were ready (or if you found Basic D&D too pre-determined), there was Advanced Dungeons and Dragons. This was a set of books that laid out some basic ideas to create your own characters and worlds and adventures. And you were free to change, modify, add to, or completely re-invent those basics. Many people did, and shared their modifications in national magazines like Dragon Magazine. Oh, and what if you want to make your own world but are still unsure? You had a whole range of pre-designed adventures called Dungeon Modules. Just buy one, play, and get inspired to create your own. Or, maybe the opposite is true: you were just tired of your creation and wanted to take a break in someone else’s world.

add

To me, Dungeons and Dragons in the 1980s was a much better metaphor for what personalized learning should look like. You had completely mindless escapism entertainment (aka lectures) when you needed it, like the books and cartoons. You had the structured environment of Basic D&D to guide you through the basics (aka instructivism). You had a series of games and accessories like Dungeon Modules and Companion Sets to guide you (aka scaffold you) to the advanced stage. You had the Advanced books that set a basic structure for creating your own world (aka the Internet). Then you had a network of people sharing ideas and designs to keep new ideas flowing (aka connectivism). Many gamers would go back and forth between these various parts – creating their own world, sharing their ideas in the magazines, playing dungeon modules on occasion, reading the books, and dipping back to basic D&D when the mood hit them.

This scene from The Big Bang Theory shows how players can customize, adapt, and personalized the game experience, even as they play:

edugeek-journal-avatarOf course, there were problems with the gaming community. It was expensive, and often sexist and/or racist. So I am not painting the Dungeon and Dragons world of the 1980s as some perfect utopia. I am looking at the design of the tools and system here. It is one that in some fashion pre-ceded and informed what we are doing with pathways learning, and one that I think is closer to true “personalization” than what some personalized learning situations offer.

maple-syrup

Pokemon Go and the Gimmickification of Education

I almost dread looking at my social media feed today. Pokemon Go (GO? G.O.? (wake me up before you) Go-Go?) received a large bit of media attention this weekend, apparently even already spawning posts about how it will revolutionize education and tweets about how we need what it produces in education:

All I could think about is: how did we get to this point? Every single tech trend turns into a gimmick to sell education mumbo jumbo kitsch tied to every cool, hip trend that pops up on the social media radar. I guess I shouldn’t been that surprised once Block-chain became educational, or Second Life was used to deliver classes, or Twitter replaced LMSs, or MySpace became the University of the future, or DVDs saved public schools, and so on and so forth. I bet at some point thousands of years ago there was a dude in white toga standing up in an agora somewhere telling Plato how chariots would revolutionize how he taught his students.

I’m all for examining new trends through an educational lens, but every time I just want to say “too far, Ed-Tech®, too far!”

We all know education needs to change. It always has been changing, it always will, and will always need to have a critical lens applied to how and why it is changing. But with every new technology trend that gets re-purposed into the next savior of education, I can’t stop this gnawing feeling that our field is becoming a big gimmick to those outside of it.

A gimmick is basically just a trick intended to attract attention. One or two seem harmless enough. Well, not that harmful? But once everything that comes down the pipe starts become this trick to get people to look at education, the gimmick gets old. People are still asking what happened to Second Life, to Google Wave, to you name the trend. After a while, they stop buying into the notion that any of us know what we are talking about. Just think of the long-term effect on the larger discourse of so many people declaring so many things to be the savior of education, only to abandon each one after a year or two.

edugeek-journal-avatarThe problem with the hype cycle of Ed-Tech is that is buries the real conversations that have been happening for a long time on whatever the hype-de-jour is. Do you want the Pokemon Go for education, where students are engaged, active, social, etc? We already have a thousand projects that have done that to some degree. Those projects just can’t get attention because everyone is saying “Pokemon Go will revolutionize education!” (well, at least those that say that un-ironically – sarcastic commentary that apparently went over many people’s head not included).

(see also “Pokemon GO is the xMOOC of Augmented Reality“)

non-linear-id

Evolution of the Dual-Layer/Customizable Pathways Design

For the past few weeks, I have been considering what recent research has to say about the evolution of the dual-layer aka customizable pathways design. A lot of this research is, unfortunately, still locked up in my dissertation, so time to get to publishing. But until then, here is kind of the run down of where it has been and where it is going.

The original idea was called “dual-layer” because the desire was to create two learning possibilities within a course: one that is a structured possibility based on the content and activities that the instructor thinks are a good way to learn the topic, the other an unstructured possibility designed so that learners can created their own learning experience as they see fit. I am saying “possibility” where I used to say “modality.” Modality really serves as the best description of these possibilities, since modality means “a particular mode in which something exists or is experienced or expressed.” The basic idea is to provide an instructivist modality and a connectivist modality. But modality seems to come across as too stuffy, so I am still looking for a cooler term to use there.

The main consideration for these possibilities is that they should be designed as part of the same course in a way that learners can switch back and forth between them as needed. Many ask: ‘why not just design two courses?” You don’t want two courses as that could impeded changing modalities, as well as create barriers to social interactions. The main picture that I have in my head to explain why this is so is a large botanical garden, like this:

01216 Scene at Fort Worth Botanic Gardens

There is a path there for those that want to follow it, but you are free to veer off and make your own path to see other things from different angles or contexts. But you don’t just design two gardens, one that is just a pathway and one that is just open fields. You design both in one space.

So in other words, you design a defined path (purple line below) and then connect with opportunities to allow learners to look at the topic from other contexts (gold area below):

pathways1-1

You have a defined modality (the path), and then open up ways for people to go off the path into other contexts. When allowed to mix the two, the learner would create their own customized pathway, something like this:

pathways1-2

The problem with the image above is that this really shouldn’t only be about going off the walkway in the garden to explore other contexts. Learners should be allowed to dig under the walkway, or fly above it. They should be able to dig deeper or pull back for a bird’s eye view as needed. So that would take the model into a three dimensional view like this:

pathways2

(please forgive my lack of 3-D modeling skills)

Learners can follow the instructors suggested content or go off in any direction they choose, and then change to the other modality at any moment. They can go deeper, into different contexts, deeper in different contexts, bigger picture, or bigger picture in different contexts.

The problem that we have uncovered in using this model in DALMOOC and HumanMOOC is that many learners don’t understand this design. However, many do understand and appreciate the choice… but there are some that don’t want to get bogged down in the design choices. Some that choose one modality don’t understand why the other modality needs to be in the course (while some that have chosen that “other” modality wonder the same thing in reverse). So really, all that I have been discussing so far probably needs to be relegated to an “instructional design” / “learning experience design” / “whatever you like to call it design” method. All of this talk of pathways and possibilities and modalities needs to be relegated to the design process. There are ways to tie this idea together into a cohesive learning experience through goal posts, competencies, open-ended rubrics, assignment banks, and scaffolding. Of course, scaffolding may be a third modality that sits between the other two; I’m not totality sure if it needs to be its own part or a tool in the background. Or both.

The goal of this “design method” would be to create a course that supports learners that want instructor guidance while also getting out of the way of those that want to go at it on their own. All while also recognizing that learners don’t always fall into those two neat categories. They may be different mixtures of both at any given moment, and they could change that mixture at any given point. The goal would be to design in a way that gives the learner what they need at any given point.

Of course, I think the learner needs to know that they have this choice to make. However, going too far into instructivism vs. connectivism or structured vs. unstructured can get some learners bogged down in educational theory that they don’t have time for. We need to work on a way to decrease the design presence so learners can focus on making the choices to map their learning pathway.

So the other piece to the evolution of this pathways idea is creating the tools that allow learners to map their pathway through the topic. What comes to mind is something like Storify, re-imagined as a educational mapping tool in place of an LMS. What I like about Storify is the simple interface, and the way you can pull up a whole range of content on the right side of the page to drag and drop into a flowing story on the left.

storify

I could image something like this for learners, but with a wide range of content and tools (both prescribed by the instructor, the learner, other learners, and from other places on the Internets) on the right side. Learners would drag and drop the various aspects they want into a “pathway” through the week’s topic on the left side. The parts that they pull together would contain links or interactive parts for them to start utilizing.

For example, the learner decides to look at the week’s content and drags the instructor’s introductory video into their pathway. Then they drag in a widget with a Wikipedia search box, because they want to look at bigger picture on the topic. Then they drag a Twitter hashtag to the pathway to remind themselves to tweet a specific question out to a related hashtag to see what others say. Then they drag a blog box over to create a blog post. Finally, they look in the assignment bank list and drag an assignment onto the end of the pathway that they think will best prove they understand the topic of the week.

The interesting thing about this possible “tool” is that after creating the map, the learner could then create a graph full of artifacts of what they did to complete the map. Say they get into an interesting Twitter conversation. All of those tweets could be pulled into the graph to show what happened. Then, let’s say their Wikipedia search led them to read some interesting blog posts by experts in the field. They could add those links to the graph, as well as point out the comments they made. Then, they may have decided to go back and watch the second video that the instructor created for the topic. They could add that to the graph. Then they could add a link to the blog post they created. Finally, they could link to the assignment bank activity that they modified to fit their needs. Maybe it was a group video they created, or whatever activity they decided on.

In the end, the graph that they create itself could serve as their final artifact to show what they have learned during the course. Instead of a single “gotcha!” test or paper at the end, learners would have a graph that shows their process of learning. And a great addition to their portfolios as well.

Ultimately, These maps and graphs would need to be something that reside on each learners personal domain, connecting with the school domain as needed to collect map elements.

When education is framed like this, with the learner on top and the system underneath providing support, I also see an interesting context for learning analytics. Just think of what it could look like if, for instance, instead of tricking struggling learners into doing things to stay on “track” (as defined by administrators), learning analytics could provide helpful suggestions for learners (“When other people had problems with _____ like you are, they tried these three options. Interested in any of them?”).

edugeek-journal-avatarOf course, I realized that I am talking about upending an entire system built on “final grades” by instead focusing on “showcasing a learning process.” Can’t say this change will ever occur, but for now this serves as a brief summary of where the idea of customizable pathways has been, where it is now (at least in my head, others probably have different ideas that are pretty interesting as well), and where I would like for it to go (even if only in my dreams).

LEGO

Depressing Confessions of a “Newly Minted” Ph.D.

I have been struggling with this blog post for much longer today than I probably should admit. Lots of people ask you what you are going to do “now that you have a Ph.D.” And the truth is, I really don’t know. I currently work in a nice position that requires Ph.D. level work, so its not like I am in a hurry to change things. But it is also a position that requires me to determine what I want to research, so staying put or looking elsewhere leaves me with the same confusion over “what’s next?” either way.

But why do I feel so confused over the future? This line from Jim Groom’s recent post seemed to finally clarify my hang-up:

“a bunch of folks who have been, for the most part, marginalized by the ed-tech gold rush for MOOCs, big data, analytics, etc—a push for the last four years that dominated the field’s time, energy, and precious few resources.”

There are interesting things happening in those “gold rush” areas, and also some concerning things. But our field, overall, does have a “cool” crowd and a “not so cool” crowd. If you are not currently into analytics, wearables, and a few other hot topics… you are usually left in the margins. I’m not sure if marginalized is the best word, but maybe… toiling in obscurity? For example, even bad ideas in analytics get more attention, more funding, more awards, etc, than great ideas in more obscure fields like instructional design, learning theory, etc.

That is not to slam analytics or wearables or whatever as a whole. There are some great ideas there. As Vitomir Kovanovic stated today:

The “gold rush” is often focusing on the “bad ones” at times because they can get something out there quicker. As George Siemens wisely pointed out:

So for a lot of these “hot topics,” I don’t hate them as much as see them having a long waiting period to mature into something practical. In the meantime, the instructional designer in me knows of practical ideas that can be used right now to make a dent in things.

But, the depressing truth is that these ideas will mostly always be kicking around on the fringes. When people like Mike Caufield complain about feeling obscure, and his ideas are a hundred times more popular than the ones I am interested in… it doesn’t make one want to sign up for years and years of fringe work.

Personally, I think the idea of “thought leader” is a bit along the lines of “rock star.” Others see differently, that is fine with me. But “thought leaders” are still part of the cool crowd, where as “thought synthesizers” tend to get left out of the conversation frequently. Most of the really interesting things that I like to work on, like customizable pathways design, are not really the result of “thought leaders” as much as “thought synthesizers.”

So the problem is, should I throw my lot in with the cool kids and do things that I am maybe-kind of interested in, or follow my passions into obscurity?

To be honest, I don’t really know. I am technically already in obscurity, so no where to go but up, right? A lot of this is not really about me, but the ideas that I think have great potential. They are also, unfortunately, ill-defined, poorly worded (too many syllables, which I say in all seriousness and not flippantly), not sexy, not flashy, not cool. I could very easily hitch my wagon to some ideas that are cool sounding and sexy. Someone sent me a link to a university that was looking for a Professor of Game-Based Learning that they thought I would be a good fit for. Sounds fun, flashy, hip, etc. But it was also in Texas, and let’s face it: Texas is not a great place to live (sorry if you think it is). And they pay academics poorly. I just found out this week I could get a raise if I went to teach high school down the street. Not interested in that at all, but…. ouch.

Also don’t know if I could spend all day teaching game-based learning. Not my passion. You see, I went to get a Ph.D. as a frustrated instructional designer that couldn’t get a foot in the research door because I wasn’t a professor. I wanted to follow my passions into researching ideas that made a practical difference (like many other Ph.D. students I am sure). That was five years ago, and the general state of academia has declined rapidly since then. I’m hardly enthusiastic to jump on the tenure track when that is such a minefield. If I can even get on the tenure track – that is difficult at best in the current university climate.

Oh, and now in many states students could be packing heat. So, yay safety.

edugeek-journal-avatarSo now that my pity party has been dragging on forever and will probably cost me the 6 readers I get for any post (WordPress stats are depressing as well), I leave anyone still reading this my depressing confession: if you get a Ph.D., you may end up finding yourself at a crossroads to choose between your passions and what will actually get you somewhere. If your passions line up with the cool crowd, you are lucky; if they don’t, you have a hard choice to make. I can’t tell you which one I will make. Obviously, I will be choosing very soon. But do I really want to push off in the opposite direction of the stream of hip ideas that have “dominated the field’s time, energy, and precious few resources”? It’s hard to say. But an important question to ask oneself.

the-kids-don-t-stand-a-chance-1201958

Reclaim the Front Page of Your Learning Experience for #IndieEdTech

One of the most contested areas in online learning is what I sometimes call the “front page” – usually the user interface or splash page or whatever main area learners first see when they start a course/learning experience/etc (usually also the main area they have to come back to every time). Schools want to control the “front page” learners see first in their class (usually always the learning management system they paid big money for). Ed-Tech companies want to control the “front page” learners see when they use their product. Other non-educational websites that get used in education like Twitter or Facebook want to control the “front page” of what users see. Of course, the average learner uses many of these services and has to navigate through many tools that are trying to control what you see while they learn, to control the “front page” of their learning experience.

The “front page” is how companies gather data for analytics so they can monetize users. Think back to the major changes between MySpace and Facebook. As horrible as MySpace could look at times, users could insert CSS and control all manner of aspects of their front page. That control was a good thing, despite the eye sores it created from time to time. How can a company monetize a MySpace user page when users can completely remove portions of the page? How can a company monetize interactions when users rarely have to leave their “space” to interact with others? The changes between MySpace and Twitter/Facebook resolve a lot of those issues, and hence created the battle for the “front page” of users’ internet experience.

This may not seem to be a big deal to many, but as we have been researching learner agency by giving learners modality choice in a customizable modality pathway design (aka “dual-layer”), the “front page” becomes a very, very important space that existentially affects learner choice in major ways. The tool that learners begins a learning experience in becomes the place they are comfortable with, and they resist venturing past the “front page” of that tool. You might have run into this problem with, say, introducing Twitter into a course taught in Blackboard. Many learners start to complain that the Bb forums would work just fine. There is a stickiness to the front page that keeps learners in there and away from other tools.

Shouldn’t the learner be in control of this “front page?” Shouldn’t this “front page” display their map of what they want to learn? Shouldn’t the tools and content and things they want to learn with/from support this map, linking from the learners “front page” rather than competing with it?

This is pretty much the big problem we run into with the customizable modality pathway design. The “front page” control battle segments the learning process, pulls learners away, makes them comfortable with giving up control of that space, and enforces the status quo of instructor-controlled learning. Up to this point, we have been working on design and structure – all of which is, for better or worse, coalescing into a design theory/method of some kind. However, the technology is simply in the way most of the time, mainly because very few tools actually work to give the learner control. They mostly all attempt to put their tool in control, and by extension, the person (instructor and school admins) behind the tool in control as well.

edugeek-journal-avatarIn many ways, I think this issue connects to the Indie Ed-Tech movement. I’ll be blogging more about that over the next few weeks/months. I’ll need to cover how the technology that allows learners to reclaim the “front page” of their learning experience could look – turning the idea of a “Neutral Zone” into a learning map that learners build (and then connect artifacts to create a “portfolio” map of what they did). This will allow learners to mix and match what tools, services, course, etc they learn from, leading to alternative ways to prove/certify that they have specific knowledge and skills, fueled by owning their own domain, APIs, cool stuff, etc. Of course, since any work in Indie Ed-Tech needs a music reference, I will be taking up Adam Croom’s challenge for someone to write about grunge rock (not my favorite style of music – Audrey Watters already used that one – but a good genre to represent the angle I am going for). So, yes, I have a good dozen blog posts in mind already, so time to get cracking.

(image credit: “The Kids Don’t Stand a Chance” by Nina Vital)

non-linear-id

Will The “Best” Best Practice Please Step Forward?

Whenever educational discussions turn towards student agency, learner-centered learning, and other less-utilized (non-instructivist) strategies, several common questions/concerns are raised about going this route. One of the more important ones is how do we put learners in control when there are so many learning mediums? How do we pick which one is best?

This is a great question. We should always strive towards what is best for our learners. The problem with this question comes not really with the question but the context that one or a few mediums are “best” and that we as educators can pick correctly for all learners at all times.

“Best practices” is a term commonly used in this context, and a problematic concept for many reasons. One of the bigger problems being that “best” is not really an objective line in the sand. What is “best” is constantly changing based on context, goals, preferences, and many other factors.

For example, different learning modalities each have their own set of best practices. Do you want a stereotypical instructor-focused course with lectures and quizzes? There are many ways to do that correctly, and many ways to do that incorrectly. Very incorrectly..

Do you want problem-based learning? Our field knows a lot on how to do that correctly, and a lot on how to do that incorrectly. There is also a lot we don’t know. And all of that changes drastically if you want, say, a well-defined contextually specific problem versus an ill-structured problem.

Other modalities (connectivist, cognitivst, social, independent, etc) have their own set of best practices, and each set of best practices changes within each modality depending on what flavor of that modality you are choosing. And even then there are still so many best practices that it really dilutes the term “best practice” down to “do the good stuff and avoid the bad stuff and be cautious with all of the stuff that we aren’t sure where it fits.”

Of course, sometimes when we say “best”, we are referring to choosing the “best” overall modality for a course, or even better, a given module inside a course. Anyone that has taught will know that once you choose a modality, half your learners will like it, and the other half will complain: “Why do we have to do group work? Why can’t you just tell us what to do?” “Why do we have to listen to you tell us what to do? Why can’t we just go do it on our own?” “Why can’t I have a group to help me?” and so on (even if you don’t hear them, you know they are happening in your learners’ heads.)

The truth is that different learners need different modalities for different topics at different times, some times even changing from one day to the next based on a whole range of internal and external reasons.

This means that the best device for choosing the best modality for any given learner at any given time is the learner themselves.

This whole post was inspired by a few tweets today that I think sum up nicely what I am really getting at:

The general idea is that our education needs to shift towards teaching learners how to learn, how to adapt, how to choose their own modality as they learn. We need to focus more on how to be learners and not just what facts and skills to learn. You, teach a person to fish and all that. This is the basis of heutagogy – the process of learning how to learn, how to adapt, how to self-regulate towards self-determined learning.

In other words, how do we get back to putting the human at the center of the educational process instead of our favorite tools and modalities?

edugeek-journal-avatarOne practical way some are working on this idea is the custmozable modality pathway learning design (my term de jour for what we used to call dual-layer). Shameless plug warning! Last week I was able to successfully defend my dissertation on this idea (and there was much rejoicing!). So hopefully after a few months of revisions and edits I will soon be able to start publishing the results on how diverse and personalized learners’ pathways are once they are given the choice. The educational field in general so rarely gives much true learner choice or agency that the outcome of enabling that choice is pretty eye-opening.

20141114_094222_choose-your-own-frac-sand-adventure-picture

Can the Students Speak for Themselves?

The answer is, yes, of course students can speak for themselves. The real question is will we listen to them, and even start including them in the conversation about their own educational experiences? This is not just a question for the established educational power systems that we typically associate with ignoring the student voice, but also for the educational reformers that seek to change those entrenched structures.

Recently I have been digging more into the work of the Indian-born philosopher Gayatri Chakravorty Spivak. Possibly one of her best known works is “Can the Subaltern Speak?”, an eye-opening critique of the post-colonial movement. For those that haven’t read Spivak, I would recommend Benjamin Graves one extended paragraph review of “Can the Subaltern Speak?” as a quick introduction.

The basic concern is that those who wish to help the subaltern (the economically dispossessed) gain their voice are still forcing them to adopt one voice for the entire group, ignoring the differences that exist within that group. In other words, the post-colonialists are becoming a different type of colonialist. This leads to two problems: “1) a logocentric assumption of cultural solidarity among a heterogeneous people, and 2) a dependence upon western intellectuals to “speak for” the subaltern condition rather than allowing them to speak for themselves.” Sound familiar?

What if you replace “subaltern” with “student”? How about replacing “cultural solidarity” with “connectivism”? What about the recent claims that scaffolding is colonialist in nature? Pretty much insert any modern educational reformer’s idea that there are absolute good and bad solutions for all learners: “if we can just convince all learners that connectivism is good and that scaffolding is oppressive, we can improve education!”

But what if we are forcing learners to take on epidemiological solidarity when the are actually a very heterogeneous group? What would they say about that if we listened to them when they speak for themselves?

We would find out that some learners want to follow the instructor. We would find out that some want to follow their own path. We would find out that many want both, just at the time of their own choosing. We would find that some love connectivism, while others find it inefficient and pointless. We would find that some hate scaffolding, while others think it is necessary. While scaffolding might be oppressive to some, it could be supporting or liberating to others. Or it could be both at different times to the same learner. Contexts shift. People change their minds.

These are not speculations. This is based on what learners have stated in the research for my dissertation. Learners are all over the map once you give them true choice, true personalization.

Which takes me to my problem with what many call personalized learning. Those of a certain age will remember the Choose Your Own Adventure book series. The basic idea of this book series was that the stories were not presented as a singular, linear path. Readers would read a few pages and then be presented with options. They would choose an option and turn to that corresponding page for that option, and so on until the adventure ended. Usually it ended poorly or kind of neutrally, but the goal was to keep trying until you arrived at one of the “good” endings. There were generally about 12-40 full story lines in each book to mix and match.

Most people that read these books developed a strategy of gaming the story lines, usually by bookmarking the last few choices with various fingers. If one choice led to death, just back up a step or two and try again.

The reality was that these were less “Choose Your OWN Adventure,” as much as “Choose One of 40 or so Pre-Determined Pathways to Entertain You With the Illusion of Choice.” This is also the premise of many (but not all) personalized learning systems. The programmers create a pre-determined set of options, and the learner has the illusion of “choice” and “personalization” as they choose various pre-programmed scenarios.

To me, true personalized learning would allow learners to speak for themselves, while not forcing them to follow one person’s view of the “correct” way to learn. True personalized learning would treat learners as an epistemologically heterogeneous group, giving them the ability to speak for their own personal epistemology.

Because the bigger problem is that when the experts come in and say “connectivism is good, scaffolding is bad, here are the ways you are going to connect with others”, they are really just creating a form of neo-instructvism that still forces learners to follow what the expert at the front says to do (even though it may be pre-prescribed connected learning).

These neo-instructivist connected learning activities are not theoretically – they currently exist in online courses. Learners are told to go to write their own blog and then comment on three other blogs in order to pass. Or compose a tweet and then respond to three other tweets. Or post a picture on Instagram and then comment on three other pictures on Instagram.

Sure, that is connected learning and research tells us that learners will retain more because they applied it while connecting to others. But where is the student voice in forcing them to all have a blog and then forcing them to comment and interact (or else don’t pass the course you took out a big loan for)?

Or what of the instructor that doesn’t provide any guidance and just dives into student-centered learning… whether the learners want it or not? Where is the student voice in that pre-determined student-centered design?

edugeek-journal-avatarSure, these instructors will win awards and be praise all over the Twitter-sphere for innovative, connectivist learning. For fighting instructivist colonialism. And so on. But what if these post-instructivist crusaders are causing the same damages to learning that the post-colonialist crusaders were causing that Gayatri Chakravorty Spivak noted? What if we are mistaking a statistically significant research result for the lone “voice” of what works for all learners at all times?

lego-feat1

People are Not Generalizable Cogs in a Wheel

One of the issues that we are trying to get at with dual-layer/customizable pathways design is that human beings are individuals with different needs and ever-changing preferences.

That seems to be an obvious statement to many, but a problematic one when looking at educational research. Or more correctly, how we use and discuss research in practical scenarios.

For example, when ever I mention how instructivism and connectivism can also be looked at as personal choices that individual learners prefer at different times, the response from educators is usually to quote research generalizations as if they are facts for all learners at all times:

More advanced learners prefer connectivism.
People that lack technical skills are afraid to try social learning.
Learners with higher levels of self-regulation hate instructivism
Students that are new to a topic need instructor guidance.
Student-centered learning makes learners think more in depth.

While many of these statements are true for many people, the thing we often skip over in education is that these concepts are actually generalized from research. It is not the case that these concepts are true for all learners, but that they have been generalized from a statistically significant correlation. That distinction is important (and often ignored) – because studies rarely find that these concepts are 100% true for 100% of the learners 100% of the time.

But practitioners typically read these generalizations and then standardize them for all learners. We lose sight of the individual outliers that are not included in those numbers (and even of the fact that in the data there is variations that get smoothed over in the quest for “generalization”).

Then, of course, we repeat those experiments with different groups and rarely check to see if those outliers in the new experiment are different types of people or the same.

We also rarely research courses where learners have true choice in the modality that they engage the course content, so do we ever truly know of we are finding the best options for learning in general, or if we are just finding out what learners will do to make the best out of being forced to do something they would rather not?

Are we losing sight of the individual, the unique person at the center of educational efforts?

My research is finding that, when the given freedom to choose their learning modality (instructivism or connectivism), learners stop falling into such neat categories that often comes out of research. For example, those that are advanced learners with high self-regulation and well-developed tech skills will sometimes prefer to follow an instructivist path for a variety of reasons. Or, for another example, sometimes learners have already thought through an issue pretty well, and therefore forcing them to go through student-centered learning with that topic is a boring chore because they don’t need to be forced to think about it again. Or. for even another example, some learners with low self-regulation and low tech skills will jump head first into connectivism because they want to interact with others (even though the research says they should have been too afraid to jump in).

edugeek-journal-avatarWhen you actually dig into the pathways that individuals would choose to take if one is not forced on them, those individuals tend to defy generalization more often than expected. But when you point this out, the establishment of education tends to argue against those findings all kinds of ways. We like the comfort of large sample sizes, generalizable statistics, and cut and dry boxes to put everyone in. I’m not saying to abandon this kind of research – just put it in a more realistic context in order to make sure we aren’t losing the individual human behind those generalizations.

non-linear-id

Instructivism vs Connectivism vs Social Learning

One of the things that I mentioned in the wrap-up hang out for HumanMOOC is getting at how people understand educational theories and their own preferences for learning. This is connected to how many educators will typically choose a theory of learning that they like best, and then assume it is best for all learners at all times. Until, of course, they are forced into learning in another theory that they don’t like by someone else that has decided that that theory is the best for all learners at all times, which is when they realize that maybe we are all different and maybe we should find ways to let people make their own path through learning.

This is, of course, one of the goals with dual-layer/customizable pathways design. We don’t force instructivism or connectivism on learners (or even a single pathway of our own design that is a mix of both). Nor do we treat one modality (like connectivism) like its an external thing that we embrace as a “backchannel” to the course if it happens. We create two valid modalities for learners to mix and change (or ignore) as they choose. And then we say that “every choice is awesome!” even if the learners don’t choose the options we would have.

Now, I do have to note that saying that “every choice is awesome!” is not the same as saying “every tool is awesome” and that we should not give feedback to the companies that offer the tools we use. I have given hundreds of points of feedback to all kinds of companies (as you can see in the archives of this blog). In my experience, the companies that ignore you are the ones that are most likely to turn around and use your idea (Blackboard is infamous for this). Those that listen to your ideas typically are just trying to look good on public blogs – they talk like they are listening and then change nothing more often than not. Just a bit of free advice from someone that has (and continues to) give out a lot of critique to ed tech companies.

One of the common problems with designing a course is that you have to use words to communicate what you want people to do. But people already have attached meaning to those words, which may or may not line up with commonly accepted norms. “Social Learning” is a term that I find causes the most confusion with customizable pathways design. Many, many people think that instructivism is not social at all, and that all social learning is connectivism (and connectivism has to be social in order to be connectivist).

The problem is – neither concept is true. Instructivism can be social, and connectivism does not have to be social.

In the literature, instructivism is sometimes connected to closed lectures and multiple choices tests, but for the most part it is connected with instructor-led content and activities. This can be anything from discussion forums (which can be social) to group assignments to Twitter activities. Yes, a Twitter activity in a course can be instructivist. If an instructor tells learners to go out and create a Twitter account, and then gives them a list of things to Tweet and respond to in order to fulfill an assignment, that is instructivism… and it is social. Social presence is a large field of research that is basically dedicated to figuring out how to improve an instructivist paradigm with social learning designs.

On the other hand, while connectivism is often very social, it doesn’t have to be social to still be connectivist. For example, go back to one of the foundational papers on connectivism (and probably one of the most quoted) and look at what connectivism is. Did you notice the part in there about off-loading learning to non-human agents? What this means is this: a learner can do a Google search on a topic and end up reading a Wikipedia article about the topic and that is still connectivism. They were not social at all, but they connected to the knowledge of others to learn about a topic. The connection occurred with a non-human agent.

Or think of it this way. Connectivism also involves the nurturing of connections for learning. You can follow hundreds of people on Twitter or in a RSS Reader and learn all kinds of things from them without ever commenting or responding. You are being connectivist, but not social. Or, you could even be social with people by tweeting “good luck!” when, say, someone tweets about getting a new job. This action is social, and it is building your connections (and therefore part of connectivism), but it is not social learning.

Of course, any connectivist worth their salt in WordPress will tell you that social learning is much, much more robust than independent learning. My point is just that not all connectivist learning is social in nature all the time.

Another part of connectivism is making sense of chaos and complex networks. So of course, being social helps. But at times, you have to wrestle with these things yourself as well. I can tell you for a fact that one of the founders of connectivism does not share all of his sense making socially. He does some, but not all. He wrestles with some of it in his head or while thinking about various things he reads online. Because that is also a part of connectivism – working on your own from time to time. Maybe even connecting with some instructivist content and being guided.

The problem is, we are all at different places at different times when going through the same topics. Forcing (or even encouraging) all students to get out of the LMS and into social learning is ignoring sociocultural differences and contextual needs of the individual students. It is also enforcing an instructor led pathway on all students. So yes, in many ways, forcing all learners to go and do connectivist activities (or even trying to trick them into doing so) is really an instructivist methodology behind the scenes. Which is not bad for the learners that want that, but horrible for those that do not.

In education, we tend to create false dichotomies between two sides that we think are diametrically opposed to each other. In the open learning world, there are many that label connectivism as “always good” and instructivism as “always bad.” Unfortunately, the world is not that simple, that black and white. The data that I have collected after two dual-layer MOOCs reaching tens of thousands of students would indicate learners are not that simplistic. Many learners find extreme value in instructivism… as long as it happens at a point that they choose, not one that is forced on them.

edugeek-journal-avatarAlso to note, this post is talking about course design. We have found that many learners prefer a mix of both modalities. The line between instructivism and connectivism is often a bit mixed, or permeable, or whatever you want to call it, to them – and that is just fine. While we are figuring out this customizable pathways design thing, we have to talk about the design a lot more in order to figure out what works. So understandably, that begins to conflate design considerations with learning experience in many learners minds. Someday we can hopefully get through all of that and let the design fade into the background.