blue-chaos-3-1190004-1280x960

Is It Really Learner Agency If The Instructor “Empowers” It?

For a few years now I have been struggling with how to “verb” agency in education (sometimes referred to as learner agency or student agency). When people first become aware of the idea, they tend to use terminology like “I want to allow student agency in my classes.” I guess on some levels that is technically what happens in many cases, as the teacher typically holds the power in the course, and they have to allow agency to happen.

However, once one uses that terms a bit and gets used to the idea, you usually realize that “allowing” agency is kind of a contradiction. People tend to shift towards using the term “empower”… as in, “My goal was to empower learner agency in this lesson.” This is the verb I hear most at conferences the few times that agency in education is touched on.

Of course, saying that the instructor is “empowering” agency is pretty problematic as well. Is a learner’s thought process really independent if the instructor is the one that empowered it? Is the autonomous action that flows from independent thought really all that autonomous if the teacher had to initiate the power to make it happen?

With some twists in logic and semantic word play, I am sure one could say that agency can be empowered, but to be honest – it really can’t. If the teacher is the one that “empowers” it, then its not really agency. What many really mean when they refer to “empowering agency” is “tricking learners into doing something that looks like independent thought and action, even if they didn’t really independently decide to think or act that way because at the end of the lesson there was a grade for coming up with something within specific instructor-determined parameters.”

I have started using terms like “unleash” when discussing agency in presentations, because that is probably about all you can really do with agency – remove the barriers that are holding it down, and let it do its own thing. But still, not really the best verb for agency.

Of course, this is probably why we don’t see much true learner agency in formal education settings – you set it loose, and it could go in any direction, or none, sometimes both from the same learner. It becomes something that is difficult to standardize and quantify once it really happens.

However, I am speaking of agency as if it is something that turns on and off at the flick of a switch, when the reality is that there are shades of agency that exist on a spectrum. Even when we unleash it, or just stand back and see what happens (or how ever you want to “verb” it), its not like learners just jump right into agency feet first and swim around in it like a natural. Some need guidance, scaffolding, a hand to hold, etc – whether because they are new to the idea in a system that has never allowed it or because they just need a more experienced hand to point them towards which way to go. Oh sure, there are many that do just launch out with little to no guidance to do it just fine. In any one class, you are going to have learners all over the place. They will even switch places from day to day or hour to hour.

edugeek-journal-avatarAgency in learning is something that takes the predictable linear instructivist narrative and explodes it all kinds of directions, but then even messes with linear time in that explosion as some need it to go slower while others need a guide through the explosion and others ride the explosion with enthusiasm wanting it to go faster. Oh, and then they all change their place in that process without a moments notice. So how does one come up with a verb to explain this chaos?

(image credit: Blue Chaos 3 by Josh Klute)

human-presence-1179565-1280x960

Decreasing Design Presence

With the Humanizing Online Learning MOOC in full swing, I wanted to dig more into a topic that I tend to allude to at conference presentations. While educators often talk (rightly so) about increasing teaching, social, and cognitive presence, there is also one form of presence that needs to be decreased when designing and teaching courses: design presence.

I’m using “design presence” here to cover a wide range of user interface, instructional design, and learning theory issues. In my mind, there are at least three areas that are heavy on design presence, and therefore design presence needs to be decreased in these areas:

  1. Technological Design Presence: tool/technology interfaces and instructions
  2. Instructional Design Presence: tool and content instructional design decisions
  3. Epistemological Design Presence: underlying learning theory choices

While some might notice there is some overlap with these areas and teacher, social, and cognitive presence, I have found that there are still some differences. Working to decrease design presence also ends up helping to increase teaching, social, and cognitive presence in the long-run.

Technological Design Presence

This is an area where user interface and instructional design collide, and for many courses designers the options are pre-determined by institutional adoptions. However, where choices are allowed, utilizing tools that have the least complex user interface options is ideal. For example, if you really want to use a listserv, but the tool you have to use is complex to sign-up and use, why not use Twitter? The user interface on Twitter is very simple compared to some older mass email tools. If you have to have a really complex set of instructions to use a tool, why not consider using something with less instructions and stress on the learner?

Or if you have a listserv tool that is easier to use than Twitter, why not use that instead of Twitter?

Where there are several options within a tool (like an LMS), why not choose the least confusing, most ready-to-use tool? Newer features in larger LMS tool sets often have a steep learning curve. For example, the blog feature in Blackboard was very confusing when it was first released, and it really worked more as a re-arranged discussion board. If you have to stay within Blackboard, then stick with the tools that take the least amount of time to explain to learners.

Additionally, think about other issues that cause unnecessary technology confusion. Blackboard was infamous for allowing course designers to set-up boxes within boxes within boxes. Avoid using tools and content structures just because you can. Avoid using desktop tools that make no sense online (like “folders” inside of online content). Avoid using complex navigational structures just because you can.

Once learners have to click around a half dozen times just to get somewhere, or dig through complex tool instructions, or spend too much time figuring out what you want them to do, they are running into too much technological design presence. Decrease what you can where you can.

Instructional Design Presence

This next facet has many connections to the first one, so there will probably be some overlap. Many times, course designers will make tool and content design decisions that are unnecessarily complex. For example, complex grading schemes that require dense explanations and calculators to figure out. Why go there? Obviously, there is merit to the idea that grades are problematic altogether, but many instructors are stuck with them. So why make them so complex? Why not just base course grades on a 100 point scale (which most people understand already), and make each assignment a straight portion of that grade. Complex structures based on weighted grades and 556 point scales and what not are a burden for both the instructor and the learner.

Rubrics are also a part of this area. Complex rubrics with too many categories and specific point values are, again, a burden for learners and instructors. Compare the complexity of this rubric with this one. I realize some people like the first one because it has so much detail, but to be honest, it is something most readers aren’t going to read through, because just glancing at it could cause stress.

Or another issue might be design choices that add unnecessary complexity, like having students upload Word docs to discussion forums for class discussion. Why not just use blogs? That is basically what you are doing with Word Docs and discussion forums.

Course designers typically make many choices with tools and content in their courses. Do these choices increase the instructional design presence of those decisions? Or do they decrease the design presence and allow learners to focus on learning rather than figuring out your designs?

Epistemological Design Presence

This area is a bit more difficult to get at, as it probably affects overarching decisions that affect everything in your course. For instance, if you lean more towards instructivism that places yourself at the center of everything in a course, you will probably choose many tools and interfaces that support your instructivist leanings: lecture capture, content heavy videos, long reading assignments, multiple choice tests, etc.

Now, just to point out, I am not a person to bash instructivist lectures across the board no matter what. There are times when learners need a well executed lecture. However, in education, many instructors use lectures too much. They use lectures to fill time when learners should be doing something hands on and/or active. If you are using lectures on video (or textbook readings) when learners should be creating their own knowledge, or applying concepts hands-on, or collaborating in groups, you have increased the epistemological design presence of your preferred learning theory at the expense of what the learners really needed. Time to decrease that facet of design presence.

There are times when learners don’t need to socially connect or listen to lectures, but work on their own. There are times when they need to connect with others rather than work individually. Don’t stick with instructivism or social constructivism or connectivism or any other theory you love just because you like it best. Put the learner first.

But what about the times where learners are at different levels and need different theories? Or, when no one theory fits and it is really up to the learner? I say, give them the choice. Build in multiple pathways for learning in your course. Build in scaffolding for learners to change into different theories. But avoid the mistakes I have made in the past and make sure to decrease the design presence of those options and pathways as much as possible. Don’t focus on the difference between the pathways – just focus on the fact that learners can make the choices they need at any given moment and then show the choices.

Decreasing Design Presence

edugeek-journal-avatarIf you are a good course designer, you probably already know everything I have touched on here. There is nothing new or different about what I am outlining here – this is solid instructional design methodology taught in most instructional design courses or learned on the job. However, it is seldom examined from the angle of decreasing design presence, and since I am one of the “wayfinders” in a course on the Community of Inquiry framework that covers teaching, social, and cognitive presence, I thought it would be a good idea to have a place to point to every time I mention “decreasing design presence.”

(image credit: Human Presence by Manu Mohan)

openonlinelearning

Self-Determined Learning: The Lesser-Explored Side of Open Learning

OpenEd 16 is in full swing and I am already kicking myself for not going this year. I seem to miss at least half of the cool conferences. Adam Croom has already provided a fascinating analysis of the abstract topics, which reveals a great list of important topics. However, I do notice something that is (possibly?) missing.

There is a lot about resources, textbooks, pedagogy, etc. Much of this focuses on removing barriers of access to education, which is a topic that we should all support. But what about the design of this education that they are increasing access to?

“Open pedagogy” seems to be the main focus of the design side of the equation. Of course, it is hard to tell from this analysis what people will really present on. When I think of open pedagogy, I think of David Wiley’s important work on the topic. Wiley’s description of open pedagogy is focused on being open about the design and assessment process, as well as allowing learners to remix and create their own open content.

So the question is – where is the learner agency, the self-determined learning, and the heutagogical side of “open learning”? It is probably there, but just not as explicitly named or explored. When you unleash your learners to determine their own pathway, their own context, their own content, and so on – that is also a part of open learning that needs to be specifically mentioned.

Open pedagogy is definitely a scaffold-ed step into self-determined open learning. Maybe some would argue that self-determined learning is implicitly a form of open pedagogy. I wouldn’t disagree, although I tend to avoid using pedagogy as a catch-all term for all forms of learning design due to the co-opting nature of expanding the use of pedagogy beyond “to guide a child.” But that really isn’t a huge deal to me as it is to the early childhood educators that feel left out of most academic educational discussions and usually don’t appreciate the college educators that typically leave them out also stealing the technical term for their design methodology.

Even when looking at the Wikipedia article on open learning, many of the topics touched on get close to self-determined learning, but not quite: self-regulated learning, active learning, life-long learning, etc. Almost there, but not quite.

edugeek-journal-avatarAgain, I know there are people out there that include the topics of learner agency and self-determined learning in the open learning / open education sphere, and that there are some people working in those topics. I just think there should be more. In my opinion, you can offer all the free content you want to and allow people to remix and re-use as much as you want… but if the design still focuses on the instructor (or the pre-determined content) as the center of the course, you have just created an open-licensed “sage on the stage” learning experience. Which I am sure many people will need, but for many others, this falls short of the concepts of learning how to be a learner.

scratch-metal

We are the Monster at the End of the Book

I wanted to circle back to a thought I had while reading Maha Bali’s excellent post Reproducing Marginality? The whole post is excellent, but one line made me think more than others. In it, she quotes something that she wrote with Paul Prinsloo and Kate Bowles that says:

…for most of us not in the US (or the UK), this [edtech] vision has often signalled top-down, US-to-world, Anglo-oriented, decontextualized, culturally irrelevant, infrastructure-insensitive, and timezone-ignorant aspirations, even when the invitation for us to join in may be well-intentioned.

Many of us in the Western world of EdTech are trying to figure out how to fix Education and Ed Tech, looking for the evil monsters out there that are causing the problems, and then fixing those monsters with research, technology, design, or methods.

And sometimes we are afraid to see what those monsters are that are damaging education, because they may be too big for us to fix.

This all reminds me of one of my favorite books as a kid: The Monster at the End of the Book.

mon001

In this book, Grover notices the title of the book and spends every page trying to stop you, the reader, from reaching the end of the book. He nails pages together, builds brick walls, and pleads with you NOT to get to the end of the book and face the monster lurking there.

hqdefault

Grover is terrified of the monster at the end of the book. But when he gets to the end of the book, he finds that he was the monster all along and that he had nothing to fear.

We (in the western world) are pretty much the monster at the end of the book when it comes to education reform. We are doing everything we can to avoid that possibility – looking to everything but ourselves to fix the problems. But is is our (sometimes) extreme ethno-centrism, socio-cultural centrism, whatever you want to call it, that is the problem all along. I would even go so far to say that as long as we are the center of the education world, we are always going to be the problem.

edugeek-journal-avatarEducation is about learning. Learners do the learning. Learning needs to be the center of what we do. Learners can live anywhere in the world, in any context. We need to examine the structures that keeps the wrong things at the center of education. We need to skip to the end of the book, realize we are the monster at the end of the book, and turn the story around. Learner agency is the only true “innovation” was have left to explore deeply in the education world.

dalmooc-header

Big (Scary) Education (Retention) Data (Surveillance)

Big data in education might be the savior of our failing learning system or the cement shoes that drags the system to the bottom of the ocean depending on who you talk to. No matter what your view of big data is, it is here and we need to pay attention to it regardless of our views.

My view? It is a mixture of extreme concern for the glaring problems mixed with hope that we can correct course on those problems and do something useful for the learners with the data.

Yesterday at LINK Lab we had a peak behind the scenes at a data collection tool that UTA is implementing. The people that run the software at UTA are good people with good intentions. I also hope they are aware of the problems already hard coded in the tool (and I suspect they are).

Big Data can definitely look scary for a lot of reasons. What we observed was mostly focused on retention (or “persistence” was the more friendly term the software uses I believe). All of the data collected basically turns students into a collection of numbers on hundreds of continuums, and then averages those numbers out to rank them on how likely they are to drop out. To some, this is scary prospect.

Another scary prospect is that there is the real danger of using that data to see which students to ignore (because they are going to stick around anyways) and which students to focus time and energy on (in order to make the university more money). This would be data as surveillance more than educational tool.

While looking at the factors in this data tool that learners are ranked by led to no surprises – we have known from research for a long time what students that “persist” do and what those that don’t “persist” do (or don’t do). The lists of “at risk” students that these factors produce will probably not be much different from the older “at risk” lists that have been around for decades. The main change will be that we will offload the process of producing those lists to the machines, and wash our hands of any bias that has always existed in producing those lists in the first place.

And I don’t want to skip over the irony of spending millions or dollars on big data to find out that “financial difficulties” are the reason that a large number of learners don’t “persist.”

The biggest concern that I see is the amount of bias being programmed into the algorithms. Even the word “persistence” implies certain sociocultural values that are not the same for all learners. Even in our short time looking around in the data collection program, I saw dozens of examples of positivist white male bias hard coded in the design.

For example, when ranking learners based on grades, one measure ranked learners in relation to the class average. Those that fell too far below the class average were seen as having one risk factor for not “persisting.” This is different than looking at just grades as a whole. If the class average is a low B but a learner has a high B, they would be above the class average and in the “okay” zone for “persistence.”

But that is not how all cultures view grades. My wife is half Indian and half Australian. We have been to India and talked to many people that were under intense stress to get the highest grades possible. It is a huge pressure for many in certain parts of that culture. But even a low A might not register as a troubling signal if the class average is much lower. But to someone that is facing intense pressure to get the best grades or else come home and work in Dad’s business… they need help.

(I am not a fan of grades myself, but this is one area that stuck out to me while poking around in the back end of the data program)

This is an important issue since UTA is designated as a Hispanic Serving Institute. We have to be careful not get into the same traps that education has fallen into for centuries related to inequalities. But as our LINK director Lisa Berry pointed out, this is also why UTA needs to dive into Big Data. If we don’t get in there with our diverse population and start breaking the algorithms to expose where they are biased, who else will?  Hopefully there are others, but the point is that we need to get in there and critically ask the hard questions, or else we run the risk of perpetuating educational inequalities (by offloading them to the machines).

For now, a good place to start is by asking the hard questions about privacy and ownership in our big data plan:

Are the students made aware that this kind of data is being collected?

If not, they need to be made aware. Everywhere that data is collected, there should be a notification.

Beyond that, are they given details on what specific data points are being collected?

If not, they need to know that as well. I would suggest a centralized ADA-compliant web page that explains every data point collected in easy to understand detail (with as many translations to other languages as possible).

Can students opt-out of data collection? What about granular control over the data that they do allow to be collected?

Students should be able to opt out of data collection. Each class or point of collection should have permissions. Beyond that, I would say they should be able to say yes or no to specific data points if they want to. Or even beyond that, what about making data collection opt-in?

Who owns the students’ data (since it is technically their actions that create the data)?

This may seem radical to some, but shouldn’t the student own their own data? If you say “no,” then they should at least have the right to access it and see what is being collected on them specifically.

Think of it this way: How will the very substantial Muslim population at UTA feel about a public school, tied to the government, collecting all of this data on them? How will our students of color feel about UTA collecting data on them while they are voicing support for Black Lives Matter? How would the child of illegal immigrants feel about each class at UTA collecting data about them that could incriminate their parents?

edugeek-journal-avatarThese issues are some of the hard things we have to wrestle with in the world of Big Data in Education. If we point it towards openness, transparency, student ownership, and helping all learners with their unique sociocultural situations, then it has potential. If not, then we run the risk of turning Big Education Data into Scary Retention Surveillance.

scratch-metal

Disruption is No Longer Innovative

How can you tell if an innovator is pulling your leg? Their lips are moving. Or their fingers are typing. I write that knowing fully well that it says a lot about my current title of “learning innovation coordinator.” To come clean about that title: we were allowed to choose them to some degree. I chose that one for pure political reasons. I knew that if I wanted to help bring some different ideas to my university (like Domain of One’s Own, Learning Pathways, Wearables, etc), I would need a title beyond something like “instructional technologist” to open doors.

But beyond a few discussions that I have on campus, you will rarely hear my talking about “innovation,” and I reject the title of “innovator” for almost anyone. Really, if you think any technology or idea or group is innovative, put that technology or idea into Google followed by “Audrey Watters” and get ready for the Ed-Tech history lesson the “innovators” tend to forget to tell you about.

In a broad sense, many would say that the concept of “innovation” involves some kind of idea or design or tool or whatever that is new (or at least previously very very “popular”). Within that framework of innovation, disruption is no longer “innovative.” Disruption is really a pretty old idea that gained popularity after the mp3 supposedly “disrupted” the music business and/or the digital camera disrupted the camera industry.

Of course, that is not what happened – mp3s and digital cameras just wrenched some power out of the hands of the gatekeepers of those industries, who then responded by creating the “disruption narrative” (which is what most are referring to when they just say “disruption”). And then proceeded to use that narrative to gain more control over their industry than before (for example, streaming music services). Keep this in mind any time you read someone talking about “disruption” in education. Who is saying it, what do they want it to do, and how much more control do they get over the educational process because of their disruption narrative?

Of course, there is debate over whether disruption is real or not. Both sides have good points. Regardless of if you believe that disruption is real or not, our current disruption narrative has been around for over two decades now… probably long past the expiration date that gets slapped on any “innovative” idea. If you are still talking disruption, you are not an innovator.

If you want to convince me that you are an innovator, I don’t want to know what cool ideas or toys you have. I want to know who you read and follow. Are you familiar with Audrey Watters? Have you read Gayatri Chakravorty Spivak’s Can the Subaltern Speak? Are you familiar with Adeline Koh’s work on Frantz Fanon? Do you follow Maha Bali on Twitter? If I mention Rafranz Davis and #EdtechBlackout, do I get a blank stare back from you?

If you were to chart the people that influence your thinking – and it ends up being primarily white males… I am not sure how much of an innovator you really are. Education often operates as a “one-size-fits-all” box (or at best, a “one-set-of-ideas-fits-all” box), and that box has mostly been designed by white males. Usually a small set of white males that think all people learn best like they do. How can your idea or technology be that “new” if it is influenced by the same people that influenced all of the previous ones?

So what has this “one-set-of-ideas-fits-all” box created for education? Think tanks and university initiatives that sit around “innovating” things like massive curriculum rethinking, “new” pedagogical approaches, and “creative new applications of a range of pedagogical and social technologies.” They try to come up with the solutions for the learners. Many of these are probably some great ideas – but nothing new.

Why not find ways to let the learners set their own curriculum, follow their own pedagogical approaches, or create their own ways of applying technology? Instead of walling ourselves up in instructional design teams, why not talk to the learners themselves and find out what hinders their heutagogical development? Why not look to learners as the instructors, and let them into the design process? Or dump the process and let learners be the designers?

What I am getting at is helping learners create and follow their own learning pathway. Each one will be different, so we need massive epistemological and organizational shifts to empower this diversity. Why not make “diversity” the new “innovative” in education? Diversity could be the future of educational innovation, if it could serve as a way to humanize the learning process. This shift would need people that are already interacting with a diverse range of educators and students to understand how to make that happen.

I would even go as far to say that it is time to enter the “post-innovation” era of Ed-Tech, where any tool or idea is framed based on whether it supports a disruption mindset or a diversity mindset. What does that mean about emerging ideas like big data or wearables? Post-innovation would not be about the tool or the system around it, but the underlying narrative. Does this “thing” support disruption or diversity? Does it keep power with the gatekeepers that already have it, or empower learners to explore what it means for them to be their one unique “human” self in the digital age?

For example, if “big data” is just used to dissect retention rates, and then to find ways to trick students into not dropping out… that is a “disruption” mindset. “We are losing learners/control, so let’s find a way to upend the system to get those learners back!” A diversity mindset looks at how the data can help each individual learner become their own unique, self-determined learner, in their particular sociocultural context: “Based on the this data that you gave us permission to collect, we compared it anonymously to other learners and they were often helped by these suggestions. Do any of these look interesting to you?” Even of the learner looks at these options and rejects all of them, the process of thinking through those options will still help them learn more about their unique learning needs and desires. It will help them celebrate their unique, diverse human self instead of becoming another percentage point in a system designed to trick them into producing better looking numbers for the powers that be.

edugeek-journal-avatarThis is also a foundational guiding aspect of the dual-layer/learning pathways idea we are working on at the LINK Lab. It is hard to come up with a good name for it, as we are not really looking at it as a “model” but something that turns the idea of a “model” or “system” inside out, placing each individual learner in the role of creating their own model/pathway/system/etc. In other words, a rejection of “disruption” in favor of “diversity.” We want to embrace how diversity has been and always will be the true essence of what innovation should have been: each learner defining innovation for themselves.

dd-basic-set3

Personalized Learning Versus Dungeons and Dragons

Personalized learning is popular right now. But is that a good or bad thing? I can buy all kinds of personalized gadgets online, but do I really like or need any of them? If you decided to get me a custom dinner place mat that says “Matt’s Grub” – sure that is personalized. But its also a pretty useless personalized item that I have no interest in.

Many prominent personalized learning programs/tools are a modern educational version of the Choose Your Own Adventure book series from the 1908s. As I have written before, these books provided a promise of a personalized adventure for the reader, which was entertaining for a while. But you were really just choosing from a series of 50 pre-written paths, hoping to pick one of the ones that led to a happy ending. Of course, If you happened to have any physical characteristics that were different than the ones written into the story (I remember a classmate that had shaved his head making fun of one page that had the main character doing something with his hair – yes they were sometimes gendered stories even), then the “your” in “Choose Your Own Adventure” fell flat.

ChooseYourOwnAdventure

These eventually evolved into more complex books like the Lone Wolf gamebooks that had you doing your own battles, collecting objects, and other activities that were closer to role playing games.

LoneWolf

But let’s face it – the true “Choose Your Own Adventure” scenarios in the 1980s were really role playing games. And few were as personalizable as Dungeons and Dragons.

Now, whether you love or hate D&D, or even still think it is Satanic… please hear me out. D&D, at least in the 80s, was personalizable because it was provide different pathways that were scaffolded. New players could start out with the Basic D&D boxset – which came with game rules, pre-designed characters, basic adventures to go on, etc. And that wasn’t even really the starting point. If basic D&D was too unstructured for you, there were books like the Dragonlance Chronicles or the Shannara series that would give you this completely guided tour of what D&D could look like. Oh, and even a Saturday morning cartoon series if the books were too much for you.

But back to D&D, once you mastered the Basic set, there were more sets (Expert, Companion, Master, and Immortal) – all of which gave you more power and control. Then, when you were ready (or if you found Basic D&D too pre-determined), there was Advanced Dungeons and Dragons. This was a set of books that laid out some basic ideas to create your own characters and worlds and adventures. And you were free to change, modify, add to, or completely re-invent those basics. Many people did, and shared their modifications in national magazines like Dragon Magazine. Oh, and what if you want to make your own world but are still unsure? You had a whole range of pre-designed adventures called Dungeon Modules. Just buy one, play, and get inspired to create your own. Or, maybe the opposite is true: you were just tired of your creation and wanted to take a break in someone else’s world.

add

To me, Dungeons and Dragons in the 1980s was a much better metaphor for what personalized learning should look like. You had completely mindless escapism entertainment (aka lectures) when you needed it, like the books and cartoons. You had the structured environment of Basic D&D to guide you through the basics (aka instructivism). You had a series of games and accessories like Dungeon Modules and Companion Sets to guide you (aka scaffold you) to the advanced stage. You had the Advanced books that set a basic structure for creating your own world (aka the Internet). Then you had a network of people sharing ideas and designs to keep new ideas flowing (aka connectivism). Many gamers would go back and forth between these various parts – creating their own world, sharing their ideas in the magazines, playing dungeon modules on occasion, reading the books, and dipping back to basic D&D when the mood hit them.

This scene from The Big Bang Theory shows how players can customize, adapt, and personalized the game experience, even as they play:

edugeek-journal-avatarOf course, there were problems with the gaming community. It was expensive, and often sexist and/or racist. So I am not painting the Dungeon and Dragons world of the 1980s as some perfect utopia. I am looking at the design of the tools and system here. It is one that in some fashion pre-ceded and informed what we are doing with pathways learning, and one that I think is closer to true “personalization” than what some personalized learning situations offer.

maple-syrup

Pokemon Go and the Gimmickification of Education

I almost dread looking at my social media feed today. Pokemon Go (GO? G.O.? (wake me up before you) Go-Go?) received a large bit of media attention this weekend, apparently even already spawning posts about how it will revolutionize education and tweets about how we need what it produces in education:

All I could think about is: how did we get to this point? Every single tech trend turns into a gimmick to sell education mumbo jumbo kitsch tied to every cool, hip trend that pops up on the social media radar. I guess I shouldn’t been that surprised once Block-chain became educational, or Second Life was used to deliver classes, or Twitter replaced LMSs, or MySpace became the University of the future, or DVDs saved public schools, and so on and so forth. I bet at some point thousands of years ago there was a dude in white toga standing up in an agora somewhere telling Plato how chariots would revolutionize how he taught his students.

I’m all for examining new trends through an educational lens, but every time I just want to say “too far, Ed-Tech®, too far!”

We all know education needs to change. It always has been changing, it always will, and will always need to have a critical lens applied to how and why it is changing. But with every new technology trend that gets re-purposed into the next savior of education, I can’t stop this gnawing feeling that our field is becoming a big gimmick to those outside of it.

A gimmick is basically just a trick intended to attract attention. One or two seem harmless enough. Well, not that harmful? But once everything that comes down the pipe starts become this trick to get people to look at education, the gimmick gets old. People are still asking what happened to Second Life, to Google Wave, to you name the trend. After a while, they stop buying into the notion that any of us know what we are talking about. Just think of the long-term effect on the larger discourse of so many people declaring so many things to be the savior of education, only to abandon each one after a year or two.

edugeek-journal-avatarThe problem with the hype cycle of Ed-Tech is that is buries the real conversations that have been happening for a long time on whatever the hype-de-jour is. Do you want the Pokemon Go for education, where students are engaged, active, social, etc? We already have a thousand projects that have done that to some degree. Those projects just can’t get attention because everyone is saying “Pokemon Go will revolutionize education!” (well, at least those that say that un-ironically – sarcastic commentary that apparently went over many people’s head not included).

(see also “Pokemon GO is the xMOOC of Augmented Reality“)

non-linear-id

Evolution of the Dual-Layer/Customizable Pathways Design

For the past few weeks, I have been considering what recent research has to say about the evolution of the dual-layer aka customizable pathways design. A lot of this research is, unfortunately, still locked up in my dissertation, so time to get to publishing. But until then, here is kind of the run down of where it has been and where it is going.

The original idea was called “dual-layer” because the desire was to create two learning possibilities within a course: one that is a structured possibility based on the content and activities that the instructor thinks are a good way to learn the topic, the other an unstructured possibility designed so that learners can created their own learning experience as they see fit. I am saying “possibility” where I used to say “modality.” Modality really serves as the best description of these possibilities, since modality means “a particular mode in which something exists or is experienced or expressed.” The basic idea is to provide an instructivist modality and a connectivist modality. But modality seems to come across as too stuffy, so I am still looking for a cooler term to use there.

The main consideration for these possibilities is that they should be designed as part of the same course in a way that learners can switch back and forth between them as needed. Many ask: ‘why not just design two courses?” You don’t want two courses as that could impeded changing modalities, as well as create barriers to social interactions. The main picture that I have in my head to explain why this is so is a large botanical garden, like this:

01216 Scene at Fort Worth Botanic Gardens

There is a path there for those that want to follow it, but you are free to veer off and make your own path to see other things from different angles or contexts. But you don’t just design two gardens, one that is just a pathway and one that is just open fields. You design both in one space.

So in other words, you design a defined path (purple line below) and then connect with opportunities to allow learners to look at the topic from other contexts (gold area below):

pathways1-1

You have a defined modality (the path), and then open up ways for people to go off the path into other contexts. When allowed to mix the two, the learner would create their own customized pathway, something like this:

pathways1-2

The problem with the image above is that this really shouldn’t only be about going off the walkway in the garden to explore other contexts. Learners should be allowed to dig under the walkway, or fly above it. They should be able to dig deeper or pull back for a bird’s eye view as needed. So that would take the model into a three dimensional view like this:

pathways2

(please forgive my lack of 3-D modeling skills)

Learners can follow the instructors suggested content or go off in any direction they choose, and then change to the other modality at any moment. They can go deeper, into different contexts, deeper in different contexts, bigger picture, or bigger picture in different contexts.

The problem that we have uncovered in using this model in DALMOOC and HumanMOOC is that many learners don’t understand this design. However, many do understand and appreciate the choice… but there are some that don’t want to get bogged down in the design choices. Some that choose one modality don’t understand why the other modality needs to be in the course (while some that have chosen that “other” modality wonder the same thing in reverse). So really, all that I have been discussing so far probably needs to be relegated to an “instructional design” / “learning experience design” / “whatever you like to call it design” method. All of this talk of pathways and possibilities and modalities needs to be relegated to the design process. There are ways to tie this idea together into a cohesive learning experience through goal posts, competencies, open-ended rubrics, assignment banks, and scaffolding. Of course, scaffolding may be a third modality that sits between the other two; I’m not totality sure if it needs to be its own part or a tool in the background. Or both.

The goal of this “design method” would be to create a course that supports learners that want instructor guidance while also getting out of the way of those that want to go at it on their own. All while also recognizing that learners don’t always fall into those two neat categories. They may be different mixtures of both at any given moment, and they could change that mixture at any given point. The goal would be to design in a way that gives the learner what they need at any given point.

Of course, I think the learner needs to know that they have this choice to make. However, going too far into instructivism vs. connectivism or structured vs. unstructured can get some learners bogged down in educational theory that they don’t have time for. We need to work on a way to decrease the design presence so learners can focus on making the choices to map their learning pathway.

So the other piece to the evolution of this pathways idea is creating the tools that allow learners to map their pathway through the topic. What comes to mind is something like Storify, re-imagined as a educational mapping tool in place of an LMS. What I like about Storify is the simple interface, and the way you can pull up a whole range of content on the right side of the page to drag and drop into a flowing story on the left.

storify

I could image something like this for learners, but with a wide range of content and tools (both prescribed by the instructor, the learner, other learners, and from other places on the Internets) on the right side. Learners would drag and drop the various aspects they want into a “pathway” through the week’s topic on the left side. The parts that they pull together would contain links or interactive parts for them to start utilizing.

For example, the learner decides to look at the week’s content and drags the instructor’s introductory video into their pathway. Then they drag in a widget with a Wikipedia search box, because they want to look at bigger picture on the topic. Then they drag a Twitter hashtag to the pathway to remind themselves to tweet a specific question out to a related hashtag to see what others say. Then they drag a blog box over to create a blog post. Finally, they look in the assignment bank list and drag an assignment onto the end of the pathway that they think will best prove they understand the topic of the week.

The interesting thing about this possible “tool” is that after creating the map, the learner could then create a graph full of artifacts of what they did to complete the map. Say they get into an interesting Twitter conversation. All of those tweets could be pulled into the graph to show what happened. Then, let’s say their Wikipedia search led them to read some interesting blog posts by experts in the field. They could add those links to the graph, as well as point out the comments they made. Then, they may have decided to go back and watch the second video that the instructor created for the topic. They could add that to the graph. Then they could add a link to the blog post they created. Finally, they could link to the assignment bank activity that they modified to fit their needs. Maybe it was a group video they created, or whatever activity they decided on.

In the end, the graph that they create itself could serve as their final artifact to show what they have learned during the course. Instead of a single “gotcha!” test or paper at the end, learners would have a graph that shows their process of learning. And a great addition to their portfolios as well.

Ultimately, These maps and graphs would need to be something that reside on each learners personal domain, connecting with the school domain as needed to collect map elements.

When education is framed like this, with the learner on top and the system underneath providing support, I also see an interesting context for learning analytics. Just think of what it could look like if, for instance, instead of tricking struggling learners into doing things to stay on “track” (as defined by administrators), learning analytics could provide helpful suggestions for learners (“When other people had problems with _____ like you are, they tried these three options. Interested in any of them?”).

edugeek-journal-avatarOf course, I realized that I am talking about upending an entire system built on “final grades” by instead focusing on “showcasing a learning process.” Can’t say this change will ever occur, but for now this serves as a brief summary of where the idea of customizable pathways has been, where it is now (at least in my head, others probably have different ideas that are pretty interesting as well), and where I would like for it to go (even if only in my dreams).

LEGO

Depressing Confessions of a “Newly Minted” Ph.D.

I have been struggling with this blog post for much longer today than I probably should admit. Lots of people ask you what you are going to do “now that you have a Ph.D.” And the truth is, I really don’t know. I currently work in a nice position that requires Ph.D. level work, so its not like I am in a hurry to change things. But it is also a position that requires me to determine what I want to research, so staying put or looking elsewhere leaves me with the same confusion over “what’s next?” either way.

But why do I feel so confused over the future? This line from Jim Groom’s recent post seemed to finally clarify my hang-up:

“a bunch of folks who have been, for the most part, marginalized by the ed-tech gold rush for MOOCs, big data, analytics, etc—a push for the last four years that dominated the field’s time, energy, and precious few resources.”

There are interesting things happening in those “gold rush” areas, and also some concerning things. But our field, overall, does have a “cool” crowd and a “not so cool” crowd. If you are not currently into analytics, wearables, and a few other hot topics… you are usually left in the margins. I’m not sure if marginalized is the best word, but maybe… toiling in obscurity? For example, even bad ideas in analytics get more attention, more funding, more awards, etc, than great ideas in more obscure fields like instructional design, learning theory, etc.

That is not to slam analytics or wearables or whatever as a whole. There are some great ideas there. As Vitomir Kovanovic stated today:

The “gold rush” is often focusing on the “bad ones” at times because they can get something out there quicker. As George Siemens wisely pointed out:

So for a lot of these “hot topics,” I don’t hate them as much as see them having a long waiting period to mature into something practical. In the meantime, the instructional designer in me knows of practical ideas that can be used right now to make a dent in things.

But, the depressing truth is that these ideas will mostly always be kicking around on the fringes. When people like Mike Caufield complain about feeling obscure, and his ideas are a hundred times more popular than the ones I am interested in… it doesn’t make one want to sign up for years and years of fringe work.

Personally, I think the idea of “thought leader” is a bit along the lines of “rock star.” Others see differently, that is fine with me. But “thought leaders” are still part of the cool crowd, where as “thought synthesizers” tend to get left out of the conversation frequently. Most of the really interesting things that I like to work on, like customizable pathways design, are not really the result of “thought leaders” as much as “thought synthesizers.”

So the problem is, should I throw my lot in with the cool kids and do things that I am maybe-kind of interested in, or follow my passions into obscurity?

To be honest, I don’t really know. I am technically already in obscurity, so no where to go but up, right? A lot of this is not really about me, but the ideas that I think have great potential. They are also, unfortunately, ill-defined, poorly worded (too many syllables, which I say in all seriousness and not flippantly), not sexy, not flashy, not cool. I could very easily hitch my wagon to some ideas that are cool sounding and sexy. Someone sent me a link to a university that was looking for a Professor of Game-Based Learning that they thought I would be a good fit for. Sounds fun, flashy, hip, etc. But it was also in Texas, and let’s face it: Texas is not a great place to live (sorry if you think it is). And they pay academics poorly. I just found out this week I could get a raise if I went to teach high school down the street. Not interested in that at all, but…. ouch.

Also don’t know if I could spend all day teaching game-based learning. Not my passion. You see, I went to get a Ph.D. as a frustrated instructional designer that couldn’t get a foot in the research door because I wasn’t a professor. I wanted to follow my passions into researching ideas that made a practical difference (like many other Ph.D. students I am sure). That was five years ago, and the general state of academia has declined rapidly since then. I’m hardly enthusiastic to jump on the tenure track when that is such a minefield. If I can even get on the tenure track – that is difficult at best in the current university climate.

Oh, and now in many states students could be packing heat. So, yay safety.

edugeek-journal-avatarSo now that my pity party has been dragging on forever and will probably cost me the 6 readers I get for any post (WordPress stats are depressing as well), I leave anyone still reading this my depressing confession: if you get a Ph.D., you may end up finding yourself at a crossroads to choose between your passions and what will actually get you somewhere. If your passions line up with the cool crowd, you are lucky; if they don’t, you have a hard choice to make. I can’t tell you which one I will make. Obviously, I will be choosing very soon. But do I really want to push off in the opposite direction of the stream of hip ideas that have “dominated the field’s time, energy, and precious few resources”? It’s hard to say. But an important question to ask oneself.