dalmooc-header

Big (Scary) Education (Retention) Data (Surveillance)

Big data in education might be the savior of our failing learning system or the cement shoes that drags the system to the bottom of the ocean depending on who you talk to. No matter what your view of big data is, it is here and we need to pay attention to it regardless of our views.

My view? It is a mixture of extreme concern for the glaring problems mixed with hope that we can correct course on those problems and do something useful for the learners with the data.

Yesterday at LINK Lab we had a peak behind the scenes at a data collection tool that UTA is implementing. The people that run the software at UTA are good people with good intentions. I also hope they are aware of the problems already hard coded in the tool (and I suspect they are).

Big Data can definitely look scary for a lot of reasons. What we observed was mostly focused on retention (or “persistence” was the more friendly term the software uses I believe). All of the data collected basically turns students into a collection of numbers on hundreds of continuums, and then averages those numbers out to rank them on how likely they are to drop out. To some, this is scary prospect.

Another scary prospect is that there is the real danger of using that data to see which students to ignore (because they are going to stick around anyways) and which students to focus time and energy on (in order to make the university more money). This would be data as surveillance more than educational tool.

While looking at the factors in this data tool that learners are ranked by led to no surprises – we have known from research for a long time what students that “persist” do and what those that don’t “persist” do (or don’t do). The lists of “at risk” students that these factors produce will probably not be much different from the older “at risk” lists that have been around for decades. The main change will be that we will offload the process of producing those lists to the machines, and wash our hands of any bias that has always existed in producing those lists in the first place.

And I don’t want to skip over the irony of spending millions or dollars on big data to find out that “financial difficulties” are the reason that a large number of learners don’t “persist.”

The biggest concern that I see is the amount of bias being programmed into the algorithms. Even the word “persistence” implies certain sociocultural values that are not the same for all learners. Even in our short time looking around in the data collection program, I saw dozens of examples of positivist white male bias hard coded in the design.

For example, when ranking learners based on grades, one measure ranked learners in relation to the class average. Those that fell too far below the class average were seen as having one risk factor for not “persisting.” This is different than looking at just grades as a whole. If the class average is a low B but a learner has a high B, they would be above the class average and in the “okay” zone for “persistence.”

But that is not how all cultures view grades. My wife is half Indian and half Australian. We have been to India and talked to many people that were under intense stress to get the highest grades possible. It is a huge pressure for many in certain parts of that culture. But even a low A might not register as a troubling signal if the class average is much lower. But to someone that is facing intense pressure to get the best grades or else come home and work in Dad’s business… they need help.

(I am not a fan of grades myself, but this is one area that stuck out to me while poking around in the back end of the data program)

This is an important issue since UTA is designated as a Hispanic Serving Institute. We have to be careful not get into the same traps that education has fallen into for centuries related to inequalities. But as our LINK director Lisa Berry pointed out, this is also why UTA needs to dive into Big Data. If we don’t get in there with our diverse population and start breaking the algorithms to expose where they are biased, who else will?  Hopefully there are others, but the point is that we need to get in there and critically ask the hard questions, or else we run the risk of perpetuating educational inequalities (by offloading them to the machines).

For now, a good place to start is by asking the hard questions about privacy and ownership in our big data plan:

Are the students made aware that this kind of data is being collected?

If not, they need to be made aware. Everywhere that data is collected, there should be a notification.

Beyond that, are they given details on what specific data points are being collected?

If not, they need to know that as well. I would suggest a centralized ADA-compliant web page that explains every data point collected in easy to understand detail (with as many translations to other languages as possible).

Can students opt-out of data collection? What about granular control over the data that they do allow to be collected?

Students should be able to opt out of data collection. Each class or point of collection should have permissions. Beyond that, I would say they should be able to say yes or no to specific data points if they want to. Or even beyond that, what about making data collection opt-in?

Who owns the students’ data (since it is technically their actions that create the data)?

This may seem radical to some, but shouldn’t the student own their own data? If you say “no,” then they should at least have the right to access it and see what is being collected on them specifically.

Think of it this way: How will the very substantial Muslim population at UTA feel about a public school, tied to the government, collecting all of this data on them? How will our students of color feel about UTA collecting data on them while they are voicing support for Black Lives Matter? How would the child of illegal immigrants feel about each class at UTA collecting data about them that could incriminate their parents?

edugeek-journal-avatarThese issues are some of the hard things we have to wrestle with in the world of Big Data in Education. If we point it towards openness, transparency, student ownership, and helping all learners with their unique sociocultural situations, then it has potential. If not, then we run the risk of turning Big Education Data into Scary Retention Surveillance.

scratch-metal

Disruption is No Longer Innovative

How can you tell if an innovator is pulling your leg? Their lips are moving. Or their fingers are typing. I write that knowing fully well that it says a lot about my current title of “learning innovation coordinator.” To come clean about that title: we were allowed to choose them to some degree. I chose that one for pure political reasons. I knew that if I wanted to help bring some different ideas to my university (like Domain of One’s Own, Learning Pathways, Wearables, etc), I would need a title beyond something like “instructional technologist” to open doors.

But beyond a few discussions that I have on campus, you will rarely hear my talking about “innovation,” and I reject the title of “innovator” for almost anyone. Really, if you think any technology or idea or group is innovative, put that technology or idea into Google followed by “Audrey Watters” and get ready for the Ed-Tech history lesson the “innovators” tend to forget to tell you about.

In a broad sense, many would say that the concept of “innovation” involves some kind of idea or design or tool or whatever that is new (or at least previously very very “popular”). Within that framework of innovation, disruption is no longer “innovative.” Disruption is really a pretty old idea that gained popularity after the mp3 supposedly “disrupted” the music business and/or the digital camera disrupted the camera industry.

Of course, that is not what happened – mp3s and digital cameras just wrenched some power out of the hands of the gatekeepers of those industries, who then responded by creating the “disruption narrative” (which is what most are referring to when they just say “disruption”). And then proceeded to use that narrative to gain more control over their industry than before (for example, streaming music services). Keep this in mind any time you read someone talking about “disruption” in education. Who is saying it, what do they want it to do, and how much more control do they get over the educational process because of their disruption narrative?

Of course, there is debate over whether disruption is real or not. Both sides have good points. Regardless of if you believe that disruption is real or not, our current disruption narrative has been around for over two decades now… probably long past the expiration date that gets slapped on any “innovative” idea. If you are still talking disruption, you are not an innovator.

If you want to convince me that you are an innovator, I don’t want to know what cool ideas or toys you have. I want to know who you read and follow. Are you familiar with Audrey Watters? Have you read Gayatri Chakravorty Spivak’s Can the Subaltern Speak? Are you familiar with Adeline Koh’s work on Frantz Fanon? Do you follow Maha Bali on Twitter? If I mention Rafranz Davis and #EdtechBlackout, do I get a blank stare back from you?

If you were to chart the people that influence your thinking – and it ends up being primarily white males… I am not sure how much of an innovator you really are. Education often operates as a “one-size-fits-all” box (or at best, a “one-set-of-ideas-fits-all” box), and that box has mostly been designed by white males. Usually a small set of white males that think all people learn best like they do. How can your idea or technology be that “new” if it is influenced by the same people that influenced all of the previous ones?

So what has this “one-set-of-ideas-fits-all” box created for education? Think tanks and university initiatives that sit around “innovating” things like massive curriculum rethinking, “new” pedagogical approaches, and “creative new applications of a range of pedagogical and social technologies.” They try to come up with the solutions for the learners. Many of these are probably some great ideas – but nothing new.

Why not find ways to let the learners set their own curriculum, follow their own pedagogical approaches, or create their own ways of applying technology? Instead of walling ourselves up in instructional design teams, why not talk to the learners themselves and find out what hinders their heutagogical development? Why not look to learners as the instructors, and let them into the design process? Or dump the process and let learners be the designers?

What I am getting at is helping learners create and follow their own learning pathway. Each one will be different, so we need massive epistemological and organizational shifts to empower this diversity. Why not make “diversity” the new “innovative” in education? Diversity could be the future of educational innovation, if it could serve as a way to humanize the learning process. This shift would need people that are already interacting with a diverse range of educators and students to understand how to make that happen.

I would even go as far to say that it is time to enter the “post-innovation” era of Ed-Tech, where any tool or idea is framed based on whether it supports a disruption mindset or a diversity mindset. What does that mean about emerging ideas like big data or wearables? Post-innovation would not be about the tool or the system around it, but the underlying narrative. Does this “thing” support disruption or diversity? Does it keep power with the gatekeepers that already have it, or empower learners to explore what it means for them to be their one unique “human” self in the digital age?

For example, if “big data” is just used to dissect retention rates, and then to find ways to trick students into not dropping out… that is a “disruption” mindset. “We are losing learners/control, so let’s find a way to upend the system to get those learners back!” A diversity mindset looks at how the data can help each individual learner become their own unique, self-determined learner, in their particular sociocultural context: “Based on the this data that you gave us permission to collect, we compared it anonymously to other learners and they were often helped by these suggestions. Do any of these look interesting to you?” Even of the learner looks at these options and rejects all of them, the process of thinking through those options will still help them learn more about their unique learning needs and desires. It will help them celebrate their unique, diverse human self instead of becoming another percentage point in a system designed to trick them into producing better looking numbers for the powers that be.

edugeek-journal-avatarThis is also a foundational guiding aspect of the dual-layer/learning pathways idea we are working on at the LINK Lab. It is hard to come up with a good name for it, as we are not really looking at it as a “model” but something that turns the idea of a “model” or “system” inside out, placing each individual learner in the role of creating their own model/pathway/system/etc. In other words, a rejection of “disruption” in favor of “diversity.” We want to embrace how diversity has been and always will be the true essence of what innovation should have been: each learner defining innovation for themselves.

dd-basic-set3

Personalized Learning Versus Dungeons and Dragons

Personalized learning is popular right now. But is that a good or bad thing? I can buy all kinds of personalized gadgets online, but do I really like or need any of them? If you decided to get me a custom dinner place mat that says “Matt’s Grub” – sure that is personalized. But its also a pretty useless personalized item that I have no interest in.

Many prominent personalized learning programs/tools are a modern educational version of the Choose Your Own Adventure book series from the 1908s. As I have written before, these books provided a promise of a personalized adventure for the reader, which was entertaining for a while. But you were really just choosing from a series of 50 pre-written paths, hoping to pick one of the ones that led to a happy ending. Of course, If you happened to have any physical characteristics that were different than the ones written into the story (I remember a classmate that had shaved his head making fun of one page that had the main character doing something with his hair – yes they were sometimes gendered stories even), then the “your” in “Choose Your Own Adventure” fell flat.

ChooseYourOwnAdventure

These eventually evolved into more complex books like the Lone Wolf gamebooks that had you doing your own battles, collecting objects, and other activities that were closer to role playing games.

LoneWolf

But let’s face it – the true “Choose Your Own Adventure” scenarios in the 1980s were really role playing games. And few were as personalizable as Dungeons and Dragons.

Now, whether you love or hate D&D, or even still think it is Satanic… please hear me out. D&D, at least in the 80s, was personalizable because it was provide different pathways that were scaffolded. New players could start out with the Basic D&D boxset – which came with game rules, pre-designed characters, basic adventures to go on, etc. And that wasn’t even really the starting point. If basic D&D was too unstructured for you, there were books like the Dragonlance Chronicles or the Shannara series that would give you this completely guided tour of what D&D could look like. Oh, and even a Saturday morning cartoon series if the books were too much for you.

But back to D&D, once you mastered the Basic set, there were more sets (Expert, Companion, Master, and Immortal) – all of which gave you more power and control. Then, when you were ready (or if you found Basic D&D too pre-determined), there was Advanced Dungeons and Dragons. This was a set of books that laid out some basic ideas to create your own characters and worlds and adventures. And you were free to change, modify, add to, or completely re-invent those basics. Many people did, and shared their modifications in national magazines like Dragon Magazine. Oh, and what if you want to make your own world but are still unsure? You had a whole range of pre-designed adventures called Dungeon Modules. Just buy one, play, and get inspired to create your own. Or, maybe the opposite is true: you were just tired of your creation and wanted to take a break in someone else’s world.

add

To me, Dungeons and Dragons in the 1980s was a much better metaphor for what personalized learning should look like. You had completely mindless escapism entertainment (aka lectures) when you needed it, like the books and cartoons. You had the structured environment of Basic D&D to guide you through the basics (aka instructivism). You had a series of games and accessories like Dungeon Modules and Companion Sets to guide you (aka scaffold you) to the advanced stage. You had the Advanced books that set a basic structure for creating your own world (aka the Internet). Then you had a network of people sharing ideas and designs to keep new ideas flowing (aka connectivism). Many gamers would go back and forth between these various parts – creating their own world, sharing their ideas in the magazines, playing dungeon modules on occasion, reading the books, and dipping back to basic D&D when the mood hit them.

This scene from The Big Bang Theory shows how players can customize, adapt, and personalized the game experience, even as they play:

edugeek-journal-avatarOf course, there were problems with the gaming community. It was expensive, and often sexist and/or racist. So I am not painting the Dungeon and Dragons world of the 1980s as some perfect utopia. I am looking at the design of the tools and system here. It is one that in some fashion pre-ceded and informed what we are doing with pathways learning, and one that I think is closer to true “personalization” than what some personalized learning situations offer.

maple-syrup

Pokemon Go and the Gimmickification of Education

I almost dread looking at my social media feed today. Pokemon Go (GO? G.O.? (wake me up before you) Go-Go?) received a large bit of media attention this weekend, apparently even already spawning posts about how it will revolutionize education and tweets about how we need what it produces in education:

All I could think about is: how did we get to this point? Every single tech trend turns into a gimmick to sell education mumbo jumbo kitsch tied to every cool, hip trend that pops up on the social media radar. I guess I shouldn’t been that surprised once Block-chain became educational, or Second Life was used to deliver classes, or Twitter replaced LMSs, or MySpace became the University of the future, or DVDs saved public schools, and so on and so forth. I bet at some point thousands of years ago there was a dude in white toga standing up in an agora somewhere telling Plato how chariots would revolutionize how he taught his students.

I’m all for examining new trends through an educational lens, but every time I just want to say “too far, Ed-Tech®, too far!”

We all know education needs to change. It always has been changing, it always will, and will always need to have a critical lens applied to how and why it is changing. But with every new technology trend that gets re-purposed into the next savior of education, I can’t stop this gnawing feeling that our field is becoming a big gimmick to those outside of it.

A gimmick is basically just a trick intended to attract attention. One or two seem harmless enough. Well, not that harmful? But once everything that comes down the pipe starts become this trick to get people to look at education, the gimmick gets old. People are still asking what happened to Second Life, to Google Wave, to you name the trend. After a while, they stop buying into the notion that any of us know what we are talking about. Just think of the long-term effect on the larger discourse of so many people declaring so many things to be the savior of education, only to abandon each one after a year or two.

edugeek-journal-avatarThe problem with the hype cycle of Ed-Tech is that is buries the real conversations that have been happening for a long time on whatever the hype-de-jour is. Do you want the Pokemon Go for education, where students are engaged, active, social, etc? We already have a thousand projects that have done that to some degree. Those projects just can’t get attention because everyone is saying “Pokemon Go will revolutionize education!” (well, at least those that say that un-ironically – sarcastic commentary that apparently went over many people’s head not included).

(see also “Pokemon GO is the xMOOC of Augmented Reality“)

non-linear-id

Evolution of the Dual-Layer/Customizable Pathways Design

For the past few weeks, I have been considering what recent research has to say about the evolution of the dual-layer aka customizable pathways design. A lot of this research is, unfortunately, still locked up in my dissertation, so time to get to publishing. But until then, here is kind of the run down of where it has been and where it is going.

The original idea was called “dual-layer” because the desire was to create two learning possibilities within a course: one that is a structured possibility based on the content and activities that the instructor thinks are a good way to learn the topic, the other an unstructured possibility designed so that learners can created their own learning experience as they see fit. I am saying “possibility” where I used to say “modality.” Modality really serves as the best description of these possibilities, since modality means “a particular mode in which something exists or is experienced or expressed.” The basic idea is to provide an instructivist modality and a connectivist modality. But modality seems to come across as too stuffy, so I am still looking for a cooler term to use there.

The main consideration for these possibilities is that they should be designed as part of the same course in a way that learners can switch back and forth between them as needed. Many ask: ‘why not just design two courses?” You don’t want two courses as that could impeded changing modalities, as well as create barriers to social interactions. The main picture that I have in my head to explain why this is so is a large botanical garden, like this:

01216 Scene at Fort Worth Botanic Gardens

There is a path there for those that want to follow it, but you are free to veer off and make your own path to see other things from different angles or contexts. But you don’t just design two gardens, one that is just a pathway and one that is just open fields. You design both in one space.

So in other words, you design a defined path (purple line below) and then connect with opportunities to allow learners to look at the topic from other contexts (gold area below):

pathways1-1

You have a defined modality (the path), and then open up ways for people to go off the path into other contexts. When allowed to mix the two, the learner would create their own customized pathway, something like this:

pathways1-2

The problem with the image above is that this really shouldn’t only be about going off the walkway in the garden to explore other contexts. Learners should be allowed to dig under the walkway, or fly above it. They should be able to dig deeper or pull back for a bird’s eye view as needed. So that would take the model into a three dimensional view like this:

pathways2

(please forgive my lack of 3-D modeling skills)

Learners can follow the instructors suggested content or go off in any direction they choose, and then change to the other modality at any moment. They can go deeper, into different contexts, deeper in different contexts, bigger picture, or bigger picture in different contexts.

The problem that we have uncovered in using this model in DALMOOC and HumanMOOC is that many learners don’t understand this design. However, many do understand and appreciate the choice… but there are some that don’t want to get bogged down in the design choices. Some that choose one modality don’t understand why the other modality needs to be in the course (while some that have chosen that “other” modality wonder the same thing in reverse). So really, all that I have been discussing so far probably needs to be relegated to an “instructional design” / “learning experience design” / “whatever you like to call it design” method. All of this talk of pathways and possibilities and modalities needs to be relegated to the design process. There are ways to tie this idea together into a cohesive learning experience through goal posts, competencies, open-ended rubrics, assignment banks, and scaffolding. Of course, scaffolding may be a third modality that sits between the other two; I’m not totality sure if it needs to be its own part or a tool in the background. Or both.

The goal of this “design method” would be to create a course that supports learners that want instructor guidance while also getting out of the way of those that want to go at it on their own. All while also recognizing that learners don’t always fall into those two neat categories. They may be different mixtures of both at any given moment, and they could change that mixture at any given point. The goal would be to design in a way that gives the learner what they need at any given point.

Of course, I think the learner needs to know that they have this choice to make. However, going too far into instructivism vs. connectivism or structured vs. unstructured can get some learners bogged down in educational theory that they don’t have time for. We need to work on a way to decrease the design presence so learners can focus on making the choices to map their learning pathway.

So the other piece to the evolution of this pathways idea is creating the tools that allow learners to map their pathway through the topic. What comes to mind is something like Storify, re-imagined as a educational mapping tool in place of an LMS. What I like about Storify is the simple interface, and the way you can pull up a whole range of content on the right side of the page to drag and drop into a flowing story on the left.

storify

I could image something like this for learners, but with a wide range of content and tools (both prescribed by the instructor, the learner, other learners, and from other places on the Internets) on the right side. Learners would drag and drop the various aspects they want into a “pathway” through the week’s topic on the left side. The parts that they pull together would contain links or interactive parts for them to start utilizing.

For example, the learner decides to look at the week’s content and drags the instructor’s introductory video into their pathway. Then they drag in a widget with a Wikipedia search box, because they want to look at bigger picture on the topic. Then they drag a Twitter hashtag to the pathway to remind themselves to tweet a specific question out to a related hashtag to see what others say. Then they drag a blog box over to create a blog post. Finally, they look in the assignment bank list and drag an assignment onto the end of the pathway that they think will best prove they understand the topic of the week.

The interesting thing about this possible “tool” is that after creating the map, the learner could then create a graph full of artifacts of what they did to complete the map. Say they get into an interesting Twitter conversation. All of those tweets could be pulled into the graph to show what happened. Then, let’s say their Wikipedia search led them to read some interesting blog posts by experts in the field. They could add those links to the graph, as well as point out the comments they made. Then, they may have decided to go back and watch the second video that the instructor created for the topic. They could add that to the graph. Then they could add a link to the blog post they created. Finally, they could link to the assignment bank activity that they modified to fit their needs. Maybe it was a group video they created, or whatever activity they decided on.

In the end, the graph that they create itself could serve as their final artifact to show what they have learned during the course. Instead of a single “gotcha!” test or paper at the end, learners would have a graph that shows their process of learning. And a great addition to their portfolios as well.

Ultimately, These maps and graphs would need to be something that reside on each learners personal domain, connecting with the school domain as needed to collect map elements.

When education is framed like this, with the learner on top and the system underneath providing support, I also see an interesting context for learning analytics. Just think of what it could look like if, for instance, instead of tricking struggling learners into doing things to stay on “track” (as defined by administrators), learning analytics could provide helpful suggestions for learners (“When other people had problems with _____ like you are, they tried these three options. Interested in any of them?”).

edugeek-journal-avatarOf course, I realized that I am talking about upending an entire system built on “final grades” by instead focusing on “showcasing a learning process.” Can’t say this change will ever occur, but for now this serves as a brief summary of where the idea of customizable pathways has been, where it is now (at least in my head, others probably have different ideas that are pretty interesting as well), and where I would like for it to go (even if only in my dreams).

LEGO

Depressing Confessions of a “Newly Minted” Ph.D.

I have been struggling with this blog post for much longer today than I probably should admit. Lots of people ask you what you are going to do “now that you have a Ph.D.” And the truth is, I really don’t know. I currently work in a nice position that requires Ph.D. level work, so its not like I am in a hurry to change things. But it is also a position that requires me to determine what I want to research, so staying put or looking elsewhere leaves me with the same confusion over “what’s next?” either way.

But why do I feel so confused over the future? This line from Jim Groom’s recent post seemed to finally clarify my hang-up:

“a bunch of folks who have been, for the most part, marginalized by the ed-tech gold rush for MOOCs, big data, analytics, etc—a push for the last four years that dominated the field’s time, energy, and precious few resources.”

There are interesting things happening in those “gold rush” areas, and also some concerning things. But our field, overall, does have a “cool” crowd and a “not so cool” crowd. If you are not currently into analytics, wearables, and a few other hot topics… you are usually left in the margins. I’m not sure if marginalized is the best word, but maybe… toiling in obscurity? For example, even bad ideas in analytics get more attention, more funding, more awards, etc, than great ideas in more obscure fields like instructional design, learning theory, etc.

That is not to slam analytics or wearables or whatever as a whole. There are some great ideas there. As Vitomir Kovanovic stated today:

The “gold rush” is often focusing on the “bad ones” at times because they can get something out there quicker. As George Siemens wisely pointed out:

So for a lot of these “hot topics,” I don’t hate them as much as see them having a long waiting period to mature into something practical. In the meantime, the instructional designer in me knows of practical ideas that can be used right now to make a dent in things.

But, the depressing truth is that these ideas will mostly always be kicking around on the fringes. When people like Mike Caufield complain about feeling obscure, and his ideas are a hundred times more popular than the ones I am interested in… it doesn’t make one want to sign up for years and years of fringe work.

Personally, I think the idea of “thought leader” is a bit along the lines of “rock star.” Others see differently, that is fine with me. But “thought leaders” are still part of the cool crowd, where as “thought synthesizers” tend to get left out of the conversation frequently. Most of the really interesting things that I like to work on, like customizable pathways design, are not really the result of “thought leaders” as much as “thought synthesizers.”

So the problem is, should I throw my lot in with the cool kids and do things that I am maybe-kind of interested in, or follow my passions into obscurity?

To be honest, I don’t really know. I am technically already in obscurity, so no where to go but up, right? A lot of this is not really about me, but the ideas that I think have great potential. They are also, unfortunately, ill-defined, poorly worded (too many syllables, which I say in all seriousness and not flippantly), not sexy, not flashy, not cool. I could very easily hitch my wagon to some ideas that are cool sounding and sexy. Someone sent me a link to a university that was looking for a Professor of Game-Based Learning that they thought I would be a good fit for. Sounds fun, flashy, hip, etc. But it was also in Texas, and let’s face it: Texas is not a great place to live (sorry if you think it is). And they pay academics poorly. I just found out this week I could get a raise if I went to teach high school down the street. Not interested in that at all, but…. ouch.

Also don’t know if I could spend all day teaching game-based learning. Not my passion. You see, I went to get a Ph.D. as a frustrated instructional designer that couldn’t get a foot in the research door because I wasn’t a professor. I wanted to follow my passions into researching ideas that made a practical difference (like many other Ph.D. students I am sure). That was five years ago, and the general state of academia has declined rapidly since then. I’m hardly enthusiastic to jump on the tenure track when that is such a minefield. If I can even get on the tenure track – that is difficult at best in the current university climate.

Oh, and now in many states students could be packing heat. So, yay safety.

edugeek-journal-avatarSo now that my pity party has been dragging on forever and will probably cost me the 6 readers I get for any post (WordPress stats are depressing as well), I leave anyone still reading this my depressing confession: if you get a Ph.D., you may end up finding yourself at a crossroads to choose between your passions and what will actually get you somewhere. If your passions line up with the cool crowd, you are lucky; if they don’t, you have a hard choice to make. I can’t tell you which one I will make. Obviously, I will be choosing very soon. But do I really want to push off in the opposite direction of the stream of hip ideas that have “dominated the field’s time, energy, and precious few resources”? It’s hard to say. But an important question to ask oneself.

the-kids-don-t-stand-a-chance-1201958

Reclaim the Front Page of Your Learning Experience for #IndieEdTech

One of the most contested areas in online learning is what I sometimes call the “front page” – usually the user interface or splash page or whatever main area learners first see when they start a course/learning experience/etc (usually also the main area they have to come back to every time). Schools want to control the “front page” learners see first in their class (usually always the learning management system they paid big money for). Ed-Tech companies want to control the “front page” learners see when they use their product. Other non-educational websites that get used in education like Twitter or Facebook want to control the “front page” of what users see. Of course, the average learner uses many of these services and has to navigate through many tools that are trying to control what you see while they learn, to control the “front page” of their learning experience.

The “front page” is how companies gather data for analytics so they can monetize users. Think back to the major changes between MySpace and Facebook. As horrible as MySpace could look at times, users could insert CSS and control all manner of aspects of their front page. That control was a good thing, despite the eye sores it created from time to time. How can a company monetize a MySpace user page when users can completely remove portions of the page? How can a company monetize interactions when users rarely have to leave their “space” to interact with others? The changes between MySpace and Twitter/Facebook resolve a lot of those issues, and hence created the battle for the “front page” of users’ internet experience.

This may not seem to be a big deal to many, but as we have been researching learner agency by giving learners modality choice in a customizable modality pathway design (aka “dual-layer”), the “front page” becomes a very, very important space that existentially affects learner choice in major ways. The tool that learners begins a learning experience in becomes the place they are comfortable with, and they resist venturing past the “front page” of that tool. You might have run into this problem with, say, introducing Twitter into a course taught in Blackboard. Many learners start to complain that the Bb forums would work just fine. There is a stickiness to the front page that keeps learners in there and away from other tools.

Shouldn’t the learner be in control of this “front page?” Shouldn’t this “front page” display their map of what they want to learn? Shouldn’t the tools and content and things they want to learn with/from support this map, linking from the learners “front page” rather than competing with it?

This is pretty much the big problem we run into with the customizable modality pathway design. The “front page” control battle segments the learning process, pulls learners away, makes them comfortable with giving up control of that space, and enforces the status quo of instructor-controlled learning. Up to this point, we have been working on design and structure – all of which is, for better or worse, coalescing into a design theory/method of some kind. However, the technology is simply in the way most of the time, mainly because very few tools actually work to give the learner control. They mostly all attempt to put their tool in control, and by extension, the person (instructor and school admins) behind the tool in control as well.

edugeek-journal-avatarIn many ways, I think this issue connects to the Indie Ed-Tech movement. I’ll be blogging more about that over the next few weeks/months. I’ll need to cover how the technology that allows learners to reclaim the “front page” of their learning experience could look – turning the idea of a “Neutral Zone” into a learning map that learners build (and then connect artifacts to create a “portfolio” map of what they did). This will allow learners to mix and match what tools, services, course, etc they learn from, leading to alternative ways to prove/certify that they have specific knowledge and skills, fueled by owning their own domain, APIs, cool stuff, etc. Of course, since any work in Indie Ed-Tech needs a music reference, I will be taking up Adam Croom’s challenge for someone to write about grunge rock (not my favorite style of music – Audrey Watters already used that one – but a good genre to represent the angle I am going for). So, yes, I have a good dozen blog posts in mind already, so time to get cracking.

(image credit: “The Kids Don’t Stand a Chance” by Nina Vital)

non-linear-id

Will The “Best” Best Practice Please Step Forward?

Whenever educational discussions turn towards student agency, learner-centered learning, and other less-utilized (non-instructivist) strategies, several common questions/concerns are raised about going this route. One of the more important ones is how do we put learners in control when there are so many learning mediums? How do we pick which one is best?

This is a great question. We should always strive towards what is best for our learners. The problem with this question comes not really with the question but the context that one or a few mediums are “best” and that we as educators can pick correctly for all learners at all times.

“Best practices” is a term commonly used in this context, and a problematic concept for many reasons. One of the bigger problems being that “best” is not really an objective line in the sand. What is “best” is constantly changing based on context, goals, preferences, and many other factors.

For example, different learning modalities each have their own set of best practices. Do you want a stereotypical instructor-focused course with lectures and quizzes? There are many ways to do that correctly, and many ways to do that incorrectly. Very incorrectly..

Do you want problem-based learning? Our field knows a lot on how to do that correctly, and a lot on how to do that incorrectly. There is also a lot we don’t know. And all of that changes drastically if you want, say, a well-defined contextually specific problem versus an ill-structured problem.

Other modalities (connectivist, cognitivst, social, independent, etc) have their own set of best practices, and each set of best practices changes within each modality depending on what flavor of that modality you are choosing. And even then there are still so many best practices that it really dilutes the term “best practice” down to “do the good stuff and avoid the bad stuff and be cautious with all of the stuff that we aren’t sure where it fits.”

Of course, sometimes when we say “best”, we are referring to choosing the “best” overall modality for a course, or even better, a given module inside a course. Anyone that has taught will know that once you choose a modality, half your learners will like it, and the other half will complain: “Why do we have to do group work? Why can’t you just tell us what to do?” “Why do we have to listen to you tell us what to do? Why can’t we just go do it on our own?” “Why can’t I have a group to help me?” and so on (even if you don’t hear them, you know they are happening in your learners’ heads.)

The truth is that different learners need different modalities for different topics at different times, some times even changing from one day to the next based on a whole range of internal and external reasons.

This means that the best device for choosing the best modality for any given learner at any given time is the learner themselves.

This whole post was inspired by a few tweets today that I think sum up nicely what I am really getting at:

The general idea is that our education needs to shift towards teaching learners how to learn, how to adapt, how to choose their own modality as they learn. We need to focus more on how to be learners and not just what facts and skills to learn. You, teach a person to fish and all that. This is the basis of heutagogy – the process of learning how to learn, how to adapt, how to self-regulate towards self-determined learning.

In other words, how do we get back to putting the human at the center of the educational process instead of our favorite tools and modalities?

edugeek-journal-avatarOne practical way some are working on this idea is the custmozable modality pathway learning design (my term de jour for what we used to call dual-layer). Shameless plug warning! Last week I was able to successfully defend my dissertation on this idea (and there was much rejoicing!). So hopefully after a few months of revisions and edits I will soon be able to start publishing the results on how diverse and personalized learners’ pathways are once they are given the choice. The educational field in general so rarely gives much true learner choice or agency that the outcome of enabling that choice is pretty eye-opening.

20141114_094222_choose-your-own-frac-sand-adventure-picture

Can the Students Speak for Themselves?

The answer is, yes, of course students can speak for themselves. The real question is will we listen to them, and even start including them in the conversation about their own educational experiences? This is not just a question for the established educational power systems that we typically associate with ignoring the student voice, but also for the educational reformers that seek to change those entrenched structures.

Recently I have been digging more into the work of the Indian-born philosopher Gayatri Chakravorty Spivak. Possibly one of her best known works is “Can the Subaltern Speak?”, an eye-opening critique of the post-colonial movement. For those that haven’t read Spivak, I would recommend Benjamin Graves one extended paragraph review of “Can the Subaltern Speak?” as a quick introduction.

The basic concern is that those who wish to help the subaltern (the economically dispossessed) gain their voice are still forcing them to adopt one voice for the entire group, ignoring the differences that exist within that group. In other words, the post-colonialists are becoming a different type of colonialist. This leads to two problems: “1) a logocentric assumption of cultural solidarity among a heterogeneous people, and 2) a dependence upon western intellectuals to “speak for” the subaltern condition rather than allowing them to speak for themselves.” Sound familiar?

What if you replace “subaltern” with “student”? How about replacing “cultural solidarity” with “connectivism”? What about the recent claims that scaffolding is colonialist in nature? Pretty much insert any modern educational reformer’s idea that there are absolute good and bad solutions for all learners: “if we can just convince all learners that connectivism is good and that scaffolding is oppressive, we can improve education!”

But what if we are forcing learners to take on epidemiological solidarity when the are actually a very heterogeneous group? What would they say about that if we listened to them when they speak for themselves?

We would find out that some learners want to follow the instructor. We would find out that some want to follow their own path. We would find out that many want both, just at the time of their own choosing. We would find that some love connectivism, while others find it inefficient and pointless. We would find that some hate scaffolding, while others think it is necessary. While scaffolding might be oppressive to some, it could be supporting or liberating to others. Or it could be both at different times to the same learner. Contexts shift. People change their minds.

These are not speculations. This is based on what learners have stated in the research for my dissertation. Learners are all over the map once you give them true choice, true personalization.

Which takes me to my problem with what many call personalized learning. Those of a certain age will remember the Choose Your Own Adventure book series. The basic idea of this book series was that the stories were not presented as a singular, linear path. Readers would read a few pages and then be presented with options. They would choose an option and turn to that corresponding page for that option, and so on until the adventure ended. Usually it ended poorly or kind of neutrally, but the goal was to keep trying until you arrived at one of the “good” endings. There were generally about 12-40 full story lines in each book to mix and match.

Most people that read these books developed a strategy of gaming the story lines, usually by bookmarking the last few choices with various fingers. If one choice led to death, just back up a step or two and try again.

The reality was that these were less “Choose Your OWN Adventure,” as much as “Choose One of 40 or so Pre-Determined Pathways to Entertain You With the Illusion of Choice.” This is also the premise of many (but not all) personalized learning systems. The programmers create a pre-determined set of options, and the learner has the illusion of “choice” and “personalization” as they choose various pre-programmed scenarios.

To me, true personalized learning would allow learners to speak for themselves, while not forcing them to follow one person’s view of the “correct” way to learn. True personalized learning would treat learners as an epistemologically heterogeneous group, giving them the ability to speak for their own personal epistemology.

Because the bigger problem is that when the experts come in and say “connectivism is good, scaffolding is bad, here are the ways you are going to connect with others”, they are really just creating a form of neo-instructvism that still forces learners to follow what the expert at the front says to do (even though it may be pre-prescribed connected learning).

These neo-instructivist connected learning activities are not theoretically – they currently exist in online courses. Learners are told to go to write their own blog and then comment on three other blogs in order to pass. Or compose a tweet and then respond to three other tweets. Or post a picture on Instagram and then comment on three other pictures on Instagram.

Sure, that is connected learning and research tells us that learners will retain more because they applied it while connecting to others. But where is the student voice in forcing them to all have a blog and then forcing them to comment and interact (or else don’t pass the course you took out a big loan for)?

Or what of the instructor that doesn’t provide any guidance and just dives into student-centered learning… whether the learners want it or not? Where is the student voice in that pre-determined student-centered design?

edugeek-journal-avatarSure, these instructors will win awards and be praise all over the Twitter-sphere for innovative, connectivist learning. For fighting instructivist colonialism. And so on. But what if these post-instructivist crusaders are causing the same damages to learning that the post-colonialist crusaders were causing that Gayatri Chakravorty Spivak noted? What if we are mistaking a statistically significant research result for the lone “voice” of what works for all learners at all times?

lego-feat1

People are Not Generalizable Cogs in a Wheel

One of the issues that we are trying to get at with dual-layer/customizable pathways design is that human beings are individuals with different needs and ever-changing preferences.

That seems to be an obvious statement to many, but a problematic one when looking at educational research. Or more correctly, how we use and discuss research in practical scenarios.

For example, when ever I mention how instructivism and connectivism can also be looked at as personal choices that individual learners prefer at different times, the response from educators is usually to quote research generalizations as if they are facts for all learners at all times:

More advanced learners prefer connectivism.
People that lack technical skills are afraid to try social learning.
Learners with higher levels of self-regulation hate instructivism
Students that are new to a topic need instructor guidance.
Student-centered learning makes learners think more in depth.

While many of these statements are true for many people, the thing we often skip over in education is that these concepts are actually generalized from research. It is not the case that these concepts are true for all learners, but that they have been generalized from a statistically significant correlation. That distinction is important (and often ignored) – because studies rarely find that these concepts are 100% true for 100% of the learners 100% of the time.

But practitioners typically read these generalizations and then standardize them for all learners. We lose sight of the individual outliers that are not included in those numbers (and even of the fact that in the data there is variations that get smoothed over in the quest for “generalization”).

Then, of course, we repeat those experiments with different groups and rarely check to see if those outliers in the new experiment are different types of people or the same.

We also rarely research courses where learners have true choice in the modality that they engage the course content, so do we ever truly know of we are finding the best options for learning in general, or if we are just finding out what learners will do to make the best out of being forced to do something they would rather not?

Are we losing sight of the individual, the unique person at the center of educational efforts?

My research is finding that, when the given freedom to choose their learning modality (instructivism or connectivism), learners stop falling into such neat categories that often comes out of research. For example, those that are advanced learners with high self-regulation and well-developed tech skills will sometimes prefer to follow an instructivist path for a variety of reasons. Or, for another example, sometimes learners have already thought through an issue pretty well, and therefore forcing them to go through student-centered learning with that topic is a boring chore because they don’t need to be forced to think about it again. Or. for even another example, some learners with low self-regulation and low tech skills will jump head first into connectivism because they want to interact with others (even though the research says they should have been too afraid to jump in).

edugeek-journal-avatarWhen you actually dig into the pathways that individuals would choose to take if one is not forced on them, those individuals tend to defy generalization more often than expected. But when you point this out, the establishment of education tends to argue against those findings all kinds of ways. We like the comfort of large sample sizes, generalizable statistics, and cut and dry boxes to put everyone in. I’m not saying to abandon this kind of research – just put it in a more realistic context in order to make sure we aren’t losing the individual human behind those generalizations.