Applying ‘Intelligence’ to Data

(Part of the ongoing discussion in the Designing for Data MOOC)

When Competitive Intelligence professionals delineate the needs of a client, they do so with the intent of delivering actionable intelligence.  One of the tools which can be be used, in order to obtain the desired data and analytics results, is Jan Herring’s Key Intelligence Topics (KITs) .

When we’re engaging in data driven design, we might not have to survey the competitive landscape, but we can still adapt the KIT framework to better define our goals, strategies and context.

KITs are broken into three categories: Strategic Design, Early Warning and Key Players;  those could probably be translated for learning data as follows:

Strategic Topics  (at the individual, course or organizational level)

What decisions are you hoping to make?

What actions might be influenced by this data?

What time frame are you working under?

How might this data be used at more than one functional level?

Internal/External Forces

What could possibly go wrong?

What could surprise you?

What might force change or disruption the course of the project?

What might invalidate results?

What else could trip you up or simply force a change? Think everything from privacy issues, to technology limitations, to budget cuts to disruptive technology, resources, available analytics skills

(hint – think about things that have tripped up projects in the past – start from there)

Key Players

Who needs this data? Why?  Who else?

What do they really need to know – be specific.  Be more specific?

What outside factors will influence how they hear/respond to this data.  Again be Specific.

As an example:

For one of my projects, this is a partial outline of some of the topics I need to investigate in detail to create a project/action plan.

Strategic Topics:

Learner self-assessment vs. behaviour in a gamified environment

Sources of red herrings in learning data

Dataset to support development of dashboard design heuristics/rubrics

Internal/External Forces:

Functional limits of analytics tool

Data Modeling expertise

Learner expectations, needs

Sample size sufficiency for certain statistical analyses

Difficulty validating hypotheses (limits to models)

Key Players:

Participants (profiles, needs)

Team members – personal objectives, availability

What data/reports will meet (name)’s needs

How does (name) define success?  Do I clearly know their goals?

 

Once you have clear picture of project needs and potential influencing factors (whether using this model or any other), it will be much easier to address data needs and prospective analytical approaches.

Starting with ‘Why’

Leading up to the Designing for Data MOOC, Sean Putman and I have been looking at some of the foundational issues...

 

Torture the data, and it will confess to anything.

~ Ronald Coase, Economics, Nobel Prize Laureate

The continuum:

Data > Information > Knowledge > Wisdom

is a common feature in blog posts about learning and instructional design

A steady progression from one to the next is great in theory; but what comes out in practice can fall short of the theoretical ideal.

It’s key, in the process of designing for data, to keep in mind what we’re wanting to measure, and why; and what are valid metrics to validate the critical elements.   Unfortunately there is no neat, tidy meter-stick – no ‘base unit of learning’.  As Reuben Tozman put it: “What does a 75% on a math test actually mean?”  (Clearly, it can mean a lot of things – but the common unit of measurement – percent correct – is not enough to narrow it down.)

So before data collection begins it’s critical to ask some hard questions.   Otherwise data becomes an invitation to game the system, or a wasted opportunity.  If I don’t know what data I really need – it becomes very easy to simply use the data that’s available.

The key is to start by asking not “what” but “why;  to think like analysts, not like accountants; and to do that before collecting data.  Starting with the foundational reasons behind the learning endeavor to be studied will drive the analytic approaches to use and those, in turn, will inform the selection of data I need.

It’s easy to lie with statistics. It’s hard to tell the truth without statistics.

~ Andrejs Dunkels

Data (and statistics) are awesome for getting an overview of a population, but have to be handled very differently  when we’re looking at an individual.   Using Body Mass Index as a health predictor works well as a general heuristic, but it can break pretty dramatically when applied indiscriminately to individuals (think body-builders).   The context directly impacts which data is useful, and how that data should be interpreted.

Even good data, well applied is not an end unto itself. Information is not action. In the study: “Numbers Are Not Enough. Why e-Learning Analytics Failed to Inform an Institutional Strategic Plan”, a university performed a system which looked at LMS usage in relation to student learning across a university, the results pointed to a strong correlation between LMS use and student achievement as well as identifying growth opportunities and needs.  But this data, on its own did not have a strong impact on the strategic planning processes at the university level due to inherent cultural and structural issues.

The conclusions of the paper acknowledge the power of the analytics to encourage change over time, but also recognize the need for compelling delivery of results to stakeholders.  Since data and analytics are often used to drive change or improvement, contextual awareness ties not only into the designing to collect day, but also designing to deliver that data in effective ways.

If you do not know how to ask the right question, you discover nothing.

~ W. Edward Deming

Even in the best of situations, collecting data on learning cannot guarantee I’ll gain optimal insight into how well a course is working, if business objectives are being met, if individuals are truly attaining mastery.  But designing strategically prior to data collection will improve my odds significantly.

Rock, Paper…. Systems

hopefully not "global thermonuclear war'. Although that did end up being a major learning experience.

If you’ve ever learned to play chess or taught someone to play it, you know how different the game looks to a novice than it does to an expert.  There are a lot of reasons for that, and all the studies about how novices see individual pieces whereas experts see patterns, etc. are pretty familiar.

But think back to learning the game.  If you were lucky, the person who was teaching you had you play some practice games with only a partial set of pieces; maybe just playing out some endgame situations.  This meant that you had a lot fewer details to hold in your mind; the system was simpler, the number of actions limited.

This, to at least a small degree, leveled the playing field.  By having a more constrained system, the differences caused by the human element of the game had a lesser impact on how the game played out.  It also gave you, the novice, a chance to start creating some of the connections and heuristics needed to play the game well.

systems – disambiguation

Systems as opposed to ‘systematic’. I’m using the term systems to mean something akin to is usage in ‘ecosystem’ with all it’s intersecting realities, forces, influences, constraints, opportunities and limitations that define a game or learning situation.

The human side is the wild card in how games or learning play out.  If I know how much human (agential) influence I want on the  process and outcome, I can respond accordingly by designing more loosely or tightly constrained systems.

Of course, another response is to augment a loosely constrained system with some of the human affect being in the form of support and feedback.  With our novice chess player, this might mean playing the full board, but with an experienced player to help them think through the game.

There’s a lot of learning value to  having systems not be so tightly constrained that they have a chokehold on creativity and innovation, but there’s also value to not having the system be too loose.  Like a novice facing off against an expert in chess, too many choices can actually end up reducing the number of meaningful choices.

Balancing Act – motivation in MOOCs

Last summer I had the chance to be part of the first xAPI Design cohort that was run by the folks at ADL.  It looked to be a good way for people interested in xAPI to start playing with it in a supported environment – and a good way for ADL to get a feel for how the spec might be used out in the wild.  And it was; there were some really impressive projects and results.

A Crash Course in xAPI

In a nutshell, about 50 people signed up for a nine week project sprint.  They were assigned to teams (based on overlapping interest areas), came up with a project idea, and then worked for nine weeks to build that idea into something usable.  Most of us were working off-the-clock, on teams with people we’d never met; our main common ground being that our interest in xAPI greatly outweighed our knowledge and experience.

Each week, the designers, developers, and project managers, joined by ADL staff, would have a virtual meeting and gradually work and adapt our projects into something useful, in a fairly tight time frame.  There were weekly check-ins with all the cohort teams, which proved to be a fabulous experience – there’s a kind of camaraderie, collaboration and generosity that comes to the fore in projects rooted in time-crunched creativity.

But it turned out that what the teams accomplished over those nine weeks, wasn’t the only interesting thing to come out the experience.

A MOOC by Any Other Name

I’d been doing some thinking about MOOCs in the weeks prior to the cohort.  And at some point along the way, it became clear that the Design Project was effectively a MOOC (or at least a mini-MOOC).  More specifically it was a MOOC that worked, one that delivered on the promise of the concept.  So while part of my brain was engaged on my team’s project, another part was looking at what it was that made this (undeclared) MOOC have completion rates that were at least 5 to 7 times higher than what is typical.

Granted, the folks who signed up for the cohort were highly motivated – a chance to play with and understand xAPI was the common ground.  But then again, I’ve signed for MOOCs on topics I’m motivated to learn about, and have an excellent track record of non-completion. So I started thinking about why people stayed the course on this one.

As it turns out, the project had several aspects which, in combination, worked heavily in its favor.

Constrained Chaos is a Powerful Thing

First off it was probably the most ill-defined project I’ve ever been involved in.  And that was a key to its success:  ill-defined problems are a compelling motivator, and a great tool to encourage deep learning and exploration.  It’s hard to resist exploring, and along the way new mental models are created, new heuristics, new connections – it’s good, hard fun.

But an ill defined problem on its own can wind up being a turnoff – wheels spin, the energy dissipates…  Unless you balance the lack of structure in the system with a healthy dose of human support. And even better, human support coupled with ongoing deadlines.  Knowing there are teammates depending on you to something done before your Tuesday meeting keeps the momentum driving forward, even in the face of obstacles.  And sometimes those obstacles were removed or at least lowered though interaction with the ADL coaches and other teams.

Did I get a lot out of Design Project?  Absolutely – it exceeded expectations.

But it may be that the main thing I got was a reminder about the power of well balanced design.  Freedom and constraints, autonomy and support – get those right and the payoff in motivation and results is probably something worth sending an xAPI statement about.

Stepping Back – xAPI, Learning Assessment and Pointilism

circus - georges seuratI’m getting to play around with xAPI for a few weeks, and I’m learning a lot.  There’s the obvious learning that comes from picking up something new and working with in an evolving community of practice.   But it’s the conversations, even more than the practice that are nudging me to look at some aspects of the intersection of technology and learning that are outside of my usual focus areas.

I tend to spend a lot of time thinking about how technology can be leveraged to support  learning as well as looking critically at situational limitations to its effectiveness.  But these days I’m spending a lot more time looking at another facet of technology and learning – the broader implications of being able to track learner activities.

About half the time I’m really excited about the potential, about the possibility of more nuanced, meaningful assessment.  When I think about areas of complex learning in organization, like emerging business processes, I can see tremendous value to having a full view of the range of practices and activities which support the difficult task of change.   Equally, in education, there can be real benefit tracking correlation between activities and mastery – to see what helps each student really grasp the topic, not just what gives them high marks on a test.

But then the other half of the time, I’m not so sanguine.   Because just like standardized testing in schools may have started with the laudable intention of providing equal educational opportunity for public school students across the full socio-economic spectrum, but then, in practice, evolved into something much less laudable, it seems plausible that access to the swathe of data that xAPI is able to provide is no guarantee of better assessment or better learning opportunities.

That’s probably the part of xAPI that interests me the most – designing use cases that are more than just a more sophisticated way to track people’s clicks, play big brother, or allow me to game the system to rack up the most ‘success’ points .

Looking at that aspect makes me think of watching people encounter pointillism for the first time.   Walking into an art gallery one of two things happens.  A person walks around the corner and finds themselves face to face with a canvas full of dots and thinks, “What’s so great about this?”  Or the person may see the painting from across the gallery and only upon approaching closely do they see the tiny points that make up whole picture.

Right now I’m thinking that when designing for tools like xAPI, it’s going to be key to remember to have the whole picture in mind before planning out the “dots”; and then to keep stepping back to makes sure the dots are actually creating the desired picture.

Recall, Re-Creation, and Remixing

This is one of those posts where, as I write it, half of my brain is quite convinced that I’m really onto something, and the other half is equally sure that I’m just rehashing something that everybody else has already worked out.

“What’s the difference between Recall and Re-creation?” – an interesting question that came up last year in a conversation about learning. I’ve played with it, in the back of my mind, ever since.

In one sense it’s a simple distinction.  Back in high school, you might have mechanically solved a system of multi-variable linear equations, simply by recognizing the form and recalling the rubrics – that “Recall”.  But given an unfamiliar style word problem, or having to derive a proof would have shifted things to the arena of Re-Creation.  In over-simplified terms, Recall is about knowledge, and Re-Creation relies on both knowledge and understanding.

It seems, though, the Recall and Re-Creation are actually part of a continuum.  If you look at how people learn, it seems to go something like this:

recall-re-creation coninuum
or possibly it is more realistically like this:
Recall-ReCreation1

where Remixing and Creation are parallel but functionally similar.

some side comments are hidden in the toggle here:

Notes

As I was writing this, it hit me the first aspects of the diagrams seem to almost be akin to the Montessori three stage lesson (“This is a triangle.”  “Can you find the triangle?”  “What is this shape?”  And by extension the Re-Creation aspect might be approaching “What makes something a triangle?”)  The similarities actually makes reasonable sense, Montessori is about building knowledge and understanding – giving the precursor mental tools needed to allow meaningful exploration and creation.

When I look at the continuum as a whole, it makes me wonder if I’m just unconsciously revising Bloom’s Taxonomy.  I rather hope not. I have always had a bit of an uneasy relationship with Blooms.*   It seems to be easily subject to forced or clunky applications; the Creation bit always seemed to end up being a diorama or a “medieval newspaper” project on some history topic – I never learned much history from those, the brain was too full of executing the logistical side to worry much about history.

* coming from the science, and then information science world, Bloom’s is something I met (formally) much later in life and immediately recognized as the source of many a painful and pointless classroom experience.  Now don’t get me wrong, there are marvelous teachers who integrate Bloom’s seamlessly (they are probably also adept re-mixers) but it is easy to use it to drive unconnected activities, piled one atop the other like a stair step of cake layers, or so many boxes to tick.

back to the regularly scheduled post, already in progress…

 

From Algorithm to Understanding

Obviously we don’t actually learn things in single incremental steps, there’s a lot overlap (bleed-through) and cycling back and forth, but on a fundamental level this seems to be useful mental model (for me, at least).  Recognition and Recall seem to be pretty well established concepts (e.g. Oppenheimer on fluency and priming), where it got interesting to me was when that long-ago conversation looked at recall vs re-creation.

The difference might have a familiar example in math.  I might know how to solve (algorithmically) a given class of equations, and I might even know pretty well when presented a real world problem which class of equations would apply to it.  That’s all in the realm of Recall.

Where Re-Creation comes in might be in something like deriving the equations from fundamentals (I had a friend who was an astrophysicist whose adviser would, when approached with any problem would invariably start out with “Well, let’s see, V=iR…. and then derive from there.  He was modeling Re-Creation – helping the students build a habit of approaching questions from a point of deeper understanding – even if it was probably a bit annoying for the student at that moment)

green globs

A successful removal of all the “green globs”. A further description is available at http://youtu.be/rdDwoUk4ojY?t=42m2s

Another fantastic example of re-creation is the old math game “Green Globs” where students were presented with a set of points on a a graph, and tried to create equations to hit as many “globs” as possible with as few equations as possible.

To be successful you had to understand equations and their behaviour well enough to develop what you needed.  Re-Creation, then seems to be that point where one has sufficient knowledge, experience and skill that it is possible to begin to develop some sophisticated heuristics about the topic at hand.

In learning, just as in games, as we develop more sophisticated heuristics and can start seeing what avenues will likely be fruitful, and which ones should be avoided – reaching the point of being able to not just recall, but re-create is a turning point where there are real rewards in learning – that sense of potential and creativity start to take off.

 

Remixing and beyond

Remixing occupies a really interesting place in the learning continuum.  At it’s best, it seems to occur at an intersection of knowledge, expertise and heuristics – all scaffolded by the work(s) being remixed. It’s not  not merely a space for non-experts,  but it is definitely a great approach to continue to refine knowledge and heuristics as it allows an individual to focus on limited array of aspects of creation.

This could be anything from writing fan-fiction (where a writer uses existing worlds and characters and can therefore focus attention build skills in plotting, dialogue and descriptions) to musicians taking familiar song and extracting some new subtle layers from it, or even developers who fork code (having the foundational work in place allows clearer focus on very specific goals).    Being able to focus on a limited number of details is very helpful while a person is still building schema and heuristics; it allows attention and effort to be spent efficiently where there are skills and knowledge to be developed.

As expertise grows, there is less need for the scaffolding aspect of Remixing, but it is still a useful tool.  At a fundamental level, most research papers are an exercise in remixing – the author is distilling the essence of prior information, trying to build new connections, but not building the world from scratch.  The level and quality of distillation will vary greatly with the expertise of users – in research, a new student might make simple connections, an seasoned researcher might create detailed and complex synthesis of information, but both would advance in their understanding from the exercise.*

Obviously remixing isn’t new, but the tools we have now make it much easier – allowing more time and attention on the content rather than the mechanics.**  Regardless the subject area, remixes can be a place to develop choice bits from the foundational work, or find surprising new connections.  They are a scaffolded starting point for either creation or refinement – both of which are potential fruitful spaces for learning.

To be useful, since remixing is about building connections, heuristics, schema… it has to be something organic.  That doesn’t mean unstructured, but it does mean that if you weigh it down with too many rubrics the value will get lost in the box checking.  The real beauty of remixing is that it both develops and demonstrates expertise and rigour.  A good remix, like creation cannot be fudged from the sample problems or highlighted terms in a text book.

 

 

*thanks to Chris Atherton (@finiteattention) for suggesting this concept.

** I remember my brother and a friend doing remix “documentaries” for Literature class assignments – a tedious task involving a cassette recorder, hand written scripts, and albums on a turntable for soundclips.  The scripts were somewhat MontyPython-esque, and the execution was incredibly laborious in those days.  But I still recall more literary details from those than from any papers I read on the topics.

 

 

 

 

Only You Can Prevent Standardized Testing

Smokey the Bear - Only YouI was sweating my way through some tough writing when my phone rang this morning. I almost didn’t bother to answer it, but I’m glad I did.

The call took me away from all sorts of idealistic, theoretical discussions of education and right into the reality of it.

The person on the other end of the line was the parent of a middle school student, and what she had to say was not as surprising as I wish it was. Her child has a class that is (nominally) something along the lines of “Global Studies”, but what she had noticed was that more that 50% of the assignments from the class were worksheets to help prepare for the Math and English sections of the Statewide standardized testing. (She actually started to track the assignments on a spreadsheet to verify that this was the case).

The teacher’s response was that the course had been cut from a full year class to a single semester, and the only way to get it reinstated as a year-long class was to prove that students who took the class had improved standardized test scores.

Obviously there are so many things wrong with the whole situation that it’s difficult to know where to start. I’m working hard to restrain my natural urge to rant about all the reasons why standardized testing is idiotic (you can start from how the standards are established and assessed and go on from there…. it’s a long list).

I don’t necessarily blame the teacher – he/she seems to have been put in an impossible situation of “you have to teach to the test to be allowed to give the students the chance to learn some things of real interest and value.”

Revisiting Big Data

A while ago I wrote a slightly cranky (and oversimplified) post on Big Data and education. Situations like this make me realize two things:

  1. Used well Big Data (and Long Data) may actually be a way around pointless testing. It is becoming increasing possible to measure meaningful information about learning. I’m still not a fan of the concept, but I realize that schools, like any large state-run organization feel the need for objective measures of performance to assure effective use of taxpayer dollars, and so on.  It is arguable that it is actually possible, or even desirable, to measure learning in the same way that we measure key economic indicators, but I don’t think that the state and national requirements for “objective measures” (for some sort for accountability) are likely to go away any time soon.
  2. Used badly, big data could make this sort of classroom situation even more common.

In the current assessment system, lacking tools to measure meaningful data to satisfy the accountability requirements, we’ve set up an unacceptable situation. We have young people who could be learning amazing things, who could be creating and innovating, or at least stumbling across the spark for future inspiration, instead stuck doing mindless worksheets so they can fill in multiple choice tests.

It’s enough to make me hope for rapid and effective development of Big Data tools for education (TinCan in the classroom would certainly trump multiple choice tests).

But What Can We Do?

It is outrageous to think of how many students may be being cheated out of real learning. And the fact is while it may be not our fault, it is certainly in our control.

I was on a Board of Education for a number of years. I learned a lot about the layers of state and federal administration that hamstring talented teachers and innovative school administrators. I saw the political nonsense that can end up driving standards and establishing shoddy (and often academically unsound) assessment criteria and methods. And all of that can be pretty demoralizing.

But I also was reminded that, in the end, the power lies in the hands of the consumer. A teacher who I deeply respected, and truly enjoyed working with, also had his qualms about standardized testing (he saw how it ultimately was harmful to students).  He once said to me:

“Parents have the ability to end standardized testing right now. If every parent refused to send their children to school on testing days, the system would collapse.”

Now of course that’s silly, we all know that no one wants to compromise their kids future scholarship to (pick one: Harvard, the State University, Community College) by skipping those tests.

But what if one day we all decided that real learning was more important for our children’s futures than test rankings? What if next year, when testing rolled around, no one showed up?

Meaningful Data

(Part of a series of posts from SXSWedu 2013)

All in All…

Sometimes at this sort of event you get an unintentionally defining moment.   There were some chuckles around the room where a largish crowd was waiting for a panel discussion on Big Data and Emerging Tech; as the panelists were settling in, it was hard to ignore the background music.  There’s nothing like a little Pink Floyd to get the conversation started on the right note.

What followed was actually a very useful session.   In general there reasonably balanced view of data and analytics on a conceptual level – seeing them as part of the picture but not necessarily an end unto themselves.  Interestingly, a lot of what was being proposed reminded me of discussions around the use of ERPs in the business world; and just as is often the case in the business world, the conversation was solid on the conceptual level, but seemed to fall short on the execution side of things.

Good tools and lots of data only take you so far; plenty of organizations fail to make effective use of the information at hand.  Even with excellent, user friendly technology, it ultimately comes down to the thought process that underpins the tools.

You keep using that word. I do not think it means what you think it means…

When you start talking data, you will pretty rapidly touch on problem words.  Words like: assessments, systems, processes.  At some point you’re drowning in a sea of terms which get tossed around, largely undefined.   I would have found the session more enlightening if there had been a better definition on the specific usage of Big Data in education.   One thing that was lacking was a detailed discussion of what might be measured, why it would be measured, in what context, and how those measurements would produce meaningful information.  It would also have been interesting to have had more discussion of what sorts of data would not be relevant.  The technology available is undeniably powerful, but tools and data are not the same thing as solutions (or even understanding).

There was also an interesting disconnect with respect to definitions when conversations occurred across professional spheres. This was something that was especially clear in the Q&A session – times when the response to an audience question clearly did not match the intent of the questioner within their own professional context – this was an unfortunate missed opportunity. However, the back-channel and after session discussions were both enlightening and useful.  The nature of the topic led to an interesting embodiment of the disconnect between disciplines; fortunately, informal conversations in the halls, after the talk, provided individual opportunities to bridge that disconnect.

None of which is to say this was not a good session – it was.  It just left me wishing there was more time to dig deeper.

A Place for Everything…

The educational ecosystem is incredibly complex.   That can make hard to keep straight the “places for everything” – big data, small data, metrics, local control, national governance, empirical, stochastic and evidence based research.  They all belong, but they don’t all necessarily belong everywhere – each one can be used poorly and each one can be used effectively.  I would have really loved to have heard a lot more discussion about the intelligent selection of what should be measured, and how, and why – and then how it was proposed that that data would be used in meaningful and useful ways.*   This is a complex set of problems, and businesses struggle with it every bit as much as the education community; basic questions such as “What should we be measuring, and how?”, and “What should we do with the data? “ are non trivial.**

but Everything looks like a nail….

As the old saying goes… if you have a hammer, everything looks like a nail…

Having access to large amounts of data, that doesn’t mean it is the right data to use, or that you are using it correctly.

And just because technology allows us to gather educational data (and provide access) on a massive, national scale, doesn’t necessarily mean that we should.  It’s good to take a step back and decide when goals are served though scale, and when they are better served on a more local level; it’s perhaps analogous to the difference between a fast food strawberry milkshake and locally grown strawberries and cream.  They have the same base ingredients, but what a difference in the end product.  And there is a place for each one; it’s a matter of context and goals.***

Just as businesses find that for some functions it makes sense to have global processes, but in others it is better to handle things on a local level, so it is for education.  And it takes some time, hard work and open-minded analysis to figure out which is which.  And the starting point for me is always to step beyond the range of my own narrow professional lenses, and get down to the fundamentals: what we want to know, and why, and (if) how we can effectively measure it**.   And finally making sure that you do something useful with the information generated.

Data (and Education) in Context

The session on Big Data was one that kept people talking long after it wrapped up.  I was fortunate to have some lunchtime conversation with a fantastic group, including Aaron Silvers and Megan Bowe.  As a group, they had some of the best grasp of context of anyone I spoke with while in Austin – balancing technology,  government/administrative requirements, politics, and how real learning happens.  There are people out there doing cool stuff with analytics and data that has real benefits for the education community- you just may have to sift through a bit of chaff to find them.****

Ultimately, data that is collected and applied in well considered, meaningful ways can generate some valuable, usable insights.  If it’s not done well, though,  we might as well all find our old Pink Floyd t-shirts and join the chorus “… another brick in the wall.”


 

 

* One case where data is used in a meaningful, considered way might be Mozilla’s Open Badges (see this quick write up by Doug Belshaw)

** A fantastic example of having lots of data, but not necessarily the relevant data, and/or sufficiently contextual evaluation of data: Yahoo’s new CEO’s decision to eliminate telecommuting was based, at least in part, on VPN usage data.  But that presupposes that there is a direct correlation between VPN use and productivity.  The decision process seemed to ignore  larger contextual issues:  1) for any tech savvy workforce, if there’s something better for productivity than the VPN (which there is) they will use it; 2) there is an underlying assumption that clocking in at the office naturally equates with productivity; and 3) it is also apparently assumed that all types of work have their highest productivity in the cubicle world.

*** I definitely prefer the strawberries and cream when possible.  But in the realm of technology – I’m just has happy to have a mass produced smartphone than build my own with local materials.  Context matters – and context has many, many layers.

**** This holds true in a lot of other areas of the education world.  Gamification and Cognitive Science are prime examples.

 

Coding and Makers

(Part of a series of posts from SXSWedu 2013)

There seemed to be a bit of a nexus around Coding, Web Literacies, and Makers at SXSWedu.  This made me very happy for a number of reasons.

Asking the Right Questions – Creating Opportunity

I started off the week catching Dave Belshaw and Kathleen Stokes’ panel: “Supporting a Generation of Digital MAkers” where they shared some of the work going on at Mozilla as well as more general resources related to Digital Makers.  It was a lovely session, informative, yet informal and very interactive and conversational.  It pinged out some pretty key questions that probably need to be asked in any learning situation – “Why do we need to teach ___?” and/or “Why do we need to know __?”

And right there is where the sweet spot of coding (or other maker-type activities) lies: the value is in the opportunities.   And for all the talk of 21st century learners that’s been chucked around for the last decade or so, this is actually getting into the realm of both “useful” and “actionable” because coding seems to be a space where you get more than  just a “skill”.  And that’s not dismissing the “skill” aspect – it’s a skill with a lot of merit on its own; it just gets more exciting when you view the skill in a broader context than just IT.

Coding as a Literacy, a Skill and a Gateway

In several sessions at the conference, there was some good discussion (reasonably supported by evidence) of the value of viewing coding as a fundamental literacy – or even as a language (along with world languages, and ASL).  This begins to touch on one of the aspects of coding and digital literacy that makes it valuable beyond the obvious technical functions.  It is a specific means of communication, and ultimately learning codling also supports the development of thinking and reasoning skills as the student builds the ability to manage complex ideas and projects, and that’s when things get really interesting.

In a panel discussion on Wednesday, Preparing High Tech Innovators through Game Design, the was some nice cross-topic discussion, looking at employment data, employer-required skills, and the reality that even though students are device-competent, that does not mean they are tech-literate.  There was some refreshingly multifaceted discussion of the needs and issues involved with supporting the development of true IT/Developer skills in the real world of a high school. Interestingly, in the case of one program discussed (developed by WorldWideWorkshop) not only were student achievements impressive, there was also a dramatic drop in disciplinary referrals – the students were doing something real and meaningful and so were invested in their work (engaged, as some would say).

The value of meaningful work to motivate should not be underestimated; there have been some interesting studies on this, including this one, which document the importance of “meaning” relative to performance in the workplace.

architectural detail AustinAs I walked in downtown Austin, I noticed that every post on an older building had some lovely detailing – those who made the building, more than a century ago, cared about what they were creating – made it to last and made it beautiful.  It was not something to throw away, and that showed in their care and attention even to the point of a small touch that was not even going reach above knees of pedestrians.  Knowing what they made had value was demonstrated in the time and attention they gave every aspect.  I’d hire that builder if they were still around.

The use of technology (or any other tool in learning) to support the development of something real, useful and meaningful has payoffs that go deeper than discreet job skills, or even student interest – valuable as those are.  So many of the soft skills that we want learning to support (pride of work, self-confidence, innovation, collaboration…) are not going to be fostered by a problem set that will be graded and then pitched into the recycling bin.

A Means to an Even Greater End

When you look at the skills needed in an information economy, the ability to instinctively apply logic and reason, to structure ideas, to ask hard questions, and to manage large amounts of information and complex projects are key.   And coding can be viewed to some degree as a form of embodied logic, reasoning, and possibly even a bit of philosophy.  It is also an arena where the complex is synthesized and structured.  It is a workout for the mind, with value beyond simply being an IT exercise, or even an act of creation.

And when you couple that with the ability to actually make some cool stuff, that’s a very powerful thing, indeed.

Another side of coding that caught my attention ties into a concept I’ve seen tossed around in the organizational learning arena – “Visible Work”.   I couldn’t help but contemplate that there’s a lot to be said for “Visible Learning” – we saw this in live action as we all joined in on the etherpad at Kate & Doug’s session.  It was evident again periodically in various sessions I attended and was certainly well modeled and illustrated in (NoTosh) talk on Thursday.

Like most topics related to learning, the really important thing is to look at the principles – the whys and wherefores that make it meaningful and effective.  Ultimately it’s the fully transferable analytical and creative skills fostered as well as the benefits that can be created in open, visible work that had me truly excited.  I’m pondering the broader implications and applications of the ideas presented.

Building Attention

 

Spend enough time with learning professionals of any sort and you start hearing the same words over and over again.  For decades.

Words like: “engagement”, “attention”… and their counterpoints, “distraction” and “apathy”.

I’ll admit that there are cases where the individual capacity for attention seems nearly infinite (LoTR movie marathons and endless games of Halo 4 are examples that come immediately to mind).  But more often it seems that attention is in limited supply, and of short duration.  And that has to do with a couple of things, the nature of attention and the need to build endurance in attention.

Different Ways to Pay Attention?

A study recently looked at the brain activity of people reading literature (Jane Austen) while undergoing an fMRI.  Subjects were asked to read casually and then closely.  In the instance of close reading (when effect subjects were instructed to apply attention to the task), there was increased blood flow to “to regions of the brain beyond those responsible for ‘executive function,’ areas which would normally be associated with paying close attention to a task, such as reading”.  Casual reading also led to increased blood flow, but in different regions of the brain than those for close reading.

It seems there are different types of paying attention; the kind that’s, for lack of a better word, relaxed attention (reading for pleasure, gaming, watching movies) and a more intentional attention.   It’s the second kind that comes into play in most learning conversations.

Which Habit am I Building?

Some of the most interesting things I’ve read on attention are from before the days of MRIs and cognitive science; but they are based on observation, and experience, and most importantly focus on “what works”.

One of the keys is getting past the notion that attention is “gained” by an instructor, or speaker, or course.  Attention is “given” by the student or learner or audience.  That’s not to excuse deathly dull instruction, but it does put the responsibility of attention on the only person who actually has any power over it.   Because ultimately attention is a personal effort – an act of will, a habit to be cultivated.

If I frame it in these terms, then as an instructor or instructional designer, my role shifts from “I need to get their attention” to “I need to help people understand how to harness their attention”.  And this is where it gets really interesting, because often how we were taught to do things works against building good skills of attention.**

Think about when you were 8 or 9 years old and you had an (endless!) page of division problems to do, or you had to use each of your spelling words in a sentence, or some other really tedious task.  Probably, about 5-10 problems (or words) into the task you started fidgeting, wanting to go ride your bike or play Lego or do pretty much anything but your work…  and any responsible adult in charge would say “you just stay there and focus on your work until it’s done”.

But at 8 years old, you’d used up your focus, so you’d doodle or stare out the window, or make up an adventure in your mind…   Because you were trying to stay on task beyond the limits of your attention, you wound up creating the opposite habit – the habit of “inattention”.  In reality the best thing possible would have been to stop the task as soon as your attention flagged. This seemed counter intuitive to me for a long time, but really, if you keep at something when you are no longer able to focus, what you are practicing at that point is exactly the opposite of attention, so that’s the habit you’re building.

Like any other skill, attention is developed a little at a time.  When I was 8 years old, 10 or 15 minutes might have been my limit.  Now, depending on the task, I might be able to pay attention for as long as my day allows me (until the phone rings, or the next meeting comes up).  But that’s on a good day.  Usually the key is to work until I start to lose focus, and not one second longer.  Then catch my breath, stretch my legs, maybe do something different, and come back to the original task later so I’m building the right habit.

Most of the time, I have to work at consciously attending to what I’m doing, and lately I’ve taken to using a timer to rebuild the discipline of focused blocks of attention.  Like a runner getting into condition, I can build endurance over time.

* see also this article
** This is also, where game designers probably have things right with the whole concept of Atoms.