Showing posts with label technology. Show all posts
Showing posts with label technology. Show all posts

Friday, February 21, 2014

Why are the computer programmers making the policy decisions?

Like government, higher education is administered (or mis-administered, depending on who you talk to) by teams of bureaucrats.  This post isn't really about criticizing this aspect of the higher education landscape in general, though a decent argument could be made that the growth of our administrative ranks hasn't really led to an overall improvement in learning.  Rather, I want to comment on a particular segment of that bureaucracy that has become increasingly powerful over the last two decades.  No, I'm not talking about the President, her cabinet, the chancellor's office, or any of the other groups we typically see as "high-powered."  The group with the real power are those slightly socially inept, dress-code flaunting teams tucked away in some basement of your campus--the computer programmers.

At BYU, we are fortunate to not have to deal with many of the enrollment challenges faced by many other institutions.  Because of our unique mission, and the fact that we have a very well-defined target population, we typically don't have any problem hitting enrollment targets for our two major semesters (Sept - Dec. & Jan. - April).  And, because of the generous tuition subsidies provided by our sponsor, tuition has not risen nearly as fast or as much as the state schools in our area.

Nonetheless, BYU's Registrars Office has been quite aggressive as of late in boosting what we refer to as "spring-summer" enrollments (i.e. enrollment in courses during the "off-season" of May - August).  During these warm weather months, most students take time off to participate in internships, travel and study abroad, or just return home to be with family.  So, from May to August, campus is quieter, slower, and emptier.  While I rather enjoy these months and the extra time and space they give me to read, research, and get to the things I've been putting off the rest of the year, from a resource management perspective they are a drain.  We have empty classrooms, under-utilized academic advisement centers, and a lonesome looking library.  So, it only makes sense to take a strategic approach to boosting enrollments during these months to offset the overhead operating expenses associated with staying open all year.

Up until very recently, students who wished to enroll during spring-summer, were limited to 8 week courses that ran during either spring or summer "term."  While some students and faculty enjoyed this model because a course was over and done with in less time (can you say superficial learning?), it was not a great fit for some students or some courses.  So, the Registrar's office has wisely expanded the spring-summer catalog to include more traditional 15 week courses that span both terms.  Not only does this new policy provide more options for students wishing to take courses during spring and summer, it should improve learning in those courses that opt for the 15 week model.  But, as with any new policy, there have been some unintended and unanticipated challenges.  And, this is where the programmers come in.

To ensure that students do not overwhelm themselves with an unrealistic academic load, BYU limits students to an enrollment of 18 credit hours during a traditional 15-week semester.  A student who wants to exceed this limit can do so, but only after approval from his or her academic advisor.  In the past, this policy was also implemented during 8-week terms and limited students to the 8-week equivalent of 9 credit hours for these enrollment periods (because the courses are twice as fast, the credit limit is half as large).

But, this all got pretty messy when the Registrar's office decided to offer both 15-week courses that span both terms, as well as the 8-week courses that are limited to either spring or summer term.  Now that 15-week courses are on the table for the May - August period, the credit limit has been raised to 18 credits to mirror the policy for the rest of the academic year.  But, what the bureaucracy didn't anticipate was that the online registration system (what we call MyMAP) would not be able to distinguish between credit hours belonging to 15-week courses, and those belonging to the shorter 8-week courses.  Essentially, this means that a student can now, in theory, register for up to 18 credit hours to be completed in an original 8 week term.  For those of you keeping score at home (and who are familiar with the Carnegie credit hour system), this would mean 36 hours a week of in-seat class time, plus 72 hours of study time outside of class.  I'm a PE major, so proceed at your own caution when trusting my math, but that's over 15 hours a day, 7 days of week participating in some kind of academic activity.  Even if we take a more conservative schedule of 12 credit hours in an 8-week term, that's 24 hours of class and 48 hours of study each week (better than 10 hours a day, 7 days a week).  

An administrative assistant in our office discovered all of this about two weeks ago when she was reviewing first-year students' schedules.  Since then, we've discovered upwards of 100 first-year students (for whom we have primary responsibility in our department) who are registered for 9+ credit hours during our upcoming summer term.  Because first-year students have no real idea of the accelerated pace and increased workload associated with an 8-week course, we were concerned that these students were setting themselves up for failure.  So, somewhat naively we called the Registrar's Office to inquire about the situation.  It was rather clear that no one had really thought through the implications of the new spring-summer policy and the fact that it would allow situations like this to arise.  After a "we'll get back to you," and about 30 minutes, we got a return phone call notifying us that the registration system would not allow any further restrictions in terms of credit hour limits and that was the end of the conversation.  

The reality is that the "system" can be structured to do just about anything the institution would like it to.  But, only if the programmers agree to make the changes.  So, what ends up happening is that computer programmers become one of the strongest voices in the room when it comes to issues of policy (particularly registration and enrollment policies, because technology is so enmeshed in all of those processes).  Forget about what's best for learning, student well-being, or even common sense.  If a change to the registration system is viewed as too much work or not as important as another project, the policy will reflect the wishes of the technologists.  

While policy has to be feasible and responsive to technological capabilities, the policy that I've described in this post represents negligence on the part of our institution, particularly in the case of first-year students who are unfamiliar with the demands of college-level work.  While institutional policy serves a variety of functions from ensuring economic viability, to managing the allocation of resources, to protecting the quality of the work-life of faculty and administrators, ultimately policy should support and enhance learning.  And, this most recent policy decision doesn't.  

I have no problem with the fact that the Registrar's office didn't anticipate the fact that this change in policy would create this problematic loophole--it's impossible to anticipate everything that will come with these kinds of transitions.  What is frustrating is the way they've responded--no admission that this is a problematic path for us to be heading down and no urgency in terms of modifying the registration system to close the loop.  Our repeated unwillingness to admit mistakes and failure to strategically use policy to support learning is discouraging.  And, half-hearted "recommendations" like the one below, aren't enough to make up for it (from the Registrar's Office FAQ page):

Though it is possible to take more than 9 credits in a Spring or Summer term, it is not recommended since classes taken in a term cover the same amount of content in only about half the time.





  




Friday, September 14, 2012

Scaleable "Solutions" vs. Local Responses

Yesterday, I participated in a forum with educators, administrators, and business leaders who are all interested in the use of technology in learning settings (Accelerating Innovation:  Personalizing the learning environment--thanks to k12, BYU's Center for Teaching and Learning, and TD Ameritrade Investools for making it free to attend).  It was clear that everyone there was passionate about learning and excited about what learning might look like in the future, particularly in schools.  In that way, I felt like I was with kindred spirits and appreciated having the opportunity to connect with and dialogue around important issues.  However, I always feel a bit like a fraud in these settings because I tend to be skeptical whenever I hear people talking about technology "revolutionizing" or "transforming" learning.  Further, the conversations at gatherings like the one I attended yesterday often focus on finding "innovative solutions" that can be "scaled up" and adopted on a massive scale (this seemed to be the only thing the representatives from the USDOE wanted to talk about yesterday).

Clearly, there are technological advances that have this impact upon learners and the learning process, but I think the list is much shorter than many technologists would believe.  More typical is the new "tool" that is developed in a particular setting, touted as "transformational," and then adopted only in the setting where it was developed and a few others where the challenges are similar.  That isn't a criticism of these "tools" as much as it is a criticism of the rhetoric of many educational technologists which is, in short, "we're going to change the world with this new idea."

At the core of these issues is an interesting tension that I saw playing out in yesterday's forum.  And, the tension is framed by two fundamental perspectives on educational reform.  The first is what I'll call the grand solution paradigm which seems to be concerned with finding universal solutions to big problems.  Consequently, their focus is on identifying problems that manifest themselves in virtually every educational setting and sector and then developing "solutions" that can be "scaled up" and adopted on a widespread basis.  

The second perspective operates from a local response paradigm.  Those who align with this approach are very concerned with context in that they approach problems by, first, understanding the complexities and nuances of particular settings (e.g. local cultures, historical influences, individual personalities, and available resources).  Then, they work alongside local stakeholders to respond to the challenges presented by these unique educational landscapes.

There are stark difference across these perspectives.  The first seems to be concerned with "answers" to questions and believes that finding these answers will solve problems for all educators.  They seem concerned with what has sometimes been termed in research as the "grand narrative" and aim to provide new ideas and tools that everyone can use, in nearly the same way.  These are the folks that are much more likely to see themselves as "revolutionaries" and "reformers."  In contrast, localists aren't likely to make any claims at developing "solutions" or "answers," rather they approach educational policy and practice as a dynamic dance wherein teachers and administrators are in a continual state of responding to the challenges and opportunities that present themselves.  Because this is slow and more "tribal" work, they may not see themselves as "revolutionizing" education, although their collective efforts may have that impact over the long-term.

Like most complex problems, the challenges we face in education aren't likely to be overcome if they are approached exclusively from one of these perspectives or the other.  Any sustainable changes are likely to come about in response to a coordinated effort that involves both a search for "scaleable" solutions and an openness to local innovation and responsiveness.  But, when I sit back and listen to the dialogue of the "reformers" I hear too much of the former and not enough of the latter.  Rather than spending inordinate amounts of time "innovating" in search of the holy grail of education (yesterday it was Open Educational Resources), we should be spending just as much time helping local practitioners and stakeholders join the grand dialogue, and then consider what "works" in their own place.  Innovation, while focused on outcomes, products, and ideas, should be just as concerned with processes that allow local innovation to thrive.

Friday, July 9, 2010

What schools could learn from cab companies

I spent the first year of my professional career as a teacher.  At the risk of sounding arrogant, I thought I was pretty good--I had good rapport with students, I tried to align my teaching with learning objectives, and I believed that I applied good pedagogical practices in my classroom.  It's probably pretty easy to spot the problem in this assessment--they are all my own personal perceptions of my ability and performance, and very subject to bias and inaccuracy.  This has bothered me lately and I've wondered how good a teacher I really was during that year.  As I look back on those experiences, two really important things seem to have been missing:  feedback and focused effort to learn from mistakes.  

While I tried to regularly evaluate my own teaching, like most of us I probably overestimated my abilities and was likely unaware of many of the mistakes I was making.  On a few occasions (three, that I can remember) I was observed by another teacher and then given some basic feedback at the end of the class session.  It's fair to assume that both my own evaluations and those of others might have had some slight improvement on my teaching.  But, in retrospect, I don't think I was much better in June when the school year ended than I was on the first day of school in September.  In fact, I probably developed some bad habits, got too comfortable with the role, and stopped doing some of the little things that make a big difference in one's teaching ability.  In short, I may have even been worse.  

At the same time I was having these depressing thoughts, I was reading Traffic, by Tom Vanderbilt.  The book uses driving patterns and habits as a context for exploring human behavior.  He tells a fascinating story about how cab companies and limo services have helped improve their drivers' performance using a technology called DriveCam.  DriveCam installs cameras on the rearview mirrors of cars that continusously buffer images (like TiVo) of what is happening both inside and outside the car.  Sensors monitor various measurable forces and when a "trigger" is detected (sharp turn of the steering wheel, significant decrease/increase in speed, etc.), the camera records ten seconds of footage both before and after the trigger.  This footage is then sent to a database and may be reviewed with the driver in an attempt to correct mistakes and improve safety.  Although it probably ruffles some feathers of drivers who feel like their privacy is being invaded, the bottom line is that transportation companies have a lot to lose when their drivers don't drive well.  What's more, this is great learning.  Drivers can view actual footage of their driving, spot mistakes and fix them, and focus on small elements of the driving performance that seem to make a big difference in achieving good outcomes, namely safety (this isn't unlinke the way elite athletes use taped performances to improve).

This left me wondering why something similar couldn't happen in classrooms, particularly those classrooms led by novice teachers.  We spend millions of dollars to equip classrooms with new technologies that are touted to improve student learning and increase engagement.  And, while ipods and laptops can help, the core factor influencing student learning still seems to be the teacher.  It seems fair to ask why technology can't be used to imrove core teacher practices that would have far-reaching impact upon student learning.  There is some work being done in this area (see the work of Peter Rich and this pilot project at the University of Central Florida), but it  seems to be on the periphery.  

What would happen if there were cameras installed in classrooms that could capture real footage of teacher performance.  Are there "triggers" that we would want to focus on?  What would they be?  Something like this would seem to fill a gap in current teacher development practices.  Teachers would have the opportunity to really see themselves teaching (as opposed to their perception of their teaching or someone else's interpretation), use mistakes to improve, and focus on critical parts of their performance that have been shown to lead to significant improvements in student learning.  










Friday, June 11, 2010

The problem of control in education

I attended TTIX 2010 yesterday at the University of Utah.  TTIX (Teaching with Technology Idea Exchange) brings together instructional designers, technology specialists, and educators interested in the use of technology.  In her the keynote address, Nancy White explored the question "Should we use communities in learning?"  While there wasn't much argument that we should not, Nancy did present an interesting paradigm for thinking about the ways in which learning occurs.  She thought about it as an issue of "me, we, and the networks."  Or, more simply, we can learn individually, in small groups, or as part of a much broader network.  

We see individual and small-group learning at work every day in higher education.  But, real networked learning that extends outside of campus seems to be missing from most universities.  Of course, students often have their own personal learning networks, but these networks generally live outside of what they perceive as their school experience.  Students do one sort of learning in class, in the library, and with project groups.  They do another type of learning "outside of school," learning that is largely separated from their course work.  This seems problematic to me for at least two reasons.  First, formalized "school learning" should be authentic and connected to students' interests.  Second, if we really believe that higher education should produce life-long learners, campuses should help students begin to build and use a personal learning network that includes people, media, web resources, organizations, etc.  While some of those elements will be available on a college campus, it is either arrogant, naive, or both to think that a single college campus can connect students to all of the resources they will need for a rich learning network.  In short, there are times when we need to get students off-campus and encourage them to do their learning there.  And, this learning needs to have meaningful connections to what they are doing on campus as part of formal university programs.

Our problem in higher ed is that we want to exert control over students.  We want to tell them what they can learn and when they can learn it (the standard course model); we want to create closed learning management systems (e.g. Blackboard &Brain Honey) that allow us to monitor student learning and keep out "intruders;" and we tell them what their learning goals will be (graduation requirements).  Universities, by their very nature, will always have some level of structure and exert some level of control over students--I've come to accept that fact as unavoidable.  However, why couldn't our institutions help students identify their own learning goals, build their own personal learning networks, and then find ways to connect that learning to university coursework (or internships, captstone experiences, field studies, service learning, etc.)?  

We often wonder why students aren't motivated to do the sorts of deep learning that we would hope to see at the university level?  But, we can't be too surprised, given the fact that we removed most of a student's autonomy.  If students don't have some choice in structuring their learning (and selecting from a list of courses to take is a poor excuse for "choice"), they will rarely be motivated to learn deeply (see this TED talk by Dan Pink for more on this idea or this condensed version of his ideas; his newest book, Drive is also a good read) .

What we need in higher education is more boundaryless, fuzzy, relationship-based learning that doesn't begin and end at the semester.  The traditional course model might not ever go away, but why couldn't there be an overarching learning process that is overlaid on top of courses?  The type of learning that is motivating, inspiring, and that will likely last well beyond graduation?  





Friday, May 21, 2010

How much has technology really helped us in education?

I just finished reading Atul Gawande's latest book, Checklist Manifesto.  In a nutshell, Gawande argues that the right kind of checklist (it turns out that good checklists aren't easy or quick to produce) can lead to vast improvements in the way we do things, and for very little cost.  It's a great read and one of the most practically useful books I've read in a while.

Gawande is a physician and uses stories from medicine to illustrate the effectiveness of simple checklists (there are also some fascinating examples from building construction and finance).  One of the areas where checklists have made the most difference are in operation rooms.  It turns out that for even the most developed countries--those with great hospitals, state-of-the-art medical technologies, and highly-trained physicians--surgical complications are a fairly significant problem.  Gawande and his team have managed to develop simple checklists that, when used properly, have drastically reduced complication rates.  It is important to note that these have not been modest findings, the results have been startling and and hard to argue with.  

The interesting thing in all of this is that the vast majority of physicians and hospitals have refused to use the checklist.  Instead most have opted to invest in $1.7 million remote controlled surgical robots that have driven up costs massively, without producing any significant improvements.  Meanwhile, the low-tech, low-budget checklists are saving lives.

There seems to be a parallel here to education.  More and more, institutions are adopting technology with the hope that it will revolutionize learning.  In my graduate work I spend a fair amount of time with instructional technologists, some who tout technology as the saviour of schools.  I should also confess that I am sometimes a sucker for cool ed tech gadgets because they seem to make learning fun and engaging.  That's not to mention my reliance upon technology for some of the most basic functions of my job (just last night I made a presentation to a group of parents of incoming students and used the bells and whistles of a fully-mediated auditorium to "enhance" my remarks).

The nagging question I keep having, though, is whether the very expensive technology we use has really improved the learning experience for students.  It likely cost thousands of dollars to outfit the auditorium I spoke in last evening.  And, I would estimate that there are at least 100 other rooms of various sizes just like it across the rest of my campus.  In many ways this is nice.  It means that instructors can use PowerPoint slides, show media clips, play music, etc.  These things are entertaining for students and can deepen engagement.  But, how much more did the parents in my session learn because I used technology?  What if the computer had crashed mid-presentation?  Would I have been prepared enough to make the remaining 15 minutes useful?  Would a low-tech "technology" like a "minute paper" have been just as beneficial as a data slide?

I'm not arguing for the elimination of technology in higher education (or any setting for that matter).  But, I wonder how often we falsely assume that twitter, tech classrooms, and iClickers will simplify the educational process and prevent educational failures.  Could it be that simple pedagogical tools, processes, or philosophies could be just as impactful and at a much lower cost?  And, what sorts of new failures does technology introduce?  Gawande frames this question well:

"We have most readily turned to the computer as our aid.  Computers hold out the prospectof automation as our bulward against failure.  Indeed, they can take huge numbers of tasks off our hands, and thankfully already have--tasks of calculation, processing, storage, transmission.  Without question, technology can increase our capabilities.  But there is much that technology cannot do:  deal with the unpredictable, manage uncertainty, construct a soaring building, perform a lifesaving operation.  In many ways, technology has compicated these matters. It has added yet another element of complexity to the systems we depend on and given us entirely new kinds of failure to content with."

What are our "educational checklists?" (i.e. those simple and frugal things that can make signficant differences in learning).  And, what are the "surgical robots" in education that look cool, but deplete budgets without making any meaningful improvement to the educational landscape?  And, maybe most importantly, how do we know when we're dealing with an expensive failure or something that, while expensive, truly will revolutionize learning?


Friday, September 25, 2009

Why do students hate school, but love learning?


A few weeks ago my wife and I had some friends over for dinner.  Just before they left we we got to talking about blogs and I mentioned, somewhat casually, that I have a blog that I post to about once a week or so.  At that point my wife turned and looked at me like I was a stranger she had never seen before and then said, rather emphatically, "No you don't."  You see I have never been particularly fond of or good at writing (as many of my posts reveal).  But, for the past year or so I have been a regular blogger.  What's more, it is one of the more enjoyable things that I do in my work.  This didn't make sense to my wife and has only recently started to make sense to me.  

In high school, and to some extent in college, I cringed any time a teacher mentioned an essay, research project, or even Haiku (I still wince at writing reports, memos, and evaluations).  I hated writing.  I was forced to do it, couldn't write about anything other than what the teacher wanted, and no one other than my teacher (who I didn't really care about anyway) was going to read it.  So, I've always been a little intrigued by the fact that writing for a blog is something that I do willingly.  Last night I got some answers that shed some light on this split personality I've developed.   

I listened to a talk by Clay Shirky ("Where do people find the time?"--for part 2 of the talk click here), a professor of new media at NYU, where he describes what he calls an "architecture of participation."  This architecture consists of three factors:  an ability for individuals to (1) consume, (2) produce, and (3) share all of which Shirky argues are critical for meaningful participation.  The more I listened I started to realize that my blog has allowed me to do all three of those things and I started to believe what Shirky was saying.  My blogging is different from the academic writing asked of me in school in very important and fundamental ways.  First, when I blog I get to write about what I am interested in and it isn't restricted to a particular discipline.  I am, at heart, an educator so many of my posts center there.  I've also written about politics, language, design, and health care. . .things I know little about, but am interested in.  Blogging also allows me to produce my own digital footprint.  I enjoy creating a post that links to books I've read, talks I've listened to, and other blogs.  It's also gratifying to google a term like "deep practice" and see one of my postings come up in the search.  I feel like I've created something that matters.  Lastly, because blogs are public I know that what I'm writing may actually be read by someone (I recognize that may not be true, but the theoretical idea sounds good to me).  That, incidentally, makes me care a lot more about what I write about and how I write it.

This all leaves me wondering how we can make school a little more like blogging.  What if schools were more thoughtful about creating an architecture of participation that would support the type of learning we hope happens in our classrooms?  How would assignments be structured differently?  How would the teachers role in the classroom change?  How would relationships and roles among peers look?  

The idealist in me wants to believe this would make a difference and that students would start learning in school in the same ways we see them learning once they leave our classrooms and campuses.  Am I being naive?