Tag Archives: 21st_century

The Two Realms of Curriculum

Since my last post about curriculum was well-received, I figured it might be good to write more on the subject.  A little on my background– I’m working my way through a doctorate in a program that deals with curriculum and instruction, and my job involves thinking about curriculum a great deal.  I created an independent study for myself in general curriculum theory and practice because I found that though I was mostly done my course work, I still didn’t really know what I wanted to know about the topic– namely the most fundamental question of all– what should a good curriculum have?  I found that a lot of others had the same problem– curriculum studies remained kind of a mysterious, amorphous area.  Everyone knew what is was… kinda, but it was hard to talk about in concrete terms.  So I’d like to share in Show All Your Work as much as possible what I’ve learned about curriculum through study and experience.

There’s a lot of debate over “the future of education” everywhere you look.  Look no farther than Youtube to find hundreds of videos that show the future of education as a place where students will be surrounded by slick, exciting technologies that make learning fun, instant, entertaining– basically everything that the internet is.   But there’s a whole other camp that calls for a return to “basics” such as reading, writing, and math.  Facts, to this group, are not challengeable, and learning is not “constructed.”  Students are in school to learn about the world as it is, and to learn self-control, discipline, and the things that educated people need to know.

And whichever side of the fence you’re on, there’s research to back you up.  If you’re convinced that technology will open wide the doors of the world’s information for students, and let them become quick-thinking, problem-solving, multitasking, knowledge-creating whizzes who are never more than a mobile device away from instant and continuous learning, you’re not alone.  Don Tapscott believes so, too– his books Growing Up Digital and Grown Up Digital tell us how young people are learning to lead, think, and create in ways that most adults didn’t ever get to do in their lifetimes.  Steven Johnson tells us that everything we think is bad for us— TV, too much internet time, video games– actually makes us smarter by engaging us in cognitive tasks.

Or, you might think that our reliance on technology is the beginning of the end of civilization, and that students and young people are being turned into mush-brained zombies that can only speak in short, dissociated, self-indulgent chirps about what they’re doing at any given moment.  Or that the only thing they learn from YouTube, iLife, and MySpace is that the first person is the first– and only– person that matters (you English teachers and grammar fans will appreciate that one).   You’ll be in good company, too.  E.D. Hirsch’s Dictionary of Cultural Literacy tries to convince us that the best of art, culture, and history is important for students to know and not just be able to Google.  Mark Bauerlein, whom I’ve written about here before, admonishes us not to trust anyone under 30‘s insect-like attention spans, and Nicholas Carr ominously wonders whether the internet is re-wiring our brains to avoid deep, sustained concentration.

At the center of these debates are, of course, the students that expect schools to prepare them for a world that no one– not experts, writers, leaders, politicians, and not their teachers, either– can really predict.  The role that curriculum can play in these debates is to ensure that students experience as much of both sides of the “two realms” as possible.

What that means is that in the first realm, students will need to learn skills and habits of mind.  They will need to have experiences where what they know doesn’t matter so much– it’s what they do with information that matters.  For their math class, they can have all the formulas they want handed to them on a laminated index card.  But how they apply the formulas is what will matter, and what will show their teachers that they’ve learned.  What good is knowing anything if there’s nothing to be done with the knowledge?  I might know Trigonometry forwards and back, but without creative vision, the problem-solving capacity to work around obstacles, and the ability to empathize and negotiate with local neighbors and residents, the building isn’t getting built.

It also means that students still need to be expected to learn things by heart.  What I mean is that they need to internalize information, know it, hold it, and keep it ready for use when it’s needed.  Knowing things has taken a hit in the past 10 years or so.  Why bother remembering anything when you can get your answer from Google in .8 seconds?   But the key here is that the knowledge and skill realms don’t work against each other– they work with each other.  A student with a deep, broad knowledge base from which to draw becomes a more-skilled problem-solver, critical-thinker, or empathizer.  Students who possess knowledge aren’t slowed down and disrupted by having to look up everything– no matter how instantaneous the search.  They are like musicians who are in a state of flow as they play a complex, improvised solo.  Imagine if a musician didn’t know any scales, but instead relied on looking them up as needed.  That solo isn’t getting played, people aren’t being moved, and music isn’t happening.  It’s the knowledge of musical theory, the instrument, even the music’s historical precedents that feed the creative and technical skills– those are what makes the music.

When extremist sides are taken, the dogmatism and stifling debate can sometimes miss the real point.  To endow students with the background and the ability, curriculum needs to be a balance of both skills and knowledge.  School isn’t a place to come to memorize discrete, unrelated, useless factlets, but it’s not a place to pretend that knowing things doesn’t matter anymore, either.



Filed under Uncategorized

Some Thoughts and Three Questions About Curricular Experiences

Part of my job involves writing curriculum.  For a long time I found it really hard to wrap my mind around exactly what that means.  Was it just thinking up some neat courses for students to take?  Was it reading through pages of complicated standards, and making sure that some class everyone took covered every one of them?  Was it even something that you could do well— and if so, how?

As I kept teaching, and then became a curriculum director, I saw that deciding a curriculum meant thinking about two things: first, what was it that students should know or be able to do; and second, what should students have to do in pursuit of that knowledge and those skills?

Depending on your school, the word “curriculum” is tinged with either freedom and excitement or restriction and rules.  It might be about learning, or it might be about coverage.  It might be a flexible, living set of ideas that can grow and shift to include new ideas, or it might be doctrine, a set of laws as immutable as the periodic table– which, maybe some chemist will tell me, actually can be mut-ed.  My hope is that when thinking about where students need to be and what they should do to get there, a few things will happen.

First, get the students’ input.  One thing that looking at 20th-century curriculum history tells us is that using a series of outside agencies, think-tanks, and other groups to decide what students to learn might be a good way to increase rigorous content, but it leads to a lot of back-and-forth.  Any time you have one group saying we need more math and science, there’s another saying that we need more art instead.  It’s the back-and-forth that’s behind the “reform wars” that prevent a whole lot from getting done.  It’s not to say that all outside agencies are bad; the Partnership for 21st Century Skills, for instance, does a great job of figuring out exactly what those skills are.  But an agency that says its agenda is the secret to improving schools is necessarily short-sighted.  Schools don’t need just more math or just more art– they need more everything.   Asking students what they might want to learn about sounds simple enough, but as the professionals, it’s easy to get caught up in fighting our own battles, and lose sight of what the students in question might want to learn.

Second, involve the teachers in writing curriculum.  Curriculum specialists are all well and good, but since the teachers’ job is to enact the curriculum, they can’t be treated as soldiers who simply carry out orders.  Maybe the most leaderly thing a curriculum director can do, paradoxically, is give leadership away to those who will bring the curriculum to the students directly.  Treat the teachers like professionals, and give them their say.  That’s what leads to buy-in– a whole-school effort and everyone on the same page.

Maybe the most important, and final consideration when thinking about a curriculum is to ask these three questions:
1) Does the experience ensure the best opportunity for mastery of the subject matter?
2) Is the experience provided in a way that fosters the development of 21st-century skills and habits of mind in students?
3) Is the experience grounded in relevance and real-world application?

If a new curricular experience– a new course, an assessment, a trip, a collaboration– fulfills these three requirements, then it’s probably worthwhile.

Thanks to all my readers out there– I wish you all a wonderful holiday and a very happy new year!


Filed under Uncategorized


Empathy was always one of those words that my students found confusing.  Was it just like “sympathy,” but you sound smarter saying it– kind of like the word “matriculate?” Empathy was a little like sympathy, I told them, but it was a little bit… more.  If sympathy was seeing someone get hurt and saying, “Aw, that’s too bad,” then empathy was about feeling that person’s pain, seeing the world through their eyes, experiencing what they experienced in their minds.  It sounded so lofty, so special, as if you had some kind of superpower.

Jeremy Rifkin’s “The Empathic Civilization” makes it clear that empathy is an essential skill for the 21st century.  It’s something you may have heard in other places, too– Daniel Goleman’s Emotional Intelligence got the ball rolling with the idea that after a long time of prioritizing skills like analytical and computational abilities, orderliness, and linear thinking, technology had gotten to the point where it would always be better than our human brains at these such abilities.  The internet emerged, computational power continued to explode according to Moore’s Law, and a computer beat Gary Kasparov at chess.  We were forced to– and are still forced to– fall back on what I call our sense of “human pride”– the things we can do that a computer can never probably won’t ever be able to do– empathize with someone, break existing rules in the name of creativity, and imagine things that never were, for starters.  Human pride is one of those ideas that escapes definition– it’s hard to articulate, but we know it when we see it.

Rifkin says that empathizing will be an essential ability in the coming years, and I agree.  The confluence of extraordinary technologies and the forces of globalization has made worldwide communication cheap and easy.  This means that we’re running into people who are very different than us, more than ever before.  Just a hundred years ago, it would be easy to go your whole life and never meet or talk to anyone from another country.  Today, kindergartners  can Skype with  counterparts in Argentina and practice each other’s language.  We’ll run into a lot of people who are very different than us in our lifetimes.  Empathy lets us do what’s most important– find the ways in which we’re all very much the same.

Rifkin tells listeners that Facebook, Twitter, Youtube and the rest of the social media rock stars will increase empathy throughout the world– the plight of those in distress broadcast over these communication networks means more attention, more discussion, more action.  But I caution that broadcasting the troubles of the world over instantaneous, pervasive outlets  might do the opposite– desensitize us, distance us, keep problems relegated to a click of the mouse.  The more we’re exposed to something, the more we habituate to it.  If you had shown people across the world images from World War I as it was happening, they would have been horrified more than they ever thought possible.  Now, we can watch footage from battle– real battle– in Afghanistan on Youtube, and not miss a bite of our sandwich.  If every tragedy to befall the world comes through on our iPhones and Droids, it will take a tsunami-sized catastrophe to get our attention.

Either way, cultivating empathy in our students is essential for a society to be able to keep being human when technology runs so much of our lives.  We have to keep our focus on technology as a tool to serve our humanity, not the other way around.  If we can teach students to remember the meaning of a living thing, and to be able to mentally put themselves in someone else’s place– to really see, hear, feel, and think like they do– we will be giving them a skill that will helped them live a happier, more human life.  Perhaps that will be the best definition of human pride.

Leave a comment

Filed under Uncategorized

More Book, less Face?

There’s a war going on.  You might not be able to name who is fighting whom, or who the allies are, or how it ends.  It’s a war of cultures, a war of this generation of students vs. previous generations of students– now the teachers, commentators, writers, and guardians of the old culture.  It might end a lot like Bob Dylan’s lyrics to “The Times, They Are a-Changin'”:

Don’t criticize what you can’t understand
Your sons and your daughters are beyond your command.
Your old road is rapidly aging.
Please get out of the new one if you can’t lend your hand

This war is between the Facebook Culture and the “More Book, Less Face” Culture– a group that should be called the Ironic Culture since most of them have Facebooks themselves.  But Facebook isn’t the same for these two camps.  For young people, social media are the same as life, as much as walking out the door and going to school.  The constant contact afforded to students by technology means that they’re never more than a Tweet, poke, status update, or text away from their 1051 1052 friends and followers.  For those of us that grew up without social media and smart phones, Facebook is a tool or a plaything, a thing that you can do when you want to– and sometimes you spend a lot of time with it– but in the end, you walk away to your actual life.

And maybe you go read a book.  So, which way of living is better?  Nicholas Carr’s The Shallows gives an exhaustive look at the neurological evidence that shows that online habits literally re-wire your brain to “crave” the online click-browse-click-browse rhythm.  It’s the plasticity of your brain at fault– do something enough and your brain rewires itself to be good at it. And it gets rammy when it can’t have what it craves– anyone else get strangely, freakishly furious when their internet doesn’t work?  Those who quit smoking might know what I mean.  So what?  Maybe we get new brain circuits?  And?  The flip side is that our brains aren’t as infinite as we wish– for all the new circuits we form, others fall away– like those otherwise used for reading, for instance.

Mark Bauerlein calls today’s students The Dumbest Generation.  Young people are spending increasingly-less hours reading and more hours online, mostly engaging in the kind of chatting, gossiping, trash-talking, bullying, and mindless meandering that marks a lot of what people– not just young people– already do when they’re together– you know, just plain hanging out. In the spirit of this conversation, maybe you can just check out this video instead of actually reading his book:

Social media and constant connectedness, Bauerlein laments, gets young people stuck in an adolescent world where they cocoon themselves in a digital world with each other, and remain blissfully unaware of the adult world around them.  He disdains the fact that most students will find Facebook, YouTube and MySpace– although MySpace has been sooo meh since about 2007– more compelling than Julius Caesar.  But then why do students even need Caesar?  I try to get my students to like coming to school– won’t forcing them to read Shakespeare derail even my best efforts to make it interesting?  And people do read on Facebook– all that reading of status updates is a new kind of literacy for the 21st century.   So what if kids don’t read books anymore– one in four Americans don’t, and they end up fine, mostly.  Kanye West and Victoria Beckham don’t read, and they both ended up fine rich.

But it’s not the same, and we know it.  The best, most trenchant point that Carr and Bauerlein both make is that reading books makes us good at specific things– like sustained, focus concentration and interpreting language.  I’ve heard the argument that when books were brought to mass production in the 15th century, there was noise  about the fact that people were losing the ability to memorize huge, classic works, like the ancient bards who could recite The Odyssey in its full version.  These same people usually say that this is all a dialectic, a historical inevitability, that digital media will destroy the book because better technology always displaces what came before it.  But I don’t think there’s such a thing as historic inevitability– I think there are people who are too afraid of being perceived as old, uncool, and out of touch, like the old-roaders of Bob Dylan’s lyrics.

Here’s what we know:  When books became the norm, yes, we lost the presence of those who could memorize astounding chunks of lyrics.  But other things happened, too, not long after the turn of the 16th century.  Science flourished.  Life expectancy soared.  Diseases were eradicated.  The fundamental laws of the universe were discovered.  Superstition fell away in favor of reason.  More people had more money and were lifted out of a kind of poverty that made life literally unlivable, in some cases.  Democracy became the norm.  Quality of life skyrocketed exponentially.  I believe that these things happened because people got good at sustained, focused concentration.  Because they read.  It was causation, not just correlation.

I don’t think the internet is a bad thing– if I did, I think I’d rightfully belong somewhere in the middle of Bob Dylan’s song.  Most people that know me would say I spend a lot of time online.  I manage two websites, and give workshops on Web 2.0 tools for learning.  The internet has been my professional bread and butter.  The internet gives us its own amazing gifts– the ability to multitask and manage complicated networks of tasks; a chance to communicate with anyone in the world cheaply and quickly, the record of all the world’s information at your fingertips, no matter how poor you may be; powerful tools that let creative minds do things never thought possible.  But it doesn’t give you the sustained, focused concentration that has marked the past 500 years.  We take in the online world through quick click-throughs and ephemeral moments, and we lose patience when Youtube videos take more than 30 seconds to load.  I agree with Carr and Bauerlein that our ability to engage in sustained, focused concentration– however we learn how to do it, reading or otherwise– is absolutely essential to our ability to continue to flourish and make progress in the world’s problems.

For teachers, and anyone who serves as a role model for a young person, another part of Bob Dylan’s song comes to mind: “Keep your eyes wide, the chance won’t come again.”  We’re at a critical time in the history of information.  The last 500 years show us the kind of advances that a book-reading culture can make.  We don’t totally know yet what an online culture can do.  But I’m not willing to step aside for some delusion of historical inevitability and bet the next 500 years on hoping for the best.

Leave a comment

Filed under Uncategorized