Hoo boy. Who could have foreseen putting a thin-skinned narcissist in charge of the country would cause all sorts of butterfly effects throughout our culture and society? To be fair, the leader of the United States is going to set the tone for discourse regardless of the particular (shall we be generous and say) eccentricities of the person in the Oval Office. One of the reasons that people look at things like gender or race when it comes to a candidate is because that can change how tone comes across. (Look at the way Jacinda Ardern, prime minister of New Zealand, reacted to an earthquake during an interview; also consider, perhaps, the way President Trump chose to behave when warned against looking directly at the sun.) Despite the hack D'Souza claiming he understands the roots of Obama's rage, the forty-forth president maintained a calm demeanor in almost all circumstances. (This early thinkpiece about the reasons for that might be worth your time; also recall how he was ridiculed for weeping over the staggering loss at Sandy Hook Elementary school.) My point is, what happens on Pennsylvania Avenue tends to have repercussions all over the place, including in the digital sphere. While Barack Obama utilized the nascent social media and digital domains to his advantage in 2008, the tech world morphed immensely during his tenure. By the time 2016 came along, Russian interference via Facebook and other social media platforms only exacerbated what was already the clear trajectory of subscription-free websites: Divisiveness makes money. By exploiting that concept, the businessman-turned-politician whose best skills lie in exploiting divisiveness rather unsurprisingly became the GOP nominee for the presidency. He lost the popular vote by over 3 million people and became president anyway, highlighting additional problems that I'm not getting into here. The point is that without the digital terrain of the mid 2010s, I don't know if we'd have the current political landscape. Trump owes social media his presidency as much as he owes Putin. So it's not surprising to me to learn that Trump, at the time of this writing, is poised to sign an executive order regarding social media sites, in effect regulating what they allow on their private platforms. There is an irony here that I've seen in other places, and though it's a qualified irony, it's worth pointing out: To many conservatives, governmental regulation is anathema. We all remember when Rick Perry couldn't remember which regulatory agency he would have scrapped had he gone on to the presidency. Reagan's poison is conservative doctrine now: "The nine most terrifying words in the English language are: I'm from the Government, and I'm here to help." Conservatives have long run on the platform of smaller government (which has its merits) by insisting that they should be put in charge of the entity they have nothing but disdain for (which does not have its merits). Much like having teetotalers in charge of the Department of Alcoholic Beverage Control, there's something to be said about having those in charge who don't believe in the thing that they're in charge of--and it isn't a nice thing to be said, either. Hence this irony: A president who has promised to repeal two regulations for every new one instituted is insisting on additional regulation. Now, some people may agree with his move--that his executive order to the FTC on forcing social media platforms to moderate their content according to their guidelines better--is a good move. But its value isn't where the irony is, it's in the fact that there's a regulation being forwarded at all. I'm not going to waste time asking what two regulations Trump will strip to offset this new one--it'll probably be environmental or emission regulations--as I don't think there's a one-to-one (or, more accurately, a two-to-one) connection between these things. Here's a shoutout to 2015, when Senator Thom Tillis (R-NC) said that he didn't mind it if Starbucks no longer expected its team members to wash their hands after using the restroom. Making a regulation to reduce a regulation is still a regulation. There's more going on here, though: The president's original tweets on the subject are filled with inaccuracies. Not only is mail-in voting successful, but though there are mistakes that might happen, there's no evidence that they are anything other than the right choice to make to ensure our democracy has a voice in November as COVID-19 continues to upset almost every aspect of our daily lives. More than that, however, is his claims that Twitter putting a "fact-check" link on his tweet is tantamount to violating free speech is honestly nauseatingly stupid. Not only is it a completely wrong sentiment, it beggars credulity in reality that the man who is in charge of the country--the highest office created by the Constitution--is so Constitutionally ignorant that he mistakes being corrected as "stifling" free speech. (Cue the Neil DeGrasse Tyson gif.) I'm not a Constitutional scholar, but I am a social studies teacher. I have had to spend time thinking about how the Constitution works, teaching the Bill of Rights to my students, reading history books that trace the way the Constitution has been seen, and studied how the country has run itself in the past. I'm not claiming an absolute authority on this--I'll leave that to Agent Orange--but I do claim that I've spent more time considering it than, say, a run-of-the-mill devotee of Sean Hannity or Rush Limbaugh. And it's clear that though it may not be a popular way of viewing things, positive and negative rights are a great way of divvying up the Bill of Rights and some of the later amendments. (Recap: Positive rights are those things which the government is obligated to provide, and tend to be sparse in the US Constitution; negative rights are areas where the government is restrained in its power, and are more frequent, particularly in the Bill of Rights.) When it comes to the freedom of speech, the Constitution doesn't guarantee it unconditionally. In fact, it's a perfect example of a negative right: It is a restriction in governmental action in the face of the individual's expression. It is not, however, a ban on governmental action. We all know that you can't yell "Fire!" in a crowded movie theater and walk away from any sort of legal prosecution for the action. There are times when speech can be infringed and censored by governments (local, state, or federal). Some of them, historically speaking, have been abuses of power (consider how the Alien and Sedition Acts influenced early in the country's history, or what happened to Robert Goldstein when he ran afoul of the Espionage Act by making The Spirit of '76 back in 1917). Other areas, however, indicate that the greater societal considerations outweigh the individual rights. (The yelling-fire example is the quick example of that.) The larger takeaway, however, is that the guarantee of governmental non-interference of speech is not something that exists inside of my house, for example. If someone came into my house so that he could gas on about how Trump really is making America great again, I would be within my rights to tell him to shut up. I could even excuse him from my home. I would not be violating his First Amendment rights because I'm not the government. I'm a private citizen. The First Amendment allows speech to happen, yes, but does not require anyone to listen. And, if my platform doesn't want to embrace that speech, I don't have to. Social media has made this trickier: Is this a public space, or a private one? If it's public, then maybe there are some other considerations to view. If it's private that everyone is allowed to see, what's the difference? Trump has flouted Twitter rules regularly, which should have seen him excused from the platform. If it's a private company, making rules about what can be discussed or said on its servers, then that's a digital domain tantamount to my living room: Follow the rules or exit the premises. But if Twitter is a public place, can it do the same? Can public places--parks, libraries, seats of government--be places where abuse, violence, or depravity are enacted without a reprisal from the people's representative government? I personally don't see Twitter as a digital version of a public place--not while it makes billions of dollars by selling ad-space. It's clearly a for-profit business, and though the service may be something the public benefits from (as I often do by using the product), I'm certainly not seeing any of that profit in my bank account. (I could, I suppose, if I invested in them.) Are Facebook and Twitter extensions of the digital commons? I would argue no, and Mark Zuckerberg agrees with me to an extent, despite his recent insipid comments about how Facebook isn't an "arbiter of truth" (which is obviously true; that he uses it to try to keep his hands clean when his platform is routinely abused demonstrates that he'd rather not reflect on how perverse his worldview is). There are things that get an account banned from Facebook and Twitter. There are community guidelines. There are lines in the sand (that can be so conveniently erased) that these platforms disallow people from crossing. It is more profitable for Facebook specifically (though Twitter is in a similar vein) to allow divisiveness than it is for them to enforce their own community rules and regulations. An untended garden doesn't flourish with flowers; it drowns in weeds. Though this is a nuanced and difficult topic, I don't think it's an impossible-to-understand foray into metaphysical ontology or pandisciplinary exegesis. It's something that takes some time to chew on, disagree with, change one's opinion about, and move around as the idea percolates. It is, in other words, far beyond the grasp of the current ambulatory, toupee-wearing traffic cone that will forever be called the 45th president. Honestly, it's hard for me to adapt to the intellectual whiplash between forty-four and forty-five. While his interpretation of the Constitution could be held up to scrutiny and criticism (and often was), no one could honestly say President Obama hadn't studied the document. (Plenty of people said it dishonestly, obviously.) He was, after all, a professor of constitutional law at the University of Chicago. President Trump, however, has asserted that, "When somebody is President of the United states, the authority is total." He seems to view criticism as personal attacks, with an unwavering expectation of loyalty from those who've allied themselves with him. Early on, his administration was hammered for using the term "alternative facts" to describe the surprisingly belligerent Sean Spicer's assertion that Trump's inauguration was the best attended of all time. (A quick refresher and analysis can be found here.) Gaslighting happens on a regular basis from Trump and his cronies--do I even need to link to the injecting of disinfectant comment and his flimsy "it was sarcastic" excuse? The man is untrustworthy in almost every possible way, and yet he maintains a grip on power and domination over his party. This is not simply because his politics doesn't align with mine, though that absolutely informs how I view the situation. I had very few problems with President Obama--but his failure to close Gitmo, the increase in drone strikes that killed innocent children overseas, and his educational policies were a train-wreck (albeit better--barely--than Bush). Even when I disagreed with the politics or the decision, I at least was able to view him as a competent, capable leader. His ability to improve how America was viewed by other countries was an important indicator to me that he was on the right track. Now, even when I think Trump might be making the right decision--locking down the country in response to COVID-19 was the right choice…granted he did it far too late, has assumed no responsibility for the negative consequences his policies generated, and doesn't seem to care too much that we had 9/11-levels of dead Americans daily for a week or so--I consider his "good" moves as accidental spasms, rather than calculated moves. Have I benefited from the slight tax relief that came because of his tax breaks? I…guess? It's been so slight that I didn't really notice. Was the stimulus helpful? I suppose; though I'm still mystified that $2 trillion can so ineffectively be redistributed (another irony of both the Bush and Trump administrations: Redistribution of wealth during times of success is communism; redistribution to the wealthy during times of crisis is "the right thing to do"). Ultimately, this little rant doesn't do much. I've a right to say it*, of course, and you have a right to disagree. If you do, I encourage you to write your own 2,000+ word afternoon diatribe, complete with a footnote and twenty-three links to sundry articles that back up your position. I promise I have the right not to read it. --- * Though if it somehow violates the terms and conditions of Weebly, the company that hosts my website, I would fully expect it to be removed. I wouldn't be happy about it, but it couldn't possibly be censorship if I had broken their rules. There are a lot of structural fractures that COVID-19 is exposing--flaws in our systems that have long been pointed to, decried, and targeted for change--that are now cracking under the weight of a prolonged shutdown and potentially greater problems down the road. Some are critical--healthcare access, availability, and usage; political programs, including and especially elections; civil rights, understandings of liberties, and repercussions for abuses of power--while others are of a more minor or middling importance.
Grades, I would argue, falls somewhere in the middle. Here's the thing about grades that educators have long groused about: They don't mean anything. Of course, that's only partially true: Money doesn't mean anything, but fiat intersubjective agreement has given us enough traction with the idea that it is, indeed, worthwhile (even necessary). Grades don't mean anything, except for in all the ways that they do. Our American system runs an alphabetical gamut from A to F (skipping E because reasons), with the ostensible meanings being centered on C for average work. In many schools (though not mine), the D rank is a type of failing (though some use it as a passing grade) and means "below average". The B range is "good" or slightly above average, and the A range is used for an indication of excellence on the project, assignment, or course. We all know this--we went to school, after all. There has been lamentation about grade inflation for a number of years, with the basic thrust of the argument being that "an A doesn't mean what it used to". And while there is some truth to that, it misses the point: an A in 1969 might have been much more rare, but it in no way expresses a qualitative meaning about the A. Aside from shifting standards in content and delivery (what counts for "good writing" is a mercurial thing at best), what is actually being graded? Syntax? Rhetorical moods? Accuracy? Expression of knowledge? None of that information is recorded in the A. It's become a mark in a gradebook, a note on a transcript. That issue hasn't changed simply because the calendar has advanced. None of the grades that go to a college admission board has ever explained anything beyond the fact that the student received an A. There's no indication of whether or not the kid cajoled the teacher to give him extra credit or petitioned the administration to have a grade changed. It is not (despite what many wish it to be) an indication of meritorious effort and reward. It is a highly imperfect and disproportionally regarded attempt at measuring student knowledge. In fact, if you were to ask teachers what they mean by the grades they give, you'd likely be surprised the diversity of answers. Some view it as a type of communication: "In my class, your level of effort and comprehension is about 75%. Hence the reason you have a C." Others look at it as a type of mastery over the content: "Your skills in this subject are still burgeoning, so you get a D--you have much to learn, young padawan." Yet others conceive of grades as an average of what has transpired in the class: "You wrote a very good essay but your homework was incomplete and incorrect. I'll just average that together and it'll be, say, a B+." If you've ever looked over the sundry disclosure documents of a high school student, you'll see that every teacher expects something different and renders grades based upon their own subjective (though, I hope, clearly articulated) rubric. This is where the "grades are like money" concept breaks down. You don't go to the store and feel uncertain how much your dollar bill will be worth. Sure, the prices may be higher or lower than you anticipate, but no one looks at the dollar bill and says, "That's only worth eighty-five cents here." Yet that's exactly what we do in our different grading systems. Consider the dreaded English essay. If a student writes a paper about, say, the inclusion of feathers on non-avian dinosaurs, how should I grade that? Ought I to remove points for a failure to use commas correctly? What if they assert that feathered dinosaurs are a passing fad in the paleontological world? That's as factually incorrect as misusing a comma, but should that matter in the paper (remember that it's an English paper). And, of course, I know that the factual assertions are incorrect, but that's because I'm an armchair dinosaur afficionado. If it were a paper about, say, the correct air pressure of footballs in the NFL, I wouldn't know if the kid made a mistake there. And this goes along with any subject: Should a student's math teacher demerit a paper if there's a spelling error? What if the math teacher doesn't know her grammar well enough? And while you could argue that an English paper ought to be graded on English paper standards and a math paper on math paper standards, you're again invoking separate standards, which in no way demonstrates student comprehension of anything save a very slender sliver of what's being graded. (Additionally, in what way is a false assertion correct, regardless of the class in which it happens?) A point in one class is not the same as a point in another class.* A grade from one teacher of the same subject is not the same as a grade from another. And yet that's exactly what we pretend our GPAs indicate--a type of standardization that doesn't actually exist. We all likely have memories of a class that was particularly hard for us. In my case, it was the AB Calculus class that I took my senior year. I was never very good at it, I ended up with a 2 (out of 5) on the AB test, and an A- in that course. It required a significantly larger output on my part than the A I earned in my AP English class. Yet, on my transcript, what did it matter? Anyone looking at that transcript would say, "Wow. This kid is solid in both mathematics and language arts." And they would be grossly misinformed. I'm terrible at mathematics. This isn't just false modesty: I'm not just bad with numbers, I don't even know what to do to the numbers to get the answer I'm looking for. A calculator only works when you know how to use it, and I basically don't. I'm not saying I didn't deserve the A- I got in AB Calc…but I'm not saying that I did, either. I don't know what that A- is supposed to say, what it's supposed to mean, and I was there to get it. As far as problematic linguistical resources, grades are a doozy of one.** Of course, there's more to it than just that. Grades have metamorphosized. Now they are also supposed to be barometers of a student's overall self-image. ("She's a good kid: She has a 4.0.") Concepts of self-identity are tied into the idea of "an A-student", so much so that after the student has demonstrated the skills the grade is supposed to measure, post-semester requests about "giving just one more point of a credit so that I can get an A" are not uncommon. Enormous amounts of stress related to grades comes on teenagers throughout the education system. As cases of depression and attempted suicide increase, the role of grades to act as a type of canary in the coalmine has also increased. Grades are now tasked with warning about mental health problems, while at the same time adding additional stress to a young person's life. Little wonder we focus so much on them: We view grades as a kind of panacea and affliction, the cause of--and solution to--all of our academic problems. When a student's self-image is connected to her GPA, desperation and poor choices often come along with it, to say nothing of existential crises on a mind that is not yet equipped to deal with ontological shocks. Much of what happens in high school is foundational but forgettable, a crucial moment of growth off of which much of the future is built, yet not nearly as significant and important later on as it feels in the moment. Grades factor into that complex system in all sorts of ways, for both good and ill. This, I think, is another component to our insistence on their use. With the structural blows to education that COVID-19 has given us, it's time to consider what we mean by grades. We've inherited this system through endless years of tradition; unfortunately, it's not a pure system from the outset, and even if it were, the pressure on grades to do more than they can has evolved it into a vestigial component of education. There are alternatives to what we want out of a responsibility system*** and that would, of course, come with compromises and changes. While I plan on figuring out some of these alternatives, I think the bigger question is this: Do we want to change this system? We have an unheard-of possibility in the current circumstances to radically and permanently change how we communicate about a student's growth and acquisition of knowledge--or (and this is crazy, I know) maybe something else about the student besides just rote memorization or academic business as usual. Ought we to change what we do? Can the massive lemon that is COVID-19, which has upended grading so fundamentally that the past term is, in my view, a complete waste of time‡, be turned into a lemonade that serves all students better? Are we willing to shift things enough to make education more accessible, equitable, and purposeful? And are we willing to pay the price that such a change will inevitably cost? --- * Though I've often wondered about a school wherein teachers set a price for a grade, with each assignment acting as a type of "payment" for their work, which they then would be able to use as currency for their grades. It sounds nightmarish to me. ** Speaking of problematic linguistical resources, there's also the damage that a grade-based "misstatement" can render. A friend of mine was studying Greek in college. He did well in the class--got an A--but learned effectively nothing because of how his professor ran the course. When my buddy went on to the next level of the language, a different professor expected skills that my friend's transcript said he'd attained, but hadn't really mastered. To this day, his Greek is weak (better than almost anyone else he knows, of course, but for a classics major, surprisingly shallow), and it stems from that parallax gap between praxis and practice. *** I use this phrase deliberately, mostly because I didn't explore this other facet of grades in the essay proper: One of the reasons we teachers like grades is it's a way of generating habits within students so that they grow, learn, and enhance their skills. It's one of the few ways that educators have to manipulate student behavior so that they act in a way that's designed to help them grow as individuals and as learners. Of all the reasons that grades are beneficial, this is one that makes the biggest difference to me. I've taught classes where the grade is irrelevant--"Automatic A, I just expect you to work while you're with me"--and I've had mixed results. Highly motivated students tend to do fine with that, but those who might have worked more diligently had there been a higher grade expectation ended up providing middling work at best. That, again, puts pressure on what I mean by an A in my creative writing class--what's "A" about what they did? That they came to class and wrote? That they wrote well? That they demonstrated some form of learning? Should a creative writing class be more prescriptive? All of these sorts of questions spiral out of the concept of grading, even in a low-stakes elective class. ‡ I view the final quarter of this semester to have been a waste of time, as far as grading goes. None of what a grade can communicate is coming through. None of what I'd like for a grade to do is worthwhile. And though academic institutions will have to keep in mind that applicants who went through Q4 2020 might need some sort of accommodation, that kind of memory likely won't last long. Besides, who hasn't been affected by COVID-19? My first grader didn't get the same education that he should have. How many repercussions will that have going through the rest of his career? It's all well and good that the class of 2021 might have colleges be more lenient when looking at their transcripts, but what about the class of 2031, whose entire schooling careers have been permanently shifted by what has transpired these past few months? With the end of the school year whimpering its way toward graduation, I decided to host some low-expectation online offerings for this week between the end of our school's finals and the official ending of the school year. To that end, I set up a couple of Dungeons and Dragons campaigns, a music-sharing get-together, a Random Stuff I Know™ © ® chat session, a Socratic discussion about David Foster Wallace's "This is Water" speech, and a book club on Alice's Adventures in Wonderland. I had diminishing returns as the week went on, with only three or four students attending the Random Stuff I Know™ © ® and Socratic discussions. Still, it was a lot of fun to see some of these students again, and to have an hour or so of chatting about something that wasn't curriculum-based.
Today was the day I hosted the book club, and it was a low-water mark in terms of attendance (only one student came) but a high-water mark in terms of discussion. This is unusual: There's a critical mass of students that are usually needed for a high-quality discussion, and who is in that quantity also matters. Typically, if a student wants to have a one-on-one discussion, it's because she has some specific problem or question that she wants help working through. As far as a book club goes, however, a one-on-one session doesn't necessarily inspire me with confidence with the potential of the conversation. However, when the only student showed up, I was relieved to see that it was Becca--one of my favorite students from one of my favorite families. She had finished reading Alice's Adventures in Wonderland earlier this morning and was willing to spend an hour talking with her teacher--now former teacher, I suppose--about this piece of children's literature. I'm really glad she did. I won't go into all that Becca and I talked about--though we managed to range from some light religious comments to deep questions about identity and incorporated some Harry Potter and Shakespeare quotes while we were at it--but instead want to focus on the question that is the inspiration for this essay: What is a classic? This is one of our foundational questions that we pose to our students when they come to my school. We're a liberal arts school built on the concept of learning from "the classics", which we use in both its traditional (that is, the great works of Homer and Virgil) and broader (our students read The Scarlet Letter, for example) sense. It makes sense, therefore, that we try to define our terms when we say that we want to study the classics. When I ask my sophomores what they think a classic is on the second day of school, they often give some good, albeit incomplete, answers. "Something that's withstood the test of time" is frequently put up there, though it's an easy enough idea to challenge. (Is The Princess Bride a classic of film? Can any film be considered a classic, as the form is barely over a hundred years old?) We talk about it being required in school, even though that isn't a required part of the definition…if that makes any sense. There are a lot of other things that they come up with, of course, but the picture should be coming into focus: The understanding of what makes a classic is hard to pin down. Part of that comes from being able to apply it to other media, which I think is a crucial component. The Greeks may have invented poetry, but we've other ways of communicating beyond that now. The concept of film, I think, is really helpful, as it's old enough to be a given in our culture, yet new enough to force additional understanding onto the definition of classic. (Can video games fit into this definition? Yes. Do they? Very, very rarely.) As Becca and I talked about why Alice's Adventures in Wonderland is a classic, we pulled on the concept that is partly satirized in the last chapter of the book. In Chapter XII, Alice is brought as a witness in the trial of the Knave who supposedly stole the tarts. The White Rabbit throws in a poem (supposedly a confessional written by the accused) that ought to help clear things up. Unfortunately, the poem is so vague that it could be applied to a great many of situations. "'I don't believe there's an atom of meaning in it,'" says Alice (114), and she's basically right. It's imprecise and is not particularly worth interpreting. The King agrees that it would be better if the poem were meaningless, because then he wouldn't have to interpret it. But he can't help himself, and he starts to "botch the words up fit to [his] own thoughts" (Hamlet 4.5) in an interpretive pretzel that strains to get the poem to mean what the King thinks it ought to mean. Becca and I noticed that this impulse to interpret a book of nonsense is a similar sort of action that the King is doing himself. And that's when we cottoned onto the idea that additional meanings of interpretation are what mark a piece of work as a classic. The text itself is comparatively narrow--there are only two epic poems by Homer, and Virgil has but one masterpiece (and Shakespeare, building off what came before, created a dozen masterpieces because Shakespeare is incredible)--but it invites, encourages, and (most importantly) allows additional interpretations. The boundaries of the story do not confine the meaning of the story. A classic, therefore, insists that the ways into it and out of it continue to expand. Time allows us to see what pieces have endured this sort of hermeneutical expansion--which is why we often think of classics as "old"--but that's more of an outgrowth of its richness. Part of how it does that, I think, is via a return to the beginning. Sometimes that's through direct invocations--Frankenstein's frame story brings us back to where we started, for example--and sometimes it's a matter of thematic closure and the protagonist's completion of the goal. However it comes about, there's a revolution that returns to its starting point: Alice wakes up next to where she'd fallen asleep; Peter Pan refuses to grow; Dante leaves the "straightforward path" of true worship until his theophany amongst the stars. This provides closure, but also encouragement: "You saw one thing this time through. Go again, and see what else you discover." Talking it over with Becca, it was this second component that made such a difference. Today marks the last day of her time at my school: She graduates next Friday, and there aren't any more lessons for her to attend. Even my extracurricular get-togethers are ended. Much like a classic, she has now returned to where she began, asking (and, I think, perhaps, answering) the question that began her entire educational path at my school: What is a classic? In that sense, her classical education was an interpretive journey through the classics, forming her own classic in her growth as a human and a seeker of truth. Being a part of that journey is why I love being a teacher. With the ramifications of the pandemic so immensely unclear--and with Senate testimony from Dr. Fauci having just wrapped up as I sit to write this--I have some thoughts about schooling, the pandemic, and this bizarre piece that happened across my browser, an op-ed by Michael Petrilli called "Half-Time High School May Be Just What Students Need".
To begin with, Michael Petrilli is president of the conservative think tank Thomas B. Fordham Institute, an educator with Education Next (an outlet of corporate education reform policies), and a proud father. Since I'm not really interested in making any sort of ad-hominem argument about him, I bring this up only to say that he is coming from a different point of view and philosophy about education than I do. Additionally, he might have answers to some of my critiques--but they aren't in the op-ed piece, which is what I'm responding to. Petrilli points out an important and unavoidable point: COVID-19 has fundamentally upset what it means to get an education. He begins his piece lamenting the loss of the non-academic value that schools provide: sports events, dances, musicals, and other group-based events. These are crucial components to an educational experience in America and provides an opportunity for students to learn more about how much humankind has to offer. There's a reason why school is more than the "core classes", and exposure to variety (both in and out of the classroom) is necessary. He then paints a picture that is certainly common, though by no means widespread: The tuned-out teenager who's drifting through the day, waiting for the sweet relief of the bell to let them out to their freedom. While there absolutely are those students (and I think everyone, at one point or another, fell into that category), it's also true that there are teenagers sitting in classes that they love, learning eagerly, and anxious to improve their skills and understanding--even for seven hours a day. He claims (and I don't think he's wrong) that students would be happier if "they spent much more of their time reading, writing and completing projects than going through the motions in our industrial-style schools." It's true that our schools have been heavily influenced by industrial revolutionary ideas, as well as Cold War expectations for creating a workforce. In fact, that's the fundamental question about what education is for in the first place: Is it about making future workers? Improving the lives of the students? Providing opportunities to grow and fail with a safety net still in place? Memorizing facts? Socializing? Gaining experiences they don't know will matter to them later on? Forcing them to do things they don't want to do? Our education does a lot of things, but answering this question isn't one we do very well, most likely because there are so many different teachers who go into this profession for so many different reasons, seeing different ways that their career affects their students. Where I disagree with Petrilli's sentiment here is the idea that the students would be spending "much more of their time" in doing school-related activities. In the past two months, I've seen some of my students almost implode because of the workload--which, of course, is reduced from what it would have been during regular sessions--and struggle to meet even a single deadline. (Yes, I'm working with those students; I haven't left them in the dreary wilderness of Bad Grades…yet.) Online schooling--or, as my principal more accurately describes it, "crisis schooling"--is obviously an abnormal situation. It may be premature to draw any conclusions about what's happened the last quarter of the 2019-2020 school year. However, one of the things that we as teachers see every single year is that consistency makes an enormous difference in the overall growth of the student. I love my summers off, but I'll be the first to admit that there is a distinctive loss of retention over the long break. Math and language teachers especially see this, but I have full confidence that, even in a normal situation, if I gave a freshly-minted junior her final from her sophomore year on Day One of her new school year, she would fail that final. This has to do with one of the bigger problems with Petrilli's arguments (which the subtitle of the article is the only place where this problem is at even acknowledged), which is the difference between a senior in high school and a freshman in college is that of age. Teenagers' brains melt during puberty, and there is a lot of stuff that they learn only to forget. That's a natural part of development (and also the reason why they are exposed to the same history multiple times over the course of their education). Petrilli's use of the college paradigm is one I've wondered myself. Why don't we use the Ivory Tower as a model for our more prosaic public schools? As he points out, there's only three hours of in-person schooling during college, so why not do the same for high school? Well, the answer is pretty straightforward: High school isn't college. If you remember your college experiences at all, you'll remember how crucial it was that you manage your time, delicately balancing class schedules, work requirements, and study hours so that you could meet all of your obligations. Often, the on-campus stuff was the easiest part of the day. And though I look back fondly on my college experience, I know that for a lot of people, college was vastly more stressful and difficult to manage than high school. One of the contributing factors was that very thing that Petrilli is exulting over: The freedom to design one's day. I consider myself a pretty committed student during my time as a Wolverine, and I definitely had to fight the urge to skip a class because only the midterm and final count on the grade is pretty strong. I mean, I was paying for the class and still struggled to find the motivation sometimes. What do you think the result would be by putting a child in charge of what she's supposed to do at any given time? When dealing with younger people (yes, even seniors), the routine of the school day is what allows them to move into the more self-directed areas. Almost all educators know of that "one kid" who can't seem to finish his homework, despite having it outlined on the classroom calendar and seeing him write the assignment down in his planner. Ability to plan and manage time is on a spectrum, for sure, so the majority of students tend to do well enough. But if you were to take even a highly organized, highly motivated student and give her college-level schedules, she would likely struggle to decide what to do. I mean, high school kids (yes, even seniors) are still kids. I would imagine that, by now, most parents who are trying to help their own children with the school work coming through the computers now recognize how important it is to provide a lot for growing minds. The educational parlance of "scaffolding" is really important here: Teachers of younger children do a lot of the heavy lifting when it comes to things like scheduling. The training wheels of disclosure documents and parent/teacher conferences are there to help the students move forward so that they can be ready to stand on their own when it's their turn. There's also a very important issue that Petrilli fails to even acknowledge in passing, and that's the fact that schools provide 30 hours (or more, depending on the school) of childcare. "Why don't schools start later? Teenagers need more sleep, according to research?" some people (including Petrilli, it seems) ask with a scratch of their heads. Because the work day won't shift correspondingly: If mom has to get to work by 8, she can't drop her kid off at the school at 9. Additionally, shifting the school day back means that academics begin to encroach on extracurriculars and the vital lifeblood of every Prom group, the part-time job. Later start would mean later end, and I can testify that ending one's day at 3:30pm after starting at 8:00am is really rough. Now, obviously, the reason that schools can't collapse the entire schedule (start at 9 and end at 2) is because of state-mandated number of seat-hours. With enough political will, this part of the equation could change--though it doesn't change the parental situation. Pretending for a moment that we could go back to normal school in the fall, except for the idea that kids aren't in school from 8 to 3, what does that do to a working mother's schedule? Is free daycare available? (No.) Is her work kid-friendly and capable of letting the child come and be entertained/cared for while her mom works? (Unlikely.) Divorcees, single parents, and kids from otherwise "less-than-ideal" homes would not be able to provide what full-time school does. Perhaps a rebuttal would be, "Do we really need to pander to the rare exceptions? Couldn't we make a better system and then figure out what to do with the spares?" Aside from being incredibly heartless, this question asserts a couple of things that are going to be increasingly untrue as time goes on: One, that "normal" kids are the ones coming from a nuclear family with a stay-at-home parent (if we're being generous; "stay-at-home mom" is likely more accurate); and two, that those who will be most disadvantaged by a shift that focuses on the "normal" kids are the most vulnerable in our society. Schools provide more than education: They provide a safe place for students whose home-lives are uncomfortable or dangerous; they give food to kids who may not otherwise eat; they give students tools that the kids' parents don't have when they teach them reading, writing, and online skills, often in a second language. No, schools are pretty far from perfect. However, dismissing those students as collateral damage in the wake of a full-system overhaul is a flawed decision. Another issue that I take with Petrilli's piece is the missing half of the equation: The teachers. I really appreciate his focus on students--even if I question whom he thinks is supposed to be in school--because that's the most important aspect of the story. But skipping over the implications that a half-time day would mean for teachers is a massive misstep. There are lots of reasons that we can't simply flip the switch on what we have now. Here are a couple: The average age of teachers in America in 2016 (I'm sure the numbers have shifted slightly) is 42. And while that may be the answer to life, the universe, and everything, it's also an indication of a demographic that is not likely to be making a TikTok video any time soon. I'm not saying that old dogs can't learn new tricks (I hope to be less clichéd than that); I'm saying that a resistance to change is a real issue. One of my coworkers is old enough to be my grandmother, yet she is keenly interested in using digital tools to help her students learn. Yes, she still makes copies and hands out worksheets (and considering the fact she's working with 7th graders, that's probably a good policy to have), but she's always trying to use Google Classroom to provide feedback and devise new strategies with the tech. She may not even be an exception (though some of my other, older compatriots are a bit less flexible in this area), but she isn't in the majority. Teachers resist all sorts of external changes, from new core curricula to what's allowed in their dress code. It comes, I think, from having a great deal of autonomy and authority in the classroom; when that is challenged in anyway, defenses tend to go up. Another reason why radically shifting the educational system requires quite a bit more effort than what Petrilli argues for is a matter of money. This is a sore spot for basically everyone--teachers are tired of being used in self-sacrifice porn and held up as martyrs for a greater cause simply because they have to have three jobs just to make ends meet; taxpayers are tired of seeing bureaucratic waste and six-digit salaries going to district puppets; conspiracy theorists are tired of claiming that public education is a usurpation of God-given commandments that a child only be taught by their nuclear parents (just kidding; they never get tired of claiming any- and everything). But it basically boils down to this: A radical restructuring and re-administrating of a century's worth of educational practices cannot be done for free. I last saw all of my students on 12 March 2020. On 13 March, I said goodbye to some of them (we have half-day Fridays), wishing them a good weekend, and that I would see them next week. By the time Monday, 16 March had arrived, I was at school, frantically John Henrying the track as the steam engine of "online school" barreled my way. I had two days to redesign a carefully constructed curriculum, having to restructure my schedule, my teaching style, and excising some of the most important moments of my year because the next step was incompatible with what I wanted to do. Now, I think I did all right, in part because of an ease I have with technology already (a fortunate advantage that not all teachers share), but did I get a bonus for this? Was I paid extra for having to do something so drastically different from my "job description"? No. In fact, there's a very real possibility I won't even get an annual raise. When teachers say that they want more pay, they're not trying to nickle-and-dime taxpayers. First of all, teachers are taxpayers. Secondly, there is a lot of flexibility and improvisation that teachers have to go through, and since every teacher is a college graduate and over half of them have master's degree, it's only fair to feel that such training and expertise have pecuniary rewards. Thirdly, now more than ever, teaching is a dangerous job. Quite aside from the nightmare of school shootings, schools are petri dishes for the transmission of diseases. Any teacher who is high risk or must care for one (as in my case) is putting her entire family in danger by virtue of her job. I recognize that part of the reason we're even talking about half-time school is because of the need to maintain social distancing as much as possible. Money doesn't solve every problem, but it can help ameliorate certain situations. Now, obviously, my resistance to Petrilli's argument doesn't mean that it's not bereft of merit. I see this pandemic as an opportunity to shift education in a way that I've long felt it needed. However, I do think it's folly to assume we can change things into a "new normal" in the course of six months, especially when we have to look at the broader implications for the less-fortunate students in our country. Maybe some day I'll write up my ideas. I like to think that I'm a pretty easy person to birthday shop for: Get me a book in something I'm interested in and that goes down well. Still, my family prefers to do things a bit more specific, so I try to keep my Amazon wishlist updated. This birthday, with it being in quaran-times and without the ability to do the annual tradition of going to a movie to celebrate my ageing up, I spent a quiet evening at home with the family, doing essentially the same thing that I've done with them for over two months. Though the party wasn't particularly memorable, the situation was, and I'm grateful that my family and I can have moments like that despite the strangeness of life in the spring of 2020.
One of the things that I put on my wishlist was a Magic: The Gathering book called War of the Spark: Ravnica. My son bought it for me, and I finished reading it yesterday. It is…good? And bad? It's complicated… What Worked As a teenager, my friend, Mark Wyman, was big into the Rifts TTRPG. He had a novel set in that world which he liked. I asked him if I could read it, but he didn't recommend it. After all, I wouldn't know what they were talking about. I figured that I'd be fine--it's a science fiction world, and I'd read quite a bit of science-fiction with weird worlds and weird things. I tried reading it anyway, and returned it to him after about twenty or so pages. It was just too hard to deal with how much was assumed of me as a reader. When it comes to these types of spin-off sff novels, there's always a bit of a problem with lore. How much backstory for characters, events, or locations should be provided? How much can the author expect of her readers in terms of preexisting knowledge? What kinds of details are necessary, especially if it's an art-heavy kind of IP? In the case of War of the Spark, Greg Weisman has a lot of ground to cover, as the story's premise is, for lack of a better comparison, the entirety of Endgame. I mean, Endgame doesn't really work that well as a movie qua movie, does it? (I haven't seen it since it came out, so I may be wrong about this.) That is, the emotional stakes, the personal desires, the consequences of the Snappening…all of that is foreground that other movies established. So Endgame has a really strong foundation that assumes a great deal of investment from the audience. (For the record, I think it really did pay off.) So how does this connect to Weisman's book? Well, there was an intricate plan to stop the Big Bad (an Elder Dragon named Nicol Bolas) from attaining god-like power. The book begins with the aftermath of that plan's failure. (See? Kind of a lot like Endgame.) The events of the next day as the heroes of the Magic: The Gathering universe (called Planeswalkers) scramble to fix the situation fills the rest of the novel. As far as it goes, this worked well…but only because I'm an ardent enough fan to know the mythos, lore, locations, and even abilities of a great many of the characters. (Weisman head-hops from Planeswalker to Planeswalker, going through at least a dozen different ones in the course of the story.) I knew what Jace Beleren was capable of doing, I knew he had a relationship with Lillianna Vess (I didn't know about his fling with Vraska, which was a surprise), and I knew his commitment to protecting the Multiverse from destruction. The card game from which the book is based has a rich and complicated lore that comes through all sorts of different avenues, including art books (of which I own five), novels, and articles, and more. So there's a lot of information that a reader has to absorb before this story can make sense. And, in a lot of ways, that makes this book an excellent piece of fan-service. Everyone gets a bit of the spotlight, with the ten different guilds of Ravnica participating in one form or another. Planeswalkers galore fill out the ranks, and the stakes are tangible. The action is persistent, but there are still moments of connection and emotional empathy--provided, of course, one already has an understanding of these characters. What Didn't Work I have to admit, I felt like I was reading the second book in the series, though: The web of intrigue and feelings that connected the Planeswalkers was already so advanced that I checked online a couple of different times to make sure that I hadn't picked up the wrong book. (There is a sequel, which I will likely buy at some point.) This sense of not-quite-knowing but being able to pick up enough of the pieces is a testament to Weisman's skill as a storyteller. Unfortunately, though I was able to figure out what happened before the book began as I read along, it meant that the reasons for people's behavior throughout much of the story I had to take for granted. I didn't know their specifics well enough to understand why everyone felt the way they did about, say, the betrayal of Vraska. By the end, yes, I got it. The result of storytelling this way, though, is that I watched the consequences of choices that I didn't understand until much later. That made the story feel out of order, and the ramifications of the pre-story actions weren't as strongly felt. Additionally, though I think Weisman tells the story well, his sentence-level writing is perfunctory and sometimes even bad. His pacing is cinematic--there are page-long chapters, as well as chapters that sprawl for a dozen pages--and that works well, but his descriptions are consistently inconsistent. This, of course, is part of the problem of adaptations: How much should one describe a character whose face is plastered on a thousands of copies of cards? Usually, Weisman will throw a single sentence--maybe two--about what a character looks like, focusing on the important details. Ajani is a leonin, so his head looks like a lion's. For players of the game, that's all that's needed. So it came as a surprise to see the loving and lengthy descriptions of the Cult of Rakdos. Multiple paragraphs were spent describing the dark, bloody atmosphere as some of the Planeswalkers made their way through it. The criticism isn't that the details of the Cult of Rakdos were expansive; it's that the rest were not. The inconsistency stood out to me. Going along with that, the various problems that the Planeswalkers needed to solve were done quickly, often within their short chapter. I understand the impulse: There's a lot of story here, so a focus on moving the plot forward was probably a good one. Unfortunately, that choice led to the book feeling skimpy. There were chapters that should've been an entire third of the novel. Trying to pack into 360 pages such an immense and complicated story with a dozen POV characters in a fantasy world (which are notorious for being longer, as there's more explanations needed for how the fantasy world works) is a task that might very well be impossible. Should You Read It? Weisman did his best--and it's an enjoyable romp that I'd recommend to Magic: The Gathering players--but that isn't enough to make it a good book. It's good at what it's trying to do, but I think there are enough dings and flaws in it to make it a book for Magic-lovers, rather than someone who's curious what a Magic: The Gathering book is like. (If you want one that doesn't require a lot of knowledge about the game, check out Arena. It's the first novel set in the game's universes, and though it takes some shortcuts, I found it an enjoyable read.) As far as a flat out recommendation, I'd say your mileage will absolutely vary. My younger brother will probably like it quite a bit (though I'm sure he already knows all the events that the book depicts anyway). I think my middle son will want to read it once he's done a bit more reading of the art books I own and played the game a little longer. But I don't think my mom's going to be interested in this one. As part of my rereading of Shakespeare, I finally finished reading Richard III. I've been struggling to get much of the Bard read--a process that's my own fault, really. In all actuality, I should be able to read a play in an afternoon, since that's about how long it takes to have one performed and I read faster than actors speak. But I don't read Shakespeare that way: I read with a pencil in hand, cross-references to other plays when I think of them, and a careful attention to what I'm reading. The result is that I go through very, very slowly. I finished Richard III at the end of April, despite having started it in January.
Still, I did it, and I've some things to say about this one. Richard III marks a genuine beginning to his writing style that he flirted with in Titus Andronicus but set aside until this play, and that is a focus on a single character. All of his plays are filled with characters, of course--as a writer of plays, he had to consider how his fellow actors would be given their jobs, after all. The early comedies and even the first histories that he wrote, however, are ensemble pieces. Two Gentlemen of Verona has, of course, two main characters and their attendant love interests. Because Taming of the Shrew is a comedy, it has to have the A-plot love interest and the B-plot love interest. History plays are (up to Richard III) split in focus among the different factions and battles. Only in Titus Andronicus do we finally see the intimations of a main character in the plays. Unfortunately, that play is pretty gruesome and lands poorly. It's a bit like The Jungle by Upton Sinclair. He wanted to write a novel that would hit America in the heart (and instill the desire to spread socialism throughout the country, to cease the exploitation of the American worker); instead, he hit it in the stomach, which led to regulations about how slaughterhouses worked. Titus Andronicus might have been intended for a different effect, but the result is that the blood-soaked stage covers anything that might have been happening inside of the characters. Enter Richard of Gloucester. This malevolent Machiavel had already been showcased in 3 Henry VI (where Prince Edward says to Richard "Thou, misshapen Dick", much to the hilarity of future sophomores throughout centuries), and the foundation of what Richard will become in his own play are set down brilliantly. However, Richard's presence in the background of 3 Henry VI is inversely proportional to his presence in its sequel: From the first line he speaks (the famously misunderstood first line: "Now is the winter of our discontent…") until his enduring bargain, "My kingdom for a horse!", he is a force to be reckoned with. With the adroitness of an acrobat, Richard manipulates everyone around him, tugging and cajoling, threatening and promising, nimbly dancing through the many obstacles between him and his goal: The crown. To me, Richard III is the pivot of Shakespeare's genius. (I don't doubt that other Bardolators would disagree with me, by the way: Most hermeneutics are polemics by another name anyway.) It's here that we start to see his mastery of the soliloquy--the unpacking of a character's heart with words* becomes one of the greatest tools within Shakespeare's heady arsenal of dramatic representation of humankind. Couple with his unparalleled poetry, Shakespeare really starts to move into a new level of expression through his protracted examination of Richard III. For Shakespeare to achieve this analysis, he has to do what he always does with his histories: He telescopes events, conflates historical characters, abridges conflicts, and places people in the wrong place at the wrong time.** This is all secondary--or even tertiary--to what he's trying to accomplish. And what is that? Well, it's a theme that seems to preoccupy the Bard: What happens when you give a mortal man too much power? Much of Shakespeare's canon is ruminations on power. He often comes to similar conclusions: Bad things transpire. Indeed, when I read his work under this light, it makes The Tempest an even more powerful story…but that's an analysis for a different day. There is a sense of legitimate power--legitimate use of power, I should say--within some of the plays. I get the sense that he wasn't particularly impressed with the house of York*** and that they ended up "getting theirs". However, when it comes to Richard III, he documents an ambitious man's obsession with power at any cost, up to and including the seduction of his niece (4.4) and the ordering for his nephews to be killed and buried in the walls of the Tower (4.2). This sort of sustained attention helps to generate two conflicting emotions: Admiration for Richard's tenacity and reprehension for his behaviors. Tyrants have long been a part of the makeup of the world. Though we've few historical examples of the Platonic Philosopher-King who rules despite not wanting the job, our drama prefers people of greater drive and motivation. And that's what really makes Richard III (and much of the play Richard III) so compelling: The main character, though we loathe him, actually does what he sets out to do. That's storytelling 101: Give the character a goal, put obstacles in front of that goal, and the pleasure of the story is seeing how the character overcomes those difficulties to achieve the goal. And that leads to the flaws of the piece: Richard III is a bloated play. It is the second longest play in the canon (Hamlet clocks in at 29,844 words; Richard III has 28,439), and it feels it. Unlike the longer (and superior) Hamlet, Richard III struggles to maintain its full narrative drive the entire time. The reason for this is simple: Both Hamlet and Richard have goals. Hamlet doesn't succeed in achieving his goal until 5.2, the final scene of the play. Richard, however, gets what he's after by 4.2, thus leaving the rest of Act 4 and all of Act 5 to finish off the story. Shakespeare manages to keep Richard's attempts to remain king--his new goal--worthwhile; unfortunately, there's also a lot of cursing going on with Queen Margaret and the other women in the play, plus the machinations of events outside of Richard's control. The result of this is that the play doesn't contain the same intensity in the latter portion as in the earlier acts.‡ These are quibbles: Richard himself is such a compelling and charismatic character that it's hard not to like him--at least, in the same way that one likes horror movies, war stories, or rubber-necks a bad accident on the freeway. There's a vile charm about him that we can't help but enjoy. We want him to succeed only so that his fall is stronger and more potent having seen what he did to attain such heights. In this he's a precursor to Milton's Satan, giving us the insights into the darkness of ambition- and pride-gilded minds. This is only possible because of Shakespeare's shift from ensemble to lead. Surely Richard Burbage--the best actor of the company and the man who first voiced all of Shakespeare's most iconic roles--had something to do with it. Perhaps Shakespeare finally understood how well his characters could be expressed and so gave greater attention to the way he represented humanity. Or, maybe, it was happenstance: Perhaps the Bard grew tired of always writing sprawling stories of (comparatively) shallow characters and was ready to try something new. Maybe the structure of history allowed him to expand in interesting ways. (This is something I've found to be true in my own writing: Rewriting an already-told tale takes some of the burden off of the mind, allowing growth in different directions.) Whatever prompted Mr. Shakespeare to do what he did, I'm glad it happened. The arrival of this bad-guy-as-protagonist changed the way Shakespeare wrote, shifting his abilities toward even more powerful representations. Though I highly doubt it was clear to William Shakespeare when he wrote Richard III, the "lump of foul deformity" (1.2.57) ended up becoming the foundation for the apotheosis of dramatic representation: Macbeth, Lear, and--above all--Hamlet. Pretty good for someone not even yet 30 years old. --- * I couldn't help myself from this little allusion to Hamlet 2.2. Nor could I help myself from pointing out the allusion, which is just bad manners. ** This is most obvious when you look at the deposed Queen Margaret, who is still in the English court--basically as a tool to harass the characters--even though in reality she had, for much of the time the play is covering, already returned to her native France and/or was dead (the play covers a fair swath of years). *** I'm not going to weigh in on the idea that Shakespeare wrote Richard III as propaganda for the Tudors, or that he was secretly persuaded by the White Rose side of the Wars of the Roses; I'm talking about how he portrays the historical characters in the plays. ‡ When Shakespeare addresses this kind of story again in Macbeth, he solves the problem in a couple of ways: One, he shortens the story (Macbeth only has 16,372 words--a full 12,000 fewer words than Richard III); and two, he drops this line into Macbeth's mouth, rendering the shift in goals so clearly that it's easy to understand Macbeth's intellectual movement: "To be thus is nothing/But to be safely thus…" (3.1.49-50). |
AuthorWould you like to support my writings? Feel free to buy me a coffee (which I don't drink, but I do drink hot chocolate) at my Ko-Fi page. Thanks! Archives
July 2022
Categories
All
|