(All right, so the title is misleading: Technically, Jin is a samurai, not a shinobi. But I thought it was clever, so I went with it. Okay, moving on…) During Fall 2020, a lot of really rotten things hit me and my family, not the least of which was an infection of COVID that narrowly avoided hitting my heart-warrior son. Due to this (and a host of other things), Gayle obliged me by letting me buy a new video game. I wanted Sekiro: Shadows Die Twice or Ghost of Tsushima. As the latter was on sale, I bought that. Christmas was around the corner, so Sekiro arrived on my PlayStation 4 shortly thereafter. Suddenly, I had two games set in feudal Japan that required a lot of sword swinging to get things done. Playing the games concurrently--sometimes switching from one to another in a single evening--led to a unique juxtaposition, an insight into how wildly different developers approach a similar concept.
What's the Same The setting: Both games take place during a historical moment of Japan--GoT during the Mongolian invasions of the late-13th century, S:SDT during the Sengoku period in the 15th century--and each relies on getting many details right. I'm no expert on this, but my brother (a Japanese teacher and translator) assures me that GoT has a pretty faithful adherence to historical accuracy. There are some liberties taken, of course, but on the whole, it's a faithful adaptation. Sekiro takes place in the fictional nation of Ashina, so there's a lot more room for flexibility. Still, the lightning-angled paper streamers known as shide abound in Ashina as much as Tsushima (perhaps a bit more in Sekiro), and sake features in both games fairly heavily. Pagodas dot the landscapes, miscanthus grass covers the ground, and inspiring vistas of a cloud-capped mountains and foggy valleys add depth to both worlds. Obviously, with both games set in Japan, the characters speak Japanese (though there are English tracks) and approach their duties with a strong sense of duty, honor, and loyalty. The gameplay: Smacking bad guys with swords, throwing alternative weapons to distract or kill enemies from a distance, hiding in shadows to stealth-kill thoughtless guards, and navigating what ought to be unnavigable terrain feature heavily in both games. There are ways to distract guards, manipulate the environment, and even light enemies on fire, regardless of which title you pick to play. Fast traveling, leveling up the character, and even alternative costumes are available, albeit in very different ways from each. Oh, and they're third-person action RPGs, so even genre-wise they're playing in the same sandbox. As is typical for video games, there are also a number of mini-bosses that can be defeated, which helps improve the character's stats, plus a number of larger bosses to defeat. In such high-stakes, one-on-one battles, the enemy has a stamina bar in addition to health bars. Deflecting enough damage--or meting out enough of your own--can lead to the stamina bar dropping low enough to deal major health damage to the boss. The story: In order to save his part of the world, the hero must embark on a quest to resist the influence of an evil usurper who wishes to harm someone the hero loves. By using his skills with the sword--and a trusty grappling hook--he will traverse a wild and dangerous world, filled with enemies in enclosed fortresses and vicious animals who will attack him at a moment's notice. In the end, the hero must confront the man he always considered his father, the man who trained him in the ways of the warrior. The life of the father will then be decided by the hero. This confrontation comes about because the hero has chosen to betray his family and the demands of tradition. Also, both have ghosts. What's Different The setting: Both games are stunning in their executions, albeit in different ways. There's no doubt, though, that Ghost of Tsushima has a superior graphical and visual delivery. Sucker Punch's game is jaw-droppingly beautiful, having taken massive inspiration from Akira Kurosawa's cinematic language to create engaging, powerful cut scenes. Top-notch performance capture work, along with subtle facial animations to match the nuances of the acting all combine with the eye-candy of a late-stage PlayStation 4 game. The world feels almost tangible, with wind whistling through the leaves of grass (and the controller's speaker) and stirring the cloth of the characters. A day/night cycle, as well as weather effects work together to make Tsushima variegated, engaging, and enjoyable to traverse. Not only that, but GoT is an open-world game, allowing the player to explore many nooks and crannies, rivers and streams, mountaintops and valleys. Light platforming mechanics gives Jin--the player character--a chance to clamber around, swinging from branches to boulders in well-designed side-missions. Indeed, discovering the shrines was one of my favorite parts of the game, as I've always reveled in well-made platforming sections (I think the early Prince of Persia titles were superb examples of this). The melding of strictly linear approaches in these mini-missions versus the otherwise open-ended options of the main game is a seamless and logical construction. By contrast, Sekiro: Shadows Die Twice is an amalgamation of open-world philosophy and conscientious, deliberate "level" design. As is almost always the case with FromSoftware games, Sekiro has a progression of areas of the map that is ideal for certain levels of skill. At the beginning, Sekiro must fight through a memory at Hirata Estate. When I first played this section, it took a solid hour (or more…probably more) to learn the pathways through the streets, the best order to attack enemies, and doing my best not to engage with the soldiers in anything less than the ideal situation. As I beat my head against the final boss fight of the game, some seventy hours later, I chose to return to Hirata Estate and slew my way through without hardly even taking any damage. This is what I mean by deliberate design: Ashina has many places to explore, but they're all within the "tracks" of the main pathways. There are shortcuts--crucial to find if you want to play through without going crazy having to fight your way through the same areas three dozen times--and secrets, but the design is recursive, bringing you back to earlier areas. This creates a really cohesive but small world, one that is finely tuned for its purpose. There are hints to a broader world beyond the conflict in Ashina, but that's all they are: Hints. Yet, I also mean that it's "open-world" because you don't have to play through the game in any specific, set way. There are some required early-game areas, of course--as is the case with Ghost and most every game--where options are highly limited. However, once you reach a certain point, progress can be done in any way you wish. I got stuck on mini-bosses a number of times, so I would go elsewhere and shinobi-stab some fools for a while. It would help me level up, get me better at the game, and sometimes lead to other boss/mini-boss fights that I could challenge myself with. The freedom to choose how to explore the world is contracted compared to GoT, but it still gives the impression of being in control of when and where I fought. Graphically, I have to say that it was always a bit jarring to switch from GoT to S:SDT. The former was always rich with color, its HDR10 color palate expansive and crisp. By comparison, the latter always felt a bit dingy, with washed out colors and a grimy feeling. (This may be a PlayStation 4 issue: I've seen some breathtaking footage of Sekrio on YouTube, which I assume was captured with a high-end PC.) The game is still pretty--mostly in the way that video games are now, with the sharp details that look as good close up as they do at a distance--but not the gasp-inducing beauty that GoT pulled off. The gameplay: Of the two, I vastly prefer playing as Sekiro. That isn't to say that Jin wasn't fun; on the contrary, I had a great time playing as the Ghost of Tsushima--especially when I played the online mode with my brother. It was always satisfying to get a fifty-meter headshot with my longbow (Sekrio doesn't use any bows at all) and watch the enemy rag-doll to the ground. And the way that I could easily flow from one fighting style to another was a brilliant bit of design on Sucker Punch's part. Part of this is because FromSoftware's sense of how to use the controller is so good. It doesn't sound like there'd be a lot of variability in this--there are limited number of buttons, after all, so how could one game's use of the controller matter so much? Yet there is. In the case of Sekiro, the shoulder buttons being the attack buttons means that running and jumping can be done without having to reset my thumb to switch to an attack if necessary. This game moves quickly (not in terms of story…that's a different thing altogether), so the slightest advantage I can have, I want. By way of (yet another) comparison, I recently started playing Marvel's Avengers. I remapped the controls as much as I could to be like a FromSoftware game. I use my right fingers to attack, leaving my thumb open for dodging and jumping. But because the game isn't designed for that level of finesse, it doesn't have the same feeling. Like, at all. In fact, I'm planning on switching back to the defaults, because it simply isn't satisfying. It's sort of like trying to run an HDR10 game on a TV that only outputs 1080i: The higher quality stuff isn't really doing anything for the experience. Sekiro moves like a shadow, practically gliding over the earth, stealth-killing and slashing his way through Ashina. Because of the sound-design, animation sequences, and controller interaction are so well welded, kills feel substantial and satisfying. Flying out of the air to land on an unsuspecting monster's neck is a frequent thrill. And, with the ability to stealth-kill or deathblow an enemy being the same button as my basic attack, I almost never flubbed one. I can't say the same for Ghost of Tsushima. It was always clear when I played Sekiro before Ghost: In the latter game, the R1 button throws a kunai at the bad guys. I can't tell you how many times I thought I was about to chop my opponents down, only to find myself throwing some small knives at them, staggering them backwards. The muscle-memory took rewiring each time. More than any of these specific components, the reality is that nobody can touch FromSoftware when it comes to boss fights. (The closest would be Hideo Kojima during his prime years on Metal Gear Solid, and maybe a couple of times in Bayonetta and Devil May Cry.) The common refrain on FromSoftware games is that they're punishingly hard. That is true, but it isn't about being hard that makes the game worth playing; it's how satisfying it is when you finally make that last deathblow and defeat the enemy that has sent you back to the checkpoint countless times. There's a thrill not unlike going on a rollercoaster when you're squaring off against the Blazing Bull for the fifth or sixth time and you've finally got him on the ropes. Finally putting down a boss (or, as happened so much more often with me, a mini-boss) after so many attempts feels so good. It's honestly addicting, and part of the reason that, after beating Bloodborne a few months ago, I've been flirting with the idea of replaying it. (I have a couple of other games to knock out before I do that, however.) And while I was always satisfied when I defeated a difficult boss in Ghost of Tsushima, they didn't provide the same level of satisfaction as when I defeated someone who had given me grief for a solid hour in Sekiro. All that being said, both gameplay styles are good. Not just good, but really top notch. The designers brought their A-game (I honestly don't know what that phrase is supposed to mean) to the products, and it shows. I thoroughly enjoyed both offerings and had fun while I was there. The story: Despite my earlier, glib way of pointing out plot similarities, the two games are drastically different. And while both have "ghosts", the supernatural is pretty muted in Ghost of Tsushima, while it's crucial to the story of Sekiro. Ghost of Tsushima is a story about revenge and fury, about repelling invaders and unifying a fighting force to stop a great wrong from happening. Its scope is large, yet it remains tightly focused on Jin. He is an interesting character, one who struggles with what he has to do in order to save his island home, an exploration of what happens when one gives up morality for Machiavellian advantage. More than that, the story really resonates because of the aforementioned performances. By being able to see the characters' faces, their emotional responses to the different subjects they discuss, and even seeing the changes in the costumes to match the new moments in the story, I was pulled into Jin's journey much more fully. Video games are unique in their interactivity, but their ability to use cinematic language can't be overlooked. I felt a gentle give-and-pull of being in control of a character but willing to let him go when the story intervened. Sekiro, on the other hand, has very few cutscenes, and though there are lots of conversations, they feel like puppets delivering dialogue. There isn't any emotion in the body language, as the interlocutors remain stiff as they run through their lines. The camera remains free, allowing me to spin around and try to see Sekiro's face to try to gauge his emotional reaction. Unfortunately, this tends to distract me, making it hard for me to pay attention to what's being said, as well as failing in the point of drawing me more into their world. Sekiro takes all information in with the same stoic resolve as he would if someone pointed out that he has a nose. I know why game designers do this (they're trying to get the players to more fully invest themselves into the avatar, and don't want the character's personality to interfere with it), but I really wish they'd stop trying. It doesn't make sense. It didn't work for Solid Snake, it doesn't work for the Hunter in Bloodborne, and it won't work for Sekiro. Blank-canvas characters aren't interesting (I'm looking at you, Bella Swan), no matter the medium. Of course, one thing that video games can do in ways that no other medium can, is tell a non-linear story based on the amount that the audience wants to hear. Sekiro's story is told through small "remnants" of memories that you find as you explore, as well as item descriptions, notes found in the world, conversations over sake with other NPCs, details in the environment, and--occasionally--a cutscene. It's a fantastic way to tell a story, because a player gets as much as she puts into it. For me, this is the great strength of interactive storytelling: Giving the player choice and control, not over narrative trees, but over quantity and detail of the story. That's the other ingredient to FromSoftware's secret sauce, and it's used to perfection in this game. Except for one thing: Sekiro is an actual character, not solely an avatar. Neither of these games allows for character creation--all people who play Ghost will play as Jin; there is only one Sekiro in Sekiro--and that means that the story can be focused on the character qua character, rather than inciting incident for the events of the world. In other FromSoftware games, you can create what your avatar looks like--skin color, gender, height, and more--and pilot that avatar throughout the dark world. And it is that world wherein the story happens. Bloodborne, for example, is about a Hunter who seeks the paleblood. However, it isn't about the Hunter. That is, the player may interact with the world, but that character is in something much bigger than herself. The characters with names, motivations, and backstory are those who create the tapestry and world that the player explores. It's highly enjoyable, but it mostly works because it isn't about the way the player character changes through the course of the journey. Sekiro tries to blend the two, and I don't think it fully succeeds. It tells Sekiro's story competently enough, inasmuch as the plot points are clear (-ish) and give strong motivation for what your objectives are. However, there isn't a lot of emotional grounding. When it comes time to decide whom to betray, there isn't any sort of background to rely on for an emotional feeling. I could pick one of four options in how I got to the end of the story (and then watch the others on YouTube) without having any sort of character-based reason for choosing the way I thought Sekiro might. Since he's such a stoic character, I wasn't able to "read" him in any significant way. This is, perhaps, the biggest flaw of this game. Ghost is replete with emotional moments. There's genuine pathos when a friend dies horribly, and I really wanted to help Yuna whenever her missions popped up, as I viewed her as a great ally. Jin grows and learns as a person through the course of the story, and with the superior cinematography and editing of the frequent cutscenes, I felt much more connected to him. Sitting and composing haikus in the forest, giving time over to watch his naked self contemplate important thoughts while in a hot spring, listening to him discuss ideas and stratagems with his friends--these are the components of a strong connection with a character. There's an emotional vulnerability to Jin that Sekiro simply doesn't have. There's nothing wrong with a stoic, resolute character--but they certainly aren't one that I would want to watch a movie about. I like Sekiro because I can play as Sekiro; I like Jin because I feel for him and see parts of myself in his struggles. Final Thoughts It shouldn't surprise you to know that I don't recommend one game over another. They're both incredible, and they both do their jobs with stunning aplomb. Neither is perfect, and I think both should be played by anyone interested. Perhaps the supernatural dive into Japanese mythology (complete with an eventual slaying of a dragon by the end) is more interesting to you: In which case, Sekiro is the better choice. But maybe historical fiction with a bit of ancestor-help-as-gameplay-mechanic intrigues you more: Take Ghost of Tsushima, then. Either way, you'll have an enjoyable experience. Despite how many times I died because I hit the wrong button thanks to the control scheme of the other game, I'm really glad that I played them this way. Where one lacks, the other shines, and vice versa--though I must emphasize again they are both superb games--and I think anyone interested in spending some more time in the Land of the Rising Sun could do worse than playing one of them. Or why not both? Note: Mormonism is capable of sustaining a lot of different views and attitudes; what I have almost exclusive contact with is the Utah County variety, which is its own unique brand of the religion. Additionally, I'm speaking from personal, lived experience and perceptions that I have received. Others who've been a part of this religion as long--or longer--may remember and view things differently. Obviously, I'm speaking for myself and not for the Church itself, and there are plenty of people who feel differently than the mainstream Mormonism I'm painting here. Exceptions to what I'm discussing here are what give me hope.
I'm a member of the Church of Jesus Christ of Latter-day Saints--a Mormon--and I don't view politics the way the majority of my local congregants do. If I had to peg my personal concepts of Mormonism, they'd probably be closer to an LDS liberation theology than where many might expect a Mormon to land. Like any honest seeker of truth, my understandings of the world shift and change as new information comes in. My feelings and ideas also change--there was a time, for example, when I believed that global warming was a hoax, simply because I thought that I was a Republican, and Republicans denied the clear scientific evidence--and so I'm writing this not as an endpoint of my thoughts but rather one that's spurred by recent events and disappointments. It's part of my own journey. What I'm doing here is trying to answer the question that I found in comments to one of Pat Bagley's tweets (which is funny and, to only fuel the irony of this post, I'm linking to but not sharing outright because it has swears and, as a Mormon, I've issues with that). In it, Pat "translates" Evan McMullin's tweet which expresses his disgust at the police brutality against a senior citizen in the recent police riots. Within Bagley's comments is the one I have as the image at the top of the post, from @the_real_scott: "Speaking of Mormonese, I can't understand the Mormon ease in voting for something that is antithetical to everything they say they believe morally. I really don't get how they support Trump's lies, crimes, and overt racism." Good wordplay there, and it shoots straight at my own questions about how Mormons feel about the impeached president. First of all, the majority of Mormons seem to be okay with President Trump. Despite his bragging about sexual assault--revealed before the election happened--his impeachment, and any other catalogue of horrors and abuses, Mormons are poised to vote for him again in November, based upon polls taken at the end of May 2020. And though they've not loved him the way Mormons usually kowtow to Republican presidents, they still abide his presidency by almost two-thirds majority. (Admittedly, that particular stat comes from 2018, and opinions can change.) In short, the impeached president's bragging about murdering someone on 5th Avenue has, metaphorically, held true with the majority of members of the Church of Jesus Christ: Despite his clear disdain for religion--using it as a prop to shore up his Evangelical base--as well as his frequent maligning of Mormon-favorite Mitt Romney, President Donald Trump remains popular among the pious. It should be clear, if it weren't yet, that I view the impeached Donald Trump as a danger to our country and a "king of shreds and patches," to quote Shakespeare. He took a position he was not qualified for, put in office against the wishes of the majority of voters, and has done a worse job as president than I anticipated--which is really saying something. As a human, he's undignified, incapable of coherent thought, and an embarrassment. And, as much as it might pain him to hear it, for Mormons, I don't think it's about him. For some members of the Church, it wasn't about Trump; it was about his competition. To many Mormons, voting for Trump (which both Mormon-heavy Utah and Idaho did in 2016) was more about voting against Hillary Clinton, whom they viewed with suspicion (at best) and outright hostility (at worst…and at more normal levels, from my experience). It feels like much of the AM dial in Utah is dedicated to conservative talk-radio, and talk-radio notoriously despised Clinton, whom they viewed as an Obama-surrogate (among other things). Right or no, the perception of Clinton as somehow even worse than President Obama was definitely part of the milieu in Utah County circa 2016. The case against Clinton was manifold, but the one that I heard a student say that continues to haunt me is that she was "overqualified" to be the President of the United States. And, of course, the sarcastic catchphrase of the election: "But her emails!" was viewed, not as conspiracy-theory bleating, but a coup de grâce about voting red. Abortion is a flashpoint for a lot of members of the Church: The Church is opposed to at-will abortions, so voting for a candidate who embraced the continued legalization of abortion was a non-starter. Marriage, another bastion of Mormonism and an area where the Church feels constantly threatened, was brought up against Clinton. I saw people deride her for staying with the impeached Bill Clinton, despite his highly-public affair. I also heard people use the idea that Bill was a rapist, and therefore Hillary should not be president. (I haven't heard if these same people were distressed by the sixteen allegations of sexual misconduct against 45 has changed their opinions on the toupee-wearing jack-o-lantern.) Trump is on his third wife, and has admitted to extramarital affairs--including a large-scale scandal with a paid-off porn star--but I've not heard much among my conservative friends about whether that has changed any feelings. Despite all of this, Clinton is no longer running (though I hear enough about both Clinton and Obama from conservative defenders of the impeached president that I sometimes wonder) and so voters for Trump no longer have to be his supporters, right? Well, this is where it stops being about Donald Trump, at least from what I can understand. It's not his personality, but his politics where a lot of Mormons align with him. Yes, on the whole, Mormons are opposed to Trump's stance on refugees--consider Governor Herbert's request at the end of 2019--and they aren't a fan of his blatant sexism (I guess; Mormons have a really strong definition of gender roles, but they don't like it when people are mean about those sorts of things). Really, it's more of a "hate the sinner, love the sin" sort of an approach. The death of Antonin Scalia--and the Supreme Court Justice seat McConnell and other Republican senators held unfilled until after the election was over--appeared to me as one of the deciding factors for a number of people: Better to have a spray-tan afficionado in the Oval Office and a conservative Justice than a competent Commander-in-Chief who would put a liberal Justice in place. And so we hit the paydirt of what Mormonism as a political force means. I personally think that the politics of Mormonism is divorced from the theology--as I mentioned before, I lean toward a type of liberation theology, rather than the prosperity theology that has been a part of Mormonic politics/culture for as long as I can remember--and that can, in part, be laid at the feet of President (of the Church) Ezra Taft Benson. His cold-warrior approach to the way the world worked in his time gave a lot of grist to the conservative movement, including his proclamations that the Constitution is a "heavenly banner". (I personally don't know that I want a banner in heaven that enshrines slavery, 3/5 personhood to Blacks, or busies itself with letters permitting piracy…but to each his own, I guess.) Don't get me wrong: I'm a fan of the Constitution. But I'm not a fan of thinking it as some sort of extracanonical scripture (that's what Shakespeare's for) that makes it sacrosanct and above reproach. President Benson wasn't alone in this--we've a long-standing love-affair with conservativism in Mormon history. Heck, BYU's no-beard policy comes in response to counterculture activism in the 1960s and the overall association of hippies and communists to looking less well-groomed, including the wearing of facial hair. What better way to show we're anti-communist than by keeping our faces clean-shaven? The point is, that since at least the mid-twentieth century, Mormonism and conservativism have been growing together. That, however, doesn't explain all of it… From what I can tell, Mormons really want to be a part of the Christian name brand. I wrote about my own feelings on this (before the Church came out and made it a verbal taboo to use the nickname "Mormon"), which haven't changed very much. However, part of my argument is that, aside from a superficial dictionary definition of the term Christian, Mormons aren't Christians. And we're definitely different from the evangelical strains of American Christianity. We members of the Church of Jesus Christ of Latter-day Saints won't be accepted as part of the body of Christ. Though old, this article from Michelle Vu at The Christian Post really puts a finger on the issue when she quotes Dr. Richard Land's analysis. We're considered a fourth Abrahamic religion: Judaism, Christianity, Islam, and Mormonism. However, going it alone is hard to do, especially when there are areas of commonality--a love of Jesus, a hope to do good, a desire for divinity and a blissful afterlife--that make Evangelists appear like natural allies in a world we've been taught to fear, reject, and help save. The marriage of so-called "conservative values" and the Evangelical Right, along with its fusion to the Republican party, has created a web of loyalties and assumptions that Mormonic politics has embraced almost wholesale. This is, to finally get to the answer from @the_real_scott's original question, why Mormons are at ease with Trump. It isn't Trump that they're at ease with: It's the initial next to his name. It's the Republican party that Mormons like. Sure, there are plenty of instances of disagreement--after all, Evan McMullan snagged almost 22% of the electoral vote in 2016, showing a very strong resistance to picking Trump. In fact, McMullan is an interesting case, because it shows that some (quite clearly not all) members did take issue with Trump, but still wanted their conservative views intact. For them, they felt that they were presented with two evils, and so decided to choose neither.* Had those who voted for McMullan instead picked Clinton, Utah would have gone to a Democratic candidate for the first time since LBJ.** Of course, they picked McMullan because they wanted an alternative to the personality, not necessarily to the politics, of the GOP and Trump. From what I can tell, the reason why Mormons will vote for Trump again in 2020--and, since it's 2020 and everything is topsy-turvy, it'll probably be in higher numbers than four years ago--is because they have long considered conservativism as a shibboleth for their religion. The broad strokes of Evangelical politics and right-wing thinking have enough religious parallels that members of the Church of Jesus Christ of Latter-day Saints will go along with almost any candidate with an R next to his (almost always his***) name. --- * I get the idea of voting one's conscience: I would argue that people's conscience should be, before "smaller government, lower taxes!", the moral "Don't vote for fascists". But that's just me. ** What's interesting to me isn't the infrequency of Democratic votes, but when they happen. In Utah's whole history, they've voted for five Democratic nominees in a total of eight elections. The remaining twenty-three elections all went to the Republican. And who did they vote for? Well, in the twentieth century, they went with Wilson--who won because he "kept America out of the war" and then sent Americans to war shortly after his second inauguration--before going along with FDR all four times. They even voted for his vice president. Utah didn't even vote for JFK, yet they helped rehire his vice president. I wonder if it had something to do with their perception of how the wars were progressing. I'd have to do more research, but I think that's fascinating. Oh, and did you notice how safe Utah is for Trump? There's no doubt that the Beehive State is securely in the impeached president's pocket. No doubt at all. *** Obviously, there are plenty of females in the Republican party and in the Utah political system. But there's definitely a preponderance of males. Also, the curious case of Ben McAdams versus Mia Love deserves more digestion than a footnote can handle, but it is absolutely worth mentioning that there is a Democrat from Utah in the House of Representatives. It's also worth pointing out that he ended up there because he had 694 more votes than Mia Love. And, to be honest, I was positively gob-smacked when I heard that McAdams won. The world is filled with all sorts of exceptions and unexpected turns, isn't it? Hoo boy. Who could have foreseen putting a thin-skinned narcissist in charge of the country would cause all sorts of butterfly effects throughout our culture and society? To be fair, the leader of the United States is going to set the tone for discourse regardless of the particular (shall we be generous and say) eccentricities of the person in the Oval Office. One of the reasons that people look at things like gender or race when it comes to a candidate is because that can change how tone comes across. (Look at the way Jacinda Ardern, prime minister of New Zealand, reacted to an earthquake during an interview; also consider, perhaps, the way President Trump chose to behave when warned against looking directly at the sun.) Despite the hack D'Souza claiming he understands the roots of Obama's rage, the forty-forth president maintained a calm demeanor in almost all circumstances. (This early thinkpiece about the reasons for that might be worth your time; also recall how he was ridiculed for weeping over the staggering loss at Sandy Hook Elementary school.) My point is, what happens on Pennsylvania Avenue tends to have repercussions all over the place, including in the digital sphere. While Barack Obama utilized the nascent social media and digital domains to his advantage in 2008, the tech world morphed immensely during his tenure. By the time 2016 came along, Russian interference via Facebook and other social media platforms only exacerbated what was already the clear trajectory of subscription-free websites: Divisiveness makes money. By exploiting that concept, the businessman-turned-politician whose best skills lie in exploiting divisiveness rather unsurprisingly became the GOP nominee for the presidency. He lost the popular vote by over 3 million people and became president anyway, highlighting additional problems that I'm not getting into here. The point is that without the digital terrain of the mid 2010s, I don't know if we'd have the current political landscape. Trump owes social media his presidency as much as he owes Putin. So it's not surprising to me to learn that Trump, at the time of this writing, is poised to sign an executive order regarding social media sites, in effect regulating what they allow on their private platforms. There is an irony here that I've seen in other places, and though it's a qualified irony, it's worth pointing out: To many conservatives, governmental regulation is anathema. We all remember when Rick Perry couldn't remember which regulatory agency he would have scrapped had he gone on to the presidency. Reagan's poison is conservative doctrine now: "The nine most terrifying words in the English language are: I'm from the Government, and I'm here to help." Conservatives have long run on the platform of smaller government (which has its merits) by insisting that they should be put in charge of the entity they have nothing but disdain for (which does not have its merits). Much like having teetotalers in charge of the Department of Alcoholic Beverage Control, there's something to be said about having those in charge who don't believe in the thing that they're in charge of--and it isn't a nice thing to be said, either. Hence this irony: A president who has promised to repeal two regulations for every new one instituted is insisting on additional regulation. Now, some people may agree with his move--that his executive order to the FTC on forcing social media platforms to moderate their content according to their guidelines better--is a good move. But its value isn't where the irony is, it's in the fact that there's a regulation being forwarded at all. I'm not going to waste time asking what two regulations Trump will strip to offset this new one--it'll probably be environmental or emission regulations--as I don't think there's a one-to-one (or, more accurately, a two-to-one) connection between these things. Here's a shoutout to 2015, when Senator Thom Tillis (R-NC) said that he didn't mind it if Starbucks no longer expected its team members to wash their hands after using the restroom. Making a regulation to reduce a regulation is still a regulation. There's more going on here, though: The president's original tweets on the subject are filled with inaccuracies. Not only is mail-in voting successful, but though there are mistakes that might happen, there's no evidence that they are anything other than the right choice to make to ensure our democracy has a voice in November as COVID-19 continues to upset almost every aspect of our daily lives. More than that, however, is his claims that Twitter putting a "fact-check" link on his tweet is tantamount to violating free speech is honestly nauseatingly stupid. Not only is it a completely wrong sentiment, it beggars credulity in reality that the man who is in charge of the country--the highest office created by the Constitution--is so Constitutionally ignorant that he mistakes being corrected as "stifling" free speech. (Cue the Neil DeGrasse Tyson gif.) I'm not a Constitutional scholar, but I am a social studies teacher. I have had to spend time thinking about how the Constitution works, teaching the Bill of Rights to my students, reading history books that trace the way the Constitution has been seen, and studied how the country has run itself in the past. I'm not claiming an absolute authority on this--I'll leave that to Agent Orange--but I do claim that I've spent more time considering it than, say, a run-of-the-mill devotee of Sean Hannity or Rush Limbaugh. And it's clear that though it may not be a popular way of viewing things, positive and negative rights are a great way of divvying up the Bill of Rights and some of the later amendments. (Recap: Positive rights are those things which the government is obligated to provide, and tend to be sparse in the US Constitution; negative rights are areas where the government is restrained in its power, and are more frequent, particularly in the Bill of Rights.) When it comes to the freedom of speech, the Constitution doesn't guarantee it unconditionally. In fact, it's a perfect example of a negative right: It is a restriction in governmental action in the face of the individual's expression. It is not, however, a ban on governmental action. We all know that you can't yell "Fire!" in a crowded movie theater and walk away from any sort of legal prosecution for the action. There are times when speech can be infringed and censored by governments (local, state, or federal). Some of them, historically speaking, have been abuses of power (consider how the Alien and Sedition Acts influenced early in the country's history, or what happened to Robert Goldstein when he ran afoul of the Espionage Act by making The Spirit of '76 back in 1917). Other areas, however, indicate that the greater societal considerations outweigh the individual rights. (The yelling-fire example is the quick example of that.) The larger takeaway, however, is that the guarantee of governmental non-interference of speech is not something that exists inside of my house, for example. If someone came into my house so that he could gas on about how Trump really is making America great again, I would be within my rights to tell him to shut up. I could even excuse him from my home. I would not be violating his First Amendment rights because I'm not the government. I'm a private citizen. The First Amendment allows speech to happen, yes, but does not require anyone to listen. And, if my platform doesn't want to embrace that speech, I don't have to. Social media has made this trickier: Is this a public space, or a private one? If it's public, then maybe there are some other considerations to view. If it's private that everyone is allowed to see, what's the difference? Trump has flouted Twitter rules regularly, which should have seen him excused from the platform. If it's a private company, making rules about what can be discussed or said on its servers, then that's a digital domain tantamount to my living room: Follow the rules or exit the premises. But if Twitter is a public place, can it do the same? Can public places--parks, libraries, seats of government--be places where abuse, violence, or depravity are enacted without a reprisal from the people's representative government? I personally don't see Twitter as a digital version of a public place--not while it makes billions of dollars by selling ad-space. It's clearly a for-profit business, and though the service may be something the public benefits from (as I often do by using the product), I'm certainly not seeing any of that profit in my bank account. (I could, I suppose, if I invested in them.) Are Facebook and Twitter extensions of the digital commons? I would argue no, and Mark Zuckerberg agrees with me to an extent, despite his recent insipid comments about how Facebook isn't an "arbiter of truth" (which is obviously true; that he uses it to try to keep his hands clean when his platform is routinely abused demonstrates that he'd rather not reflect on how perverse his worldview is). There are things that get an account banned from Facebook and Twitter. There are community guidelines. There are lines in the sand (that can be so conveniently erased) that these platforms disallow people from crossing. It is more profitable for Facebook specifically (though Twitter is in a similar vein) to allow divisiveness than it is for them to enforce their own community rules and regulations. An untended garden doesn't flourish with flowers; it drowns in weeds. Though this is a nuanced and difficult topic, I don't think it's an impossible-to-understand foray into metaphysical ontology or pandisciplinary exegesis. It's something that takes some time to chew on, disagree with, change one's opinion about, and move around as the idea percolates. It is, in other words, far beyond the grasp of the current ambulatory, toupee-wearing traffic cone that will forever be called the 45th president. Honestly, it's hard for me to adapt to the intellectual whiplash between forty-four and forty-five. While his interpretation of the Constitution could be held up to scrutiny and criticism (and often was), no one could honestly say President Obama hadn't studied the document. (Plenty of people said it dishonestly, obviously.) He was, after all, a professor of constitutional law at the University of Chicago. President Trump, however, has asserted that, "When somebody is President of the United states, the authority is total." He seems to view criticism as personal attacks, with an unwavering expectation of loyalty from those who've allied themselves with him. Early on, his administration was hammered for using the term "alternative facts" to describe the surprisingly belligerent Sean Spicer's assertion that Trump's inauguration was the best attended of all time. (A quick refresher and analysis can be found here.) Gaslighting happens on a regular basis from Trump and his cronies--do I even need to link to the injecting of disinfectant comment and his flimsy "it was sarcastic" excuse? The man is untrustworthy in almost every possible way, and yet he maintains a grip on power and domination over his party. This is not simply because his politics doesn't align with mine, though that absolutely informs how I view the situation. I had very few problems with President Obama--but his failure to close Gitmo, the increase in drone strikes that killed innocent children overseas, and his educational policies were a train-wreck (albeit better--barely--than Bush). Even when I disagreed with the politics or the decision, I at least was able to view him as a competent, capable leader. His ability to improve how America was viewed by other countries was an important indicator to me that he was on the right track. Now, even when I think Trump might be making the right decision--locking down the country in response to COVID-19 was the right choice…granted he did it far too late, has assumed no responsibility for the negative consequences his policies generated, and doesn't seem to care too much that we had 9/11-levels of dead Americans daily for a week or so--I consider his "good" moves as accidental spasms, rather than calculated moves. Have I benefited from the slight tax relief that came because of his tax breaks? I…guess? It's been so slight that I didn't really notice. Was the stimulus helpful? I suppose; though I'm still mystified that $2 trillion can so ineffectively be redistributed (another irony of both the Bush and Trump administrations: Redistribution of wealth during times of success is communism; redistribution to the wealthy during times of crisis is "the right thing to do"). Ultimately, this little rant doesn't do much. I've a right to say it*, of course, and you have a right to disagree. If you do, I encourage you to write your own 2,000+ word afternoon diatribe, complete with a footnote and twenty-three links to sundry articles that back up your position. I promise I have the right not to read it. --- * Though if it somehow violates the terms and conditions of Weebly, the company that hosts my website, I would fully expect it to be removed. I wouldn't be happy about it, but it couldn't possibly be censorship if I had broken their rules. Like most people, the news of the spreading corona virus has led me to some serious life reflections and considerations. What is essential? What am I prepared for? What do I view my life to be in the short term? How can I keep my family safe? For all of the unanswered questions, there's one that seems to nag at me the most, waiting in the wings: Is this it? For quite some time now, I've abandoned any millenarian theological interpretations about world events. My study of history--especially within the last hundred years--has shown me that as bad as things are, there have been times in the past where things were significantly worse than now. As a Mormon, I'm part of a millenarian church, but one that's been rather cagey about the end of the world, for the most part. After all, plenty of people--inside and, of course, outside--the Church of Jesus Christ of Latter-day Saints have made predictions about the pending apocalypse. My favorite would have to be the Great Fire of London in 1666. England, which had long thought of itself as God's Chosen Land™, was on edge about the whole year "666" thing. (I say "England", but really it was the more puritanically-inclined people; those who were less religiously devout/superstitious likely didn't mind it as much.) What better year to really show his demonic power off than in Satan's own year? Dire warnings about God's judgment were rife, particularly since the monarchy had only been restored six years prior and was still a sore spot for the revolutionaries who had believed in Cromwell's dictatorship. With a plague outbreak happening a year before, London was feeling like…well, that it was the end of the world. On 2 September 1666, in the King's bakery on Pudding Lane, a fire broke out. Due to a long, hot, dry summer, London was ripe for the roasting and soon half of the City was on fire. Attempts to detonate buildings with gunpowder to provide a fire break occurred (which is, in hindsight, rather an amusing picture), and despite their best efforts, by 4 September 1666, only a fifth of London remained standing. Even St. Paul's Cathedral was destroyed--the one that we all know and love today, that survived the Nazi blitz of World War II, was erected on the same spot in the aftermath of the Great Fire--and though only a handful of people died in the blaze, hundreds of thousands were left homeless and destitute. It was a catastrophe by every mark. (If you want to read more, here's a nifty article.) Who of that time wouldn't look at the great city of London succumbing to flames and think, "This is the end of the world"? On the first day of July 1916, the British launched a bloody and ill-fated attack on German positions near the Somme in France. The battle turned into a lengthy bloodbath, the likes of which have but rarely been seen since then. When I think of how we're behaving now, how convinced we are at the prospect of facing the End Times, I think of this footage. Filmed at 0720 on 1 July 1916 by Geoffrey Malins, this explosion at the Hawthorn Redoubt saw 40,000 pounds of explosive detonate underground. Watch this short clip and ask yourself: What does the end of the world look like? Surely seeing an 80 foot-deep crater, longer than a football field would be part of it? I see an image like this, and I'm reminded of Book 6 of Paradise Lost, when the rebel angels' cannon-fire pushes the loyal angels' ingenuity, and they begin to hurl entire mountains at one another: Forthwith (behold the excellence, the power When I think of the End of Days, I consider how, in the years between Hitler's rise and fall, human beings were turned into purses and riding pants, how Japan's Unit 731 experimented on Chinese prisoners with anthrax and vivisection, how Turkey yet denies having slaughtered a million Armenians…
…if that's not enough to spur Christ's return, why would a twenty-first century flu be sufficient? There's an entire cottage industry of predicting (thus far, wrongly) the end of the world, the Rapture, whatever one wishes to call it, up to and including the creation of a pet-service website for after the apocalypse comes. Mayans were believed to have predicted the end of the world in 2012, of course, and there's hardly a Sunday-gone-by where I haven't heard someone lament about how much more wicked the world is than in those idyllic yesteryears of yore. But I just don't know if that's true. Yes, the world is different, but it's been in a perpetual evolution since Day One. But more wicked than the wholesale enslavement of 16 million human beings from Africa? More wicked than systemic exploitations that led to children dying in mines and factories? History is replete with heinous behavior; why should this be it? The Mormon in me wants to believe that the end is nigh because there are many promised blessings. But the humanist in me wants to believe that we could have chosen differently; we could have aimed to save people, save our planet, save our future--that Christ would come not as a deus ex machina to prevent us from self-annihilation, but because we'd made the world safer, kinder, more loving, more caring, less violent, more equal…more heavenly. When I think of all the despicable things I know from my small store of historical knowledge, I can't believe that twenty-first century problems are what St. John the Beloved was looking at in his great uncovering of the end of the world. Maybe what really worries me is that if the Holocaust isn't sufficiently evil enough to trigger the Second Coming, what will be? As an amateur, armchair paleontologist (I would say dinophile, but that's not actually a word, and, strictly speaking, it means "lover of terrible [things]", which doesn't sound particularly pleasant) now is a great time to be alive and loving dinosaurs. There are, according to Steve Brusatte (in his book, The Rise and Fall of Dinosaurs, which you should read, because it's good), about 50 new species discovered every year. This means that, at the rate of about one a week, a fresh dinosaur is described.
Most recently is a bat-winged creature called Ambopteryx longibrachium (see the picture above) has caught some attention. It isn't the first bat-winged dinosaur ever discovered--that happened back in 2015 when scientists described Yi qi. And that's kind of my point: It's really hard to keep up with the past. This isn't just a phenomenon I suffer from with dinosaurs; being a history aficionado has this same peril. I recently learned about Virginia Hall, a spy for the French during World War II. A book about her life was just released (I haven't read it yet), and, according to the NPR article that let me know about her in the first place, there are three books about her, as well as two movies in the works. This, of course, is wonderful. Far too often the butchers and killers and maniacs of the war are the focus of our stories. And, as most of the soldiers and all of the generals are male, it's particularly nice to get a story about the contributions of women in the war. Moreover, I also have a hard time keeping up with already published (and purchased) books that cover the topics I'm interested. I have two books about living in Elizabethan England, too many about Shakespeare to even catalogue from memory, and a solid handful of Milton-related works. Most of these were purchased because I thought they'd be interested and I believed (as I always do) the lie I tell myself that I will find a way to squeeze in a bit more reading, one more book. Thinking back over my own past, there was a time when what I liked was more niche than nowadays. As a kid, I loved reading Spider-Man novels--not just the comics, which were too variegated for me to keep track of--because I could buy them as they came out. In the mid- to late nineties, there wasn't the glut of interest in superheroes that we're enjoying (and I am enjoying it immensely) today. Now, however, there are so many ways of getting into the spider-verse that it's honestly intimidating. I don't want to say that this is simply because of nostalgia-glasses, though that certainly is a possibility. I was a pretty oblivious kid (I didn't, for example, know that eighth grade GPA didn't "count" until the third term of that year was over), so there's a good chance that more was happening that I simply wasn't aware of. Nevertheless, I think it's fair to say that there really is just a lot more output of content now than ever before. Clearly, the internet is the conduit for this, but I'm still convinced that part of the reason this feels the case is because there is a way for smaller voices to be better heard. I mean, not in the Spider-Man case: Intellectual properties tend to be pretty tightly regulated. But just in general, I'm confident that people were making stuff that they couldn't get into the mainstream and so they languished. So, I guess it's actually pretty hard to assert that we have quantifiable more stuff. The difficulty remains, however: Keeping up with the stories of the past, the new ideas of our future, the important aspects of our now is no easy task. It's beyond what a full-time consumer of culture could ever hope to accomplish, like drinking the ocean. Then again, who needs to drink it? We can enjoy it in many other ways. Maybe that's what I should focus on, instead. To say that the fourth track on Dave Matthews Band's Before These Crowded Streets is anti-imperialist is as uncreative as coming up with a band name like "the Dave Matthews Band". In my mind, however, this song's power is not just in its message but also in its delivery--its simplicity is its power; its complexity is its worth. To start off, "Don't Drink the Water" has to be looked at from an African point of view--and by that I mean a Southern African point of view. Dave Matthews was born in Johannesburg, South Africa, and spent time, off and on in his childhood, in that country. In other interviews (which I couldn't find right away, so this may be hearsay), Matthews acknowledged that the sonic tapestry of South Africa influenced him throughout his life, and Carter Beauford, the band's drummer, locks into that pulsating rhythm in the song. The drum line--a couple of bass kicks and then some distinct snares--is the guitar line. Matthews's earlier work didn't see a lot of unique tunings--no capos, no open tunings, and until Everyday, he didn't use electric (or baritone) guitars--so this song was, reportedly, called "Drop-D"* during the production of the album, as it's the first to really feature this alternative tuning (though "Crush" is also in drop-D). This is how the guitar and the drum end up as the rhythm section: Matthews' striking of the low D is in time with Beauford's kick and Stefan Lessard's bass D. Instead of allowing those three parts of the band to break into lead guitar, bass, and drums (the last two often being the rhythm section), the song pulsates with all three instruments marching along in tandem. Despite this potentially static beat--written in 4/4 time and a scant 84 bpm in the album version--the intricacies of the bass line (freed up to be more melodic and riff-laden than the guitar part, for once), the droning of the violin, and the contributions of both LeRoi Moore's saxophone and guest-artist Bela Fleck's banjo all interweave in such a way that the music becomes layered and complex. One could pick a specific instrument and pay exclusive attention to it each time one listened and glean new musical connections. During the third verse, an electric guitar with distortion and a way hammers on harmonics, again providing texture and variability in what is, for most of the guitar part at least, a one-chord song. In fact, the majority of "Don't Drink the Water" is a D5 - G5 - B minor affair, with the verse running through the D5 until the pre-chorus begins ("So you will lay your arms down" is the first one) by playing the G5. The droning effect of this song makes the shift from D5 to G5 striking and refreshing--as if the brooding groove of the verse can only pound on the listener for so long before relief needs to come in. However, it's only two measures before it's back the D5--this is repeated throughout the pre-chorus--and then the verse returns. It isn't until the chorus (finally dragging in at 2:08) that a new chord is added to the vocabulary, the B minor. Though the guitar brings this in for a couple of measures to change that drone, it's only for two measures before it returns to the G5 to D5 progression. The point of all of this is to say that the guitar is painfully simple throughout almost all of the song, yet it remains captivating despite all of that. The album version of the song (used above; the music video is an interesting, abbreviated version that's worth looking at) goes at a slower, more inexorable pace than the live versions (also worth hearing). This slower pace turns the thudding of the rhythmic triad into a pounding wall of inevitability, one that underscores and enhances the theme of the track. That leads me to the lyrics of "Don't Drink the Water": Come out come out At the beginning, I pointed out that it's clear that the song is anti-imperialist. Phrases like "All I can say to you my new neighbor / Is you must move on or I will bury you" make it pretty clear what's going on. But the way these lyrics are constructed is what fascinates me: Matthews has taken on a persona of a colonizer, of a greedy conquistador. Rather than speaking about imperialism, he's speaking from it. Though I can't be certain, I feel like growing up in apartheid Africa surely gave Matthews a different lens through which this song is being cast. The Dave Matthews Band, at the time of this album's creation, was a five-man band--two white guys (Dave Matthews, guitar; and Stefan Lessard, bass) and three Black (Boyd Tinsley, violin; Leroi Moore, saxophone and others; Carter Beauford, drums). Racially and musically diverse, the Dave Matthews Band is, in many ways, a repudiation of the world that Matthews knew growing up. I don't know when I started to view imperialism with skepticism, though I'm certain songs like this were instrumental** in changing my assumption that the course of history was blameless. The music video of "Don't Drink the Water" puts us in an Amazonian flavor, but the song applies to Manifest Destiny--the way I used to take it, when I was younger and bothered to think about anything--as well as any other example of greed-as-motive-for-atrocities. I feel like the Manifest Destiny interpretation is one that I, as an American living in the West, am most responsible for and benefit the most from. As I've driven around my state, looking at the scrub oak and the variability of the Wasatch, the acres of farmland and the quiet cold of snow-swept mountains, I have thought back to the earlier inhabitants. As urban sprawl swallows up more miles of "empty" land, I can't help but think of the lines "And here I will spread my wings / Yes, I will call this home." The chilling dismissal of concerns ("What's this you say? You feel a right to remain? / Then stay and I will bury you" and "I have no time to justify to you / Fool, you're blind. Fool, move aside for me" are two quick examples) exemplifies what I hear in the rhetoric about imperial Europe. Progress, of course, is the banner under which these behaviors and beliefs live, and anyone who's blind to progress must be moved aside…or so the story goes. Which pushes me to the outro--the part where, particularly live, Matthews' anger at the injustice which he has been satirizing boils over--and the complete dropping of pretense. (I should say that, on occasion, Matthews will play the chorus one extra time, substituting his words for some of the lyrics of "This Land Is My Land", the effect of which is a haunting condemnation because of the context that surrounds it.) As the last chorus ends, Matthews sings, "I can breathe my own air / And I can sleep more soundly / Upon these poor souls / I'll build heaven and call it home / 'Cause you're all dead now." Atrocities like the Trail of Tears and recent injustices like Standing Rock are, in my mind, sudden snapshots of would-be ghosts, a people that has gone nowhere but here and were moved aside for the expansion of the imperialists. For a second time, here are the lyrics of the outro: I live with my justice The rank honesty--the mask of satire has slipped into outright scorn--is shocking. The musical effect here is striking as well: Alanis Morissette sings the melody with Matthews, though an octave higher, to provide an eerie doubling effect. More than that, however, a new chord is introduced, one which jabs at what Matthews is singing here. Instead of a D5 chord (with that 6th string still thumping away), he modulates the 5th note (usually an A) and slides it up a half-step (to a B flat). This discordant chord (try it out on an instrument and see how grating it is) is the crime of imperialism. It doesn't look like anything is too wrong; it's really close to a resolved chord. But it's completely jarring. It grinds away, creating an antagonistic clash to go along with the naked error that pushed so many millions into forgotten graves. Whose justice reigns? My justice. What's the motive? My frenzied feeding and greedy need. Why are they doing this? Hatred. Jealousy. These dark emotions are spat out, as if we could perhaps excise them if we were only to try hard enough.
The penultimate couplet--"I live with the notion / That I don't need anyone but me"--is such a withering indictment of the "rugged individualism" by which "the West was won" that I have a hard time really saying anything more than what's already there. Our founding as a nation is done because of our founding fathers; our country has been defended by our men and women in uniform--the notion that the individual I has created this world is clearly a false one, yet it is one of our more beloved lies. "Self-made man" is, actually, not a thing--John Donne was right: No man is an island. But there's another possibility--faint and unpleasant--that what Matthews' persona is pointing at, is the "me"…the "me" is the only one that even matters. "Me, yeah…" is how he drives toward the end of the song (after the ominous warning "Don't drink the water / There's blood in the water"), turning again to this monstrous concept of personal exceptionalism and Machiavellianism qua truth and justice--that the might of historical pressures and sundry conditions has made the right of the status quo. The cacophony with which the song ends--much like with "The Last Stop"--is a clash of cymbals, drum beats, screams, and warnings. Live, the song will pulse on for another couple of measures, ending where it begins but with the B flat/D chord jangling everything else. In the album version, the song winds down and slides into an interlude. However, the deep marks--the menacing history--that the song points us towards shouldn't do anything other than carve a new empathy for others, for what they've lost, for what we've gained. Interlude Almost as if we need something to cleanse our palate, we get a sixteen measure interlude. Different key, different time signature (the always-peculiar 5/8 time, until the last four measures, which are in 3/4 time). It's reminiscent of "#34" from Under the Table and Dreaming, with arpeggio chords and the entire band weaving their unique brand of music into the shifting chord progression. Of all the interludes of the album, this one is the most necessary (with "The Last Stop" being a close second), if only because its simplicity helps alleviate the weight of the previous song. Indeed, I think the interludes are one of the most crucial aspects of Before These Crowded Streets, giving a logical flow to the order of the songs, as well as emotional breaks from the intensity the music can create. So far as I know, the band never performs these snippets of music--and that's a real loss. Pieces of the songs are audible in other--sometimes earlier, sometimes later--works, but I'm not aware of any other Dave Matthews Band song that relies on the interlude for "Don't Drink the Water". --- * Tuning a guitar to a drop-D is simple: The sixth string--the low E--is detuned a full step so that its an open-D instead of an open-E. Because of how a guitar is tuned, this allows power chords (three note chords: an octave with a fifth in between) to be played more easily and aggressively. ** Pun most definitely intended. Today marks the one hundredth anniversary of the end of the First World War. I have a hard time putting my thoughts together on something like this. Maybe some personal history will help me unpack the volatile and wide-ranging emotions that I'm feeling right now.
Growing up, history was not My Thing™. I didn't mind my history classes, but I pretty much blew them off. English was the only course in which I felt I did well, and the only time I transferred out of a class after it had begun was when I abandoned my Honors History in favor of "regular" history class. High school history is, for me anyway, a complete blur. I can't really recall much of anything that was taught there--sorry, Mrs. Kelsch. I put in minimal effort to get an A in my required history courses in college, again dedicating all of my mental energy to English and the math/science course that was most kicking my butt at that moment. On the whole, I remember only slightly more from my college experience than I do my high school one. Then I got a job teaching World History II and Language Arts 10 at the school where I still work. For state-mandated reasons, I had to go to night school over the course of a couple of years to pick up the equivalent of a minor in history. As part of this endorsement, I had to select two electives. The first one available was a course on the Second World War with a Professor Winkler. I'd taken an ancient history class from Professor Winkler before and I knew I enjoyed his style. I understood what he was after from his students, I liked his lectures (something that I never thought would be the case when I was a kid--liking to listen to lectures), so I figured that, if nothing else, I'd get something from my time with him. My whole world changed. Professor Winkler walked us through an extremely complicated time in the history of the world, keeping us moving through the different battles, with descriptions of the highlights and explanations that helped me to understand the scope and scale of the largest armed conflict in history. Part of what impacted me the most was the passion with which he taught. He was furious at the decisions and behaviors of anyone who did an atrocity--"If we do it, it's necessity; if they do it, it's an atrocity"--and saved his greatest spleen for the architects of such destruction and cruelty. I still remember my surprise when tears leaked from his eyes during his explanation of the Rape of Nanking. In much the way Hamlet muses, shocked and a little ashamed, about an actor's ability to weep for Hecuba ("What's Hecuba to him, or he to Hecuba,/That he should weep for her?") at the end of Act 2 scene 2, I was left stunned. How could the long-silenced cries of those killed under the brutality of Imperial Japan still affect a person in the twenty-first century? While Professor Winkler didn't teach about the Holocaust--he said that it was a semester course on its own, something worth studying separately--I was genuinely impressed and moved by his teaching. The next semester, this time in a course on World War I, saw me, front row, laptop open and ready to take in what he had to stay. I was not prepared for the amount of suffering that was to be described to me. As Professor Winkler laid the groundwork for the War to End All Wars, I found myself having a hard time coming to grips with just how bad World War I was. After all, I had "seen" what WWII was all about--understanding, of course, that one could study that conflict for an entire lifetime and still learn something new--and thought that we'd hit the apogee of human misery and suffering. Studying the First World War showed me that suffering can be wrought upon soldiers as well as civilians, and humans qua humans went through the nightmares of the first half of the twentieth century, regardless of whether they were armed, trained, or uniformed. To say the misery of the soldiers in World War I was somehow "less than" because they "signed up for it" (ignoring the propaganda and social pressures that essentially eradicated that possibility, and definitely setting aside the enforced enlistment of an entire empire, forcing those who would not be involved otherwise into the conflict) is a diminishment of the sacrifice of the men and women who died during that conflagration. Professor Winkler wept whilst describing the pleas of starving German children whose stomachs had been pinched by the British blockade which effectively starved Germany into submission. He wept at the idea of what the men in Verdun survived. He wept at the cold brutality of a war fought on erroneous assumptions. He wept at the genocide with which the twentieth century began. And he fumed at the waste of soldiers' lives that the generals seemed intent on pursuing. Professor Winkler showed me just how tragic a war can be. "Why study war?" he asked in the opening lecture. Then he answered his own question. "So that you can learn to hate it. Doctors study diseases not so that they can use them, but so that they can defeat them." I've taken few other lessons as deeply to heart as that one. So when I think of what World War I means--what it meant to those millions of men during the bleak years between June 1914 and November 1918--I have almost too much to say…so much, in fact, that it renders me mute. When it come so the First World War, Americans' cavalier attitude toward the conflict is something that silently infuriates me. The war is old--a century is a long time--and though I've spent the last four years thinking to myself, Today is the centenary of some battle or other in the Great War, I know that very few do the same. I guess there could be some blame assigned to this, but it's a diffused enough blame as to be rather immaterial. I do know that, as I have the rare privilege of being a voice for the dead ("We are the dead"*) in that conflict, I take the responsibility to impress on my students' minds the gravity of World War I. In other words, I refuse to let the almost sixty-a-year quantity of fifteen- and sixteen year olds pass through my class without having a taste of the despair and horror that their ancestors survived. So now we get to today. It's both Veterans' Day and the 100th anniversary of Armistice. Have you taken a full moment to silently contemplate it? Consider the poppies of the field, the blood-red reminders of the blood-letting. What have we done with the future that they fought to give us? A second world war--same people, similar causes, even worse destruction--and the second half of the century under the threat of nuclear annihilation. Cold wars. Genocides. Terrorism. Torture. Rape, rubble, and bones. Grim visaged war has not smoothed his front**, and peace is a word that is scorned by those with the power to make it happen. The cannon of the war have fallen silent--you can hear that for yourself--but the lands bear scars that five hundred years will not efface. What do we make of it? Like the French Revolution, it's far too soon to see the effects of World War I has had on history. What do we know of it? What do we care about it? One of the reasons that I finally went ahead and wrote my War Golem book in a quasi-World War I world was, in part, to try to communicate how that conflict matters. It's a way of me showing how the war affected me. But what do others care? What, to an American, is Armistice Day? These are questions that continue to haunt me. When I consider the basic nothing I know about World War I--with a recognition that it's a lot more than the average American--I can't help but ache with a sadness that I only get when I consider the first half of the twentieth century. I'm grateful that Veterans' Day will help raise awareness of the importance of this day. But I don't expect this to increase our cultural sensitivity to just how significant the Great War was and is. And that's a tragedy of a different kind. --- * Taken from the famous World War I poem, "In Flanders Fields" by John McCrae. ** See Richard III 1.1. 14 July 2018
Back in 1789, the French Revolution really took off (some heads) and majorly changed the world. The destruction of the monarchical reign in France was bigger than we Americans sometimes credit it and it's cool to have been in Washington, D.C. (if only for part of the day) on "the French 4th of July". And that goes along with a lot of what I've been rolling over in my mind whilst in D.C. The ideas of liberty and freedom (which have slightly different meanings) are supposedly writ large in D.C., with the French version an even more radical one than what the Founders envisioned. While I didn't get to read the Declaration of Independence, I did get to sit in the Jefferson Memorial and read part of it; what seems clear and obvious to us now had to be, at one time, set down and explained--carved in stone, as the case turned out to be at the Jefferson Memorial. But what do we even mean by American freedom? I saw a quote whilst at the Capitol Building's exhibition, something about how America was the only place where a person can be free. Yet there are plenty of "free" places: Canada is a quick and easy example, as well as many of the European countries. Japan has quite a bit of freedom, too. Some might argue that we have "more" freedom, as if it's quantifiable. And maybe it is. But there are plenty of things to unpack there: Is more better inherently? Do various types of freedom change the measurement? Is it how equitably the freedoms are distributed? What about the praxis of freedoms? As I write this, I'm in the Baltimore Washington International Airport. To get into this area, I had to 1) purchase a ticket (without money, my freedom of movement is contracted almost to the point of worthlessness); 2) navigate a fairly complicated privatized system of check-in and baggage tagging; 3) process through the security system where I was not free to leave this laptop in my bag, nor keep the shoes on my feet or my phone in my pocket; 4) purchase a subpar meal without the ability to negotiate or barter (corporate policies most likely being the one that impinges there); 5) keep my mouth shut about certain topics (bombs, terrorism, hijackings); 6) refrain from loudly proselytizing while standing on a table of the nearby Potbelly's. While the list is far from comprehensive, it shows that there are a lot of things that an unfettered freedom can't really approach. Some of these strictures are on the federal level; others are on Maryland; some are corporatized; some are social norms that we aren't "supposed" to break. It's a normalized type of world, in a lot of ways: These things are taken for granted and we move on with life. I'm not saying these things should change (except the TSA: Those security lines don't have to be the humiliating process they've become--but since we're addicted to the convenience of airlines, there's little chance at changing it) necessarily, but instead am noticing the small "erosions" of lost liberties (or freedoms, if you prefer) that we get in our country right now. There are larger issues than the fact that money is what's accepted and not the option to barter, but I think it's illustrative that we've normalized so much. Having just come from the World War I training, I'm reminded of how bad it was for some people in 1917 and 1918 who voiced dismay or disgust or anything other than full-throated support for President Wilson and/or the war. In one instance, a grocer was tarred and feathered because, when a person complained about the food quality, he said, "Don't blame me, blame --- ---- Wilson." (The blanks were in the quoted newspaper, so you can fill in whatever you want.) A pastor who preached pacifism was likewise tarred and feathered. There was a genuine paranoia and social expectation during the Great War--one that we saw resurface in the Second World War (most visibly manifest in the Japanese internment camps) and again later during the Red Scare--that caused a much clearer loss of American rights. Fortunately, the Sedition and Espionage Act is no longer on the books--and other heinous laws that have been implemented have likewise gone away--and one can voice discontent without fear of immediate mob violence. But we can't simply say that "America means freedom," because that's too simplistic for truth to be inside of it. I wrestled with these emotions as I went through the Capitol Building today (not as much when I saw the far too small dinosaur exhibit at the Smithsonian). There are dangers in the world and unfettered freedom exposes a people to danger. I definitely get that. But the Sedition Act is no longer--what about the USA PATRIOT Act? What about the continued surveillance of people? What about the clearer threats to our democracy--any and every attempt to disenfranchise citizens, Russian interference, erosion in confidence in the press, a dismissal of truth in an era of "fake news"--that we aren't addressing? It's easy to blame the GOP (as they have full control in Washington) for not doing something, and the change truly should start there, but there are so many other things--small, seemingly inconsequential things--that aren't necessarily as big but can be just as important or worrying. After all, I'm only in line for the airport security once every couple of years. I'm on my phone every day. What freedoms are being impinged by both governmental and corporate entities that I don't even recognize? And what would Robespierre say about what American (or French) freedom today? If he and Washington could come to 2018 and look at their respective countries, would they be impressed by the freedoms we have? Dismayed at what we take for granted? Embarrassed by aspects of their legacies? Desperate to explain themselves after having their descendants interpret their words and deeds and governments for the past couple of centuries? On this year's Bastille Day, as I leave the epicenter of American politics, I think about these questions. And I wonder. The trainings that we're getting are really enjoyable. Sometimes (most of the time), professional development is a necessary evil. Often it's a hoop to jump, sometimes it's got ideas, occasionally you leave with something you can snag. This one is designed to break your carry-on weight with materials (I'm not even joking when I say that I'm thinking I'll have to load up a box and ship it to myself just to get everything home) and let you head back with greater insights into primary source documents and how to use them in a classroom. It's been fantastic, and the week is now only half over.
During our research time, I finally cracked and went with the sole English teacher in the group to the Folger's Shakespeare Library. We didn't have a lot of time (sadly, though I plan on returning if I can), but we looked at the First Folio (my fourth), took pictures with a Shax bust, and listened to an old lady talk about Elizabethan theater for a few minutes. I also browsed the bookstore because I needed to scratch that itch: Walked out with a kitchen magnet and a book from my two favorite Shakespeare authors, James Shapiro (Shakespeare in America) and Stephen Greenblatt (Tyranny: Shakespeare and Politics). The second one was a copy that Stephen signed when he was here in May. That made me very happy. I mentioned earlier my disappointment at the lackluster World War I memorials that D.C. has, so you can imagine my appreciation when, after my coworker and I arrived at Arlington Cemetery, I got to see a display of World War I exhibits. Markers that led to Verdun, French helmets, and even some leftover barbed wire were there, as well as forthright facts--some more saddening than others*. We couldn't see the Women in the Armed Forces museum, though we peeked through the glass--I wish I could have spent more time there. I really…I really wish that sexism wasn't a thing. We saw the tomb of the Unknown Soldier, complete with a Marine marching his twenty paces, clicking his heels, and carefully adjusting his rifle. It was stirring and somber and significant. Once done at Arlington, we decided to try to find an ice cream place Gayle had visited when she was here a couple of years back. We ended up in Georgetown where we sat down at a nice restaurant and I had a croque monsieur--a much fancier one than what I ate in Bayeux--whilst we chatted. Going through Georgetown only cemented the feelings I've been teasing out as I've walked through D.C.: Washington feels like what Disneyland would be if it were a real city. It's surprisingly clean (I'm talking about the area around the Capitol, where I'm staying and working), people walk quickly, the Metro is a ride where you sit down and get zipped about, it's hot and muggy, if you see a kid, she's probably crying or tired or both, and there's a sense of deliberate manipulation of emotions. Additionally, both places have a mythical component, a significant piece of Americana that is normally only accessed through television. Both Sleeping Beauty's castle and the Supreme Court are places that are seen on a screen long before they're experienced in real life. Obviously the Nation's Capital is real, in that the buildings have a pragmatic purpose and the landmarks are designed to manipulate the emotion of appreciation for the past and hope for the ideals of the future. Disneyland is designed to make money. Georgetown, on the other hand, feels like a real city: People were bustling on the streets even after 5:00pm, there were colonial buildings on top, buzzing neon signs on the bottom…in short, Georgetown gave me the same vibe as the Latin Quarter in Paris or certain sections of London: A real place with real people and real shops, living in a place that is steeped with history and bursting with modernity. Busking blues came from the alcove of a store. A homeless man bedded down for the night on the street as we walked past. People lived lives, rather than worked work. It was an interesting contrast. Getting into Georgetown (we never got the ice cream that Gayle said was there) was one thing…getting out of it was harder. We'd wandered pretty far west, and getting to a Metro station proved to be a lengthy process. We figured it out, though, and now I'm back, head- and feet aching, ready for bed. --- * I knew that the African-American divisions (there were two) were mostly tasked with the horrendous work of burying the dead. Of course, many of the Black battalions fought with the French, were honored by the French, and died with the French. Those who died and were buried next to the White soldiers in France were disinterred and their remains were brought here to Arlington. The policy at the time was to segregate the remains: Even in death they were treated unequally. A problem I've come across is that so much happens on these days, I hardly have time to digest or think about them before a new day starts. These entries are helping a little, but I'm tired every night and I can't put in the details that I noticed. I mean, I totally forgot to mention that I saw a gutless pigeon on the sidewalk yesterday, plus there was this lady who, with the patient help and support of her husband (whom she loved because she wore a shirt saying "I <3 My Husband"), made it down the escalators at Arlington Cemetery Metro Station--a first for her, as she had never gone down the escalators before.
Like, that was so cool! I saw a transwoman, which was also super cool. And that's what I mean: Too many cool experiences to write down, yet if I don't write them down, I'll never remember them. Ugh. It's frustrating. Today we had more training, which I'm enjoying immensely. The only gripe about the program--seriously, the only gripe--is the fact that there isn't time to visit the great things here in Washington, as I've groused before. In addition to the training, we got an after-hours tour of the Library of Congress' main reading room, where we got a peek into the card catalog (20 million index cards, ending in 1980) and the stacks--which are organized not according to topic, title, or author, but height--and even were allowed to go stand where the superintendent of the reading room used to sit and glare at misbehaving miscreants. Now it's a lidless eye in the shape of a security camera. One cool fact: the Library has to be careful not to store too many books, as it could cause the building to collapse. Makes me glad that my office is above the garage… There was an "open house" activity in which specialists from different fields of research inside the Library itself. During that time, I spoke with a handful of them, looking at the cool stuff (including the typed notes of President Wilson as he was trying to talk the American people into joining the Great War and a page from the personal diary of John Pershing), and learning new things. For example: Newspapers have always been asked to send in two copies of each issue for archival at the Library. Now that so many newspapers have gone to online only, they're supposed to send digital files for archival. Additionally, the Library tries to track and archive every "news" website it can. So the BuzzFeed listicles, Drudge Report articles, and more are kept as part of the American record. Crazy. Oh, and did you know they had 3-D images of soldiers at war back in the 1920s? I was given a chance to do some personal research. I headed to the Jefferson Building (we're in the Madison Building for our classes) through tunnels beneath the streets. It keeps you from having to go through security a second time. I went to the second floor, as the European Reading Room was on the second floor, and was promptly told that I needed to go back down to the ground floor and wander down a hallway until I reached a separate set of elevators. Then I could go up onto the second floor there before walking through the Hispanic Reading Room and arriving in the European Reading Room. I spent so much time walking that I didn't get a chance to try to research; instead, the helpful Italian librarian showed me some of the tricks of the website, including the way that you can request a book be brought to you in a reading room and someone will bring it by within an hour. This is basically the best place on the planet. When we finished the classes for the day, we immediately set out to see the Supreme Court. There was a protest there yesterday--which I stupidly missed "because I'm tired"…still kicking myself about that--but today it was calm. That's the picture up above. I had no idea how big that building is, nor how intimidating it is to stand in front of it. The sun was burning hot today, and setting right into my eyes, making the marble building dazzle. Before we left the Court, we asked the security guard if he was always on patrol there or what. "Yes," he said. "I'm part of the Supreme Court Police." "There's more than one kind of police here?" "There are thirty-seven different police forces here. Only two or three are for the city. The rest are all federal. The Senate has one, the Supreme Court…" I did not know that. Before we left, we saw a guy, sitting all by his lonesome, holding up a large protest sign. We had to see what he was there for…it was the least we could do, right? The answer: Circumcision Harms. Apparently, his friend made a documentary about circumcision and his bud was trying to raise awareness. He happened to be there yesterday, so if you were watching the news about the new nominee for the Supreme Court, you may have seen his sign. Sadly, he didn't want to talk to us about the movie; instead, he complained that his Bluetooth speakers didn't pair with his iPhone so he had to use a back up speaker to listen to his tunes. That was strange. So was the Taft memorial. The sculptor was super generous with how he depicted that fellow, I tell you what. Anyway, we headed to the Metro, swung down to the Smithsonian depot, and then hiked our way to the MLK memorial. So far, that has been my favorite one. On the way, however, I found it: The World War I memorial in Washington, D.C… …and it was disappointing to say the least. A large, domed, pillared gazebo that, according to a tour guide we overheard, "Is popular for taking wedding pictures in, because it gets such good lighting." It lists the names of 500 Washingtonians who died during the Great War…so it isn't even a memorial to all of them. And, irony of ironies (why are there so many in life?), it was dedicated by none other than Herbert Hoover. Like…ugh. Unbelievable. Anyway, back to Martin Luthor King, Jr. The monument only came up in the last few years, and I know it was controversial (what isn't these days?), but I was impressed. Thoughtful memorials--ones that have symbolism and power and a grandeur--always strike me, and the idea of him coming out of the mountain, out of the stone, was so impressive. A tour group of a bunch of southern Black folks was there, and their excitement, their enthusiasm and appreciation…it was palpable. Aside from the World War II memorial, I haven't felt a feeling of gratitude the same way as I did there. Happiness, too--smiles and appreciation abounded. That struck me and is one of my favorite moments thus far. We wandered down to the FDR memorial, which was easily wheelchair accessible (which is the law, yes, but I think that is significant for other reasons). The light was almost gone by then (as was the heat, fortunately--much muggier tonight than previous nights*), so we didn't get the full effect. FDR was interesting, because I know that a lot of people still have strong feelings about him. He's also a more modern president and there are people still alive who remember his presidency. That's different than a Jefferson or a Lincoln. The layout was larger than many other memorials, at least in terms of moving you through aspects of his life and presidency. Rough-hewn rock and chiseled quotes interspersed with bronze sculptures of snippets from that time, like a man hunched over a radio, listening to one of the president's broadcasts. Since it was getting late, we turned around and headed "home", having never stopped to get dinner like we had planned. Now that we've visited all of the memorials and monuments, I'm not sure what we're going to do tomorrow after we rush to the Arlington Cemetery. I'm sure we'll figure something out. --- * Which was nice. It was such a familiar feeling to when I lived in Miami. The humidity doesn't bother me, though I was always grateful to step into the air conditioned subway car or Senate Office building, which we did, too. I saw John McCain's office, as well as the Russel Rotunda, which was set up with a striking set of photos of suffering Yemenis and it made me sad. |
AuthorWould you like to support my writings? Feel free to buy me a coffee (which I don't drink, but I do drink hot chocolate) at my Ko-Fi page. Thanks! Archives
July 2022
Categories
All
|