In the past year and a bit, there have been three notable video game releases--Resident Evil 2 Remake, Resident Evil 3 Remake, and Final Fantasy VII Remake. I wrote about Resident Evil 2 Remake back in January 2019 when I finished it for the first time. I have since replayed it a good three or four times, still enjoying it quite thoroughly. In fact, in anticipation for RE3 coming out at the beginning of April, I replayed RE2 and had a great time blasting my way through the infected of Racoon City yet again.
But what I was really waiting for was Final Fantasy VII Remake. I have an enormous soft-spot in my heart for Cloud and his colorful crew--enough that I should maybe expand on some of what I talked about back in January of 2018--and I have been waiting and hoping for this game for over a decade. Really, ever since Advent Children came out, I wanted to see LEGO-style Cloud remade with newer graphics and video game mechanics. When Square Enix announced that FFVII Remake would be a reality and that we need only wait a bit longer, I was skeptical. After a certain amount of time, anticipation far outstrips what can be delivered. (This is the problem with Half Life 3, though there are stirrings about that actually coming to pass…) It's hard not to be excited about something that you're, you know, excited about. But the more I focus on wanting a thing, the less impressive it tends to be when I finally get it. So, I specifically avoided watching trailers (except for a couple of times, when the temptation was too great), and I did my best to think on other things. However, as it got closer, the demo dropped, and I was immediately excited--I played through the demo twice the day I downloaded it. Suffice to say, I have been a rather-pampered gamer in the past little while. In fact, that's what I wanted to talk about (I will try to write a review of both RE3 and FFVII in the near future, while the experience playing the games is still fresh): The strange way iterations in the video game medium differ from other media. Make vs. Remake Films are notorious for this: We have classic films that Hollywood knows contain a lot of quality, and they get remade with modern sensibilities, acting styles, costumes, and special effects. Almost always, they are an inferior product. I'm not a huge film nerd, but I can, off the top of my head, list a handful of movie "reboots" or remakes that failed to make a lasting impression. The Mummy, Godzilla, Ben Hur, Clash of the Titans, Total Recall, and Robocop all came and went with hardly a note. In fact, the aborted "Dark Universe" was supposed to be a cinematic contender of the classic Universal monster movies against Marvel's undisputed creations, but fell apart at inception because of many reasons that aren't really relevant here. The point is, with just over a century of film history, we've repeated film ideas constantly. It isn't like film invented this phenomenon, though. Lost to us now, there is a version of Hamlet from the late 1580s (maybe early 1590s?) that we only know about because people wrote about how bad it was. Maybe it was an early draft of the play that Shakespeare himself wrote (which is what Harold Bloom argues), or maybe it was just a trashy version of a familiar story. What Shakespeare went on to write--the Hamlet that has changed the world--is, on a story level, a reboot of the Ur-Hamlet. (And, yes, I would love to read that play.) But even Ur-Hamlet is based upon a Danish story about a prince named Amleth (whose name cracks me up…just relocating the last letter to the front and boom! new name). In fact, almost every story that Shakespeare told was actually a retelling--and he did it better than anyone else. Drama, being the forebearer to film, that makes sense. But even in poetry--arguably our oldest form of permanent communication--we see retellings and reimaginings. While The Aeneid is more of a spin-off from The Iliad, we see Homeric and Virgilian echoes throughout almost all of history. New forms take the epics and uses their tropes to experience the stories again (think, for example, of the experimental novel Ulysses). Even the Bible isn't free from retellings, as the sublime and unsurpassable Paradise Lost shows. What's the reason for this? Being a would-be writer, I understand this impulse. Some stories--and, in many ways, the way the stories are told--have an unexpected influence on a person. A creative person often will take that influential energy and redirect it through their own lens and talents in a hope to glean a piece of the original's power and put it into their own work. I despair of my own writings when I read Steinbeck or It, because I can't reach the level that I see. I want to try my hand at those influential stories--it's the reason I retold Hamlet for my NaNoWriMo 2019--and see if I can "do what they did". But as a consumer, it's a desire to reclaim the awe the original inspired. I envy anyone who gets to come to Paradise Lost for the first time, or experience It without expectations or prior knowledge. There's something inside of these stories that can't be caught anywhere else--but that doesn't mean we don't want to try. Within the Digital I understand why people want to retell and rework and reimagine and remake their stories. What's so fascinating to me about this phenomenon in video games, though, is why they want to try again: The technology has improved. Assuming Bloom is right and Shakespeare decided to try the story of Hamlet again, it wasn't because there was a new innovation in the medium of his story. It wasn't like they discovered they could have stereoscopic sound in the Globe Theatre. There wasn't a technological advancement in printing that made Milton think that the story of Genesis could now be told in epic poetry. (In fact, his choice of epic poetry was a commercial risk, as nobody read or wrote in that format anymore; he was using an antiquated format for his Bible fanfic.) Final Fantasy VII was originally released on the PlayStation because that console had the greatest amount of power available to the developers at the time. They crammed as much content as they could into three CD-ROMs, using every shortcut* they could to be able to tell the story as possible. The limitations of their technology prevented them from doing all that they wished to do. With the continual increase of processing power, photo-realistic graphics, and improvements on acting capture (a level beyond motion capture) technology, video games now have the ability to tell their stories more fully, with greater detail and precision than ever before. The medium itself is changed. So the desire to revisit that which was technologically-confined is, I think, understandable. But what surprises me is that these remakes are, from a standpoint unaffected by nostalgia, superior to the originals that inspired them. And that is a controversial statement. The Power of Nostalgia There's another form of iteration at play here: As rising generations--in this case, the much-maligned Millennials of which I am one--begin to create, they often recreate. It's a call-back to a "simpler" time (simple only because the creator was a child during that time, and most kids have the innocence of childhood to paste over the hard parts of history). I think the best example would have to be Back to the Future, where the modern (1985) clashes with the idyllic (1955). The majority of that film takes place in the fifties, with only the framing concept being in the eighties. The stuff that was modern to Marty Macfly is nostalgic to me now. Stranger Things takes this feeling as the primary part of its appeal (even though it's technically historical urban science fiction--not a particularly large genre, to be honest). It's common for this to happen: Soon enough, early 2000 pop-culture will be used in our stories as creators who have fond memories of a pre-9/11 world will take creative control over our television, movies, novels, and video games and use that nostalgia as fuel for interest in their creations. That is the nature of how we tell stories, I think. Originality is simply a combination of two previously uncombined elements, but those elements still exist. We can find fingerprints of others throughout any story, if we really try. What's happening now in the video game world, though, is that the power of nostalgia is being coupled with outstanding quality. Resident Evil 2 will always be one of my favorite video games. I played it countless times and could probably knock it out in a single afternoon with minimal saves if I really wanted. My long-standing fascination with zombies comes from that video game. (In fact, I tried writing a zombie story in middle school that involved an evil corporation that accidentally turned people into zombies and had to be stopped by the main character, a gun-toting, ponytailed girl who wasn't afraid of the monsters.) I have a huge amount of nostalgic appreciation for that game…but I don't recommend it. Not because of its violence or gore (which is so much worse in the remake), but because it's a product of the times and the technology. The voice acting is bad, the animations strange, the controls a mess…everything that we now use to judge a game's quality** renders Resident Evil 2 as a definite pass. Yes, it's influential and continued the survivor horror genre in video games. It's an important game. But it's no longer a "good" game…at least, not without context. Resident Evil 2 Remake, however, is excellent on almost every front. Again, without the nostalgia-glasses, it deserves the acclaim it's received and could be considered a better game than its original. If you add back in the nostalgia, its power is diminished a bit (since it can't ever be experienced in the same milieu of life in which I experienced the original), but only a bit. Where it fades (the twists and turns of the story aren't a surprise, for example), the nostalgia of being in the Racoon Police Department, hunting for the Diamond Key more than makes up for it. Final Remake Much of what I said about Resident Evil 2--and, by extension, Resident Evil 3--doesn't apply as much to FFVII. That game is still wonderful, and even has a retro vibe*** to it now. In fact, I insisted that my son play FFVII on his iPad before he played the remake on the PlayStation 4, as I didn't want him to create nostalgic memories of something that I didn't have. I wanted, in this particular case, his experience with Cloud to be dictated by the original PlayStation version. And I think I made the right choice (though my other boys won't have that experience, since they've watched me play FFVII Remake and have now started formulating their own childhood memories that will one day bloom into nostalgia). My oldest is at the perfect age to allow these types of memories to shape him and go with him. And while I think FFVII Remake is a remarkable game, the power of the connection between the original and me can't really be undone. I'll never be able to feel about Remake as I did about the nineties' version, because I'm not that person any more. I'm not in middle school in an America that had been at war since before I was born. I'm no longer living in a world with corded telephones and no home internet. What I made out of that game is contingent on when I encountered that game. So of course the remake can't really generate the same sort of feelings. Instead, whenever I play FFVII Remake in the future, it will remind me of this time, of the chaos and strangeness of living in quasi-quarantine as a virus ravages the world. The context of now will continue to affect how I feel about that game, just as the context of then affects me now. Still, it is remarkable to me that the video game industry is able to be iterative in its reiterations. I think there's more to this than happenstance, too, but I won't know for certain until we get remakes of things like Overwatch or Fortnite…and maybe we won't. Perhaps our technology has reached a place that current ideas can be realized fully on the first try (albeit with a patch or two), preventing the necessity of remaking anything. I guess we'll have to wait and see. --- * FFVII had "solved" the problem of not enough processing power by having all of the character models be simple geometric shapes, imbued with a subtlety of movement within their animations to convey their feelings. The other members of Cloud's party would disappear, walking into his body so that the game didn't have to render three characters at once. When they went on to develop Final Fantasy VIII, the developers at Squaresoft wanted to keep the models of the characters the same in the battle sequences as in the world--no more of that blocky, severely deformed character model idea. That desire nearly prevented the game from being completed, as it was one of the most difficult programming feats the developers had to do. ** Not the story, though…I've never seen the caliber of story as one of the graded components of a video game review *** If you were curious, I don't much care for retro aesthetic. I didn't like pixelated video games when that was all I could get. I disliked seeing cover art that looked so dissimilar from the product. Retro gaming doesn't appeal to me because it creates a false impression of nostalgia--it looks like my gaming past, but it's a brand new game that I didn't actually play. Without nostalgia to smooth over the rough (pixelated) edges, I don't get a lot from the game. The COVID-19 crisis has sent a lot of people to their homes who would otherwise not spend so much time there. We all know this. But the result of this extended working staycation on me has been different than I anticipated.
Here's one unexpected thing: I thought I would have more free time to write. Now, it's true that my schedule has opened up in unexpected ways. I no longer have to worry about ward- or neighborhood gatherings, after school activities, or running errands. Because of my son's status as a high-risk person, we're making sure that we do not go anywhere unless it's absolutely necessary. I dropped off some garbage at the dump last week--that was the last time I drove my car. We're taking the "Stay home, stay safe" order really seriously. According to the internet (the most reliable source of information) and social media (the most accurate source of what people are doing), people are learning new skills, finishing projects, and generally improving themselves in the ways most convenient to them. I am saving over an hour a day in commute, to say nothing of the fact that I get my school work done during my work time, which means I should have ample time to write a lot. Yet, as of yesterday, I only had about 21,000 words written this month. Grand total for the year is just over 130,000. That may sound impressive, but it isn't. At least, for me. In comparison, I had over 173,000 words written by the end of March 2018. And March 2019 saw 152,000 words. While I've been on a downhill trend for output over the past long stretch, I was hoping to see a change in my writing life this year. It hasn't happened. At all. Yes, I've managed to log over 30,000 words per month (which is my soft goal for the year), but that's in large part due to LTUE conference (which gives me 17,000 words over three days) and a winter writing retreat in March (which added 13,000 words over a weekend). A couple of unusual experiences are carrying my total word count. That isn't to say I'm writing nothing these days. A lot of words have made it into my reading journal (I'm almost finished with the one I started back in October 2019). I've done a lot of world building and a bit of outlining of different projects, which is good. I've started what I think will be another fantasy novel (its shape in my mind is still vague), with an eye toward a different style of writing. I have a comic I'm working on--I've projects, in other words. This should be what keeps me going… …but I have the same problem now as I did when I felt like a real teacher: By the time I finish with my job, my mental energies are depleted to the point that I just want to read a book, smash the drums, or play a video game. I don't feel like my extra "free" time is going into anything except slightly extended uses of what I previously did with my life. For example, I used to practice drums for half an hour or so when I got home from school. Now, I practice them for almost an hour, sometimes longer. I used to read a little bit; now I get upwards of an hour in a book. I'm not complaining about any of this. These are things that I can do for longer because of the quarantine. It's the writing thing that's really driving me crazy. After four weeks of being at home, shouldn't I have a bit more to show for it? Maybe not. I mean, writing is hard*. It takes a certain amount of mental preparation and willingness, of energy and ability. It used to be that I could go somewhere else and that would help kickstart my brain into writing-readiness. That is no longer an option (for obvious reasons), though maybe if the weather would cooperate, I could start writing in my hammock in the backyard or find a bench at an abandoned park where I could work. Anyway, the point is, just because I'm at home now doesn't mean my brain is ready to write. This has been a wake up call to me. I've always dreamed of supporting my family via my words. (I always expected that to be through writing; instead, it's been my oratory in teaching that has helped provide.) In that dream, I sit at my desk for hours each day, opening the veins of my imagination and letting the words flow forth. But that isn't happening. I should have known that would be the case, though: I have plenty of downtime in the summer, when I'm not expending energy on other projects. What do I do then? Well, historically, summers have yielded a lot (in 2019 I had 134,000 words in June and July; 162,000 in 2018), but those are all thanks to the writing retreats. Day by day, I get about the same, maybe a few hundred more words, than I do during the school year. I know, I know: It's a hobby. It's a passion, yes, but it isn't a job. If my ability to help pay for food were contingent on fingers on the keyboard, then I'd probably do things differently. That's fair: I know that I would think about writing differently if that were the case. The hard part with all of this is that I don't have a way of really knowing how well I could do it, as, in the back of my mind, procrastinating my writing now doesn't make me slightly edgy. Dinner will still be served even if I don't put another word into my current story. There's not much to say else: I write about this often (because it's something that preoccupies me, which I then turn into writing). It's a worry that I have--a useless worry (as most worries are), but a persistent one. Part of what's so silly about these ruminations is that, unless I get an agent and sell a book, it's all entirely moot. I keep gnawing on the bone when the animal itself has yet to even be taken down. Well, perhaps this will be the last confession I need to write. Maybe this will exorcise the demon of "I don't have the mindset to be a full-time writer" and I'll be able to find other ways of wasting my readers' five- to six minutes of reading time. Maybe. --- * If you don't believe me, I encourage you to sit down and write a short story of at least three thousand words, but only after you've done all of your other chores and responsibilities. If it was easy for you…well, I'm kind of jealous, honestly. You should let me read your story. If it was hard, well, yeah, that's my point. At the end of February, I decided to do something that was a greater sacrifice for Lent than I normally do: I gave up being on Twitter. I didn't delete my account (though I did ditch the app on my phone), and I had a couple of visits there (sometimes a link from a news article took me to Twitter; I watched a Dave Matthews livestream from his home and tweeted how much I liked it; my website automatically shares a link whenever I publish a new essay), but for the most part, I did exactly what I said.
Here's the thing: I'm not Catholic. I have a few acquaintances, mostly from my quidditch days, who are Catholic. That isn't to say that I've a lot of claim to the tradition. Like much of my understanding of Mormonism and the culture of the Church, I recognize that Protestant--and, sometimes, even Puritan--influences have dictated what my religious experience encapsulates. My choice to participate in Lent had more to do with a desire for a kind of religious solidarity within my own tradition: The safest sort of religious experimentation that a person could do. The impetus is actually years old: I was talking to Dan Harmon, one of my quidditch buddies, who was came to my school to talk to my creative writing students about screenwriting (which he had studied in college). I took him out for lunch once the school day was over, and he readily agreed to eating pizza, which he'd given up for Lent. In subsequent conversations, it turned out that Dan wasn't Catholic, he just liked participating in these sorts of religious traditions. (I don't know what his current stance is on any of this, as I've lost contact with almost every vestige of my quidditch life.) That inspired me to try the same thing, using my Mormonic upbringing to conceptualize it in a way that made sense to me. To that end, I decided that, if I was going to do something for Lent, I would need to give up something that I would genuinely miss. For Dan, he gave up pizza; for me, I gave up Twitter. See, I have a hate/tolerate relationship with Facebook, but Twitter is a different animal. In Twitter, I feel as though I'm getting glimpses of other parts of the world. Yes, there's the center of a Venn diagram there: I follow certain people because of mutual interest. Authors, book agents, fellow teachers, dinosaur lovers, and comic book geeks inhabit my Twitter feed. (I also, quite begrudgingly, follow all of my representative legislators, though none really uses the platform for much of substance.) I also have made it a point to include LGBTQ+ and people of color in my timeline to give me an additional dose of "I didn't know that". In other words, Twitter helps broaden my view of life and living, with a lot of interesting things going on. And, boy, there are a lot of things going on right now. COVID-19's ravaging of the world is worth talking about, and the solidarity and commiseration that happens on social media is definitely one of the best parts about this crisis happening when it has. We've all had a good laugh at a post that was shared by a friend, neighbor, or whoever that perfectly recreates our own feelings. It's times like this when social media is at its best. Giving up Twitter, then, was a really hard decision. I made it before the crisis escalated to the point that our country's leadership could no longer deny it, and I think that was a good thing. It meant that I had already made the decision, so I didn't have to try to rationalize whether or not to commit. I'd done so; only thing left was to keep the course. At first, it was pretty difficult. I'm quite used to Twitter and would jump on during loading screens of video games, when I had a random thought to share, or just because I was bored with the conversation happening around me. Its ubiquity brought me comfort and I definitely dealt with a type of withdrawal. What helped--and what, I think, is the point of Lent--was that, during those first few days off the platform, every time I considered what I wanted to do and had to reject the "Go on Twitter" impulse, I had to think why I was missing it. End result? Participating in Lent meant that I thought about Jesus a lot more than usual. I'm convinced this is the intent of Lent, as it was a more authentic sacrifice than almost anything else at that moment in my life. I could have given up wearing a man-bun for Lent, but that wouldn't have mattered at all because I don't normally wear--or even much care for--the man-bun look. And though Twitter can have great value, its largest contribution in my life was to burn time trying to learn something new amid the constant stream of thoughts and words, 280 characters at a time, scrolling across my screen. Losing that but replacing it with the thought of "Hey, this reminds me of Jesus and His sacrifice that's coming up" made a difference in my life. The downside of this, however, is two-fold: One, I learned that I still need/want to scroll through social media. Two, that itch wasn't lost as much as transferred…to Facebook. I'm not a fan of Facebook. At all. Yes, there are some positive things about the website, and it could even be a good tool for improving the world. And, of course, the vast majority of people who read this essay will have become aware of its existence via Facebook. (I get the irony, folks.) Anyway, Facebook (as an entity; not individuals utilizing it) is not really improving the world, and it likely never will, but hey, at least there was potential at some point. As it stands, I don't like the platform for a number of reasons. Some are petty and nitpicky (I hate the fact that it doesn't automatically post the most recent posts--the fact that you can switch things around, only to have it change depending on the device you're using only makes it worse), while others are larger (Facebook is better at ads, especially the way it culls posted information to sell more stuff that I don't really need…and, yes, Twitter does this, too; they're just not as good at it). But there's one thing about Facebook that really grinds my gears: I know (almost) all of these people. That may sound counter-intuitive, as that's the entire point of Facebook. But Facebook is like dancing in a car at a red light: You think that you're pretty much doing your dance by yourself, only to realize that everyone you went to high school with is sitting in the car next to you, watching you with mixtures of embarrassment and interest. If a person on Twitter dislikes my hottake on something, I can block them and move on with my life. Detritus is as detritus does. But on Facebook, many of the responses to posts come from "friends" that I've accumulated over the years. Blocking or unfriending them comes with strings; there's a diplomacy, a politics involved with no longer being a part of someone's Facebook life that isn't as apparent in Twitter. If I don't like following a celebrity or an author because she says something stupid, then there's no real loss there. Facebook, however, changes the dynamic. If someone I know says something stupid, then it's in my face, again and again (because of that idiotic "Top Stories" default). Under normal circumstances, I can roll my eyes and choose not to engage with Facebook at all. I get my itch to scroll scratched elsewhere. But this year's timing between Lent and the COVID-19 crisis has meant that I couldn't scroll through Twitter whilst waiting for my video games to load. Instead, I was on Facebook a lot more, which meant that I was exposed to bad ideas more frequently. (And why is it that the worst ideas of your friends are the ones that show up the most often?) When it finally got too much and my distaste for the platform reached its zenith was when a friend from my mission posted memes and comments criticizing, downplaying, or entirely dismissing the quarantine. Now, I am no defender of America's response to the pandemic: We had a lot of warning that was ignored from the top down, and we still have a false-hope narrative that disregards science and history to try to mollify people. Until a vaccine that is tested, proven safe and effective, and ubiquitous, my family--with our half-hearted son--will be endangered by any premature "return to normal". Choosing to let our son out of the house is actually a life-and-death decision that we will have to formulate going forward. America has lost over 20,000 people at the time I'm writing this, and it's probably higher due to underreporting of numbers. Our lives permanently changed when 9/11 saw a tenth that number die--COVID-19 is going to radically alter America and the world. So when friends--not internet strangers or possible troll/bot accounts, but people I've broken bread with, been in their homes, took classes with in high school or college--spread idiocy like, well, a virus, it gets beyond tiresome. It gets dangerous. And it isn't just that someone else might read their meme and think, Hey, the quarantine is stupid! Sure, that might happen, but the danger comes from the further spreading of disinformation that is too easily shared. For example, I heard someone talking about a handful of different COVID-19 related stories: Almost all of them were either false or unproven. It's as if people are unaware that Snopes exists. Being exposed to that is damaging to my mental health, because the message I hear from falsely optimistic people, or those who don't actually maintain appropriate distances, or who go to the airport to welcome home missionaries in direct defiance of Church and state requests is a simple one: The life of your family is irrelevant. Living with an at-risk member of the populace means that I can't, in good conscience, head to the store with a mask on and think all will be fine and dandy. Living with an at-risk member of the populace means that I could be a vector of disease. As I told my students, half-hearted people don't get to survive pandemics. The only way to save my son's life--again--is to lockdown my home and take every precaution that I can. And as much as I recognize the heartache and sadness that comes from not celebrating Easter as a large, rowdy extended family dinner, it also means that we don't have to miss going to the funeral of someone we could have otherwise protected. So, yeah. I'm not a fan of Facebook. That's where I see the most frequent eye-rolls and yeah-rights of the whole pandemic issue. Is Twitter a better place than its competitor? I honestly have no idea. I haven't been on Twitter in multiple fortnights. I will say this, though: The only way I get through this potentially months-long tragedy-in-waiting is through the help of my friends. And Facebook gives me a view of many of them that tells me that may be a false hope. I hate seeing that. I hate feeling and thinking that. Yet I can't shake the sentiment. I learned that, while giving up Twitter for Jesus was good for my soul, Facebook certainly wasn't. The hard thing is, there's still something that I desire from social media. I want…something that social media provides. If I can find a way to scratch that itch a different way, I'd probably be less stressed and worried. Maybe I should start an Instagram account… |
AuthorWould you like to support my writings? Feel free to buy me a coffee (which I don't drink, but I do drink hot chocolate) at my Ko-Fi page. Thanks! Archives
July 2022
Categories
All
|