Post #6: Historical Gaming

When tasked with evaluating a history themed video game, I could not think of a better place to start than Electronic Arts’ World War II first-person shooter (FPS) Medal of Honor: Frontline.  Released in 2002, Frontline was the first FPS I ever owned.  At the time, the game was perfectly tailored to my interests.  It was set during the Second World War and as a 13 year old, receiving my parents permission to buy an FPS meant I no longer had to sneak in Goldeneye sessions at my friends houses.  The Medal of Honor franchise also struck a chord with gamers in the late 1990s and early 2000s because it piggybacked on the success of Saving Private Ryan and Band of Brothers.  In fact, the first level of Frontline is taken directly from Saving Private Ryan‘s portrayal of the landing at Omaha Beach (perhaps unsurprisingly, Steven Spielberg was heavily involved in creating the Medal of Honor franchise and Dreamworks Interactive produced the first two games until Electronic Arts purchased  the game developer from Spielberg’s Dreamworks Studios).

Admittedly, I was not able to go back and replay Frontline for this blog post because I no longer own a Playstation 2.  However, with the magic of Youtube I was able to sit and watch someone else play through the game.  Yes, yes, playing the game is a fundamental part of the gaming experience, but I still fondly remember the technical and cinematic aspects of Frontline.  What I did not remember as vividly was the treatment of history, something that Youtube is more than capable of communicating.

From a gameplay perspective,  Frontline does a poor job depicting combat during WWII.   Certain parts of the game, like the D-Day sequence, are more accurate than others, but overall the game lacks realism.  (I am fully aware that it is impossible to capture the brutality of war in a video game, but because Frontline attempts to do so, it is necessary to critique the effort).  The biggest problem is that the game’s protagonist, OSS agent Lt. James Patterson, almost single handedly dismantles the Third Reich.  Not only does this emphasize the role of the individual over the unit, but it also allows the developers to bypass difficult subjects like death and the effects of violence.  On occasion, Frontline manages to hint at these aspects of war, but these ideas are communicated less through the images on the screen and more through the soundtrack composed by future Academy Award winner Michael Giacchino.  But here again, the player recognizes that this is still a heavily romanticized version of war.

While the game fails to capture the brutality of war, it is much more successful at providing an outline, albeit a very general one, of the war’s course.   Missions are occasionally separated by film clips contextualizing the gameplay and, while these never stray from boilerplate History Channel material, they adequately frame post D-Day actions.  However, in their attempt to include historical content, the game developers also reinforce the “the good war” narrative.  By ignoring the complexity of the Second World War, the game fails to ask any questions about the events it depicts.

My feelings about Medal of Honor: Frontline are mixed.  On the one hand, it was a technical achievement for its time and remains an exceptionally well designed game.  It makes history accessible and has the potential to stimulate players to investigate the period independently. On the other hand, the brand of history advanced by Frontline lacks any critical thought and the close connection to the Spielberg empire makes historians like Stephen Ambrose appear to be paragons of historical inquiry.

Ultimately, I believe that Triple A video games are the wrong format for history gaming.  Like a Hollywood blockbuster, the massive budgets required to publish these games do not lend themselves to historical accuracy.  Other formats seem better suited for history based video games.  Browser based games or real-time strategy games, similar to the Civilization franchise, are more likely to succeed in communicating academic material because there is less emphasis on fast paced action, allowing for a stronger focus on substance.

Post #5: Spatial History

The data charted for my spatial analysis maps out some of the major sites involved with the Manhattan Project.  The locations shown do not represent a comprehensive list of all the sites involved with the creation of the atomic bomb, only the most prominent areas.  This information is ideal for spatial analysis because it demonstrates how physical space played a major role in the Manhattan Project.  Looking at the map, it is easy to see that the Manhattan Project required civilian and military officials to utilize different regions throughout the US.  At its most basic level, this type of analysis raises questions about the infrastructure network connecting all of these locations.  How did raw materials move from Colorado to Washington and Tennessee for processing and then down to New Mexico?  Similarly, how did people move from site to site?  The map also points toward a sophisticated physical and organizational network to coordinate the operation.  What types of communication were used to facilitate the movement of information between locations?  How was the Manhattan Project organized to maximize efficiency? Or, conversely, did bureaucratic structures decrease productivity?  If the map is paired with census data, the types of work being done at different sites is more easily understood.  Washington D.C. and New York City – both major population centers and seats of government power – served administrative functions.  Rural areas in Washington and Tennessee were used for dangerous and classified activities like plutonium and uranium enrichment.  Finally, the comparably empty space around Alamogordo, New Mexico witnessed the first atomic test in human history.

The features of the map itself provide some information, but unfortunately Google Maps Engine Lite does not allow base maps to be imported.  If this feature were available, it would be possible to use  maps of the US’s road, rail, and telecommunications network during the 1940s.  This type of map might further explain the location choices of some of the rural sites supporting the Manhattan Project.  Lacking this ability, the next best option is to use a map that includes the current US roadway infrastructure and some terrain data.  Even though the modern interstate highway system did not exist in the 1940s, looking at a map of the current network and the locations of Manhattan Project sites allows historians to ask questions about the decisions that went into charting the paths of highways.  All of the sites on the map are directly connected to or very near major interstates.  Were the interstates built with the intention of reaching these locations or did the interstate designers simply expand pre-existing roadways already used to connect Manhattan Project sites?

As a footnote, symbols used for marking the locations of Manhattan Project sites can make it easier to process information (i.e. factory shaped icons where factories were built, shovel and pickaxe to denote a mine).  However,  the basic Google Maps Engine does not allow these types of icons to be customized.  Without the ability to change color, the unique icons were sparingly used to avoid color redundancy.

 

Post #4: Data Visualization and Organization

For my data visualization project, I analyzed two texts that are instrumental to my research area: George Kennan’s “Long Telegram” from 1946  and Kennan’s 1947 article “The Sources of Soviet Conduct.”  The purpose of  comparing these two texts is to evaluate the differences between the documents.  Kennan wrote the “Long Telegram” as a dispatch to Secretary of State James Byrnes intended only for use within the US government.  In 1947, Kennan penned “The Sources of Soviet Conduct” as an internal report for the State Department, but it was published later that same year under the pseudonym “X” in Foreign Affairs magazine.  The two documents are not identical – the “Long Telegram” is 5,336 words while “The Sources of Soviet Conduct” comes in at just over 6,850 – but their tone and purpose is similar enough that using a data visualization tool like Voyant helps reveal shifts in Kennan’s policy concerns.

After eliminating Stop Words, Voyant highlights, both in the word cloud and list of word frequencies throughout the corpus, the major continuities and discontinuities that exist across the two documents.  “Soviet” is the most frequently used unique word and occurs as an almost identical percentage of the total words used in both articles, slightly over 1%.  The words “power” and “world” are both in the top 5 unique words identified in the two articles, though they do not appear in similar percentages of total words.  However, beyond these three key words a greater variation in usage and percentage of words is visible.  Given the subject matter, comparing the words “communist” and “capitalist” demonstrates differences in the focus of each article.  In the “Long Telegram,” Kennan uses “capitalist” 16 times and “communist” only 9 times.  In “The Sources of Soviet Conduct,” however, this trend is reversed with “communist” appearing 18 times compared to only “12” uses of “capitalism.”

By itself, this type of data does not provide a definitive interpretations of Kennan’s writings, but it does provide a new method of accessing the material.  Voyant is particularly helpful in accomplishing this task because it creates multiple visualizations including a word cloud, frequency list, and trend graph.  While manipulating these tools is not an entirely straightforward process once the visualizations are generated they are pretty transparent and do not require a great deal of specialized knowledge to interpret, thereby avoiding one of the major pitfalls of statistical analysis noted by Theibault.

“Long Telegram” visualization.

“The Sources of Soviet Conduct” visualization.

Unfortunately, I could not figure out how to get the URL’s to link to the data set with the Stop Words already locked in place.  Individual tools (i.e. Word Cloud) were capable of providing URL’s for manipulated data sets, but I could not figure out how to do it as whole.

Post #3: Digital Storytelling and New Media

New Media provides historians – enthusiasts and academics alike – with the ability to track, interpret, and present information using increasingly sophisticated, yet highly accessible technologies.  The wealth of digital resources available today, especially open source formats, means the ceiling of knowledge required to participate is not tremendously high.  Some projects, like Small Town Noir, utilize (mostly) free resources like WordPress to present historical anecdotes with minimal interaction and limited aesthetic ambition.  However, depending on the skills of the designer, these projects can become extremely complicated (National Archives Digital Vaults) resulting in visually dazzling, interactive digital stories.  The degree of sophistication used to tell a digital story is not indicative of its effectiveness or historical value.  In fact, extremely flashy sites like Welcome to Pine Point can wind up offering hyper-specific detail without providing context, thereby failing to do much in the way of advancing the historical field. This is not to say that something like Pine Point is without value, but rather it is an illustration of how form can dwarf function.  If digital storytellers want to be taken seriously by traditional scholars they will need to focus on interpretation as much, if not more, than presentation.  Unless new media is used to interrogate and question information it will likely remain a complementary tool, but unable to stand independent from more traditional written sources.

For the average historian, creating a site as with the technical refinement of The Hollow is not possible. However, this does not preclude members of the academic community from participating in these twenty-first century projects. As Daniel Cohen points out, blogs allow academics with extremely limited technical skills to expand their reach and ability to collaborate.  Despite the stigma still attached to new media formats, Robert Townsend’s research shows that the academy is growing increasingly receptive to the idea that serious scholarship can take place digitally.

Post #2: Final Project

My final project will expand my original Omeka site.  The initial Omeka assignment will digitize select items from the Harold Saunders collection housed at the Wings of Freedom Aviation Museum in Horsham, Pennsylvania.  The final project will expand the exhibit to include additional pieces of the Saunders collection, approximately 50 artifacts.  The collection contains objects from Saunders’ personal life and military service during World War II including letters, maps, and aeronautical navigation tools.  Though the Saunders collection is not comprehensive, the site will utilize the available materials to examine parts of his life with special emphasis on Saunders’ draft and time in the Army Air Force.  Omeka will serve as the primary tool for developing the site with Gimp functioning as an image manipulator when required.  Other sites like Timeline JS and Mapbox might be used as well.  By creating timelines and maps, users’ ability to interact with the exhibit will expand providing more accessible and streamlined information.  Ideally, the site will help the Wings of Freedom Aviation Museum increase its digital footprint both online and in the museum.  The site will be designed to serve multiple learners including school groups, aviation enthusiasts, and historians.

Creating a definition of digital history is a task unlikely to ever be completed.  The term is so flexible that attempting to identify specific and widely accepted parameters is an exercise in futility.  There is good digital history and bad digital history, but trying to set hard boundaries is useless.  A more productive – and necessary – use of time is for each individual historian to sketch their own outline of what digital history means.  Writing a personal definition of digital history gives the historian a target to shoot at during the interpretive process, even if it that target lacks universal acceptance.  The volume of articles emerging from the digital humanities community, as discussed by Matthew Kirschenbaum, indicates that the process of defining remains a vibrant topic of conversation.

My definition of digital history is heavily influenced by Daniel Cohen, Roy Rosenzweig, and Lisa Spiro.  While Cohen and Rosenzweig’s qualities of digital media exemplify the benefits of technology based approaches to historical inquiry, these same qualities also highlight what I believe to be the distinguishing features of digital history.  The flexibility to combine mediums, the diversity of participants, the ability to penetrate dense material, unprecedented levels of interaction between historians, and the decentralization of process are all hallmarks of digital history.  Finally, Spiro’s idea that “information is not a commodity to be controlled but a social good to be shared and reused,” emphasizes the democratic impulse driving not just digital history, but digital studies in general.

I would also argue, however, that in order for something to qualify as digital history it does not need to meet every standard mentioned above.  The variety of digital history projects available today means that applying rigid standards of evaluation will only result in intellectual tail chasing.  The difference between passive and active projects demonstrates this point. The Wilson Center Digital Archive is one of the more engaging digital history projects I have encountered.  It allows visitors to investigate Cold War foreign policy chronologically, geographically, and thematically.  The site offers access to thousands of primary sources, provides relevant document information, and makes it easy to share specific documents.  In stark contrast to the Wilson Center Digital Archive is the University of Wisconsin’s digitized collection of Foreign Relations of the United States (FRUS).  Wisconsin’s FRUS collection does not provide users with any interpretation or feedback.  It is strictly a catalog of documents.  The interface is not user friendly and the only function available is word searching within documents.  Despite these problems, I would argue that the FRUS collection is still an example of good digital history.  Could the site be improved? Undoubtedly.  Does it look outdated? Most certainly. Does it condense a set of documents that would otherwise require multiple bookcases into one easily accessible location? Absolutely, and that is all it really needs to do.  The site does not aspire to be anything other than a warehouse.  In the end, the gap between the Wilson Center’s beautifully skinned page and UW’s clunky archive does not matter.  Both sites achieve their desired goals and offer historians around the world the opportunity to engage materials previously confined to academic libraries.