Review of "Unverified"

Since this is the internet, I feel the need to post a “spoiler alert.” This blog post is about the documentary film “Unverified,” and may “spoil” things for you if you have not seen the film. Read at your own discretion.

 

Last week, Bradley Bethel, former University of North Carolina academic adviser in athletics turned documentary filmmaker, released “Unverified: The Untold Story Behind the UNC Scandal.” The film’s website heralded the documentary’s purpose as such:

Beginning in 2011, the story of UNC’s 'fake classes' made national headlines as a massive athletics scandal. Caught between university deans unwilling to accept responsibility and news media eager to implicate athletics, UNC’s academic counselors for athletes found themselves accused of complicity and without the means to defend themselves. Bradley Bethel was a reading specialist for UNC athletes and was outraged by the way the press portrayed his colleagues. Refusing to remain silent, he set out to defend those falsely accused and give them a platform to tell their side of the story. In the process, he realized the problem was even bigger than the media. Following Bradley over the course of a year, UNVERIFIED challenges the headlines and tells a story more complicated and heartbreaking than the one we’ve heard in the news.”

Bethel approached me several months ago about my willingness to watch the film when it came out and “review” it on my blog. (I feel that calling this a “review” gives too much authority to me.) I want to emphasize at the outset that Bethel made zero qualifications about the type of review I should write or its content. The thoughts in the following post are completely my own.

Previously, I wrote about Bethel and how I was disappointed in a Daily Tar Heel editorial about him and his film. The disclaimer at the beginning of that post is still appropriate here (in short: I am a UNC alumnus, love UNC athletics, and still have not met Bethel outside of the internet, overwhelmingly Twitter).

For most people, Bethel first entered the conversation about the UNC Scandal with a blog post titled “Truth and Literacy at UNC.” In that and many subsequent posts, Bethel attacked the veracity of claims made by Mary Willingham, a former reading specialist in the UNC athletics department, and Jay Smith, a chaired professor of history at UNC. (An independent investigation with external reviewers later demonstrated that Willingham’s claims about athlete illiteracy were false.) Bethel took on other people as well, but Willingham and Smith drew the majority of his fire.

Perhaps the cruelest cut in Bethel’s film is that Willingham plays an exceedingly minor role and Smith, even if pictured, is not mentioned at all. By failing to give Willingham and Smith significant roles, Bethel effectively marginalizes the role the two played in the whole ordeal. E tu, Brute?

The film was, in all honesty, not what I expected it to be. While I anticipated a film that probed the media’s treatment of the UNC scandal—which it did in many ways—what Bethel produced is actually a much more personal film. If you are expecting a documentary that does nothing but dissect media inaccuracies over and over, this is not your film. (For a primer on the scandal, even if it is one whose facts Bethel somewhat disputes in his film, see here.)

Early on in the documentary, Bethel recounts a story where he failed to stand up to a childhood bully and how he has felt guilty about that his entire life. He vowed as a child that, if presented with an opportunity to defend his friends again, he would not run from the bully.

Much of the film, then, is about following Bethel as he interviews various figures and defends his friends and fellow academic counselors Beth Bridger and Jaimie Lee, and to a lesser degree former senior associate athletic director John Blanchard. Bridger and Lee were terminated for their supposed role in UNC’s paper class scandal, and Blanchard announced his retirement in 2013 during the midst of the scandal. Bethel is, however, most certainly the central character of the film. At one point, he states, "I know of good people within or associated with athletics whose integrity has been questioned and for some whose careers have ended because of being mischaracterized." The film is his chance to tell his friends’ side of the story and defend their integrity. 

For the most part, Bethel handily succeeds at this goal. He skillfully presents a variety of viewpoints (including many current and former athletes) from those involved in all aspects of the scandal, excepting the media and UNC’s current administration, who we are told all refused to be interviewed (more on that later).

The points the film makes over and over again: How could those in athletics have known what was actually happening in the African American Studies Department? Moreover, why would those athletics folks have ever thought to question anyone in academics, let alone a department head? (ESPN analyst, lawyer, and former dook** basketball player Jay Bilas especially makes this second point in the film.)

These basic questions and others eluded UNC administrators and Kenneth Wainstein (the former federal prosecutor paid $3.1 million to investigate the scandal), “Unverified” contends, because it was easier to blame athletics, low-level employees, and protect academics. The media tied into this by not questioning their sources appropriately (especially Willingham), and presenting a sensationalist view designed to get clicks on the internet. It is, after all, easier to sell a morality play to the public than to present nuanced stories with fewer clear villains and heroes.

The film thus starts off talking about the media’s sensationalism and ends up being more about questioning whether Kenneth Wainstein and those in power at UNC were fair to everyone involved and exhibited due process. As UNC journalism professor Adam Hochberg points out in the film, anyone terminated with cause because of the scandal was fired by the university and not the media. 

The documentary’s central point ends up being that Bethel thinks his friends Beth Bridger and Jaimie Lee lost their jobs not because they did anything wrong, but because they were easy targets. Firing low-level support staff who make $40,000 a year is a lot easier, he contends, than asking tenured full professors and UNC deans why they did not have a better control over their academics. (Former UNC Chancellor James Moeser even says  in the film that the AFAM department got a bit of a pass because nobody in the administration wanted to be seen as being harsh to the “black” department—my quotations, not his.)

Bethel ends his film with the revelation that the NCAA’s notice of allegations (where the institution laid out its interpretation of UNC’s wrongdoings) did not mention Bridger, Lee, or Blanchard at all. The film is, therefore, really about trying to tell a narrative that gives back power to normal people who had their power and careers wrested away by large bureaucracies and the media. (Bridger claims that she was actually fired without cause just because her name appeared several times in the Wainstein report.) 

In this way, “Unverified” is a complete success. It tells a nuanced story and gives voice to normal people (even if it does not always ask those normal people the hard questions). While media outlets often only play short clips of interviews, the film frequently lets the camera roll, giving aggrieved parties a chance to vent their frustrations and explain how they feel they have been misrepresented. But the documentary deftly avoids becoming a “gripe session,” and instead moves with pace to focus on a larger narrative (though that narrative focus seems to switch halfway through from focusing on the media to focusing on UNC’s administration and Kenneth Wainstein.) 

One especially nice moment that probably best illustrates Bethel’s point about media sensationalism comes during his interviews with former football player Deunta Williams. Williams claims that ESPN’s show “Outside the Lines” misrepresented him and his comments, and he was especially upset that “Outside the Lines” took a lot of film of him in his home, driving around, and said they would use this show how successful he was. When the show aired, however, it only called Williams a “fast food worker.”

In reality, Williams is a restaurant owner, high school football coach, and real estate investor. He has several employees and seems to be doing well, from the documentary. Unlike ESPN, “Unverified” does show Williams walking around his house and working hard, and does show him driving around town in his new Audi. Where the media “talks the talk,” Bethel shows, he and his movie “walk the walk.”  

While in general I was impressed with “Unverified,” I did have some concerns or critiques. Bethel makes it clear that various media outlets sensationalized or reported falsehoods about the UNC scandal, but this is not done in as focused a way as I would have expected or liked. Instead, we get references scattered all over the film. (Although, to his credit, Bethel does make two media inaccuracies clear: (1) the UNC scandal was academic, not athletic in nature; and (2) his fellow academic support counselors have been misportrayed.) If you were not fairly familiar with the UNC scandal going into this, you would not necessarily get some of Bethel’s finer points. A short segment at the beginning correcting inaccurate media claims would have been helpful. 

Also, at times I wondered whether the focus of the film was really about “The Untold Story Behind the UNC Scandal” or instead about Bradley Bethel. I am not at all suggesting that Bethel comes off as arrogant, narcissistic, or self-aggrandizing. But there were times—such as both his interviews with Hochberg—where it almost felt like the viewer was intruding on Bethel’s therapy session. While focusing the film on Bethel helps tie some of the documentary’s larger narratives together, it also means that viewers get a lot of Bethel and his seemingly inner thoughts. 

And, finally, one awkward scene occurred when Bethel called Joe Nocera, the New York Times reporter who wrote about the UNC scandal. On speakerphone (and on camera), Bethel asked if Nocera wanted to be interviewed for the film. Nocera was dismissive and somewhat rude, but he emphatically declined to be interviewed for the film. So, why was that speakerphone call taped and included in the documentary? If Nocera did not want to be in the film then his wishes should have been honored. 

These are, in the end, fairly minor complaints. “Unverified” is a good documentary that ultimately is about how people in power get to make decisions that influence the rest of us. It is about authenticity and narrative in journalism, and how, while the connection is nebulous, the media can greatly affect seemingly innocent people in profound ways. And the movie is about standing up to entrenched power structures that can bury “the little guy” because standing up to those power structures is the right thing to do.

I think two tweets from Bethel sum up his thoughts and general perspective after completing the film: 

Who could argue with a desire that journalists—and by extension all of us—were more open about the biases and perspectives that we all carry?

My final thought of the film is this: anyone who watches “Unverified” will be happy that Bethel gives voice to those who have been stripped of their dignity and reputation by large, bureaucratic organizations that frequently seem more concerned with protecting their own power and authority than doing what is right. In that way, the documentary is especially a job well done. Moreover, all viewers, no matter their thoughts about the scandal, will finish with a more nuanced view of what happened and a keener eye toward recognizing both media sensationalism and how they fit into the power structures in their own lives.

 

**As a unrepentant Tar Heel, I just cannot bring it of myself to type out the most commonly accepted spelling of that university in Durham, NC.

Selling Nature: Mountain Valley Water

My latest research project centers on Mountain Valley Water, a premium bottled water company located in Hot Springs, Arkansas. In 1928, the company became the first nationally-distributed bottled water, with its distribution network stretching from California to New York City. And Mountain Valley proudly proclaims that everyone from presidents to celebrities to racehorses have quaffed the beverage. (Eisenhower once mentioned the company by name in a press conference, and the famed Secretariat was even a patron.)

My newest position at the Arkansas School For Mathematics, Sciences, and the Arts is a great gig, but with a 5/5 teaching load I don’t have an overabundance of time for research. Not only is Mountain Valley a local company, meaning it was fairly easy to find primary sources, but its corporate identity well fits into my research interests.

The longer paper argues that Mountain Valley’s history represents interconnected issues of nature, health, and capitalism. For this shorter blog post, however, I just wanted to share two interesting advertisements I had found. They give a brief glance into this company’s fascinating history and these larger themes.

The first one is from the Arkansas Gazette, printed on 6 November 1939. Hot Springs, essentially from the outset of its human discovery, has possessed a reputation for being a healthy place. The springs that bubbled forth were revered as a natural cure for any number of diseases, especially rheumatism. As can be seen in the advertisement, Mountain Valley emphasized not only that consuming its water could improve human health, but also that the product was the “natural aid” to do so.

In another ad, this one from the Courier-Journal (Louisville, Kentucky), printed on 11 November 1939, Mountain Valley leveraged the same notions. On a basic level, the ad describes how, long before Euro-American settlers conquered the area, American Indians knew of the water’s supposedly curative properties. Through this line of argument, the advertisement augmented previous health claims with a notion of permanence.

But by hearkening back to Indian land usage and environmental understandings, the company drew upon popular notions of Nativeness to emphasize a connection to the natural world. As Shepherd Krech argued in The Ecological Indian (1999), popular stereotypes of Indians portrayed them as both “ecologist and conservationist,” particularly as “noble savages.” (The etymology of “savage” is originally from the Latin, meaning woodlands—silva.) In this case, advertising that American Indians used the area to cure illnesses bolstered claims of the springs' natural powers and emphasized Mountain Valley’s connection to the environment.

Unsurprisingly, a great many other Mountain Valley Water advertisements exist—brochures, pamphlets, newspaper ads, etc.—the company even ran a Time magazine campaign in 1940. But since its founding in 1871, as I hope to show in longer, published formats, the company developed an identity predicated on connecting a salubrious natural world to wholesome bodies. Healthy environments in this case meant healthy bodies, and hopefully healthy profits.

Images are courtesy of the Garland County Historical Society's archives.

Withering Heights: The Battlefield Geography of Antietam

I’m very pleased that my colleague John Hess has agreed to do a guest blog post to help commemorate the anniversary of the Civil War battle at Antietam/Sharpsburg. Fought near Antietam Creek close to Sharpsburg, Maryland on 17 September 1862, the battle created over twenty-two thousand casualties and remains the single most violent day in U.S. history.

John visited the historic battlefield site this summer and took a number of photographs that reveal history in a way that written description alone often cannot. Simon Schama helped us realize in Landscape and Memory (1995) that the natural world can profoundly influence popular memories and histories. But a twist on that idea can be true as well: we can forget what landscapes look like to the detriment of how we remember and understand historical events.

Anne Kelly Knowles (Middlebury College) has used GIS technology to change how we view Civil War history, particularly the Battle of Gettysburg. For example, by melding topography and top-down geography she has questioned whether Confederate General Robert E. Lee could actually see much of the battlefield, possibly helping explain some of his militarily ineffective decisions. In a similar way, we now turn to John’s guest post for how his photos can help explain why the battle at Antietam unfolded the way it did.

**note: Obviously landscapes and environments can and do change over time. The authors recognize that, while the battlefield likely looked very similar to how it does now, it did not, in 1862, look exactly as it does today.

**     **     **     **     **

One hundred and fifty-three years ago today, Robert E. Lee’s Army of Northern Virginia met the Union Army of the Potomac, under the command of General George B. McClellan, near Antietam Creek in a rural area of western Maryland. After victories over Union armies during the Peninsula Campaign and the Second Bull Run, Lee embarked on an invasion of the North in the fall of 1862. He hoped that Confederate victories on Northern soil would bring European recognition of the Confederacy, particularly from Great Britain and France. The invasion began well enough for Lee. Dividing his army into three columns, he sent General Stonewall Jackson to capture an important ammunition depot at Harpers Ferry, Virginia, while the rest of the army moved into Maryland. Lee’s invasion may have worked against the ever-cautious McClellan, but fate then seemingly intervened. Two Union soldiers found a copy of Lee’s orders wrapped around some cigars and immediately forwarded those to McClellan. The Union general now possessed the ability to destroy the Army of Northern Virginia, but he once again moved slowly. On September 17, 1862, the armies met east of the small town of Sharpsburg, Maryland, along Antietam Creek. McClellan outnumbered Lee almost two-to-one, as rebel regiments were still marching from Harper’s Ferry. Nevertheless, the Confederates held geographic advantages on the battlefield and what followed was the bloodiest single day in American history.

Above: A panorama of the northern part of the battlefield. This pictures looks west towards Sharpsburg.

Above: A panorama of the northern part of the battlefield. This pictures looks west towards Sharpsburg.

The engagement began with Union attacks on the northern edge of the battlefield (shown above). Union regiments advanced out of the north woods southward towards Confederate positions over a broad, flat field. Marching north to south (right to left in the picture), the terrain provided ideal for keeping formation. But that flatness exposed Union regiments to withering fire from rebel artillery and provided no cover from flying shrapnel.

Above: The view northward across the old cornfield.

Above: The view northward across the old cornfield.

The Union advance produced one of the bloodiest engagements of the Civil War. In the picture above, which looks northward into where the cornfield once stood, you can see how open the terrain was. Advancing Union soldiers would have made easy targets to Confederate artillery batteries. Now imagine a large cornfield, with stalks nearly as tall as a grown man. Confederate soldiers waited on the southern edge of the cornfield and met Union soldiers with a wall of soft lead as they moved out of the cornfield. A young Union private, marching in line out of the cornfield, would have had to adjust to the relative brightness after the cornstalks blocked out much of the sun. Coupled with massed Confederate rifle fire, the battle in the cornfield would have been confusing and chaotic. The infamous cornfield ultimately changed hands several times during the early morning hours and the fighting produced thousands of casualties on both sides; in some units, 60% of the men were killed or wounded in just a few hours.

Above: Looking west towards the Confederate positions in the West Wood.

Above: Looking west towards the Confederate positions in the West Wood.

After the Union eventually secured the cornfield, a fresh division under the command of General John Sedgwick joined the battle and the new forces pivoted to the west with hopes of rolling up the Confederate flank. Sedgwick’s division of some 5,000 men advanced into the West Woods where it met withering fire from three different directions: artillery from the west, along with infantry and cavalry units to the south and southwest. The close confines of the West Wood, as seen above, combined with the smoke of battle to produce organized chaos. Soldiers and commanders would have had little idea what was going on beyond their immediate vicinity. Additionally, the forest broke up unit formations, meaning it was difficult, if not impossible, for Union regiments to attack in mass, negating their numerical superiority. As a result, Northern units suffered horrendous casualties. The Philadelphia Brigade, for example, lost some 500 men in less than twenty minutes of fighting in the West Wood.

The advance into the West Wood also brought the fighting to a small, white-washed church built by a sect of German pacifists: the Dunkers. Since it was so near to the forest, the little church became a small battlefield unto itself, as Union and Confederate regiments fought to control this small, but important geographic landmark just south of the West Wood.

Above: The reconstructed Dunker church. It was a focal point of fighting as the battle shifted to the south and west during the morning hours.

Above: The reconstructed Dunker church. It was a focal point of fighting as the battle shifted to the south and west during the morning hours.

As fighting raged in the West Wood, two Union divisions advanced (east to west) in the center of the battlefield. Some 10,000 Union soldiers attacked approximately 2,500 Confederate soldiers defending the center of the line in the sunken farm road seen below..

Above: Looking south along the sunken farm road. Union regiments advanced from the left

Above: Looking south along the sunken farm road. Union regiments advanced from the left

The sunken road stretched for several hundred yards in the middle of the battlefield and provided a natural trench for Confederate regiments. As a result, rebel soldiers could fire and reload in relative safety, as long as Union soldiers were kept at bay.

And for several hours, the attacking Union divisions were repulsed again and again as they attacked the improvised trench. As you can see below, Confederate soldiers facing east towards the advancing Union regiments had clear fields of fire and, once again, Union riflemen had no real cover against Confederate fire. Silhouetted against the sky and lined up in formation, Union soldiers made easy targets for the defenders. The result was a bloodbath. The 2,500 Confederates held off the 10,000 Union attackers for hours from what became known as “Bloody Lane.”

Above: The open field over which Union regiments advanced towards Bloody Lane.

Above: The open field over which Union regiments advanced towards Bloody Lane.

Above: The clear fields of fire possessed by the Confederate defenders.

Above: The clear fields of fire possessed by the Confederate defenders.

Above: The view of a young Confederate rifleman would have had at Bloody Lane.

Above: The view of a young Confederate rifleman would have had at Bloody Lane.

Eventually, however, Union regiments got around the Confederate flank of Bloody Lane and could fire down the gulley’s length. The terrain that had so benefited the Confederates for hours during the morning now became their undoing. Now without cover, the Confederate defenders were nearly annihilated. With the center of their defense broken, the remaining Southern regiments withdrew towards Sharpsburg and fighting died down in the center.

Well south of the fighting in the north the battle continued at a small bridge that crossed Antietam Creek, perhaps the most famous landmark of the battle. A Union corps under the command of General Ambrose Burnside unsuccessfully tried to force the crossing throughout the morning with small, piecemeal attacks, but was repulsed each time.

Above: The bridge over Antietam Creek, known as “Burnside’s Bridge.”

Above: The bridge over Antietam Creek, known as “Burnside’s Bridge.”

The Confederates held a major geographical advantage at Burnside’s Bridge, as it was ultimately called. Confederate regiments held the high ground on the western side of the creek. As you can see below, the heights provided them with a commanding defensive position from which they unleashed a deadly hail of fire upon the attacking Union soldiers.

Above: The view from the Confederate positions on the west side of the creek.

Above: The view from the Confederate positions on the west side of the creek.

As an attacking Union soldier, the Confederate position was undoubtedly daunting. Union riflemen had to assemble on the east side of the creek within range of Confederate rifle fire. They then had to cross the bridge under constant fire and fight their way up the relatively steep heights in the hot sun. The pictures below only begin to capture the challenged faced by Union soldiers. As a result, the outnumbered Confederate regiments managed to hold off the Northern units throughout the morning.

Above (two photos): The perspective of a Union soldier on the east side of Antietam Creek and as he would have advanced across the bridge.

Above (two photos): The perspective of a Union soldier on the east side of Antietam Creek and as he would have advanced across the bridge.

Only in the afternoon did Burnside finally attack across the bridge in force and Union regiments finally ousted the Confederate defenders occupying the heights. The superior terrain of the Confederate position, much like at Bloody Lane, had held up the Union attack for hours and prevented a quick Union victory.

At this point, the Union battle plan had not gone according to plan, despite outnumbering the Confederates nearly two-to-one. Nevertheless, disaster faced the Lee and the Army of Northern Virginia. Burnside had outflanked Lee’s army and advanced his corps towards Sharpsburg. If he could capture the town, the Confederate army would be trapped in Maryland with little hope of escape. The war might come to a quick end.

The few Confederate units standing in Burnside’s way took up positions on a rise in between Antietam Creek and Sharpsburg. This rise, pictured below, provided a solid defensive position, but by now the Confederates were so heavily outnumbered that the advantage in terrain did not halt the Union advance. Union regiments quickly pushed up the hill and advanced towards Sharpsburg.

Above: The final Confederate positions south of Sharpsburg. This is where Confederate regiments fells back to after Union forces finally captured Burnside’s Bridge.

Above: The final Confederate positions south of Sharpsburg. This is where Confederate regiments fells back to after Union forces finally captured Burnside’s Bridge.

 The battle, and perhaps the war, appeared lost until, in a move straight out of a Hollywood movie, thousands of Confederate reinforcements under the command of General A.P. Hill arrived from Harpers Ferry, some seventeen miles away. They arrived from the northwest at the last moment and blunted the final Union assault in the late afternoon. Reluctant to take more casualties, Burnside withdrew his corps and the fighting ended that day.

McClellan, always overly cautious, refused to attack again the next day despite holding a major numerical advantage. Then two days after the end of the battle Lee and the Army of Northern Virginia slipped back across the Potomac River and McClellan, despite prodding from Lincoln, refused to follow.

The clash of the Union and Confederate armies at Antietam provides an excellent demonstration of the importance of terrain in a battle. Heavily outnumbered, Confederate regiments took advantage of the chaotic West Wood, the improvised trench at Bloody Lane, and the heights at Burnside’s Bridge to hold back superior Union numbers. Despite being outnumbered almost two-to-one, Lee and the Army of Northern Virginia secured a tactical draw in the battle because of an excellent use of defensive terrain and due to the general advantage held by the defense during the Civil War (there were also the usual problems of bad Union generalship). But the result was the single bloodiest day in American history, with some 3,600 killed on the field and another 20,000 injured. More importantly, the battle looked enough like a victory for the Union that President Abraham Lincoln finally issued his preliminary Emancipation Proclamation. The war to save the Union, caused by division over slavery, became a war to end slavery in America.

U.S. gun violence and #blacklivesmatter

Two different incidents on Twitter caught my eye recently, and I have wondered if they are related.

ESPN talking head Bomani Jones tweeted a link to a Guardian article titled, “Horror, live for all to see: another week in American gun violence.” The article was specifically about two recent events in the United States: two journalists were shot to death on live television by a disgruntled former employee, and a 14-year old boy held his class and teacher hostage with firearm. But, more broadly, the piece was about the culture of gun violence in the U.S. that leads to 88 deaths per day due to shootings (about 32,000 a year).

Jones editorialized, “the world now gawks at us like we did south america and the middle east in the ‘80s and ‘90s. and it should.” 

In an incident that superficially seems unrelated, NBA player Kendall Marshall was criticized on Twitter by a fan for using the “#blacklivesmatter” hashtag made popular following Michael Brown’s death from police shooting in Ferguson, Missouri. (I will not name the fan because s(he) is not a public figure.) That fan thought that Marshall should instead use the “#alllivesmatter” hashtag. Moreover, that fan thought Marshall was only using #blacklivesmatter to increase his “street cred.”

Marshall sarcastically replied, “street cred babyyyyy.” 

Both overall U.S. gun deaths and deaths by police shooting have a racial tinge. The Pew Research Group claims that, though blacks represent just 13% of the U.S. population they comprise 55% of the shooting homicide victims (homicides were not quite 2/3 of all gun deaths over the studied time period). In terms of police shootings, CUNY assistant professor Peter Moskos claims that blacks are 3.5 times more likely to be killed by a police officer than whites. However Moskos did clarify that, when adjusted for the homicide and felonious crime rates, whites were more likely to be killed by police than blacks. (Methodological quandaries abound with all of these measurements.)

Drawing meaning from these numbers is difficult at best, but in terms of the population blacks are more likely to die from a firearm in the United States than whites. The potential reasons for that are varied, and a Google search will turn up quite a few of those. Many of those explanations are politically colored, and thus I will not proffer my own.

What does seem obvious to me, however, is that gun violence is a significant problem in this country. I have no idea how to fix that, but we as a nation should want to try. And we must realize that, amidst all those gun deaths, the black community bears a disproportionate amount of the carnage. It is no surprise that #blacklivesmatter became popular.

In the end, I think the Guardian story provided me an overall context to the debate about #blacklivesmatter vs. #alllivesmatter. Of course all lives matter—saying otherwise is nonsensical. But, considering the context that blacks are indeed more likely to die from gun deaths, is it any surprise that so many people have found it necessary to insist that black lives do indeed matter?

The #blacklivesmatter campaign is not an about saying that only black lives matter, but instead an insistence that black lives be considered part of all lives. Thus the phrases #blacklivesmatter and #alllivesmatter should be synonymous (even if #alllivesmatter started largely in opposition to #blacklivesmatter as an attempt to derail that movement). But, political disharmony being what it is, often proponents of the two phrases see themselves as antithetical to the other.

The Guardian article and Bomani Jones’ commentary combined with Kendall Marshall’s confrontation with a fan demonstrate several things to me: (1) gun violence in the United States is a serious problem; (2) black bodies disproportionately bear that violence; (3) we need to de-politicize the idea of stopping gun deaths; (4) we need to respect that, no matter the reasons why they are more likely to be shot, the black community is right to be hurt, demand change, and insist that their lives matter as much as white’s.

I have no idea how to fix any of these problems, and I fear that venturing a guess how to do so would show my ignorance in one way or another. Really, I guess I am just sad that we would let so many of our fellow countrypersons, of all races but especially minorities, die without making an honest attempt, as a nation, to do something to change that.

On LeBron James, Statistics, and the 2015 NBA Finals

Wednesday, 17 June was the saddest day of 2015 for me—no more basketball until NCAA and NBA seasons start back in the fall. To get me out of my post-basketball doldrums I wanted to do a blog post on the 2015 NBA Finals.

There are probably professionals doing what I am here (and doing it better), but I still wanted to crunch the numbers on the Cleveland Cavaliers team and individual statistics. Specifically, I wanted to try to put what LeBron James did in context. His performance was, not being hyperbolic, transcendent. (FYI most of the numbers below, unless stated otherwise, are taken either directly from ESPN’s box scores or calculated by me using those box scores.)

Because his All-Star teammates Kyrie Irving and Kevin Love (not to mention Anderson Verajao) mostly did not play in the Finals series due to injuries (Irving played most of game 1 while below 100% healthy and left in overtime with a broken kneecap), James was forced to take on an incredibly high workload. He did his best to “carry” his team to a championship. Even though he lost, I think his play deserves a deeper look.

ESPN noted on its stats Twitter page (@ESPNStatsInfo) that James was the first player in NBA history to lead the Finals series in points, rebounds, and assists. He averaged 35.8 points, 13.3 rebounds, and 8.8 assists—nearly a triple double average!

Moreover, James put up those numbers against arguably one of the best NBA teams of all time. The numbers gurus at Nate Silver’s 538 Sports have a rating system called Elo (borrowed from chess), and the 2015 Golden State Warriors (the opponent of James’ Cleveland Cavaliers) ended up with an 1822 Elo rating. That’s the second highest team Elo score in NBA history behind Michael Jordan’s record 72-win 1996 Chicago Bulls (team Elo of 1853).

And it is not like top-notch talent surrounded James either. 538 Sports ranked his supporting cast 59th out of the last 60 Finals teams (two teams a year for the last 30 years). Ouch.

How odious were LeBron James’ teammates in the Finals? While many have tried to dismiss James’ stat line for being inefficient, he was actually arguably more efficient shooting the ball than his teammates over the course of the series.

1.png

First off, the obvious answer. Did James shooting a low percentage hurt his team? We can turn to game-by-game +/- scores for that. +/- is a statistic borrowed from hockey that very simply measures whether a player’s team won or lost during his minutes on the court. Outscore your opponents 55-50 during your on-court time during a game? You get a +5 for the game.

Chart 2

In all but one game LeBron James had a better +/- than his teammates (higher numbers are better), and for the series the team was 18 points worse with James on the bench. Considering James played 275 out of 298 possible minutes (only resting an average of 3 minutes 50 seconds a game, evening including two overtime games), that statistic is meaningful.

James’ 275 minutes on the court? Outscored by 25 points (for the series going down a point every 11 minutes). James’ 23 minutes off the court? Outscored by 18 points (for the series going down a point every 1 minute 17 seconds).

With shooting splits of 40/31/69 James certainly could have shot the ball better. But his teammates combined for 38/29/71 shooting splits, just marginally worse but worse nonetheless.

Compared to his teammates, James’ Effective Field Goal Percentage (weights for three-point shots made) was virtually identical to his teammates: 43.1% vs. 43.2%. The same is true for his True Shooting Percentage (weights points scored vs. field goals and free throws attempted): 47.7% vs. 47.9%. (All of these numbers are somewhat poor.)

Let’s say we call all that a wash—LeBron James’ shooting efficiency was about the same as his teammates. If that is the case, how can we criticize James for being inefficient unless it is to criticize the whole team? James shooting the ball certainly was not any worse of an option than his teammates as a whole. (Considering the difficulty of shots he had to take it was probably better. More on that below.)

It would not be unfair to quibble about whether specific players should have gotten more shots in relation to James, of course. Center Timofey Mozgov shot 55% for the series and Power Forward Tristan Thompson shot 50% (the guard trio of Matthew Dellavadova, JR Smith, and Iman Shumpert put up the bulk of the team's poor shooting numbers  with combined 29/28/69 shooting splitstruly miserable). But Mozgov and Thompson got a great many of their made field goals off of assists (neither can consistently create his own offense) and offensive rebound putbacks (frequently off of shots that James missed close and drew a second defender). Increasing their workload would have been difficult and likely would have lowered their shooting percentages (fewer easy shots as described above).

No, we need to recognize that Lebron James shooting the ball typically was the team’s best option because of two reasons:

1.     James’ teammates could pick more optimal times to take shots, but James had to “carry the load” so to speak

2.     James’ teammates got to play with him, and that makes a difference

First, it needs to be pointed out that LeBron James had an astronomically high usage percentage (Usg%) of 46.7% in the NBA Finals. That means that almost half of his team’s possessions when he was on the court ended up in him “using” the possession (attempting a field goal or free throws, or turning over the ball). For comparisons sake, for the playoffs overall, James’ Usg% was 37.6%, leading the NBA, and the next highest Usg% was 2015 MVP Stephen Curry’s at 31.0%. Anything over 30% is very, very high. Anything over 40% is almost unheard of. James 46.7% looks like a number only found in video games.

The Cavaliers’ team game plan depended heavily on James to create scoring opportunities for himself and his teammates. Why? Because his teammates simply could not do so with All-Stars Kyrie Irving and Kevin Love on the bench. Which brings us to the second point.

LeBron James’ teammates got the benefit of playing with him, while he did not. The Cavaliers totaled 95 assists as a team during the NBA Finals, but James had over half of those—53 assists over the six Finals games compared to 42 by his teammates. That means that James assisted on about 45% of all the shots his teammates made.

(The team rate was about 48%. But remember they were passing to James AND each other—LeBron James, as good as he is, cannot assist to himself. Also, James only turned the ball over 21 times over the course of the series compared to his teammates’ 52 times. And while his teammates got to pass to James and other teammates more open because of the defensive attention James received they still threw it to the defense more and came up with fewer assists, even with a usage rate 7% higher than James.)

James is arguably the best passer in the NBA today (I would argue he is among the best in NBA history), and that makes the game easier for everyone around him. In addition, since the Warriors knew James would be the focal point of the Cavaliers’ offensive game plan (again, a 46.7% Usg%), he typically drew much more defensive attention than his teammates—the Warriors’ best defenders, double teams, etc. Not only did James get his teammates more easy, assisted baskets than they got him and each other, but his presence on the court made the game much easier for them than it was for him.

All that to say, LeBron James just completed perhaps the greatest NBA Finals performance in history. Against one of the best teams of all time, playing with one of the worst supporting casts of the last 30 years, James did everything he could. He played an incredible number of minutes each game. And it was not just his minutes that soared—he shouldered the burden of shooting and creating for his teammates who, as mentioned, could not really do so themselves. And while some have criticized his efficiency, it was relatively better for James to shoot that much and at least as efficient as the rest of his teammates combined.

We should sit back and appreciate what just happened. While James’ team lost the series four games to two, numbers suggest that without him the Cavaliers would have gotten swept in blowout fashion. No other player in the NBA could do what LeBron James just did, and perhaps no player in NBA history ever has.

LeBron James and the Cavaliers may have lost the Finals, but by getting to watch that performance the fans won.

Guest Post for University of Washington Press

The University of Washington Press recently asked me to do a guest blog post Q&A to promote their new book Proving Grounds: Militarized Landscapes, Weapons Testing, and the Environmental Impact of U.S. Bases, edited by Edwin Martini.

The project's UW Press editor, Ranjit Arab, introduced the blog post as such:

"The essays in Proving Grounds: Militarized Landscapes, Weapons Testing, and the Environmental Impact of U.S. Bases give us the most comprehensive examination to date of the environmental footprint of U.S. military bases both at home and abroad. Though critical of the military’s presence across the globe, the book does point to a few examples where the armed forces were actually ahead of the curve—at least compared to the private sector—in terms of self-regulation. Still, the majority of cases in Proving Grounds look at the damaging consequences—both intended and unintended—of building bases and testing weapons, from wiping out indigenous plant and wildlife to the contamination resulting from the disposal of Agent Orange after the Vietnam War.

In Chapter 2, historian Neil Oatsvall looks at how deeply policymakers engaged with environmental science at the dawn of the nuclear testing era. Contrary to popular belief, he finds, U.S. leaders actually did take scientific considerations seriously as they tried to take a lead in the burgeoning nuclear arms race. However, though their intentions may have been well-meant, given the limits of their environmental knowledge at the time, they were clearly in over their heads. We asked Neil to elaborate on this contradiction."

Read the whole post here.

When Academic Free Speech Goes Wrong: The Jerry Hough Edition

Duke University chaired professor of Political Science Jerry Hough recently created discord with his “controversial” remarks about race in the comments section of a New York Times editorial. Responding to that editorial on “How Racism Doomed Baltimore”, Hough attempted to draw a distinction between different ethnicities and explain why some groups were rioting in Baltimore and not others.

In doing so, Hough demonstrated how NOT to use academic free speech. By labeling himself as a professor at an elite university, Hough attempted to use his academic standing to strengthen his ideas with a sort of gravitas that was completely undeserved. When speaking as an “expert” you simply cannot let personal opinion masquerade as well reasoned, argued, and supported ideas.

In a six-paragraph mess, Hough claimed that Asian Americans were not rioting as a group in Baltimore because “they didn’t feel sorry for themselves” when confronting racism “at least as [bad]” as what black citizens had faced, and instead “worked doubly hard.” He then made what he thought was a lucid point about how, in his estimation, “Every Asian student has a very simple old American first name that symbolizes their desire for integration. Virtually every black has a strange new name that symbolizes their lack of desire for integration.” To top it all off, when questioned about his noxious string of garbage, Hough drew back his bowstring for one more zinger: “In writing me, no one has said I was wrong, just racist.”

What bullshit, Professor Hough.

I am telling you that you are wrong (as I expect many other people did had you possessed the sense to listen). Your comments simply do not measure up in the face history, this country’s demographics, or commonsense.

First off, the assertion that “Asians were discriminated against at least as badly as blacks” cannot be supported with U.S. history. Asian Americans certainly have faced considerable racism in the United States. The Chinese Exclusion Act of 1882 did not have a spontaneous genesis, but came from decades of anti-Chinese sentiment and action. And it is no surprise that only Japanese-Americans were interned in mass numbers during World War II and not Italian- or German-Americans (can you imagine anyone interning Joe DiMaggio?).

Even considering this considerable racism (just look at a sampling of the visual culture from WWII), anti-Asian sentiment in this country pales in comparison to the triangular slave trade. The forced migration and chattel slavery of millions of blacks is one of the most horrific mass acts human beings have ever perpetrated on each other. Add to it the blatant institutional racism this country had for many decades, such as from Jim Crow laws, and the claim that “Asians were discriminated against at least as badly as blacks” is simply wrong.

Second, the idea of names marking a desire for “integration” makes no logical sense. Would Hough argue that President Barack Obama and his family never tried to integrate because of his name? Moreover, Hough assumes that a normative U.S. culture can only exist as a white, Euro-American culture. What he really implies with his use of “integration,” deriding “virtually every black” for “a strange new name,” is that only by choosing European-derived names can non-whites ever truly be real Americans.

Moreover, though I have no hard evidence (and neither does he), I would suspect that his characterization of names by ethnicity falls flat across the entire population. Perhaps his assertion holds true within the very small, typically wealthy subset of the population that is Duke University students—I would not know, because as a UNC alumnus I try to avoid that campus in Durham. But “every Asian student” in the country, or even a large majority? That seems very unlikely. (And he clearly has not been to a toddler playgroup lately—lots of white people name their kids strange things too, and I know as a stay-at-home parent.)

Finally, he makes demographic claims without a hint of evidence. He asserts an “enormous” number of interracial relationships exist between Asian Americans and whites while concurrently arguing that “black-white dating is almost non-existent.” If that were true (it is not), we could not be surprised given our country’s history—Martha Hode’s White Women, Black Men is an excellent work on the subject. However, Hough presents no evidence for his claim. That is because he is flat wrong.

The reality is that while white-Asian marriages are more common than white-black marriages (about 8% of all U.S. marriages are interracial, with white-Hispanic being the most common interracial marriage type in the nation), the numbers are relatively close. For example, in 2010, of the more than quarter-million new interracial marriages, about fifteen percent were white-Asian while about 12 percent were white-black.  That number is statistically significant in its difference, but not the gaping chasm that Hough suggests.

It is the height of irresponsibility to leverage your academic career into the authority to champion unsupportable ideas. This is exactly what Jerry Hough did with his comments. Do I think his ideas are racist? Yes. But that is not the point I am trying to make here.

What I am saying is that if we are going to value academic free speech and defend its merit in a public and higher-ed culture that increasingly devalues academic freedom, then we need to practice responsible free speech with sound arguments, clear logic, and good evidence. Jerry Hough did none of these things, and that is a significant problem.

I am not saying I have to agree with you, but you do have to make even the slightest amount of sense.

 

Addendum: My friend Allie Mullin read this blog post and said that Hough seems to want “to compare races in essentially an Oppression Olympics. From an academic standpoint that's the only quantification he makes, and it lacks factual grounding and ignorance of intersectionality." I agree with her.

In Defense of Lecture... Well, sort of

My former advisor, Matthew Booker, sent me an interesting blog post by Grant Wiggins about high school history teachers and lecturing in class. According to a survey conducted by the website, history teachers were more likely to lecture than teachers in any other discipline, with a majority lecturing for half the period or more (sometimes the whole period).

 Wiggins basically comes down on the issue of lecturing by saying that there are better ways to achieve our pedagogical goals as history teachers, and most of what we want to convey in lecture can be conveyed to students via printed materials. He even goes so far as to assert that he “can only see two good reasons for lecturing at length”:

1.     “You have done original research that isn’t written down in a book”

2.     “You have rich and interesting knowledge based on research that can overcome confusions and missing elements in the current course”

 I have mixed thoughts about lecturing. In class, there is no way to convey as much information as quickly as we can through lecture. For sure, students can and should get some (or perhaps all) of that information via their readings. But readings are not interactive. And it is hard for textbooks to model critical thinking and demonstrate how to analyze and use evidence to build historical interpretations—skills highly valued in our discipline.

 My thinking on lecture changed dramatically my last semester of TAing for a professor in her last semester before retirement. Even in a 70-person lecture class she made it a point to draw students into discussion while lecturing. The mix kept students on their toes but also disseminated lots of information. Part of her success was because she was very good at it, but the technique was great. Since then whenever I lecture I always use this interactive approach.

 My guess is that many educators find lecture a necessary evil. One defense I will make of high school history teachers lecturing is this: Frequently those history courses have 30+ students, and when you have that many high school students in a room a lot more of your time becomes classroom management than teaching. If we (as a society) could do a better job of getting that number down to 15-20 (or fewer!) students then teachers would have more flexibility to do collaborative activities and have more interactive discussions.

Anyway, all that to say, straight lecture probably is not the best classroom practice, even if it is sometimes seems necessary. However, mixing lecture with discussion, Socratic-style teaching, historical role-playing and games, etc. can be effective, in my opinion. Mixed methods help keep students interested and can cater to different learning styles. And, to be honest, it helps if you are good at it. We have all sat in classes with good lecturers who we wanted to hear talk and in classes with bad lectures where we probably nodded off.

 When lecturing becomes less about conveying information and more about involving students in the creative aspects of the historical discipline then it is worth including in our classrooms. Please leave supporting or dissenting opinions in the comments! I am very interested in hearing what others have to say.

Whose Scandal?: The UNC scandal and Bradley Bethel

I was disappointed by a recent editorial in the Daily Tar Heel, the award-winning student newspaper of my alma mater, the University of North Carolina. The editorial concerned Bradley Bethel, a former UNC learning specialist who worked to support athletes academically, and his work putting the university’s recent academic-athletic scandal in what he feels is proper context.

A quick note on biases: I am extremely proud of my undergraduate degree from UNC, I worked very hard for that degree (and took none of the aberrant classes), I still am an avid supporter of UNC athletics (all sports, not just the revenue ones), and Bethel and I sometimes communicate on Twitter (we have never communicated outside of that medium).

All that out of the way, the editorial unfairly maligns Bethel’s previous work and his current film project, “Unverified”. That film seeks to challenge what Bethel rightly perceives as a sensationalist media narrative. In the introductory video on the film’s Kickstarter page, Bethel says of the recent UNC scandal, “Now, the true story is not entirely pretty. Some of the facts will be embarrassing for the university. But it is a story much different than the media’s sensationalized narrative.”

Contrast that to the Daily Tar Heel’s opinion that begins, “A film dedicated to proving that UNC’s athletic-academic scandal was imagined by headline-hungry journalists is difficult to take seriously.” The editorial continues to call Bethel’s film “delusional” and an “embarrassment.”

These comments are patently unfair.

Much of Bethel’s point of view comes from challenging claims made by Mary Willingham, a former reading specialist at UNC. Many have even called Willingham a “whistleblower” in the UNC scandal for her work even though, as far as I know, she meets none of the criteria of such. (She did not identify the academic misconduct—there had already been several investigations into the misdeeds by the time she became a national name.)

Willingham drew great public attention for releasing a study of 183 UNC athletes that supposedly demonstrated 60% of them read between fourth- and eighth-grade levels, with perhaps as much as a tenth of those athletes reading below a third-grade level.

Bethel has rightly challenged Willingham’s methodology and conclusions. Three external reviewers came to similar conclusions that discredited Willingham’s findings. It later came to light that Willingham had likely plagiarized significant chunks of her MA thesis, which further challenges her academic credibility. Willingham also seems to have violated FERPA laws.

For his trouble, Bethel has had his mental health questioned by Willingham’s co-author (for their book Cheated), Jay Smith, distinguished professor of history at UNC. Smith is a truly excellent historian (check out his impressive CV here), but writing the provost to express concerns about Bethel’s mental health is, frankly, ghastly.

All that to say, even though prominent media members have used Willingham as a source without questioning the veracity of her findings, she is far from an ideal resource. (Those media members especially include CNN’s Sara Ganim, Pulitzer Prize winner for her work on the Jerry Sandusky scandal, and the Raleigh News & Observer’s Dan Kane.)

And I have yet to see an evidence-based refutation of Bethel’s larger points: Willingham and Smith erred in their claims about athlete literacy at UNC, most media members took those claims without scrutiny and made sensationalized claims, and the UNC academic-athletic scandal as a whole has been largely misunderstood because of that. If you contend Bethel is so embarrassing, please first point out how he is wrong.

I am not trying to say that nothing bad happened at UNC—far from it. Though I love my alma mater dearly, I have been deeply, tremendously embarrassed by the events that took place. It is clear that significant academic improprieties occurred. That misconduct was, being as charitable as possible, at least characterized by institutional capture by athletics personnel (though it should give pause that athletes accounted for less than half of the aberrant enrollments in fraudulent classes). Like Bethel, I am not trying deny that substantial wrongdoing occurred.

I do, however, think that the Daily Tar Heel editorial is the most recent unfair attempt to disparage Bethel. At times I disagree with Bethel (who wouldn’t?), but I have found his work to be meticulously researched and argued. He probably goes overboard sometimes. He is not, however, delusional or an embarrassment.

At the end of the movie The Dark Knight, Jim Gordon says of Batman, “He’s the hero Gotham deserves, but not the one it needs right now.” I don’t know if any of that applies to Bethel—hero, as needed or deserved—but I do know that bringing dissenting facts to light in a respectful fashion is always needed. That’s what Bethel has done and intends to do with his film.

Respectfully providing evidence-based refutations of widely held beliefs is at the heart of academic discourse. Who would be embarrassed of that?

Five Reasons Not to Embargo

To embargo your dissertation or not to embargo? The question has been debated within the historical discipline. Recently I had a conversation on Twitter with Michael D. Hattem (@MichaelHattem) on the subject, particularly over his guest AHA blog post. On Twitter he made the perfectly reasonable assertion:

With that in mind, I have written this blog post with a few quick reasons why I think graduate students in history should consider NOT embargoing their dissertations. I do not expect this to end the discussion at all, but I hope to help provide a counterpoint to a strong narrative that asserts any grad student who cares about an academic future should embargo.

(1) Other scholars do have a harder time finding and reading your work

My own dissertation is available open access through the University of Kansas ScholarWorks site. (Go read it!) Since I finished it in May 2013, dozens of people have downloaded my dissertation including, as of this post, eighteen views outside of the United States from seven different countries. I do not know eighteen people outside the U.S. who might be interested in my work! Perhaps those folks would have emailed me to ask for a copy of my dissertation, but I am doubtful of that.

Another quick story: I am friendly acquaintances with a prominent scholar in my field. Out of the blue that scholar sent me an email last year asking for more information about a source I cited in my dissertation. Would that scholar have emailed me to ask for my dissertation? Perhaps—it is definitely possible. But the ease of getting a copy of my work made that person reading it more likely. Put simply, embargoing your dissertation means that fewer people will read it.

(2) Embargoing creates a culture of fear

Every graduate student I have talked to who has said they are embargoing is doing so because they fear that not doing so will somehow make their dissertation unpublishable and thus hurt their job and career opportunities. The fear is palpable in their comments, thick with worry during such a difficult and uncertain moment to be seeking employment in our profession.

Graduate school, at times, seems designed to psychically damage young, bright, hardworking people. This discussion plays into that by helping to convince new PhDs that the reason why they have not found a job is because they are not working hard enough, not publishing enough, not doing something they should be doing, etc. For many fields this is nonsense—as has been stated time and again, there are simply too many applicants for too few positions. This means that many good candidates will not get any position, let alone the dreamed tenure track job. Do not let fear convince you that your difficulty on the job market is purely because of some step you are not taking.

(3) I have real doubts that most editors care

I have never heard of a press or series editor who cared whether a potential author embargoed or not. Even more, I have never even heard of one asking me or anyone else under any context whether we did. In this discussion I have heard from a number of academics at all stages of their careers that “editors care.” Who are these editors? Why do they care? Because…

(4) No matter the embargo length, it is not forever

This means that any potential book would be out after or only shortly before the embargo ended. Most embargoes are slated to go one to three years. It is nigh impossible to get a first book out within three years time, as the amount of work revising a dissertation (intended to demonstrate to a committee that you are ready to join the profession) into a academic monograph (intended to make a scholarly contribution to your field) is substantial. Moreover, those substantial revisions mean that when the embargo does end the book will be something very different than the dissertation ever was. As an example, one friend is starting completely from scratch with the book manuscript, using much of the dissertation’s research but adding to it so substantively with new research and arguments that merely revising would have made a total mess of things.

Even the AHA’s recommendation of six years means that, presuming someone was fortunate enough to secure a tenure track job immediately out of graduate school and had to have their first monograph out before tenure review in 5 years, the dissertation embargo would end within a year or two of the book’s printing anyway.

In the historical discipline the embargo simply does not hold the work out of the public realm for long enough to make much of a difference. In other disciplines this point can completely change. For example, in the natural sciences there are frequently patents and that application process involved. Such people have very, very different concerns than a history PhD looking to publish a monograph. Since the embargo will likely end before the book is published, the supposed gains from embargoing seem moot.

(5) Embargoing wastes time

I put this reason last, but I do not think it is entirely inconsequential. Embargoing may only take a few hours of your time to get the appropriate signatures and turn in the correct forms, but why spend that time if you do not have to do so? Spend that time instead reading another book in the historiography, revising an article, or even preparing your book proposal for a press.

Or… spend that time doing something entirely unrelated to work. Graduate school seems frequently makes people feel guilty about not working every minute of their lives. That feeling does not end with PhD conferral, either. Instead of dealing with the embargo, read for pleasure, watch a movie, or have dinner with friends and family. Start practicing what it will be like to be a professional who balances a career and a personal life right now, because hopefully that is what you will be soon.

 

At the end of his post, Hattem notes, “conflicting anecdotal evidence and a lack of metrics exacerbate the problem and call for caution and individual choice.” Absolutely fair in some ways—all I have presented above is anecdotal.

But I would assert this: Cowing to fear of the unknown is no way to live your life. Caution is one thing, but there is a difference between reasonable caution like looking both ways before you cross the street and unreasonable caution such as an unwillingness to walk on street grates for fear that you might fall in (one of my wife’s phobias). If you are going to embargo make sure you are doing it for concrete reasons that directly affect you and not merely because of some career or job market bogeyman you fear. 

My Rules for Job Market Sanity

Like a lot of people, I’ve been thinking about the job market a lot lately. Really, I have been thinking about it a lot for the last three years. To that end, I’ve come up with a list of rules that I try to follow to keep myself sane. These may not be for everyone, but they have been a way for me to cope with the big bundle of rejection that is the academic job market.

1) Know your profile

This one is a little more complicated than it will sound, but here goes: You need to know what sort of institution will be most interested in you as a candidate. No matter your dreams of working at an elite institution, if you have no published works to your name, you are extremely unlikely to get an Ivy League gig (even if you went to an excellent, Ivy League-caliber school). No matter your dreams of working at a small liberal arts college with lots of interaction with students, if you have never taught a single course (or have very limited teaching experience), you’re extremely unlikely to get a job at a school that truly values teaching.

These are not happy things to think about, but they are part of being realistic about who you are as a candidate and what the job market is like. Market yourself appropriately.

The flip side of this is that you have to recognize when you are a good candidate. Just because you have a good profile and good application materials it, very sadly, does not mean you will get a job (or even interviews!). Do you best not to get discouraged and keep working hard. This is much harder—knowing that you can be a good candidate and it still not lead to your employment. It really is true that there are too many good candidates and too few jobs.

2) Be honest about how hard you’re willing to work on the job search process

If you’re willing to spend the time you can probably apply to dozens (or more) jobs. You can write a new cover letter for each position and finely tailor each document to the position. One friend on a search committee told me that they had some teaching philosophy statements that were so carefully tailored to the school that some candidates researched not only all the courses on the books but which courses had been taught recently and by whom. Those candidates then laid out a detailed plan for how their courses would fit into the department. That took a lot of time and effort!

To be blunt, I am not willing to work that hard for almost any application. Everyone has to decide what makes sense for them.

The flip side of this, of course, is that no matter how hard you work it may not end up in you getting a position. Hard work is not a determining factor past a point. Just because you put 10+ hours into an application it does not mean that you will get an interview.

3) Know that you probably do not understand the dynamics of a search

The more time I deal with search committees (having been on both sides of the process), the more convinced I am that candidates have little sense of what is actually going on with a search committee. Even being part of a search committee and in the room during deliberations is not always enough to have a full sense of everything that is going on.

Without revealing too many details, I was crushed not to get an interview for a particular job because I was very familiar with the institution, department, and faculty. I thought I had tailored my application materials perfectly (I was willing to put in the work on this particular job). It turns out, talking to a faculty member later, that the department/institution had decided to go off in a direction that was totally unexpected to me (and my friend on the faculty, who was tangentially involved in the search), and I never could have fit what they actually wanted.

No matter what the job ad says or what you think you know about the department, faculty, and institution, you do not fully understand the search. Just accept it.

4) Know that the job market is neither 100% merit-based nor is it completely random.

Someone much wiser than me told me this, and I have found it to be so very true. Some of this is related to no. 3 above, some of it is this very (VERY) nebulous idea of “fit,” and some of it is the mere fact that different people value different things out of a colleague. If you are a good candidate and give it enough time you will get opportunities and, if you have a little luck, get a position.

5) Be happy, truly happy, when your friends get interviews and job offers

This is one that I said out loud my first year on the job market but did not mean until halfway through my second year. If you are lucky you will have talented friends who are also on the job market with you, and hopefully they will get interviews and job offers. You will lose out on many, many positions to people you have never met and may never meet. Wouldn’t you be happier if one of your friends got that job offer instead of someone you do not know? It is a tough idea to accept in your heart, but accepting the logic of it is the first step.

6) Finally, know that not getting a position does NOT mean you are a bad scholar or teacher

We live in a rough time to be on the academic job market. As I said in no. 1, even good candidates can miss out on interviews and job opportunities. One of my friends has a good teaching record, multiple publications (one forthcoming in the top journal in his/her field), and has had multiple prestigious fellowships (including a Fulbright). This friend has gotten no interviews to my knowledge, and it completely baffles me. I would think this person would be one of the first snatched off the market, but it has not happened.

Just because one year (or even two) does not give you a job offer, it does not mean that the future will be the same. Doing so may mean putting other career or maybe even family opportunities on hold, and you will have to balance what potentially landing an academic job in the future is worth to you. But you just cannot feel bad at a few (or even a lot of) strikeouts.

Remember, jobs are like spouses—you only need one to say yes.

We Still Need to Kill the Conference Interview

In my head I’m thinking about some variation of the Eagle’s “Hotel California” here (please kindly ignore that I buggered the number of syllables):

In the conference hotel lobby they are gathered for the feast,

They stab it with blog posts and tweets, but they just can’t kill the beast.

There are a lot of reasons why the conference interview needs to go, and I’m far from the first to write about it. In terms of justice, we’re asking the most financially vulnerable in our profession to shell out big bucks. It’s not just about the candidates, though. One of the most persuasive arguments I’ve read is David Perry’s that the conference interview format is a financially poor decision for colleges and universities. He has some great links in there to read other perspectives on why the conference interview should end.

I’ll be honest that I don’t really have much new to add other than this: a colleague of mine recently received an invitation to interview at the American Historical Association (AHA) next month. The pure joy at receiving an interview invitation quickly receded to dread when my colleague realized the interview was being held at the AHA job center, necessitating conference registration. The colleague holds a non-tenure track teaching position, and thus falls into the “employed” registration category of $220 dollars. My colleague was not planning on registering before (just go do the interview and leave), but now must find the money for this extra expense.

I initially wrote a much longer post about the financials of attending a conference, but I deleted most of it to focus on the question of registering for a conference where you have an interview. Some may say that you should register anyway because that’s what you do. But unless you’re really excited about lots of particular panels (which doesn’t always happen at the big conferences), what’s the point? To get into the exhibit hall? To pay extra to go on a field trip or to attend a luncheon?

Perhaps some readers have a dissenting viewpoint, and I’d like to hear it if they do. I’m willing to be swayed. In my current thinking, however, it seems that having the interview at the AHA’s job center instead of a hotel suite is just transferring some of the cost from the interviewing institution to the interviewees. And that stinks.

Conference interviews really don’t have much to do with the attendant conferences, and taking those interviews is expensive enough already. Why would adding an extra cost ever be a good idea?

How far does public service go?

Whenever I see a New Yorker article by Jill Lepore, I know it’s a piece I want to read. She’s a Bancroft Award-winning distinguished professor at Harvard for a reason.

Her latest, “The Great Paper Caper,” is a fascinating chronicle of U.S. Supreme Court Justice Felix Frankfurter (in office 1939-62) that not only accounts his time on the bench but, more importantly, questions what should happen with the papers of Supreme Court justices after they retire. As alluded to in the title, a great chunk of Frankfurter’s papers were stolen from the Library of Congress.

This incredible wrongdoing is compounded by the fact that, unlike other public servants working at public institutions, nothing compels the justices to release their official papers to the public. Lepore notes that the Federal Records Act (1950) excludes the Supreme Court, and subsequent additions, like the Presidential Records Act (1978), have not changed that. This means that access to the complete Frankfurter papers had been (and still is) limited, and any lost documents cannot even be remotely replaced for the public.

The “great paper caper,” then, is not only the theft of Frankfurter’s papers from the Library of Congress, but also the potential “theft” from the public by justices who would not release their unedited papers without legal compulsion.

Before delving into the “point” of this blog post, I want to say that there are few things more reprehensible in the scholarly world than stealing documents from an archive. The act is on par or worse than plagiarism, data falsification, and other research fraud or academic misconduct. I actually got a little sick to my stomach reading the article’s opening vignette about the stolen papers.

After my initial revulsion subsided, I started thinking about the wisdom of allowing Supreme Court justices to decide the manner in which their papers are released to the public or even if those ever will be. Lepore does an admirable job presenting an even-handed account of why the current system may be preferable to one where federal law mandates the release of judicial papers, but I remain entirely unconvinced.

If we find it appropriate not only to release the full papers of every president, but even to go as far as to record every Oval Office conversation, what excuses do we truly have for the Supreme Court? As Richard Nixon showed us, when presidents are allowed to censor the historical record regrettable things can happen. Lepore’s article gives a few instances where former justices have censored their papers in ways that are detrimental to the public welfare.

In the end, I think that if you’re holding an office as lofty as United States President or Supreme Court Justice, your public service does not end when you leave office. Part of the deal—part of what you owe the country—is a full accounting of your actions in the historical record. As much as is possible, your personal life should remain as personal as you want it to be. Your actions in an official capacity, however, are no longer yours—they belong to the nation and its peoples.

Obviously this opinion is influenced by my training and career as a professional historian. I owe a great debt to many governmental library holdings, without which my scholarship simply would have been impossible.

More than that, however, I would argue my belief is informed by being a civic-minded citizen. If we know anything about U.S. policymakers it is that their deliberations are often complicated—their decisions are rarely as simple as they seem to the public at the time. We deserve, as members of a democratic republic, to have to capacity to hold our elected officials (and their appointments) to a full reckoning. That is impossible without access to the papers created in the official capacities of their duties.

In addition, I believe that the best way to keep people trustworthy is to make sure that they have no occasions to be untrustworthy. Even great, honorable persons can be tempted to commit wrong when they know they can get away with it. The ability to heavily censor or not release papers gives members of the Supreme Court the ability to do so. Perhaps this is an incredibly pessimistic view of justices’ moral fiber, but I believe it is a simple recognition of human desire to escape punishment when it easily can be achieved.

Some may argue that justices need to be free from the fear of recrimination so that they can render proper, constitutional verdicts. But if they are scared of the deliberations behind their decisions being made public, shouldn’t they also be afraid of making those sorts of decisions? Justices owe the release of their papers to the public. Perhaps more importantly they owe it to themselves so that they can fulfill their charges at the highest level.

Whether you agree or disagree, feel free to leave a comment.

Farming as a trend

Almost a year ago The New Yorker started appearing in my mailbox. To this day I have no clue who purchased the subscription for me, but I’ve become a bit addicted to the weekly offering. An article by Alec Wilkinson on the new magazine Modern Farming caught my eye. I will admit, I have never read an issue of the award-winning Modern Farming, so what follows is more a rumination on Wilkinson’s piece than anything else. 

Like many New Yorker pieces it is as much a character study as anything else, this one of Ann Marie Gardner, Modern Farmer’s founder and editor. The article adroitly compares Gardner to the magazine. One person reviewing Modern Farmer said, “I wonder who the ideal reader is. My assumption is that it’s people who will never farm.” Another remarked, “There was not anything actually written by a farmer.” Both descriptions fit Gardner.

Later the reader is presented with a story of Gardner buying chickens for dinner. The local farmer selling her the chickens slaughters the birds on the spot for Gardner, causing her anguish. At the end, Wilkinson writes, “Sniffling, she wrote a check for $84.93, and took the chickens, which I had to carry, because when she touched them she discovered that they were still warm.”

I’m not going to pretend that I grew up on a farm (because I did not), but I did grow up with family friends who were farmers. I remember quite fondly what a barn full of curing tobacco smells like, but I never spent my summers picking it like my mother or grew up on a farm full of it like her mother.

I do know, however, that at its core farming is about killing some beings so that other beings can live. This is obvious when eating meat—children’s author E.B. White of Charlotte’s Web fame once called hog slaughter first-degree murder while simultaneously acknowledging how delicious bacon tastes. The same can be true for plants, however. Wheat cannot be eaten while it is still alive, nor many other crops. We know this on a visceral level, but why are so many of us still so squeamish when we are reminded that our dinner used to breath and eat just as we do? (My wife absolutely refuses to handle raw meat.)

Do I have a larger point that Wilkinson’s piece does not make? I am not sure. But I do know that farming is trendy right now. Organic produce is all the rage, and so is eating local, slow food, etc. These are not bad things (even if “organic” becomes commoditized like many entities in this country). Cooking reality shows are too numerous to count these days (I happen to be a big fan of Top Chef). Yet if we, as a nation, increasingly care so much about our food, why is historian Matthew Booker researching why many people have “lost faith” in food? Perhaps this “foodie” trend is just that—a trend.

Perhaps we should be less interested in the idea of farming and more interested in what farming actually is. Farming is death. Faming is tedium. Faming is a business. Farming is being confronted with tough choices. It is also many other things, and none of them are inherently bad. Knowing that, it is more than a little strange that the editor and founder of a magazine titled Modern Farmer would go out of her way to get fresh, local chickens for a dinner party and then get weepy because her dinner was slaughtered while she waited (FYI I recognize the gendered element of this portrayal).

Agriculture has become the latest intersection of culture and environment where a great many people in our society feel that they have a stake or expertise (or both). What wilderness was a century ago agriculture is today. That is not necessarily a bad thing. I just wonder if we are concerned about our cultural interactions with agroecosystems because doing so is trendy or because we truly appreciate what agriculture is and the importance it has on the world and our lives. Maybe it does not matter either way.

We all have to eat, you know.

Interstellar as anti-environmentalist trope?

**Spoiler alert: This post will contain detailed information about the movie Interstellar. If you don’t want to read about that, please stop reading here.**

I went to see Christopher Nolan’s newest film Interstellar this past week, and one of the first things I did when I got home was to Google search whether Nolan is a climate change denier. I’ll explain in this post why the movie caused me to question that and why I think it could be interpreted as being an anti-environmental and climate change denying film.

Don’t get me wrong—I liked the movie a lot. If you want to see an engaging film with beautiful special effects this is your movie. The use of theoretical astrophysics is fascinating, and at its heart the film causes its audience to question what the value of the human species is. Not bad, even if it takes us almost three hours to get there. Nolan (director and shares writing credits with his brother) has produced something worth watching. 

For all its merits, however, I left Interstellar feeling a bit unsettled. We’re confronted right away with a world in crisis in the distant but not too distant future. The world is gripped by a modern Dust Bowl (they even used interviews from Ken Burns’ 2012 documentary “The Dust Bowl”).

But how did the Earth get like that?

This was one of my first clues. I was shocked that there isn’t a single line that might hint at how the planet got to such a state even when there are other lines of dialogue and context clues that help us understand the current situation. First off, it’s farfetched that the whole planet would be gripped in another Dust Bowl. But more than that, you don’t want to tell us it was anthropogenic climate change? Or nuclear war? Or SOMETHING? Instead we’re presented with a “dying Earth” model (in some ways fitting the Gaia hypothesis). The planet is dying just because it is. It’s not necessarily (as far as the viewer knows) because humans did anything wrong.

Color me skeptical.

Well at least humans are trying to fix the planet and that’s a good thing, right?

Wrong. At least in the movie’s worldview. Our ruggedly handsome protagonist, Cooper (played by Matthew McConaughey), let’s us know early on that the folks trying to take care of the planet are anti-science and don’t have humanity’s best interests at heart.

Cooper’s daughter gets suspended from school because she brought in one of her father’s old textbooks that showed the moon landing. Her teacher then explains very matter-of-factly that everyone NOW knows that the moon landing was faked to bankrupt the Soviets during the Cold War. Teaching anything other than the new standards is hurtful. The next generation needs to keep its eyes on the ground, not the sky.

Pfft.

Cooper’s daughter Murph (played by three different actors), in many ways the real hero of the movie, rebels against her teachers and studies physics, eventually saving the world. We’re to believe that her calling to study physics and Cooper’s desire to be a pilot again are somehow better and more important than farming.

Don’t get me wrong—there’s absolutely nothing wrong with wanting to be a pilot or study physics. Of course there’s nothing wrong with wanting to farm and take care of the Earth either (in real life, not in Interstellar).

"We're explorers, pioneers; not caretakers."

The textbook incident at school causes Cooper to intone the above line in his honey-thick Southern drawl. It is the most powerful line in the movie by far. It’s also a bit nonsensical. 

Cooper presents a false dichotomy. Why can’t people be both pioneers and caretakers? Why can’t some people be pioneers and some caretakers? And what’s wrong with taking care of things?

Perhaps this is some latent (or not so latent) sexism. The movie is overwhelmingly white and male (I disagree with Neil deGrasse Tyson here). Or perhaps Nolan means to vilify an environmentalist or conservationist mindset that sees protecting the Earth above all else as worthwhile and valuable.

The caretakers are also anti-science. In addition to the moon landing chicanery above, we are told that the Lazarus project has to be ultra-secret because if normal people (presumably the caretakers) found out about it they would be outraged that money was being spent on space exploration (a stand-in for science in general). Agronomy is more-or-less unmentioned in the movie, and the one scene where it is depicted we are led to believe that the discipline is impotent and unable to counteract the blight in any way. For a movie that glorifies science so much why can’t agricultural science also be lauded?

Climatologists are the Villains?

When Matt Damon makes his appearance about two-thirds of the way through the film, it is eventually revealed that he is a selfish villain who, though he proclaims to care about the survival of the species, in the end only cares about his own survival. What is his character’s name? Dr. Mann.

While some may question whether Mann is meant to stand in for humanity (“man” or “mankind”), I immediately wondered whether this was an allusion to Dr. Michael Mann, perhaps the world’s most well known climatologist. Mann is best known for his work in developing the “hockey stick curve,” likely the most common iconography associated with anthropogenic climate change.

Could it be a complete coincidence that the biggest threat to the mission to save humanity has the same name as the most influential climatologist? Very possibly. But it would be a heck of a coincidence. I think the name is intentional. Nolan sees Michael Mann as a villain who is hurting humanity’s chances at survival and portrays him as a lunatic. At least he gets played by a good looking actor.

Technocratic worldview

The biggest piece of evidence why the movie is an anti-environmentalist trope is the technocratic or techno-scientific worldview that it espouses. We are led to believe that the only people who can save the species are the scientists (not normal people changing their lifestyles, not political agreements, not taking better care of the planet, etc.).

Moreover, the lampooned “caretakers” are not terribly intelligent. Are we really to believe that some mysterious “blight” has destroyed everything but corn? And only one varietal of corn at that? Perhaps this is part of an anti-Monsanto trope. The rest of the movie, however, portrays to us that it’s more likely ineffective management by those trying to take care of the Earth. The caretakers aren’t just misguided; they’re idiots.

Moreover, while robotics technology has advanced enough to create sentient beings, agricultural science and technology seem to have regressed to technology and methods available at the end of the twentieth century. If agricultural production was as desperate as is implied, it seems most plausible that research and development resources would have been poured into food production and soil conservation. That did not happen, however, for some unknown reason.

On the other hand, the physicists are the real ones who can save the planet. By the way, I think it’s not coincidental that the science PhDs who have most vehemently argued against anthropogenic climate change (and used their credentials as “scientists” to do so) have been those with PhDs in physics. Hmm.

I don’t have a problem with showing physicists as heroes. The movie does a marvelous job of showing how interesting and, frankly, cool astronomy and astrophysics are. Kudos. But presenting physicists as the heroic foil to the antagonist “caretakers” is, at best, problematic. 

Conclusion

 In the end, Interstellar’s message is simple: it would be really bad if humans destroyed the Earth so much that the entire planet turned into another Dust Bowl, but if that happens scientists will help lead the way and find a solution. That solution may even be ditching the planet for something better!

 Could all of this be coincidental? Perhaps… but with so many data points that seems unlikely. We could quibble about most of these points, but taken as a collective they are difficult to dismiss. Nolan seems interested in telling a tale where those interested in taking care of the Earth are holding back the species and one where science and technology can overcome any environmental problem. It even goes far enough to make me question whether Nolan is a climate denier.

 Ultimately, Interstellar is a wildly entertaining movie. I just wish its message was equally as thrilling.

UPDATE (11 Nov 2014): Neil deGrasse Tyson and I completely agree on this: fixing the Earth seems much easier than going through a wormhole to find a new planet.