What about the performance of the media later in the Vietnam War? Is it true, as claimed in a persistent mythology embraced by some military officers and others, that the news media, especially television, turned American opinion against the war?
Television was actually still a relatively new medium in the late 1960s, though TV sets were ubiquitous: there were 10,000 television sets in the United States in 1941, 10 million by the time of the Korean War, and 100 million at the peak of the Vietnam War.46 So Vietnam was essentially America’s first “living room war,” as many media analysts have noted. But the public implications of that fact are not clear, particularly in the context of mounting casualties, crushing budgetary costs, and years of stalemate. In fact, one study found that of roughly 2,300 stories from South Vietnam that were televised on evening news programs, only seventy-six “showed anything approaching true violence—heavy fighting, incoming small arms and artillery fire, [soldiers who had been] killed and wounded within view.”47
According to John Mueller, author of War, Presidents and Public Opinion, there are no polling data to support the notion that “largely uncensored day-by-day television coverage of the war and its brutalities made a profound impression on public attitudes.”48 There is, however, a direct polling-data relationship between casualties and public opinion. Support for the Vietnam war fell fifteen points every time casualties increased by a factor of ten.49
Humans, of course, have a remarkable capacity for self-delusion. To those in power, the control of information is regarded as utterly essential to achieving success, regardless of subject or policy, of administration or even country. The lessons of the Vietnam War for cold-blooded, pragmatic wielders of national power were not that excessive government secrecy was wrong. Or that waging something as gravely consequential as war without the informed consent of the governed is wrong, indeed immoral. Or that the political control of information in times of war, including misstating or overstating the facts, corrupts the government’s internal decision-making ability and circumvents the central tenet of self-government of the people, by the people, and for the people.
In fact, serious power practitioners believe such sensibilities are quaint and naïve do-gooder sentiment, to be disregarded. Realpolitik, they contend, requires the rigorous, sometimes ruthless control of information, and the longer it takes for the public to learn the truth, the better. Strict discipline and careful execution are absolutely essential, the goal being to severely limit internal and external access to information, whether it’s documents and calendars or memoranda, phone logs, and e-mails. This is rarely discussed publicly, but when salient information about significant government decisions disappears inexplicably or is simply never made available, perhaps for decades, such actions connote a contemptuous disregard for the public and the consent of the governed.
In 1917, Senator Hiram Johnson famously said, “The first casualty when war comes is truth.” In an era when the lines between war and peace have long been blurred—first by a half-century-long Cold War between East and West, then by a seemingly endless “war on terror” launched in the aftermath of September 11, 2001—it’s no wonder that truth has become an increasingly rare commodity in our national discourse.
But the Founding Fathers understood this power dynamic 150 years before detonation of the first atomic bomb. In 1798, James Madison wrote to Thomas Jefferson, “Perhaps it is a universal truth that the loss of liberty at home is to be charged to provisions against danger, real or pretended, from abroad.”56
The problem is that when access to government information is so tightly controlled, with multiple clearances representing many levels of classification, by definition you’ve created carefully delineated layers of truth. In essence, you have formally “institutionalized lying,” as David Wise characterized this phenomenon in his 1972 book, The Politics of Lying. “Policy makers who consider it desirable to mask their decisions or their objectives, or who wish to mislead the public or withhold information, can do so as easily as reaching for the nearest rubber stamp. In short, lying and secrecy are two sides of the same coin.”59
Little did I know, when I visited the grave of Larry LaSalle at Arlington National Cemetery, that the shocks and traumas of the 1960s and 1970s were just beginning. In 1968, Martin Luther King Jr. and Robert F. Kennedy would be assassinated within two months of one another, cutting short the lives of two of the most inspiring progressive leaders of our time. In July 1969, Senator Edward M. Kennedy would crash his black Oldsmobile off a bridge in Massachusetts, causing the drowning death of campaign worker Mary Jo Kopechne and sabotaging his hopes for the White House. In 1972, another presidential candidate, race-baiting, segregationist Alabama governor George Wallace, was shot and paralyzed while campaigning in Maryland. In October 1973, Vice President Spiro Agnew resigned in disgrace over improprieties committed while he was the governor of Maryland. And ten months later, for the first and (so far) only time in US history, a US president would resign from office because of his role in a political scandal, in Nixon’s case the massive Watergate cover-up.
Later that day in federal court, the Pentagon Papers case was presented before Murray Gurfein, a new judge appointed by Nixon who had served in army intelligence during World War II. Literally on his first case, Gurfein granted the US government a temporary restraining order, ruling that “any temporary harm that may result from not publishing during the pendency of the application for a preliminary injunction is far outweighed by the irreparable harm that could be done to the interests of the United States government if it should ultimately prevail” in the case.16
Never before had the US government legally barred a newspaper from publishing information. As Daniel Ellsberg described it decades later, the Nixon administration was “asking federal courts to violate or ignore the Constitution or in effect to abrogate the First Amendment. It was the boldest assertion during the cold war that ‘national security’ overrode the constitutional guarantees of the Bill of Rights.”17
On June 28, 1971, the Supreme Court rejected the government’s argument that prior restraint was justified and, by a vote of 6–3, upheld the First Amendment. Justice Potter Stewart wrote in his opinion for the majority:
In the absence of governmental checks and balances present in areas of our national life, the only effective restraint upon executive policy and power in the areas of national defense and international affairs may lie in an enlightened citizenry—in an informed and critical public opinion which alone can here protect the values of democratic government. For this reason, it is perhaps here that a press that is alert, aware, and free most vitally serves the basic purpose of the First Amendment. For without an informed and free press there cannot be an enlightened people.39
The New York Times and the Washington Post immediately resumed publication, and the following spring the Times won the Pulitzer Prize for meritorious public service. But beyond the euphoria and affirmation of the role of independent reporting in our society, it was sobering how close to the precipice the press had come. As Ben Bradlee later reflected, “For the first time in the history of the American republic, newspapers had been restrained by the government from publishing a story—a black mark in the history of democracy . . . What the hell was going on in this country that this could happen?”40
Within minutes of the dismissal of the charges against Ellsberg, a frustrated Nixon told his former chief of staff, H. R. Haldeman, “that sonofabitching thief is made a national hero and is going to get off on a mistrial. And the New York Times gets a Pulitzer Prize for stealing documents. . . . What in the name of God have we come to?” (italics in the original).57 What we’d come to, of course, were the full throes of the Watergate scandal.
But a stubborn Nixon said that in his “frame of mind,” he was “driven to preserve the government’s ability to conduct foreign policy and to conduct it in the way that I felt would best bring peace. I believed that national security was involved. I still believe it today, and in the same circumstances I would act now as I did then.”58 In other words, the president of the United States wanted to preserve the national security of the United States as defined exclusively by himself, waging war in Laos and Cambodia with no cumbersome congressional oversight or public discussion, with the United States taking numerous actions in defiance of state and federal laws if necessary. As he told television interviewer David Frost in 1977, “When the President does it, that means it is not illegal.”59
With that kind of mindset, all else is incidental, including the parameters of secrecy about the government’s policies and activities. Thus, the truth itself as understood by the public is substantially defined by the commander in chief and the state, through his chosen representatives. And unbeknownst to the American people and most national reporters, from 1969 through early 1972, the Nixon White House was veering out of control from its power, paranoia, and political ambition. Besides Ellsberg, for example, Howard Hunt and others were investigating muckraking journalist Jack Anderson, whose India-Pakistan reporting had also upset the president. The White House put the nationally syndicated columnist under surveillance, tapped his telephone, investigated his staff, and audited his taxes; at one point, Hunt apparently even hatched plans to assassinate him but fortunately did not proceed.60
The record shows that the only public official prosecuted for lying to Congress following Hersh’s revelatory Chile stories was former CIA director Helms. The famously “urbane and dashing spy-master” was represented by the celebrated defense lawyer Edward Bennett Williams—ironically, the same attorney whose telephoned advice to Ben Bradlee had stiffened the spine of the Washington Post and helped ensure the publication of the Pentagon Papers. Having worked out a deal with the Carter administration, Helms pleaded nolo contendere to a two-count misdemeanor, for which he was sentenced to two years in prison (suspended) and fined $2,000. Judge Barrington Parker castigated Helms in a tirade culminating in the words, “You now stand before this court in disgrace and shame.” Helms silently accepted the judge’s rebuke, appearing contrite. But outside the courthouse, a smiling Helms proclaimed to reporters, “I wear this conviction like a badge of honor . . . I don’t feel disgraced at all.”52 Later that day, his former CIA colleagues gave Helms “a standing, cheering ovation” and passed a hat around, raising the full amount of his fine.
Helms died in 2002. In his posthumously published memoir, he matter-of-factly acknowledged that in his dramatic meeting with Kissinger and Nixon in the Oval Office on September 15, 1970, which led to the Schneider assassination, “President Nixon had ordered me to instigate a military coup in Chile, a heretofore democratic country. Moreover, the knowledge of this presidential directive was to be kept from the U.S. officials most directly concerned. Within CIA this directive was to be restricted to those with an absolute need to know. And I was to report to the President through Henry Kissinger.”53
As exceptional as these stories were and remain, however, they all exposed controversial covert CIA activities many years after they had occurred. Truth delayed is truth denied. When the facts are bottled up by secrecy and deception, it means that the public and its elected representations can do nothing to prevent or reverse abuses. And it often means that officials responsible for misconduct are never held accountable for their actions, including their misleading comments to the US Congress and to the public.
Investigative reports by journalists with access to leaked government documents helped bring the facts to the American public. For example, on September 8, 1974, Seymour Hersh authored a front-page story in the New York Times titled, “CIA Chief Tells House of $8 Million Campaign Against Allende in ’70–’73,” using information drawn from a leaked letter by Democratic representative Michael Harrington of Massachusetts. Hersh’s reporting hugely influenced the rest of the news media, including the CBS News Washington Bureau, which assigned Daniel Schorr to “develop what, in effect, would be a television version of Hersh’s stories.”48
And so it might have remained, if not for House and Senate committees controlled by a different political party, in this case the Democrats. The committees were able to disgorge and publish secret government and corporate documents concerning the coup in Chile, setting forth precisely what abuses of power had occurred.
In 1975, the Senate Select Committee to Study Government Activities with Respect to Intelligence, chaired by Idaho Democratic senator Frank Church—the so-called Church Committee—issued its first of fourteen reports about a wide range of covert government intelligence activities “and the extent, if any, to which such activities were ‘illegal, improper or unethical.’” The work of the Church Committee represents one of the most remarkable investigations Congress has ever undertaken, as indicated in particular by its two stunning, starkly detailed reports, “Covert Action in Chile, 1963–73” and “Alleged Assassination Plots Involving Foreign Leaders.” These reports formally concluded that covert US activities in Chile between 1963 and 1973 had been “extensive and continuous.”47
But in the immediate aftermath of the death of Allende and his democratically elected government, the military coup and the apparent US role in instigating it were universally condemned. A few days after Allende’s removal, in the White House Oval Office, Kissinger complained to Nixon, “Of course, the newspapers [are] bleeding because a pro-Communist government has been overthrown.” Nixon replied, “Isn’t that something.” Kissinger continued, “I mean, instead of celebrating—in the Eisenhower period we would be heroes.” Nixon said, “Well, we didn’t—as you know—our hand doesn’t show on this one.” Kissinger agreed: “We didn’t do it. I mean we helped them—created the conditions as great as possible.” “That is right,” Nixon replied. “And that is the way it is going to be played.”46
What intrigued me then and now was this tacit understanding in the diplomatic community that governments speak and act on different levels, with differing levels of truth. In the case of Chile and Allende, “covert” meant, I was appalled to learn, that the president and other top US officials could secretly subvert the elected government of another country without ever acknowledging it publicly. International law and congressional oversight seemed to be less than an afterthought. It seemed there were two realities: one, a soothing bromide for the unknowing public and journalists; and another, the unmitigated, unaccountable truth of dubious legality or outright crime, knowable only to a select few.
Finally, on September 11, 1973, amid rising signs of hardly coincidental “destabilization,” including nationwide strikes against truck, bus, taxi, and shop owners, President Salvador Allende died in a brutal military coup led by General Augusto Pinochet, who became president and dissolved the Congress two days later.41 In the subsequent months and years of the Pinochet regime, 3,197 Chilean citizens and at least four US citizens were tortured, mutilated, and murdered by the new US-supported regime.42
From the time of Schneider’s death, Henry Kissinger, in sworn deposition, congressional testimony, and his subsequent memoirs, steadfastly maintained that he had halted Project FUBELT a week before Schneider’s killing, on October 15, 1970, and that he “never received another report on the subject.”33 In fact, there are White House telephone-call transcripts confirming that on the day Kissinger informed Nixon he had ordered the CIA to stop the military coup-plotting in Chile, he said: “That looks hopeless. I turned it off. Nothing could be worse than an abortive coup.” But as respected historian Robert Dallek has noted, according to a once-secret CIA “Memorandum of Conversation” on that very same day, CIA deputy director Thomas Karamessines told Kissinger in a White House meeting that he did not think “wide-ranging discussions with numerous people urging a coup could be put back into the bottle.”34 Indeed, the fact remains that the Schneider murder almost certainly would not have occurred without their covert maneuverings, which set certain tragic, inexorable events in motion. Nonetheless, after leaving government, both Kissinger and Nixon steadfastly maintained a “who, me?” posture, which is belied by reams of bellicose internal cables, telephone transcripts, and secret memoranda at that time. But, of course, that was always the plan—the diplomatic, public posture and the much tougher, private reality.35
Meanwhile, over the next three years, the United States privately made the Chilean economy scream, dropping US bilateral aid from $35 million in 1969 to $1.5 million in 1971 and, by 1973, just $800,000. The United States used its international financial clout to “dry up the flow of new multilateral credit or other financial assistance,” from $76.4 million in 1970 to $8.2 million in 1972.39 And separately, approximately $7 million was spent on covert activities inside Chile during Allende’s presidency, some of it going to opposition political parties and nearly $2 million funneled to the anti-Allende newspaper El Mercurio. And during this same time, the CIA’s relationships with the military quietly deepened.40
CIA director Helms and his top deputies sent a cold-blooded, congratulatory cable to Santiago: “The Station has done excellent job of guiding Chileans to point today where a military solution is at least an option for them. COS [Chief of Station] [and others involved] are commended for accomplishing this under extremely difficult and delicate circumstances.”30 Nixon disingenuously wired the former president of Chile, Eduardo Frei: “The shocking attempt on the life of General Schneider is a stain on the pages of contemporary history. I would like you to know of my sorrow that this repugnant event has occurred in your country.”31 Of course, there was widespread public revulsion to the violence, and on October 24, 1970, the Chilean Congress, by a margin of 153–37, ratified Salvador Allende as president. The Schneider assassination had caused exactly the opposite effect of what had been intended.
That left the CIA scrambling to cover up the incriminating evidence. Wimert was instructed to retrieve the $50,000 bounty, and in so doing he was forced to pistol-whip Brigadier General Camilo Valenzuela, who had not yet paid the kidnappers. And on orders to dispose of the guns, he dumped them in the Pacific Ocean, seventy miles from Santiago. Over the ensuing decades, the CIA attempted to airbrush the history of its involvement in the Schneider debacle. Agency Santiago personnel were ordered to “stonewall all the way,” even to other US officials, about what had transpired. But in late 2000, around the time the CIA was forced to declassify a particularly sensitive 1970 cable alluding to requests for money from some of the Schneider assassination plotters, the Agency finally acknowledged that it had paid money directly to the assassins.32
CIA director Helms and his top deputies sent a cold-blooded, congratulatory cable to Santiago: “The Station has done excellent job of guiding Chileans to point today where a military solution is at least an option for them. COS [Chief of Station] [and others involved] are commended for accomplishing this under extremely difficult and delicate circumstances.”30 Nixon disingenuously wired the former president of Chile, Eduardo Frei: “The shocking attempt on the life of General Schneider is a stain on the pages of contemporary history. I would like you to know of my sorrow that this repugnant event has occurred in your country.”31 Of course, there was widespread public revulsion to the violence, and on October 24, 1970, the Chilean Congress, by a margin of 153–37, ratified Salvador Allende as president. The Schneider assassination had caused exactly the opposite effect of what had been intended.
On September 15, 1970, in a White House meeting with Kissinger, Mitchell, and CIA director Helms, Nixon ordered Helms to stop Allende from being inaugurated on November 4. The following day, Helms told subordinates that Nixon had authorized the CIA to “prevent Allende from coming to power or to unseat him. The President had authorized ten million dollars for this purpose, if needed. Further, The Agency is to carry out this mission without coordination with the Department of State.” One item in Helms’s handwritten notes from that Oval Office meeting was “make economy scream.” And over the ensuing weeks, the administration would do everything conceivable to create a “coup climate” economically in Chile; at one point, Helms cabled Kissinger: “a suddenly disastrous economic situation would be the most logical pretext for a military move.” The name of the overall secret operation to block Allende from becoming president: Project FUBELT.25
However, it soon became clear that a stumbling block to any coup solution was the Chilean commander in chief of the army, General René Schneider, a constitutionalist strongly opposed to military interference in the electoral process. In Santiago, US military attaché Paul Wimert was in contact with “several groups of military plotters,” and by mid-October a single “full-fledged conspiracy” had emerged involving two Chilean generals (one active, one retired), an admiral, and a team of “kidnappers,” who supposedly would abduct Schneider and take him to neighboring Argentina. The plan: the military would announce that Schneider had “disappeared” and blame it on “leftists,” then President Frei would resign and a new military junta would assume power.
Wimert gave $50,000 to the unidentified kidnappers, along with six submachine guns and ammunition that had been sent in the overnight diplomatic pouch from Washington “specially wrapped and falsely labeled to disguise what they were from State Department officials.” Two unsuccessful attempts to intercept Schneider occurred on October 19 and 20, and on October 22 his car was rammed by a Jeep. Five people surrounded the vehicle and started firing; Schneider was shot three times and died days later.29
On September 16, in an off-the-record White House briefing for reporters, Kissinger outlined his “domino theory” regarding Chile: “I have yet to meet somebody who firmly believes that if Allende wins, there is likely to be another free election in Chile . . . There is a good chance that he will establish over a period of years some sort of Communist Government . . . in a major Latin American country . . . [ad]joining . . . Argentina . . . Peru . . . and Bolivia . . . So I don’t think we should delude ourselves that an Allende takeover in Chile would not present massive problems for us, and for democratic forces in Latin America, and indeed to the whole Western Hemisphere.”26 Of course, although US officials were not pleased with the election outcome, there wasn’t a whisper that the president had just authorized the CIA to do something about it, including attempting to instigate an internal coup d’état.
Allende was again the leftist, multiparty candidate, this time atop a Popular Unity coalition; his campaign platform urged wage increases, agrarian reform, and complete nationalization of the copper industry, which was substantially owned and operated by US multinational corporations Anaconda and Kennecott. He also urged closer relationships with Socialist and Communist countries. The conservative National Party candidate was seventy-four-year-old ex-president Jorge Alessandri, and left-leaning Radomiro Tomic was the Christian Democratic nominee. Like Allende, Tomic favored nationalization of the copper industry.22
US multinationals in Chile, not surprisingly, were extremely concerned about the prospects of an Allende presidency. Allende had made it clear, for example, that he also intended to nationalize the country’s telephone company, Chiltelco, which at that time was 70 percent owned by International Telephone and Telegraph, Inc. ITT officials contacted the CIA in both Santiago and Washington. In coordination with the company’s chairman, board member John McCone, who had served as director of central intelligence during the height of the Cold War, called CIA director Richard Helms with a proposal: ITT wanted the agency to secretly launder its $1 million contribution to the Alessandri campaign. The CIA declined, but ITT later found another way to move $350,000 to Alessandri’s campaign (and $100,000 to the conservative newspaper El Mercurio). Other US multinationals did likewise.23
Against that backdrop, the Kennedy and Johnson administrations, with help from the CIA, had also decided to secretly intervene “on a massive scale” in the 1964 presidential election in Chile, approving the expenditure of nearly $4 million (the equivalent of $29 million in 2011 dollars) for fifteen covert-action projects.18 Secretary of State Dean Rusk, in a Top Secret memorandum to President Lyndon Johnson weeks before the election, wrote, “We are making a major covert effort to reduce chances of Chile being the first American country to elect an avowed Marxist president [emphasis in original]. . . .”19 The goal was to “prevent or minimize the influence of Chilean Communists or Marxists” in the government and specifically to thwart the presidential candidacy of Socialist Party candidate Salvador Allende Gossens. Ultimately, the CIA “underwrote slightly more than half of the total cost” of the entire Christian Democratic campaign, whose candidate, Eduardo Frei, received a 57 percent majority in a three-way race.20
In 2004, the National Commission on Political Imprisonment and Torture in Chile concluded in a 1,200-page report that during the Pinochet years, “torture was a state policy, meant to repress and terrorize the population.” The report specifically identified 27,255 people who were tortured at 1,200 sites, and it named the military, political, and intelligence units that inflicted this torture. Shortly afterward, a judge placed the aged Pinochet under house arrest for kidnapping and murder. He died in 2006.15
It’s a terrible story. But also terrible, for me as an American citizen, is the fact that the US government helped to bring about Chile’s decades-long international nightmare. To this day, no American president has ever apologized to Chileans for the violence our government helped to cause.
Throughout his brutal seventeen-year dictatorship, Pinochet and his regime denied any involvement in the murders. But truth eventually caught up with him. In 1987, a member of the Letelier assassination team agreed to plead guilty and provide testimony in exchange for protection in the United States. He directly implicated Pinochet in the cover-up of the crime. In 1988, Pinochet lost a constitutionally required, national plebiscite, and on January 6, 1990, Christian Democrat Patricio Aylwin was inaugurated as Chile’s new president. In October 1998, Pinochet was arrested in London in connection with a Spanish government prosecution against him; he was held under house arrest for sixteen months before being allowed to return to Chile. But upon his arrival, facing over seventy judicial cases, he was stripped of his immunity. He was placed under house arrest in January 2001 and interrogated by the authorities, but the Chilean Supreme Court ruled in July 2002 that Pinochet, eighty-seven, was “mentally unfit due to dementia” to stand trial.14
Roughly eighteen months after our conversation, in the darkness before dawn just outside his Bethesda home, an assassin sent to Washington by the Chilean secret police—with the personal knowledge of General Pinochet—taped a remote-control bomb to the driver’s-side chassis of Letelier’s Chevrolet Chevelle. On the morning of September 21, 1976, driving with his young, recently wed IPS colleagues Michael and Ronni Moffit to work downtown, one of the assassins trailing them “pressed the button on an electronic paging device,” triggering a massive explosion that was heard at the State Department half a mile away. A piece of shrapnel cut twenty-five-year-old Ronni Moffitt’s jugular vein, and she literally drowned in her own blood. Letelier’s legs were blown off, and he died before the ambulance reached George Washington University Hospital. The backseat passenger, Michael Moffitt, only slightly injured by comparison, tried in vain to help the pair amid the bloody mayhem.12
It was the first time in US history that a foreign government had conducted a political execution on the streets of Washington, DC—at least, so far as we know. Of course, it took years for the truth to seep out, and it’s still not all out yet. But, in time, the world learned that Augusto Pinochet had authorized a series of political assassinations outside Chile, with Letelier just one of the victims. Pinochet’s deadly secret police, the Directorate of National Intelligence (DINA), headed by Colonel Manuel Contreras, who reported directly to him, was responsible for the years of terror after the coup. Contreras had overseen the Letelier hit, along with other “Operation Condor” state-sponsored terrorism. Michael Townley, a US citizen who had emigrated to Chile, had placed the bomb under the car and earlier had recruited as accomplices a trio of anti-Castro Cubans, all of whom were eventually apprehended, tried, and convicted, although their convictions were later reversed on appeal.13
In 2005, in the immediate aftermath of Hurricane Katrina, for a brief, appalling moment, the American people saw poor black people penned for days inside a downtown football stadium without food and water. The federal and state government incompetence and de facto abandonment of the poor in New Orleans and elsewhere in the Gulf region resulted in numerous entirely preventable deaths, severely damaging President George W. Bush’s standing in public opinion polls.
As Seymour Hersh years later wrote in his seminal book The Price of Power: Kissinger in the Nixon White House, “Letelier, with his old-world manners and civility, was no match for Kissinger.”6 Those who might wonder how Kissinger himself would defend his actions will turn to his three volumes of memoirs in vain. Kissinger did not mention Letelier once—not even in a footnote.7
Letelier was also convinced that a May 1972 break-in at the Chilean Embassy in Washington, which he described to me in intricate detail, had been the handiwork of the White House Watergate “plumbers” as part of the US “infernal machine” of covert intervention against Allende. We now know he was correct.8 In 1999, newly available Oval office recordings from May 1973, revealed President Nixon telling his aide General Alexander Haig, “There are times, you know, when, good God, I’d authorize any means to achieve a goal abroad,” including “the breaking-in of embassies and so forth.”9
Rather than accepting blame, the administration sought to deflect it. Four days after the hurricane, Bush tried to suggest that no one had seen this coming: “I don’t think anybody anticipated the breach of the levees.” Actually, the New Orleans Times-Picayune, in a multipart series three years earlier, had exposed the unreliability of the levees.68 And months after Katrina, the Associated Press released confidential government video footage showing that Bush had been clearly told in a briefing days before the hurricane hit that “the storm could breach levees, put lives at risk in New Orleans’ Superdome and overwhelm rescuers.”69 Worse still, the administration even tried to prevent any media photographs of the injured or dead, a hide-the-truth policy that had proven to be much more feasible in a tightly controlled Iraq war zone than in five Gulf Coast states.70
It was an eloquent and impressive speech. But few people fully imagined that day what would occur over the ensuing twenty-two months: the election of the first African American president of the United States, and then, from the west portico of the US Capitol, his inauguration before a shivering crowd of roughly 2 million jubilant people. That evening of January 20, 2009, the Obama family slept in the White House, which slaves had built two centuries earlier.
In March 2007, as part of our continuing fascination with the American struggle over race, my wife and I, along with our six-year-old son, accompanied a dozen bipartisan members of Congress and others on a three-day pilgrimage to Birmingham, Montgomery, and Selma to revisit the sites of the historic bombings, murders, churches, and marches of 1961 to 1965. This was one of the biannual trips organized by Representative John Lewis, beaten or imprisoned forty times in the 1960s, and one of the “Big Six” black leaders in the civil rights movement with King. Lewis spoke at the great 1963 March on Washington, at the age of twenty-three, and after King’s murder, in 1968 was a presidential campaign aide to Senator Robert Kennedy, with him in Los Angeles just minutes before his assassination.60 As you can imagine, revisiting these hallowed sites for the first time in thirty years in the presence of John Lewis was a remarkable experience for me.
And so today, four-plus decades after Liuzzo’s murder, and despite the massive effort dedicated to the case by reporters and law enforcement officials alike, it remains clear to me that we will never know the full truth about this crime or Rowe’s precise, complicated role in it.59
Personally, I find this conclusion excruciatingly painful. It goes against everything in my investigative reporter’s DNA. But sometimes inexorable circumstances (such as lack of physical evidence or lack of witness credibility) simply preclude ever knowing the full truth. For journalists, as well as judges, juries, prosecutors, and others operating in a postmortem, criminal-justice context, real-life circumstances can sometimes erode certainty “beyond a reasonable doubt.” And there are the investigator’s own individual or organizational limitations, usually related to finite time and resources, including money. Within the journalism milieu, the financial, legal, and corporate sensibilities, usually subtle and unstated, can directly affect what a reporter, editor, or corporate executive is willing and able to even attempt at the outset.
On March 25, ten days after Johnson’s address, Viola Liuzzo, a white mother of five who had come to Selma from Detroit to demonstrate peacefully, was shot and killed while ferrying marchers in her Oldsmobile in “bloody Lowndes” County.
The news of her ambush shocked the country, and in a stunning development literally the next day, President Johnson announced in a televised event that the FBI had already arrested the accused killers: Ku Klux Klansmen William Eaton, Gary Thomas Rowe, Eugene Thomas, and Collie Leroy Wilkins. Days later, however, Jack Nelson of the Los Angeles Times and, soon after, Fred Graham of the New York Times reported that in fact, Rowe was a paid FBI informant and had been riding inside the murderers’ car. Hours later, it was reported, he provided the names of his fellow travelers to the FBI.55
According to John Lewis, who watched the speech in Selma with other movement leaders, Martin Luther King “wiped away a tear at the point where Johnson said the words, ‘We shall overcome.’” Later that evening, King called Johnson to thank and congratulate him. He added: “It is ironic, Mr. President, that after a century, a southern white President would help lead the way toward the salvation of the Negro.” In a subsequent telegram, King told Johnson his speech was “the most moving, eloquent, unequivocal and passionate plea for human rights ever made by any President of the nation.”54
Later that night, ABC interrupted its prime-time airing of Judgment at Nuremberg, which chronicled how the German people had enabled Nazi-era atrocities, with a fifteen-minute special report featuring remarkable footage of “Bloody Sunday” in Selma. Within moments, millions of Americans went from seeing a movie about brutal Nazis to seeing vicious Alabama law enforcement officials, at one point hearing Sheriff Clark order his men to “get those goddamned niggers. And get those goddamned white niggers.”52
Just eight days after Bloody Sunday, with the country fully engaged over the crisis in Alabama, President Johnson proposed voting-rights legislation before a joint session of Congress and 70 million TV viewers. In a powerful speech interrupted by applause forty times, Johnson said:
I recognize that outside this chamber is the outraged conscience of a nation, the grave concern of many nations, and the harsh judgment of history on our acts . . . Even if we pass this bill, the battle will not be over. What happened in Selma is part of a far larger movement which reaches into every section and state of America. It is the effort of American Negroes to secure for themselves the full blessings of American life. Their cause must be our cause too. Because it is not just Negroes, but really it is all of us, who must overcome the crippling legacy of bigotry and injustice.
“It is as old as the Scriptures and is as clear as the American Constitution,” Kennedy said. “The heart of the question is whether all Americans are to be afforded equal rights and equal opportunities, whether we are going to treat our fellow Americans as we want to be treated.”44 Just hours after Kennedy’s speech, Medgar Evers, director of the Mississippi National Association for the Advancement of Colored People, was assassinated outside his home in Jackson. A week later, after having met with Evers’s grieving family at the White House, the president submitted to Congress what would become the 1964 Civil Rights Act. The legislation passed the US Senate by a vote of 73–27 exactly one year later, adeptly shepherded into law by Lyndon Johnson after Kennedy had also been assassinated.45
Sadly, the failure of the authorities to respond in a timely fashion to the Sixteenth Street bombing was no anomaly. The outrageous truth is that there are still more than one hundred unsolved civil rights–related murders of African Americans from the era between the 1954 Supreme Court Brown decision and the 1968 murder of Dr. Martin Luther King Jr.9
King also turned to one of his edgiest, most brilliant Southern Christian Leadership Conference strategists, Reverend James Bevel, who recommended the almost unthinkable: make the Birmingham campaign a “children’s crusade.”38 King agreed, and on May 2, 1963, more than one thousand young black people congregated at Birmingham’s Sixteenth Street Baptist Church, then marched downtown.39 There were six hundred arrests. Four days later, the procession of young blacks swelled to 3,000, and another eight hundred arrests overflowed the Birmingham jail to outdoor pens on the state fairgrounds. By the time it was all over, 10,000 children had been arrested, transported to their outdoor prison in school buses.
Sadly, the failure of the authorities to respond in a timely fashion to the Sixteenth Street bombing was no anomaly. The outrageous truth is that there are still more than one hundred unsolved civil rights–related murders of African Americans from the era between the 1954 Supreme Court Brown decision and the 1968 murder of Dr. Martin Luther King Jr.9
What I learned in the process shocked me. If I’d previously accepted the notion that the march to equality for black Americans had ended in a clear-cut victory for the forces of freedom, my naïveté was swiftly shattered. I quickly discovered that equal justice regardless of color was still an unattained American ideal. And perhaps even more disturbing, I learned that most Americans are blissfully unaware of this shameful reality—and that government, much of the news media, and many Americans themselves have quietly succumbed to this state of willful, comfortable blindness.
Two months before his death, President John F. Kennedy asked FBI director J. Edgar Hoover to aggressively investigate the Sixteenth Street bombing case. More than two hundred FBI agents were assigned to the case, but the federal government didn’t prosecute anyone in the years that immediately followed. Worse, in 1980 it finally emerged that Hoover had unilaterally blocked any prosecution for the heinous civil rights crime, even though his agents had identified the perpetrators within weeks.6
A quarter century later, another piece of that puzzle emerged in a powerful 2005 book, The Informant, by historian Gary May. According to May, the FBI in Birmingham knew ten days before the girls were murdered that the most dangerous local Klansmen—believed responsible for other bombings months earlier—had just obtained a crate of dynamite. Yet the FBI did not question or apprehend the men, or even alert the Birmingham police.7
It should be noted that nearly forty years after the Birmingham church bombing, the Justice Department did finally successfully prosecute sixty-two-year-old Thomas Blanton for the crime. (I interviewed Blanton in Birmingham in 1977 while at ABC News; he steadfastly and unconvincingly denied his involvement in the bombing, then and always.) In 2001, Blanton was convicted in US district court in Birmingham for murder.
In his closing argument to the jury, then US attorney Doug Jones said, “It’s never too late for the truth to be told. It’s never too late for wounds to heal. It’s never too late for a man to be held accountable for his crimes.”8 That’s true, of course; and justice delayed by forty years is better than no justice at all. But there’s no credible excuse for the FBI’s willful failure to prosecute in the 1960s, as soon as it had knowledge of the perpetrators of the crime. Swift action would have sent a strong message that violence against the civil rights movement would not be tolerated, perhaps deterring some of the other attacks that followed.
What has perhaps been most surprising is the unexpected national security obsessiveness of President Barack Obama, whose administration has waged protracted prosecutions of leakers like army private Chelsea (formerly Bradley) Manning, who had illegally passed classified documents to WikiLeaks, as well the journalists who work with leakers, for example, James Risen of the New York Times. What’s more, more than 4 million government employees and contractors now have national security clearances. And in 2010, a jaw-dropping 77 million documents were classified, up 44 percent from the year before!88 According to James Goodale, the New York Times’ lead counsel during the historic Pentagon Papers case, “In many respects, President Obama is no better than Nixon. Obama has used the Espionage Act to indict more leakers than any president in the history of this country.” And Goodale wrote that in his memoir before the Obama Justice Department secretly obtained two months of telephone records for reporters and editors at the Associated Press (AP), an act unprecedented in US history.89
The abuses of power extended far beyond the supposed misdeeds of the media or claimed damage to national security. The White House, joined later by the Committee to Re-elect the President (also known as the Committee for the Re-election of the President), also began conducting illegal political intelligence and surveillance operations and “dirty tricks” to disrupt, embarrass, and smear the likely 1972 Democratic presidential candidates. For example, on September 8, 1971, Nixon said to Ehrlichman in the Oval Office, “We have the power but are we using it to investigate contributors to [Nixon’s vanquished 1968 Democratic presidential opponent] Hubert Humphrey, contributors to [Democratic senator Edmund] Muskie, the Jews, you know, that are stealing every— . . . are we going after their tax returns? Do you know what I mean? . . . Are we looking into Muskie’s return? . . . Teddy [Kennedy]? Who knows about the Kennedys? Shouldn’t they be investigated?”68
Respected author Theodore H. White summed up the whole sorry saga back then: “The true crime of Richard Nixon was simple: he destroyed the myth that binds America together . . . that all men are equal before the law and protected by it; and no matter how the faith may be betrayed elsewhere, at one particular point—the Presidency—justice will be done beyond prejudice, beyond rancor, beyond the possibility of a fix. It was that faith that Richard Nixon broke.”77
And the hapless White House plumbers, who apparently had learned nothing from the botched burglary to steal Ellsberg’s psychiatric files nine months earlier, forged recklessly ahead with another nocturnal black-bag job.
In 1970, White House special assistant to the president Charles Colson began coordinating the creation of an “Enemies List,” which ultimately targeted some two hundred individuals and eighteen organizations to receive “‘special’ government attention.” The roster included actors, Democratic politicians, foundations, university presidents, and more than fifty journalists, including James Reston, Marvin Kalb, and Daniel Schorr, a CBS News correspondent hired in 1953 by Edward R. Murrow.62
Schorr was seen as a “real media enemy” by the Nixon White House. During live network television coverage of the Senate Watergate hearings in 1973, after former White House counsel John Dean first testified and revealed that there had been a White House “Enemies List,” Schorr and his producers quickly obtained the list, and the veteran correspondent began to read the names live, on the air. “I had not seen it. I had not prepared for it . . . I didn’t faint. I didn’t say ‘Wow.’ I didn’t do anything. I read my name with the other names just as though it was another name.”63
Over time, the journalistic exposés spawned criminal prosecutions and congressional investigations, presided over by heroic figures such as Judge John Sirica, a Republican appointee, in his federal courtroom, and Democratic committee chairmen Senator Sam Ervin and Representative Peter Rodino, who in nationally televised hearings insisted on the truth and on government accountability. As a result, Americans learned that President Richard Nixon had violated his oath of office, systematically and illegally abusing power in the most wide-ranging ways ever documented, much of it obvious from the White House audiotapes of meetings and phone calls to and from the Oval Office. To compound matters, Nixon clumsily orchestrated the cover-up within days of the Watergate break-in, and in October 1973, shamelessly ordered his appointees to fire Watergate special prosecutor Archibald Cox, in the “Saturday Night Massacre.” And for more than two years he lied to the American people, straight-faced before the cameras, about his involvement, knowledge, and criminal misconduct.
The Clinton administration, to its great credit, initiated the Chile Declassification Project, which resulted in the release of roughly 24,000 secret documents pertaining to the two-decade US foreign policy disaster. The most stubbornly intransigent agencies, not surprisingly, were the CIA and the National Security Agency, which to this day refuse to declassify hundreds of documents. Kissinger had, for decades, prevented anyone from accessing records related to his tenure as National Security adviser and secretary of state to two presidents; when he left government, he literally took with him his recorded telephone conversations between 1970 and 1976. Public officials frequently pull such shenanigans—absconding with public documents and getting others declassified for their high-priced memoirs, while withholding the negative, unflattering, or even potentially criminal material. Kornbluh calls it “holding history hostage.”61 Call it whatever you want, it obscures and distorts the truth as we know it. And that’s wrong.
US involvement in the Chilean coup of 1973 is scarcely the only episode in recent history in which illegal, immoral, or simply politically untenable acts by American governments abroad have been mired in webs of secrecy, obfuscation, and lying. We opened the first chapter of this book with another such episode, the Gulf of Tonkin incident, which seemingly set the pattern for similar escapades, including the one that lends the book its title—the brazen use of 935 lies by the administration of George W. Bush to mislead America into war in Iraq. There certainly have been other examples of foreign policy adventures that were of questionable legality or in which the Congress, the public, and the news media were misled, including the US invasion of the Dominican Republic (1965) and the invasion of Panama (1989), in which hundreds of people were killed and US reporters were kept from the field of action.62
Sadly, in the field of US foreign policy, seriously assessing the morality of government decision making is rare, generally regarded as quaint and foolish by realpolitik policymakers, regardless of party or administration. When it does occur, it often seems to degenerate into self-serving pomposity and self-justifying, in-the-eyes-of-the-beholder rhetoric.63
The “Somoza dynasty” of Nicaraguan presidents had been, in the words of foreign policy historian Walter LaFeber, “a subsidiary of the United States since 1936.” In fact, there had been eleven US military interventions in Nicaragua between 1853 and 1933, before the Somozas. But by 1979, the Somoza dynasty was ready to collapse under pressure from the left-wing Sandinista National Liberation Front. The Carter administration had stopped supporting Somoza and his National Guard in the waning months before he fled Nicaragua, and ultimately Carter also refused to grant Somoza political asylum in July 1979 when he fled the country he and his family had ruled since 1937. Somoza had been responsible for the deaths of thousands of Nicaraguans and a looted national treasury left with only $3 million. He was assassinated in exile in Paraguay just over a year later.
Meanwhile, the Sandinistas had taken over the country and installed a ruling junta, led by future president Daniel Ortega. While the Carter administration had serious misgivings about the leftist Sandinistas’ close ties to Fidel Castro’s Cuba, it had also sent emergency earthquake relief to Nicaragua and sought $75 million from Congress in US economic aid. For nearly a year, however, House conservatives stalled the legislation, and it was finally approved in September 1980.64
After Reagan became president in 1981, his ambassador to the United Nations, former Georgetown University professor Jeane Kirkpatrick, signaled the new administration’s change in foreign-policy thinking by declaring that “Central America is the most important place in the world for the United States today.”66 And in fact, over the next several years, Reagan and his appointees helped to turn the poor and substantially illiterate region into a major Cold War combat zone, orchestrating and bankrolling secret wars “fought in the main by proxy warriors.” In February 1983, the president told a national meeting of the American Legion, “The specter of Marxist-Leninist-controlled governments in Central America with ideological and political loyalties to Cuba and the Soviet Union poses a direct challenge to which we must respond.” Unfortunately, the decades-in-the-making social and political inequality and inevitable turmoil in Nicaragua, El Salvador, and Guatemala had finally overheated into full-fledged armed conflict, and as historian LaFeber put it, “the Reagan administration sought to fight fire by pouring on gasoline.”67 The result was widespread human suffering in those three countries, plus Honduras, the base of operations for the anti-Sandinista Contras in Nicaragua.68
In mid-1993, a special review panel on El Salvador, appointed by Secretary of State Warren Christopher, concluded that the State Department had mishandled the El Mozote investigation in 1982 and had undermined “the Department’s credibility with its critics—and probably with the Salvadorans—in a serious way that has not healed.” Not only had no one from the State Department ever actually gone to El Mozote to eyeball the carnage, department officials then misled the American people about what had happened there. Now, in the spirit of the courageous El Salvador Truth Commission, the record was being set straight: “a massacre had indeed occurred and the U.S. statements on the case were wrong.”80 The report concluded that there was “no effort in Washington to obtain and analyze the numerous photographs that had been taken at the site by the American journalists. The Embassy does not seem to have been inclined to press, and Washington preferred to avoid the issue and protect its policy then under siege.”81
In retrospect, what is most striking about the 1980s Central America secret wars, to a massive extent spurred financially and militarily by the United States through local, “proxy” governments, was the unswerving, righteous arrogance of the Reagan administration—publicly mum and morally obtuse to their repressive comrades-in-arms’ wanton violence against civilians, even children; willing to lie to Congress and the American people, if necessary, about politically inconvenient truths and activities; and undeterred in the slightest by pesky and nettlesome US and international laws that might impede full implementation of the administration’s foreign policy agenda—i.e., aggressively supporting numerous covert, paramilitary wars around the world, a policy that came to be known as the Reagan Doctrine.
It was this arrogance that led the administration to scoff at multiple votes condemning US support for the anti-Sandinista Contras by the International Court of Justice at the Hague, the UN Security Council, and the UN General Assembly—together testifying to the near-unanimity of international opinion. It was this same arrogance that led the administration to defy the US Congress, which had voted in late 1984 to end all US assistance to the Nicaraguan rebels, by quietly creating a “private contra aid network,” which raised millions of dollars from individual donors and such foreign countries as Saudi Arabia, Taiwan, and Brunei.82 The inevitable result was the Iran-Contra scandal, an illegal, Rube Goldberg—like scheme to sell arms to Iran (a pariah nation under an international arms embargo), and in exchange, attempt to both secure the release of US hostages in Lebanon and provide funding to the Contras in Nicaragua. Known as the “Enterprise,” and led by Lieutenant Colonel Oliver North, it had “its own airplanes, pilots, airfield, operatives, ship, secure communications devices, and secret Swiss bank accounts. For 16 months, it served as the secret arm” of the White House National Security Council, carrying out an illegal covert Contra aid program that Congress had prohibited.83
Ultimately, when the embarrassing details emerged, Reagan had to fire several top aides over the scandal. And Iran-Contra independent counsel Lawrence E. Walsh prosecuted fourteen administration officials, nine of whom pleaded guilty—five for withholding information, making false statements, or committing perjury before Congress.84
In retrospect, beyond the particulars of the Latin American wars and the Iran-Contra scandal, something in Washington had seriously changed, undoubtedly for the worse. The will and the ability to hold those in power accountable had perceptibly weakened, in three specific ways.
First, although we had divided government in 1987 just as in the 1973–1974 Watergate period, with a Republican president and a Democratic-controlled Congress, the leaders of the Iran-Contra House and Senate Committees evinced no real interest in seriously investigating Ronald Reagan, the aging, affable, immensely popular president who had survived an assassin’s bullet in 1981. Iran-Contra Committee staffers have complained to me for years about the various ways the scope of their investigative efforts was limited from the outset by senior members of the two committees.
There were a few examples of excellent Iran-Contra reporting, including work by Robert Parry of the Associated Press and then Newsweek, who won the 1984 George Polk Award, and Alfonso Chardy, who, with his colleagues at the Miami Herald, won the 1987 Pulitzer Prize.86 But perhaps New York Times columnist Anthony Lewis best summed up the news media’s anemic coverage: “Fundamentally, the press lost interest in Iran-Contra because Congress did not develop sustained outrage. In this as in other matters the American press, for all its independence, relies on the official institutions of Washington to legitimize its choice of what is news.”87
And finally, in attempting to hold accountable those responsible for the Iran-Contra abuses of power, the rule of law in the United States was seriously subverted.
On December 2, 1986, President Reagan called for an independent counsel, and just over two weeks later, a three-judge panel named to the post Lawrence Walsh, a former US district court judge and deputy attorney general appointed by President Eisenhower. Within a mere two months, and for years afterward, Reagan and Bush attorneys general and their assistants, along with individual Republican lawmakers, began a drumbeat of public criticism of Walsh and the expense of the independent counsel investigation. Despite being under siege, Walsh and his staff dutifully attempted to fulfill his very difficult task. Then, on Christmas Eve, 1992, President George H. W. Bush effectively shut down the investigation by granting “full, complete and unconditional” pardons to former secretary of defense Caspar Weinberger, former national security adviser Robert McFarland, assistant secretary of state Elliott Abrams, and three others. Bush—who had been present in Iran-Contra meetings with Reagan and others while serving as vice president, and who had withheld his own notes from Walsh for six years—cited Weinberger’s health and his own concern about “the criminalization of policy differences” as the reason for the pardons.
Indeed, the cover-up continues to this day. In 2001, a few months after becoming president, George W. Bush signed the unprecedented Executive Order 13233, which sharply restricted public access to the papers of former presidents, including Ronald Reagan and Bush’s father. The Bush order overrode the post-Watergate, 1978 Presidential Records Act, requiring that a president’s papers must be made available to the public twelve years after leaving office. Steven L. Henson, the president of the Society of American Archivists, told the Washington Post, “The order effectively blocks access to information that enables Americans to hold our presidents accountable for their actions.”
On his first full day in office, January 21, 2009, President Barack Obama signed Executive Order 13489, revoking the earlier Bush order and also explicitly stipulating that vice presidential records are considered a part of “Presidential records.” Nonetheless, two decades after the pardons, George H. W. Bush’s Iran-Contra papers have still not been declassified at the Bush presidential library.89
From the manipulations, misrepresentations, and misconduct in Chile, to Guatemala, El Salvador, and Nicaragua, culminating in the Iran-Contra scandal, the United States displayed a callous contempt for transparency, laws, and human lives. It’s not hard to see a direct line leading from this outrageous and unaccountable conduct to an unnecessary war of choice in Iraq in 2003.
When respect for the truth is eroded, the barriers that protect us from official arrogance and, ultimately, tyranny inevitably begin to crumble.
Woodward’s critical self-appraisal may have surprised many in the audience. But it was a candid reflection of the intense and growing pressures faced by reporters and editors when considering investigative projects. Woodward himself, despite his status as journalistic superstar, had felt those pressures. During an ill-fated stint as the Post’s deputy managing editor for metropolitan news, he had assigned Patrick Tyler, a talented young reporter, to an intricate investigative story about Mobil Oil, alleging major improprieties by its president, William P. Tavoulareas. Woodward edited the piece, whose publication generated a highly publicized $50-million libel lawsuit against the newspaper that dragged on for eight years; by the time the full US Court of Appeals for the District of Columbia Circuit finally affirmed the truth and validity of the published story, the litigation had cost the Washington Post nearly $1 million.3 After it was all over, Ben Bradlee said, “If you come to me and ask me to run that story and say it’s going to cost a million dollars in legal fees and all the back and forth, I wouldn’t run it.” Woodward publicly disagreed with that sentiment.4
In 1962, Mintz wrote a remarkable front-page story for the Post about Dr. Frances Oldham Kelsey, a Food and Drug Administration (FDA) official who had resisted intense pressure from the pharmaceutical industry to approve the sedative thalidomide (trade name Kevadon) for sale to pregnant women suffering from morning sickness. Of course, thalidomide was later found to cause severe deformities in the children of pregnant mothers who used it, including missing arms or legs. His incisive journalism inflamed and emboldened Congress to enact legislation giving the FDA greater authority to require that drug companies scientifically test and prove that their products are safe.8
Within weeks of Mintz’s story, President John F. Kennedy specifically commended Kelsey during a nationally televised press conference, and days after that awarded her the President’s Award for Distinguished Federal Civilian Service.9 Congress unanimously passed and Kennedy signed into law tougher federal controls regulating the safety and effectiveness of pharmaceutical drugs.10 Mintz, a World War II navy veteran who’d begun his journalism career in St. Louis, won the George Polk Award for his reporting and, in 1965, published his first book, The Therapeutic Nightmare: A Report on Prescription Drugs, the Men Who Make Them, and the Agency That Controls Them.11
Given all these factors, it’s no wonder that Morton Mintz was never given carte blanche by the Washington Post to take on the tobacco industry. Instead, he became one of a long series of dedicated truth-seekers to do battle in an arena where powerful economic forces, public health, journalistic honor, and scientific integrity collide. And history tells us that this is a battle in which truth rarely prevails.
Bernays made no bones about his philosophy. In his 1928 book Propaganda, he wrote, “The conscious and intelligent manipulation of the organized habits and opinions of the masses is an important element in democratic society . . . We are governed, our minds molded, our tastes formed, our ideas suggested, largely by men we have never heard of . . . who understand the mental processes and social patterns of the masses. It is they who pull the wires which control the public mind, who harness old social forces and contrive new ways to bind and guide the world.”19
The following year, Bernays had a chance to field-test his theories, in concert with another early PR wizard named Ivy Lee. (In later years, Ivy Lee would serve as an informal adviser on propaganda strategy to Germany’s Nazi regime through a lucrative 1933 contract with industrial giant I.G. Farben—an admittedly extreme example of the readiness of many corporate public relations experts to apply their talents on behalf of virtually any paying customer. Public outrage over Lee’s services to the Nazis eventually led to passage of the 1938 Foreign Agents Registration Act, requiring that anyone in the United States who “acts at the order, request, or under the direction or control of a foreign principal” must report to the Department of Justice.)20
Bernays and Lee were enlisted by the American Tobacco Company to rectify a very costly, commercial problem with the “public mind”: American women were reluctant to smoke cigarettes on the streets of their communities. As Bernays explained it, “A woman seen smoking in public was labeled a hussy or worse.” Change that, he was told, and the company could “double the female market.”21
Bernays retained a psychiatrist, A. A. Brill, who advised him that cigarettes for women represented “a sublimation of oral eroticism; holding a cigarette in the mouth excites the oral zone.” And therefore, he concluded, “Cigarettes, which are equated with men, become torches of freedom.”22 Based on this “motivational research,” Bernays orchestrated, in the name of “equality of the sexes,” a “freedom march” of debutantes who smoked their way up six blocks of New York City’s Fifth Avenue during the Easter Sunday parade. The highly unusual public event became national news. Bernays later wrote, “Age-old customs, I learned, could be broken down by a dramatic appeal, disseminated by the network of media.”23
Despite heavy newspaper coverage, no one apparently realized or reported that the event had been orchestrated by Bernays and American Tobacco.24 Almost two decades later, in an essay entitled, “The Engineering of Consent,” Bernays proudly declared that “the engineer of consent must create news [emphasis added] . . . Newsworthy events, involving people, usually do not happen by accident. They are planned deliberately to accomplish a purpose, to influence our ideas and actions.”25
But by 1950, the tobacco industry had to do more than create news. That year, five scientific studies linking smoking to lung cancer were published, none more stunning than one by Dr. Richard Doll, a biostatistician on Britain’s Medical Research Council. He and colleague A. Bradford Hill, in their paper “Smoking and Carcinoma of the Lung,” found that heavy smokers were fifty times as likely as nonsmokers to contract lung cancer.26 That was followed by publication of fourteen similarly dire, smoking-cancer studies, including the largest, most substantive study to date about cigarettes and longevity, released by the American Cancer Society to the American Medical Association in 1954.27 For the first time, there was compelling scientific evidence of what had been suspected, but not officially proven, for centuries: smoking kills.
But with billions of dollars in future profits at stake, the tobacco industry responded by hiring the president of the public relations company Hill and Knowlton, John W. Hill, who ironically had himself quit smoking in the early 1940s for health reasons. Hill developed an aggressive counterstrategy that to this day dominates corporate anti-regulation efforts for a slew of dangerous products: engineer “controversy” instead of consent.29
Hill told his clients they should be perceived as “embracing” scientific research instead of ignoring it, that as an industry they should support the principle that “public health is paramount to all else.” On January 4, 1954, the companies collectively issued “A Frank Statement to Cigarette Smokers,” written by Hill and Knowlton, which appeared as an advertisement in newspapers nationwide. It announced the creation of the Tobacco Industry Research Committee to explore “all phases of tobacco use and health” and declared, among other things, “We always have and always will cooperate closely with those whose task it is to safeguard the public health.”30
Perhaps the most obvious tip-off was that TIRC’s administrative offices were located at Hill and Knowlton. Another was that the organization’s executive director, Tom Hoyt, had no scientific qualifications and was literally “loaned” by Hill and Knowlton to the tobacco industry. And when respected scientists inquired about funding for their own smoking-cancer research, they soon discovered the industry’s incuriosity about the health effects of its products. Indeed, the TIRC never sponsored direct epidemiological research about cigarettes and disease; the many cancer-related studies trumpeted by the tobacco industry during the early 1960s, for example, were all, according to David Michaels, “motivated by the same principle: Find other causes for disease, find smokers who do not have disease, find new associations of whatever sort, find this, find that, find anything—but the truth.”32
Such defiance of federal attempts to curb the tobacco menace was nothing new for this industry. Two former surgeons general, Dr. C. Everett Koop (who served in the Ronald Reagan administration) and Dr. Richard Carmona (the George W. Bush administration) described being constantly pressured over their anti-tobacco reports and statements. Carmona said he “fought for years” to release his significant surgeon general’s report, The Health Consequences of Involuntary Exposure to Tobacco Smoke.38 And Koop described the human impact of more than half a century of sustained, systematic misrepresentations: “In the course of my years as surgeon general and since, I have often wondered how many people died as a result of the fact that the medical and public health professions were misled by the tobacco industry.”39 All we can precisely say is that 100 million people around the world died from smoking-related illnesses in the twentieth century, according to the World Health Organization. And that number is expected to soar to an estimated one billion smoking-related deaths in this century, thanks to an aggressive export strategy aided and abetted by trade officials from the Jimmy Carter administration to that of Barack Obama.40
Murrow himself died of lung cancer in April 1965.50 Adding to the irony, William Paley, the man who built and oversaw CBS for half a century, was able to do what he did only because of the personal wealth he’d amassed from his family’s cigar business, whose seven factories produced more than 1 million cigars daily. Soon after he merged two broadcasting companies to create the Columbia Broadcasting System, Paley hired none other than Edward Bernays, the preeminent public relations expert and longtime tobacco-industry flack. Paley exulted, “I thought, my God, to be important enough to have a public relations man. Somebody who could tell you what to do and what not to do.”51
But the See It Now reports about the connection between smoking and lung cancer were the exception for network television news shows, which otherwise “virtually ignored” that inconvenient subject. As Thomas Whiteside of the New Yorker later reported, those programs were “nearly all sponsored by cigarette companies.”52 By the late 1960s, according to the Federal Trade Commission, every US household with a TV set was annually exposed to roughly 800 cigarette commercials, and “four out of every five promotional dollars spent by cigarette makers” in the United States were spent annually on television.53 In 1969 alone, that amounted to $230 million on television advertising spent by the tobacco companies.54
At one point, in response to an inquiry by a Senate subcommittee chairman, the three television network presidents declared they would not voluntarily release the tobacco companies from their multiyear advertising contracts. Leonard Goldenson, the president of ABC, complained that a cigarette advertising ban would be unfair and financially calamitous, and in fact, “it could well mean a substantial cutback in our news and public affairs operations almost immediately . . . We do not believe that the Congress would look with favor on any such forced curtailment of network service to the American public.” In other words, as journalist Whiteside adroitly put it at the time, “ABC owed it to the public to keep the cigarette commercials on the tube.”56 Ultimately, despite the substantial industry pressure and sophistry, not to mention campaign contribution largesse, Congress passed and President Richard Nixon signed into law the Public Health Cigarette Smoking Act of 1970, banning all television and radio cigarette commercials starting January 1, 1971.57
Of course, the new law didn’t end cigarette advertising but merely relocated it. In the first year of the ban, the tobacco industry spent $157.6 million on newspaper and magazine advertising in the United States—up dramatically from $64.2 million in 1970.58 Whiteside, the most prominent tobacco industry muckraker during this time, was livid over the sudden advertising bonanza the print media had no compunction accepting. “How,” he asked, “can any publisher—anyone—make money out of selling advertisements for a product that is known to cause death on a disastrous scale year after year?”59 But the coffin cash kept flowing. Newspapers and magazines cumulatively took in $10.9 billion worth of cigarette advertising between 1976 and 2008, with $649 million in 1981 alone.60 During those same three-plus decades, approximately 12 million people in the United States died prematurely from smoking-related illnesses.61
With the exception of Whiteside and a few others, there was little noteworthy print or broadcast investigative reporting—no Pulitzer, Peabody, or National Magazine Award winners—about the tobacco industry in the 1960s and 1970s. The news media’s anemic coverage of the nation’s deadliest industry, while simultaneously reaping billions of dollars in revenues from it, is particularly ironic considering that this coincided with the historic apogee of high-quality, public-service journalism in America, namely, publication of the Pentagon Papers in 1971 and the Watergate scandal coverage between 1972 through 1974.
Political forces played a role. The election of conservative Republican Ronald Reagan in 1980 ushered in a period when the GOP would control the White House for twenty out of twenty-eight years, moving the nation’s political agenda and outlook to the right as well. The nation’s leading newspapers followed suit. From the 1960s to the late 1990s, many of the nation’s most prominent newspaper companies went public, among them the Wall Street Journal, owned by Dow Jones and Company (1963), the Times-Mirror Company (1964), Gannett Company (1967), the New York Times (1967), Knight Ridder (1969), Washington Post (1971), the Tribune Company (1983), the Pulitzer Company (1986), and E.W. Scripps Company (1998).63 The shift from private to public ownership made Wall Street a player in the newspaper business, making corporate concerns even more influential in media circles.
That same year, Richard Bonin and I began co-producing a 60 Minutes segment entitled “Tobacco on Trial” for senior correspondent Mike Wallace about the increasingly aggressive civil litigation against the cigarette companies, and their muscular and very expensive efforts to fight back. No smoker had ever successfully sued a cigarette company in the United States, despite more than three hundred civil lawsuits filed against the industry since 1954, after their product’s lethal qualities had become scientifically well established. In an attempt to ward off more lawsuits, and to help ensure victory in the courtroom, the tobacco industry hired eighty-seven of the premier law firms in the United States, made aggressive use of private investigators, and conducted brutal interrogations of plaintiffs and related witnesses that lasted for days—all entirely legal, of course.
Koughan, who had never before had a documentary killed, refused to turn over those materials, even though they legally belonged to ABC. In short order, inconvenient news coverage about the internal spiking of his documentary appeared in the Washington Post, American Journalism Review, Village Voice, New York Daily News, Project Censored, and elsewhere, and things got uglier. Koughan told the Washington Post in August 1995, “A half-million dollars was flushed down the toilet, and a tough look at the tobacco industry was snuffed out.” With its lawsuit, he said, Philip Morris had shown that “for a paltry $10 million or $20 million in legal fees . . . you can effectively silence the criticism.”80
ABC issued a formal apology to Philip Morris and, as part of the settlement, Diane Sawyer read it during halftime of Monday Night Football.94 Bogdanich and correspondent John Martin refused to sign the apology. One of the terms of the settlement was that both sides would not talk publicly. But days later, Philip Morris bought full-page ads in roughly seven hundred newspapers, with the banner headline “Apology Accepted.”95 ABC was widely attacked as having sold out journalism. Under the headline “The Cave on Tobacco Road,” Jonathan Alter wrote in Newsweek, “ABC caved—not entirely, but enough to send a true chill through the entire news business.” And Murphy was heavily criticized as a “corporate sellout.”96
ABC News employees were stunned by what appeared to be a crass corporate capitulation at the expense of their own credibility as journalists. Bogdanich told me, “There was no doubt in my mind, none, that this was an attempt to clean the record, hand off a company that didn’t have a $10 billion potential liability on its books, even though . . . our lawyers kept assuring us that we are going to win this case.”97
Even Walter Cronkite, the venerable anchor of the CBS Evening News during the turbulent 1960s and 1970s, was disgusted, telling the Public Broadcasting System (PBS) program Frontline:
[T]he management of 60 Minutes has the power there, quite clearly, to say, “I’m sorry. We’re doing this because we must do it. This is a journalistic imperative. We have this story and we’re going with it. We’ve got to take whatever the legal chances are on it.” Well, they didn’t. They felt it was necessary to buckle under the legal pressures and that must send a message to every station across the country where they might have any ambitions to do investigative reporting. “Hey, look, if 60 Minutes can’t stand the pressure, then none of us ought to get in the kitchen at all.”102
The CBS capitulation completed the trifecta of the 1990s Network Television “Corporate Mergers and Abdications from Journalism” sweepstakes. It demonstrated unmistakably that something had changed throughout network television. Under questioning by Frontline correspondent Daniel Schorr, Mike Wallace acknowledged that it had never occurred to him to quit in protest, and though never publicly stated, that obviously was also the case with Hewitt.
ABC’s settlement of the Philip Morris lawsuit, along with a Justice Department criminal inquiry into possible perjury committed by the seven tobacco CEOs testifying before Congress (including Laurence Tisch’s son, Andrew), had, according to Bergman, “put a chill into the general counsel and the corporation.” What’s more, as Bergman points out, “CBS was also up for sale.” In fact, one day after the ballyhooed Disney-Capital Cities-ABC merger, in August 1995, CBS announced that it too was looking for suitors; in the end, the company accepted a $5.4 billion offer from Westinghouse Electric Corporation, which reportedly earned more than $500 million for Larry (and brother Preston) Tisch’s Loews Corporation, the largest shareholder of CBS. The sale, which was approved a few days after the embarrassing, truncated 60 Minutes interview with Wigand, also made millions for CBS general counsel Ellen Kaden and CBS News president Eric Ober, the very executives who told Bergman, Wallace, Hewitt, and their cohorts to halt their work on the tobacco project.105
In this specific case, what was truth? From the moment that lawsuit went on, truth then had to be, “Wait a minute, we need to reshape this now.” They didn’t say change the facts, we had to just “reframe” how we were doing this because of new events or evolving situations. So truth now gets a little murkier. Within a couple of months after that, when it becomes clear, on the corporate level, that this merger was going to happen and therefore in order to make it happen the Philip Morris lawsuit had to go away, the only way it could go away was to cut a deal with Philip Morris. So the humiliating apology on Walt Bogdanich’s piece, and the disappearance of my hour—and the truth then became “Oh, there’s no news here.” From out of the mouths of the very people who a year before said, “This is terrific stuff.”
. . . The new truth was defined by the economic realities of the corporation. Truth became, “Well, this is something that is not worthy of airing. . . .”
Tragically, the cigarette companies’ cold-blooded, calculated cover-up over many decades—blowing smoke about their lethal, addictive products while knowing full well they were causing death and suffering—also has been the modus operandi of numerous other industries. Manufacturing uncertainty for years and often decades and attempting to thwart regulation while raking in the cash from their mortally dangerous products is what the asbestos, coal, lead paint, dyes (beta-Naphthylamine [BNA], benzedrine), metal (beryllium), pesticides (DDT), plastics (vinyl chloride), pharmaceutical drug (ephedra, Vioxx, Rezulin), and other industries have done throughout the past century.111
In response to Silent Spring. .
At the time, E. Bruce Harrison worked as manager of environmental information for Manufacturing Chemists Association, his precise task to help orchestrate the industry’s campaign against the book, working closely with public relations professionals at DuPont, Dow, Monsanto, Shell Chemical, Goodrich-Gulf, and W.R. Grace. According to public relations industry watchdogs John Stauber and Sheldon Rampton, Harrison’s “crisis management” techniques included “emotional appeals, scientific misinformation, front groups, extensive mailings to the media and opinion leaders and the recruitment of doctors and scientists as ‘objective’ third party defenders of agrichemicals.”113
In fact, a two-year analysis by the Investigative Reporting Workshop at American University in Washington, DC, found that Koch Industries has “developed what may be the best funded, multifaceted, public policy, political and education presence in the nation today.” From 2007 through 2011 alone, while Koch Industries spent $53.9 million lobbying for its federal and state policy agenda, Koch private foundations were giving $41.2 million to eighty-nine nonprofit public-policy-related organizations and $30.5 million to 221 US colleges and universities. Koch Industries and the Koch brothers contributed $8.7 million to congressional or presidential candidates and the Republican Party. In 2011, these mutually reinforcing, private and public “deregulation” efforts helped the oil and gas industry kill climate change legislation in the House of Representatives.118 The Kochs and their foundations have also founded or substantially underwritten the organizing entities behind the conservative, antigovernment Tea Party movement, the Americans for Prosperity Foundation, and Freedom Works.119
Doubt is their product. And their enemy? The unpalatable truth.
But Murrow had misgivings about television, which were widely shared by “Murrow’s Boys,” the team of remarkable young reporters (Eric Sevareid, Charles Collingwood, Howard K. Smith, among others) widely considered the “giants of broadcast journalism.” As CBS News historian Gary Paul Gates noted, they wanted no part of television: “Sevareid more or less spoke for the entire Murrow team when, to a friend, he lamented, ‘That damn picture box may ruin us all.’”11
Perhaps for this reason, although Murrow had done some on-camera reporting at the 1948 political conventions, his weekly television news program, See It Now, didn’t debut until November 18, 1951. Sig Mickelson, the first director of CBS television news, describes See It Now as the first successful long-form, news-related program that was pure television rather than “a feeble copy of techniques used in either radio or newsreels, though it borrowed from both.”12 The program was an immediate hit with the public and the print press. The New York Times gushed, “Edward R. Murrow’s program ‘See It Now’ . . . [is] a striking and compelling demonstration of the power of television as a journalistic tool, lifting the medium to a new high in maturity and usefulness.”13 As author Bob Edwards has observed, “Finally, educated people would admit without shame that they owned a TV set. For the second time, Edward R. Murrow had introduced a broadcasting medium to in-depth news.”14
In retrospect, there were ominous signs from the beginning. With the advent of the Cold War, the embryonic CBS and NBC networks found themselves under public pressure to adhere to acceptable political standards. In 1946 alone, they fired two dozen “left-leaning correspondents” amid demands “to ‘tone down’ news which is sympathetic to organized labor and to Russia.”16 In 1947, a popular newsletter founded by three former FBI agents called Counterattack and another publication called Red Channels, which listed 151 broadcast journalists and entertainers “who allegedly had Communist leanings,” began turning up the heat, criticism to which Paley and others at CBS were very sensitive. After the House Un-American Activities Committee began hearings later that year into “subversives” in the US film industry, the Red Scare also deepened within broadcasting. According to Variety, “any actor, writer or producer who has been even remotely identified with leftist tendencies is shunned.”17
Things got even more precarious in February 1950, when Senator Joseph McCarthy declared that the State Department had been substantially infiltrated by Communists. Murrow soon questioned McCarthy’s claim on his radio newscast, drawing many letters of protest to Murrow’s corporate sponsor, Campbell Soup, which withdrew its sponsorship in June. Indeed, Edward R. Murrow with the News never again had a national sponsor, only regional sponsors.18
Within six months, CBS required all of its 2,500 employees—including Murrow—to sign a loyalty oath that they were not, nor had ever been, a member of the Communist Party or any other group advocating the overthrow of the US government.19 And according to author Edward Alwood, between 1952 and 1957, HUAC, the Senate Internal Security Subcommittee, and Senator McCarthy’s Subcommittee on Government Operations subpoenaed over one hundred newspaper journalists to testify, some of them publicly. Fourteen who refused to do so, including four New York Times reporters, were fired.20
The specter of political intimidation from Washington hung over the entire industry as Congress held a series of highly publicized investigative hearings about the motion picture industry and broadcast television. Broadcasters feared increasing scrutiny and government regulation of the TV industry, worrying, for example, that the Federal Communications Commission might refuse to renew broadcast licenses of those TV station owners supposedly linked to the Communist Party. In the words of broadcasting historians Robert L. Hilliard and Michael C. Keith, “The FCC and the rest of the government in the 1950s—like the attitudes and behavior of most of the United States, business and public alike—were being held hostage by McCarthyism.”21
The response of the networks to this intimidation was less than heroic. Both CBS and NBC unabashedly blacklisted employees, and network executives also quietly shared personnel-related security information. Equally disturbing, during this grim time, CBS founder Paley was allowing “CIA operatives to screen CBS News film, to eavesdrop on conversations between CBS news officials in New York and the field and to debrief CBS correspondents on their return from overseas assignments.” And he was also aware that “CIA agents from time to time operated as part-time CBS correspondents.”22 It was obviously one of the darkest periods in US history for the “due process” clauses of the US Constitution, freedom of the press, and the ideal of an independent media serving as a watchdog over government power.
Murrow’s report was one of the first national broadcasts to challenge McCarthy, using film clips to highlight some of the many self-contradictions in the senator’s public statements. At the end of the dramatic program, Murrow closed his commentary this way:
We proclaim ourselves—as indeed we are—the defenders of freedom, what’s left of it. But we cannot defend freedom abroad by deserting it at home. The actions of the junior Senator from Wisconsin have caused alarm and dismay among our allies and given considerable comfort to our enemies. And whose fault is that? Not really his. He didn’t create this situation of fear. He merely exploited it, rather successfully. Cassius was right: “The fault, dear Brutus, is not in our stars, but in ourselves.”
Good night and good luck.28
CBS and Murrow offered McCarthy a full half-hour in which to respond, which he did on April 6, angrily calling Murrow “the leader and the cleverest of the jackal pack which is always found at the throat of anyone who dares to expose individual Communists and traitors.”30 The nation soon saw much more of the erratic, bombastic senator in the Army-McCarthy hearings, gavel-to-gavel televised coverage of which began on April 22. The hearings filled 187 hours over thirty-six days and culminated in the dramatic confrontation when army counsel Joseph Welch pointedly asked McCarthy, “Have you no sense of decency, sir?”
Months later, on December 2, 1954, the US Senate voted 67–22 to “condemn”—although not to expel—McCarthy. Rejected and bitter, he literally drank himself to death and died two and a half years later. In the end, not a single proven act of espionage or subversion was ever uncovered by his subcommittee.31
Half a century later, the entire Murrow and Friendly saga at CBS still epitomizes the core conundrum of commercial television news. TV is an immensely powerful medium, but its potential to make astonishing sums of money is typically realized only by appealing to the lowest-common-denominator instincts of viewers. As a result, serious journalism—particularly investigative or other expensive-to-produce, in-depth reporting—will necessarily be undertaken by commercial TV news executives with great caution, inevitably taking a back seat to more crowd-pleasing, less costly, and thus more lucrative programming.
Yet in the early years of television, the potential for a different path existed. Few people realize that, as author Chad Raphael has pointed out, the “high point of the (network television) documentary boom” in America was in 1962, when ABC, NBC, and CBS collectively produced 447 news reports (not all investigative in nature, of course).35 Raphael has argued that “television’s contribution to the mainstream media’s first sustained period of muckraking since the Progressive era compares more favorably to print than is usually thought.” But he has also documented how the most aggressive and best-known television documentaries of that period—such as Murrow’s “Harvest of Shame” in 1960, Charles Kuralt’s “Hunger in America” in 1968, Roger Mudd’s “The Selling of the Pentagon” in 1971—all engendered substantial controversy and even government investigations.36
But what, exactly, created the move to shorter snippets of serious information? Was it television’s apparently natural impulse to simplify everything for its millions of viewers? Or did a complex combination of factors in our increasingly frenetic, mass-media world make the change inevitable? Whatever the answer, in the decades since Murrow and Friendly plied their trade, I’ve seen more than a dozen investigative-reporting units at the TV networks come and then go, usually for all the wrong reasons. Ultimately, well-meaning, immensely talented on-air and off-air journalists were figuratively ground up and spit out, departing their posts in bitterness and frustration, and leaving truths our citizenry desperately needs to know unexplored.
Other trends in television news are equally disheartening. With the advent of cable television in the United States, not to mention over-the-air TV channels, the Internet, mobile phones, and social networks, roughly 30 percent of the population “stopped watching any news at all.”38 For me, the most troubling statistic of all is that as additional sources of online news, entertainment, and other information have proliferated, consumer “use of newspapers, news magazines, and television is at a 50-year low” in the United States, according to Robert Picard, the director of research at Oxford University’s Reuters Institute.39
In the words of Paul Starr, a two-time Pulitzer Prize–winning author and a longtime Princeton University professor: “The digital revolution has been good for freedom of expression because it has increased the diversity of voices in the public sphere. The digital revolution has been good for freedom of information because it has made government documents and data directly accessible to more people and has fostered a culture that demands transparency from powerful institutions. But the digital revolution has both revitalized and weakened freedom of the press.” Starr has also noted what I have certainly discerned in the past decades: studies throughout the world indicate “that corruption flourishes where journalism does not . . . The less news coverage, the more entrenched political leaders become and the more likely they are to abuse power.”40
Meanwhile, “most Americans still get their news from their favorite local TV news team,” according to an authoritative government study, “The Information Needs of Communities.”41 Unfortunately, network television news and newsmagazine staffs are roughly half what they were in the 1980s, whereas most local TV stations have increased the amount of news programming while also reducing their editorial positions.42 As Matthew Zelkind, the news director of WKRN-TV in Nashville, Tennessee, told the Federal Communications Commission, “Long-form stories are dying because they’re not financially feasible . . . It’s all economically driven.”43 Throughout the United States today, only 20 to 30 percent of the population has access to a local all-news cable channel, and one-third of commercial TV stations offer “little or no” news.44 Sadly, far too many local stations today still rely on slick, “happy talk” anchor banter, as well as the familiar 1970s-era formula of “Action News” in notorious “if it bleeds, it leads” form, followed by weather, sports, and such “human interest” video fare as a water-skiing squirrel or a snowboarding opossum.45
The toll on America’s newspapers has been profound. Between 1992 and 2009, the number of commercial newspaper editorial employees in the United States dropped by 33 percent, to 40,000 from more than 60,000, according to a Columbia University study written by Leonard Downie Jr. and Michael Schudson.47 Most of those cuts have occurred recently (13,400 jobs lost between 2007 and 2010),48 thanks in large part to a 47 percent decline in newspaper advertising revenue between 2005 and 2009. The fact is, according to the Federal Communications Commission, even though the country’s population grew to 308 million in 2010 from 203 million in 1970 (a gain of roughly 50 percent), today we have approximately the same number of journalists watching those in power as we did then. What’s more, we have half as many television network news staffers as we had in the 1980s.49
This means that newspapers in twenty-seven states now have no reporters in Washington covering their members of Congress and myriad other state-relevant subjects. Moreover, the number of reporters covering state governments in the United States fell by one-third between 2003 and 2008 alone, their thinning ranks clearly not up to the task of following some 22,000 laws passed each year in state capitals. And coverage of global events has been similarly affected: between 2003 and 2011, the number of American foreign correspondents dropped by 24 percent, to 234 from 307, according to the American Journalism Review.50
Against this backdrop of drastically reduced coverage, the number of entries for the Pulitzer Prize for in-depth, nondaily reporting, the most prestigious award for print journalism, has also dramatically dropped. From 1985 to 2010, applications for the public service “Gold Medal” category fell to 70 from 122, for the investigative reporting category to 81 from 103, and for explanatory journalism to 104 from 181.51 The organization Investigative Reporters and Editors, of which I have been a proud member since 1982, saw its dues-paying membership drop to 4,000 in 2010 from 5,391 in 2003.52
The sharp decline in the number of professional journalists and Pulitzer entries is not occurring in a vacuum. In a society increasingly beset by public relations, advertising, and other artificial sweeteners manufactured by message consultants and communications flacks, how does an ordinary citizen decipher truth amid the “pseudo-events” and vast “thicket of unreality which stands between us and the facts” so aptly described by Daniel Boorstin in his classic 1962 book, The Image: A Guide to Pseudo-Events in America. Our late Librarian of Congress Emeritus and one of our most distinguished historians, who died in 2004, called for Americans to “disillusion ourselves. What ails us most is not what we have done with America, but what we have substituted for America. We suffer primarily not from our vices or our weaknesses, but from our illusions. We are haunted, not by reality, but by those images we have put in place of reality.”53
Other research, both in the United States and elsewhere, confirms this basic trend. A study of ten newspapers in 2009 by the Australian Centre for Independent Journalism, at the University of Technology in Sydney, found that “55 percent of the stories analyzed were driven by some form of public relations.”58 The Columbia Journalism Review analyzed one edition of the Wall Street Journal and found that more than half of the news stories “were based solely on press releases” reprinted “almost verbatim or in paraphrase.”59
By the mid-1990s, Roberts had become so concerned about what he called the “corporatization of newspapers” that he launched the Project on the State of the American Newspaper, funded by the Pew Charitable Trusts, which produced two books and twenty articles in the American Journalism Review. That important work illuminated, among other things, the extraordinary extent of newspaper consolidation that had occurred. Between 1994 and mid-2000 alone, roughly 40 percent of America’s daily newspapers were sold at least once, with small papers (circulation under 13,000) accounting for 70 percent of that total.67 According to journalist Mary Walton, “the frenzy of buying and selling has produced a new breed of ‘financial owner’ for whom small newspapers are just another business . . . Because the bulk sales from old to new chains typically have involved a large number of leanly run ‘mature papers’ in no- to slow-growth communities, the only way the new owners can increase revenue is to find fresh properties and cut costs.” Buyers were especially interested in finding groups of newspapers that could be “clustered” around one printing plant, with newsroom staffs in some instances also being shared and presumably made more “efficient.”68
Today, in the publicly traded newspaper company, the business of news is being transformed into the business of business. News is not its product, upon which the enterprise depends for its long-term survival. News is instead increasingly an instrument by which advertisers are lured, customers are efficiently reached, advertising rates are increased, news staff is cut, and margins are increased, and increased, and increased. At its worst, the publicly traded newspaper company, its energy entirely drawn to the financial market’s unrealistic and greedy expectations, can become indifferent to news and, thus, ultimately to the fundamental purposes served by news and the press. Some of the publicly traded companies are today acting as if . . . they see themselves as simply a channel for consumption, a broker and distributor of commerce.70
Indeed, as Jack Fuller, the former editor and publisher of the Chicago Tribune, painfully acknowledged in his 2010 book, What Is Happening to News, “every newspaper company, even those led by people totally committed to striking a proper balance between the financial and social missions of journalism, has been beaten down.”71 So beaten down, in fact, that in 2011, print advertising revenues dropped for the sixth consecutive year, and Gannett—the nation’s largest newspaper publisher as measured by daily circulation—purged the word “newspapers” from its home page. And according to the Newspaper Association of America, although online advertising in 2011 increased by $207 million compared to 2010, print advertising over that same period dropped by $2.1 billion. “So the print losses were greater than the digital gains by 10 to 1.”72
To understand a little better what propels some souls toward careers in enterprise journalism—a term that embraces investigative journalism as well as other forms of in-depth, original reporting on topics whose importance transcends that of the usual flow of daily events—let’s consider the formative years of three of the best.
Journalist Florence Graves decided that investigative journalism was for her at a very early age. “I would read biographies of women,” she told me. “And I remember reading biographies of Nellie Bly and Ida Tarbell when I was a kid in elementary school and thinking, ‘Wow. That’s what I want to do.’” Their compelling personal stories of exposing outrageous abuses of power in settings ranging from a mental asylum to the most powerful corporation in America provided Graves with a “picture that this was possible. Because in the ’50s and ’60s Texas that I grew up in, that was not possible.”4
“I saw that [Bly and Tarbell] were able to use their brains,” Graves says. “They were able to use their curiosity, their determination to actually go out and do stories that were important, that would make a difference that in many cases could lead to changes in society. Because even as a kid, I could see that there were things that needed to be changed in the world. And journalism was a way to bring truth to people, or a form of the truth.”5
Bill Kovach grew up poor in the Appalachian region of Johnson City, Tennessee, where his Albanian parents ran a little coffee shop in the Trailways bus station. “My dad was an illegal immigrant. He worked his way over on a steamer as a fourteen-year-old boy. And my mother was brought over by her older brother . . . she and her sisters and her brother were orphaned in the first war when the Italians bombed their home, killed their parents.”7
Kovach’s father died when he was twelve, and young Bill developed into an observant and angry adolescent:
I grew up in the streets, because my mother was working all the time . . . And in the streets of that town . . . there was a veteran’s hospital that was one of the biggest industries. And that veteran’s hospital had a lot of World War I gas victims, both Canadian and U.S., whose lungs were destroyed by the gas attacks. They were incapacitated for life and living at the hospital. And they were emaciated, sad people who just needed a drink to forget about things. So I was always, as a kid, watching cops kicking these guys around, you know, people taking advantage of them. And it infuriated me, in part, because, you know, we were outsiders. In East Tennessee there weren’t a lot of Albanians.8
Daniel Schorr realized his calling earlier than most. He grew up in a ground-floor tenement apartment in the Bronx in the 1920s, fatherless when he was five and with a younger brother afflicted with polio. “On one hot summer’s day with the windows open, I heard a big plop outside. And I went to the window to see what it was. And it was a man lying on the stone, dead. He had jumped or fallen.” The twelve-year-old Schorr went outside with paper and pencil, asked the police who the man was and why they thought he had fallen, and then called in the story to the local newspaper. His natural instinct, even then, was that “my job is to report it and explain what it’s about. It’s very strange.” Schorr was paid $5.00 for that first scoop.10
From breaking major Watergate scandal stories to exposing secret assassinations of foreign leaders by the United States and so many other significant scoops, Daniel Schorr had an extraordinary career over six momentous decades, winning every major US broadcast journalism award. Hired by Edward R. Murrow at CBS News in 1953 and getting the first televised interview with Soviet premier Nikita Khrushchev in 1957, Schorr developed a reputation as a tenacious reporter who, according to the Washington Post, “broke major national stories while also provoking presidents, foreign leaders, the KGB, the CIA and his bosses at CBS and CNN.” He spent the last twenty-five years of his life as a senior news analyst for National Public Radio.11
It became painfully apparent over time that network television news was not especially interested in investigative reporting, certainly not to the extent or the depth of the best national print outlets. In fact, the most trusted man in America around this time, CBS News anchor Walter Cronkite, had told Time magazine something in 1966 that still rang true more than a decade later: that “the networks, including my own, do a first-rate job of disseminating the news, but all of them have third-rate news-gathering organizations. We are still basically dependent on the wire services. We have barely dipped our toe into investigative reporting.”12
Our first five years of investigating corruption in state legislatures culminated in a national investigation of conflicts of interest by state lawmakers that, in 2000, was discreetly disseminated in embargoed fashion to a consortium of fifty leading newspapers in fifty states.8 We analyzed and posted online the annual financial disclosure filings of more than 5,700 state lawmakers, exposing literally hundreds of apparent conflicts of interest related to their conduct of official business. That scrutiny continued for the next decade with a series of major reports that generated significant local media coverage and prompted twenty-one states to change their financial disclosure laws, forms, or rules pertaining to lawmakers. Similarly, after the Center exposed the lax disclosure systems regarding lobbying, twenty-four states improved their lobbyist transparency requirements.9
Thanks to these exhaustive investigations of dubious government activities, the Center grew steadily in reputation, funding, and size. But one subject we hadn’t yet tackled was the media’s power and influence, which citizens were increasingly asking me about. So in 2001, we created the “Well-Connected” project, which tracked information about media, technology, and telecommunications corporations. Our watchdog reporting included cumulative, detailed, political-influence information gleaned from government documents about media ownership, as well as media and telecom companies’ federal and state lobbying and campaign contribution activities. The project team made news in 2003 when it uncovered that Federal Communications Commission officials had been taken on 2,500 all-expense-paid trips, over an eight-year period, by the media companies they were entrusted to regulate.10 Within months, Congress curbed all such privately funded travel at the agency.
Other projects that earned widespread public attention included the online publication, in February 2003, of the secret “Patriot II” draft legislation, which the Center revealed against the explicit wishes of the Justice Department.11 The story caused a bipartisan uproar, as Congress had been told for six months that there was no Bush administration intention to propose sequel legislation to the 2001 Patriot Act. Weeks later, mere days after the US invasion of Iraq, the Center published a new report disclosing that at least nine of the thirty members of the Defense Policy Board, the government-appointed group that advises the secretary of defense, had ties to companies with more than $76 billion in defense contracts in 2001 and 2002.12
We like to think that the Center for Public Integrity’s publication of thoughtful, in-depth investigative reporting—and the public’s growing embrace of that work—provides a potent counterweight to some of the disturbing trends unfolding in American media: the rise of the cable TV shout-fests; talk radio’s derisive invective; and the flight to shorter, lighter local and network television news stories, sometimes augmented with cartoonish graphics to make sure the audience actually understands the point of the reporting. At times, it feels as though too many people are becoming benumbed by what Carl Bernstein calls “the spectacle, and the triumph, of the idiot culture.”
I once asked Bernstein what precisely had prompted his eloquent 1992 denunciation of that “idiot culture” in our media. He described to me precisely when, for him, the downslide had all begun:
[It started when] not just the New York Post and the New York Daily News tabloids, but Newsday, which was then owned by the Los Angeles Times and was probably the best tabloid newspaper in America, did the ‘whole front page this day’ [in 1990] about the breakup of Donald Trump’s first marriage to Ivana Trump and Trump’s new relationship with this woman named Marla Maples. And on that same day, the allies of World War II agreed to the reunification of Germany, and Nelson Mandela was released from the South African gulag that he had been in for all of those years. Those stories were inside Newsday, the two other tabloids here, and many other newspapers in America. And [in] every local news broadcast in this town [New York City], television, the Mandela and reunification stories followed the Trump marriage stories.
To me, that became a kind of allegory . . . for what was happening. And meanwhile, the following week or two, ABC News, where I had gone to work, premiered its news magazine show called Primetime Live with Sam Donaldson, who is a great reporter, and with Diane Sawyer, who is remarkably talented. And Diane was sent not to the Brandenburg Gate and not to Robben Island in South Africa. She was sent to Marla Maples’ apartment. And I wrote, “That is the triumph of idiot culture.”13
In October 2003, for example, the Center published “Windfalls of War,” which examined the major US government contracts in Afghanistan and Iraq, definitively revealing Halliburton and its subsidiary, Kellogg, Brown and Root, to be the overwhelmingly largest financial beneficiary of our invasions of those countries. For six months, twenty researchers, writers, and editors had worked on the project, filing seventy-three Freedom of Information Act (FOIA) requests and even suing the US Army and the State Department (and ultimately winning the release of key, no-bid contract documents).14 That report, which won the first George Polk online investigative reporting award, was prepared by the Washington staff of the Center’s International Consortium of Investigative Journalists, which I began in 1977 and which is helping to fill the void for aggressive reporting left by the contraction of commercial media.
ICIJ is the first working network of some of the world’s preeminent investigative reporters collaborating to produce original international enterprise journalism, its ranks now comprising 175 people in over sixty countries on six continents. I’ve already mentioned “Windfalls of War,” which ICIJ helped produce in 2003; that report had been preceded in 2002 by “Making a Killing: The Business of War,” which used contributions from thirty-two reporters globally to identify ninety private military companies working for governments, corporations, and even criminal groups around the world.15
In July 1999, I asked a recent University of Delaware grad, newly arrived in our offices, to help me explore a new way of monitoring and reporting on corruption, government accountability, and openness around the world. That effort culminated in a 750,000-word Center report, published online in 2004, entitled “Global Integrity.” The unprecedented undertaking, by far the Center’s largest-ever effort, was prepared by two hundred paid social scientists, journalists, and peer review editors in twenty-five countries on six continents.
This massive project spawned Global Integrity, a new nonprofit organization with an academic, social-science orientation and quantitative methodological component, and with greater and more diverse funding and capacity needs than the Center for Public Integrity. To address this situation, in December 2004, I recommended to the board of directors that this new entity be spun off as a global nongovernmental organization, completely separate and independent from the Center for Public Integrity. It was one of my last official acts as executive director of the Center.
But weeks after we had published a story about Dick Cheney’s years as the CEO of Halliburton, in August 2000, we found ourselves being sued in US district court in Washington by two Russian billionaire oligarchs represented by the powerful DC law firm of Akin Gump Strauss, Hauer, Feld, LLP. Published within days of the Republican National Convention in which Cheney was nominated to be the party’s nominee for vice president of the United States, the story showed that Halliburton had doubled its lobbying, campaign contributions, and federal government contracts/loans during Cheney’s five years as CEO, compared to the five years preceding his tenure there—and that among the apparent beneficiaries of Cheney’s influence-peddling were the powerful owners of one of Russia’s biggest banks.21
This David versus Goliath struggle cost each side millions of dollars. My major goal was to shield the Center newsroom and allow our ongoing investigative reporting to be relatively undeterred and undistracted. It worked: during the five-year siege of litigation, we still managed to publish one hundred investigative projects, including four books.22 The case was finally dismissed in September 2005, upholding the vital principle that public figures cannot prevail in a libel suit against journalists unless they can demonstrate “actual malice” or “reckless disregard” for the truth—an appropriately high standard that clearly exonerated the Center.
Of course, we were delighted that the suit was unsuccessful. But it illuminated the immediate need for an institutional bulkhead protecting the organization from future storms of litigation. To provide such a shield, the Fund for Independence in Journalism, a 509 (a) (3) endowment and legal defense support organization, was created. Initial foundation contributions totaled $4 million, and, in addition, five of the most prestigious law firms in America pledged, on a case-by-case basis, to defend the Center for Public Integrity in any future actions, pro bono.23
The Center for Public Integrity isn’t the only nonprofit organization leading a rebirth of the practice of investigative journalism. Excluding the partially government-funded National Public Radio, the Public Broadcasting System, and their hundreds of affiliate stations and local-news websites, today there are roughly one hundred professional nonprofit news organizations operating throughout the United States. More than two-thirds of those operations were created since 2004.26 Eighteen of them operate from university campuses—an attempt to inculcate core journalistic values and technical know-how in new generations of reporters and editors in order for them to continue this essential work. The largest university-based reporting center and the only one in the nation’s capital is the Investigative Reporting Workshop, which I proposed to the American University School of Communication in late 2007 and have since led. Its purpose is twofold: to create “significant, original investigative reporting on subjects of national and international importance, combining the talents and energies of preeminent journalists working with graduate students,” and to analyze and experiment “with new economic models for creating and delivering investigative reporting” and in general, to find ways to enlarge the public space for this vital work.27
In late 2011, when my researchers and I at the Investigative Reporting Workshop examined the relevant IRS tax documents and other materials, we found that the seventy-five journalistic organizations profiled collectively boasted annual funding of $135 million and employed 1,300 full-time, paid staff members.28 In addition, nonprofit investigative news organizations are now operating from England to Italy, from Peru to South Africa, from Jordan to Southeast Asia, where, in 1990, nine female journalists founded the Philippine Center for Investigative Journalism, the first known such nonprofit outside the United States.29
One of the distinguishing characteristics of this new phenomenon has been the migration of top major media editors to this nonprofit environment. The abbreviated roster includes former vice president of news for NPR, Bill Buzenberg, now executive director of the Center for Public Integrity; former assistant managing editor of the Hartford Courant, Lynne DeLucia, who cofounded the Connecticut Health Investigative Team; Florence Graves, founder and editor of Common Cause Magazine, who started the Schuster Institute for Investigative Journalism at Brandeis University; former senior editor for Metro and Watchdog Journalism at the San Diego Union-Tribune, Lorie Hearn, now editor of Investigative Newsource; Joel Kramer, who separately served as editor and publisher of the Minneapolis Star-Tribune, and launched MinnPost; former executive editor of the Philadelphia Inquirer, Robert Rosenthal, who now directs the Center for Investigative Reporting; Paul Steiger, former managing editor of the Wall Street Journal, who became founding executive editor of ProPublica; and Margaret Wolf Freivogel, former assistant managing editor of the St. Louis Post-Dispatch, who is the founding editor of the St. Louis Beacon.30
Another tremendously exciting development is a move by journalism departments in the nation’s colleges and universities to embrace a “teaching hospital” model, creating daily and investigative news stories for mainstream outlets—in some cases year-round, rather than just during academic semesters—through the reporting and writing of upper-class undergrads and graduate students. One notable example is the Carnegie-Knight Initiative on the Future of Journalism Education, whose News21 program has, since 2005, annually recruited university journalism students to produce in-depth reporting on a single subject. Their multipart, multimedia series pieces, edited and coordinated by Arizona State University’s Walter Cronkite School of Journalism and Mass Communications, have been co-published with major news media outlets.33 In 2010, for instance, the Washington Post and MSNBC.com were among those that published parts of a twenty-three-story series, produced by eleven student reporters from eleven universities, on shortcomings in US transportation safety.
the annual Washington Post Fellow Program was created: the newspaper pays for the master’s degree of an incoming journalism student, and the school names five “Dean’s Fellows” for annual internships inside the Post newsroom. By the end of the 2011–2012 academic year, seven School of Communication students had written more than two hundred bylined articles for the Post.37
Then, in early 2013, with help from a Ford Foundation grant, the Post and SOC announced they were jointly hiring highly respected investigative reporter John Sullivan.38 Previously, Sullivan had led a five-person team at the Philadelphia Inquirer that explored violence in the city’s schools, including crimes committed by children against other children. In 2012, the seven-part, multimedia exposé, which prompted reforms to improve safety for both teachers and their students, won for the Inquirer the Joseph Pulitzer Gold Medal for Public Service, the most prestigious Pulitzer Prize for journalism.39
In the meantime, the sharing of values, resources, and even content has also been taking place within the new nonprofit journalism ecosystem itself. For example, in July 2009, the founders of roughly twenty nonprofit news organizations gathered outside New York City and created the Investigative News Network. Leading philanthropic foundations got behind the historic effort, INN was incorporated, the IRS approved its tax-exempt status, and an executive director was hired. I am proud to say I was present at the creation, proposed the concept and name to the group, and serve as a founding board member and officer. The enterprise is still in its infancy, although membership in this nonprofit news publishers association has already reached nearly one hundred. I hope this is just the beginning and that INN will pioneer the collection and syndication of the best nonprofit investigative reporting content in the United States, and perhaps throughout the world.40
But a dose of perspective is also in order. Paul Starr, a two-time Pulitzer Prize–winning author and a longtime Princeton University professor, recently pointed out the discrepancy between the growth of the nonprofit journalism sector and the massive decline in its for-profit counterpart: “It is hard to see how philanthropy can match the resources that are being lost. Since 2000, the [US] newspaper industry alone has lost an estimated ‘$1.6 billion in annual reporting and editing capacity . . . or roughly 30 percent,’ but the new nonprofit money coming into journalism has made up less than one-tenth that amount.”42
Unfortunately, the situation has since gotten discernibly worse, which makes whistleblowers and leakers, technologically ingenious hackers and other “idealists, anarchists [and] extremists” seem somewhat heroic as welcome antidotes to our worsening affliction.48 Consider that in 2001, 8.6 million US government documents were classified; by 2008, 23.8 million were classified, and by 2010, “despite [President] Barack Obama’s promises of a more transparent government,” 76.7 million documents were classified!49 Today, 4.2 million Americans have some form of security clearance to read classified documents—and of those, 1.2 million have Top Secret clearances.50 In effect, our citizenry is divided into two tiers—a small elite with access to inside knowledge about our government, and a vast lower echelon that is kept in the dark. There’s no reason to believe the situation is better in most other countries.
Today there is a robust global “right to know” information movement and a related anti-corruption community, which have initiated various cross-border collaborative initiatives, including conferences. Some of the organizations involved, from newest to oldest, are:
• The Sunlight Foundation, founded in Washington, DC, in 2006, which “uses the power of the Internet to catalyze greater government openness and transparency”
• The Open Democracy Advice Centre, based in Cape Town, South Africa, created in 2000 to help with the implementation of the recently enacted access-to-information laws there
• Transparency International, created in 1993 and based in Berlin, best known for publishing a Corruption Perception Index based on annual polling data, ranking the world’s most corrupt countries
• Article 19, founded in 1987 and based in London, which promotes “access to information needed to hold the corrupters and the corrupted to account”
• The National Security Archive, located at George Washington University in Washington, DC, founded in 1985 by journalists and scholars “to check rising government secrecy,” which, as the foremost nonprofit user of the US Freedom of Information Act, has submitted 40,000 Freedom of Information and declassification requests to over two hundred US government offices and agencies, in the process prying loose more than 10 million pages of documents
• The Center for Effective Government (formerly OMB Watch), also based in Washington, DC, one of the nation’s leading transparency advocacy organizations, established in 1983 to “lift the veil of secrecy shrouding the White House Office of Management and Budget”
• The Center for Responsive Politics, the preeminent research group tracking private campaign money in US political elections, started in Washington in 1983
• The National Institute for Money in State Politics, based in Helena, Montana, which has the only comprehensive and searchable online record of political donations in all fifty states
• The Carter Center, created by former president Jimmy Carter in 1982, in partnership with Emory University, in Atlanta, which seeks to “enhance freedom and information” in the Americas and elsewhere
• The Project on Government Oversight, founded under a different name in 1981, which states that its “investigations into corruption, misconduct, and conflicts of interest achieve a more effective, accountable, open, and ethical federal government”54
There are now 6,603 think tanks operating in 182 countries, nearly 60 percent of them in North America and Western Europe, many based at or affiliated with colleges and universities.
For example, there are tens of millions of Americans who are environmentally concerned or curious, and many more like them throughout the world, but there is no international broadcast, cable TV, or global video source focused exclusively on this theme.62 There should be. Global multimedia attention to a topic like the environment—or human rights, education, government corruption, international security, corporate governance, health care, or other similar topics—would provide cross-cultural coherence, as well as vitally important context and information, to interested communities everywhere. One hopes that someday there will be such programming for all these important areas of interest and many more.
Indeed, I believe we are already moving toward creating what I informally call online knowledge clusters. Why can’t there be places where citizens from anywhere can go to find the most authoritative, up-to-date, searchable knowledge about vital issues, combining the best documented information from multiple crowd sources, including academia, journalism, government, and the private and nonprofit sectors? The nearest equivalents today are Wikipedia and perhaps Encyclopedia Britannica and other such knowledge compendiums.63 These online sites are remarkable achievements, but they are often inadequate, inaccurate, or outdated.
We need new ways to amass knowledge across borders and cultures based on documented, reliable sources. Imagine combining the most authoritative information from disparate sectors, including journalism and such academic subject areas as investigative history, forensic accounting, computer science and statistics, political science, economics, public anthropology, human rights, public interest and other law-related fields, court proceedings and their related, unsealed case materials. Imagine an online portal that provides access to all these materials and more—a central clearinghouse for data about the world’s worst corporate, financial, and environmental scofflaws; the worst violators of health, safety, and labor rights regulations; companies that have been decertified by one of the world’s stock exchanges for fraud or other misbehavior; government agencies that have squandered public funds; and political organizations and lobbying groups that mislead and deceive the citizenry. Creating such a source is mostly a matter of time and money, and I believe it is inevitable.64
We journalists need to become less arrogant about our status as filters of information; we need to be more ready to acknowledge the value of authoritative investigative information unearthed by others, from expert specialists to ordinary citizens; and we need to learn to collaborate more closely with one another and with professionals from many fields in our collective search for truth.
For this reason, I have proposed the creation of a new multi-disciplinary academic field called Accountability Studies.66 Ideally, it would involve professors with different types of accountability knowledge and expertise from throughout the university, and it would enable students to earn a specialized degree in the field. I believe that an array of courses addressing these topics should be offered at every major university. My years of teaching and mentoring hundreds of research interns at three reporting/research centers have shown me that students from widely different academic backgrounds are excited about the prospect of learning exactly how to investigate those in power and hold them accountable. Many thousands would be eager to devote their careers to such work—and not necessarily in traditional journalism.
In his 2007 book, The Meaning of the Twenty-first Century, Martin wrote about the most serious social, economic, environmental, and political challenges facing our world. He concluded:
Today there are major roadblocks preventing the actions that are needed. There are huge vested interests with massive financial reasons for not changing course . . . There is widespread ignorance . . . For the powerful people who control events, the desire for short-term benefits overwhelms the desire to solve long-term problems. If these roadblocks are not removed, we will steadily head down paths that lead to catastrophe: famines, violence, wars over water, pollution, global pandemics, runaway climate change, terrorism with new types of mass-destruction weapons.68
It would be hard for anyone to claim that the problems we face have become any more tractable in the years since Martin wrote those words—or that the “huge vested interests” he described have become any less shortsighted and irresponsible. But who, exactly, is going to hold the powers that be accountable? Who is going to shine the light on the acts of corruption, abuse, despoliation, greed, and oppression that continue to darken our world?