The declassification engine What history reveals about America's top secrets

Matthew Connelly, 1967-

Book - 2023

"A captivating study of US state secrecy that examines how officials use it to hoard power and prevent meaningful public oversight. The United States was founded on the promise of a transparent government, but time and again we have abandoned the ideals of our open republic. In recent history, we have permitted ourselves to engage in costly wars, opened ourselves to preventable attacks, and ceded unaccountable power to officials both elected and unelected. Secrecy may now be an integral policy to preserving the American way of life, but its true costs have gone unacknowledged for too long. Using the latest techniques in data science, historian Matthew Connelly analyzes the millions of state documents both accessible to the public and s...till under review to unearth not only what the government does not want us to know, but what it says about the very authority we bequeath to our leaders. By culling this research and carefully studying a series of pivotal moments in recent history from Pearl Harbor to drone strikes, Connelly sheds light on the drivers of state secrecy--especially consolidating power or hiding incompetence--and how the classification of documents has become untenable. What results is an astonishing study of power: of the greed that develops out of its possession, of the negligence that it protects, and of what we lose as citizens when it remains unchecked. A crucial examination of the self-defeating nature of secrecy and the dire state of our nation's archives, The Declassification Engine is a powerful reminder of the importance of preserving the past so that we may secure our future"--

Saved in:

2nd Floor Show me where

323.445/Connelly
1 / 1 copies available
Location Call Number   Status
2nd Floor 323.445/Connelly Checked In
Subjects
Published
New York : Pantheon Books [2023]
Language
English
Main Author
Matthew Connelly, 1967- (author)
Edition
First edition
Physical Description
xvii, 540 pages : illustrations ; 25 cm
Bibliography
Includes bibliographical references (pages 407-502) and index.
ISBN
9781101871577
  • Preface: Should this book be legal?
  • The radical transparency of the American republic: a re-introduction
  • Pearl Harbor: the original secret
  • The bomb: born secret
  • Code making and code breaking: the secret of secrets
  • The military-industrial complex: the dirty secret of civil-military relations
  • Surveillance: other people's secrets
  • Weird science: secrets that are stranger than fiction
  • Following the money: trade secrets
  • Spin: the flipside of secrecy
  • There is no there there: the best kept secret
  • Deleting the archive: the ultimate secret
  • Conclusion: The end of history as we know it.
Review by Choice Review

This is a disturbing but important account of the danger the current US classification system poses to American democracy and to history. Begun during WW II, this system caused a backlog of millions of documents relating to foreign policy and other matters, kept at a cost of millions of dollars. It will take decades for that material to be declassified if it is not destroyed in the interim. Historians of foreign policy, including this reviewer, will be unable to write a full account of any presidency after the 1980s. After analyzing the history of classification, Connelly (international and global history, Columbia Univ.) asserts that much of the material is kept secret to protect policymakers. The resulting "dark state" denies citizens the ability to hold their government accountable (p. 6). Connelly and his colleagues at Columbia's History Lab urge that data science, their "declassification engine," be used to rationally ascertain what should be kept secret and what can be revealed (p. xvi). The government declined to adopt his system, but Connelly continues to argue for its efficacy. The larger point he makes, however, is that if unchanged, the system has the potential to end the writing of history as we know it. Summing Up: Highly recommended. All readers. --Lorraine M. Lees, emerita, Old Dominion University

Copyright American Library Association, used with permission.
Review by Booklist Review

Connelly teamed up with data scientists at Columbia University, where he is a professor of history, to analyze the U.S. government's system of document classification and storage in order to determine how it might be streamlined to release needlessly classified information while protecting genuinely sensitive material. What they discovered was unnerving: a highly fallible, exorbitantly expensive (over $18 billion annually, by Connelly's estimate), virtually uncontrollable system that ultimately renders its administrators unaccountable to the American taxpayers funding it. The problems revealed range from lack of civilian oversight of the U.S. military (especially the nuclear arsenal) to needlessly massive U.S. arms expenditures based on Soviet disinformation and competition among America's military branches, to administrators drawn largely from a self-interested business community, and, now, to surveillance that can reach deep into the lives of everyday Americans. "We need to start asking ourselves a different question," Connelly writes. "What do we, the people, need to know to do our job as citizens to keep our government accountable?" One hopes this book will generate serious discussion of the issue.

From Booklist, Copyright (c) American Library Association. Used with permission.
Review by Publisher's Weekly Review

Columbia University historian Connelly (Fatal Misconception) forcefully critiques the "exponential growth in government secrecy" since WWII. Drawing on his work at the History Lab, which uses advanced data mining techniques to "sift and sort through" millions of declassified documents for insights into "what the government did not want us to know, and why they did not want us to know it," Connelly argues that the "relentless" and "massive" accumulation of secret information has "served the interests of people who wanted to avoid democratic accountability." Examining declassified documents and metadata related to nuclear weapons, cryptography, UFO sightings, battle plans, the 1954 Guatemala coup (long believed to have been coordinated by the Eisenhower administration at the behest of the United Fruit Company), and more, Connelly contends that the rise of state secrets has undermined government efficiency, buttressed the military-industrial complex, and fostered conspiracy thinking. He also contends that the more information is classified, the harder it is to track and protect, making it vulnerable to exploitation, and highlights arbitrary and ineffective policies, including the classification of material after it's already entered the public domain. Though the data analysis and history lessons can be dense, Connelly enlivens the narrative with sharp character sketches and acerbic wit. It's an impassioned and well-informed wake-up call. (Feb.)

(c) Copyright PWxyz, LLC. All rights reserved
Review by Library Journal Review

Connelly (global history, Columbia Univ.; Fatal Misconception) advocates for the necessity of a declassification engine to tame the U.S. government's vast amount of secret, classified information. Knowing that this is a tall order, the author meticulously makes his case, while also outlining the history of classified information and deftly illustrating the deep symbiosis between capitalism and national security strategy. Connelly states this emerged during World War II, with Pearl Harbor, the Manhattan Project, and the inception of the Cold War acting as catalysts. Reasons for keeping parts of the public record classified include protecting sensitive information about valuable allies and hiding governmental incompetence. The mutual enmity between some civilian leaders--including presidents--and the military brass directly led to the Vietnam quagmire. The global war on terror, with its nebulous focus on national security, gave the government broad, unprecedented powers to surveil citizens. The information age has added to the glut of data captured and parsed by agencies such as the National Security Agency. Connelly also considers the banality of secrecy, making clear that much of the government's classified information is mundane and unproductive. VERDICT Perfect for readers intrigued by the intersection of politics and history.--Barrie Olmstead

(c) Copyright Library Journals LLC, a wholly owned subsidiary of Media Source, Inc. No redistribution permitted.
Review by Kirkus Book Review

The U.S. government is hopelessly awash in secret information, and this gripping history describes how we got that way and lays out the dismal consequences. Connelly, a professor of international history at Columbia, writes that more than 28 million cubic feet of secret files rest in archives across the country, with far more in digital server farms and black sites. Nonetheless, government secrets are not secure. "Washington has been shattered by security breaches and inundated with leaks," writes the author. Global hackers often access classified files, and dissenters (Julian Assange, Edward Snowden, Chelsea Manning, et al.) regularly extract material. Readers may be surprised when Connelly points out that the first 150 years of American history were essentially secret-free. Even diplomats often avoided encoding their communication. A new era began in 1931 with the groundbreaking for a national archive, and Franklin Roosevelt appointed the first archivist three years later. At this point, the "dark state" began its epic growth, which Connelly recounts in 10 unsettling chapters and the traditional yet still dispiriting how-to-fix-it conclusion. The author delivers a wild, page-turning ride packed with intelligence mistakes, embarrassing decisions, expensive failed weapons programs, and bizarre research that has ranged from the silly to the murderous. A large percentage of classified information, including the famous WikiLeaks revelations, isn't secret but available in old newspapers. Everyone agrees that democracy requires transparent government. Congress has passed many laws restricting unnecessary classification and requiring declassification after a long period, but they are often dead letters. Officials occasionally required to review records for "automatic" declassification almost always keep them secret. Plus, the bloated archives are so underfunded that staff members have insufficient technical capacity to recover historical records. Destroying them en masse is cheaper, and this is being done. Interestingly, Connelly points out that historians are more likely to study World War II and the early Cold War because 1970s and later material is largely locked away. Yet more evidence, brilliantly delivered, of the extent of the U.S. government's dysfunction. Copyright (c) Kirkus Reviews, used with permission.

Copyright (c) Kirkus Reviews, used with permission.

PREFACE: Should This Book Be Legal? There I was, sitting at a massive conference table inside a multibillion-dollar foundation, staring at the wood-paneled walls. I was facing a battery of high-powered attorneys, including the former general counsel to the National Security Agency, and another who had been chief of the Major Crimes Unit at the U.S. Attorney's Office in the Southern District of New York. The foundation was paying each of them about a thousand dollars an hour to determine whether I could be prosecuted under the Espionage Act. I am a history professor, and my only offense had been to apply for a research grant. I proposed to team up with data scientists at Columbia University to investigate the exponential growth in government secrecy. Earlier that year, in 2013, officials reported that they had classified information more than ninetyfive million times over the preceding twelve months, or three times every second. Every time one of these officials decided that some transcript, or e-mail, or PowerPoint presentation was "confidential," "secret," or "top secret," it became subject to elaborate protocols to ensure safe handling. No one without a security clearance would see these records until, decades from now, other government officials decided disclosure no longer endangered national security. The cost of keeping all these secrets was growing year by year, covering everything from retinal scanners to barbed-wire fencing to personnel training programs, and already totaled well over eleven billion dollars. But so, too, were the number and size of data breaches and leaks. At the same time, archivists were overwhelmed by the challenge of managing just the first generation of classified electronic records, dating to the 1970s. Charged with identifying and preserving the subset of public records with enduring historical significance but with no increase in staff or any new technology, they were recommending the deletion of hundreds of thousands of State Department cables, memoranda, and reports, sight unseen. The costs in terms of democratic accountability were incalculable and included the loss of public confidence in political institutions, the proliferation of conspiracy theories, and the increasing difficulty historians would have in reconstructing what our leaders do under the cloak of secrecy. We wanted to assemble a database of declassified documents and use algorithms to reveal patterns and anomalies in the way bureaucrats decide what information must be kept secret and what information can be released. To what extent were these decisions balanced and rule-based, as official spokesmen have long claimed? Were they consistent with federal laws and executive orders requiring the preservation of public records, and prompt disclosure when possible? Were the exceptions so numerous as to prove the existence of unwritten rules that really served the interests of a "deep state"? Or was the whole system so dysfunctional as to be random and inexplicable, as other critics insist? We were trying to determine whether we could reverse engineer these processes, and develop technology that could help identify truly sensitive information. If we assembled millions of documents in databases, and harnessed the power of high-performance computing clusters, it might be possible to train algorithms to look for sensitive records requiring the closest scrutiny and accelerate the release of everything else. The promise was to make the crucial but dysfunctional declassification process more equitable and far more efficient. We had begun to call it a "declassification engine," and if someone did not start building and testing prototypes, the exponential increase in government secrets--more and more of them consisting of data rather than paper documents--might make it impossible for public officials to meet their own legal responsibilities to maximize transparency. Even if we failed to get the government to adopt this kind of technology, testing these tools and techniques would reveal gaps and distortions in the public record, whether from official secrecy or archival destruction. The lawyers in front of me started to discuss the worst-case scenarios, and the officers of the foundation grew visibly uncomfortable. What if my team was able to reveal the identity of covert operatives? What if we uncovered information that would help someone build a nuclear weapon? If the foundation gave us the money, their lawyers warned that the foundation staff might be prosecuted for aiding and abetting a criminal conspiracy. Why, the most senior program officer asked, should they help us build "a tool that is purpose-built to break the law"? The only one who did not seem nervous was the former ACLU lawyer whom Columbia had hired to represent us. He had argued cases before the Supreme Court. He had defended people who published schematics of nuclear weapons--and won. He had shown how any successful prosecution required proving that someone had possession of actual classified information. How could the government go after scholars doing research on declassified documents? The ex-government lawyers pointed out that we were not just academics making educated guesses about state secrets--not when we were using high-performance computers and sophisticated algorithms. True, no journalist, no historian, can absorb hundreds of thousands of documents, analyze all of the words in them, instantly recall every one, and rank each according to one or multiple criteria. But scientists and engineers can turn millions of documents into billions of data points and use machine learning--or teaching a computer to teach itself--to detect patterns and make predictions. We agree with these predictions every time we watch a movie Netflix recommends, or buy a book that Amazon suggests. If we threw enough data at the problem of parsing redacted documents--the ones in which government officials have covered up the parts they do not want us to see-- couldn't these techniques "recommend" the words most likely to be hiding behind the black boxes, which presumably were hidden for good reason? Excerpted from The Declassification Engine: What History Reveals about America's Top Secrets by Matthew Connelly All rights reserved by the original copyright owners. Excerpts are provided for display purposes only and may not be reproduced, reprinted or distributed without the written permission of the publisher.