What has the government been up to?

Is our historical record being destroyed?

Is our historical record being destroyed?

The government is destroying our history.  Matthew Connelly did not say this in his recent presentation as part of IU’s Framing the Global Project.  But those who heard him couldn’t be faulted if they left with that impression.  In his lecture on the history and future of official secrecy, Connelly, professor of history at Columbia, ranged from sovereignty, to diplomacy, to government secrecy, to the habits of archivists, and the power of computing to foil the best efforts to hide and destroy. 

The statistics are daunting.   The government produces about 270 million pages of classified documents each year.  Changes in regulations over the past decade—more documents tagged secret, fewer individuals with the authority to change the tags—have led to about 200 million documents a year being withheld from public scrutiny.  The result is that by 2010, the government had accumulated 9 billion pages of classified material.   

One immediate impression is that the government is trying to hide its tracks, to cover its mistakes, and that may be true.  Since we cannot see the documents, we can’t know for sure.  But other motives are at work as well.  Some matters we are better off not knowing—how to create a deadly flu virus for instance, or how to create a nuclear weapon.  Connelly also has concluded from reviewing the metadata of classified documents that much has been retained in classified status to create noise and make it difficult for enemies to ascertain the motives for classification.  And the Supreme Court has made it more difficult for private citizens to accomplish a successful declassification review. 

Other more venial motives are at work as well.  The pile of paper and electronic material has not escaped the notice of government agencies.  The State Department, among others, has set goals for processing the declassification backlog.  The targets for declassification are ambitious, but the agencies have not established sufficient funding to accomplish them. They spend on declassification one twentieth of the amount they spend on classifying documents.  “Archivists are completely overwhelmed,” Connelly explained.   To get the job done, they have had to resort to sampling methods—methods not random and often misguided, Connelly concluded.   In order to manage the workload, whole classes of records are being destroyed. Historians of immigration thus have lost much potential history as immigration and visa application records are thrown away.  Historians interested in sports diplomacy may never know all they need to about the national role in the Olympics because documents tagged to sports may be discarded.  “Only 3-5% of the governments documents are retained,” Connelly said. “I don’t think the agencies know what has been lost.”

 “People who have been trying to stop pushback have been completely outgunned.”  Still, 1.4 billion pages have been declassified.  With so much material, no historian could ever work through it all, so isn’t the destruction a moot point?  Outdated thinking, Connelly would reply. 

Firstly, the habit, fostered by academe’s methods of graduate education, of historians working in isolation will not be the only way for historians to work in the future.  As scientists assemble big, worldwide teams to solve their biggest problems, so historians now have tools that will make it possible to work as teams.  And those tools work best the more they have to operate on.  The application of computers to natural language processing, latent semantic analysis and machine learning has changed the face of linguistic and literary research.  Connelly sees it doing the same to history.  Feed all of these documents into a computer and as the data set becomes ever more massive, the computer will find patterns, clues that will not only reveal what is in the available historical record, but also what might be in the destroyed or redacted documents.

“These records are now valuable in a way they were not in the past. In data mining research, we don’t always know what kind of data is going to be useful.  We need to ask different questions about what is worth preserving.”  By aligning the databases representing the work of large numbers of historians, scholars can find unredacted versions of redacted text.”  With enough of these samples, computers can begin to predict what lies under the large chunks of text blackened out in documents that are released.   That will make it more possible to know if the redacted material contradicts the apparent direction of the text. These processes can analyze classification data that accompanies most government documents to identify linguistic styles–and so make it possible to identify authors.

“How do you have democracy if you don’t know what your government did 40 years ago?” Connelly asked.  With these new tools, and with a new attitude and procedures for handling the massive documentary materials that the government has accumulated, historians “can begin to restore the integrity of the historical record.

Click here for more information and sponsors of the Framing the Global Project.

Categories: Grants, Lectures