the general public needs
13
to be alerted that the threatened loss of our collective memories has at least as commanding a claim to its attention and its tax dollars as the deterioration of historic buildings or the natural environment.
Child is at pains, however, to emphasize that reformatting is not a “universal panacea”: 14
Microfilming is only
one
of the treatments at our command for dealing with the plague of paper deterioration, just as radiation therapy is but one of the options to be considered by an oncologist confronted by a malignancy.
Like radiation therapy, microfilming isn’t “an ideal or very pleasant method of treatment,” Child writes, but “in the last analysis, its value as a treatment is indisputable in those cases in which the patient would die.” The weakness of this analogy is that your typical doctor believes that when he prescribes radiation therapy he has a reasonable chance of keeping a patient alive, while your typical late-eighties preservation-reformatter disposed of the patient after a last afternoon on the X-ray table. In fact, it was better if you dismembered the patient first, because you could get higher quality X rays that way for less.
As the metaphorized stridency intensified, Haas, Child, Sparks, Welsh, and the other brittle-bookers began to see results. In 1985, Haas met with William ( Book of Virtues ) Bennett at the National Endowment for the Humanities, and he “talked passionately,” as he describes it, on behalf of preservation. Bennett acted; a week before he was to leave for his new job in Ronald Reagan’s cabinet as secretary of education, he founded an Office of Preservation at the NEH. That gave the lobbyists something to fix on: entreat Congress to give the Office of Preservation more money. “That was the beginning of real effort,” Haas recalls, “as opposed to Library of Congress effort.”
Now — how much money would the effort require?
CHAPTER 21. 3.3 Million Books, 358 Million Dollars
To answer the extremely important money question, Haas hired (circa November 1984) Robert M. Hayes, distinguished dean of UCLA’s School of Library and Information Science, to write, against a six-week deadline, a report entitled “Analysis of the Magnitude, 1Costs, and Benefits of the Preservation of Research Library Books.” Hayes was a network consultant 2for libraries and an early computer-connectivity expert; his digital career went back to the vacuum-tube days, when he used the National Bureau of Standards’ very fast fifties machine, the SWAC, 3now esteemed by historians of computer evolution. At Hughes Aircraft, at Magnavox, 4and then as head of a venture called Advanced Information Systems, Hayes helped design data-management systems for the National Security Agency, the Air Force’s Ballistic Missiles Division, the National Science Foundation, and Douglas Aircraft. But in the early sixties, Hayes decided to make library automation his life; he and the CIA’s Joseph Becker, 5who became his business partner, developed the library-of-the-future exhibition in Seattle, and Verner Clapp gave the two of them a grant to write Information Storage and Retrieval (1963), a rich compendium of hardware and methods.
Like Clapp and Fremont Rider, Hayes was troubled by the problem of growth. He, too, had an answer, and it wasn’t more shelves. “The most far-reaching solution 6to the problems posed by library growth,” he and Becker wrote in the Handbook of Data Processing for Libraries (1970), “is the creation of cooperative library networks.” Hayes might seem to be an unlikely person to write a manifesto for a gigantic microfilming program, since he had spent a good three decades as a database frontiersman, and yet once Hayes got going, and the macroeconomic numbers started rolling like flatcars through his brain, he demonstrated why he was the ideal choice for the job.
What Hayes did was sift through all the statistical deterioriation surveys — Yale’s, Stanford’s, the Library of Congress’s, and others — pulling percentages from dozens of places, cleaving to the ideal of consistency wherever possible and, where it wasn’t, saying so and plunging ahead anyway. And he did some arithmetic.
Assume, he began, that the nation’s libraries hold 305 million books — or volumes, rather, since the figure includes periodicals. If you apply the Library of Congress’s MIT Fold Test results to that figure, and call twenty-five percent of them brittle, you have 76 million volumes currently “at risk.” Assume that in another twenty years, 38 million more will attain “at-risk” status. Assume that of those 114 million volumes, nine out of ten are duplicates. That leaves you with 11.4 million at-risk volumes that would be available over the next twenty years for microfilming. In most cases (as Hayes points out in a supplemental report), the work would require guillotining, “effectively destroying 7the original as a book.”
Not all 11.4 million would have to be filmed, though. To estimate how many would, Hayes relied on a 1984 “Preservation Plan 8for Textual (Paper) Records for the National Archives of the United States,” which recommends that the archives repair or otherwise conserve seventeen percent of its holdings, mass-deacidify twenty-eight percent, “dispose” of six percent, leave twenty-four percent alone for now (as part of their planned deterioration), and microfilm the remaining thirty-three percent. (Why these numbers add up to more than a hundred Hayes does not explain.)
Now we’re almost there. If we apply the National Archives’ number, and assume that one third of the national at-risk population is microfilm-worthy, that brings us to 3.8 million volumes. But some of those have already been microfilmed — the Library of Congress and the New York Public Library, to name but two, have not been idle. Hayes invoked a percentage from an American Theological Libraries study, where 13.3 percent of the sample had existing microfilm.
Now, 13.3 percent of 3.8 million is about five hundred thousand, subtract that — okay, that brings us down to our final number: 3.3 million brittle volumes to microfilm in the next twenty years.
And now to figure out how much it’s going to cost. Assume a rate of twenty cents a page, sixty dollars a book, to film the chosen ones, and assume another twenty-two dollars in library overhead to do the choosing and cataloging and processing necessary before and after the work is done. Slap on another $88 million for things like training, research, program management, and communications, and you end up with a Brittle Books Program cost of $358 million.
That’s a lot of money, true, but Dean Hayes is ready in chapter 6 of his report to point out the advantages, as well. At the top of his list is “Saving of Storage Costs.” If each of the 3.3 million books that get microfilmed induces five out of ten of the libraries which hold duplicates to get rid of their own copies of those books, replacing them with “some other means of access to [their] content” (at an undisclosed cost), then all libraries, considered together, will be the net beneficiaries of liberated stack space. One must assign a dollar value to that space. It costs, Hayes estimates, only $1.25 per year to store a book, but if you do some quick things with present values and interest rates, you come up with a present value of the future cost of storage of $12.50 per book. Since five books are being dumped for every one microfilmed, you can multiply that number by five: $62.50 per volume — that is, slightly more than the estimated basic cost of microfilming. To put it another way: Yes, we’ll be spending $358 million on the program, but since as a result of the program, libraries will be relieving themselves of 16.5 million books, we’re going to be saving $206 million in storage costs—3.3 million books times the present value of the storage cost for five books at $62.50 per book. The savings that result from this spectacular book blowout is a “societal benefit” rather than a local benefit, Hayes points out: “The bulk of the savings will be experienced by other libraries that replace their duplicates with the converted form.”
Читать дальше