NDSR Boston 2015 Begins

Hello all!

These are your NDSR Boston 2015 residents taking over. We’re only in week two, but are each feeling excited about the direction of our projects and eager to learn from our participation in the NDSR program over the next nine months. We’ll use this blog to discuss facets of our projects, professional development opportunities, successes, challenges, and lessons learned. Stay tuned for more about what we’ve experienced so far, and about whats ahead!

viewlogo_colors_177_140

NDSR Boston Welcomes New Cohort for September 2015

June 3, 2015 – The National Digital Stewardship Residency Boston (NDSR-Boston) program, a program to develop professionals in digital stewardship through post-graduate residencies, welcomed a new group of residents and hosts for the 2015/16 cohort. Each resident will have the opportunity to work on an exciting and challenging project at one of the Boston area institutions beginning in September 2015.

Andrea Goethals, NDSR Boston Project Director and Harvard Library Mentor welcomed the new group, saying, “We are fortunate to be in the position of having another really great group of residents.  All of us are looking forward to welcoming them when they start in September, and getting to know them as they work with us on projects within our institutions.”

 Featured image

Alexandra Curran – MIT Libraries

Alexandra Curran’s interests include audio/visual archives, digital curation, and the preservation of multimedia collections in libraries and cultural heritage institutions. While earning her B.S. in Digital Cinema from DePaul University, she explored editing, compositing, data management and the long-term management of digital assets. Her M.A. in Library and Information Science and Graduate Certificate in Museum Studies, both from the University of South Florida (USF), prepared her for curatorial duties with interviews in the USF Tampa Library’s Oral History Program, including the significant Speaking Out Against Genocide digital oral history collection. While interning at the National Archives and Records Administration’s Motion Picture Preservation Lab she prepared materials for long-term storage and digital conversion, and also learned the principles and techniques of photochemical and digital restoration. As an intern for the City of Tarpon Springs, Florida, she created a community-based website for the Greek Community Documentation Project. Current volunteer activities include assisting a local historical society with preservation issues for video oral histories in older formats and ingesting content into an institutional repository. A fan of tea, cinema, and spy novels, Alexandra currently works in a tea store. She is looking forward to her residency with MIT Libraries.

  Featured imageJeffrey Erickson – University of Massachusetts at Boston

Jeffrey Erickson is 2015 graduate of the School of Library and Information Science at Simmons College with a focus on Archives and Cultural Heritage Informatics. Jeff was attracted to the Cultural Heritage concentration because of its emphasis on technology and digital preservation. Building on his extensive professional career in IT, Jeff’s interest was to become an archivist so he could work on issues related to digital materials. Jeff has studied digital preservation, preservation management, digital asset management, XML and metadata at Simmons and applied his newly acquired knowledge to solve issues related to digital materials and technology at the Massachusetts Historical Society, the Forbes House Museum, and the John F. Kennedy Presidential Library and Museum. Jeff has a personal interest in preservation; he has restored an antique building, has collected artifacts from his family’s dairy farm in New Bedford, MA and is building a family archive to pass along his family’s heritage to his children and to future generations of his family. Jeff enjoys working on DIY projects, reading non-fiction, traveling with his family, playing basketball and working on his golf game.

Featured imageAlice Sara Prael – John F. Kennedy Presidential Library & Museum

 Alice Sara Prael recently graduated with an MLS from the University of Maryland, College Park, specializing in the curation and management of digital assets. During that time she served as the Digital Programs and Initiatives Graduate Assistant at University of Maryland Libraries.  In this position she discovered a passion for digital preservation and partnered with the Special Collections and University Archives to create a workflow for processing born digital content.  She spent the past summer interning with the National Archives and Records Administration where she oversaw the digitization and description of a series of records from the John F. Kennedy Assassination Records Collection.  Alice currently lives in Maryland where she hikes, crafts and plays roller derby with the DC Rollergirls.  She is thrilled to continue work with digital preservation at the John F. Kennedy Presidential Library through NDSR.

Featured imageJulie Seifert – Harvard Library

Julie Seifert graduated from the MLS program at the University of North Carolina at Chapel Hill, with a concentration in archives and records management, and a certificate in Digital Curation. While at UNC, she focused on digital preservation and completed a Master’s project on digital forensics. She also received a scholarship to go to the Czech Republic, where she met Czech colleagues and studied digital libraries in the Czech Republic. In her free time, she enjoys paddle boarding, reading, and traveling. She completed an internship in Boston during graduate school and is looking forward to being back! She will carry out her residency at Harvard University.

 

    Featured imageStefanie Ramsay – State Library of Massachusetts

Stefanie Ramsay is a recent MLIS graduate from the University of Washington. Drawing upon her background in American history, Stefanie’s focus in graduate school has been on the preservation of historical materials for long-term access and use by diverse user groups. She’s worked in digital collections for the University of Washington Special Collections Library, performed archival processing for the Washington State Jewish Archives, and implemented a pilot digitization program for architectural drawings at the Seattle architecture firm NBBJ. She believes digital preservation is a necessity for modern information organizations, and is eager to learn more about best practices as well as how libraries and archives adopt the appropriate workflows to ensure greater access to and preservation of their important materials. Having previously lived in Los Angeles, New York City, and Seattle, Stefanie is looking forward to walking all over Boston, eating lots of lobster, and cheering on the Red Sox at Fenway Park. Stefanie will carry out her residency with the State Library of Massachusetts.

NDSR Boston 2014 Residents Conclude Residency in Boston

June 2015 – The National Digital Stewardship Residency Boston (NDSR-Boston) program, a program to develop professionals in digital stewardship through post-graduate residencies, has concluded the first round of residents marking the occasion with a Capstone Event held at Harvard on May 13, 2015.   The residents, who were embedded within five Boston area host institutions (Harvard, MIT, Northeastern, WGBH and Tufts), each presented their work in a poster created as the final project of the residency.

The event was well-attended and included representatives from each host institution as well as local and regional professionals.  Among the hosts, it was widely agreed that the experience of mentoring a resident was a rewarding one and that the resident contributed valuable work to their institution.

With an eye towards sustaining and expanding the regional digital stewardship network, the program plans to include the first year cohort of residents in training and social events during the second residency year which starts in September 2015.

QC vs. QA, IMHO

As my project moves along, our first batch of digitized audiocassettes from the Herb Pomeroy collection have just arrived back at MIT from the vendor. Leading up to this exciting arrival, a smaller group of representatives from the project have been meeting to determine exactly what the Quality Control (QC) steps will be, as distinguished from our Quality Assurance (QA) steps.

Semantics can be flustering sometimes, but nuanced distinctions between words and phrases can contain major implications. For our project – in defining Quality Control and Quality Assurance – the implications affect who is responsible for executing the actions and who needs to be trained on what tools, as the Collections Preservation and Reformatting (CPR) department is executing the Quality Control actions, while the Content Curators (CC) work on the Quality Assurance methods. Below you can see the isolated high-level “Transform Analog to Digital” workflow and the “Manage Digital” workflow and where the QC and QA steps fit.

A and D pipeline

The way I’ve been conceptualizing the distinctions we made between the two are: Quality Control is more like quantitative data checking and Quality Assurance is qualitative content assessment. So I wanted to discuss these distinctions, the methods that are included under them, and the tools we are currently employing to execute these processes.

Quality Control (QC)

As mentioned above, the Quality Control processes relevant for this project have really geared towards determining whether the data and files requested from the vendor is present and complete. Questions involved in our QC measure include: Are the number of recordings we expected on the hard drive? Are all of the metadata fields we needed filled in? Are the digital audio surrogates the length and stereo specification we requested? Here are some of the QC actions and the tools we are enlisting – though they will certainly evolve as the workflow matures:

QC Action Tool
Virus check Sophos
Checksum validation Karen’s Hasher or MDFiveCheck
Was reformatting done correctly? BWF MEtaEdit to check channels and stereo/mono
Are all metadata types we requested present? BWF MetaEdit
Correct file formats for preservation and access copies? DROID
Appropriate number and size of files? N/A – Eyeball the directory

Quality Assurance (QA)

The Content Curators are then in charge of the QA, listening to the actual audio (content) and determining:

  • Is this what we expected off of that analog audio material?
  • Can we use this after all, does it sound good?
  • The metadata fields are present, but are they correct?

It’s been a very useful conversation for our group, especially instructive for other members as to what ways digital archival content needs to be treated in order to maintain integrity, reliability, and authenticity. As this has all been a part of the group hashing out the workflow, it is another example of how documenting detailed, lower-level pieces of the workflow has clarified responsibilities and where specific tools can be leveraged. I’m very close to finishing documentation of the next portion of the workflow (the D sequence above), and I am excited to show everyone the work soon!

By JHOVE, we’ve done it –
Tricia P

Money Talks: Part I, Government Funders

Money Talks Image

Samantha DeWitt
NDSR Resident | Tufts University Tisch Library & Digital Collections and Archives

What can be done to encourage data preservation among university researchers? U.S. science and technology research agencies and the offices that oversee them have had strong ideas on the subject and have been making pronouncements for over a decade. Twelve years ago, the National Institutes of Health (NIH) began asking grant-seekers to plan for the management of their research data. The National Science Foundation (NSF) followed in 2011. In 2013, the White House superseded both agencies with a memorandum from the Office of Science and Technology Policy (OSTP) announcing the government’s commitment to “increase access to federally funded published research and digital scientific data…”

Money talks. The NSF invests about $7 billion in American research annually and the NIH allocates about $30 billion to medical research. To put these numbers in perspective for Tufts, that translated into about $8 million from the NSF and just under $62 million from the NIH last year.*

Because the agencies that disburse money in the form of federal R&D funds have been mandating data management plans from their applicants, the research universities that rely on those federal funds have responded. Many, including Tufts, have looked to their libraries to provide support in:

  • Assisting researchers in the creation and implementation of data management plans
  • Helping researchers find the right data repository
  • Dataset metadata creation
  • Encouraging best practices in data management

Universities have taken other steps as well. Some have created new data repositories, or have augmented their existing institutional repositories in order to accommodate and support the long-term preservation of research data.

Have government data access directives had their intended effect? So far the data are sparse. A 2014 Drexel University study did find NIH mandates, along with those implemented by scientific journals, seemed to be “meeting the goal of increasing the sharing of scientific resources among life science investigators.” But the point I want to make here is that, at the very least, these mandates have served to publicize the issue of data management to a degree that has encouraged debate and discussion among researchers and others within the university.

While the government still provides the greatest portion of funding to U.S. research universities, the automatic spending cuts of the 2013 budget sequestration have reduced the flow of money enough to make grant-seekers nervous. Researchers are increasingly appealing to foundations, corporations and philanthropic organizations to fill in the gaps. I hope you’ll join me in April for Part II, as we “follow the money” and look at which non-government funders are advocating for data management as well!

‘Till then,

Sam

* In research project grants

http://projectreporter.nih.gov/reporter.cfm
http://dellweb.bfa.nsf.gov/Top50Inst2/default.asp

That Workflow’s A Monster: Updating Ingest Procedures at WGBH

Screen Shot 2015-02-04 at 3.05.56 PM

This was our first effort at charting out the current accessioning workflow for all the different material collected by WGBH, step-by-step – from its arrival (in boxes carried over from production departments, or on hard drives dropped off by production assistants, or copied to a server or sent to a dedicated inbox or – you get the idea), through the various complex data and rights management processes designed to make sure that every tape, shot, transcript, and contract is appropriately accounted for, all the way through to physical storage and the burgeoning digital preservation program the department is now putting into place. Peter Higgins (my partner in workflow documentation crime) and I spent a long time trying to make it as clear and easy to follow as possible, but it can’t be denied that it’s kind of a beast.

Don’t bother trying to read the tiny text in the boxes; for now, just relax and enjoy the wash of incomprehensible geometry.  After all, as of the point of the writing of this blog post, thirty days after Peter and I presented this document to the rest of the WGBH team, it’s also already outdated. WGBH, as I’ve mentioned before on this blog, is an archive in transition, which is one of the reasons working on this project now was so crucial. The current workflow involves a lot of emailing back and forth, entering information into Excel spreadsheets (unless someone else is using them first), moving folders around in a share drive, and assigning files color-coded labels. As a newbie myself, I can vouch for the fact that this is a difficult system to explain to newcomers, which is a problem for an archive that hosts several new interns every year. It also tends to result in a lot of this:

Screen Shot 2015-02-04 at 3.21.29 PM

Everyone at WGBH wants to streamline the current workflow, get rid of unnecessary steps and outdated practices, and figure out better tracking – having an easy way to tell who’s doing what when is key for ensuring that work isn’t duplicated and material doesn’t slip through the cracks. However, two major changes are coming down the pipeline, which may also alter the workflow significantly.

The first is that the Media Archive Research System, aka MARS – the complex FileMaker database which currently stores and links together all of WGBH’s data about its programs, physical media assets, original video content, and licensed media – is being replaced. In the long run, the new database should be friendlier for WGBH production departments, making it easier for them to enter and store metadata for the use of the rights department and the archives, and to retrieve the data they need on the other end. In the short term, however, there are still a lot of question marks about how exactly the new database is going to link up with the current archival workflows around metadata management.

The second factor is the adoption of the HydraDAM system for digital preservation, which I talked about [in a previous blog post linked here.] HydraDAM isn’t intended to be the primary source for content metadata, but again, in theory, it should be able to automate a lot of the processes that archivists are currently doing manually, such as generating and comparing checksums to ensure safe file transfers. But until HydraDAM is ready to kick into full gear, we won’t know for sure how nicely it’s going to play with the rest of the systems that are already in place in the archives.

We want to make sure these transitions go as smoothly as possible, and as we’re cleaning up the workflow to make it easier for the staff, we also want to make sure we’re not making any changes we’ll regret when the new systems come online. That’s why we let workflows eat our brain for most of December, throwing ourselves into the task of creating the monstrously in-depth diagram to represent the current workflow that we possibly could.

Then we used the original beast to build this:

Screen Shot 2015-02-04 at 3.31.01 PM

OK, yes, it still looks pretty beastly, but in theory it’s the first step on the road to a better, stronger, faster beast.  However, the real key focus here is all those gold-colored boxes on the chart. They represent elements of the workflow that we know we’re going to need, but don’t yet exactly know how they’re going to happen in the oncoming Hydra-PYM future – whether that’s because we haven’t finished developing the tool that’s going to do it, or because we need to research new tools that we haven’t yet got. We’re using this proposed workflow itself as a tool for discussion and planning, to help target the areas to focus on as we move forward in implementing new systems, and make sure that we’re not leaving out any crucial functionality that we’re going to need later down the line.

Although there’s still a long timeframe for the WGBH workflow overhaul, the work we’ve done has already had some immediate results – including the adoption of a new project management system to help the department keep better track of our ingest process. We’ve decided to use Trello, a flexible online application that allows us to create ‘cards’ representing each delivery of production material that comes into the archive, and add lists of tasks that need to be accomplished before the accession process is complete.

Screen Shot 2015-02-04 at 3.38.55 PM

When we presented our Trello test space to the MLA team at WGBH, we thought we would have to build in some time for testing and for people to get used to the idea of tracking their work in an entirely different system. However, everyone we spoke to was so excited for the change that we received a pretty much unanimous decision to jump on-board right away.

Inspired by the activity on the main MLA Trello, I’ve also created a Trello board to keep track of my own next phase of NDSR — which will be a topic for my next post, in which I solemnly swear to have minimal geometry.

– Rebecca