Evaluation evolution

As I referenced in my post about World Day for Audiovisual Heritage, MIT’s Lewis Music Library is really looking forward to not only preserving the audio content that we are digitizing, but also finally expanding access to it and awareness of it. For this, we need to identify a suitable access platform – no small feat, I have discovered! Large feat, in fact. Size 11s, at least. Since hearing about the myriad of considerations that inform a software evaluation during one of my MSLIS courses, I’ve actually been harboring a desire to be a part of one – and I haven’t been disappointed! So I thought I would share some of our evaluation process so far.

About the time I began my residency, a team was established to evaluate Avalon Media System as our potential dissemination platform. We tended towards looking at Avalon because there hasn’t been a ton of open source competition in the area of audio/audiovisual access, and the music library had previously looked into using Variations – Avalon’s predecessor – before Variations fell by the wayside. Avalon also has classroom integration functionality, which is important for an academic music library. As we began plotting out the evaluation, however, it became clear that it may be a better idea to establish our requirements and then measure a few platform options against them. We aren’t sure yet what other options we might evaluate yet, but we will find out as part of the process.

Our evaluation was initiated by digital curation and preservation on behalf of the Lewis Music Library. It takes a collaborative approach, actively merging organizational and technological perspectives. The team consists of:
• a couple of representatives from IT,
• our metadata archivist,
• a curatorial representative (someone from the music library that can speak to what their users need),
• and a couple of representatives from digital curation and preservation,

Together we have established most of our organizational and technological requirements, and our Digital Curation Analyst Helen (who was previously MIT’s Fellow for Digital Curation and Preservation) has compiled them into a spreadsheet. They also include any TRAC requirements that are pertinent specifically to access and dissemination, as well as the forethought that we probably want a system that is extensible to video. From here, we are ranking them from 0 (Meh, it might be nice) to 3 (Showstopper, must-have). After we have the requirements ranked, we can evaluate Avalon – and probably two other access systems – against it and figure out which one will work the best for us.

Streaming Audio System Evaluation Requirements - Sheet1_Page_1

Streaming Audio System Evaluation Requirements - Sheet1_Page_2

Click to enlarge

Similar to workflow design, this kind of work gives you real insight into many different facets of the library, archive, and institution itself: the users, rights issues, technical abilities, content varieties, &c. I love the a-ha moments, like: “Ohhhh, of course it would be useful for the curator to have the ability to bookmark sections of the audio! If the item consists of an entire concert, and it’s better for us to retain the context and not split up and separate the different pieces performed, we’ll want to be able to demarcate them.” Or: “Hmmm, I never realized how useful the ability to speed up or slow down audio could be!” I’m excited to keep you all abreast of how it pans out – and especially eager to see the music library finally have a good streaming option for their special collections.

If you Avalon, mow it!
Tricia P

Soup’s on!

Bit Soup2
November is the perfect time to enjoy a hot, steaming bowl of digital content, isn’t it?

Samantha DeWitt
NDSR Resident | Tufts University Tisch Library & Digital Collections and Archives

Since I will be spending my residency here at Tufts considering new ways for users to access university-associated research data, today’s meal comes fresh from the kitchen of research data management. Data sharing among researchers is certainly not a new topic of discussion, but it has gained notable fervor in light of expanding digital technologies over the past decade. Scholarly journals, advocating for improved access to research data, have become increasingly stringent in their data submission policies (Dryad has compiled a good list); and U.S. funding agencies such as the National Institutes of Health (NIH) and the National Science Foundation (NSF) have required data management plans from grant recipients for several years now. Non-government funders have also followed suit. Like the NIH, the Bill and Melinda Gates Foundation requires a data management plan for grants over $500,000. The Gordon and Betty Moore Foundation has published an extensive “data sharing philosophy” and encourages all funded researchers to share their data as soon as possible (in a manner consistent with applicable laws).

What effect policy and discourse is having on the research community is still open for debate, however. Last spring, Patrick Andreoli-Versbach and Frank Mueller-Langer published their study showing that out of 82 empirical researchers, only 12 shared research data regularly. Conversely, a study by Genevieve Pham-Kanter, Darren Zinner and Eric Campbell published in Plos One last month suggested that data sharing among life scientists was boosted by changes in data management policy and by the growth of third party data repositories and online data supplements.

It is generally accepted that some fields – physics for example – are better at curating and sharing data than others. Last month, the Signal (the terrific digital preservation and access blog of the Library of Congress) posted an interview with Elizabeth Griffin, an astrophysicist at the Canada-based Dominion Astrophysical Observatory. Griffin describes the astronomical community as being generally more advanced in data sharing and management for reasons that include: a smaller community size compared to other natural sciences, an “attendant international nature [that]… also requires careful attention to systems that have no borders,” and a history of making sure analog data was curated and accessible.

A multicolored mosaic of topics and issues emanate from the subject of preserving and sharing of research data. (I would make a “like a forest of autumn leaves” analogy but I don’t want to push it.) I have touched upon only a few notes in my first post – but I am looking forward to considering a great many more. I hope you will chime in for discussion, there is hot cider waiting and plenty of room at the table!

Samantha