NDSA Member Profile: Matt Schultz of Grand Valley State University Libraries

NDSA Member Profiles is a new collaborative series from the Interest Groups of the NDSA. The series is inspired by past NDSA traditions such as Insights Interviews, and aims to build on and expand these types of interviews with featured NDSA members to allow for better shared communication and collaboration around the work of digital stewardship and preservation. Topics range from member questions and insights for the NDSA community to sharing failures, discoveries, and anything else in between.

If you or your institution is interested in being featured, please contact Lauren Work (lw2cd@virginia.edu) or Sibyl Schaefer (sschaefer@ucsd.edu). The interview below was conducted by Lauren Work and lightly edited for length and content.

Matt Schultz is the Metadata & Digital Curation Librarian for Grand Valley State University (GVSU) Libraries, where he helps advance preservation and discovery strategies for the university’s unique, distinctive and special digital materials. GVSU Libraries joined the NDSA in Summer 2017 and Matt began engaging with the NDSA community, primarily through the Content Interest Group.

Profile image of Matt Schultz

Describe your position, and how you spend most of your working time.

As the Metadata & Digital Curation Librarian, I work regularly with a team of functional specialist faculty and professional support staff to ensure and improve discovery and use of the Libraries’ cataloged collections. I engage in technical curation for all of our uniquely created/acquired digital materials. I also work closely with our teaching faculty to accomplish good data management solutions for their sponsored research projects. In some ways I see these as three separate hats to wear, but there is also a lot of overlap in terms of focus and concern for the digital materials. That being said, it is a genuine challenge to balance and deepen the technical expertise needed to do each of these areas justice. Thankfully, GVSU Libraries is an incredibly collaborative, flexible and supportive environment. So I get a lot support.

Do you have a ongoing or finished digital stewardship project that you are particularly proud of that you would like to share?

Our Libraries has so many projects that we are engaged in regularly and at any given time, many of which involve external partners. I’m really proud of the collaborations. But I would say one of the projects I am most proud of involved rescuing a faculty member’s data from a couple of aging legacy Macintosh computers. We didn’t have a green light to bust into the encasings to extract the hard drives so we had to get very creative to scare up the peripherals we needed and install the associated software drivers.

The drivers were the trickiest because we had to get them on to the computers using ZIP disks that were appropriately formatted for the Macs and their flavors of OS. It was a lot of stop and go, a lot of trial and error. Eventually we were able to connect old ZIP drives to the Macs and copy the data off. In the end it was not the professional digital forensics project that I would have liked to have seen happen, using write-blockers and imaging the drives with something like Bitcurator, but it was a bit of a sleuth job nonetheless. What keeps me in this job are the puzzles and challenges and the thrill you get when you move old digital information forward in time.

What are your current challenges working in digital preservation?

I wrote our Libraries’ first digital preservation policy. As with many good digital preservation policies you can find online, an institution will often set aside some space in those documents to inventory their challenges. Our policy also includes a full list of the challenges we both currently experience and anticipate encountering as time goes by.

To just call out a few, I would say that we would like to have 1) a better layer of support for versioning changes to our digital materials over time; 2) more curatorial control over off-site replications of our data; and 3) sustainable strategies and tools for auditing the integrity of our locally stored digital materials. These are really infrastructure challenges. Our Libraries has built into our current 2016-2021 strategic plan the boosting of our local capacity and infrastructure, which I am excited about. We’ve already made some great strides towards launching our own locally-hosted access platforms running on Omeka and Digital Commons. It may take some time but I would eventually love to see our library and campus supporting a more scalable all-purpose repository. We’re excited by things like Hydra and Hyku and our short- to mid-term goal is to deepen our technical professional development, so that we can manage open source implementations like those (or others) on some savvier levels.

What have you found most beneficial from the NDSA community, and where do you think the NDSA has room to improve?

This is not an easy question to answer only because I am still on-ramping to some extent. But my engagement with the Content Interest Group was immediate and really welcoming. The facilitators and leaders for that group have not put up any barriers for someone to just dive right in and start contributing.

So, within the Interest group I’ve proposed a case study to document strategies for acquiring cloud-hosted data from donors and organizations, and I am looking forward to getting started on that activity. I was also affiliated with the previous iteration of NDSA as hosted by the Library of Congress, and I think the only thing I would say is that the production of new community resources sometimes felt a little over-driven by a small number of institutions and was often a little imbalanced in terms of the spreading out the work across any given interest group’s members. But I wouldn’t assume that the current interest groups are struggling with the same issues. Maybe just something to keep an eye on.

What recent digital stewardship discovery have you made? (This could be a tool, article, website, etc. that has helped you in your work.)

I recently learned about the efforts to form a persistent identifier registry geared towards disambiguating common researcher affiliation and citation use cases with respect to organizations/institutions. This is folding out from recent collaborations between Crossref, DataCite and ORCID. A working group has been established and has already articulated framing principles and a request for information (RFI) has been issued.

Though much more geared to scholarly publication and citation management, the building of an ecosystem to enlist the architecture of the web and other infrastructures with a focus on ensuring ongoing discovery and tracking for provenance are critical for preserving online information. I’d love to see more community-driven effort put into DOI and persistent identifier registries for a range of different published digital collection use cases and repository object management. I am intrigued by the business and service model for this newly proposed project.

Do you have an example of a digital preservation or stewardship failure you would like to share?

I have a couple of areas I could highlight. The first being that having recently gone through a major migration from use of platforms like CONTENTdm and Preservica to new platforms like Omeka and even Digital Commons, we could have used some tidier data models for our individual digital collections as we worked through that process. We had the models in formulated, there was simply not enough time to work through applying the data models as we moved from collection transfer to collection transfer. So it meant we were tidying on the fly and are now needing to go back and do some further cleanup and implementation. But migrations are always messy. So I don’t beat myself or our team up to heavily.

The other shortcoming we are experiencing is in the area of awareness-raising about the nature and importance of the work we do within the Libraries with respect to digital preservation. Our fellow liaison librarians, for example, might not fully understand what goes on within our Technology & Information Services (TIS) unit with respect to this work—simply because they spend the majority of their time on their own outreach work. It can be a lot to keep up with. But we do need to do a better job of pushing the importance of our digital collections management and preservation to the center of our services and finding ways to better engage that side of our house. We now have an Archivist for Public Outreach & Engagement that embeds with our liaison librarians and through my data management role I am also performing outreach and training to our liaisons. So I think we are moving in some positive directions.

What topics or issues do you wish the digital preservation community offered more expert guidance or robust documentation for?

This might come across as a little anathema within the community, but I honestly think we need to start fostering more conversations around the values and assumptions that are driving our priorities and our use/development of technologies for preserving digital information. I don’t think we step outside of the community and look at the bigger picture frequently enough and critically ask ourselves what sort of impacts we are making, or what the implications of some of our technology approaches are for this work on more societal and organizational levels.

What does the next year of digital stewardship hold for you and your institution? What are you working on next?

At our new Dean’s direction we are working through a really interesting set of facilitation frameworks that pull on methodologies such as appreciative inquiry and divergent thinking. This is in order to tighten up the collaborations and streamline operations across our different divisions when it comes to our digital stewardship work. We are about three months in and will probably be at the process through the end of the year. It is already yielding some really good signs for reaching a new “ideal state” and all of the stakeholders—including our Special Collections, University Archives, Scholarly Communications, and Technology & Information Services are getting excited about the new changes on the horizon. I think it is going to help us leg up even further into areas of research data management, digital scholarship, and digital humanities.

Call for Nominees: NDSA Coordinating Committee

National Digital Stewardship AllianceMembers of the National Digital Stewardship Alliance join together to form a consortium of over 220 partnering organizations, including universities, professional associations, businesses, government agencies, and nonprofit organizations, all committed to the long-term preservation of digital information. Committed to preserving access to our national digital heritage, we each offer our diverse skills, perspectives, experiences, cultures and orientations to achieve what we could not do alone.

NDSA’s Coordinating Committee (CC) provides strategic leadership to the community in coordination with working group co-chairs. Working on the CC is an opportunity to contribute your leadership for the community as a whole while collaborating with a wonderful group of dynamic and motivated professionals. NDSA is a diverse community, working on a critical mission and we seek candidates to join the CC that bring their diverse skills, perspectives, experiences, cultures and orientations to bear on leadership initiatives.

The CC is dedicated to ensuring a strategic direction for NDSA, to the advancement of NDSA activities to achieve strategic goals and furthering communication. One example of collaborative work within the community to further communication is the production of the National Agenda for Digital Stewardship. The CC is responsible for reviewing and approving NDSA membership applications and publications;  updating eligibility standards for membership in the alliance, and other bylaws;engaging with stakeholders in the community; and working to enroll new members committed to our core mission. The CC commitment is for three years. NDSA has an annual membership meeting coordinated with the DLF Forum each fall. The CC meets at the annual meeting and via a monthly conference call.

If you are interested in joining the CC yourself, or want to nominate another member, please send the name, e-mail address, and NDSA-affiliated institution of the nominee to ndsa@diglib.org by November 30.  We particularly encourage and welcome nominations of people from underrepresented groups and sectors.

Announcing the 2017 NDSA Award Winners

We are delighted to announce the recipients of the National Digital Stewardship Alliance’s (NDSA) annual Innovation Awards!

Individual Awards: Rebecca Guenther & Karen Cariani
Organization Award: The Digital Preservation Network
Project Award: The ePadd Project
Educator Awards: George Coulbourne & Dorothea Salo
Future Steward Award: Elizabeth England

These awards highlight and commend creative individuals, projects, organizations, educators, and future stewards demonstrating originality and excellence in their contributions to the field of digital preservation.

The awardees will be recognized publicly during NDSA’s Digital Preservation 2017 during the Opening Plenary on Wednesday, October 25. Please join us in congratulating them for their hard work! Each of the winners will be interviewed later this year, so stay tuned to learn more about their work on our blog.


Individual Awards

Rebecca Guenther spent most of her career at the Library of Congress in the Network Development and MARC Standards Office developing national and international metadata standards, including MARC 21, MODS, and PREMIS among others. She was co-chair of the original PREMIS Working Group, which developed the PREMIS Data Dictionary for Preservation Metadata, and, after its release, chair of the PREMIS Editorial Committee from 2005-2015. She is currently a consultant on metadata issues and teaches at NYU’s Moving Image Archiving and Preservation Program. She remains active on various metadata committees.Rebecca is recognized for nurturing conversations among international library stakeholders, conversations that led to the development of MODS, MADS and the enhancement of MARC, helping to build a solid foundation for discovery metadata. All of us in the preservation community have benefited from her leadership in the development of PREMIS metadata for digital preservation.

Karen CarianiKaren Cariani is the Senior Director of the WGBH Media Library and Archives (MLA) and Project Director for the American Archive of Public Broadcasting (AAPB). Karen has 20 plus years of television production and project management experience, including as project director for WGBH’s Teachers’ Domain (now PBS Learning Media), WGBH Open Vault, the Boston Local TV News Digital Library project; and also the development of a digital media preservation system utilizing the Hydra technology in partnership with Indiana University.She served two terms (2001-2005) on the Board of Directors of Association of Moving Image Archivists (AMIA), and was co-chair of the AMIA Local Television Task Force, and Project Director of the guidebook “Local Television: A Guide To Saving Our Heritage,” funded by the NHPRC and co-chair of the AMIA Copyright and AMIA Open Source Committees.

Karen was co-chair of the LOC National Stewardship Digital Alliance for the Infrastructure working group, and served as president of Digital Commonwealth. Recent projects include WGBH Project Director for the American Archive for Public Broadcasting in partnership with the Library of Congress. .

Karen is recognized for leadership that was crucial in forging a collaboration with the Library of Congress to steward the American Archive of Public Broadcasting, an initiative that has digitized nearly 50,000 hours of historic programming from more than 100 stations.  She has been an advocate for collaboration in the development and adoption of shared open-source solutions for digital stewardship, a leader in the NDSA and Hydra/Samvera community, as well as in the project to enhance the Avalon project with support for PBCore, and in developing open-source solutions for speech-to-text and audio waveform analysis.


Organization Award

The Digital Preservation Network (DPN) is a membership organization focused on developing solutions to meet the challenges of long-term preservation of academic and cultural heritage digital assets. The large-scale digital preservation services developed by DPN are built to last beyond the life spans of individuals, technological systems, and organizations. DPN provides members of the academy and their successors with assurance that future access to their scholarly resources will be available in the event of disruptive change in administrative or physical institutional environments. By working together, DPN members are collaboratively solving issues that have been difficult to achieve separately.Beyond DPN’s core mission of ensuring the secure preservation of stored content by leveraging a heterogeneous network that spans diverse geographic, technical, and institutional environments, the organization is recognized for its creation, with partner AVPreserve, of its Digital Preservation Workflow Curriculum.  The curriculum/workshop series provides a bridge between the desire to participate in digital preservation projects and the capacity to connect local content and resources to that aim, providing a flexible framework for guiding an organization through the necessary decision making processes for establishing a sustainable digital preservation workflow. Mary Molinaro, Chief Operating Officer & Service Manager, will accept this award on behalf of DPN.


Project Award

The ePadd project is an undertaking to develop free and open-source computational analysis software that facilitates screening, browsing, and access for historically and culturally significant email collections. The software incorporates techniques from computer science and computational linguistics, including natural language processing, named entity recognition, and other statistical machine-learning associated processes.ePADD Phase 1 was developed from 2013-2015 by staff of the Department of Special Collections and University Archives, Stanford University Libraries (SUL), Stanford University.  The software was developed with grant funding provided through the National Historical Publications and Records Commission (NHPRC). Additional funding was provided through SUL’s Payson J. Treat Fund for Library Program Development and Research.

ePADD Phase 2 is being developed from 2015-2018 by staff of the Department of Special Collections and University Archives, Stanford University Libraries (SUL), Stanford University, in collaboration with partners at Harvard University, the Metropolitan New York Library Council (METRO), University of Illinois at Urbana-Champaign, and University of California, Irvine.  Funding for ePADD Phase 2 is provided through an Institute of Museum & Library Studies (IMLS) National Leadership Grant (NLG) for Libraries, which supports projects that address challenges faced by the library and archive fields and that have the potential to advance practice in those fields.

The ePadd project is recognized for developing an effective, useful, accessible tool that has significantly lowered technical and other resource barriers to appraising, acquiring, processing, and making accessible large email collections for individuals and museums, archives, and libraries, both large and small.  It has served as well, as an effective demonstration of the concrete possibilities of working with born-digital textual collections.  The project team have also been noted for fostering open collaboration, community-building, and support. Glynn Edwards, Assistant Director, Department of Special Collections at SUL, and Project Director, will accept this award on behalf of the project.


Educator Awards

George Coulbourne served as former Chief of Internship and Fellowship Programs at the Library of Congress in Washington, D.C. prior to his retirement on August 31, 2017. Among his responsibilities at the Library George founded and led the international Digital Preservation Education and Outreach (DPOE) Program. In collaboration with the IMLS he designed and launched the pilot program for the National Digital Stewardship Residency (NDSR).  He led and expanded the Hispanic Association of Colleges and Universities National Internship Program (HNIP) for the Library which significantly increased diversity enterprise wide.

During his tenure, George established a number of collaborative fellowships, residencies and internships in collaboration with universities nationwide to promote leadership and skills development for future digital stewards. In addition, he spearheaded the design and implementation of the first public facing internship, residency and fellowship portal for the Library’s 80+ experiential programs. George currently serves on the Advisory Committees for NDSR Art-Philadelphia Museum of Art, the NEH-NEDCC Digital Assessment Training Project and The Washington Consortium of Universities and currently works as an independent consultant.

George is recognized for having established, while at the LOC, the Digital Preservation Outreach and Education (DPOE) program (now a network  of 217 digital preservation topical trainers across 33 states in the U.S., the District of Columbia, Australia, and New Zealand) to advance the practice of digital preservation through professional development opportunities such as the three-day DPOE Train-the-Trainer Workshop. The community recognizes his accomplishment in pioneering the National Digital Stewardship Residency (NDSR) program in collaboration with IMLS to cultivate nationwide talent in digital stewardship through a year-long residency that pairs emerging information professionals with cultural heritage institutions facing an array of digital preservation challenges.

Dorothea SaloDorothea Salo is a Faculty Associate in the Information School at the University of Wisconsin at Madison. She teaches courses on metadata, linked data, coding and society, open movements, and digital libraries. She has written and presented internationally on privacy, scholarly publishing, copyright, institutional repositories, linked data, and data curation. She holds an MA in Library and Information Studies and another in Spanish from UW-Madison.In addition to her teaching at the University of Wisconsin-Madison’s iSchool, Dorothea is recognized for partnering with Wisconsin libraries — the Cedarburg Public Library, the Wisconsin School for the Deaf, Mineral Point Library and Archives, and the WYOU community television station in Madison — to provide resources and assist in the digitization of at-risk materials.  Her development projects, RADD (Recovering Analog and Digital Data), PROUD (Portable Recovery of Unique Data), and PRAVDA (Portably Reformat Audio and Video to Digital from Analog), have extended the reach of digitization and preservation tools to those without the resources of large-scale memory institutions.


Future Steward Award

Elizabeth England is the Digital Processing Archivist at the Johns Hopkins University Sheridan Libraries, where her responsibilities include acquiring, processing, and preserving born-digital materials. She previously was a National Digital Stewardship Resident, and her residency project focused on the preservation of Johns Hopkins’ born-digital visual history. Elizabeth holds an MLIS with a concentration in archives from the University of Pittsburgh.

Elizabeth is recognized for her work as a National Digital Stewardship Residency (NDSR) resident at JHU’s Sheridan Libraries, which has significantly streamlined and advanced the JHU archives’ born-digital processing workflow and has motivated the Libraries to pursue grant funding to develop technology that will allow the community to appraise digital visual content in a more sophisticated way using computer vision techniques such as perceptual hashing.  She has organized and chaired a recent panel discussion at the Society of American Archivists’ annual meeting on appraisal of digital content.  And, in response to concerns of loss of crucial climate datasets from government websites, she was one of the organizers of “Data Rescue DC”, garnering international attention.


The annual Innovation Awards were established by the NDSA to recognize and encourage innovation in the field of digital preservation stewardship. The program is administered by a committee drawn from members of the NDSA Innovation Working Group. Learn more about the 2012, 2013, 2014, 2015, and 2016 Award recipients.

Announcing Publication of the NDSA Digital Preservation Staffing Survey Report

National Digital Stewardship Alliance

Earlier this year NDSA requested your participation in a survey about how organizations worldwide were addressing digital preservation staffing and related issues. The NDSA Staffing Survey Working Group is happy to announce the publication of the 2017 Digital Preservation Staffing Survey Report. The report summarizes the results of the 2017 survey, including information on participants type of organization and size, as well as information on how organizations view their digital preservation organization and staffing situation, and ideas about staffing qualifications and training needs. The report also compares the survey results to the results of the 2012 staffing survey.  

The report and survey data is available on the NDSA 2017 Staffing Survey’s OSF page

General Statistics: 133 organizations completed this survey, 78% from the United States (with 13 other countries being represented). Although many respondents represented academic libraries and archives (46%), we received responses from thirteen categories of institutions, including governmental entities, museums, historical societies, public libraries, and for-profit corporations.  Some 58% are managing between 1-50 TB of content. Review the report for all of the details and comparison with the 2012 survey.   

The Working Group would like to thank the 133 organizations who took the time to complete the survey and provide the community with information around these issues.   

Questions or comments can be sent to the Working Group at NDSAStaffingSurvey2016@googlegroups.com.  

The NDSA Staffing Survey Working Group

Interested in activities like this, or in joining with other organizations committed to the long-term preservation of digital information?  Get involved with NDSA!

NDSA Survey on Fixity Practices

Does your organization manage and preserve digital content? If so, the National Digital Stewardship Alliance (NDSA) is interested in hearing from you!

Building off of the NDSA publication Checking Your Digital Content, the NDSA Fixity Working Group is conducting a survey of institutions with digital preservation responsibilities to gain insight into how organizations worldwide use various fixity methods to ensure the stability of their digital content and to learn how real-world capacity and best practices differ.

The survey is available at https://msu.co1.qualtrics.com/jfe/form/SV_8IzGON3GTMlPlIx until September 15, 2017.

All of the survey questions can be viewed in advance by following the link to this Google PDF:

https://drive.google.com/file/d/0B3EOrsvu2mMsR2U0b1VIZGViMFk/view?usp=sharing

If you have questions or concerns about this survey, please contact the NDSA Fixity Working Group at NDSA-FIXITY@lists.clir.org.

Thank you for helping NDSA and our community define and advance digital preservation!

The NDSA Fixity Survey Working Group

NDSA Innovation Award Nominations Now Open!

Nominations are now being accepted for the 2017 Innovation Awards for the National Digital Stewardship Alliance (NDSA)! The NDSA established the Innovation Awards in 2012 to recognize and encourage innovation in the field of digital stewardship.

These awards focus on recognizing excellence in the following areas:

  •  Individuals making a significant, innovative contribution to the digital preservation community.
  •  Projects whose goals or outcomes represent an inventive, meaningful addition to the understanding or processes required for successful, sustainable digital stewardship.
  •  Organizations taking an innovative approach to providing support and guidance to the digital preservation community.
  •  Future stewards, especially students, but including educators, trainers, or curricular programs taking a creative approach to advancing knowledge of digital preservation issues and practices.

As a diverse membership group with a shared commitment to digital preservation, the NDSA understands the importance of innovation and risk-taking in developing and supporting a broad range of successful digital preservation activities. Acknowledging that innovative digital stewardship can take many forms, eligibility for these awards has been left purposely broad. Nominations are open to anyone or anything that falls into the above categories and any entity can be nominated for one of the four awards. Nominees should be US-based people and projects or collaborative international projects that contain a US-based partner. This is your chance to help us highlight and reward novel, risk-taking, and inventive approaches to the challenges of digital preservation.

You can submit a nomination via this quick, easy online submission form:

https://www.surveymonkey.com/r/NDSAinnovation

Nominations will be accepted until August 31, 2017. The prizes will be presented to the winners at the NDSA annual conference, Digital Preservation 2017: “Preservation is Political,” in Pittsburgh, Pennsylvania, October 25-26, 2017. Winners will be asked to deliver a very brief talk about their activities as part of the awards ceremony.

Help us recognize and reward innovation in digital stewardship and submit a nomination!

We encourage all NDSA members to submit nominations. We will be hitting electronic mailing lists, but also please promote the awards throughout your community.

For more information on the details on awards from previous years, please see here: http://ndsa.org/awards/

Your Participation is Requested in the NDSA Digital Preservation Staffing Survey

Curious about how other institutions staff their digital repositories? Or, how they hope to align staff responsibilities as they grow? Members of the National Digital Stewardship Alliance have formed the Staffing Survey Working Group to survey repositories internationally for insights into these questions.  Our survey will explore how organizations worldwide are addressing digital preservation staffing, scoping and structural questions. The results of this survey will build on the knowledge gained through a similar 2012 survey which were published in the 2013 NDSA Report: Staffing for Effective Digital Preservation. A new report will analyze the new data as its own information as well as in comparison with the 2013 report.

This survey is available for any organization that manages and preserves digital content and is available at https://duke.qualtrics.com/SE/?SID=SV_3dWhOx9jWCIZsNv until April 10, 2017. Please coordinate one response per institution. You will find a link to a Microsoft Word version of the questionnaire on the first screen if you would like to preview the questions.

If you have any questions please contact the Staffing Survey Working Group at: ndsastaffingsurvey2016@googlegroups.com  

Thank you for helping NDSA and our community define and advance digital preservation!

The NDSA Staffing Survey Working Group


Interested in activities like this, or in joining with other organizations committed to the long-term preservation of digital information?  Get involved with NDSA yourself at: http://ndsa.org/get-involved/

Digital Preservation, Ethical Care, and the Tribal Stewardship Cohort Program: An NDSA interview with Kimberly Christen

This interview comes to us from Jefferson Bailey & Maria Praetzellis, Internet Archive & NDSA Innovation Working Group. 

Kim Christen

We are very excited to talk with Kimberly Christen, Associate Professor and Director, Digital Technology and Culture Program, Director of Digital Projects, Native American Programs, and Co-Director, Center for Digital Scholarship and Curation at Washington State University. Kim speaks on behalf of the Tribal Stewardship Cohort Program, which was awarded an Innovation Award from the National Digital Stewardship Alliance (NDSA) in 2016 and is recognized for its work in providing long-term educational opportunities in digital heritage management and preservation as well as its dedication to culturally responsive and ethically-minded practices. You can find all of our interviews with the NDSA Innovation Award winners here

Tell us about how the Tribal Stewardship Cohort Program (TSCP) came about? What prior work or experiences informed the conceptualization and development of the program?

The TSCP was a direct result of two other projects I lead at WSU’s Center for Digital Scholarship and Curation: the Sustainable Heritage Network (SHN) and Mukurtu CMS. It was from these projects that we saw the need for a longer-term, tribally-specific set of educational resources and initiatives around training. While Mukurtu CMS provides a culturally responsive and ethically-minded platform for providing access to digital cultural heritage and the SHN provides both online and face-to-face instruction in the lifecycle of digital stewardship, what was missing was a program that could meet the needs of tribal communities to provide training to their staff that fell between a standard MILS program and a short-term workshop. The SHN taught us that hands on workshops were crucial not just for training, but importantly, for creating networks between tribes. Because so many of the issues tribal librarians, archivists and museum professionals face are unique to the history of collecting of Native materials and to tribal laws and policies, the participants in our SHN and Mukurtu workshops made invaluable connections to each other and learned from their shared challenges, opportunities and experiences.

For readers that may not know, tell us the ways in which tribal stewardship is unique and how this program organizes or frames its work to address these specific issues?

Maureen Wacondo and Arlan Sando from the Pueblo of Jemez in New Mexico

This Tribal Stewardship Cohort Program fills a crucial need articulated both by tribal museum specialists, archivists and librarians and the funding institutions that they seek to collaborate with to support their vital digitization and preservation work. One of the greatest needs of the TALM community is continuing education and training for their current staff. It is not uncommon for tribal archivists and librarians to not have formal training in their fields and or be asked to fulfill several roles at their institutions. While post-secondary education to Tribal members through Master’s degree programs has increased, the literature shows that distance to programs, family obligations, cultural needs and financial difficulties are obstacles to local tribal members. There is a crucial need to train existing staff with short courses, hands-on and tuition free educational opportunities. The ATALM report Sustaining Indigenous Cultures found that, “According to the survey data, the best ways to train current staff are through local, state, and regional programs that are topic- specific and use hands-on or how-to teaching methods.” Next to hands-on training, the most effective method for local TALMs are “Brief distance learning programs like webinars or short web-based modular courses.” Combining these hands-on and web-based educational opportunities with a core emphasis on tribal content and cultural digitization needs, the Tribal Stewardship Cohort Program aims to train local TALM staff to work in their communities with their collections to meet their specific community defined needs.

What lessons have you learned thus far in terms of working with tribal communities and culturally sensitive materials? Any insights that might apply to working within other communities or with other types of special collections?

The biggest lesson is to recognize that the lens or filter we may bring to a set of materials is always partial and situated. Even when we think that a collection is not culturally sensitive, it may be that we don’t have the right point of view. For example, we had a set of lantern slides from a Native boarding school in our Special Collections. Many of these slides were images of buildings with no people in them. On first glance, our collections manager thought there would be no issue with putting them online—without individuals in the images it seemed privacy concerns would not be an issue and because there were no cultural materials, artifacts, etc., the assumption was that these were “harmless.”  However, our protocol and workflow for all materials digitized and made available online through the Plateau Peoples’ Web Portal, means that all materials are vetted by our tribal representatives on the project. In this case, many of those slides brought back painful memories of places that were filled with physical and cultural violence. It was only over a year of consultation with tribal members across several Native nations in the region that we digitized the materials and made them available after community curation so that tribal members stories could be told. In another instance, a community using Mukurtu CMS use the internal protocol function to define circulation of a similar set of disturbing materials to community members only while they worked through the community process of vetting, remembering, and re-narrating their histories through place.

The lesson, as I see it, is twofold. First, recognizing the limits of our own understandings of how events, places, people, and material culture can cause harm. Secondly, then is putting into place mechanisms to ensure that we reduce or limit the harm we do as archivists, librarians, or museum specialists when we make collections available. To do this, we need to provide guidelines at the institutional level that define collaborations, consultation practices, and digitization policies that meaningfully engage with source communities—whomever they may be. For many professionals this rubs up against perceived notions of professional standards to make information open and available. I have written about this on several occasions, it is enough to say here that these professional standards are built from very particular histories and points of view and they eschew the knowledge circulation practices and ethics of others. I suggest Special Collections, in particular, have guidelines for vetting and reviewing collections that include community consultation. Yes, this is sometimes hard. Yes, this sometimes takes a while. Yes, you may well make a misstep. However, taking this time and doing the outreach work will move institutions closer to honoring the views of all of our constituents, not just the ones whose names are listed on the donor forms.

Working with tribal collections is an honor, it ushers in responsibilities to communities who have often been marginalized in and by collecting institutions. First, we need to recognize the long history of colonial collecting and the perversions of Western law, specifically copyright law, but not only, that resulted in the collections we have at our institutions. Next, growing from this recognition we need to be willing to work government to government with the sovereign nations whose materials we have. A practical result of this could be a Memorandum of Understanding (MOU) with tribal nations in your regions or whose collections you maintain (we have example templates on the SHN site). An MOU starts a conversation and defines responsibilities and obligations to act in mutually beneficial ways. Thirdly, we need to act to implement ethical procedures in all aspects of or workflows from accessioning collections and creating metadata to digitizing and providing access. We need to recognize that open is only one form of access and it often repeats the original colonial violence of dispossession. Metadata is deeply political—who is in the author field, who is relegated to “notes” these are not trivial. I would suggest these are at the heart of reframing the colonial legacy of archives. Look at all stages of your institutional workflow and approach them with the questions of ethical care—does this erase other voices, does this assert ownership based on colonial legacies, does this relegate whole communities to “other.”? Fourth, at an institutional level create a space following from those MOUs for long term partnerships that include training, sharing ideas and collections, providing access in varied ways and making room for multiple voices in your collections records and reading rooms. Standard practices of white gloves, silence or hushed tones, denies the relationships with the materials on our shelves, and they often times unwittingly reinscribe the power relations and erasures that built many of our collections. Recognize that handling tribal belongings (not “objects”), singing, smudging, crying, laughing and joking may be part of a collections survey or documentation practice. Preservation may not mean the same thing to all parties, some material culture was never meant to be preserved. Don’t let preservation be the new paternalism.

Not many CMS tools have a founding story as interesting as that of Mukurtu. How do you think this early development shaped the type of tool it is today?  What features does Mukurtu provide that is missing in other systems?

The early development of Mukurtu with the Warumungu Aboriginal community in Central Australia, shaped all of Mukurtu, including where we are today as an open source platform. Mukurtu started as and remains a grassroots project. We have taken that first set of development work as our guide and instituted what we call a “community software development” model. Combining the best of open source agile software development with the best of community participant engagement models, all the features and functions we add into the core Mukurtu CMS codebase is a direct result of community needs.

Core features that Mukurtu CMS has that aren’t in other content management systems include: cultural protocol based access at the item and collection level, sharing protocols at the media item level, expanded metadata that includes: traditional knowledge and cultural narratives, multiple records for items, roundtrip sharing import/export that allows sharing content and metadata with exclusions included (i.e. you may want to share all your metadata but the location), and embedded Traditional Knowledge Labels in additional to standard copyright and Creative Commons licensing options.

All of our workshops have a component for feedback and much of our daily work with communities using Mukurtu results in updates, new feature requests and “bug reports.” Our workflow for this has grown over the years as Mukurtu has developed and as we now look to creating our Mukurtu Hubs—regional centers for Mukurtu support and development—we will be expanding this even more with added communication channels for input. While GitHub works great for some instances, we have to be mindful of the limitations that seemingly “low barrier” technological platforms and tools offer. If we only depended on GitHub for the Mukurtu community of users to provide feedback we would miss a tremendous amount of invaluable feedback. In addition to this, our focus on community needs and use means that we create new features and functions that will work for a range of users and not impose new obstacles or barriers. This is a tightrope walk sometimes as we are dealing with a robust platform like Mukurtu CMS, but for us, the mission and philosophy of Mukurtu comes first: create a safe keeping place for communities (however defined) to provide ethical, stable and secure access to their digital cultural heritage materials. So, we don’t always fall on the side of academic, scholarly, or professional standards or needs. For example, one of our newest features, the Mukurtu Dictionary, resulted from two years of community feedback about the needs for language teaching, learning, preservation and revitalization. The dictionary has many functions that align with those needs (multiple audio files for one word, multiple translations for one word, linkages between words and multiple dialects in one site, etc). However, this is not a tool that a linguist might use for documenting a language. The feature grew out of direct community needs, then we had to address the range of stakeholders we serve (Indigenous communities globally) to create this first iteration (in beta now) and ensure that the functionality is as flexible as it can be while meeting the baseline needs of our core community.

The Tribal Stewardship Cohort Program is built upon a cohort-based educational model. What have you learned thus far in terms of new methods for archival education and partnership building? What role has ethics and collaboration played in this model?

The cohort model has been—hands down—the most significant learning experience for all of us involved, staff, instructors, and the participants. Our exit interviews from our first cohort group (2015-16) showed that the relationships formed with fellow cohort members, the time and space to share ideas and experiences and the chance to learn together in a non-competitive environment was invaluable to the group. Now halfway through our second cohort we see this even more. The biggest lesson is that providing time and space for co-learning and balancing policy with hands on learning provides a foundation for success. For us, we have also seen how collaboration between not just our participants, but between non-Native repositories and other institutions, like Universities, plays a crucial role in archival education—specifically when dealing with cultural heritage materials. We have a module in the program on “digital return”—getting digital materials back from repositories—and we start with how to craft an MOU with an institution who holds a Native communities’ materials, to lay the groundwork for a mutual relationship between Native and non-Native institutions. Over the last 15 years there has been a steady shift in the way non-Native institutions have thought about their collections, ownership, and the ethics of maintaining collections that have dubious histories. Although we discuss ethical issues throughout the whole program, this section brings home the necessity to create ways for non-Native institutions to be involved in the ethical return, curation and circulation of Native heritage. For me, it highlights the work those of us in non-Native institutions need to continue to work towards: relationship building, creating ethical workflows, providing pathways for meaningful and long-term collaborations, and ridding ourselves of the vestiges of colonial collecting practices.

What are new areas of development, research, or broader points of discussion within the TALM community that the larger professional community should be aware of?

Technological protocols like those that Mukurtu offers work hand in hand with other forms of non-technical, analog, relational protocols for handling, sharing, managing and preserving materials (whatever its format). In the TSCP we work at all layers of digital stewardships, from creating collections and digitization policies, to defining workflows and project plans to hands on digitization skills. The key insight I think is that the technological protocols have to be informed by the social and cultural protocols and then filtered up and down at every stage. Tribal and Indigenous institutions around the world have policies for collections handling, information sharing and documentation that can and should inform the broader conversations in the library and archives field.

The creation of standards that work from and maintain the situated and very particular nature of information is very exciting. Reimagining our own library and archival standards will be, I believe, the only way to truly decolonize our work. I think immediately of the Brian Deer Classification system, that provides a new way of thinking not just about a specific area of classification, but how we can work towards these regional and local systems and structures.  The management and organization of knowledge using Indigenous systems of knowledge sharing and social networks has been underway for some time now and it is only starting to get the wider recognition it deserves. For a good start see the recent special issue of Cataloging and Classification: Indigenous Knowledge Organization.

Tribal Stewardship Cohort Program 2016-2017 Cohort Class. Row 3: Lupe Pecos, Lotus Norton-Wisla, Jeannette Garcia, Jason Russell, Sarah Dybdahl, Trevor Bond, Michael Wynne, Alex Merrill, Steve Bingo; Row 2: Brooke Bauer, Amelia Wilson, Kim Christen, Ashley Sexton, Josiah Black Eagle Pinkham, Arlan Sando; Row 1: Matthew Lewis, Marilyn Decker, Maureen Wacondo

2016 NDSA Web Archiving Survey Report is now available

The National Digital Stewardship Alliance is pleased to announce the release of the 2016 Web Archiving Survey Report (PDF).

From January 20 to February 16, 2016, a team representing multiple NDSA member institutions and interest groups conducted a survey of organizations in the United States actively involved in, or planning to start, programs to archive content from the Web. This effort built upon a similar survey undertaken by NDSA in late 2011 and published online in June 2012 and a second survey in late 2013 published online in September 2014.

The reports and survey instruments are available on the NDSA Web site at http://ndsa.org/publications/.

The survey data is also available for review and reuse.

The goal of these surveys is to better understand the landscape of Web archiving activities in the United States by investigating the organizations involved, the history and scope of their Web archiving programs, the types of Web content being preserved, the tools and services being used, access and discovery services being provided, and overall policies related to Web archiving programs. While this survey documents the current state of US Web archiving initiatives, comparison with the results of the 2011 and 2013 surveys enables an analysis of emerging trends. This report therefore describes the current state of the field, tracks the evolution of the field over the last few years, and points to future opportunities and developments.

A few major takeaways from the report include:

  • More programs are moving from pilot to production (79% of respondents classified their status as production; only 5% as pilot)
  • There are increased perceptions of progress over past 2 years (77% respondents reported that their program had made either significant or some progress over the past two years.)
  • The top 3 areas where organizations have made the most progress are: data capture, appraisal and selection, and vision and objectives
  • The top 3 areas where organizations have made the least progress are: access/use/reuse, metadata/description, and quality assurance and analysis
  • The number of respondents who are transferring their Web archive data from an external service (such as Archive-It) remains low at ~20%
  • Staffing levels for web archiving remain low: 76% (64 of 84) of organizations are devoting less than the equivalent of one fulltime employee’s (FTE) time to web archiving
  • The top three staff skills essential to development and success of programs: facility with archiving tools, skills for appraisal and selection, skills for performing quality assurance

And finally, the report reveals a growing trend seen across the reports that shows a significant interest in collaboration on a number of fronts. Survey respondents desire collaboration on topics ranging from quality assurance techniques to best practices for policy and management of web archives, yet many institutions feel they have neither the time nor resources to lead or participate in collaborative activities. With this finding, the 2016 report authors encourage the community and stakeholders to invest in research and development efforts to create sustainable frameworks that can facilitate practical, meaningful, and effective collaboration.

 

Collaboration, Openness, and Preservation: An NDSA Interview with Dave Rice

We are very excited to talk with Dave Rice. He was awarded an Innovation Award from the National Digital Stewardship Alliance (NDSA) in 2016 for his creative work in bringing together people and organizations from different communities to result in useful standards and practices. Follow Dave’s work at dericed.com, and you can find all of our interviews with the NDSA Innovation Award winners here

We learned more about Dave and his work in the following interview:

You were selected for your work in advocating for a new working group and for the FFV1 and Matroska standards with the Internet Engineering Task Force (IETF). The IETF is sometimes ascribed nearly magical abilities for its success in working on the standards that hold the internet infrastructure together. It is not common for members of the digital preservation community to work directly with the IETF and its related groups. How did you go down that path, and did you find anything surprising about it?

Firstly, thank you for selecting me for the Innovation Award. I feel honored to be associated with the NDSA community in this way.

I should explain a bit of background to the work on FFV1 and Matroska. The Library of Congress’s Sustainability of Digital Formats defines several sustainability factors for digital formats and lists “disclosure” as the first consideration, which includes the “degree to which complete specifications and tools for validating technical integrity exist and are accessible.” In 2014 the PREFORMA project began to develop conformance checkers for a select list of open formats and included FFV1 and Matroska as their audiovisual selections. I worked within the PREFORMA audiovisual team on a conformance checker for audiovisual files, called MediaConch, and as part of the planning we saw that the quality of the specification work for FFV1 and Matroska would hinder development of a conformance checker for those formats. Thus at that point it made sense to coordinate users and developers of FFV1 and Matroska to approach an open standards organization about addressing further development of specifications building off the existing progress made by communities working on these formats.

The IETF (Internet Engineering Task Force) appeared to be the most suitable standards organization to work with. Developers from both Matroska and FFmpeg (where FFV1 was initially developed) were well familiar with the IETF from their work on other open video formats such as Ogg, VP8, Opus, and Daala. Furthermore, the values of the IETF align particularly well with Free Software principles, the LOC’s Sustainability Factors, and other digital preservation guidelines in that the IETF’s work and procedure is open, transparent, and participatory. In addition to the resulting specification documents being open, credible, and clear, the entire process is open to view including listserv discussions, chatroom transcripts, and recordings of meetings. Thus not only are the specifications of the IETF open but we can study the process, discussions, and debate that formed them.

For us, the path involved a lot of collaboration and learning along the way. We discussed the idea within the FFmpeg and Matroska communities to determine willingness, interest, and method to proceed with standardization. Initially we collaborated with the IETF Dispatch working group to determine the best way to proceed. We also met with IETF members at conferences, such as FOSDEM and VDD, to seek advice and refine our methodology. With this assistance, we drafted a charter for a working group and Tessa Fallon presented a proposal at an IETF conference. The charter was debated and adjusted, put to a vote, and ultimately approved. At this point, the CELLAR working group became active.

Can you explain how the IETF working group functions? Can you explain what you think will happen in this area in the next few years?

This is a link to the data tracker of the IETF’s CELLAR working group. There you will find links to our mailing list, historical information about the group, its charter, and its active documents. Although active documents are listed in the working group’s website, the efforts to revise the specifications happen in GitHub with related conversation on the listserv. Presently there’s a GitHub repository for Matroska, for EBML (a binary XML format on which Matroska is based), and FFV1.

The working group charter includes a timeline; though the working group is currently behind schedule, we are making active progress. The objective of the working group is to achieve the the goals outlined in the charter, namely to submit specifications for FFV1, Matroska, and FLAC to the IESG (Internet Engineering Steering Group) for approval. Anyone is welcome to join the working group’s listserv and participate.

What do you think the digital preservation community can learn from what you did with the IETF?

Although it is not common for members of the digital preservation community to work directly with the IETF, it is in the interest of the digital preservation field to foster more involvement in related standardization work. Rather than waiting for the creation of standards to adopt in preservation, it is in the interest of the preservation community to represent and advocate within standardization efforts to ensure that adequate attention is given to sustainability qualities required in long term format preservation. Often this is the case with the development of metadata standards (such as PBCore and PREMIS), but there is opportunity for more involvement from the community in the standardization of the formats that we will eventually steward. I consider the work produced by the Library of Congress on AS-07 and by the DPF Manager on TIFF as good examples of the digital preservation community’s active involvement in standardization efforts for file formats.

The world of libraries, archives and museums has many standards groups and standards of their own. Do you have any thoughts about how standards are most effectively developed within and across communities? How do you think innovation relates to standards?

Engaging with numerous stakeholder communities is critical to the sustainability of standards. As an example, last year I helped organize the No Time to Wait symposium in Berlin, which focused on the standardization efforts for FFV1 and Matroska. During the symposium, Reto Kromer and Kieran O’Leary presented on using those formats in film scanning and provided proposals and research on storing color data from the film scanning process in those formats. Additionally, Michael Bradshaw from Google presented on YouTube’s ongoing efforts to support the vast technical variety of incoming media in order to document and render color data effectively. It was very revealing to see that those managing the newest audiovisual media (YouTube uploads) and those managing the oldest audiovisual media (film formats) shared a common interest in standardizing management of comprehensive colorspace data within these formats and could collaborate on proposals.

Additionally, since the IETF working groups operate in open online spaces where those interested may join to watch or participate, the environment is welcoming to collaboration between communities with shared interests. Frequently in the working group I’ve seen contribution of expertise in areas where I wasn’t aware such expertise existed. Some standards organizations are closed or require subscription membership, limiting participation to a targeted community. These closed systems might stymie the potential for more diverse and innovative contributions.

In the preservation community, the existence or presumption of a ‘standard’ may sometimes discourage innovation in that potential participants view the work as already complete or are concerned that additional development might compete with an established standard in a way that compromises its adoption. For example, the recommended practices for the storage of analog media moved from one format to another as technological advancement offered new opportunities; however, in the migration from analog to digital formats there is sometimes less acceptance in the adoption and integration of these new formats. Best practices based upon technology should be considered to have expiration dates and be approached more skeptically as they age. A best practice should not be considered as the edge to our innovation, but a reference point from which improvements can be made. Working within the context of a standards organization ensures that the work to improve a standard develops in a controlled environment thus protecting standards from tumultuous changes while simultaneously maintaining an environment of transparency and consensus.

On your website you say your work has been focused on “independent media”. Can you talk about that term?

I learned this term at my first full time job as an archivist at Democracy Now, a daily, independent news program. Over the last few decades media consolidation has led to fewer and fewer companies controlling significant parts of the media and those companies often have financial stakes in sectors beyond media. For instance a company may own a news network to cover climate change but that company also profits from environmental deregulation or a company may own a news network to cover threats of war but also profits from the sale of military hardware. Independent media is more independent from the influence to maximize profits or distort reality. For independent media organizations the focus is more wholly on providing media as an offering to the public as opposed to providing the public as an offering to its advertisers.

On your website (dericed.com), you label yourself as an “archivist” and “technologist”. How do you use the term “technologist”? Is “technologist” a term we should be expanding in the digital preservation community?

I think my consideration of this term comes from my education at the Selznick School of Film Preservation. The education here gave a strong impression that the meeting preservation challenges depends not only on following practices but also understanding and controlling the technology involved. This becomes particularly important when the technology we require for preservation is obsolete and debugging becomes a central part of the process. There are many areas where the technology available to us is not sufficient and that those in preservation need to discover and create their own technologies. So I think I use the term to mean someone that both knows how to use certain technologies as a tool but also knows how or when to create such tools.

In your first blog post on your website, you say (in bold) “Unplayable and broken digital media may be fixed just as an unplayable film print may be fixed.” Why did you put this in bold? Can you talk about the parallels between analog and digital in your work? Can you talk about the challenges of perceptions in this area?

This is one area where I wish my education was different as it focused on the differences between analog and digital formats when I now think there are more parallels than realized. At the time I was in school I was more in tune with film preservation communities rather than digital preservation communities and there was a lot of skepticism towards digital formats and feelings of security in analog formats. I think perspectives like this slowed the progress of the community as so much time was spent trying to avoid or stall digital workflows rather than innovating in digital environments.

I had gotten a side job doing audiovisual restoration for a producer who recorded video on a camera that wrote digital files onto solid state card. He had accidentally deleted the card and when trying to use data recovery services on the files could only recover portions of QuickTime files but none of the file headers that are needed to decode the file. I worked to discover the encoded contents (mpeg2 and pcm audio in this case) and developed a process to chisel the audio and video out of these broken files so that I could recover the recording. It was thrilling to take a pile of malformed data and recover a presentation from it. This seemed very similar to my work at school prying through decomposing nitrate film and repairing edge damage and splices. Although the tools are different, there are more analogous opportunities in audiovisual preservation, whether analog or digital, than I think many realize.

Can you explain what you see as the relationship between computing hacking and archiving, and how this has benefited organizations such as AMIA (Association of Moving Image Archivists)?

Audiovisual archiving is so dependent on obsolete unsupported hardware (video machines, etc) that we must hack to support them ourselves. I think the need is clearer with analog formats and our field is accustomed to us opening video decks and tinkering in order to improve preservation possibilities. I worked with an engineer who modified a U-matic video player to have an option to disable the sensor that detects the end of the tape so that tapes with extreme shedding could cautiously be played back without triggering the machine to rewind. I’ve seen projects to convert video decks into cleaners or to sand down sprocket wheels in film transfers to accommodate shrunken film. Analog audiovisual hardware was not created with the expectation of handling media in a highly deteriorated state and I think we should celebrate the analog tinkering and hacking done to better preserve media. On the other hand, sometimes the digital equivalent can be regarded as problematic or not credible, but I consider it essential that the community support its own hackers, working with analog and digital forms, so that we aren’t unnecessarily hindered by our own technology.

Do you think the challenges and problems you are working on will be different in 5 years or 10 years? In the next generation?

Yes, our challenges and problems change as technology progresses. Perhaps 15 years ago an archivist may have felt that copying audio onto Gold CD-R discs was in the interest of long-term preservation, but nowadays an archivist may examine a collection of Gold CD-Rs and determine that it’s a priority to migrate them to more suitable storage. I remember trying to make long term plans in my early days as an archivist and in retrospect much of the intent of those plans goes in the right direction, but the details and priorities obviously change. Furthermore, many of the challenges felt in archiving 10 years ago are different because we’ve improved solutions for them. Online collaborative technology spaces such as GitHub have really helped archivists collaborate and support each other to address challenges.

I find that acknowledging that our systems are temporary helps in long term planning. The collections and the metadata about the collections should be the core of what we work to sustain, describe, and make accessible. The systems that we use to manage those collections and metadata should be replaced or improved upon as needed. Although the collections may need to be permanent, the systems do not need to be.

Based on your work and areas of interest, what kinds of work would you like to see the digital preservation and stewardship community take on?

I would like to see more adoption of and support for open source tools within preservation workflows, particularly digitization. I know that the digital preservation community has worked to contribute to, sponsor, or integrate open source tools to facilitate access to digital collections; however, in many areas we still use proprietary or closed systems for digitization that we have limited control or understanding over. I’d like to see more advocacy from the community for open software development kits for digitization hardware (such as scanners and audiovisual digitization cards) and support for open source digitization software that accounts for preservation principles.

For audiovisual digitization in particular, the community had often adopted production tools for videotape digitization such as Final Cut 7 and Live Capture Plus. Since videotape digitization is no longer part of most production workflows, the communities that support such software have dropped support and moved on, all while the preservation community is more urgently in need for such tools.

On another note, I’m glad to see more and more digital archives implementing microservice approaches to design and implement workflows for processing digital collections, rather than wholly creating workflows from the options provided by a monolithic system. I’d like to encourage more discussion on describing how archival packages are organized and how we may better create microservices that are interoperable rather than system-specific. Dinah Handel wrote an excellent blog on this topic at http://ndsr.nycdigital.org/check-your-aip-before-you-wreck-your-aip/.

Can you suggest other people who are doing interesting or innovative work that you think might be of interest to the digital preservation community?

Overall I think interesting and innovative work is becoming much more accessible to the digital preservation community in that more people are working in environments that encourage collaboration or working in open, online spaces. There’s several active projects in the AMIA Open Source github account that reflect innovative work of the audiovisual archiving community, such as ffmprovisr, vrecord, and open-workflows.

I’d also recommend following the NDSR program. The project’s mission is to “build a dedicated community of professionals who will advance our nation’s capabilities in managing, preserving, and making accessible the digital record of human achievement.” I think the focus on developing “capabilities” is an urgent need and the program is doing well to support and empower residents to focus on preservation challenges and to research and innovate accordingly.

Skip to content