The first platform evaluation is fast approaching, and the WebTex Subcommittee is in search of volunteers. The first platform to be evaluated will be Access to Memory (AtoM). If you’re interested in volunteering, we’d love your help! You can find out more by reading this blog post. ArchivesSpace and XTF are slated for evaluation in the spring.
In preparation for the upcoming platform evaluations, we have gathered some resources on each one. Additionally, we’ve gathered some basic resources on EAD.
The approach to finding resources began with the most easily located pages: the main websites for each of the platforms. The WebTex team also did some brief brainstorming on resources that we were already aware of, such as Yale’s blog on ArchivesSpace and SAA’s EAD documentation page. Links to wikis, GitHub, and other blogs were mined from these. Additionally, we performed Google keyword searches to locate more blog posts related to specific platforms, which led to more link mining and discovery of additional front-end interface examples.
We hoped to build a list that would help us and others learn more about each of the platforms and bolster preexisting EAD knowledge. However, the list is not comprehensive! Please feel free to share additional resources in the comments so that we can add them to the list. You can view the list of resources along with brief annotations on the Annotated Bibliography page.
As you all know, TARO will undergo some big changes in the next couple of years. We are looking into moving to an new archival description platform.
But which one? This is where your help is vital to the success of the new TARO.
If you agree to volunteer to test the following archival platforms, you will be contributing to the improvement of a valuable resource for the larger regional archives community.
And, you’ll be helping yourself (possibly) by doing research that can inform your institution’s own descriptive practices.
The WebTech committee is looking at the following platforms:
Over the next year we need 10 volunteers to help us test AtoM this Fall; AS and XTF in the Spring.
Volunteers will be given access to an instance of the platform and will use a set of prompts similar to a usability test to help us determine which platform best addresses the core needs as we see them:
- Finding aid discovery
- Finding aid creation
Should you agree to test AtoM, you will receive an electronic packet of links for the three testing sections consisting of:
- Evaluation Matrix: The matrix is divided into sections mapped roughly to the user stories provided in your evaluation packet.
- Indicate how you would prioritize each criterion (High / Medium / Low)
- Indicate the availability of the criterion for the platform you are evaluating (Yes / No / n/a)
- Follow-up Questions: Follow-up questions are short-answer questions that address topics related to the new TARO platform that cannot be addressed using the other evaluation tools provided.
- Comments: The comments section is entirely free form. We ask that you provide feedback about the evaluation process, the TARO planning grant, your institutional orientation towards TARO, etc.
From November 2 – 13, volunteers will evaluate AtoM with the materials to be e-mailed out next week.
We anticipate that the volunteer time required will be no more than 2 hours over the evaluation period of two weeks. The work does not need to be done in one sitting.
Volunteers can sign up by filling out the google form here: https://docs.google.com/forms/d/1f8ubEINFbMRGboqaWhPjUBsi2jzD-PEmvYHswu8avNs/viewform?usp=send_form
Contact Daniel Alonzo or Jessica Meyerson if you have any questions about the volunteer process.
We are looking forward to hearing from you!