Abstract
Purpose: Researchers and institutions are faced with a growing number of tools to help maximize and track alternative metrics, or altmetrics. Unlike bibliographic databases, the coverage, functionality, and underlying search algorithms of these tools are often opaque. This poster describes an assessment of these unique resources.
Setting/Participants/Resources: Seven tools were examined: Altmetric Explorer, F1000, ImpactStory, Kudos, Mendeley Stats, Newsflo, and PlumX
Brief Description: We investigated seven altmetrics-related tools to determine their utility both for the Libraries and for our users. We assessed seven tools, and considered their functionality, intended purpose and audience, business model, transparency, accuracy, and flexibility, both for the Libraries as a service and resource delivery unit and for the individual researcher.
Results/Outcomes: While we found that no one tool met all of the articulated and anticipated needs of our Libraries or our users, we developed an overarching rubric which allows us to clearly communicate the benefits and potential challenges of each of these diverse tools. The scope of the tools was often limited, for example focusing only on social media engagement rather than a more robust picture of impact, and they consistently lacked functionality such as the ability to download search results. Associated costs were often ambiguous, as were the search algorithms and data sources included, and tools frequently failed to simultaneously address both individual and institutional needs.
Evaluation Method: A consensus-based model was used to develop an assessment rubric of altmetric tools.
Setting/Participants/Resources: Seven tools were examined: Altmetric Explorer, F1000, ImpactStory, Kudos, Mendeley Stats, Newsflo, and PlumX
Brief Description: We investigated seven altmetrics-related tools to determine their utility both for the Libraries and for our users. We assessed seven tools, and considered their functionality, intended purpose and audience, business model, transparency, accuracy, and flexibility, both for the Libraries as a service and resource delivery unit and for the individual researcher.
Results/Outcomes: While we found that no one tool met all of the articulated and anticipated needs of our Libraries or our users, we developed an overarching rubric which allows us to clearly communicate the benefits and potential challenges of each of these diverse tools. The scope of the tools was often limited, for example focusing only on social media engagement rather than a more robust picture of impact, and they consistently lacked functionality such as the ability to download search results. Associated costs were often ambiguous, as were the search algorithms and data sources included, and tools frequently failed to simultaneously address both individual and institutional needs.
Evaluation Method: A consensus-based model was used to develop an assessment rubric of altmetric tools.
Original language | English (US) |
---|---|
State | Published - Oct 8 2018 |
Event | Medical Library Association Midwest Chapter Annual Meeting - Cleveland, United States Duration: Oct 5 2018 → Oct 9 2018 |
Conference
Conference | Medical Library Association Midwest Chapter Annual Meeting |
---|---|
Country/Territory | United States |
City | Cleveland |
Period | 10/5/18 → 10/9/18 |