top of page

Tool Test Lab – Standard Feedback Form

User Experience and Practical Assessment

About This Form

Tool Test Lab is a community-based initiative of REACH, the digital magazine of the Science Integrity Alliance. We invite researchers, editors, librarians, developers, and research-support professionals to test open-source, freely available tools used in research workflows and to share structured feedback based on practical use.


This form captures user experience and practical impressions. It does not constitute formal benchmarking, validation, endorsement, or certification of any tool.


Ratings and open-text responses from multiple testers may be combined and reported as group-level summaries in REACH (for example, "Most testers found the interface intuitive"). Open-text responses may be summarized in anonymized thematic form. Individual responses will not be attributed without explicit permission.


Please complete this form only after actively testing the tool.

(Estimated time: 5 minutes)

  1. Tester Context

Primary role:
Field or discipline:
Prior experience with similar tools:
None
Limited
Moderate
Extensive
  1. Tool and Material

(e.g., website, GitHub link)

Access method:
Web-based
R package
Python package
Command line
Other (please specify)

(e.g., reference list, manuscript draft, image figures, dataset, preprint, code, peer review report)

(please enter a number and specify the unit; e.g., 25 documents, 3 reference lists, 400 references)

(e.g., detecting duplicate references, screening for AI-generated text, checking image integrity, verifying statistical reporting)

  1. Ease of Use

    Please indicate your level of agreement with the following statements:

The tool was easy to access or set up.
The technical requirements were reasonable for this type of tool.
The instructions and documentation were clear.
The interface was intuitive.
The results were easy to interpret.
  1. Perceived Performance

    (User impression only. Not formal validation.)


    Please indicate your level of agreement with the following statements:

The results appeared accurate.
The tool only flagged content that genuinely warranted attention.
The tool detected the issues I expected it to detect.
I have confidence in the reliability of the tool's output.
The tool clearly explains how its results are generated.
  1. Comparison Against a Reference Source

    (Optional)

Did you compare the tool's output against any reference source (e.g., a manually verified list, an established dataset, or another tool)?
Yes
No
  1. Practical Value

    Please indicate your level of agreement with the following statements:

The tool fits well into my workflow.
The time required to use this tool was acceptable.
The tool is worth the effort required to use it.
I would use this tool in practice.
I would recommend this tool to others.
Overall, I am satisfied with this tool.
  1. Screenshots and Supporting Material

    (Optional)


    You may upload up to 5 images that illustrate the testing process, interface, or output results. Please ensure that the uploaded material does not contain confidential, proprietary, or identifiable information.

  1. Open Feedback

    (Optional)

For example: Are you aware of other tools with a similar aim? Any other thoughts?

  1. Consent and Attribution

    Your responses may contribute to published content in REACH. Please indicate your preferences below.

Use of aggregated data

Ratings and open-text responses from multiple testers may be combined and reported as group-level summaries. Open-text responses may be summarized in anonymized thematic form.

I agree to my responses being included in aggregated summaries published in REACH.
Yes
No

Use of individual responses

Your responses may be presented individually to illustrate specific experiences or perspectives.

I agree to my responses being presented individually in REACH.
Yes
No

The following question applies only if you agreed above to allow individual responses to be presented.

If your responses are presented individually, how would you like to be identified?

Terms and Privacy

Checking the box below confirms your agreement and enables form submission.

Alliance logo_A_edited_edited_edited.png
bottom of page