Report from
"Building the Reviewed Collection" Focus Group
at the DLESE Leadership Conference.

Bozeman, MT June 2000.

 

The charge to this focus group was to contribute to the early design of a procedure to select resources for the "Reviewed Collection" of DLESE, identifying the "best" resources within a larger "Open Collection."

The focus group began with a presentation by Kim Kastens. First , she reviewed decisions made at the Portals to the Future Workshop, including the rationale for having both reviewed and open collections, plus the seven selection criteria (see http://www.dlese.org/documents/reports/panelreports/panel_3.html). Then she presented a straw man plan for a review procedure which includes input from the user-community via a web-mediated community-review recommendation engine, from library staff, and from specialists. This plan is designed to be scalable to handle the large number of resources projected for inclusion in the DLESE collection, and supports the DLESE vision of a library built by and for the community.

In the main activity of the Focus Group, participants filled out paper mock-ups of a first design of two web-submitted forms that would be used by community-reviewers. Under the proposed plan, only members who were registered with DLESE as educators would be able to access the evaluation forms. However, within the group of people in the DLESE membership database as "educators," community-reviewers would step forward rather than being recruited by an editor.

First, focus group participants were first asked to recall a specific situation in which they had looked at a web resource, but decided not to use it in their teaching, and fill out a form appropriate for that situation. The information from that form will go to the creator of the resource, and a digested version will go to the Discovery System staff to catch situations when the resource is misdirected rather than poor.

In the subsequent discussion, the following suggestions came forward concerning the form for educators who looked at the resource but decided not to use it:

The second form was a more extensive questionnaire for educators who did use the resource under review for helping students learn. The information from this form will go to the creator (all information anonymously), to the editor/gatekeeper of the reviewed collection (after a threshold in quality and quantity of reviews is passed), and to be posted with the resource (teaching tips only). Again, focus group participants were asked to recall a specific circumstance when they had used a web-based educational resource, and fill out a paper mockup of a web form with that situation in mind.

The following suggestions were offered about the web-form for educators who tested a resource with real learners:

The participant-annotated paper mock-ups of the review forms have turned out to be a valuable resource. We plan to revise the forms taking into account comments from the Bozeman conference and repeat the focus group, possibly asynchronously over the Collections list-server.

The following suggestions were offered as to what should happen next, after the community-review results are in hand:

Other issues raised in the focus group:

The next day, at the town meeting, there was a discussion of diversity and the digital divide. During this discussion, it occurred to me that the proposed DLESE review process could help with this problem. At the end of the review form for educators who have used the resource, we could add an additional optional question, something along the lines of: "[optional] DLESE is especially seeking resources that have worked well in challenging learning/teaching situations. Please estimate what percentage of the learners with whom you used this resource are: ( ) learning disabled ( ) physically disabled ( ) economically disadvantaged ( ) English as a second language ( ) minority underrepresented in science ( ) urban setting, limited access to Nature." Then, whenever a resource had scores above a certain threshold in a situation with more than a certain percentage of students in any of the harder to teach categories, the review package would get sent automatically to a member of the Editorial Board designated as the diversity watchdog. Picking from the resources brought to his/her attention in this way, and maybe communicating with the reviewers/testers for additional input, the diversity watchdog would assemble web pages of resources "Recommended for students with limited English proficiency", "Recommended for urban students," "Recommended for place-bound students" and whatever other categories seem useful.