DLESE Collections: Building the Reviewed Collection

Kim Kastens
Lamont-Doherty Earth Observatory
Rte. 9W
Palisades NY 10964
kastens@ldeo.columbia.edu

 

Session Description:

At the Portals to the Future workshop, participants agreed on a two-part collection: a broad unreviewed collection of resources relevant to Earth System Science, and a smaller reviewed collection of highest-quality resources. In this focus group, we seek to refine the criteria and procedures by which resources will be selected for the reviewed collection.

Goals and Objectives (for users and creators):

Activities:

  1. Review of decisions to date (Kastens): Rationale for reviewed collection, rationale for unreviewed collection, seven selection criteria, community review plan
  2. Try out the form for unused resources: Recall a specific circumstance when you examined a teaching or learning resource and decided NOT to use it. A paper mock-up of the form for resources examined but not used will be distributed. Fill out the form with respect to the specific resource which you recalled.
  3. Discussion:
  4. Try out the form for used resources: Recall a specific circumstance when you found a teaching or learning resource that someone else had developed, and decided to adapt it to your own classroom. A paper mockup of the form for resources used will be distributed. Fill out the form with respect to the specific resource which you recalled.
  5. Discussion:

Background Reading:

(The following is extracted from the pending proposal "Collaborative Project: To Gather, Document, Filter and Assess the Broad and Deep Collection of the Digital Library for Earth System Education.")

Our review procedure is built around the following premises:

  1. In order to be useful, DLESE has to contain a critical mass of resources.
    1. In the growth of DLESE, teachers, classrooms and students will be abundant resources.
    2. The rate-limiting resources in DLESE's growth will be money, and the time of paid librarians, editors, and computer professionals.
  2. We adopt the seven selection criteria recommended by the Portals to the Future workshop: scientific accuracy, importance/significance, pedagogical effectiveness, well-documented, ease of use, power to inspire or motivate students, robustness/sustainability.
  3. The materials in the DLESE reviewed collection must be classroom-tested. However:
    1. Testimony from the creator of a resource that learning occurred in his or her classroom is insufficient. The presence of an expert in the classroom can bridge over gaps or blurry spots in the material that would greatly diminish the resource's effectiveness in the hands of another teacher.
    2. It is not realistic to pay for professional evaluators to go into classrooms to evaluate whether learning has occurred for every potential DLESE resource.
    3. Experienced educators can tell whether or not their own students are learning effectively from an educational resource. Their experience and intuition, in the aggregate, are valuable and valid indicators of pedagogical effectiveness.
    4. It is easier to answer: "Did your students learn?" than "Do you think students would learn?"

With these premises in mind, we have designed what we call a "community review" filtering system. This system taps into the power of the World Wide Web and the strength of numbers of the DLESE community so as to minimize the money and staff time expended per item evaluated, and maximize the quality of the resources in the reviewed collection and the sense of community ownership of the library. A summary of the process is provided in Table 1.

 

Table 1: Overview of Review process

Step

Portal to the Future
Selection Criterion

How to implement

1

Well documented

Review by library staff

2

Importance/Significance

More than N (threshold number to be chosen) educators from DLESE community tried this resource in their classroom.

3

Pedagogical Effectiveness

Ease of use for students and faculty

Inspirational or motivational for students

On-line questionnaire filled out by educators who used resource in their classroom

4

Accuracy, as evaluated by scientists

Invited review by a scientist, recruited by an editor

5

Robustness/sustainability

QA testing (functionality and configuration testing)

 

Step 1: The first step in the review procedure will occur in parallel with the metadata tagging step. As the library staff person reviews the resource to attach the metadata, he/she will check to make sure that the resource is "well-documented." The meaning of "well-documented" for each resource type is still being defined by the DLESE Collections Committee, but we anticipate that for assessment tools the criteria will include an answer key or scoring rubric; for data the criteria will include documentation of how, when, where the data were acquired; for field trips the documentation will include maps, and so on. For all kinds of materials, the documentation will include appropriate literature references and other attributions, and, of course, metadata. If the resource is incompletely documented, the creator will be invited to upgrade it.

Step 2: As each user accesses a resource, DLESE will determine if he or she is an educator. Ideally, this will occur through an automated user-authentication and user-authorization infrastructure such as that proposed in Kate Wittenberg's companion proposal to the NSDL Core Integration Track. If such a system is not funded or not adopted by DLESE, we will simply ask, as our first question on the community review survey: "Are you an educator?" If the answer is "yes", the user will be asked to fill out an evaluation questionnaire (figure 1). The questionnaire will first ask if you have used the resource in your classroom. The "importance/significance" criterion will be judged to have been passed when a certain number of members of the community attest that they have used the resource in their classroom. Even if they didn't like the resource, the very fact that they tried it will be considered as de facto evidence that the content or understanding or skill addressed by the resource is important to the community.

For people who respond, No, they didn't use the resource in their classroom, the questionnaire is simple (figure 1, left) and seeks merely to determine why the resource wasn't used. The information from educators who looked at the resource but decided not to use it will be forwarded automatically but anonymously to the resource creator. In addition, this information will be aggregated and forwarded to the keepers of the Discover system for use in refining the search procedures to cut down on the number of not-useful resources explored by DLESE educators.

Step 3: For people who respond that, Yes, they did use the resource in their classroom, the questionnaire will dig more deeply (figure 1, right). Educators will be asked to provide three numerical ratings, evaluating the resource on the criteria of pedagogical effectiveness, ease of use for faculty and students, and power to motivate or inspire students. Separate text fields will be provided for an open-ended review of the resource, and for teaching tips that other educators might find useful in using the resource. Finally, the questionnaire would gather contextual information about the setting in which the resource was used.

The response to the write-in questions and the numerical scores will be sent, anonymously and automatically, to the creator of the resource. The creator may chose to modify the resource to take into account the reviewers' comments. After revision, the creator may choose to either wait for better reviews to accrue, or to "wipe the slate clean" and erase all the pre-revision reviews.

Educators' responses to the teaching tips question will be posted with the resource. Examples include: suggestions on how to adapt the resource for certain categories of students (e.g. disabled, older or younger than original target); references to print or web-based material pertinent to some aspect of the resource; warnings of difficulties that students have encountered.

When a given resource has accumulated more than a threshold number of reviews from educators who have used the resource in their classroom, and the numerical scores have risen above a threshold value, the entire packet of reviews will be sent automatically to the "editor" of the reviewed collection (Kastens). The threshold numerical scores could be a sum of the three scores, or weighted more heavily for pedagogical effectiveness than for the others, or each criteria could have a separate threshold. We will experiment with different threshold values.

The editor will review the packet and trigger the more staff-intensive steps in the evaluation process: the evaluations for "scientifically-accurate," and "robustness/sustainability."

The evaluation questionaire for educators considered using a DLESE Collections resource.

Step 4: For "scientifically-accurate," the editor will recruit a scientist-reviewer, who will review the resource, explicitly searching for errors or weaknesses in the science presented. This process would be similar to the familiar peer-review process for scientific manuscripts.

Step 5: For "robustness/sustainability," the resource will be run through a quality-assurance process. For a simple resource such as an image collection, this will be an automated check for broken hyperlinks, and a verification that the images display and print successfully on a range of browsers and hardware. For a resource with embedded software, such as a simulation or animation, the robustness check will include a check for compatibility with different hardware and operating system configurations, and an attempt to break the software by exercising it with a wide range of inputs, conditions and parameters. For the simpler resources, we will do the QA testing in-house; for the most complicated resources, we will contract with a commercial vendor of QA services.

The reviews and bug list from these staff-mediated review steps will be sent to the creator of the resource, who will then modify the resource as needed. A staff member will check to make sure that the revisions accommodated the changes suggested by the reviews and tests, and certify that the resource is ready to move into the reviewed collection.

The community review mechanism will remain in place even after an item has moved into the reviewed collection. Teaching tips sent in by educator-evaluators will continue to accrue and be posted with the resource. The numerical scores on the pedagogical effectiveness, ease of use, and inspirational/motivational criteria for items in the reviewed collection will be used for two additional purposes. First, those resources that continue to receive the highest scores on the community review evaluation, and also are accessed by a significant number of users, will be rewarded with "best of DLESE" awards. These rewards will be given annually, in a number of categories by topic, resource type, and level. Rewards are one of the mechanisms by which the DLESE Academic Recognition Task Force plans to encourage appropriate academic career recognition for the creators of excellent educational resources. At the other end of the spectrum, those resources which begin to receive subpar numerical scores in the community review evaluation will be re-examined. If appropriate, the creator will be offered an opportunity to update the resource; if that doesn't happen, the resource may be pruned from the reviewed collection.