DLESE Collections Committee

Thoughts about how to build a DLESE collection.

By Kim Kastens 2/14/00.

With advice from Martin Rusek, John Butler and others.

 

....including the vision of some of what might be in a DLESE NSDL/Collections Track proposal and some thoughts about what might be the Collections Committee's activities over the next year or so...

....all is negotiable...see especially "Issues and Potential Problems" at the end of the document before you start launching your criticisms...

....The main change from the Coolfont document has been that the burden of proof of pedagogical effectiveness is shifted from the creator of the resource to the community. This change is because both the NEEDS paper and conversation with a colleague well versed in educational research said that "proof that student learning has occurred" is too high bar to set, that the evaluation effort required to provide such evidence would be impossible to deliver for most resource-creators. Also, this community-review step makes the library more of a community-construct than a run-of-the-mill collection of hotlinks....

 

Premises:

Premise (1): This proposal and this committee will not commission the creation of new material, but rather will seek to build a collection from existing materials, perhaps with slight modifications.

Premise (2): The materials in the "inner-circle" of peer-reviewed, DLESE-stamp-of-approval-bearing resources must be classroom tested.

  1. However, testimony (no matter how well documented) from the creator of a resource that learning has occurred in his or her classroom is insufficient. The presence in the classroom of an expert on the material can bridge over gaps or blurry spots in the material that could greatly diminish its effectiveness in the hands of another teacher.
  2. There will never be enough money in the system to pay for professional evaluators to go into classrooms to evaluate whether student learning has occurred for every potential DLESE resource.
  3. Experienced educators can tell whether or not their students are learning effectively from an educational resource. Their experience and intuition, in the aggregate, are valuable and valid indicators of pedagogical effectiveness.

Premise (3): Materials for the DLESE collections will be selected according to the selection criteria articulated by the Coolfont Collections Committee:

    1. Accuracy, as evaluated by scientists.
    2. Importance/Significance.
    3. Pedagogical effectiveness.
    4. Well documented.
    5. Ease of use for students and faculty.
    6. Inspirational or motivational for students.
    7. Robustness/sustainability.

Premise (4): In order to be useful, DLESE has to contain lots of resources. In order to review lots of resources, the selection/evaluation process has to be economical in terms of money and staff time expended per item evaluated, by finding clever ways to tap into contributed expertise from the community.

 

Overview of Collection Building Process:

Step 1: Flesh out and publicize the DLESE selection criteria.

Step 2: Build a filter/review procedure to test submitted resources against the selection criteria. A summary of the criteria and how they will be tested for is as follows:

Coolfont Selection Criteria. How to implement.
Importance/Significance. More than N (threshold number to be chosen) educators from DLESE community tried this resource in their classroom.
Pedagogical Effectiveness.
Ease of use for students and faculty.
Inspirational or motivational for students.
On-line questionnaire filled out by educators who used resource in their classroom.
Accuracy, as evaluated by scientists. Invited review by a scientist, recruited by an editor.
Well documented. Review by library staff.
Robustness/sustainability. Testing by a quality assurance professional.

Step 3: Invite submissions from people with known resources, such as the "good practices" creators on the Virtual Geosciences Professor Website.

Step 1: Flesh out the DLESE selection criteria.

This was one of the Coolfont action items, and will be undertaken by the DLESE Collections Committee as one of the 2000-'01 activities under the Geoscience Education proposal. A few thoughts on a few criteria follow:

The most complex and time-consuming of the criteria to articulate and specify will be "well-documented." The meaning of "well-documented" will differ for different resource types. For data (including both digitally-acquired data and photographs), metadata must document where, when, and how the data were acquired. For teaching/learning materials with embedded student activities, "well-documented" means that the teacher is provided with an answer key, a scoring rubric, possibly examples of good/medium/poor student answers, possibly a discussion of common pitfalls and misconceptions. Every resource, regardless of type, will need to have tags to permit it to found by users of the DLESE Discovery System.

At Coolfont we spent some time discussing what constitutes "important or significant." We recognized that importance or significance could derive from different factors for different resources: one item bears on a concept that is central to a core geoscience discipline, another items bears on a topic important to the future of humanity. We decided to leave the burden on the creator to persuade the viewers that the resource was important, just as an NSF proposal writer has to persuade the proposal reviewers that the topic is important. In the procedure outlined in this document, a very pragmatic working definition of "important" is adopted: the topic/skill/insight/etc learnable from a resource is sufficiently important to include in the DLESE core collection if a threshold number (five?) educators in the DLESE community have thought it was important enough the teach in their classroom.

Step 2: The Filter/Review Procedure.

For the first step of the proposed review procedure to work, each DLESE item needs a "Teachers' Section," which will be password-protected, and accessible only by educators who have registered with DLESE. In this section, teachers will find an array of useful resources, including answers to questions, a suggested scoring rubric, links to tutorials about pedagogy, excellent/medium/poor examples of student work, etc. And, crucially for the purpose of evaluation, an on-line questionnaire asking about their use and opinion of the DLESE resource itself.

Everyone who accesses the teacher's section of a DLESE item will be asked to fill out this evaluation questionnaire. The questionnaire will first ask if you have used the resource in your classroom.

DLESE Collections criteria #2 (importance/significance) will be judged to have been passed when a certain number of members of the community (5? 10?) attest that they have used the resource in their classroom. Even if they didn't like the resource, the very fact that they tried it will be considered as de facto evidence that the material is important to the community.

For people who respond, No, they didn't use the resource in their classroom, the questionnaire is very short: Why not?:

( ) too difficult for my students.
( ) too elementary for my students.
( ) decided not to teach this topic.
( ) this approach required more time than I had available.
( ) found a better rescue, which was ___________.
( ) I found errors in the resource, which were ___________.
( ) Other: _______________________________.

The information from the people who looked at the resource but decided not to use it will be forwarded to the keepers of the Discover system for possible use refining the search procedures to cut down on the number of not-useful resources explored by DLESE educators.

For people who respond that, Yes, they did use the resource in their classroom, the questionnaire would dig more deeply. The first section would collect contextual information: What was the grade level of the course in which you used the resource? What was the title of the course? With how many students did you use the resource? How much classtime did you spend on the resource?

Next, the questionnaire would provide text fields for educators to make two sorts of comments. The first text field would be for a review of the resource, its strengths and weaknesses, and its suitability for inclusion in the DLESE peer-reviewed section. The second text field would be for teaching tips that other educators might find useful in using the resource.

Finally, the questionnaire would ask educators to provide a numerical score, perhaps on a scale of 1-5, of the resource according to three separate criteria:

Pedagogical Effectiveness: (1) (2) (3) (4) (5)
Ease of Use: (1) (2) (3) (4) (5)
Inspirational or motivational: (1) (2) (3) (4) (5)

The response to both of the write-in questions and the numerical scores would be sent, anonymously and automatically, to the creator of the resource. The creator could chose to modify the resource to take into account the reviewers' comments. After revision, the creator could choose to either just wait for better reviews to accrue, or to "wipe the slate clean" and erase all the pre-revision reviews.

Educators' responses to the teaching tips question would be attached to the resource in the library, accessible from the Teachers' Section of the resource. Examples include: suggestions on how to adapt the resource for certain categories of students (e.g. disabled, older or younger than original target); references to print or web-based material pertinent to some aspect of the resource; warnings of difficulties that students have encountered with suggestions about how to overcome the difficulties.

When a given resource has accumulated more than a threshold number of reviews from educators who had used the resource in their classroom, and the average numerical scores had risen above a threshold value, the entire packet of reviews would be sent automatically to an editor or library staff person. The threshold numerical scores could be a sum of the three scores, or weighted more heavily for pedagogical effectiveness than for the others, or each criteria could have a separate threshold. Maybe low scores from educators who used the resource at a grade level other than that for which it was designed should be thrown out. There is room here for experimentation.

The editor or library staff person would review the packet and trigger the more staff-intensive steps in the evaluation process: the evaluations for "scientifically-accurate," "well-documented," and "robustness/sustainability."

For "scientifically-accurate," an editor or librarian would recruit a scientist-reviewer, who would review the material explicitly searching for errors or weaknesses in the science presented. This process would be similar to the familiar peer-review process for scientific manuscripts.

For "well-documented," a library staff person would compare the resource against a well-thought-out, articulated, and publicized list of what kinds of documentation are needed for each resource type. Of particular importance would be ensuring that the appropriate tags or index terms would be included so that the Discovery System could properly find and index the resource.

For "robustness/sustainability," the resource would be run through an industrial-strength quality-assurance process. For a simple resource such as an image collection, this might be as simple as an automated check for broken hyperlinks, and a verification that the images display and print successfully on a range of browsers and hardware. For a resource with embedded software, such as a simulation or animation, the robustness check should include a check for compatibility with different hardware and operating system configurations, and an attempt to break the software by exercising it with a wide range inputs, conditions and parameters.

The reviews from these three staff-mediated review steps would be sent to the creator of the resource, who would then polish the resource as needed. An editor or library staff member would check to make sure that the revisions accomodated the changes suggested by the reviews, and would certify that the resource was ready to move into the peer-reviewed system.

Step 3. Recruit Contributions.

I think that the first twenty contributions to enter the DLESE selection procedure can be recruited easily from among the DLESE true-believers through personal contacts. Once the DLESE peer-reviewed collection exceeds one thousand resources, I think it will be self-sustaining. By then, most potential resource creators will be familiar with DLESE, its assets and its requirements, and creators will automatically keep the DLESE criteria in mind as they design and build new resources.

The trick will be to get from the 20th resource to the thousandth resource. It's extra effort to take a teaching resource that works in your own classroom and document it well enough so that it can work in another classroom -- and potential contributors may need some persuading. I'd suggest pausing after the first 20 or so contributions, to work the bugs out the of the review procedure, especially the uncharted territory of the community-review system. To recruit contributors beyond those first twenty will can be done through hotlink-lists such as the Virtual Geoscience Professor, listservers such as GeoEd, scientific and educational professional societies. A fertile starting point will be agencies or centers which have outreach as a staffed part of their mission, such as the US Geological Survey, natural history museums, the International Research Institute for Climate Prediction, IRIS, or CIESIN.

 

Issues and Potential Problems.

 

How do we get educators to send in their reviews?

Peer pressure is our strongest weapon here. It needs to become part of the DLESE community ethic that part of the price you pay for having access to all of these wonderful resources is that you give back of your professional knowledge and experience by completing the DLESE review questionnaire for any resource that you use.

Beyond peer pressure, however, DLESE needs a follow up nagging procedure. If an educator has accessed the teacher's section of a resource, and three months later hasn't sent in a review, an email follow up with a few short questions would be sent: did you use the following DLESE resource in your classroom: Yes/No?

If No, why not?:

( ) not at the appropriate grade level.
( ) not on the appropriate topic.
( ) decided not to teach that topic.
( ) found a better resource which was __________.
( ) other: _________.

If "Yes", please complete the questionnaire at ________ URL.

The answers from the "No" responders would be useful feedback for refining the Discovery System.

As a last recourse, it could also be possible to cut off access to the Teacher's section of DLESE resources for educators who are delinquent in turning in a large number of reviews.

 

How do we encourage creators/authors to contribute their materials?

John Butler is organizing an "academic recognition task force" to tackle the problem of making sure that DLESE resource creators are adequately recognized and rewarded by their peers and employers. Some ideas to start with:

  1. "Digital merit badges": a small icon, which identifies an icon as part of the DLESE peer-reviewed collection, to be displayed on the resource.
  2. A tracking system, that allows the creator and the library to assess the characteristics of the community that uses each resource. These summaries (location of user, frequency of access, etc) will be regularly reported to the resource creator, and can function as electronic citation indices in hiring, promotion and salary reviews. Note that these are the same data that will need to be collected for the community-review portion of the resource evaluation procedure.
  3. Notification: when a resource is accepted for DLESE, the creator will be asked to provide the names and addresses of individuals who should be notified. Each named individual will receive a letter from DLESE that includes a link to the page on which the criteria and procedures for selection are spelled out.
  4. Citation format: There needs to be an accepted way of citing DLESE resources in a bibliography or list of publications. That citation format should be prominently featured on DLESE resources, in the same way that the desired citation format is spelled out conspicuously on one of the first pages of every volume of Ocean Drilling Program Proceedings.

 

Does the inclusion of a fraction of the DLESE materials in a non-open "Teacher's Section" within each resource make the collection significantly less valuable for learners who are outside the formal education system?

Such learners might include journalists, congressional staff members, and home-schoolers. Perhaps one might argue that the journalists and congressional staff members aren't going to do the assessment parts of DLESE resources anyway; they will be more interested in the didactic parts of the resources which will be in the sections accessible to everyone.

 

Are there kinds of resources for which a Teachers' Section would not be useful, other than for the evaluation questionnaire?

This system will work only if educators actually go into the Teachers' Section of the resource, and they will only go there if there is interesting stuff to be found there. Are there resources which are so straightforward that there is nothing to put into a Teachers' Section other than the evaluation questionnaire? A collection of images, such as photomicrographs or field photographs, might be such a resource. I can still think of things that might be well-placed in a Teachers' Section, such as a link to a tutorial about pedagogy of visual literacy, or a link to a web-page about how to help color-blind students with microscope work, or bibliographic references to the background material about the field areas in the photographs.

 

Would clever students find a way to get into the "Teacher's Sections" of DLESE and web-publish or sell or pass around their fraternity the answers to DLESE activities?

A careful registration system would be needed to keep track of educators authorized to get into the Teachers' section. I would think that such a registration system would require paper-based registration with letterhead and signature. And then a web-based security system would be required to ensure that only registered educators could get into the Teachers' section. Obviously, password-controlled web resources do exist. Are these systems good enough for this purpose?

 

Do we need to worry about resource creators spinning or loading the peer review system by getting their friends to fill out the questionnaire with high scores?

I hope we don't need to worry about this, but maybe we do. I think it is OK for a creator to call up a colleague at another university and say "I've made this great new teaching resource for this course that you and I both teach; it's in the DLESE peer-review process; could you try it out?" That might accelerate the acceptance process a bit, but I think most colleagues approached in this way would give reasonably honest reviews. I don't think there are very many people in our community who would just zoom into the questionnaire and check off 5, 5, 5 and attest that they had used the resource in their classroom without actually doing so. If the threshold number of educators who have to have used a resource in their classroom is five or more, I think that is more colleagues than most people could co-opt to produce artificially strong reviews. In a way, this is similar to the practice of many authors of journal articles and proposals of suggesting in the cover letter the names of a few possible reviewers to whom the manuscript or proposal might be send.

Do we have to worry about the merely-good keeping out the truly-excellent?

A resource on a given topic which gains entry into the DLESE peer-reviewed collection early on will have a privileged position relative to later arrivals. An educator who knows little about a given topic, but an who is about to teach it for the first time, is more likely to use a resource which already has the DLESE stamp of approval, even if there is a much more suitable resource available in the grey-area of materials not yet through the review system.

Both DLESE and the resource creator can help get past this snag through appropriate publicity for new resources. DLESE should maintain an automatic email notification service which would tell registered DLESE educators whenever a new resource has arrived which falls within a profile of grade-level and topic which the educator had specified. The resource creator can "prime the pump" and rouse interest in the new resource by informal contacts, by presenting in an education session at AGU, GSA or NSTA, or by publishing a description of the resource in JGE or ______________ (K-12 science teachers' magazine).

 

The "inspirational/motivational to students" criterion seems problematic.

Some topics are just more exciting to students than others. A glamorous topic, or one with a direction connection to human life and welfare, is always likely to score better on this criterion than even the most clear and lucid and effective resource for learning about a more mundane topic. Nonetheless, it should be a goal of DLESE to encourage the development of resources which motivate and inspire students, so I would not favor dropping this criterion.

Maybe the threshold for this criterion needs to be set lower, with the goal of simply filtering out the most boring stuff, everything lower than 3 on a scale like this.

  1. A total bore; I couldn't bring myself to finish the activity.
  2. It was OK, but I wouldn't have done it if it wasn't required.
  3. Truly inspiring; motivated me to go out and learn more about the topic on my own.

Up until now, I had been envisioning the "inspirational/motivating" criterion to be checked off by the teacher. But maybe we should consider having the students provide the numerical ratings that go into the "inspirational/motivating" evaluation.

 

At what step in the process should the evaluation for "well-documented" occur?

I have written this sequence of events such that the library staff review of whether the resource is "well-documented" occurs in the latter half of process, in parallel with the other labor-intensive reviews for scientific accuracy and robustness/sustainability.

Another approach would be to have the library staff review for "well-documented" occur as the first step in the procedure, even before the community review step. An advantage of this sequence would be that the librarians could ensure that the proper index terms or tags are attached so that the resource can be properly pigeon-holed by the Discovery System from the moment it enters the DLESE outer-circle.

 

Could there be fast-tracks or pipelines to the peer-reviewed inner circle?

It could be that DLESE would choose to accept evaluation procedures from other organizations for some of its selection criteria. For example, DLESE could have a list of commercial software quality-assurance companies whose positive reports would be accepted as evidence that the robustness/ sustainability criteria had been met. Or DLESE could recognize the internal peer-review procedure of an agency such as the US Geological Survey as taking care of the "scientific accuracy" criterion. Or an educationally-oriented organization such as JESSE or NSTA could develop review procedures that DLESE would accept as sufficient to document "pedagogical effectiveness" or "ease of use."

I personally would be reluctant to see the development of ways to skip over the community-review step of the procedure outlined above, because I think that step is important for cultivating the sense of DLESE as a community and a community-built collection, rather than a hodge-podge of files on some web servers.