DLESE Collections Committee

How to Build the DLESE Reviewed Collection.

Archive of discussion of DLESE "Scientific Accuracy" Criterion.

March 7, 2002: Kim Kastens wrote:

Dear Collections Listserver & Collections Committee,

Way back at the Coolfont meeting, in summer of 1999, the proto-Collections Committee proposed a set of seven criteria by which educational resources should be selected for the DLESE Reviewed Collection. These criteria were discussed and approved at several levels in the DLESE advisory structure, and are now incorporated into various policy documents.

The seven selection criteria are:

We are now getting to a stage where several groups are actually trying to implement pathways into the Reviewed Collection, and are finding that they need more specific guidance about what exactly these selection criteria mean in practice.

I would like to catalyze a discussion on this listserver of each of these seven criteria, one by one. From this discussion, the Collection Committee will aspire to create a best-practices document, providing advice to resource creators, resource reviewers, and editors/gatekeepers on what is needed for a resource to be eligible for the DLESE Reviewed Collection.

Let's start with "Scientific Accuracy." Here is a sacrificial draft for your editing pleasure.

Kim


1. SCIENTIFIC ACCURACY

  1. A DLESE Reviewed Collection resource should have no known errors of fact.

  2. When using a DLESE Reviewed Collection resource, the user should be able to tell what is observation/fact and what is interpretation/opinion/hypothesis.

  3. DLESE best practice is that the review for scientific accuracy should be made by a scientist who is a specialist in this field.


March 8, 2002: Michelle Hall-Wallace wrote:

Kim,

Two thoughts cross my mind when I think about the Scientific Accuracy criteria.

Is it possible for a document to have no known errors but to actually be inaccurate because information was omitted? How will that case be handled?

The statement that the user should be able to tell fact from interpretation.This implies that we would have some review by the end user, presumably students to verify this is true. That is a scary thought in itself, but becomes even scarier when you realize that most of our undergraduates struggle with this issue all the time.

My two cents - and not worth much more,

Michelle


March 11, 2002: Cheryl Peach replied:

I agree with Michelle that undergraduates struggle with the distinction between observation and interpretation. However, I think once the best practices are articulated, the creator of a reviewed DLESE resource will (in principle) pay close attention to making the difference clear. The thought of an end user review is indeed a scary thought, although it would clearly be valuable. We will certainly face the issue of student/user understanding when asking about pedagogical effectiveness. Should we address this specific issue in that part of the best practices document?

In addition to the user being able to distinguish between observation/fact and interpretation/opinion/hypothesis, I think they should be able to identify any assumptions that have been made in generating an interpretation. Maybe this adds a level of complexity that you didn't intend to incorporate into best practices, but I have found that students often don't understand that there are assumptions that underlie many scientific assertions.

My 1 1/2 cents worth!

Cheryl


March 9, 2002: Robert Stewart wrote:

Dear Colleagues,

The text below is a good start, but needs work. First, most web pages or sites contain much material. It will be impossible to state there is no known error of fact. I suggest we change the text to read "the site was written or reviewed by an scientist who is a specialist in the field discussed at the site." Further criteria are: see http://oceanworld.tamu.edu/ocean401/Class%20discussions/ocng401_discussion_1.html

  1. Does the material at the site describing the author's credentials or experience?
    Is the writer anonymous?
  2. Who uses the material?

    Is it cited by others?
    Is it linked from trustworthy sites?
    Has the site won awards?

  3. Has the material been reviewed by peers?

    Journal articles on the web from respected journals are peer reviewed.
    Some data sets and information may have been described in published articles cited by the site.

  4. Who hosts the page?

    College, university, government, grammar school, commercial, or personal website?
    Does the hosting organization have strong opinions? Greenpeace and the US National Marine Fisheries Service may have differing viewpoints.

  5. When was the web page last updated?

    Some material may be many years old.

Regards, Bob Stewart


March 11, 2002: Priscilla Minotti wrote:

Problems do arise when a specialist in one particular field is not a specialist on a region or area. Thus although the revision may be seem accurate it could be biased on the knowledge of other places, landforms, ecosystems, etc. This seems particularly common for global and continental vegetation databases.

Priscilla Minotti

DRNF-UMSEF-Proyecto Inventario Nacional de Bosques Nativos - Argentina
Universidad CAECE - Dpto. Biologia


March 14, 2002: Kim Kastens wrote:

Robert Stewart suggested as a point to consider under Scientific Accuracy:

>1) Does the material at the site describing the author's credentials or >experience?
> Is the writer anonymous?

We have another criterion called "Well documented."

The discussions I have had about the well-documented criterion so far have included "Scientific Documentation", which would include the credentials of the authors. I think that this point could be well covered in the "well-documented" criterion. I guess I'll have that as the next criterion for listserver discussion.

Concerning Bob's second point, I personally don't think that anonymous contributions should ordinarily be included in the Reviewed Collection. I suppose there might be some exceptions, which could be dealt with on an individual basis, but generally speaking if no one is willing to stand behind the resource, then I think its credibility is in question.

Regards,

Kim

[note, this discussion of anonymous contributions then moved over into the discussion of the "well-documented" criterion.]


On March 31, 2002, Kim Kastens wrote:

Since our discussion of the 7 selection criteria for the DLESE reviewed collection seems to be segueing from "scientific accuracy" into the fringes of "well-documented," I'm going to try to wrap up the "scientific accuracy" discussion at least temporarily. Remember that we will have another pass through all of these criteria together after we've passed through all seven individually.

My initial sacrificial draft said:

1. SCIENTIFIC ACCURACY

  1. A DLESE Reviewed Collection resource should have no known errors of fact.

  2. When using a DLESE Reviewed Collection resource, the user should be able to tell what is observation/fact and what is interpretation/opinion/hypothesis.

  3. DLESE best practice is that the review for scientific accuracy should be made by a scientist who is a specialist in this field.

Comments received included:

What do you think of this as a possible revision, taking into account previous comments:

1. SCIENTIFIC ACCURACY

  1. DLESE best practice is that the review for scientific accuracy should be made by a scientist who is a specialist in this field.

  2. Any errors of fact identified in the science review process must be revised before the resource can be admitted to the Reviewed Collection.

  3. To help learners and teachers understand the reliability of the various parts of the resource, the resource should distinguish between what is observation/fact and what is interpretation/hypothesis, and should identify assumptions.