At the end of the "scientific accuracy" conversation, on March 14, 2002, Kim Kastens wrote:
I personally don't think that anonymous contributions should ordinarily be included in the Reviewed Collection. I suppose there might be some exceptions, which could be dealt with on an individual basis, but generally speaking if no one is willing to stand behind the resource, then I think its credibility is in question.
On March 13, 2002, Robert Stewart replied:
Kim and Colleagues,
I often find very good sites, especially data and government sites with material we wish to have in the collection, yet the person who is responsible for the site is not identified. For example the NASA/JPL/PODAAC/Kids site at http://podaac.jpl.nasa.gov/kids/ does not identify the author. Neither does the NASA/JPL/PODAAC/Navoceano data site for the well-documented MCSST (Multichannel sea-surface temperature) pathfinder data site that has much useful data http://podaac.jpl.nasa.gov/navoceano_mcsst/ . If I dig through the documentation for the data set, I eventually find a PI. But there is no author.
So I think we ought to consider the issue. I suggest the scientific reviewer(s) contact the site and ask that the author be identified. But what about team-produced documents such as data sets, educational CD-ROMs etc? Is it sufficient to identify the team? Or is it sufficient to identify the laboratory that supported the team?
On March 18, 2002, Charlotte Schreiber replied:
'Science' without the good name of the person who is responsible is rather like a top government 'leak'. Do you believe it is sustainable, documented science? I think not. At the other extreme, I do not feel that data, such as ODP group core and site descriptions and reports should be junked because 1 responsible person is not given-- the group membership is named and the data has already been cross-argued by the working group (named in the ODP report).
On March 31, 2002, Kim Kastens wrote:
No reputable scientific journal would publish a paper without the authors' names. I think we should hold to the same standard.
I do like Bob's suggestion that for a resource which is really good but no author is identified, that the editors for one or more of the various pathways into the Reviewed Collection should consider contacting the site and ask that the author/creator be identified.
In addition to the credibility of the resource issue, there is an entirely separate reason why I think the Reviewed Collection should insist that the author(s)/creator(s) be identified, and it has to do with appropriately rewarding/crediting the creators:
Behind every great educational resource, there is a creative mind, or perhaps a pair or small group of creative minds. One of the things that DLESE is supposed to be accomplishing is raising the credibility and respect accorded to the creators of great educational resources. Submerging the identities of these creative individuals behind an anonymous logo helps to perpetuate a situation in which these people are not recognized or rewarded.
With respect to the anonymous but excellent data set, since DLESE is supposed to be a collection of stuff for education, and since "pedagogically effective" is another of the selection criteria, I wouldn't imagine that very many pure data sets would be considered for the Reviewed Collection--although learning activities that use data would be extremely appropriate.
On Feb 1, 2003, Kim Kastens wrote:
Dear DLESE Collections folks,
As you may recall, last spring I launched a listserver discussion of DLESE's seven selection criteria for resources being considered for the DLESE Reviewed Collection. The seven criteria are:
The goal of the discussion was to flesh out these criteria, and to develop a "best practices" document that could guide both resource developers and the groups working to develop review pathways into the Reviewed Collection.
Last year's discussion covered "Scientific Accuracy" pretty well, and began on "Well-documented."
I'd like to pick up the discussion again, starting with "well-documented."
Following is a sacrificial draft, for your editing pleasure, suggesting what a "well-documented" DLESE educational resource might be expected to contain, both in the metadata and in the resource itself. This is a tough one, because of the range of kinds of resources in DLESE.
Please send comments to the Collections listserver email@example.com.
(The "Well-documented" criterion covers both information in the DLESE metadata catalog and information in the resource itself.)
2a. Metadata (in the DLESE catalog):
All of the DLESE "Required" metadata fields must be complete in the DLESE metadata catalog: Title, Description, Subject, URL, Technical Information, Resource Type, Audience, Copyright, Name and contact information for the resource creator, publisher and/or contact person.
Best practice is that the "Robust" metadata fields should also be
complete in the DLESE metadata catalog. As of February 2003, these
include: Spatial and temporal coverage, science standards, geography
standards, relationship between resources
- For further information see the DLESE metadata website: http://www.dlese.org/Metadata/main/levels.htm
2b: Technical documentation (in the resource):
- Hardware needed to use this resource
- Software, including plug-ins, needed to use this resource; links or contact information to acquire such software
- Any known hardware or software incompatibilities
- (for software or complex resources) A clear and comprehensive user's guide.
2c: Practical documentation (in the resource):
- Materials list; links or contact information to acquire uncommon
- Safety precautions
- (for labs, activites, etc.) Estimate of time required
2d: Pedagogical Documentation (in the resource):
- Skills and understandings needed prior to doing activity
- Learning objectives
- Commonly observed mistakes or misconceptions
- Notes on instructional strategies
2e: Scientific Documentation (in the resource):
- Reference list or Bibliography
- Links to related websites
2f: Data Documentation (for resources containing data) (in the resource):
- Basic information about the instrument which collected the data and how it works (or links to such information)
- (for individual data sets) When and where the data was collected
- (for derived data products or merged data sets) Link to data archive or other source of description about how data were processed or merged
On Feb 6, Bob Downs wrote:
We might consider including the reviews or evaluation of the resource that have been conducted. The reviews might fit within the category, 2d: Pedagogical Documentation (in the resource).
On Feb 6, 2003, Katy Ginger wrote:
Just a couple of comments on your document. Please add 'cost' as part of required metadata. Please note that the actual part of audience that is required is a grade-range.
If a collection builder does not catalog resources at the DLESE website (i.e they download and install the DCS or are making metadata records from information they have in a database), they will need to make sure these additional required metadata fields are completed.
I leave it to you if you want to add these additional required metadata fields to this document. But I thought you should know about them. These additional required metadata fields are included in the new DLESE Scope Statement (not yet public). The additional fields come into play when a resource is not cataloged on the DLESE website.
Robust metadata is also more extensive than you list because of new fields that the ADN metadata framework will have like 'tool for' and 'beneficary', 'teaching method' and some others.
On Feb 8, 2003, Kim Kastens wrote:
Hi Katy et al,
Definitely, we should add cost.
I'm inclined to leave out the required metadata fields that are invisible to the run-of-the-mill resource creator catalogging through the DLESE catalogger. Perhaps we could add a note that if you are not catalogging on the DLESE catalogger, there are additional required metadata fields, and link to the DLESE metadata site.
Are the additional "robust" fields active yet? i.e. can they be entered via the DLESE catalogger?
On Feb 8, 2003, Kim Kastens wrote:
I think that we should add "Age or educational level for which resource is recommended" into the "Pedagogical Documentation."
On Feb 13, 2003, Floyd McCoy wrote:
I do not see mention of the date of acquisition, an important note, as might be the date of resource development (the two dates could be widely separated).
Concur with Robt. Downs on reviews. The model used by Amazon for reviewing books seems appropriate, with either quick or involved comments.
A date of the last review by DLESE would be valuable, as well. Links and the sites linked, seem to go down so often and quickly that such a note would alert one to just when the resource was last used. Certainly some periodic check of resources might be called for in the absence of reviews.
On February 25, 2003, Kim Kastens wrote:
Dear DLESE Collections Colleagues,
I had an opportunity to try out some of the results of our discussion of the DLESE reviewed collection selection criteria on a workshop of interested, knowledgeable, people: the NAGT/DLESE sponsored Cutting Edge Workshop on "Design Principles for Creating Effective Web-based Learning Resources in the Geosciences" in Ann Arbor last week.
With respect to our "well-documented" criterion, the workshop participants suggested the following:
Name and contact info should be in the resource itself, not just in the metadata. This is useful to the potential user because:
Under "Scientific documentation":
Technical terms should be defined
In addition to commenting on the positive features of what the selection criteria should be looking for, the workshop participants suggested two things that it would be best practice to avoid:
On 26 Feb 2003, Holly Devaul wrote:
A note on Floyd's points:
On 2 March 2003, Kim Kastens wrote:
Holly Devaul and Floyd McCoy have raised the point that best practice for DLESE Reviewed Collection items should be to include DATES.
I think that the date that the resource was last updated, or dates on which each segment of the resource was last updated, would be the crucial date, rather than the date that the resource was accessioned into DLESE. The revision date can help the potential user know whether the resource is being actively maintained or is being allowed to grow stale.