DLESE Collections Committee.

Collaborative Project:

To Gather, Document, Filter and Assess the Broad and Deep Collection
of the Digital Library for Earth System Education.

PI's: Kim Kastens, Barbara DeFelice, Christopher DiLeonardo, Sharon Tahirkheli.

 

What is DLESE and Why is it Needed?

The Digital Library for Earth System Education (DLESE) is envisioned as a facility that provides: (a) easy access to high-quality educational materials about the Earth for use by educators and learners at all levels, (b) information, tools and services to maximize the usefulness of the materials provided; and (c) a community center that fosters interaction, collaboration and sharing among Earth science educators and learners. This vision was developed at the Portal to the Future: A Digital Library for Earth System Education workshop held at Coolfont Resort, West Virginia, on August 8-11, 1999. The DLESE Community Plan was developed in response to recommendations from working panels at this workshop, as well as from subsequent input from the community via e-communications, "town meetings," and theme sessions at professional society meetings. Reports from the workshop, the Community Plan, and archives of related documents are available at http://www.dlese.org/. The central philosophy of DLESE from the outset has emphasized community involvement in designing, building, and evaluating the library.

Education about the Earth and environment is well-poised to benefit from recent advances in information technology (AGU, 1997; NSF, 98-82). The nature of our science, the historical moment in the evolution of our educational enterprise, and recent developments in digital library technology, all favor the success of a digital library for Earth system education, because:

Work Completed to Date

A Vision of the DLESE Collection: A Consensus of the "Portals to the Future" Workshop:

At the Portals to the Future workshop, a vision for the DLESE collection was articulated by a Collections Committee (chaired by PI Kim Kastens). The Collections Committee report <http://geo_digital_library.ou.edu/workshop/report3.html > has been disseminated for comment via the Portals and DLESE web-sites, town meetings, and other community fora. The vision was well received and has been incorporated into the DLESE Community Plan.

Our Collections plan begins with a Collections Policy, as follows:

"The DLESE shall collect material that facilitate learning about the Earth system. The collection shall favor learning resources that bring the Earth into the classroom and connect the general with the specific, theory with evidence, and the global with the local. The scope of the collection is Earth system education, with particular emphasis on interdisciplinary areas. Materials are selected to support education by giving both educators and students access to tools and resources for instruction and research. Initial priority in the building of the collection shall be given to materials for undergraduate education."

We envision a diverse collection of resource types, including:

Through discussion of the attributes of exemplary digital educational resources, we have coalesced a set of seven selection criteria by which we propose to judge resources for inclusion in DLESE:

These criteria are largely congruent with selection criteria being developed for other digital library efforts (e.g. Eibeck, 1996).

There is a tension between wanting to provide resources of every sort, on every topic, suitable for every learner, and wanting to be able to certify that the DLESE resources are of the highest quality. As a way to meet both needs, we plan a two-level collection: a broad unreviewed collection of content which is relevant to Earth System education and meets minimum quality and technical standards, and a sub-collection of high quality teaching and learning materials which have been rigorously evaluated according to the selection criteria above.

The rationale for establishing the reviewed sub-collection, from the user’s perspective, is that students and educators are in search of quality teaching and learning materials, but may not have the background to evaluate those materials, nor the time to experiment with numerous alternatives. The reviewed section of DLESE will be a reliable, efficient source for high-quality resources. From the creator’s perspective, the rationale for the reviewed sub-collection is that inclusion in the reviewed section of DLESE can become a recognized stamp of professional approval akin to publication in a peer-reviewed journal.

The rationale for maintaining a broader unreviewed collection is that there are few existing materials which would meet all of our selection criteria if applied rigorously, and yet users are seeking materials on a huge range of topics. DLESE provides added value by being inclusive, while providing powerful search and classification capability. The library will identify these resources as relevant but unreviewed, applying the philosophy of caveat emptor.

On-going efforts of the DLESE Collections Committee:

Following the Portals workshop, the DLESE Steering Committee established a Collections Subcommittee. PI Kastens is the interim chair of this committee, which has developed an active list-server, an action plan, and a web-site <http://www.ldeo.columbia.edu/DLESE/collections/>. The action plan includes defining what "well-documented" means for each category of materials, drafting guidelines for what "pedagogical effectiveness" means, establishing mechanisms for providing academic career recognition to creators of DLESE materials, and developing scenarios for how users will interact with DLESE. The Collections Committee works with other DLESE Committees (Users, Services, and Technology) and working groups (large data sets and metadata standards) The interests of all of these groups are interrelated, and the products of their work will ultimately inform the form and function of the DLESE collections and services. Funding for the work of the DLESE Collections Committee is requested in a companion proposal submitted to the Geoscience Education solicitation (Marlino/Manduca/Mogk PI’s).

The Geoscience Digital Library collection and metadata effort:

Since September 1999, the Geosciences Directorate of NSF has funded a consortium effort to build a prototype of a Geosciences Education Digital Library (GDL, more information at http://www.page.ucar.edu/gdl/). The Consortium is headed by the University Consortium for Atmospheric Research (UCAR), and assisted by the Incorporated Research Institutions for Seismology (IRIS), the Universities Space Research Association (USRA), the Keck Geology Consortium, the University of Colorado, and the University of California, Santa Barbara, Alexandria Digital Library. The GDL group have subscribed to the recommendations of the Portal to the Future workshop, and have placed themselves under the direction of the DLESE Steering Committee. Thus the GDL has become, in effect, the first stage implementation of DLESE.

One of the main thrusts of the GDL effort is to build a flexible, powerful, easy-to-use "Discovery System," which will allow users to find what they want efficiently. Educators and learners will be able to search the collection by one of more of the following criteria: topic, geographic place, time, concepts, learning style, educational resource type, or educational level.

To exercise the infrastructure and services they are creating, GDL is building a small but representative testbed collection. This miniature collection will initially contain approximately 100 resources suitable for teaching Earth system science at the high school and undergraduate levels. These resources will span concepts, themes and processes addressed in such a course, and will include the following resource types: laboratories, full courses, calculators, images of real phenomena, conceptual images, scientific papers, lesson plans, assessment tools, pedagogy tutorials, professional opportunities, animations of observed data, animations of models or simulations, virtual field trips, problem sets, large datasets, tools to analyze large datasets, student portfolios, geoscience theme pages or portal sites, history of geoscience, lecture materials. The testbed collection is being used to guide GDL’s development of metadata procedures and vocabularies. In the fall of 2000, GDL will begin enlarging their collection into an"operational" collection, which will be used in a formative way to test the various DLESE functionalities.

The key to an effective Discovery System is to have resources which are accurately and appropriately described by means of bibliographic metadata. Bibliographic metadata provide a structured way to describe the resources in a library using predefined fields that should only be assigned certain values; metadata are used to index learning resources within a local library, and potentially across libraries. For digital libaries, there are a number of emerging metadata standards. The GDL metadata group has examined Dublin Core and IMS metadata standards, and has selected IMS for use in DLESE. IMS fields tend to be more specific ("granular") than the Dublin Core fields, IMS provides a better description of educational learning objects, and IMS metadata is more suitable for portable exchange. At present, GDL is in the process of manually associating IMS metadata with the first items in the testbed collection, figuring out how to cluster the IMS data fields in a manageable number of "search buckets" upon which users may search, and designing a user interface for the efficient and accurate entry of metadata. More substantive information about the GDL Collections and Metadata efforts can be found at <http://www.dlese.org/Metadata/index.htm>.

Proposed Work

We propose to: (1) gather a broad and deep collection of resources relevant to Earth System education, (2) apply appropriate metadata to each resource, (3) implement a "filtering system" to select resources for inclusion in the reviewed collection, and (4) assess the growing collection for balance and completeness.

Task 1: Gather the broad and deep collection:

Our first task will be to cast a wide net in search of resources which are pertinent to Earth System Education. We will solicit volunteered contributions, and examine collections of resources that have had some level of peer review. As the collection begins to take shape, we will be guided by the assessment results (see Task 4 below) to actively recruit contributions to the library that fill voids in content, pedagogy, level of delivery, etc. Our gathering process will be closely coordinated with GDL’s expansion from their 100 item "testbed collection" to their larger "operational" collection. "Ponds" into which we will cast our net include:

  1. Existing search engines and hotlink lists, e.g. The Virtual Geology Professor.

  2. Mission Agencies, e.g. NASA, USGS, EPA, NOAA, NWS,etc.; including inter-agency projects such as Digital Earth and Project GLOBE.

  3. Professional societies with interests in Earth system education (AGI, AGU, GSA, NAGT, NESTA, NSTA, AMS, etc).

  4. DL-I and DL-II recipients, e.g. Alexandria Digital Library (UC Santa Barbara), The Virtual Paleontologist (U-Texas, Austin), Earthscape (Columbia University Press).

  5. Science education journal publications that could be developed as digital resources: Journal of Geoscience Education, Journal of College Science Teaching.

  6. Links to sister disciplines-- chemistry (Amer. Chem. Society), physics (Amer. Physics Society), life sciences, social sciences (Amer. Association of Geographers), mathematics, engineering (National Engineering Education Delivery System).

  7. Education projects funded through NSF: DUE--CCLI (formerly CCD and ILI), ATE, teacher preparation programs; ESIE-- Instructional Materials, Teacher Enhancement programs; GEO--Awards for Geoscience Education (AFGE), REU sites and projects, Geo-sponsored facilities (e.g. UCAR, IRIS), Earth Scope.

  8. Education projects developed at academic institutions or supported by other funding sources.

Each candidate resource will be reviewed for relevance to Earth System education, and given a quick (5-10 minute) review to rule out items which are obviously nowhere close to meeting several of our seven selection criteria.

The creators of each relevant resource will be contacted by email and asked if they want their resource included in DLESE. In the same exchange, we will explain the procedures by which resources are selected for the reviewed collection, and gather feedback to help us structure the review process and the academic career recognition procedures in a way that creators are comfortable with. These early personal contacts with creators will let us seed the creator community with knowledge about DLESE and NSDL, will build credibility for the DLESE as a facility responsive to the needs of its various constituencies, and will give us yet another source of leads to high quality resources.

If the creator agrees to be included in the collection, the resource will be passed onto to the metadata process and hence into the unreviewed DLESE Collection. We anticipate that we will gather approximately 5000 resources into the unreviewed collection during the course of the two year grant.

Task 2: Associate metadata with the incoming items:

Before an item can enter the DLESE Collection, it must be associated with appropriate metadata to permit it to be found by the Discovery System. The importance of this task cannot be overemphasized; it is only a slight exaggeration to say that metadata is what separates a digital library collection from a hot-link list.

Our team will be responsible for assigning metadata to the incoming items, using tools and procedures developed by the DPC metadata group. Metadata procedures and tools developed by the DPC for their smaller testbed collection will meet their first independent challenge with our larger collection. Both groups anticipate that this will require modifications, for example enlarging the controlled vocabulary allowed in particular metadata fields, adopting existing metadata for some resources from another format into the IMS standard, or developing new vocabularies to describe non-standard-IMS attributes such as level of review or classroom testing procedure. The ambitious mandate that the DLESE Community Plan has placed on the Discovery System, coupled with the nature of Earth systems information, pose many metadata challenges, including geospatial referencing, temporal referencing (absolute and relative geologic time, evolution of systems, rates), concepts that transcend disciplinary boundaries (e.g. convection), thematic approaches (e.g. environmental justice), human cognition and learning parameters (e.g. visual learning, quantitative skills, materials for disabled students), and links to sister science and engineering disciplines. Our team includes an expert in cataloging, indexing, and metadata, who will work with the GDL metadata group to resolve metadata issues.

During the timespan when our collection is growing, GDL and the University of Colorado will be building a tool ("COMMENT" for COMmunity Metadata ENtry Tool) that provides active support for resource cataloging using the IMS schema. The intent is that eventually the resource creators themselves will use this tool to apply metadata to their own resources. As an interim step, the metadata people from our group will work with GDL to test and refine COMMENT over the full breadth and depth of the unreviewed DLESE Collection. Our use of COMMENT will facilitate our own metadata task, while at the same time we serve as testers for COMMENT.

Task 3: Implement a filtering system to select items for inclusion in the reviewed collection:

The NSDL RFP requires "evidence that the proposed aggregation of resources will support the very best SMET education at all levels–education that is inquiry-driven, active and engaging." Thus an important aspect of our proposed work is to create a filtering system that will find the best materials in the broad and deep collection, and gather them into a reviewed collection of highest quality materials.

Our review procedure is built around the following premises:

  1. In order to be useful, DLESE has to contain a critical mass of resources.

    1. In the growth of DLESE, teachers, classrooms and students will be abundant resources.
    2. The rate-limiting resources in DLESE’s growth will be money, and the time of paid librarians, editors, and computer professionals.
  2. We adopt the seven selection criteria recommended by the Portals to the Future workshop: scientific accuracy, importance/significance, pedagogical effectiveness, well-documented, ease of use, power to inspire or motivate students, robustness/sustainability.

  3. The materials in the DLESE reviewed collection must be classroom-tested. However:
    1. Testimony from the creator of a resource that learning occurred in his or her classroom is insufficient. The presence of an expert in the classroom can bridge over gaps or blurry spots in the material that would greatly diminish the resource’s effectiveness in the hands of another teacher.
    2. It is not realistic to pay for professional evaluators to go into classrooms to evaluate whether learning has occurred for every potential DLESE resource.
    3. Experienced educators can tell whether or not their own students are learning effectively from an educational resource. Their experience and intuition, in the aggregate, are valuable and valid indicators of pedagogical effectiveness.
    4. It is easier to answer: "Did your students learn?" than "Do you think students would learn?"

With these premises in mind, we have designed what we call a "community review" filtering system. This system taps into the power of the World Wide Web and the strength of numbers of the DLESE community so as to minimize the money and staff time expended per item evaluated, and maximize the quality of the resources in the reviewed collection and the sense of community ownership of the library. A summary of the process is provided in Table 1.

Table 1: Overview of Review process

Step

Portal to the Future
Selection Criterion

How to implement

1

Well documented

Review by library staff

2

Importance/Significance

More than N (threshold number to be chosen) educators from DLESE community tried this resource in their classroom.

3

Pedagogical Effectiveness

Ease of use for students and faculty

Inspirational or motivational for students

On-line questionnaire filled out by educators who used resource in their classroom

4

Accuracy, as evaluated by scientists

Invited review by a scientist, recruited by an editor

5

Robustness/sustainability

QA testing (fuctionality and configuration testing)

Step 1: The first step in the review procedure will occur in parallel with the metadata tagging step. As the library staff person reviews the resource to attach the metadata, he/she will check to make sure that the resource is "well-documented." The meaning of "well-documented" for each resource type is still being defined by the DLESE Collections Committee, but we anticipate that for assessment tools the criteria will include an answer key or scoring rubric; for data the criteria will include documentation of how, when, where the data were acquired; for field trips the documentation will include maps, and so on. For all kinds of materials, the documentation will include appropriate literature references and other attributions, and, of course, metadata. If the resource is incompletely documented, the creator will be invited to upgrade it.

Step 2: As each user accesses a resource, DLESE will determine if he or she is an educator. Ideally, this will occur through an automated user-authentication and user-authorization infrastructure such as that proposed in Kate Wittenberg’s companion proposal to the NSDL Core Integration Track. If such a system is not funded or not adopted by DLESE, we will simply ask, as our first question on the community review survey: "Are you an educator?" If the answer is "yes", the user will be asked to fill out an evaluation questionnaire (figure 1). The questionnaire will first ask if you have used the resource in your classroom. The "importance/significance" criterion will be judged to have been passed when a certain number of members of the community (5? 10?) attest that they have used the resource in their classroom. Even if they didn’t like the resource, the very fact that they tried it will be considered as de facto evidence that the content or understanding or skill addressed by the resource is important to the community.

For people who respond, No, they didn’t use the resource in their classroom, the questionnaire is simple (figure 1, left) and seeks merely to determine why the resource wasn’t used. The information from educators who looked at the resource but decided not to use it will be forwarded automatically but anonymously to the resource creator. In addition, this information will be aggregated and forwarded to the keepers of the Discover system for use in refining the search procedures to cut down on the number of not-useful resources explored by DLESE educators.

Step 3: For people who respond that, Yes, they did use the resource in their classroom, the questionnaire will dig more deeply (figure 1, right). Educators will be asked to provide three numerical ratings, evaluating the resource on the criteria of pedagogical effectiveness, ease of use for faculty and students, and power to motivate or inspire students. Separate text fields will be provided for an open-ended review of the resource, and for teaching tips that other educators might find useful in using the resource. Finally, the questionnaire would gather contextual information about the setting in which the resource was used.

The response to the write-in questions and the numerical scores will be sent, anonymously and automatically, to the creator of the resource. The creator may chose to modify the resource to take into account the reviewers’ comments. After revision, the creator may choose to either wait for better reviews to accrue, or to "wipe the slate clean" and erase all the pre-revision reviews.

Educators’ responses to the teaching tips question will be posted with the resource. Examples include: suggestions on how to adapt the resource for certain categories of students (e.g. disabled, older or younger than original target); references to print or web-based material pertinent to some aspect of the resource; warnings of difficulties that students have encountered.

When a given resource has accumulated more than a threshold number of reviews from educators who have used the resource in their classroom, and the numerical scores have risen above a threshold value, the entire packet of reviews will be sent automatically to the "editor" of the reviewed collection (Kastens). The threshold numerical scores could be a sum of the three scores, or weighted more heavily for pedagogical effectiveness than for the others, or each criteria could have a separate threshold. We will experiment with different threshold values.

The editor will review the packet and trigger the more staff-intensive steps in the evaluation process: the evaluations for "scientifically-accurate," and "robustness/sustainability."

Evaluation questionaire for educators illustrating the plan for information flow in the "Community Review" system.

Step 4: For "scientifically-accurate," the editor will recruit a scientist-reviewer, who will review the resource, explicitly searching for errors or weaknesses in the science presented. This process would be similar to the familiar peer-review process for scientific manuscripts.

Step 5: For "robustness/sustainability," the resource will be run through a quality-assurance process. For a simple resource such as an image collection, this will be an automated check for broken hyperlinks, and a verification that the images display and print successfully on a range of browsers and hardware. For a resource with embedded software, such as a simulation or animation, the robustness check will include a check for compatibility with different hardware and operating system configurations, and an attempt to break the software by exercising it with a wide range of inputs, conditions and parameters. For the simpler resources, we will do the QA testing in-house; for the most complicated resources, we will contract with a commercial vendor of QA services.

The reviews and bug list from these staff-mediated review steps will be sent to the creator of the resource, who will then modify the resource as needed. A staff member will check to make sure that the revisions accommodated the changes suggested by the reviews and tests, and certify that the resource is ready to move into the reviewed collection.

The community review mechanism will remain in place even after an item has moved into the reviewed collection. Teaching tips sent in by educator-evaluators will continue to accrue and be posted with the resource. The numerical scores on the pedagogical effectiveness, ease of use, and inspirational/motivational criteria for items in the reviewed collection will be used for two additional purposes. First, those resources that continue to receive the highest scores on the community review evaluation, and also are accessed by a significant number of users. will be rewarded with "best ofDLESE" awards. These rewards will be given annually, in a number of categories by topic, resource type, and level. Rewards are one of the mechanisms by which the DLESE Academic Recognition Task Force <http://www.uh.edu/~jbutler/anon/artf.html> plans to encourage appropriate academic career recognition for the creators of excellent educational resources. At the other end of the spectrum, those resources which begin to receive subpar numerical scores in the community review evaluation will be re-examined. If appropriate, the creator will be offered an opportunity to update the resource; if that doesn’t happen, the resource may be pruned from the reviewed collection .

Task 4: Assess the growing Collections

Our final task will be to assess the scope and balance of the collection. Collection assessment provides a thorough and systematic comparison between an actual collection and a desired collection. With a library based on contributed resources, there is a danger that the library could grow in a lopsided way. For example, it could become rich in materials based on data from a well-organized and well-funded government agency, while remaining poor in resources pertaining to a more diffuse and less organized discipline, or it could grow rich in materials for under-graduates while remaining poor in materials for K-5. The goals of collection assessment are to:

Collection assessment for a new library occurs in two phases, first as part of the initial screening of the early collection based on anticipated use, and then as part of an on-going assessment, as patterns of actual use emerge. We propose an initial and an on-going assessment.

The assessment effort will proceed in four steps:

Step 1. Develop assessment procedures and guidelines: The procedures must be useable for both reviewed and unreviewed DLESE collections. We must chart the subject areas to be covered, the levels to be included, and other key characteristics of interest to the DLESE community. We will work closely with the GDL metadata group to ensure that the characteristics for which we wish to assess are included in the metadata tags. Useful characteristics for which to assess include:

Step 2: Perform a methodical assessment of the initial collection. This effort will test the assessment procedures on the DPC testbed collections and on the earliest version of the broad and deep collection. This initial assessment will be "collection-centered," comparing the existing DLESE collection to the known universe of possible materials, to the written collections policy, and to the type and range of materials identified as most important by DLESE planners. The outcome of this step will be an indication of depth in each area of the initial collection. A number count of items is useful for reports but not the most important measure of depth. A more useful approach is to assign a relative, qualitative "depth of collection" rating, based on knowledge of existing collections. An example of the outcome of this step is given in Table 2.

 

Table 2: Example of Collection Assessment with actions needed

 

TOPICS

Lab. Activity

Field Trips

Images

Problem sets

Data Sets

…etc, etc, more resource types….

Conservation Biology;

endangered species;

biodiversity

 

3; 14-16; E; 1 R; 2 N

 

Need to add grade 5-12 materials.

0

Area of serious lacks in collection; good material available at x institution

12; 5-5-8; 7-12-16; E; N

Need to encourage review of materials

3; 13-16; E; N

Need to add 5-12; Need to encourage review of materials

5; 14-16; R; N

Many more sets available. Links to geospatial data sets needed.

Ecosystem Analysis;

ecological modeling;

ecosystem dynamics

3; 5-8; C; N

 

Need to add 8-12; encourage review of materials

 

 

Etc, etc

     

…etc etc more topics…

 

Factors in order: number of items; grade level by grade number or range of numbers; originating body (E=educational; C=commercial; R=research; G=government); # reviewed or non-reviewed (R or N); Depth of Collection evaluation and actions needed in a free form note.

Step 3: Develop and implement on-going assessment mechanisms. Once the library is open for business, assessment becomes both collection-centered and user-centered, incorporating feedback from users about what they think should be in the library. At present, we can envision three useful feedback pathways. First, the Discovery System will capture data about what requests have been made, including requests for which the Discovery System found no matching resources. This data will be aggregated and forwarded automatically to the Collections Assessment team, where it will be analyzed for which subjects or titles are used the most and asked for most. Secondly, the user feedback portion of DLESE will have a feedback link specifically for users to suggest other resources or kinds of resources that users would like to see in the library. Thirdly, at user meetings sponsored by the DLESE User Subcommittee, users will be asked to suggest additional resources or kinds of resources they would like to see added. Information from the feedback link and from the user group meetings will be fed into the on-going Collections Assessment effort. To increase the efficiency with which we can examine and document the scope and balance of the existing collection, we will develop a librarian-friendly tool, a Discovery System autopilot, which will automatically exercise the Discovery System and report how many items exist in the collection in each category, or combination of categories, of interest.

Step 4. Report to the community and funding agencies. This report will describe the balance and quality of the collections, offer recommendations about what areas need to be better developed, and suggest changes in the collections policy growing out of the on-going assessment.

Step 5: Provide documentation of the assessment tools, techniques and guidelines for doing on-going assessments. Publish the findings of the assessment project. These actions will help ensure that collections assessment can be carried out routinely by others when the library is fully operational.

Although applied here to collections for educators in the earth sciences, our assessment tools and techniques can be extended to the even-more-diverse collection of the NSDL. No such model exists from other digital library projects; projects such as the Alexandria Digital Library and the Berkeley Digital Environmental Library have more narrowly focused collections.

Management Issues and Sustainability

Within our group, Kastens has responsibility for overall coordination of the project. Each PI will take responsibility for one of the four tasks: DiLeonardo for Task 1, Tahirkheli for Task 2, Kastens for Task 3, and DeFelice for Task 4. Additional discussion of the division of labor is included in the Budget Discussion. To facilitate interactions between tasks, and between our efforts and the DPG effort, we have scheduled and budgeted twice-yearly PI meetings in Boulder.

Developing a structure for the long-term stability of DLESE is the responsibility of the DLESE Steering Committee. Included must be a business plan that is financially sustainable, a long-term governance model, and a policy addressing the intellectual property rights of users and contributors. Our Collections-building group will be an integral part of this planning effort, with DiLeonardo and DeFelice serving on the Steering Committee, and Kastens attending Steering Committee meetings as a Committee Chair. In anticipation that our procedures for resource review and collections assessment will need to be transferred to the long-term DLESE operational arm at the end of this award period, we have designed our work plan to include frequent interactions with the DPG, and abundant documentation of our tools and techniques.

Concluding Notes

DLESE is envisioned as a facility built by and for, and with deep and broad input from, the community. This Collections plan embodied in this proposal is faithful to that vision.

This proposal will serve the larger NSDL community by building a deep and broad collection that spans an important domain of science education. In addition, our project will develop, test and document new tools and procedures with strong potential for use in other NSDL collections, including: