DLESE Collections Committee

Academic/Career Recognition Task Force Survey.

John C. Butler.

 

Response Rate.

  Received Survey Accessed Web Page Responded
 
Users
~ 1,000
175
74 (43%)
Contributors
~ 200
133
48 (36%)
Chairs
~ 200
90
35 (39%)

Total
~ 1,400
398
157 (39%)

Each of the nearly 1,400 individuals (identified at http://www.uh.edu/~jbutler/anon/artf.html) received an e-mail message with a brief description of the Digital Library for Earth System Education and the URL of one of the three survey instruments. Although some individuals may have accessed the survey web page more than once, it seems reasonable to judge response as the percentage of those who actually looked at the survey and returned their responses. Approximately 40% of the accesses on the survey web pages resulted in a submitted response. Approximately 80% of the responses arrived within 36 hours after notification. This may be a characteristic of e-mail surveys. Unless the recipient elects to make a hard copy or a "yellow sticky", the announcement of opportunity quickly gets buried in the "in box".

 

Availability of Information.

A detailed summary of each survey and all of the comments submitted are available upon request (from the ARTF page listed above). I believe that the real value is in the comments. Although, there are a few interesting generalizations that can be drawn from the responses.

I believe that DLESE has "kept the faith" with the geosciences/Earth system sciences communities in that a good faith effort has been made to solicit opinions and give the community a chance to respond in written form. More than half of the respondents elected to provide written commentary.

 

Generalizations.

About 90% of all respondents to all three surveys agreed that there is a need for some form of peer review process for materials housed in a digital library. About 75% (of the users and contributors) said that they would consider being a reviewer of material that might be housed in a digital library. These may not be surprising and are viewed as a healthy measure of where the community is coming from.

 

Users.

The potential users of DLESE are looking for a wide range of topical resources that have been evaluated by a peer review process. They highly value an assessment of both accuracy and pedagogical effectiveness. To a lower degree, the potential users value ease of use and whether the resource is inspirational or motivational.

 

Contributors.

Most of the contributors (and chairs as well) teach at a university in which the expectations for research-teaching-service are roughly: 50%-35%-15%.

Slightly less than half of the contributors would encourage their non-tenured colleagues to devote time to the production of learning resources but slightly more than half feel that they have been fairly evaluated. Fewer than 33% are required to submit a teaching portfolio when faced with a mandatory review. Nearly 90% reported that the lack of time was the first (54%) or second (35%) biggest obstacle to their efforts in creating resources.

Nearly half reported that accuracy was the highest rated criterion and more than 80% listed accuracy as one of the three highest rated. Ease of Use ranked as the second highest rated criterion and nearly 80% ranked this criterion as one of the three highest rated. Importance and Pedagogical Effectiveness tied as the third highest highest rated criteria and about 40% mentioned one or the other as being in the top 3. This is somewhat at odds with the responses of the Chairs.

Perhaps most of the developers are coming at this (I know that I am) from an interest in the technology and not from an interest (at least initially) in learning. This clearly is the case at UH. Can DLESE foster an environment in which "teams" are created?

86% agreed that they would consider submitting their materials for DLESE review. 77% agreed that they would prepare a "users manual" as part of the review process and more than 80% agreed that they would allow DLESE to maintain a "pristine version" of their resource if it was accepted for the collection. This seems to be a healthy set of indicators of how DLESE is viewed by a group representative of those who would initially be available to create resources for the collection.

 

Chairs.

About half of the chairs agree that they encourage their non-tenured colleagues to develop e-resources for their courses and 88% agree that their colleagues are fairly evaluated with respect to their teaching efforts.

Nearly 90% reported that the lack of time was the first (68%) or second (22%) biggest obstacle to their colleagues efforts in creating resources. The other potential responses received low selection. 40% listed lack of skills as the second biggest obstacle.

Pedagogical Effectiveness and Accuracy received about 67% of the responses for the criteria with the greatest impact on chairs. 65% of the respondents listed Pedagogical Effectiveness as the first or second responses with the greatest impact. Ease of Use was the third highest.

80% agreed that the would encourage faculty who produced superior resources to submit them to DLESE for review.