donderdag 25 juli 2013

Forwarded message: The Survey of Best Practicesin Developing Online Information Literacy Tutorials

-----Original Message-----
From: Jim Moses [mailto:primarydat@aol.com]
Sent: 24 July 2013 15:52
To: infolit@ala.org
Subject: [INFOLIT] Primary Research Group has published The Survey of Best Practicesin Developing Online Information Literacy Tutorials, ISBN 978-157440-247-6



Primary Research Group has published The Survey of Best Practices in Developing Online Information Literacy Tutorials, ISBN 978-157440-247-6.  The study looks closely at how academic libraries are developing and deploying online information literacy tutorials exploring issues such as spending, budgets, staffing, range and qualifications of staff used for tutorial development, software use, time frame for tutorial development, conceptions of what constitutes a quality tutorial, assessment of library efforts, marketing to students and faculty, cooperation with other institutions, frequency of tutorial revision, measurement of student outcomes and other issues in the development and use of online information literacy tutorials.

The study was devised with the assistance of Jennifer Holland and Yvonne Mery of the University of Arizona Libraries, and Erica DeFrain of the University of Vermont Library, and the summary of main findings was written by Holland and DeFrain.

Just a few of the main findings from this exhaustive 285 page study are that:

•       The mean number of information literacy tutorials per library in the
sample was 27.92, and the median was 10.50.
•       The library homepage was listed as the most popular access point for
online information literacy tutorials, followed by subject guides, course guides, and  YouTube.
•       Nearly 69% of tutorials used by the libraries in the sample were
created in-house.
•       A third of the libraries sampled reported using the tutorials of other
libraries.
•       The following institutions were cited by survey participants for
excellence in tutorial development and a source of imitation or inspiration:
Cardiff University, Clark College, Coastal Carolina University, Cooperative Library Instruction Project, Glasgow Caledonian (UK), Kent State, Manor College, Michigan State University, North Carolina State University, Open University (UK), Penn State, Rutgers, South African Universities, TILT, University of Arizona, University of California-Irvine, University of Illinois-Chicago, University of Pittsburgh, University of Sydney, University of Texas-Austin, University of Texas-Houston, Vanderbilt, Wayne State University, West Chester University, and Western Oregon University.
•       About a quarter of the libraries sampled assigned only one person to
the task of developing information literacy tutorials for the library.
•       Only a third of librarians sampled felt that their institutions
provided adequate support for tutorial development.
•       43.75% of respondents from community colleges indicating that it took
less than 10 hours to develop an information literacy tutorial.
•       2.56% of the libraries sampled used their own in-house developed
software to create tutorials.

Data is broken out by size and type of library, for US and foreign libraries, and for public and private colleges.  For further information view our website at www.PrimaryResearch.com.

woensdag 10 juli 2013

RIDL Criteria

I repeat the message:

The Research Information and Digital Literacies Coalition (RIDLs) has formulated a set of criteria to help training practitioners in higher education describe, review and evaluate their training and development interventions and resources intended for researchers, but also for students and teaching staff - see http://www.researchinfonet.org/infolit/ridls/strand2/ for details. These criteria  relate to all interventions aimed at developing information-handling knowledge, skills and competencies, whether in the form of face-to-face sessions/courses or digital/online resources. They serve three broad purposes:

(i)     Helping institutional staff who design and deliver such courses and resources to describe and review them; the aim being to provide a structured and recognized way of presenting such interventions in online resources and demonstrating their value.
(ii)   Providing a simple means of assessing courses and resources, for use within or outside the institutions in which the interventions have been compiled; the aim being to evaluate their suitability and usefulness as transferable resources.
(iii)   Serving as a prompt for a dialogue between training practitioners and learners, and providing a structure for such a dialogue.
However, the criteria are not intended as a prescriptive or rigid tool, nor as a means of assessing the performance of training practitioners: they are very much about providing the latter with a logical and common-sense self-help framework that will assist them with the formulation and delivery of their resources.
If any readers of this list you would like to try out the criteria within their own institutions, please do not hesitate to do so. And if you would like further information, feel free to contact Stéphane Goldstein, at stephane.goldstein@researchinfonet.org