The current status

To keep up with the progress of redefining IL, Keiser’s detailed report on ACRL’s work is rather helpful. (Barbie E. Keiser is an information resources management consultant located in the metropolitan Washington, D.C., area.)

Reimagining Information Literacy Competencies

http://newsbreaks.infotoday.com/NewsBreaks/Reimagining-Information-Literacy-Competencies-98406.asp

 

 

 

Posted in Advocacy, General | Tagged , | Leave a comment

“Change Literacy” and the future libraries

Brian Mathews of Virginia Tech suggests to put “change [as a noun] literacy” into consideration for the ongoing revision of definition of Information Literacy. Change literacy is, describes Mathews, “the ability to anticipate, create, adapt, and deal with change (in the broadest since) [sense, I’d guess] as a vital fluency for people today.” The rationale is “If we treat change as a literary [literacy, I’d guess] then we can better prepare students for the challenges they will face tomorrow.” Despite the somewhat awkward term, Mathews’ view of “change literacy” reflects the evolving concept of literacy. His blog post about it can be viewed at

http://chronicle.com/blognetwork/theubiquitouslibrarian/2014/03/10/acrl-if-we-are-putting-everything-on-the-table-how-about-change-literacy-too/

On a separate topic, a recent essay by the same author, “Librarian as Futurist: Changing the Way Libraries Think about the Future” appears in July 2014 issue of portal. He advocates “What will libraries be in the future? They will become whatever their users need.” His statement, while inspiring, has raised questions: how do we decide user’s actual need (in what scope and at what level(s))? Who decides user’s need (user-initiated or librarian-initiated or both)? These are the issues that deserve to be discussed.

Citation: portal: Libraries and the Academy, Volume 14, Number 3, July 2014 pp. 453-462.

Posted in Advocacy, General | Tagged , , , , , | Leave a comment

Intentional Informationists

Among the 2013 top twenty articles recommended by ALA Library Instruction Round Table, <http://www.ala.org/lirt/sites/ala.org.lirt/files/content/archive/2014jun.pdf> Hoffmann and Wallace’s “Intentional Informationists” is of particular interest. [See citation below] The case study depicts IL practice at California State University-Channel Islands, a young institution of ten years history (as of the time the article was written). Their goal is to shift “the emphasis from literate to informed, from passive receptors of information to intentional users and consumers of information.” The authors define an “intentional informationist” as a person with “the contextual, reflective and informational skills to identify information opportunities, tackle complex information problems and pitfalls, and provide solutions or considerations that do not just meet her individual needs.” (Full text can be retrieved in ScienceDirect)

Hoffmann, Debra, and Amy Wallace. “Intentional informationists: Re-envisioning information literacy and re-designing instructional programs around faculty librarians’ strengths as campus connectors, information professionals, and course designers.” The Journal of Academic Librarianship 39.6 (2013): 546-551.

 

Posted in Advocacy, General, Learning Outcomes | Tagged , , | Leave a comment

What the leaders think about IL

Released today, Ithaka S+R US Library Survey 2013 reports that library directors (chief librarians in CUNY’s term) were nearly unanimous in saying that teaching research skills and information literacy to undergraduates was a very important part of their mission.

One of the issues is practical: staffing, as we all face it and deal with it. Some libraries with more human resources cope better. We at York have to re-schedule or even cancel some IL sessions due to the shortage of staff. An encouraging trend revealed by the survey indicates ” Forty-two percent of respondents at baccalaureate colleges said they planned to expand staffing in instruction, instructional design, and information-literacy services over the next five years, as did 44 percent at doctoral universities and 53 percent at master’s-level institutions.” (quote from The Chronicle report on the survey).

Jennifer Howard of The Chronicle of Higher Education summarizes the survey results in Chronicle’s Wired Campus page:

What Matters to Academic-Library Directors? Information Literacy: http://chronicle.com/blogs/wiredcampus/what-matters-to-academic-library-directors-information-literacy/51005?cid=at&utm_source=at&utm_medium=en

 

Posted in Advocacy, General | Tagged , , , | Comments Off

Transliteracy for Next Generation Students: Academic and Everyday

Transliteracy for Next Generation Students: Academic and Everyday

My colleagues Anamika Megwalu and Christina Miller are accepted to present one of the four breakout sessions at the Information Literacy Summit, sponsored by DePaul University Library and Moraine Valley Community College Library, on April 25, 2014 at Moraine Valley Community College (near Chicago). Here is the description of their presentation.

Title of Workshop: Next Generation Literacy: Connecting the Everyday to the Academic

Description: New technologies and ideologies, and the deconstruction of traditional boundaries in learning, have led to the confluence of ‘everyday’ and academic learning and the need for a re-conceptualization of what it means to be information literate. The presenters design their information literacy sessions, for college and high school students, with an eye toward helping students acquire transliteracy – that is, the ability to derive value and create transferable knowledge through the use of a multitude of digital platforms and information sources.

Attendees of this interactive workshop will participate in two exercises designed to foster transliteracy and change learning dispositions. Prof. Megwalu will present an activity based on Analogical Reasoning that encourages college students to begin their research work with familiar web sources such as Wikipedia, blogs, and social networking and file sharing sites, before they use academic databases. Prof. Miller will demonstrate a standards (AASL/CCSS)-based exercise used in a high school science research class; students learn about scientific research by reading about studies in the popular media before they use the library’s databases. Such activities encourage next generation students to exploit everyday information sources for their academic work.

 

Posted in Advocacy, Events, General | Tagged , | Comments Off

Thinking and re-thinking

James M. Lang, an associate professor of English and director of the Center for Teaching Excellence at Assumption College, questions the use of the popular term “lifelong learning” in The Chronicle of Higher Education’s Advice section.

The author believes all human beings with working brains are lifelong learners, and takes on [the over-use of] “lifelong learning”, which, in his words, “accomplishes little and means less”.

Posting this does not mean I am totally for the author’s opinion, after all, motivating and educating lifelong learner is our ultimate goal. We ought to be open-minded. Reading different viewpoints helps us think and rethink and act upon our own mission.

One of the comments, presumably coming from a librarian, views our current practice in library instruction is “anti-lifelong-learning” due to its passive, course-driven nature, e.g. teaching the database that the faculty insists on. S/He went on to suggest that we should teach some true information literacy contents such as how to search Google and Google Scholar. (I would add open access databases for the same reason.) For this, I am totally for.

Here is the article link:

Enough with the ‘Lifelong Learning’ Already

by James M. Lang

http://chronicle.com/article/Enough-With-the-Lifelong/144137/?cid=at&utm_source=at&utm_medium=en

Posted in Advocacy, General | Tagged , , | 2 Comments

ACRL’s IL standards to be updated

In June 2013,  Steven J. Bell (the immediate past president of ACRL 2012-13) reported in ACRL’s blog that because the current standards “are showing their age”, a special task force was established to update the standards and expand the definition of information literacy. The revision, according to the co-chairs of the task force, will be less overwhelming and more flexible (no library jargon, thus, understandable to other disciplines). “Information fluency” was mentioned in the charge of the task force; multiple literacies (transliteracy, media literacy, digital literacy, visual literacy, etc.) are to be included; and student’s role as content creator will be addressed.

For more information:

Rethinking ACRL’s Information Literacy Standards: The Process Begins

Task Force Prospectus on Work Plan

ACRL Board of Directors’ Response to Task Force Prospectus

Report of the Standards Review Task Force

Posted in General | Tagged , | Comments Off

The White Paper on Facilitating Discovery of Free Online Content

More and more free web resources are available. As we always expect, some are not so good, some are good, and some are really good. The availability of free web resources has become an important issue that librarians have to deal with. A recent white paper commissioned by Taylor & Francis reveals current status. Key findings, to quote from Taylor & Francis, are:

—————————————————————–

Research Headlines

  • 92% of librarians agree that free online resources are ‘very important’
  • Librarians feel they are well-placed to provide expertise in free content selection and discovery
  • 84% of respondents said that 10% or less of their time was currently devoted to indexing free online content
  • Key challenges for librarians relating to making free resources more discoverable within their institutions are: volume  growth, unknown permanence, and difficulties relating to quality assessment;
  • The most important criteria for selection of free online access was relevance of that content to the institution’s activities  but brand and reputation factors were also key.
  • Librarians are already investing in understanding their user community needs and in developing their catalogue interfaces accordingly.

———————————————————————

To read the White Paper:

http://www.tandf.co.uk/libsite/pdf/TF-whitepaper-free-resources.pdf

Posted in General | Tagged , , , | Comments Off

The Evolution of Library Instruction

My colleague John Drobnicki posted this on our library’s list. I thought I should share with you. Andy Burkhardt outlines three stages in the evolution of library instruction: bibliographic instruction, information literacy, and information sophistication. Read more at: http://andyburkhardt.com/2013/06/18/the-evolution-of-library-instruction/

Posted in General | Tagged , , | Comments Off

Computers and Crowds: Unexpected Authors and Their Impact on Scholarly Research

On Friday, May 17, nearly 50 librarians from CUNY and other New York City libraries gathered at the CUNY Graduate School of Journalism to participate in a program about new models for content production. This exciting program was jointly organized by the LACUNY Emerging Technologies Committee, the LACUNY Scholarly Communications Roundtable, LILAC, and the Office of Library Services.

The morning began with a lively presentation from Kate Peterson, Information Literacy Librarian at the University of Minnesota-Twin Cities, and Paul Zenke, DesignLab/Digital Humanities Initiative Project Assistant at the University of Wisconsin-Madison. In their presentation, titled “Hats, Farms, and Bubbles: How Emerging Marketing & Content Production Models are Making Research More Difficult (And What You and Your Students Can Do About It),” Kate and Paul discussed five initiatives that currently affect content creation and propagation on the internet: search engine optimization (SEO), filter bubbles, content farms, algorithm-created content, and crowdsourcing (see their slides from this talk in the Program Materials section below).

The session began with an active poll in which attendees were asked to walk to labeled parts of the room to show the audience’s familiarity with each of the five concepts. With a range of prior knowledge among attendees, you could see through this activity that everyone had something to learn from the presentation.

The first item that was discussed was SEO: techniques used to increase the visibility of a website to search engines. Paul noted that while all website owners want their sites to be found, practitioners of “black hat” SEO typically use content spam (hiding or manipulating text) or link spam (increasing the number of links to a website) to try and trick search engine ranking algorithms into ranking their sites highly. Some search engines have tried to mitigate the effects of SEO: in 2012 Google launched Penguin which provides guidelines for webmasters and applies a penalty to websites that violate the guidelines.

Next Kate explained the concept of a filter bubble, a term that describes the potential for different search engine results when two identical searches are performed on two different computers (remember those Google ads that highlighted personalized searching for a beetle – the bug vs. the car?). The term filter bubble was coined by Eli Pariser in his book of the same name; we watched a brief clip of Pariser’s TED talk in which he explained the dangers of filter bubbles. When search engine algorithms increasingly tailor search engine results to our interests – which they equate with whatever content we click on while web surfing – we aren’t seeing the full range of information available on the internet. Facebook uses similar techniques to display content based on our friends’ interests. By creating these filter bubbles, internet corporations are restricting the opportunities for us to encounter information that may be new or challenging to us, or present a different point of view from our own.

Most academic librarians are familiar with content farms: websites that pay very low wages to freelancers to write large volumes of low quality articles, sites like About.com, Ehow.com, and others. Often the article topics are drawn from algorithmic analysis of search data that suggests titles and keywords that are most profitable for advertisers – unlike journalism, this model of content creation starts with consumer demand. Paul noted that Google has also come out with a strategy to attempt to stem the tide of low quality content from content farm websites; in 2011 it debuted Google Panda and downgraded 11% of content it indexed that year. While it’s useful to us, as librarians, when Google addresses the content farm problem, it’s also somewhat troubling to realize that Google is developing algorithms for evaluating information sources.

Perhaps one of the most surprising topics discussed was content created by machine, or algorithm-generated content. Algorithms have already been implemented to synthesize large data sets into an accessible narrative. They are popular in areas like sports writing or business news where there is an emphasis on statistics and identifying trends or patterns. But algorithms are also already being used to write content such as restaurant reviews or haikus. These algorithms can even be programmed to generate a certain tone within the article, or make different types of articles for different situations using the same data. Other ways they have been used in academic settings might be to give students feedback on their preparation for tests like the SAT or ACT. One point of discussion during the event was the labor issues with algorithms (or lack thereof) — the incentive to use algorithms to create content eliminates the need to pay any person (no author is paid even just a small amount, as with content farms, because essentially there is no author). A question from the audience brought up the dying art of fact checking in journalism today. Kate pointed out, interestingly, that although these articles are not written by a person, they need very little fact-checking, since they rely so heavily upon the direct import of factual data.

Crowdsourcing was also discussed as an emerging way content is created or supported through the work of the masses. Paul briefly discussed content created through crowdsourcing such as is done on web sites like Carnegie Mellon’s Eterna and the site Foldit where contributors play a game involving protein folding. He also  focused on crowdsourcing for fundraising using web sites like Kickstarter and (Indiegogo. There are implications for what people decide to fund and not to fund. What does this mean especially in these times of federal austerity?

During a following breakout session the crowdsourcing topic was explored further. Examples of user supplied content included Wikipedia. Wikipedia Editathons such as the one held at NYPL to increase access to the NYPL theater and performing arts collection were noted. MOOCs became a part of the discussion on crowdsourcing, where examples of student solutions to problems have been integrated as illustration in a course. Readersourcing.org, though it was never launched, was an attempt to crowdsource peer review. There was also an extended discussion about crowdsourcing as a news gathering technique. Twitter has surfaced as a way to gather information about events as they happen. Of concern for librarians, always interested in the accuracy of information, is whether or not information gathered through Twitter can be trusted. Additionally, Daren C. Brabham’s research on the ethics of using the crowd as a source of free labor was also discussed. According to Brabham, a myth is perpetuated about the amateur nature of crowd contributors, when in reality many who contribute as anything from citizen scientist to citizen graphic designer are often professionals who deserve compensation.

Kate and Paul ended by suggesting strategies that we can use to mitigate potentially negative effects of new content production, both for us — as librarians and as internet users — and for the students and faculty with whom we work. And indeed, academic librarians are well-placed to implement these recommendations as we work to educate ourselves and our patrons. We must continue to teach students to evaluate their sources, and perhaps expand to evaluating the possible filters they experience as well. Looking for opportunities to create more chance encounters with information could help burst those bubbles. Many of us already clear our web browser history and cookies regularly; can we also demand more transparency from our vendors about the information they collect from users? Finally, Kate and Paul challenged us to think about ways that we can put students into the role of creator — rather than simply consumer —  to raise their awareness about these issues surrounding content production and increase their data literacy and information literacy.

After Kate and Paul’s presentation, participants broke up into three discussion groups: content farms (led by Paul), algorithms (led by Kate) and crowdsourcing (led by Prof. Beth Evans). Participants explored the implications of each of these topics for work in the library, and also discussed other issues surrounding research and the internet.

Awareness of all of these issues might help to insure that librarians and researchers (and the students we teach at the reference desk and in the classroom) don’t get stuck in the filter bubble, surrounded by thin information that was written by bots!

– by Maura Smale (City Tech), Alycia Sellie (Brooklyn College), and Beth Evans (Brooklyn College)

Program Materials:

Hats, Farms, and Bubbles slides

Videos shown during the presentation:

Epipheo. (2013). Why the News Isn’t Really the News. Youtube. http://www.youtube.com/watch?v=YoZNJsp3Kik

ExLibrisLtd. (2011). Primo ScholarRank plain and simple. YouTube. http://www.youtube.com/watch?v=YDly9qPpPYQ

Ted. (2011). Eli Pariser: Beware online “filter bubbles.” http://www.ted.com/talks/eli_pariser_beware_online_filter_bubbles.html

Additional materials mentioned during the presentation:

On The Media. (2013). Ads vs. Ad-Blockers. http://www.onthemedia.org/2013/may/10/ads-vs-ad-blockers

  • In response to the question about how modifying your web browser through extensions like ad blockers can have unintended consequences like hurting independent publishers.

This American Life. (2012). Forgive us our press passes. 468: Switcheroo. http://www.thisamericanlife.org/radio-archives/episode/468/switcheroo?act=2

  • Although we didn’t mention Journatic.com during our presentation, it’s another version of a content farm but instead of using SEO techniques to attract web traffic from a general audience, Journatic.com works with newspapers to outsource hyper-local articles to writers abroad who often publish under fake bylines.

Recommended Readings:

1) Content Farms

NOTE: Notice the use of SEO in the web address; the article is NOT about ESPN.

2) Algorithm-written Content

3) Crowdsourcing

Crowdsourcing Site Screenshots, by Beth Evans (Brooklyn): http://www.slideshare.net/myspacelibrarian/crowdsourcing-site-screenshots

Posted in Events | Tagged , , , , , , , , | 2 Comments