Two recent reports provide some insight into the higher education landscape:
The 2014 Inside Higher Ed Survey of Faculty Attitudes on Technology reported on the results of a survey of 2,799 faculty members and 288 campus technology administrators about online education. Among the findings are that only 9 percent of faculty members strongly believe that online education can be equivalent to in-person courses. The value of MOOCs appears to be questionable, and there's some doubt about the usefulness of digital humanities. There was no mention of the role or impact of the library on online education. Nevertheless, a very interesting report.
The Leadership Board for CIOs has conducted an annual survey of CIOs for the past several years. The 2014 Survey of Chief Information Officers provides a complete picture of technology in higher education institutions. The survey was sent to nearly 1,000 CIOs in April and May, 2014, and the response rate was 23%. Sixty percent of the respondents were from public institutions; 37 percent from private, non-profits; and 2 percent from for-profit institutions. Information collected in the survey included institutional and CIO characteristics; financial and budget planning; IT organization and governance; consumerization of IT; administrative computing: academic technologies, MOOCs, and innovation; infrastructure, networking and security; cloud computing and big data; institutional standards; and new and emerging technologies. One item of interest was how few higher education institutions are relying on open source products for administrative computing needs (0 percent). As the report states, "[m]ost institutions prefer [enterprise resource planning] vendor-provided systems that are tightly integrated and under the control of their institution, as opposed to outsourced solutions" (p. 18). Libraries were mentioned twice. Centralized management of library services takes place in 17 percent of responding institutions, and 38 percent of them reported some library applications that were cloud-based.
Wednesday, December 31, 2014
Friday, December 26, 2014
"Using GTD to Get Things Done at Your Library," by Robin Hastings
Robin Hastings. "Using GTD to Get Things Done at Your Library." Computers in Libraries 31:8 (2011): 23-26.
I recommend reading this article to give you an introduction to David Allen's Getting Things Done: The Art of Stress-Free Productivity (GTD). Originally published in 2001, GTD has remained a bible of sorts for productivity junkies. Allen offers reams of helpful advice about how to get and stay organized and keep on top of all the things you have to do at work and in any other part of your life. Hastings provides a nice intro to this book and applies the lessons learned to our particular library environment. I haven't had a chance to review GTD on this blog yet (it's on the list!), but in short, I agree completely with Hastings assessment of the high value of GTD when applied at work.
I recommend reading this article to give you an introduction to David Allen's Getting Things Done: The Art of Stress-Free Productivity (GTD). Originally published in 2001, GTD has remained a bible of sorts for productivity junkies. Allen offers reams of helpful advice about how to get and stay organized and keep on top of all the things you have to do at work and in any other part of your life. Hastings provides a nice intro to this book and applies the lessons learned to our particular library environment. I haven't had a chance to review GTD on this blog yet (it's on the list!), but in short, I agree completely with Hastings assessment of the high value of GTD when applied at work.
Thursday, December 25, 2014
The Value and Impact of Data Sharing and Curation, by Neil Beagrie and John Houghton
Neil Beagrie and John Houghton. The Value and Impact of Data Sharing and Curation: A Synthesis of Three Recent Studies of UK Research Data Centres. (Jisc, 2014).
This was a very interesting study of three data centers to determine their value and impact in relation to their cost. The authors identified data centers in three disciplines: the humanities, social sciences and science. They were the Economic and Social Data Service, the Archaeology Data Service, and the British Atmospheric Data Centre. Key findings include that use of the data centers resulted in a significant increase in research, teaching, and studying efficiency; the value of the data centers exceeded the investments made in them; and the data centers increase the measurable returns on investment.
Their recommendations include (p. 23):
This was a very interesting study of three data centers to determine their value and impact in relation to their cost. The authors identified data centers in three disciplines: the humanities, social sciences and science. They were the Economic and Social Data Service, the Archaeology Data Service, and the British Atmospheric Data Centre. Key findings include that use of the data centers resulted in a significant increase in research, teaching, and studying efficiency; the value of the data centers exceeded the investments made in them; and the data centers increase the measurable returns on investment.
Their recommendations include (p. 23):
- Continue support for the data centers
- Develop the methods used to study them further
- Promote the standardization of usage statistics
- Study the value and impact of other aspects of the research data curation infrastructure
- Conduct more granular analysis
- Track changes longitudinally
- Study the broader value and impact of data collections
Wednesday, December 24, 2014
"Five Years of Empirical Research in the Area of Technical Services," by Natalia Gelber
Natalia Gelber. "Five Years of Empirical Research in the Area of Technical Services: An Examination of Selected Peer-Reviewed Journals, 2007-2011." Technical Services Quarterly 30:2 (2013): 166-186.
Author Gelber analyzed 256 articles from 21 peer reviewed journals to determine the type of empirical research conducted, the subjects of the research, and whether the topic influenced the type of research methods used. The top four journals represented in the study were Cataloging & Classification Quarterly, Library Resources & Technical Services, Technical Services Quarterly, Journal of Electronic Resources Librarianship, Serials Librarian, Library Collections Acquisitions and Technical Services, and Serials Review, ranging from 17 to 65 articles. The remainder of the 21 journals had fewer than 10 articles each.
Findings revealed that 53.5 percent of articles were sole-authored, 28.5 percent had two authors, 11.7 percent had three authors, and 4.3 percent had four authors. Only a tiny portion had more than four authors. The great majority of the authors were practitioners (94.5 percent); only 3.1 percent were academics. U.S. authors predominated, representing over 85.5 percent of the total, and 70.7 percent of the research settings were in academic libraries. Only 3.1 percent of the research incorporated end-users into the research.
The most commonly used type of research was the case study, with 61.7 percent of the total. Another 21.9 percent of the research consisted of surveys. Qualitative data analysis was more common that quantitative (54.7 percent vs. 27.3 percent), and only 18 percent was mixed qualitative and quantitative. The top topics were electronic resources management, discovery of materials in the online catalog, cataloging policies and practices, and acquisitions. (Collection development as a topic was eliminated from this study.) The third research question was not able to be answered because there weren't enough examples of all of the research methods to make it statistically significant.
I find it interesting but not surprising that the two most common types of research are the case study and the survey. (Literature reviews and historical reviews were eliminated from this study.) As the researchers are most likely to be practitioners, it seems to me that the most meaningful research they conduct and share with others is 1) how we did something and what we learned from it, and 2) how is everyone else doing this and what can I learn from their experiences? In fact, whenever I'm faced with a new challenge at work and I want to read up on the issue, I do a literature review to see what others have done and how it worked for them.
I found Ms. Gelber's research very interesting. I was surprised at some of the journal titles that ended up on her list (see the article, p. 173, for the full list). I had to look up one of the research methods that she listed: the Delphi Method, which consists of two or more rounds of experts responding to questionnaires. After each round the summaries of the questionnaires are shared with them to see if their own responses change; after several rounds, the conclusions are expected to be more reliable. It's used more in business forecasting, although I can see that it would be interesting to apply to technical services or more general library futures forecasting.
Author Gelber analyzed 256 articles from 21 peer reviewed journals to determine the type of empirical research conducted, the subjects of the research, and whether the topic influenced the type of research methods used. The top four journals represented in the study were Cataloging & Classification Quarterly, Library Resources & Technical Services, Technical Services Quarterly, Journal of Electronic Resources Librarianship, Serials Librarian, Library Collections Acquisitions and Technical Services, and Serials Review, ranging from 17 to 65 articles. The remainder of the 21 journals had fewer than 10 articles each.
Findings revealed that 53.5 percent of articles were sole-authored, 28.5 percent had two authors, 11.7 percent had three authors, and 4.3 percent had four authors. Only a tiny portion had more than four authors. The great majority of the authors were practitioners (94.5 percent); only 3.1 percent were academics. U.S. authors predominated, representing over 85.5 percent of the total, and 70.7 percent of the research settings were in academic libraries. Only 3.1 percent of the research incorporated end-users into the research.
The most commonly used type of research was the case study, with 61.7 percent of the total. Another 21.9 percent of the research consisted of surveys. Qualitative data analysis was more common that quantitative (54.7 percent vs. 27.3 percent), and only 18 percent was mixed qualitative and quantitative. The top topics were electronic resources management, discovery of materials in the online catalog, cataloging policies and practices, and acquisitions. (Collection development as a topic was eliminated from this study.) The third research question was not able to be answered because there weren't enough examples of all of the research methods to make it statistically significant.
I find it interesting but not surprising that the two most common types of research are the case study and the survey. (Literature reviews and historical reviews were eliminated from this study.) As the researchers are most likely to be practitioners, it seems to me that the most meaningful research they conduct and share with others is 1) how we did something and what we learned from it, and 2) how is everyone else doing this and what can I learn from their experiences? In fact, whenever I'm faced with a new challenge at work and I want to read up on the issue, I do a literature review to see what others have done and how it worked for them.
I found Ms. Gelber's research very interesting. I was surprised at some of the journal titles that ended up on her list (see the article, p. 173, for the full list). I had to look up one of the research methods that she listed: the Delphi Method, which consists of two or more rounds of experts responding to questionnaires. After each round the summaries of the questionnaires are shared with them to see if their own responses change; after several rounds, the conclusions are expected to be more reliable. It's used more in business forecasting, although I can see that it would be interesting to apply to technical services or more general library futures forecasting.
Tuesday, December 23, 2014
Stylish Academic Writing, by Helen Sword
Helen Sword. Stylish Academic Writing. Cambridge, MA: Harvard University Press, 2012. 220 pages. ISBN 9780674064485.
Author Helen Sword read and analyzed 1,000 articles published in academic journals in 10 disciplines to determine what constitutes stylish academic writing. She also studied 100 recently-published style guides to see where they agreed and differed on points of academic writing style. In Stylish Academic Writing she shares what she's learned about what makes a good article. In fourteen chapters she discusses voice, sentence construction, titles, hooks, jargon, article structure, citation style, creative academic writing, and more.
Two of the chapters speak most to me: the one on voice, and the other on citation style. They both speak to pet peeves of mine. The first is when an author has to mangle their writing to avoid using the first person. Much of the writing in library science is reporting on a project or case study, in which the author is simply telling a story about how a project was launched, carried out or successfully completed. It makes no sense to not be able to use the first person when telling this story. But if you look at much of the library science literature, you'll see many of these stories told in a way that puts a distance between the reader and what's being shared. This makes the article harder to read, and less interesting. Articles should be written in a way that conveys all of the important information that the author is trying to share, but in a way that will increase readership. Writing in the first person can help with that goal. Sword advocates for the use of the first person when possible.
My second pet peeve has to do with citation styles that require the author to put names, dates, and sometimes page numbers in parentheses right in the text. When I read an article that has a lot of citations, I sometimes find it difficult to follow the threads of a sentence or paragraph through all of these parenthetical citations. The simple use of endnotes, identified with a superscripted number, avoids this problem. Sentences and paragraphs with the simple numbered indication of an endnote are much easier to read and comprehend than one with the citations in parentheses interrupting the flow. Again, the goal is to share information and increase the readership of each article, and a simpler citation style does that. Sword supports the use of simpler citation styles that don't interrupt the flow of the article.
While I'm only highlighting two issues in this review, Sword's book is full of good advice. She illustrates all of her chapters with both good and bad examples so readers can understand what makes good writing, and what hinders comprehension. I believe this book would be useful to all academics who want to improve their writing.
Author Helen Sword read and analyzed 1,000 articles published in academic journals in 10 disciplines to determine what constitutes stylish academic writing. She also studied 100 recently-published style guides to see where they agreed and differed on points of academic writing style. In Stylish Academic Writing she shares what she's learned about what makes a good article. In fourteen chapters she discusses voice, sentence construction, titles, hooks, jargon, article structure, citation style, creative academic writing, and more.
Two of the chapters speak most to me: the one on voice, and the other on citation style. They both speak to pet peeves of mine. The first is when an author has to mangle their writing to avoid using the first person. Much of the writing in library science is reporting on a project or case study, in which the author is simply telling a story about how a project was launched, carried out or successfully completed. It makes no sense to not be able to use the first person when telling this story. But if you look at much of the library science literature, you'll see many of these stories told in a way that puts a distance between the reader and what's being shared. This makes the article harder to read, and less interesting. Articles should be written in a way that conveys all of the important information that the author is trying to share, but in a way that will increase readership. Writing in the first person can help with that goal. Sword advocates for the use of the first person when possible.
My second pet peeve has to do with citation styles that require the author to put names, dates, and sometimes page numbers in parentheses right in the text. When I read an article that has a lot of citations, I sometimes find it difficult to follow the threads of a sentence or paragraph through all of these parenthetical citations. The simple use of endnotes, identified with a superscripted number, avoids this problem. Sentences and paragraphs with the simple numbered indication of an endnote are much easier to read and comprehend than one with the citations in parentheses interrupting the flow. Again, the goal is to share information and increase the readership of each article, and a simpler citation style does that. Sword supports the use of simpler citation styles that don't interrupt the flow of the article.
While I'm only highlighting two issues in this review, Sword's book is full of good advice. She illustrates all of her chapters with both good and bad examples so readers can understand what makes good writing, and what hinders comprehension. I believe this book would be useful to all academics who want to improve their writing.
Thursday, September 11, 2014
"The 'Just Do It' Approach to Customer Service Development," by Erika L. Gavillet
Erika L. Gavillet. "The 'Just Do It' Approach to Customer Service Development: A Case Study." College & Research Libraries News 72:4 (2011): 229-236.
Author Erika Gavillet presents a new approach to surveying library patrons in this short case study. She points out that academic libraries in particular are evaluating customer service in order to make improvements or offer new services to their users. In an effort to avoid "survey fatigue," they developed a light approach to gathering user feedback. Her unit, the Customer Services Group (CSG) at Newcastle University Library, decided to conduct a quick survey of students by asking three questions (p. 230):
In response to the survey, the CSG came up with a plan to implement during the examination period in May. All library staff would participate in this plan which included regular sweeps of each floor of the library. Staff checked for "noise, trailing wires from laptops, litter and unattended belongings" (p. 230). It would be promoted as a partnership with the students to keep the library quiet and tidy. They conducted 558 sweeps over the next month, leaving calling cards on unattended belongings warning of theft and asking students to be quiet and remove any trailing wires.
A follow-up survey was conducted afterwards in which students reported that it was worthwhile and positive, but that they still found too much litter accumulating and didn't like the calling cards the library staff left on their unattended belongings.
This article shows that it's possible to gather feedback and implement changes in a relatively short period of time. The library staff and students both found the activity worthwhile. One question that I have is how the students' other concerns were addressed. They originally asked for more textbooks, PCs, and study space along with longer opening hours and less noise. It seems from the article that the only major concern that was addressed by the libraries examination period campaign was the noise issue. How they plan to address the other issues was not explained in the article.
Nevertheless, what I like about this article is that it demonstrates that every library survey or study doesn't have to be statistically meaningful or significant to be useful. Much of library research is about improving services or managing better. Surveys can be used to gather feedback and collect information that can help us run the library in a more effective and efficient manner, providing better services to our users. In this case study, a short survey got a lot of responses which helped library managers develop better services during a hectic time of the year.
Author Erika Gavillet presents a new approach to surveying library patrons in this short case study. She points out that academic libraries in particular are evaluating customer service in order to make improvements or offer new services to their users. In an effort to avoid "survey fatigue," they developed a light approach to gathering user feedback. Her unit, the Customer Services Group (CSG) at Newcastle University Library, decided to conduct a quick survey of students by asking three questions (p. 230):
- What five activities do you do most in the library? (with a list of services with tick boxes)
- What else do you use in the library? (with a list of services with tick boxes)
- If money were no object, what single change would improve your library experience the most?
In response to the survey, the CSG came up with a plan to implement during the examination period in May. All library staff would participate in this plan which included regular sweeps of each floor of the library. Staff checked for "noise, trailing wires from laptops, litter and unattended belongings" (p. 230). It would be promoted as a partnership with the students to keep the library quiet and tidy. They conducted 558 sweeps over the next month, leaving calling cards on unattended belongings warning of theft and asking students to be quiet and remove any trailing wires.
A follow-up survey was conducted afterwards in which students reported that it was worthwhile and positive, but that they still found too much litter accumulating and didn't like the calling cards the library staff left on their unattended belongings.
This article shows that it's possible to gather feedback and implement changes in a relatively short period of time. The library staff and students both found the activity worthwhile. One question that I have is how the students' other concerns were addressed. They originally asked for more textbooks, PCs, and study space along with longer opening hours and less noise. It seems from the article that the only major concern that was addressed by the libraries examination period campaign was the noise issue. How they plan to address the other issues was not explained in the article.
Nevertheless, what I like about this article is that it demonstrates that every library survey or study doesn't have to be statistically meaningful or significant to be useful. Much of library research is about improving services or managing better. Surveys can be used to gather feedback and collect information that can help us run the library in a more effective and efficient manner, providing better services to our users. In this case study, a short survey got a lot of responses which helped library managers develop better services during a hectic time of the year.
Wednesday, September 10, 2014
"Marketing Mindset," by Karen Wallace
Karen Wallace. "Marketing Mindset: Focusing on the Customer, from Technical Services to Circulation." Feliciter 3 (2007): 126-129.
In this short newsletter article, Karen Wallace makes the point that all library staff should think about their work in terms of the user and how the users' experience is affected by what they do. She states "All library employees, including those who typically have little contact with external patrons, need to understand basic marketing concepts and how what they do on a day-to-day basis affects user satisfaction. These "back-shop" folks, who may erroneously be overlooked in marketing efforts, play a key role in developing library collections and service that can meet user needs and even exceed their expectations" (p. 126).
According to Ms. Wallace, library staff need to understand their users by conducting what in the business world would be called "market research." Information gleaned from customer interactions is one source of this information, and she encourages technical services staff to participate in activities that bring them into contact with their users, including working on the reference desk and volunteering to help with library programs.
Armed with knowledge of users' needs, library staff can begin to anticipate their needs and tailor their work and services to meet those needs. For example, they can prioritize their work to meet user demands; enhance their cataloging by adding local subject headings; customize their OPACs to provide a better interface; and create canned searches for heavily-used materials.
I agree with Ms. Wallace's points about focusing on the user. We need to look at all of our processes with the user in mind. We need to evaluate what we do based on whether we're meeting those needs, and make decisions based on what's best for them. Our goal is to provide faculty, staff, and students with the resources they need in support of the teaching and research mission of the university. Timely processing of materials is part of that equation and should be balanced with providing quality access.
In this short newsletter article, Karen Wallace makes the point that all library staff should think about their work in terms of the user and how the users' experience is affected by what they do. She states "All library employees, including those who typically have little contact with external patrons, need to understand basic marketing concepts and how what they do on a day-to-day basis affects user satisfaction. These "back-shop" folks, who may erroneously be overlooked in marketing efforts, play a key role in developing library collections and service that can meet user needs and even exceed their expectations" (p. 126).
According to Ms. Wallace, library staff need to understand their users by conducting what in the business world would be called "market research." Information gleaned from customer interactions is one source of this information, and she encourages technical services staff to participate in activities that bring them into contact with their users, including working on the reference desk and volunteering to help with library programs.
Armed with knowledge of users' needs, library staff can begin to anticipate their needs and tailor their work and services to meet those needs. For example, they can prioritize their work to meet user demands; enhance their cataloging by adding local subject headings; customize their OPACs to provide a better interface; and create canned searches for heavily-used materials.
I agree with Ms. Wallace's points about focusing on the user. We need to look at all of our processes with the user in mind. We need to evaluate what we do based on whether we're meeting those needs, and make decisions based on what's best for them. Our goal is to provide faculty, staff, and students with the resources they need in support of the teaching and research mission of the university. Timely processing of materials is part of that equation and should be balanced with providing quality access.
Tuesday, September 9, 2014
"IT Service Climate," by Ronnie Jia and Blaize Horner Reich
Ronnie Jia and Blaize Horner Reich. "IT Service Climate: An Essential Managerial Tool to Improve Client Satisfaction with IT Service Quality." Information Systems Management 28:2 (2011): 174-179.
Authors Jia and Reich take an interesting approach to service quality assessment. Making the point that the climate of an organization can affect the quality of services that are offered to customers, they take it one step further and maintain that an organization's internal climate can predict customers' perceptions of quality service. To investigate this further, they developed a 10 question instrument that can be used to assess the internal organizational climate. Their study showed that "service climate may be much more important than technical competence from the clients' point of view" (p. 174).
The authors begin by defining IT service climate as "shared perceptions of the practices and behaviors in their workplace that support the provision of IT service to business customers" (p. 174). They differentiate between climate and culture, which is described as deeper and based on core values and assumptions. Climate is more about perceptions and is an expression of culture.
The authors identify three dimensions to the IT service climate: service leadership, service vision, and service evaluation. Service leadership is about setting and communicating service goals. Service vision is about IT employees seeing themselves as having a role in meeting customer needs. Service evaluation is about evaluating employees on how they met their customer service goals. Of the three, the authors identify service leadership as being the most important.
The authors' research consisted of asking IT staff in four organizations 10 questions, and comparing their responses to the responses of their respective customers to the IT-SERVQUAL instrument. They found that, in fact, the impressions of IT staff about their own climate correlated with the impressions of their customers about quality of service. This relationship indicates that managers who work to positively affect climate will have the result of improving costomer perceptions about service quality. This can be very empowering for managers who can take action to improve the bottom line.
The authors recommend incorporating their 10 questions into internal employee satisfaction surveys, examining the findings carefully, and deploying the survey on a regular basis.
I found this to be an interesting look at how an organization's climate can affect and predict the perceptions of customers. While my own research is less about climate and more about the services we provide to each other as internal service providers and customers, it's clear that our service vision is affected by climate. Although this article was specifically about IT services, the lessons learned could be applied to other types of services. The concepts of service leadership, service vision, and service evaluation are things we can all apply in our daily work.
Authors Jia and Reich take an interesting approach to service quality assessment. Making the point that the climate of an organization can affect the quality of services that are offered to customers, they take it one step further and maintain that an organization's internal climate can predict customers' perceptions of quality service. To investigate this further, they developed a 10 question instrument that can be used to assess the internal organizational climate. Their study showed that "service climate may be much more important than technical competence from the clients' point of view" (p. 174).
The authors begin by defining IT service climate as "shared perceptions of the practices and behaviors in their workplace that support the provision of IT service to business customers" (p. 174). They differentiate between climate and culture, which is described as deeper and based on core values and assumptions. Climate is more about perceptions and is an expression of culture.
The authors identify three dimensions to the IT service climate: service leadership, service vision, and service evaluation. Service leadership is about setting and communicating service goals. Service vision is about IT employees seeing themselves as having a role in meeting customer needs. Service evaluation is about evaluating employees on how they met their customer service goals. Of the three, the authors identify service leadership as being the most important.
The authors' research consisted of asking IT staff in four organizations 10 questions, and comparing their responses to the responses of their respective customers to the IT-SERVQUAL instrument. They found that, in fact, the impressions of IT staff about their own climate correlated with the impressions of their customers about quality of service. This relationship indicates that managers who work to positively affect climate will have the result of improving costomer perceptions about service quality. This can be very empowering for managers who can take action to improve the bottom line.
The authors recommend incorporating their 10 questions into internal employee satisfaction surveys, examining the findings carefully, and deploying the survey on a regular basis.
I found this to be an interesting look at how an organization's climate can affect and predict the perceptions of customers. While my own research is less about climate and more about the services we provide to each other as internal service providers and customers, it's clear that our service vision is affected by climate. Although this article was specifically about IT services, the lessons learned could be applied to other types of services. The concepts of service leadership, service vision, and service evaluation are things we can all apply in our daily work.
Wednesday, September 3, 2014
"Employees as Customers Judging Quality," by John B. Harer
I've taken a short hiatus from writing this blog, but am getting ready to ramp it up again. My new research project is going to address the use of internal customer service assessment surveys to gauge customer satisfaction with services provided by technical services and library systems and to identify potential areas for process improvement. I'm planning to use this blog to document my literature review. The first article I'm reviewing is:
John B. Harer. "Employees as Customers Judging Quality: Enhancing Employee Assessment." New Library World 109:78 (2008): 307-320.
Harer used an interesting approach to this topic. First he makes the point that many libraries follow the business world in implementing various management techniques, such as Continuous Quality Improvement and Total Quality Management. There is a focus on continually improving processes and output, and assessment is an important aspect of evaluating their success in doing so. Harer mentions that many libraries are using tools such as the LibQual surveys to assess quality, and makes the point that extending such assessment tools to library employees is a logical next step.
What's interesting to me is that instead of simply asking libraries whether they survey their employees on their views on quality, Harer looks at existing employee satisfaction survey instruments at Association of Research Libraries (ARL) member institutions and analyzes whether they ask questions that get at the employees' opinions about quality. It seems like a roundabout approach to get at this information. Nevertheless, Harer solicits participating in his study from the 96 U.S. ARL academic library members, and receives responses from 30 of them. His responses include one employee satisfaction survey, four organizational climate surveys, eight exit interview surveys, 11 employee self-assessment evaluations, and three administrator evaluation surveys. Thirteen libraries responded that they did not have or use such instruments.
From this point forward in the article, Harer limited the discussion to the four organizational climate surveys that were submitted, and found that only one of them specifically asked employees for the opinions regarding quality, although the two example questions provided addressed the needs of the user.
As I mentioned before, I think this is an interesting approach. I'm interested in another angle: employees' satisfaction with the services they offer each other. Obviously, the end user benefits from our output, whether that's in items processed or services offered. But our more immediate customers in libraries are each other, especially for technical services and library systems units. For example, bibliographers are customers of acquisitions departments. Bibliographers put their requests in to acquisitions staff, which are then responsible for carrying out those requests. Circulation staff might ask for an item to be rush-processed, so they are customers of cataloging staff. Even within a single department we are dependent on each other. When we ask for help on a tricky problem, does it get answered in a timely fashion, or did it sit on someone's desk for a month? These are all ways that we are dependent on each other. Our personal assessment can demonstrate overall satisfaction with the services that we provide to each other and identify areas in which we can improve. Improving services to each other will by extension provide better service to our users: the faculty, staff, and students who rely on our services for their teaching and research needs.
I agree that employees can provide important feedback regarding quality, but I think a more direct route is preferable. Internal customer services assessment is one more tool that we can use to improve our work and services.
John B. Harer. "Employees as Customers Judging Quality: Enhancing Employee Assessment." New Library World 109:78 (2008): 307-320.
Harer used an interesting approach to this topic. First he makes the point that many libraries follow the business world in implementing various management techniques, such as Continuous Quality Improvement and Total Quality Management. There is a focus on continually improving processes and output, and assessment is an important aspect of evaluating their success in doing so. Harer mentions that many libraries are using tools such as the LibQual surveys to assess quality, and makes the point that extending such assessment tools to library employees is a logical next step.
What's interesting to me is that instead of simply asking libraries whether they survey their employees on their views on quality, Harer looks at existing employee satisfaction survey instruments at Association of Research Libraries (ARL) member institutions and analyzes whether they ask questions that get at the employees' opinions about quality. It seems like a roundabout approach to get at this information. Nevertheless, Harer solicits participating in his study from the 96 U.S. ARL academic library members, and receives responses from 30 of them. His responses include one employee satisfaction survey, four organizational climate surveys, eight exit interview surveys, 11 employee self-assessment evaluations, and three administrator evaluation surveys. Thirteen libraries responded that they did not have or use such instruments.
From this point forward in the article, Harer limited the discussion to the four organizational climate surveys that were submitted, and found that only one of them specifically asked employees for the opinions regarding quality, although the two example questions provided addressed the needs of the user.
As I mentioned before, I think this is an interesting approach. I'm interested in another angle: employees' satisfaction with the services they offer each other. Obviously, the end user benefits from our output, whether that's in items processed or services offered. But our more immediate customers in libraries are each other, especially for technical services and library systems units. For example, bibliographers are customers of acquisitions departments. Bibliographers put their requests in to acquisitions staff, which are then responsible for carrying out those requests. Circulation staff might ask for an item to be rush-processed, so they are customers of cataloging staff. Even within a single department we are dependent on each other. When we ask for help on a tricky problem, does it get answered in a timely fashion, or did it sit on someone's desk for a month? These are all ways that we are dependent on each other. Our personal assessment can demonstrate overall satisfaction with the services that we provide to each other and identify areas in which we can improve. Improving services to each other will by extension provide better service to our users: the faculty, staff, and students who rely on our services for their teaching and research needs.
I agree that employees can provide important feedback regarding quality, but I think a more direct route is preferable. Internal customer services assessment is one more tool that we can use to improve our work and services.
Thursday, July 3, 2014
"Data Management Assessment and Planning Tools," by Andrew Sallans and Sherry Lake
Andrew Sallans and Sherry Lake. "Data Management Assessment and Planning Tools," In Research Data Management: Practical Strategies for Information Professionals (West Lafayette, IN: Purdue University Press, 2014. 436 pages. ISBN 9781557536648):87-107.
Sallans and Lake start off by discussing some of the challenges of data management. Data management has gotten a lot of attention lately. Funding agencies are driving a lot of this attention, and are making recommendations regarding the existence of a data management plan, but their requirements are often vague and can result in researchers simply doing the minimum to meet funding requirements. In this chapter, Sallans and Lake describe several efforts to assess specific data management plans.
The University of Virginia began their efforts in this area by conducting data management interviews. They hoped to identify "common research data problems and needs," identify the types of data being created, identify communities and researchers under pressure from grant requirements, identify partnerships for "institutional repository data deposit," and "develop opportunities to provide data management recommendations and training" (p. 91). However, they needed to use the information they learned in their interviews to develop recommendations and weigh assessment factors. This led to the development of the DMVitals Tool, an Excel spreadsheet designed to collect information and produce reports. The authors provide guidance on how this tool can be used at other institutions.
The DMPTool was subsequently developed through a collaborative effort with many other institutions. While the tool was not completed at the time of publication, it shows promise in helping researchers manage their data and improve their research data management practices.
Sallans and Lake start off by discussing some of the challenges of data management. Data management has gotten a lot of attention lately. Funding agencies are driving a lot of this attention, and are making recommendations regarding the existence of a data management plan, but their requirements are often vague and can result in researchers simply doing the minimum to meet funding requirements. In this chapter, Sallans and Lake describe several efforts to assess specific data management plans.
The University of Virginia began their efforts in this area by conducting data management interviews. They hoped to identify "common research data problems and needs," identify the types of data being created, identify communities and researchers under pressure from grant requirements, identify partnerships for "institutional repository data deposit," and "develop opportunities to provide data management recommendations and training" (p. 91). However, they needed to use the information they learned in their interviews to develop recommendations and weigh assessment factors. This led to the development of the DMVitals Tool, an Excel spreadsheet designed to collect information and produce reports. The authors provide guidance on how this tool can be used at other institutions.
The DMPTool was subsequently developed through a collaborative effort with many other institutions. While the tool was not completed at the time of publication, it shows promise in helping researchers manage their data and improve their research data management practices.
Wednesday, July 2, 2014
"The Use of Life Cycle Models in Developing and Supporting Data Services," by Jake Carlson
Jake Carlson. "The Use of Life Cycle Models in Developing and Supporting Data Services," In Research Data Management: Practical Strategies for Information Professionals (West Lafayette, IN: Purdue University Press, 2014. 436 pages. ISBN 9781557536648): 63-86.
In the third chapter of this book, author Jake Carlson, Associate Professor of Library Science and data services specialist at Purdue University Libraries, introduces the reader to the concept of life cycle modelling for research data management. Life cycle models are commonly adopted by organizations which are trying to promote best practices around managing, organizing, and preserving research data. Similar to biological organisms, research data progresses through a cycle of transformations in format, application, use, and purpose.
Organizations have come to recognize that research data can be used to create new products and generate new areas of research, but often researchers aren't thinking beyond their own original uses for their data. Research data life cycle models can be used to illustrate the overall research process from start to finish, and demonstrate how data can be re-used by others. Data life cycle models are really a subset of the research life cycle. There are three types of life cycle models: individual, organizational, and community. They are an effective tool for designing and carrying out a research project, and help researchers articulate and diagram activities in the research project.
Organizational life cycle models are used by organizations that offer services or assistance to researchers, such as libraries, data repositories, scholarly societies, publishers, etc. One example of an organizational life cycle model is that used by ICPSR.
Carlson makes the point that in models, the action is orderly and linear; whereas in real life, there are many variables that can change the direction of a project. He concludes by stating that effective data services depend on an in-depth understanding of the needs of the researchers they're meant to serve.
In the third chapter of this book, author Jake Carlson, Associate Professor of Library Science and data services specialist at Purdue University Libraries, introduces the reader to the concept of life cycle modelling for research data management. Life cycle models are commonly adopted by organizations which are trying to promote best practices around managing, organizing, and preserving research data. Similar to biological organisms, research data progresses through a cycle of transformations in format, application, use, and purpose.
Organizations have come to recognize that research data can be used to create new products and generate new areas of research, but often researchers aren't thinking beyond their own original uses for their data. Research data life cycle models can be used to illustrate the overall research process from start to finish, and demonstrate how data can be re-used by others. Data life cycle models are really a subset of the research life cycle. There are three types of life cycle models: individual, organizational, and community. They are an effective tool for designing and carrying out a research project, and help researchers articulate and diagram activities in the research project.
Organizational life cycle models are used by organizations that offer services or assistance to researchers, such as libraries, data repositories, scholarly societies, publishers, etc. One example of an organizational life cycle model is that used by ICPSR.
Carlson makes the point that in models, the action is orderly and linear; whereas in real life, there are many variables that can change the direction of a project. He concludes by stating that effective data services depend on an in-depth understanding of the needs of the researchers they're meant to serve.
Thursday, June 5, 2014
"Data Governance: Where Technology and Policy Collide" by MacKenzie Smith
MacKenzie Smith. "Data Governance: Where Technology and Policy Collide." In Research Data Management: Practical Strategies for Information Professionals (West Lafayette, IN: Purdue University Press, 2014. 436 pages. ISBN 9781557536648): 45-59.
Author MacKenzie Smith is University Librarian at the University of California, Davis and has a strong background in information technology and digital library issues. In this chapter she addresses a number of governance and policy issues related to the management of data resources. These include legal and policy issues; attribution and citation requirements; archives and preservation; discovery and provenance metadata; data schema and ontologies for discovery and sharing; and access to infrastructure for data analysis.
The legal issues can be very complex. Data that is public and shareable in one country might be under copyright in another. Privacy and confidentiality laws may differ as well. Another source of confusion includes the definition of commercial versus non-commercial use. Technological issues are also challenging. Software that was used to create and analyze data must be available if the data are to be reusable. Metadata should be standardized and easily interpreted. The need to have data management plans is noted, and Smith ends by making a number of recommendations regarding data governance.
Recommendations include the need to have clear statements of policy and enforcement practices; an easy to use and trustworthy infrastructure; reward mechanisms for researchers who share their data; clarification of stakeholders' rights and responsibilities; and a harmonization of data usage agreements.
Author MacKenzie Smith is University Librarian at the University of California, Davis and has a strong background in information technology and digital library issues. In this chapter she addresses a number of governance and policy issues related to the management of data resources. These include legal and policy issues; attribution and citation requirements; archives and preservation; discovery and provenance metadata; data schema and ontologies for discovery and sharing; and access to infrastructure for data analysis.
The legal issues can be very complex. Data that is public and shareable in one country might be under copyright in another. Privacy and confidentiality laws may differ as well. Another source of confusion includes the definition of commercial versus non-commercial use. Technological issues are also challenging. Software that was used to create and analyze data must be available if the data are to be reusable. Metadata should be standardized and easily interpreted. The need to have data management plans is noted, and Smith ends by making a number of recommendations regarding data governance.
Recommendations include the need to have clear statements of policy and enforcement practices; an easy to use and trustworthy infrastructure; reward mechanisms for researchers who share their data; clarification of stakeholders' rights and responsibilities; and a harmonization of data usage agreements.
Friday, April 18, 2014
"The Policy and Institutional Framework" by James L. Mullins
James L. Mullins. "The Policy and Institutional Framework." In Research Data Management: Practical Strategies for Information Professionals (West Lafayette, IN: Purdue University Press, 2014. 436 pages. ISBN 9781557536648): 25-44.
Author James Mullins, Dean of Libraries at Purdue University, has been immersed in the development of data management policy and infrastructure since his arrival at Purdue in 2004. In this chapter, Mullins describes the development of national policy in the area of research data management, and follows that with a case study about Purdue's own move towards managing data.
Mullins begins by describing the realization within the scientific community that research data could and should be shared so that researchers weren't redundantly conducting research to get data that had already been done by another researcher. One of the projects that brought this to light was the Human Genome Project, which generated massive amounts of data. Federal granting agencies were especially interested in preventing the duplication of research, since they were funding many of the studies that were creating redundant data sets.
Studies sponsored by the National Science Foundation (NSF), the Association of Research Libraries (ARL) and others identified the many challenges that research organizations faced, and a movement to require data management plans as part of grant-funding requirements began to grow. Recognizing that there was no infrastructure to support data management, the NSF offered a series of grants to encourage organizations to develop and model such infrastructures. Other institutions, such as the Institute for Museum and Library Services (IMLS) also support research in this area. One such project resulted in the development of the Data Curation Profiles Toolkit (http://datacurationprofiles.org/). ARL got in on the game by creating the E-Science Institute with seed funding from 70 of its members, and has presented workshops and provided other resources on this issue.
The second half of this chapter was devoted to a description of Purdue University's efforts to create research data management services through collaboration with others across the university.
Author James Mullins, Dean of Libraries at Purdue University, has been immersed in the development of data management policy and infrastructure since his arrival at Purdue in 2004. In this chapter, Mullins describes the development of national policy in the area of research data management, and follows that with a case study about Purdue's own move towards managing data.
Mullins begins by describing the realization within the scientific community that research data could and should be shared so that researchers weren't redundantly conducting research to get data that had already been done by another researcher. One of the projects that brought this to light was the Human Genome Project, which generated massive amounts of data. Federal granting agencies were especially interested in preventing the duplication of research, since they were funding many of the studies that were creating redundant data sets.
Studies sponsored by the National Science Foundation (NSF), the Association of Research Libraries (ARL) and others identified the many challenges that research organizations faced, and a movement to require data management plans as part of grant-funding requirements began to grow. Recognizing that there was no infrastructure to support data management, the NSF offered a series of grants to encourage organizations to develop and model such infrastructures. Other institutions, such as the Institute for Museum and Library Services (IMLS) also support research in this area. One such project resulted in the development of the Data Curation Profiles Toolkit (http://datacurationprofiles.org/). ARL got in on the game by creating the E-Science Institute with seed funding from 70 of its members, and has presented workshops and provided other resources on this issue.
The second half of this chapter was devoted to a description of Purdue University's efforts to create research data management services through collaboration with others across the university.
Thursday, April 17, 2014
Research Data Management, Edited by Joyce M. Ray
Joyce M. Ray. Research Data Management: Practical Strategies for Information Professionals. West Lafayette, IN: Purdue University Press, 2014. 436 pages. ISBN 9781557536648.
This isn't a full-fledged review of Research Data Management; rather, it's just a short introduction to this book. Joyce Ray, currently a visiting professor at the Berlin School of Library and Information Science at Humboldt University, has edited this volume containing nineteen essays and case studies which discuss how research data might be managed. The book is organized into six parts, addressing policy issues, planning, managing project data, archiving data in repositories, assessment, and in the sixth part, presenting four case studies. Clifford Lynch provides a concluding essay. The chapters are written by many of today's leaders and thinkers, all well-qualified to write on this topic.
Over the next week or two I will be discussing Research Data Management in this blog space. My university administration has just announced a forum on research data to be held in early May, so this book is very timely.
This isn't a full-fledged review of Research Data Management; rather, it's just a short introduction to this book. Joyce Ray, currently a visiting professor at the Berlin School of Library and Information Science at Humboldt University, has edited this volume containing nineteen essays and case studies which discuss how research data might be managed. The book is organized into six parts, addressing policy issues, planning, managing project data, archiving data in repositories, assessment, and in the sixth part, presenting four case studies. Clifford Lynch provides a concluding essay. The chapters are written by many of today's leaders and thinkers, all well-qualified to write on this topic.
Over the next week or two I will be discussing Research Data Management in this blog space. My university administration has just announced a forum on research data to be held in early May, so this book is very timely.
Wednesday, April 16, 2014
Changing Role of Senior Administrators, by Kathleen DeLong, Julie Garrison, and Marianne Ryan
Kathleen DeLong, Julie Garrison, and Marianne Ryan. Changing Role of Senior Administrators: A SPEC Kit. Washington, D.C.: Association of Research Libraries, 2012. 174 pages. ISBN 9781594078873.
In my last blog post I opened with a description of what SPEC Kits are; I will just add here that SPEC stands for "Systems and Procedures Exchange Center" and you can find more information about the entire series at http://publications.arl.org/SPEC_Kits.
Changing Role of Senior Administrators takes a look at how senior administrator positions have changed from 2007 to 2012. Only 46 member libraries responded to this survey, with a fairly low response rate of 37 percent. Their findings show that senior administrator titles are becoming more general, with less of an emphasis on specific areas of responsibility like public services or technical services. The tables on pages 18 through 29 demonstrate the changing positions and titles of senior administrators; tables on pages 35 through 51 show how those positions' responsibilities have changed and who their direct reports are.
The survey also addresses the 21st century skills that senior administrators must have, and indicates that these skills are acquired by attendance at professional institutes such as the ARL Research Library Leadership Fellows Program, the Frye Institute, or the Harvard Leadership Institute; reading professional literature, attending professional conferences, and networking. The survey also identifies desirable attributes of senior administrators, with "Changes/shapes library culture" at the top of the list, followed closely by "Functions in a political environment" and "Makes tough decisions."
Survey respondents indicate that they would be likely to redesign any senior administrator position, should a vacancy occur (79 percent), but only 45 percent expect to do so in the next one to three years. If there were a vacancy, they reported that they would expect the successful hire to come from another research library (91 percent), from within their own organization (67 percent), from another type of library (28 percent), or outside the library profession (19 percent).
This is an interesting snapshot of the current state of research library administrations and their viewpoints on how their administrations will change over the next few years.
In my last blog post I opened with a description of what SPEC Kits are; I will just add here that SPEC stands for "Systems and Procedures Exchange Center" and you can find more information about the entire series at http://publications.arl.org/SPEC_Kits.
Changing Role of Senior Administrators takes a look at how senior administrator positions have changed from 2007 to 2012. Only 46 member libraries responded to this survey, with a fairly low response rate of 37 percent. Their findings show that senior administrator titles are becoming more general, with less of an emphasis on specific areas of responsibility like public services or technical services. The tables on pages 18 through 29 demonstrate the changing positions and titles of senior administrators; tables on pages 35 through 51 show how those positions' responsibilities have changed and who their direct reports are.
The survey also addresses the 21st century skills that senior administrators must have, and indicates that these skills are acquired by attendance at professional institutes such as the ARL Research Library Leadership Fellows Program, the Frye Institute, or the Harvard Leadership Institute; reading professional literature, attending professional conferences, and networking. The survey also identifies desirable attributes of senior administrators, with "Changes/shapes library culture" at the top of the list, followed closely by "Functions in a political environment" and "Makes tough decisions."
Survey respondents indicate that they would be likely to redesign any senior administrator position, should a vacancy occur (79 percent), but only 45 percent expect to do so in the next one to three years. If there were a vacancy, they reported that they would expect the successful hire to come from another research library (91 percent), from within their own organization (67 percent), from another type of library (28 percent), or outside the library profession (19 percent).
This is an interesting snapshot of the current state of research library administrations and their viewpoints on how their administrations will change over the next few years.
Tuesday, April 15, 2014
Library Contribution to Accreditation, by Holly Mercer and Michael Maciel
Holly Mercer and Michael Maciel. Library Contribution to Accreditation: A SPEC Kit. Washington, D.C.: Association of Research Libraries, 2012. 184 pages. ISBN 9781594078859.
The Association of Research Libraries (ARL) conducts six surveys of its membership every year on topics of interest to its members. The survey results and accompanying documentation are published as a monograph, and are acquired by member libraries as well as other academic libraries that find their findings useful. Most SPEC kits include an executive summary, the survey questions and responses, a list of responding institutions, documentation supplied by the responding libraries, and a brief list of resources.
Library Contributions to Accreditation addresses library involvement in regional and programmatic accrediting activities. Response to this survey was on the low side, with 41 of the 115 academic ARL libraries responding. Ninety-five percent of respondents reported that they had been involved in accreditation activities within the last five years. There are six regional accrediting agencies in the U.S.; Canadian accreditation activities are conducted on the provincial level. I was surprised at the number and variety of programmatic accreditation agencies; there were 146 listed in the responses.
The types of data that responding libraries supplied for accreditation were collection holdings, facilities & equipment, financial data, instruction sessions, collections usage, staff qualifications & expertise, reference transactions, ILL transactions, digital projects & usage, scholarly communications activities, and "other data."
One of the survey questions asked respondents to describe what recommendations the accrediting agency had made to the library. In most cases, they had no recommendations, or simply stated that the library was meeting its goals and should continue doing what it's doing.
My interest in this topic came from an Association for Library Collections & Technical Services webinar that I gave November 20, 2013 on "Assessment Strategies for Cataloging Managers." (This webinar will be available free on or soon after May 20, 2014 at: http://www.ala.org/alcts/confevents/upcoming/webinar/112013.) One bit of feedback that I got included this statement: "The content of the material covered was specifically for internal improvement within cataloging/technical services divisions, and not to meet the external requirements of things like accreditation, which requires measurable ways to assess how cataloging contributes to the educational goals of the college." Although the webinar was not intended to address cataloging's contributions to accreditation requirements, this made me curious about whether there are expectations regarding this in our accreditation reports. According to Library Contributions to Accreditation, it doesn't appear that libraries are expected to report on cataloging data or outcomes in their accreditation reports, but I would like to look into this more closely.
The Association of Research Libraries (ARL) conducts six surveys of its membership every year on topics of interest to its members. The survey results and accompanying documentation are published as a monograph, and are acquired by member libraries as well as other academic libraries that find their findings useful. Most SPEC kits include an executive summary, the survey questions and responses, a list of responding institutions, documentation supplied by the responding libraries, and a brief list of resources.
Library Contributions to Accreditation addresses library involvement in regional and programmatic accrediting activities. Response to this survey was on the low side, with 41 of the 115 academic ARL libraries responding. Ninety-five percent of respondents reported that they had been involved in accreditation activities within the last five years. There are six regional accrediting agencies in the U.S.; Canadian accreditation activities are conducted on the provincial level. I was surprised at the number and variety of programmatic accreditation agencies; there were 146 listed in the responses.
The types of data that responding libraries supplied for accreditation were collection holdings, facilities & equipment, financial data, instruction sessions, collections usage, staff qualifications & expertise, reference transactions, ILL transactions, digital projects & usage, scholarly communications activities, and "other data."
One of the survey questions asked respondents to describe what recommendations the accrediting agency had made to the library. In most cases, they had no recommendations, or simply stated that the library was meeting its goals and should continue doing what it's doing.
My interest in this topic came from an Association for Library Collections & Technical Services webinar that I gave November 20, 2013 on "Assessment Strategies for Cataloging Managers." (This webinar will be available free on or soon after May 20, 2014 at: http://www.ala.org/alcts/confevents/upcoming/webinar/112013.) One bit of feedback that I got included this statement: "The content of the material covered was specifically for internal improvement within cataloging/technical services divisions, and not to meet the external requirements of things like accreditation, which requires measurable ways to assess how cataloging contributes to the educational goals of the college." Although the webinar was not intended to address cataloging's contributions to accreditation requirements, this made me curious about whether there are expectations regarding this in our accreditation reports. According to Library Contributions to Accreditation, it doesn't appear that libraries are expected to report on cataloging data or outcomes in their accreditation reports, but I would like to look into this more closely.
Friday, March 21, 2014
Social Media Use in Libraries
I just reviewed Marketing with Social Media (A LITA Guide), edited by Beth C. Thomsett-Scott. It was published by ALA TechSource in 2014, and costs $60.90 on Amazon (ISBN 9781555709723).
I haven't been heavily involved in the use of social media in libraries, but this book makes a good case for how and why it can be succesful. It's clear that whether you use blogs, twitter, facebook, or another platform for your marketing efforts, it takes a lot of time to do it thoughtfully. In preparation for writing my review, I used the subject headings in the LC CIP data block to find other books on the same topic. The top two subject headings were Libraries--Marketing, and Online social networks--Library applications. I used the latter to search in our own catalog, Minerva, and found many other books on the topic:
I did a quick search on WorldCat, and found several similar titles published in the past few years:
These twelve books (counting the one I just reviewed) were all published in the last four years, and I'm sure there are many more on the same topic that we don't have in our collection. I find it interesting that there are three published by ALA Editions, three by ALA TechSource, three by Chandos, and one each by Scarecrow, Linworth/ABC-CLIO, and Neal-Schuman (now part of ALA Editions). Obviously, a hot topic!
I haven't been heavily involved in the use of social media in libraries, but this book makes a good case for how and why it can be succesful. It's clear that whether you use blogs, twitter, facebook, or another platform for your marketing efforts, it takes a lot of time to do it thoughtfully. In preparation for writing my review, I used the subject headings in the LC CIP data block to find other books on the same topic. The top two subject headings were Libraries--Marketing, and Online social networks--Library applications. I used the latter to search in our own catalog, Minerva, and found many other books on the topic:
- Laura Solomon. The librarian's nitty-gritty guide to social media. Chicago: ALA Editions, 2013.
- Charles Harmon, Michael Messina, editors. Using social media in libraries: best practices. Lanham: The Scarecrow Press, 2013.
- Terry Ballard. Google this!: Putting Google and other social media sites to work for your library. Oxford: Chandos, 2012.
- Melissa A. Purcell. The networked library: A guide for the educational use of social networking sites. Santa Barbara, CA: Linworth/ABC-CLIO, 2012.
- Sarah K. Steiner. Strategic planning for social media in libraries. Chicago: ALA TechSource, 2012.
- Troy A. Swanson. Managing social media in libraries: Finding collaboration, coordination and focus. Oxford: Chandos, 2012.
- Laura Solomon. Doing social media so it matters: A librarian's guide. Chicago: ALA, 2011.
- Cliff Landis. A social networking primer for librarians. New York: Neal-Schuman, 2010.
I did a quick search on WorldCat, and found several similar titles published in the past few years:
- Walt Crawford. Successful social networking in public libraries. Chicago: ALA Editions, 2014.
- Denise A. Garofalo. Building communities: Social networking for academic libraries. Oxford: Chandos, 2013.
- Joe Murphy. Location-aware services and QR codes for libraries. Chicago: ALA TechSource, 2012.
These twelve books (counting the one I just reviewed) were all published in the last four years, and I'm sure there are many more on the same topic that we don't have in our collection. I find it interesting that there are three published by ALA Editions, three by ALA TechSource, three by Chandos, and one each by Scarecrow, Linworth/ABC-CLIO, and Neal-Schuman (now part of ALA Editions). Obviously, a hot topic!
Friday, March 14, 2014
Research Data Management
I recently reviewed a book for Catholic Library World called Delivering Research Data Management Services: Fundamentals of Good Practice. It was published in 2014 by Facet, and was edited by Graham Pryor, Sarah Jones, and Angus Whyte. (242 pages; available from Amazon for $94.95; ISBN 9780856049337). My review won't be published until June, and I don't want to repeat what's in the review, so I will just say that I found this book to be interesting and informative. It consists of five chapters addressing practical considerations for organizations that want to provide research data services, and five case studies of services provided by Johns Hopkins University, University of Southampton (UK), Monash University (Australia), the UK Data Service, and the Jisc Managing Research Data program.
Graham Pryor had previously edited a collection called Managing Research Data (Facet, 2012. 239 pages, ISBN 9781856047562.) It is more of an introduction to this topic, and addresses why research data needs to be managed, the lifecycle of data management, research data policies, planning, roles and responsibilities, and more. I've added this book to my reading list as well as another that I think will also be useful:
Graham Pryor had previously edited a collection called Managing Research Data (Facet, 2012. 239 pages, ISBN 9781856047562.) It is more of an introduction to this topic, and addresses why research data needs to be managed, the lifecycle of data management, research data policies, planning, roles and responsibilities, and more. I've added this book to my reading list as well as another that I think will also be useful:
Joyce
M. Ray. Research Data Management:
Practical Strategies for Information Professionals. Purdue University
Press, 2014. 300 pages. ISBN 9781557536648.
I'll be looking into the journal literature a little later.
Saturday, February 15, 2014
Joint Libraries: Models That Work, by Claire B. Gunnels, Susan E. Green, and Patricia M. Butler
Joint Libraries: Models That Work.
By Claire B. Gunnels, Susan E. Green, and Patricia M.
Butler. American Library Association, 2012, 220 pp., ISBN 978-0-8389-1138-9, (paper).
Collaborations and partnerships are ways that libraries have
traditionally dealt with challenging economic circumstances. In Joint Libraries: Models That Work,
authors Gunnels, Green and Butler discuss joint libraries, a collaborative
model that has worked for many communities. They define joint libraries as
“collaborations between different types of libraries: public libraries and
schools; universities and public libraries; community college and public; city
and county, and more.” (p. 2). With many years of experience working in joint
libraries, the authors are uniquely positioned to write about them in this
book.
Organized into ten chapters, this book provides a thorough
introduction to the issues that need to be explored when considering or
planning a joint library project. After an introduction to and a history of
joint libraries, the cultural differences between academic and public libraries
are addressed. The authors delve into management and human resource issues,
legal considerations, and technological challenges. Of particular interest are
the many case studies that are presented
in Chapter 9. The many configurations of joint libraries described here will
provide readers who are considering a joint library approach with many models
to explore.
This is a well-written book that explores a topic that has
not been widely covered. It includes an index and bibliography, as well as two
appendices that present actual joint library agreements, in both cases between
a college and a public library. It would be useful for professional libraries
where there is interest in the joint library model.
Previously published:Mugridge,
R. L. (2012) [Review of the book Joint Libraries: Models That Work, by
Claire B. Gunnels, Susan E. Green, & Patricia M. Butler]. Catholic
Library World, 83(2), 146.
Friday, February 14, 2014
Staff Development: A Practical Guide, edited by Andrea W. Stewart, Carlette Washington-Hoagland, and Carol T. Zsulya
Staff Development: A Practical Guide, 4th ed.
Edited by Andrea Wigbels Stewart, Carlette
Washington-Hoagland, and Carol T. Zsulya. ALA Editions, 2013, 219 pp., ISBN
978-0-8389-1149-5, (paper).
Faced with frequent and rapid change in the workplace, library
employees must continue to adapt and grow professionally. In this excellent
fourth edition of Staff Development,
editors Stewart, Washington-Hoagland, and Zsulya have brought together sixteen
papers from leaders in the library field that provide the reader with the tools
needed to implement a staff development program. In the introduction readers are encouraged to
think of staff development as incremental change, rather than large sweeping
change. Staff members develop most effectively over time, by making small
adjustments in their behavior, or learning new skills.
Chapters are arranged in a logical order, in four parts. The
first section starts with an introduction to the concept of staff development, and
is followed by chapters on conducting a needs assessment, setting goals, and
the questions to be considered when starting a staff development program. The
second section addresses the development process itself, with chapters on
developing orientation programs for new staff, achieving consensus on core
competencies for library employees, coaching staff, cross-training, customer
service, leadership, and succession planning. The third collection of chapters
addresses practical considerations, such as how to plan and implement a staff
development program, the application of instructional design concepts to staff
training, and the use of online videos to support training. Finally, the book
concludes with a thoughtful discussion on assessment.
Overall, Staff
Development is a well-written and thoughtful collection of chapters that
address all aspects of staff development in libraries. It includes references
at the end of each chapter as well as an index. It would be an excellent
addition to any professional library.
Previously published:
Mugridge,
Rebecca L. Review of the book Staff Development: A Practical Guide, 4th ed.
Edited by Andrea Wigbels Stewart, Carlette Washington-Hoagland, and Carol T.
Zsulya. Catholic Library World 84:1 (2013): 60.
Subscribe to:
Posts (Atom)