Thursday, September 11, 2014

"The 'Just Do It' Approach to Customer Service Development," by Erika L. Gavillet

Erika L. Gavillet. "The 'Just Do It' Approach to Customer Service Development: A Case Study." College & Research Libraries News 72:4 (2011): 229-236.

Author Erika Gavillet presents a new approach to surveying library patrons in this short case study. She points out that academic libraries in particular are evaluating customer service in order to make improvements or offer new services to their users. In an effort to avoid "survey fatigue," they developed a light approach to gathering user feedback. Her unit, the Customer Services Group (CSG) at Newcastle University Library, decided to conduct a quick survey of students by asking three questions (p. 230):
  • What five activities do you do most in the library? (with a list of services with tick boxes)
  • What else do you use in the library? (with a list of services with tick boxes)
  • If money were no object, what single change would improve your library experience the most?
They were able to gather 1,000 responses to the survey. These responses identified a number of things that the students would like to have changed: more textbooks, PCs, and study space; longer opening hours; less noise and fewer distractions.

In response to the survey, the CSG came up with a plan to implement during the examination period in May. All library staff would participate in this plan which included regular sweeps of each floor of the library. Staff checked for "noise, trailing wires from laptops, litter and unattended belongings" (p. 230). It would be promoted as a partnership with the students to keep the library quiet and tidy. They conducted 558 sweeps over the next month, leaving calling cards on unattended belongings warning of theft and asking students to be quiet and remove any trailing wires.

A follow-up survey was conducted afterwards in which students reported that it was worthwhile and positive, but that they still found too much litter accumulating and didn't like the calling cards the library staff left on their unattended belongings.

This article shows that it's possible to gather feedback and implement changes in a relatively short period of time. The library staff and students both found the activity worthwhile. One question that I have is how the students' other concerns were addressed. They originally asked for more textbooks, PCs, and study space along with longer opening hours and less noise. It seems from the article that the only major concern that was addressed by the libraries examination period campaign was the noise issue. How they plan to address the other issues was not explained in the article.

Nevertheless, what I like about this article is that it demonstrates that every library survey or study doesn't have to be statistically meaningful or significant to be useful. Much of library research is about improving services or managing better. Surveys can be used to gather feedback and collect information that can help us run the library in a more effective and efficient manner, providing better services to our users. In this case study, a short survey got a lot of responses which helped library managers develop better services during a hectic time of the year.

Wednesday, September 10, 2014

"Marketing Mindset," by Karen Wallace

Karen Wallace. "Marketing Mindset: Focusing on the Customer, from Technical Services to Circulation." Feliciter 3 (2007): 126-129.

In this short newsletter article, Karen Wallace makes the point that all library staff should think about their work in terms of the user and how the users' experience is affected by what they do. She states "All library employees, including those who typically have little contact with external patrons, need to understand basic marketing concepts and how what they do on a day-to-day basis affects user satisfaction. These "back-shop" folks, who may erroneously be overlooked in marketing efforts, play a key role in developing library collections and service that can meet user needs and even exceed their expectations" (p. 126).

According to Ms. Wallace, library staff need to understand their users by conducting what in the business world would be called "market research." Information gleaned from customer interactions is one source of this information, and she encourages technical services staff to participate in activities that bring them into contact with their users, including working on the reference desk and volunteering to help with library programs.

Armed with knowledge of users' needs, library staff can begin to anticipate their needs and tailor their work and services to meet those needs. For example, they can prioritize their work to meet user demands; enhance their cataloging by adding local subject headings; customize their OPACs to provide a better interface; and create canned searches for heavily-used materials.

I agree with Ms. Wallace's points about focusing on the user. We need to look at all of our processes with the user in mind. We need to evaluate what we do based on whether we're meeting those needs, and make decisions based on what's best for them. Our goal is to provide faculty, staff, and students with the resources they need in support of the teaching and research mission of the university. Timely processing of materials is part of that equation and should be balanced with providing quality access.

Tuesday, September 9, 2014

"IT Service Climate," by Ronnie Jia and Blaize Horner Reich

Ronnie Jia and Blaize Horner Reich. "IT Service Climate: An Essential Managerial Tool to Improve Client Satisfaction with IT Service Quality." Information Systems Management 28:2 (2011): 174-179.

Authors Jia and Reich take an interesting approach to service quality assessment. Making the point that the climate of an organization can affect the quality of services that are offered to customers, they take it one step further and maintain that an organization's internal climate can predict customers' perceptions of quality service. To investigate this further, they developed a 10 question instrument that can be used to assess the internal organizational climate. Their study showed that "service climate may be much more important than technical competence from the clients' point of view" (p. 174).

The authors begin by defining IT service climate as "shared perceptions of the practices and behaviors in their workplace that support the provision of IT service to business customers" (p. 174). They differentiate between climate and culture, which is described as deeper and based on core values and assumptions. Climate is more about perceptions and is an expression of culture.

The authors identify three dimensions to the IT service climate: service leadership, service vision, and service evaluation. Service leadership is about setting and communicating service goals. Service vision is about IT employees seeing themselves as having a role in meeting customer needs. Service evaluation is about evaluating employees on how they met their customer service goals. Of the three, the authors identify service leadership as being the most important.

The authors' research consisted of asking IT staff in four organizations 10 questions, and comparing their responses to the responses of their respective customers to the IT-SERVQUAL instrument. They found that, in fact, the impressions of IT staff about their own climate correlated with the impressions of their customers about quality of service. This relationship indicates that managers who work to positively affect climate will have the result of improving costomer perceptions about service quality. This can be very empowering for managers who can take action to improve the bottom line.

The authors recommend incorporating their 10 questions into internal employee satisfaction surveys, examining the findings carefully, and deploying the survey on a regular basis.

I found this to be an interesting look at how an organization's climate can affect and predict the perceptions of customers. While my own research is less about climate and more about the services we provide to each other as internal service providers and customers, it's clear that our service vision is affected by climate. Although this article was specifically about IT services, the lessons learned could be applied to other types of services. The concepts of service leadership, service vision, and service evaluation are things we can all apply in our daily work.

Wednesday, September 3, 2014

"Employees as Customers Judging Quality," by John B. Harer

I've taken a short hiatus from writing this blog, but am getting ready to ramp it up again. My new research project is going to address the use of internal customer service assessment surveys to gauge customer satisfaction with services provided by technical services and library systems and to identify potential areas for process improvement. I'm planning to use this blog to document my literature review. The first article I'm reviewing is:

John B. Harer. "Employees as Customers Judging Quality: Enhancing Employee Assessment." New Library World 109:78 (2008): 307-320.

Harer used an interesting approach to this topic. First he makes the point that many libraries follow the business world in implementing various management techniques, such as Continuous Quality Improvement and Total Quality Management. There is a focus on continually improving processes and output, and assessment is an important aspect of evaluating their success in doing so. Harer mentions that many libraries are using tools such as the LibQual surveys to assess quality, and makes the point that extending such assessment tools to library employees is a logical next step.

What's interesting to me is that instead of simply asking libraries whether they survey their employees on their views on quality, Harer looks at existing employee satisfaction survey instruments at Association of Research Libraries (ARL) member institutions and analyzes whether they ask questions that get at the employees' opinions about quality. It seems like a roundabout approach to get at this information. Nevertheless, Harer solicits participating in his study from the 96 U.S. ARL academic library members, and receives responses from 30 of them. His responses include one employee satisfaction survey, four organizational climate surveys, eight exit interview surveys, 11 employee self-assessment evaluations, and three administrator evaluation surveys. Thirteen libraries responded that they did not have or use such instruments.

From this point forward in the article, Harer limited the discussion to the four organizational climate surveys that were submitted, and found that only one of them specifically asked employees for the opinions regarding quality, although the two example questions provided addressed the needs of the user.

As I mentioned before, I think this is an interesting approach. I'm interested in another angle: employees' satisfaction with the services they offer each other. Obviously, the end user benefits from our output, whether that's in items processed or services offered. But our more immediate customers in libraries are each other, especially for technical services and library systems units. For example, bibliographers are customers of acquisitions departments. Bibliographers put their requests in to acquisitions staff, which are then responsible for carrying out those requests. Circulation staff might ask for an item to be rush-processed, so they are customers of cataloging staff. Even within a single department we are dependent on each other. When we ask for help on a tricky problem, does it get answered in a timely fashion, or did it sit on someone's desk for a month? These are all ways that we are dependent on each other. Our personal assessment can demonstrate overall satisfaction with the services that we provide to each other and identify areas in which we can improve. Improving services to each other will by extension provide better service to our users: the faculty, staff, and students who rely on our services for their teaching and research needs.

I agree that employees can provide important feedback regarding quality, but I think a more direct route is preferable. Internal customer services assessment is one more tool that we can use to improve our work and services.