HOW TO BRING AI INTO YOUR LIBRARY.
In the years since, many top scientists and futurists, such as Ray Kurzweil, have contended that a human-AI hybrid allowing humans to tap into the internet with their brains will be commonplace. Maybe they're right. AI is already becoming a key part of our lives. There are growing fields (such as electronic medical reference and autonomous vehicles) and AIs that are already here (such as shopping algorithms and chatbots). How can we make these work for libraries?
At the last Computers in Libraries conference, there was a lot of talk about robots, algorithms, and AI. We discussed the philosophy, ethics, and future challenges of the emerging technologies, all lofty and wonderful ideas. However, we did not get into discussing practical implications. This turn of events led me down the path I am on today.
AI, robots, automation--the terms sound far-flung and futuristic or, at the very least, daunting. But let's talk about what we can do to implement the seeds of these emerging technologies today. There are many different ways to automate in libraries, including self-checkouts and digitization efforts, which are great if your library can get hold of them. But my library, similar to many others, tends to shy away from big investments and pushes for us to find creative solutions. For the purposes of this article, I want to focus on a cheap solution for freeing up time by eliminating the need for librarians to field common questions in person: a centralized AI.
A centralized AI is one that lives in the cloud and can be accessed through myriad devices, apps, and programs. In our case, since AIs excel at repetitive tasks, we can have them sub in for answering common patron questions such as "Where is this book?" "When is this event?" and "What are your hours?" Once the AI is in place, it can work across several interfaces: on your website, through social media, or even in person with a smart speaker. There are a plethora of different programs that you can use, but I used Google's Dialogflow, and that's what I will be talking about today.
Dialogflow, formerly API.AI or Speaktoit, is a human-computer interface developer operated by Google that allows you to create your own AI infrastructure, which you can then use to interact with patrons through Google Assistant, Facebook, Twitter, or any of their other partner services. The coding behind it, which primarily uses Java and JSON, opens up a variety of possibilities if you are a dedicated code ninja, but it can get complicated. Luckily, there are a lot of features that are much easier to work with. The three basic building blocks that I am concerned with are intents, entities, and knowledgebases.
Intents are simple if/then scripts. If you say one thing to the AI, then it pops out the preset answer. If you say, "Hello," then the AI says, "Hello, I'm Such-and-Such Library's AI, nice to meet you. How can I help you?" In Dialogflow, the question is called a training phrase, and the answer is a response. The system likes for you to input many different training phrases for each intent; this teaches the AI to better understand your patrons who might ask questions in different ways. For example, for a question about when the library is open, you might want to put in other training phrases such as "When does the library close?" "What are the library's hours?" and "How late are you open today?"
Likewise, AI experts recommend that your responses are varied as well. This creates the illusion of a more natural conversation. For instance, for the aforementioned question about hours, your various responses could be, "The library is open 9-to-5," "The library's hours are 9-to-5," and "We open at 9 and close at 5." You can easily program different types of answers for different apps and programs by choosing the appropriate software from the tabs list above the response field box. You might want to show a visual aid (such as a table of hours or a map of the library) for Facebook or Twitter in which you use a screen--but not for a Google smart speaker or Alexa, which operate mostly through audio. But that's all just icing on the cake; all in all, you use intents for questions that have concrete, simple answers. For slightly more complex questions, you may need to use entities.
Entities are like thesaurus entries; they will allow you to associate similar words with each other and make it easier for the AI to understand questions. There are a lot of different variations for the question, "Where is the bathroom?" Some might not use the word "bathroom." They might say restroom, toilet, or lavatory. In the entity framework, you put the term (bathroom) in the first column and the synonyms (restroom, toilet, lavatory) in the remaining columns. Any mention of the synonyms in an intent will automatically point the AI to the shared term.
Another example would be grouping call numbers or subjects. Someone may ask, "Where is the 900 section?" However, it is also likely that he or she will be more specific and ask, "Where are the 970s?" or "Where is 973?" You would not want the AI to be so granular as to say, "973 is on the bottom shelf of the third to last row on the second floor." Attempting to be that precise will increase your chance for error. Instead, make 900s, 970s, 973, and all the other numbers in the 900-999 range synonyms for 900s. The AI response will recognize any number in the range and return the appropriate answer--in this case, "900s are on the second floor."
Similarly, you could have a list of closely related subjects point back to a general subject. You could make it so that American History, U.S. History, the American Revolution, the Civil War, and Presidential Biographies all point to American History or its corresponding call number range, the 970s. To add all the ranges quickly, Dialogflow allows you to upload CSV files (which are basically spreadsheets exported in a computer-readable format) or copy and paste the text in CSV format into a raw editor.
Knowledgebases are the last building blocks you will need to know about. The knowledgebase portion of Dialogflow is still in beta, but I feel the need to mention it. Knowledge-bases are files that Dialogflow will put together for you using simple inputs such as HTML or CSV. But you should know about it because it works very well with FAQ pages, which have clearly stipulated questions and answers. If your library has an FAQ webpage or similar private file, you will want to try running it through the knowledgebase module before you even start creating intents and entities. Even if it does not work perfectly, a converted FAQ could at least save you a lot of time.
Now that you know the three basic foundational pieces, what are some tips and tricks for libraries? A cornerstone of my project was making the school's catalog searchable. Some catalogs may integrate with Dialogflow with advanced coding. However, our catalog is an older model and didn't have the option to integrate, so I needed to find a workaround.
The way my catalog query works is by using entities. In my case, I needed to use five entities with about 10,000 records in each. When a patron asks about a book's title, which will be in one of my entities' synonym columns, the AI will convert the title to a call number, which is in one of my entities' shared-term columns.
What will the result be if a patron asks, "Do you have Harry Potter?" The AI will answer, "Yes. The call number for that book is FIC ROWLING." Behind the scenes, the entity--in this case, it is called FicBook--has the term, containing "FIC ROWLING" in Column 1 along with whatever other call numbers are in my catalog. In the remaining columns are the synonyms that contain Harry Potter: Harry Potter and the Sorcerer's Stone, Sorcerer's Stone, Harry Potter and the Chamber of Secrets, Chamber of Secrets, etc. Each row has a title from the catalog.
For this intent, you would set the training phrase to "Do you have Harry Potter?" Dialogflow should recognize that "Harry Potter" is part of an entity, but if it doesn't, you can highlight the term and choose the appropriate entity (in my case, FicBook). The response output is "Yes. The call number for that book is $FicBook." Now, the intent will search the whole list in that entity before returning a match. I repeated the process to create a similar intent for each of the remaining entities.
You can also set follow-up intents and fallback intents. Follow-ups are intents linked to other intents that allow you to better simulate speech. Instead of listening for specific questions, they listen to whether or not the user answered "yes" or "no" to the last question. For example, you might want to have your AI give recommendations. At the end of the recommendation, the AI asks, "Would you like to know where that book is?" If the user answers "yes," then the follow-up intent gives the call number, while a "no" initiates a follow-up intent that gives a new recommendation.
Even with all of your follow-ups in place, the best laid plans can still go awry. That's what fallbacks are for. If the AI can't find an appropriate response to a question, then it will use the fallback. Make sure your fallback intent redirects the patron to an in-person librarian as a fail-safe. Dialogflow automatically creates a fallback, but the default only says, "Sorry, I don't know how to answer that." You will need to customize the app to share librarian contact information. Change the fallback intent to "Sorry, I don't think I can help you with that, but you could ask a librarian. Here is the phone number and email for the Reference Desk."
In conclusion, our centralized AI, which uses all the tools I've mentioned, is still learning--but the future is bright. Dialogflow tracks analytics, and we can listen for questions or phrases that we didn't think to add to the script and continuously build it out. The more data the AI takes in, the more accurate its answers can become.
All in all, we have received a positive response and hope to continue improving our AI. Right now, AI still needs live librarian backup, but programming it and implementing it at your library are much less daunting than they sound. You can use Dialogflow, or a similar service, to create a script for a central AI, so the AI can answer simple questions for you through social media, web chats, and smart speakers. Hopefully, this article has dispelled some of your fears, and you will try to make your own AI soon. Your library's AI won't be your new computer overlord yet, but incorporating it into your library has never been easier.
Daniel Geary is the emerging technologies librarian in the Bunn Library at The Lawrenceville School in Lawrenceville, N.J. If you would like to see the AI he is working on, you can check it out at tinyurl.com/BunnLibAI.
ABOUT THE LAWRENCEVILLE SCHOOL
The Lawrenceville School, founded in 1810, is one of the oldest college preparatory boarding schools in the U.S. With approximately 800 students and 110 classroom teachers, it boasts a student-teacher ratio of 8:1. The school's mission is "to inspire the best in each to seek the best for all." Among the school's objectives are to upgrade and modernize the Bunn Library as well as develop a series of design hub makerspaces around the campus. The work on enhancing digital access to the collection has already begun, as this article shows.
Caption: The library's centralized AI can even answer questions over a smart speaker.
Caption: Dialogflow's website
Caption: Shown here, functioning as a chatbot, the Bunn Library's AI directs users to the location of specific resources.
Caption: The library uses AI to help students search the collection.