Open Access Open Access  Restricted Access Subscription Access

Chapter 5. Conclusion

Jason Griffey


Chapter 5 of Library Technology Reports (vol. 55, no. 1), "Conclusion"

The incredible pace of Moore’s Law has brought artificial intelligence (AI) systems down into the range where technologists at even small organizations can afford to have the computing power necessary to run machine learning systems locally.1 From running open-source systems like TensorFlow, Keras, or Theano on local hardware like high-end GPUs, all the way down to $100 neural net engines like Intel’s Movidius Neural Compute Stick, which allows for pretrained neural nets to run almost anywhere, there is an enormous wealth of options for programmers who are interested in experimenting with AI. It’s even easier if you’re running something that doesn’t require local processing power, since every major provider of cloud services has some option for running machine learning systems in the cloud. Amazon has Machine Learning on AWS, Microsoft has Azure Machine Learning Studio, Google has Cloud AI, and IBM has Watson Machine Learning. Even your phone has chips in it dedicated just to AI processing; the newest iPhones have a dedicated Apple-designed 8-core neural chip in them just for doing AI work for apps and iOS.

Full Text:



Wikipedia, s.v. “Moore’s law,” last updated October 6, 2018, 05:25,

Chris Bourg, “What Happens to Libraries and Librarians When Machines Can Read All the Books?” Feral Librarian (blog), March 16, 2017,

Andres Guadamuz, “Artificial Intelligence and Copyright,” WIPO Magazine, no. 5/2017 (October 2017),

Naomi Rea, “Why One Collector Bought a Work of Art Made by Artificial Intelligence—and Is Open to Acquiring More,” Artnet News, April 3, 2018,


  • There are currently no refbacks.

Published by ALA TechSource, an imprint of the American Library Association.
Copyright Statement | ALA Privacy Policy