Neutrality and Algorithms in Libraries

T. J. Lamanna


A recently published book, Safiya Noble’s 2018’s Algorithms of Oppression, has become an extremely popular read in our field as of late. While the book highlights some very important information about how our digital architecture de facto marginalizes people, it offers few remedies, other than expressing concerns about humans’ control of how the algorithm is built, thus influencing how it works. The book details how we must admit that our algorithms are human-generated, but does little to explain how this situation can be remedied beyond “fixing the algorithms.” Algorithms cannot be neutral, nor should they be; they are created by people and thus inherit the biases, conscious or unconscious, of their creators. No human has the capacity to be unbiased, so no algorithm can be. If they were, they could easily be gamed by malicious actors who would try to skew results. They need to be constantly worked and massaged to make sure they are behaving in a positive and progressive direction.

Full Text:




  • There are currently no refbacks.

ALA Privacy Policy

© 2021 OIF