Poor Information Literacy Skills and Practices as Barriers to Academic Performance: A Mixed Methods Study of the University of Dar es Salaam

Tina Klomsri (tinaklomsri@gmail.com) is a M.Sc graduate at Department of Computer and Systems Sciences, Stockholm University; Matti Tedre (matti@dsv.su.se) is associate professor at Department of Computer and Systems Sciences, Stockholm University.

The authors would like to thank Elias Mturi, Pascal Kunambi, and all the students and librarians at UDSM. We are especially grateful to the Nanyang Technological University, Wee Kim Wee School of Communication and Information Literacy Cluster for providing us the questionnaire used in this study.

Information and Communication Technology (ICT) is increasingly used in Tanzanian education. Knowing how to operate ICT alone is incomplete without knowing how to use it as a tool for organization, communication, research, and problem-solving. In recognition of this challenge, information literacy (IL) has been identified as a key attribute to students as they progress through their learning paths. Based on a mixed methods strategy, using questionnaires and focus group discussions, this study measured the level of IL skills among University of Dar es Salaam’s (UDSM) postgraduate students, to gain insights into the students’ perceptions and experiences with information problems. A total of 102 students from four institutions answered the online questionnaire and 22 students participated in six focus group discussions. The questionnaire scores of the students were poor in the majority of IL categories, suggesting ineffectiveness of the current IL training in imparting IL knowledge and skills. The study ends by discussing recommendations to improve current IL practices at the university.

Information and communication technology (ICT) can increase the quality of education in both developed and developing countries if it is used efficiently.1 ICT is being increasingly used in Tanzanian education, and the latest situational analysis of ICT in Tanzanian education showed that all universities have computer labs and many have high bandwidth Internet connection through fiber optic cable.2 Although the majority of Tanzanian university students and staff have access to ICT and the Internet, analysts have noted that the integration and exploitation of ICT in teaching and learning practices was still limited.3 The same analysis also showed that current ICT training at the Tanzanian universities covered basic uses of ICT, such as basic word processing and spreadsheets, basic statistics, and simple searches in journal databases and generic search engines. ICT training rarely, if ever, covers advanced topics that truly unleash the power of the Internet and computers in learning and research, such as Web 2.0/3.0 content creation and networking environments, critical information literacy, advanced search tools and techniques, and information management. In other words, knowing how to operate ICT alone is incomplete without knowing how to use it as a tool for organization, communication, research, and problem-solving.4

The University of Dar es Salaam

The University of Dar es Salaam was established in 1961, and it is the oldest and largest public University in Tanzania. UDSM started to give information literacy courses to students in 2001 when the Internet enabled free access to subscribed resources.5 Library orientation programs are given to students at the beginning of the academic year to introduce students to the layout of the library, its collections and services.6 At the time of writing, the library has two computer labs and a total of 41 computers with fiber Internet connection. There are also 20 computers with access to Online Publication Access Catalogue (OPAC) placed in convenient places in the library. The library subscribes to more than 30 e-journal databases with more than 10,000 online journal titles.7

The IL training was included at UDSM as a stand-alone course in which the students could participate voluntarily.8 This is still valid today, as students and staff are encouraged, but not obligated, to participate in the IL training programs. A study by Lwehabura found that approximately half of the respondents were not aware of the possibility to attend IL training.9 Among the students who attended the training, a majority (53 percent) expressed that the training was not effective, mainly due to the insufficient time and resources afforded to appropriate the skills. Today, however, there are more computers available at the library, and more students have access to a personal computer.

The most common method to teach IL at UDSM is through lectures. The nature of these lectures, however, is teacher-centered and tends not to activate students’ higher-order thinking skills.10 Moreover, there is no incentive for the students to attend IL training since they do not receive any credit or grade for the effort. Consequently, there is no way to guarantee that all students will participate and benefit from the training provided.11

Research Problem

The advancement in technology in the past decades has changed the way education is delivered at UDSM. Lectures are no longer the students’ primarily source of information, as computers and Internet access has changed this scene. Technology is increasingly being integrated into the curriculum to support the teaching and learning environment. To understand the educational impact of ICT in Tanzanian education, as well as to explore pedagogies to improve them, there is a need to measure and assess students’ Information Literacy (IL) skills.12 A study that measures the impact of ICT in Tanzanian education is timely and important, as the country shares many similarities with other African countries. This research study provides information about IL issues in a developing country context. In addition, the findings of this study can help academic staff from other universities who struggle with similar information literacy challenges to better understand how their students might approach information problems, which in turn can facilitate the improvement of future information literacy programs.

This research study has three aims. First, this study describes the level of IL skills among UDSM’s postgraduate students using questionnaires from the Nanyang Technological University (NTU), Singapore.13 Second, this study explains the results of the questionnaire survey through focus group discussions with the students. Third, this study provides recommendations on how to improve IL practices at UDSM based on the empirical data. For that purpose three research questions were defined:

  1. What is the level of information literacy skills among postgraduate students at the University of Dar es Salaam?
  2. What are the students’ perceptions and experiences with information problems that can explain the score of the most and least successful IL skills?
  3. How can the University of Dar es Salaam work pedagogically to improve the students’ IL skills?


This study describes IL skills of postgraduate students at the University of Dar es Salaam, Tanzania. Postgraduate students were chosen because at that level of study the students are required to be information literate and to have knowledge of how to conduct research.14 The students who took part in the survey and focus group discussions were all postgraduate students from four institutions situated at the University’s main campus, including College of Social Science (COSS), UDSM Business School (UDSMBS), UDSM School of Education (UDSMSE), and College of Natural and Applied Sciences (CNAS).

Literature Review

Information Literacy

Information Literacy is an umbrella term that encompasses concepts such as digital, visual, and media literacies, academic literacy, information handling, information skills, data selection, and data management.15 Digital, visual, and media literacies are related to an individual’s ability to read, write, and otherwise deal with digital sources effectively using ICT.16

Academic literacy refers to IL within the academic context where people are expected to understand how to use resources such as online databases, OPAC, journal articles, as well as experts and authoritative bodies to obtain knowledge and achieve their academic tasks.17

Information handling, information skills, data selection, and data management are IL competencies closely related to each other. The processes concern the ways a person interacts and communicates with information. It combines the intellectual processes of information use with the physical processes of information seeking.18

IL skills have become increasingly important in the present-day environment of rapid technological change and growing amount of information resources due to computerization. Individuals are faced with diverse and often unfiltered information choices, which raise the questions about authenticity, validity, and reliability.19 As a result, IL is seen as a key quality for most people today, as they are faced with large numbers of disparate information resources, which they need to manage in an effective and efficient way.20

All around the world higher education institutions have introduced IL programs to strengthen students’ information use, but in many African countries IL interventions are yet to be considered or implemented. The many barriers that face many parts of Africa, such as scarce financial, material, and human resources, force the majority of African students to pass through the university system without ever mastering the art of information retrieval and use.21 Baro conducted a survey of universities in Africa and found that only a few institutions have successfully integrated IL courses into the curriculum.22 The situation is different from the early development of European and Northern American programs on information literacy practices in schools and universities, which never suffered from a similar lack of resources. Librarians and teachers around the world continue to address the challenge of integrating information skills instruction into the total curriculum23 but there is a constant need for a better understanding of the contextual elements of IL education. Despite significant progress in the past decades, more effort is needed to ensure that students are information literate. In China, for instance, the government has supported the teaching of information literacy skills in the past decade, although few students enrolled in higher education are able to participate.24 The rapid proliferation of modern ICT equipment makes it ever more important for developing countries to ensure that their citizens have access to information skills instruction.

The Big6 Information Problem-Solving Model

The Big6 process model for information problem-solving was developed by Eisenberg and Berkowitz in 1987. Since then, the Big6 approach has become one of the most renowned and adopted approaches to teaching information literacy in K–12 education all around the world.25 The Big6 process model was used as IL assessment framework in this study because it integrates the traditional information skills with the use of technology.

The Big6 framework is divided into six major stages with two sub-stages under each (see table 1).26

The 6+3 Model for IL Standards

According to Mokhtar et al., information seeking today is not simply about finding “answers” but also about finding “opinions” of other people.27 To acknowledge this shift in information seeking, Collaborative Information Seeking (CIS) was added to the Big6 process model. In addition, Ethics and Social Responsibility and Attitudes and Perception were also added as part of the mindset. These two mindsets are necessary to ensure that students understand how to use information in an ethical and responsible way, and that the students display IL related attitudes such as having respect for diverse opinions. The aspects of CIS and Ethics and Social Responsibility were covered in the questionnaire survey, while the aspect Perception (excluding Attitudes) was covered in the focus group discussions. The mindset Attitudes has deliberately been excluded, as it is beyond the scope of this study. In this study, the IL competencies and mindsets are called categories or IL skills, and the two terms are used interchangeably.

Mixed Methods Research Strategy

This research study is of the exploratory kind, and the research inquiries required methods both from quantitative and qualitative research. Instead of following one research paradigm, the choice of implementing mixed methods strategy was based on what was perceived as the most suitable strategy to answer the research questions.28 A particular strength of the use of different methods is that it allows the findings from one method to be triangulated with the findings of another method.29 The triangulation of the results in this study was done, first, by employing the quantitative method (questionnaires) to get an overview of the IL skills level of the postgraduate students. Then, the qualitative method (focus groups) was implemented, as the quantitative data informed the later qualitative study of the areas where the students performed most and least successfully. The focus groups’ discussions compensated for the small sample size of the survey, and provided a richer explanation of the survey findings.

This study is unique in the sense that it provides both a measurement of the students’ IL knowledge level, as well as a picture of their perceptions and experiences with information problems. The voice of the students offers another perspective of the IL phenomenon, and their insights account for a more comprehensive picture of the IL practices at the university.


The questionnaire used in this study were built on the framework of Eisenberg and Berkowitz’s Big6 approach and supplemented with Information Ethics (awareness of censorship) and Collaborative Information Seeking as suggested by the 6+3 model.30 The original questionnaire was developed by a team comprising Information Studies and Education Faculty members of NTU.31 The questionnaire was divided into two sections. The first section contained nine demographic questions about the students such as age, gender, education background, Internet and computer access, frequencies of library visits, and the use of library resources. The original survey included questions about public and national libraries, which was excluded in this study to maintain the focus on UDSM University library.

The second section of the questionnaire consisted of thirty multiple-choice questions divided into eight categories to test the students’ IL skills. The eight categories were comprised of: Task Definition, Information Seeking Strategies, Location and Access, Information Use, Information Synthesis, Information Evaluation, Information Ethics, and Collaborative Information Seeking. The majority of the questions had only one correct answer. Seven of the questions, however, had more than one correct answer. The answers to the questions were given a different score according to perceived difficulty level. The majority of questions had two points as full mark. The maximum score that the students could get from the knowledge test was fifty points.

The questions with multiple correct answers were treated as follows: if a student chose the best answer he/she would receive the full mark for that question. If the student, however, chose the 2nd or (3rd) best answer, he/she will receive a lower mark. For example, question 35 asked the students who they would consult to evaluate the information they obtained critically and competently. Critical evaluation of information includes the ability to examine and compares information from various sources to determine its validity, reliability, accuracy, authority, timeliness, and point of view and bias.32 Two points were given for the students who chose “expert feedback” and one point was given for the students who chose “assessment rubric.”

In this study, convenience sampling and snowball sampling were chosen. The questionnaires were distributed between December 2014 and February 2015. A professor from UDSM helped administer the questionnaires by distributing it to other professors and librarians, who in turn handed out the survey to their students. The researcher also approached the students on UDSM’s main campus to ask for their participation in the survey, and if they could nominate other postgraduate students who would like to participate in the study. As this study employed non-probability sampling techniques, descriptive statistics was found the most appropriate to analyze and present data in a meaningful way.

Surveymonkey, a web-based survey solution, was used to collect and retrieve the data. Since online questionnaires were used, the results could be biased toward more affluent students with access to Internet and personal computers since ICT costs can be prohibitively high in Tanzania. Regarding non-response bias through refusal, there were a few factors that could discourage the students from answering the survey: (1) the perceived amount of effort needed to respond to the questionnaire; (2) the omission of reward for participation; and (3) the perceived difficulty of the questions. Therefore, the participating students could differ from the non-participating students in terms of personal interest, ambition, and diligence.

Using a prepreexisting questionnaire helps fulfill the requirements of validity and reliability of the questionnaire design.33 However, some of the questions were contextualized for the students of Tanzania. For example, questions related to the Asian culture were changed to the African culture. One question about call numbers on books was removed, as this study focused mainly on online search of information. Lastly, to prevent the feeling of frustration among the respondents, the answer “don’t know” was added as an alternative.

Focus Groups

The focus group discussions in this study were semistructured. At the end of each focus group discussion, a summary of the topics discussed was mentioned to reduce chances of misunderstanding. Focus group discussions were chosen as a method because IL can be perceived as a complex topic, therefore, the participants could get support and ideas from each other, which can stimulate the dynamic of the discussions further.

Six focus group discussions were conducted in February 2015 with students from College of Natural and Applied Sciences (CNAS), College of Social Science (COSS) and UDSM School of Education (UDSMSE). All of the discussions were held in a quiet outside study area and lasted around fifty minutes each. The discussions were recorded and field notes were taken. Each discussion started with the moderator asking the participants about their background, such as department of study, frequency and purpose of library visits, IL training, and Internet access. Then, the discussions continued with in-depth questions about the students’ perceptions and experiences with information problem-solving.

The in-depth questions were designed so the students had to explain how they solved different information problems. This design choice was made because IL skills are not isolated incidents, but they rather are “connected activities that encompass a way of thinking about and using information.”34 Also, to understand why the students performed more or less successfully in certain IL categories, it was important to consider the students’ context in the broad information landscape.35 Hence, the focus group questions contained many follow-up questions to gain a deeper understanding of the IL context in which the students operate.

The first part of the in-depth discussion contained general questions that looked at the students’ information need and how the students solved typical information problems both in their academic studies and in their everyday life. According to SCONUL experience and information need are two factors affecting an individual’s IL skills level.36 The second part of the in-depth discussion contained questions directly related to the findings of the questionnaires. The questionnaires results showed that the students performed the least successful in the IL areas (mean score below 50/100): Information Evaluation, Location and Access, Information Use, Information Synthesis and Information Ethics. The number of focus group discussions was determined by data saturation; in other words, data collection ceased when new data do not provide more information related to the research questions.37

Convenience sampling was applied due to the limited time schedule of the researcher to find suitable participants. Creswell’s data analysis spiral was used to analyze the discussions. The process of data analysis is best presented as a spiral, containing data management, reading and memoing, describing, classifying and interpreting, representing and visualizing data.38 The coding of the data was done in the computer program Dedoose.

The moderator acted neutral during all of the discussions, and encouraged each participant to deepen his/her responses. Follow-up questions were asked so the participants can explain their perceptions and thoughts. Minimal feedback was given. To fulfill the reliability requirement, the topics of discussions were developed iteratively, and it was made sure that no leading or obscure questions were included.

Research Ethics

The participation in this study was confidential, voluntary and based on informed consent, which was taken in writing from all participants. The questionnaire and discussion data were kept confidential. The researcher operated in a transparent manner by detailing the aim of the study and requirements for participation.


Demographic profile of survey respondents

Students from four institutions responded to the survey; 21 percent of the respondents were students at UDSMSE, 23 percent of the respondents were students at CNAS, 24 percent of the respondents were students at UDSMBS, and 34 percent of the respondents were students at COSS. In total, 156 students responded to the survey but only 102 of the responses were completed. The average time for the respondents to complete the survey was thirty-seven minutes. Among the 102 respondents, the majority were born between 1974 and 1986 (62 percent), and 68 percent were male while 32 percent were female. One female respondent did not reveal her birth year.

For 90 percent of the respondents, it was their first or second year on postgraduate level at UDSM. Only 10 percent of the respondents had studied three years or longer at UDSM. This was confirmed during the focus group discussions as the majority of students explained that they did their undergraduate study at another university.

Computer Skills and Internet Access

Almost all of the respondents own a personal computer (95 percent), and the majority of the respondents had Internet access at their place of residence (76 percent). The focus group discussions revealed that the students accessed the Internet mainly through their Smartphone. For the majority of students, entering university meant ownership of a first personal computer, as one of the students explained: “Most of us Tanzanians get access to laptops when we start at university level. You may know that it is some sort of prestige to enter a university. They [extended family] send you off with a laptop and stuff like that” (Participant G, Focus group 2). Before university, the students said that computers and Internet were not available in school. When asked how the respondents learned computer skills, the focus group discussions showed that some students took an introductory course in basic computer skills. The students who did not attend similar courses learned through practice and their peers.

Library Resource Usage Training and Library Visits

The majority of the respondents (90 percent) had received library resource usage training or training related to IL. Most of the respondents received their training at UDSM (59 percent). Some of the students without IL training explained, during the focus group discussions, that they were unaware of the existence of IL training. Other students, however, knew about the IL training but were not able to participate since they had another lecture that they needed to attend. All of the students who did not participate in the IL training thought that the training would be beneficial and said that they would attend the training if given the opportunity. Another interesting finding from the discussions was that some of the students decided not to participate in the IL training on postgraduate level because they thought it to be unnecessary, as they had already taken the training at the undergraduate level.

The survey results show that the frequency of the students’ library resources usage in the past 12 months was slightly lower than the frequency of library visits. The discussions revealed that this was because the students visited the library only to study, without consulting the library’s resources. Many students thought the books in the library were outdated, so they would only visit the library to access wireless Internet and read the newspaper. “When we go to the library we can only access old books, but with Internet we get the current information or whatever you need. Sometimes, there is no need of going there. If you need any information, you can access through Internet” (Participant I, Focus group 2).

Information Literacy Test Results

All of the aspects of Eisenberg and Berkowitz’s Big6 along with Mokhtar et al.’s new dimension of Ethics and Social Responsibility and Collaborative Information Seeking were tested using multiple-choice questions.39 The scores were normalized to 100 percent for each category and as a whole instrument. All of the scores were rounded to two decimals. Figure 1 shows the spread of the standardized percentage scores among the study population. The majority of respondents (68 percent) scored between 34 and 57 out of 100. The scores are low at an overall study population level as compared to the study of Foo et al., with the mean score of 45.59/100. As can be seen in table 2, Task Definition was the best performing area (62.87/100), while Information Ethics was the poorest performing area, with the alarming low mean percentage score of 18.63/100, as compared with 73.60/100 in Foo et al.’s study of tertiary students in Singapore.

Table 3 lists the standardized mean score for each testing area of IL skills. The respondents scored over 70/100 for questions about search type, plagiarism, and evaluating information content. However, the respondents seemed to lack understanding on how to differentiate fact, view, and opinion, censorship, and citation style. On these three questions, the respondents scored lower than 20/100, which shows that there seems to be a serious issue related to these IL skills.

Male students were found to score higher than female students (46.57 vs. 43.55). Surprisingly, students who had not received IL related training scored higher than students who had received training previously. The lowest standardized mean score was attributed to students who received IL training at UDSM (see figure 2).

As can be seen in table 4, students with no IL related training scored higher in almost all IL categories except for Information Synthesis and Information Ethics. The biggest difference between the scores can be found in CIS where the students with no training outperformed their peer with training (48.91 vs. 60.00). This finding, however, should be viewed with caution, as only ten students did not participate in any IL related training. Students of year 1 and 2 at UDSM scored higher than students who studied 3 years or longer at UDSM (45.77 vs. 43.90) (see figure 3). This finding, too, should be viewed with caution as only ten respondents studied three years or longer at UDSM.

The results showed that the respondents with Internet access at the place of residence performed better than the respondents with no Internet access at the place of residence (46.36 vs. 43.08). Having Internet at the place of residence facilitate the practice of information problem-solving. Figure 4 shows the distribution of standardized mean percentage score across the four institutions. UDSMSE scored the highest 49.48/100. The lowest mean percentage score of 41.09/100 was assigned to CNAS.

The final question of the survey asked the respondents whether they would consult several potential human information sources when completing the information tasks covered by the Big6 model. It was found that for defining the research topic and scope; organizing, compiling, finalizing and presenting answer to research topic; and evaluating the completed product and process of information seeking; more respondents would consult professors, followed by peers (classmates). Evidently, collaborating with peers to solve different information problems made up a considerable part of the respondents’ academic life. Professors were also the first to be consulted for the task formulate search strategy, statements and retrieve information, followed by librarians. For the task identify sources of relevant information, librarians were the first to be consulted, followed by professors. Lastly, more respondents tended to consult their peers (classmates), followed by professors, for the task analyze quality of retrieved information and select relevant information for use. This is a matter of concern since the respondents’ mean scores were below 60/100 in the fields of Information Seeking Strategies and Information Use. Another concern was that more than 10 percent of the respondents did not consult anyone when performing the tasks formulate search strategy, statements and retrieve information and evaluate the completed product and process of information seeking, even though their mean scores in many of these fields were below 60/100.

Most Successful IL Skills

Task Definition

Task Definition was the respondents’ best performing IL category (62.87). Task Definition concerns the ability to recognize and define an information problem (including research topics and questions) and to identify types and amount of information needed.40 The students explained that they practiced information search everyday and that almost all of their academic material is retrieved from the Internet, using predominantly Google as search engine. Focus group discussions revealed that all of the students have conducted research with primary or secondary data at least once during their undergraduate studies. This may explain the respondents’ higher mean score in defining tasks and defining research topics and questions. Even though most of the students conducted research before, they still thought more research training was needed.

An encouraging finding from the discussions was the students’ critical awareness of using Wikipedia as a source of information: “Of course there are some sites that you don’t trust as much, for example Wikipedia. You can read from Wikipedia, but then you should go to another source and see if it correlates or not (Participant J, Focus group 3). Understanding how to use general online information such as Wikipedia is an important part of Task Definition.41

Information Seeking Strategies

Information Seeking Strategies was the respondents’ second best performing category (54.04). Information Seeking Strategies concerns the ability to consider all the information sources and to evaluate the sources to determine priorities.42 The survey data showed that the majority of respondents had a good understanding of choosing whom to consult on academic matters. During the discussions, almost all of the students explained that they consulted their supervisor or professor, followed by peers, when conducting research.

Since the students had conducted research on primary or secondary data, it was expected that the students performed well in the survey when asked to identify primary and secondary information sources, and to evaluate the most appropriate sources of information. However, during the focus group discussions only two students mentioned peer-reviewed material from scholarly journals as trustworthy for academic use. Most of the students learned source evaluation skills from teachers at the University but expressed that more training is desired, preferably in proximity to thesis writing. A surprising number of students mentioned PDF documents as trustworthy. As one student explained: “First you enter your words and then PDF. Then you click Search. The information that appears there is trusted.” (Participant R, Focus group 5). Also surprising was the number of students who claimed they did not evaluate digital information.

Collaborative Information Seeking

Collaborative Information Seeking was the respondents’ third best performing category (50). Although the quantitative results showed that the respondents primarily consulted professors when completing most of the information tasks, the discussions showed that this was not always possible. Many students admitted to consult their peer more than their supervisor: “The teacher told us that we could come and consult at any time you want, but most of the time they are not available.” (Participant J, Focus group 3). Another student added, “We feel bad about it, because of course we need the help. But I think we are used to the situation now. We solve it using friends.” (Participant I, Focus group 3).

Least successful IL skills

Information Evaluation

Information Evaluation was one of the less successful IL categories (47.06). Information Evaluation concerns the process of evaluating one’s information problem-solving process. The survey results indicated that the respondents had a profound understanding of plagiarism, as it was one the best performing IL areas of the respondents. This does not, however, prevent the students from practicing plagiarism. One of the female students explained why she thought students plagiarized: “The teachers don’t even read our reports. They just look if it is attractive and if it is big” (Participant J, Focus group 3).

Merely 36 percent of the respondents chose expert feedback (the best answer) and 4 percent of the respondents chose assessment rubrics (the second best answer) to help them evaluate information critically. This is consistent with the results on survey question forty where the majority of respondents selected ‘peers’ to assist them in analyzing quality of retrieved information and select relevant information for use. Regarding the copyright question, the focus group discussions showed that most students equated copyright with ownership of information, which can be given away if the owner decided to do so. This explains the mean score of 23.53.

Location and Access

The standardized mean score for Location and Access was 44.45. Location and Access concerns the ability to locate and efficiently use information resources.43 The best performing IL skills areas within Location and Access with a mean score above 50 were type of search, knowledge of library’s e-resources, Boolean operators, phrase search, and roles of reference librarians. Since the students practiced information search daily, it was not surprising that they received a better score on the questions related to Internet search. The better score afforded to knowledge of library’s e-resources and roles of reference librarians can be explained by the fact that a vast majority (90 percent) of the respondents had taken library resource usage training.

Many students explained that they used phrase search, quotes, or reading the result one-by-one to exclude irrelevant results when performing online search. Some of the students, however, expressed that they did not have a strategy to narrow down the search results, and none of the students understood the logic behind Boolean operators (even though many of them used quotes). Yet the respondents performed well on one of the survey questions related to using Boolean operators. This was because the question was similar to how the students normally narrow down search results, namely to include relevant phrases and quotes. On the other hand, the respondents performed less well on the question of how to broaden searches using Boolean operators. This indicates that the students lacked proper understanding of how Boolean operators function.

None of the students suggested the use of truncation to narrow down the search results, which explained the low mean score on the question about truncation. Merely one student mentioned that he used the University library’s OPAC to look up materials. The low use of OPAC is most likely the reason behind the low mean score on that particular question.

During the focus group discussions, some students mentioned problems with comprehending the English language as a barrier to accessing information and assessing its credibility. This would explain why the respondents performed poorly on some of the survey questions. For example in question 21, the respondents had to select the best search statement to narrow search results. Many respondents omitted the word ‘Cantonese, which was the key word in the search statement, as they most likely did not understand the word and therefore decided not to include it.

The low mean score appointed to the question of how to use index of a book suggests that the respondents have little experience of searching for different topics in books. The discussions revealed that the students primarily used computer and Internet to access academic materials. Thus, the students might be more familiar with searching using computer shortcut keys such as “Ctrl+F.” It is important to mention, however, that a majority of respondents (78 percent) chose ‘table of contents, which was the second best answer.

Information Use

The standardized mean score for Information Use was 37.56. It contained one of the respondents’ best and poorest IL skills areas. Information Use concerns the ability to evaluate the relevance of information and then extract the relevant information. Many students explained, during the discussions, that reliable information must come from recognized authors or institutions, and if the information was dubious then they would cross-check the information. The students’ ability to evaluate information explains the better mean score on the questions related to evaluating information content and cross comparison of content. Their information evaluation criteria also clarify the low mean score on the question related to authoritative information sources, where the respondents had to identify the most impartial source. Thirty-four percent of the respondents chose the second best answer, which was Ministry of Natural Resources and Tourism website, an authoritative institution of which they recognized. Merely 19 percent of the respondents chose United Nations (the best answer), as the respondents knew little about the intergovernmental organization.

During the focus group discussions, the students were asked to discuss the differences between facts and opinions. Most students agreed that facts must have empirical evidence: “A fact is something that has empirical evidence by one who is doing research. If there is no research, it is not a fact” (Participant R, Focus group 5). Opinions on the other hand were expressed explicitly: “When you talk about an opinion, it is when somebody says ‘my opinion’ and ‘your opinion’” (Participant Q, Focus group 5). To understand how the students resonated when differentiating between facts and opinion was important as it explained the low mean score on the IL skill areas critical assessment of information and fact, view, and opinion. The description in these particular questions were reported in an empirical fashion, which explained why the majority of respondents opted for the wrong answer without reflecting on the objectivity of the information, or on whether or not the information has supporting evidence.

Information Synthesis

Information Synthesis was the second poorest IL category (36.60). Information Synthesis concerns the ability to organize and communicate the results, including the ability to cite properly and credit electronic resources. Information Synthesis consisted of two questions related to citation style. During the discussions, many students claimed that they seldom practiced referencing (even though they would like to learn more about it), so they forgot how to properly cite and write a bibliography. Also, the students’ deficiencies in English language skills, as discussed earlier, might have played a role as to why the students performed poorly. In one of the two survey questions, the respondents were asked to name the title of the periodical. 47 percent of the respondents chose the title of the article instead of the title of the periodical. The respondents mistook the English word periodical for article. This would explain the low mean score on that particular question.

Information Ethics

Information Ethics in this study concerns awareness of censorship. It was the respondents’ poorest IL category (18.63). During the discussions, merely one student understood the meaning of censorship; the rest of the students did not know what the word meant. However, there were two students who understood the concept of censorship but had not heard of the terminology before. This would explain why 49 percent of the respondents chose ‘don’t know, and 34 percent of the respondents chose the wrong answer on the particular question.

Discussion and Conclusions

“What is the level of information literacy skills among postgraduate students at the University of Dar es Salaam?”

Although the finding should be viewed with caution, survey data showed that students with no IL related training scored higher than students with training on almost all categories of IL, implying ineffectiveness of the training at the University in imparting IL knowledge. Previous study on four Tanzanian universities revealed that a majority of students found library resource usage training ineffective.44 The reasons for the ineffectiveness were, among other things, inadequate time spent on training sessions, lack of awareness among the students about library resource usage training, and the separation between the training and course offerings.

The students scored higher on the IL skills Task Definition and Information Seeking Strategies. Similarly, Foo et al. found these two IL skills (and Location and Access) to be the highest scoring categories of Singaporean undergraduate students.45 According to the authors, the higher scores in these categories could be attributed to the systematic way in which these skills can be taught, for example, through IL related training provided by the library or acquired over time through practice. CIS was another IL skill of which the students received a higher score. Proficiency in CIS skills can yield better results than individual efforts due to shared activities.46 CIS has become increasingly important with the proliferation of the web and more recently the implementation of web 2.0. Although it was encouraging to find that many students were understood the concept of CIS, evidence showed that more support from the institutions is needed.

The scores attained for Information Evaluation, Location and Access, Information Use, and Information Synthesis were found to be unsatisfactory. As Foo et al.47 mentioned, these categories (except Location and Access) require “higher-order thinking skills to differentiate the quality and relevance of the retrieved information, and to subsequently synthesize, extract, and connect bits of information for use to complete [the] tasks.” Librarians and teaching staff need to put more efforts in transmitting these higher-order thinking skills to the students, e.g., through student-centered learning.

Confirming the results of Lwoga, the students in this study scored lower on the IL skill Location and Access.48 However, unlike Lwoga’s study, this current study revealed that most students only used three types of search to narrow down the search results, namely phrase searching, one keyword search technique, and quotes. Most students were not familiar with truncation, Boolean operators, and the use of OPAC in the library. This suggests that IL training needs to focus more on search techniques to increase the students’ ability to locate information efficiently.

The lowest scores were attributed to the ability to differentiate fact, view or opinion, understanding censorship, and citation style. The teaching staff needs to concentrate on increasing the students’ knowledge in these areas. Without the teachers’ support to monitor and educate the students, most IL programs will end unsuccessful or severely limited. The involvement of teaching staff are crucial as teachers are subject specific experts and provide the context in which the IL skills are exerted.49

“What are the students’ perceptions and experiences with information problems that can explain the score of the most and least successful IL skills?”

The results revealed many issues related to the students’ information problem-solving experiences that need to be addressed. The students reported a lack of coordination between the library staff and the teaching staff at the departments to communicate the importance and availability of IL training to the students. Consequently, many students missed the opportunity to acquire or hone their skills. This confirms the results of previous study by Lwehabura50 where the author concluded that the only way to make the students attend and acquire IL skills was to make the training compulsory and credit-bearing for all. This way, the students and teaching staff will also take IL training more seriously, as findings concerning plagiarism and lack of support from teachers were disconcerting factors.

The students’ lack of ICT experience seemed to affect their IL skills negatively, as highlighted in previous literature.51 The present study results showed that the students have access to computers, but they were not always able to use the medium to meet their academic needs. A study by Hargittai found that people who have been Internet users for longer are expected to have better online skills, such as finding information on the web easier as they have previous experiences to draw on.52

A majority of the students preferred using the search engine Google to the library databases to retrieve literature. The same finding was observed in Lwoga’s study of undergraduate students at another Tanzanian university.53 As a result, the author proposed that IL training put more emphasis on the use of scholarly databases/indexes. It was disconcerting to find that many students evaluated information (and its source) using questionable strategies or did not evaluate the information at all. For some students, this issue is exacerbated by deficient English language skills. Although English is the language of learning in Tanzania, it is not the first language of the students and evidence showed that this caused problems with comprehension. Hepworth and Wema’s designed and implemented an IL training course at UDSM and observed that students who did not use English as first language are likely to find IL practices more challenging, for example, understanding academic literature and refining search terms requires a good vocabulary.54 An encouraging finding, however, was that a majority of the students had a good understanding of how to use general online information sources, such as Wikipedia.


“How can the University of Dar es Salaam work pedagogically to improve the students’ IL skills?”

The findings from this study contribute to our knowledge about the level of IL skills among UDSM’s postgraduate students, and the way in which the students approached different information problems. Many divides can be attributed to the low IL test scores of the students. First, IL training is not prioritized in the curriculum and, therefore, also not among the students. Educational stakeholders, such as librarians and teaching staff, should take a proactive role in the promotion of IL initiatives both in the curriculum and in the library. Moreover, many previous studies, likewise this study, advocate an integration of IL skills in the curriculum to ensure the continuous practice of the skills in a meaningful context.

A second implication to be considered from this study is the students’ lack of ICT skills. In the future, IL and ICT training should be integrated so the students can take maximum advantage of all that ICT has to offer, as sheer access to technology does not itself create information literate individuals. At a later stage, less emphasis should be placed on computer skills, and more on the thinking skills and the broader aspects of IL.55

Educational stakeholders should also regard the English language skills of the students, as all IL skills are underpinned by proficiency in the English language.56 Training should also be given to enhancing the students’ higher-order thinking skills, by for example introducing student-centered and problem-based learning. Preferably, the training should be done in combination with research training. Ideally, as noted by the students, the trainings should be given at undergraduate level and continuously throughout their university education. Similar IL training should also be given to students at Masters and PhD level.

A third implication to be considered from this study is that the students made low use of the library’s e-resources and scholarly databases due to perceived inconvenience and inaccessibility. Educational stakeholders should, therefore, ensure that the subscriptions to the scholarly databases are available to the students outside of the library. More effort should also be invested in training the students on how to evaluate information critically, as well as increase their understanding around censorship and the ethical use of information in general.

A fourth implication to be considered from this study is that the students performed well in CIS, and this ability should be leveraged in the development of IL training, for example, through collaborative inquiry-based learning. Collaborative learning prepares students for the future workplace, as individuals seldom undertake various work tasks alone. Educational stakeholders could use different collaborative avenues to promote IL training. For example, the library could use a learning course management system to provide and extend the possibility for students to practice IL. The training of the skills could be conducted in groups or individually, and could include mentoring opportunities, online tutorials, and general IL guidelines.


One limitation of this study is the small sample size. Use of a convenience sample prohibits generalization of the results, as the sample is not representative of the postgraduate student population. The reliability in this study can be considered a strength since the survey questions were designed to fit the context of the study, and to be as straightforward as possible. The focus group questions were developed in an iterative matter, taking the survey results into consideration. A weakness regarding reliability in qualitative methods is that it is impossible to reconstruct the context in which the research study was conducted. Therefore, it is likely that other researchers would obtain slightly different results.

Future Research

A few questions were raised as a result of this research work. Could IL delivery be enhanced at UDSM through collaborative inquiry-based learning, as suggested in this study? How can universities in Tanzania use ICT to promote IL practices taking into account the level of ICT skills of the students? How can the universities (and also secondary school) better engage the teaching staff in the development of IL practices, to help ensure that future graduates are information literate? There are many questions unanswered and numerous research opportunities to investigate the most suitable approach to improving IL training in Tanzanian education. These questions are important because in an increasingly technology-driven world, it is imperative to equip students and citizens with IL skills and knowledge so they are able to function as independent lifelong learners.


  1. Tim Unwin, ICT4D—Information and Communication Technology for Development (Cambridge: Cambridge University Press, 2009).
  2. Patti Swarts and Esther Wachira, “Tanzania: ICT in education situational analysis,” A report to the Global e-Schools and Communities Initiative.
  3. Ibid.
  4. Ibid.
  5. Mark Hepworth and Evans Wema, “The Design and Implementation of an Information Literacy Training Course that Integrated Information and Library Science Conceptions of Information Literacy, Educational Theory and Information Behaviour Research: A Tanzanian Pilot Study,” Innovations in Teaching and Learning in Information and Computer Sciences, 5, no. 1 (2006): 1–23.
  6. Ibid.
  7. University of Dar es Salaam homepage, 2014, https://www.udsm.ac.tz/.
  8. Mugyabuso Lwehabura, “Information Literacy Delivery in Tanzanian Universities: An Examination of Its Effectiveness,” African Journal of Library, Archives and Information Science 18, no. 2 (2008): 157–68.
  9. Ibid.
  10. Hepworth and Wema, “The Design and Implementation of an Information Literacy Training Course.”
  11. Lwehabura, “Information Literacy Delivery in Tanzanian Universities.”
  12. Schubert Foo et al., “Information Literacy Skills of Humanities, Arts, and Social Science Tertiary Students in Singapore,” Reference & User Services Quarterly 53, no. 1 (2013), 40–50.
  13. Ibid.
  14. Christy Donaldson, “Information Literacy and the McKinsey Model: The McKinsey Strategic Problem-Solving Model Adapted to Teach Information Literacy to Graduate Business Students,” Library Philossophy and Practice 6, no. 2 (2004): 1–9.
  15. SCONUL Working Group on Information Literacy, The SCONUL Seven Pillars of Information Literacy: Core model for higher education (SCONUL, April, 2011).
  16. Colin Lankshear and Michele Knobel, “Digital Literacies: Concepts, Policies and Practices,” in Digital Literacies (New York: Peter Lang, 2008), 1–16.
  17. Hepworth and Wema, “The Design and Implementation of an Information Literacy Training Course.”
  18. Intan Azura Mokhtar et al., “Proposing a 6+3 Model for Developing Information Literacy Standards for Schools: A Case for Singapore,” Education for Information 27, no. 2–3 (2010): 505–21.
  19. Australian and New Zealand Institute for Information Literacy and Council of Australian University Librarians (ANZIIL), “Australian and New Zealand information literacy framework,” 2004, www.caul.edu.au/content/upload/files/info-literacy/InfoLiteracyFramework.pdf.
  20. SCONUL Working Group on Information Literacy, The SCONUL Seven Pillars of Information Literacy.
  21. Joseph Muema Kavulya, “Challenges Facing Information Literacy Efforts in Kenya: A Case Study of Selected University Libraries in Kenya,” Library Management 24, no. 4/5 (2003), 216–22.
  22. Emmanuel Baro, “A Survey of Information Literacy Education in Library Schools in Africa,” Library Review 60, no. 3 (2011), 202–17.
  23. Hannelore Rader, “The Global Significance of Information Literacy in Workforce Development An International Perspective,” UNESCO Thematic Debate on Information Literacy, 2005, http://portal.unesco.org/ci/en/ev.php-URL_ID=20492&URL_DO=DO_TOPIC&URL_SECTION=201.html.
  24. Ibid.
  25. Michael Eisenberg, “A Big6 skills overview,” 2014, http://big6.com/pages/about/big6-skills-overview.php.
  26. Ibid.
  27. Mokhtar et al., “Proposing a 6+3 Model for Developing Information Literacy Standards for Schools.”
  28. Viswanath Venkatesh, Susan Brown, and Hillol Bala, “Bridging The Qualitative–Quantitative Divide: Guidelines For Conducting Mixed Methods Research In Information Systems,” MIS Quarterly 37, no. 1 (2013): 21–54.
  29. Martyn Denscombe, The Good Research Guide: For Small-Scale Social Research Projects, 4th ed. (New York: Open University Press, 2010).
  30. Mokhtar et al., “Proposing a 6+3 Model for Developing Information Literacy Standards for Schools”; M. Eisenberg, J. Murray, and C. Bartow, “Big6 By The Month: A Common Sense Approach to Effective Use of Common Standards for Information Literacy Learning,” Library Media Connection 32, no. 6 (2014) 38–41.
  31. Schubert Foo et al., “Information Literacy Skills of Humanities, Arts, and Social Science Tertiary Students in Singapore.”
  32. The Association of College and Research Libraries, a division of the American Library Association (ALA), “Information Literacy Competency Standards for Higher Education,” 2000, www.ala.org/acrl/sites/ala.org.acrl/files/content/standards/standards.pdf.
  33. Justus Randolph, Multidisciplinary Methods in Educational Technology Research and Development (Hämeenlinna: HAMK, 2008).
  34. Michael Eisenberg, “Information Literacy: Essential Skills for the Information Age,” Journal of Library Information Technology 28, no. 2 (2008), 39–47.
  35. SCONUL Working Group on Information Literacy, The SCONUL Seven Pillars of Information Literacy.
  36. Ibid.
  37. Mark Mason, “Sample Size and Saturation in PhD Studies Using Qualitative Interviews,” Forum: Qualitative Social Research 11, no. 3 (2010).
  38. John Creswell, Qualitative Inquiry & Research Design: Choosing Among Five Approaches, 2nd ed. (Thousand Oaks, CA: SAGE, 2007).
  39. Eisenberg, Murray, and Bartow, “Big6 By The Month”; Mokhtar et al., “Proposing a 6+3 Model for Developing Information Literacy Standards for Schools.”
  40. Michael Eisenberg, Doug Johnson, and Bob Berkowitz, “Information, Communications, and Technology (ICT) Skills Curriculum Based on the Big6 Skills Approach to Information Problem-Solving,” Library Media Connection 28, no. 6 (2010): 24.
  41. Ibid.
  42. Ibid.
  43. Ibid.
  44. Mugyabuso Lwehabura and Christine Stilwell, “Information literacy in Tanzanian universities: Challenges and potential opportunities,” Journal of Librarianship and Information Science 40, no. 3 (2008): 179–91.
  45. Foo et al., “Information Literacy Skills of Humanities, Arts, and Social Science Tertiary Students in Singapore.”
  46. Roberto González-Ibáñez, Muge Haseki, and Chirag Shah, “Let’s Search Together, but Not Too Close! An Analysis of Communication and Performance in Collaborative Information Seeking,” Information Processing and Management 49, no. 5 (2013), 1165–79.
  47. Foo et al., “Information Literacy Skills of Humanities, Arts, and Social Science Tertiary Students in Singapore.”
  48. Edda Tandi Lwoga, “Mapping Information Literacy Outcomes and Learning Experiences of Health Sciences Undergraduate Students,” The Canadian Journal of Library and Information Practice and Research 9, no. 1, (2014): 1–17.
  49. Lwehabura and Stilwell, “Information Literacy in Tanzanian Universities.”
  50. Ibid.
  51. Diana Rosenberg, “Towards the Digital Library: Findings of an Investigation to Establish the Current Status of University Libraries in Africa,” International Network for the Availability of Science Publications (2005): 1–29.
  52. Eszter Hargittai, “Differences in People’s Online Skills,” First Monday: Peer Reviewed Journal on the Internet 7, no. 4 (2002).
  53. Edda Tandi Lwoga, “Mapping Information Literacy Outcomes and Learning Experiences of Health Sciences Undergraduate Students.”
  54. Mark Hepworth, “Developing Academic Information Literacy for Undergraduates through Inquiry Based Learning,” Innovation in Teaching and Learning in Information and Computer Sciences 8, no. 2 (2009): 2–13.
  55. Ibid.
  56. Neil Selwyn and Keri Facer, Beyond the Digital Divide: Rethinking Digital Inclusion for the 21st Century (FutureLab Report, 2007).
The Big6+3 Model as proposed by Mokhtar et al., 2009

Figure 1. The Big6+3 Model as proposed by Mokhtar et al., 2009

Number and Percentage of Respondents across Institutions

Figure 2. Number and Percentage of Respondents across Institutions

Frequencies of Visiting University Library and Frequencies of Using the Library’s Resources

Figure 3. Frequencies of Visiting University Library and Frequencies of Using the Library’s Resources

Histogram of Standardized Percentage Score for IL Skills

Figure 4. Histogram of Standardized Percentage Score for IL Skills

Standardized Mean Percentage Scores across Students’ IL Training Background

Figure 5. Standardized Mean Percentage Scores across Students’ IL Training Background

Standardized Mean Percentage Scores Across Year of Study

Figure 6. Standardized Mean Percentage Scores Across Year of Study

Standardized Mean Percentage Scores across Institutions

Figure 7. Standardized Mean Percentage Scores across Institutions

Table 1. Big6 Information Problem-Solving Process (Eisenberg, 2014)



1. Task Definition

1.1. Define the information problem

1.2. Identify the information needed in order to complete the task (to solve the information problem)

2. Information Seeking Strategies

2.1. Determine the range of possible sources (brainstorm)

2.2. Evaluate the different possible sources to determine priorities (select the best sources)

3. Location and Access

3.1. Locate sources (intellectually and physically)

3.2. Find information within sources

4. Use of Information

4.1. Engage (e.g. read, hear, view, touch) the information in a source

4.2. Extract relevant information from a source

5. Synthesis

5.1. Organize information from multiple sources

5.2. Present information

6. Evaluation

6.1. Judge the product (effectiveness)

6.2. Judge the information problem-solving process (efficiency)

Table 2. Standardized Percentage Score for Each IL Skill Category






Std Dev

Task Definition






Information Seeking Strategies






Collaborative Information Seeking






Information Evaluation






Location Access






Information Use






Information Synthesis






Information Ethics






Table 3. Standardized Mean Score for Each Testing Area of IL SkillsNote: The green highlighted rows show the three areas where students performed the most successfully, and the red highlighted rows show the areas where the students performed the least successfully

IL Skill

No. of Questions


Mean (Max 100)

Task Definition


Brainstorming/Defining tasks


Research topics and questions


Information Seeking Strategies


Seeking expert opinion


Primary vs. secondary information sources


Appropriate sources of information


Reference resources


Location & Access


Knowledge of library e-resources


Roles of reference librarians




Using index of a book


Narrowing search results


Boolean operators


Broadening searches


Phrase search


Stop words


Type of search




Information Use


Evaluating information content


Cross comparison of content


Critical assessment of information


Fact, view or opinion?


Authoritative information source


Information Synthesis


Citation style


Citation style


Information Evaluation


Information evaluation tools and resources






Information Ethics




Collaborative Information Seeking


Collaborative Information Seeking


Table 4. Comparison of Standardized Mean Percentage Scores for each IL Skill between Students with Library Usage Training and Students without Library Usage TrainingNote: The green highlighted rows indicate which student group scored better.

IL training

Task Definition

Information Seeking Strategies

Location & Access

Information Use

Information Synthesis

Information Evaluation

Information Ethics

Collaborative Information Seeking

Yes (n = 92)









No (n = 10)



















  • There are currently no refbacks.

ALA Privacy Policy

© 2021 RUSA