Printer Friendly

Comparative evaluation of the level of competence expectations of LIS educators and practitioners in database management.

Introduction

LIS education programs must equip graduates with adequate professional knowledge and skills required to select, acquire, organize, store, maintain, retrieve and disseminate recorded information that will be commensurate with the demands of entry-level positions for LIS graduates (Reitz, 2004). As information technology continues to redefine the professional information management landscape, providing adequate training to prepare new professionals for current and future LIS work has become the challenge of LIS education. As Elmborg (2008) put it, a relevant education for the information sectors will include a synthesis of global, technical, and critical perspectives to encourage librarians to understand the changes they need to enact to ensure relevance for the future. Teaching for problem solving is a major educational goal (Mayer, 2002a) for LIS programs. Educators want their students to be able to transfer what they have learned in the classroom to new situations, such as their jobs, upon graduation. Practitioners also expect that graduates in various fields can transfer what they have learned in school to their job, particularly at entry-level positions. Educators must inculcate in their students a competence level that is transferable to new work environments, particularly in positions involving Database Management (DBM) tasks. Moran (2001) and Base, Hall-Ellis and Kinney (2008) acknowledge that there is a "growing rift" between educators and practitioners regarding the quality of LIS education. Moran is concerned that LIS education may not be targeted at the competence level required from graduates by their employers--the practitioners. An LIS education system in which educators train the students with little or no consideration of what practitioners' expectations are from the graduates in terms of level of competence will likely be out of touch with the reality of their future job demands. Base, Hall-Ellis and Kinney noted that the perception of many library managers is that LIS educators are not giving their students the tools and skills necessary for entry-level positions, particularly in cataloguing. Moran therefore suggested that it is time for LIS educators and practitioners to "connect."

LIS graduates now work in diverse information management institutions with such titles as technology initiatives librarian, internet services librarian, electronic services librarian, digital librarian, information technology librarian, and multimedia services librarian (Moyo, 2002). As new information professionals, LIS graduates are expected to possess a certain level of competence to effectively manage information resources and services in various information management environments. Reitz (2004) defines competence as the "capabilities expected of a person hired to perform a specific job or upon successful completion of a course of study or training" (p. 164). In other words, competence is applied knowledge and observable application of knowledge that results in success at a specific professional task. Here, competence is defined as the cognitive processes and knowledge required for efficient performance in a specified task or job position. Although several competence models have been offered for adequate practice in various facets of LIS work, competence itself has not been adequately identified.

As is reiterated by Paquette et al. (2006), to say that a person knows something or that the person must acquire some particular knowledge is not sufficient. What is needed is to specify a degree or a level of knowledge mastery--the level of competence. The level of competence intended in LIS education is identifiable in the learning objectives of the courses offered by LIS school faculty. Learning objectives are "definitive descriptions of desirable educational outcomes, often expressed in terms of what students should be able to do at the end of their course" (Ellington, 1984, p. 5). The learning objectives express what teachers intend students to learn as a result of their instruction. They represent the level of competence that teachers pursue in the course of instruction. These competence levels should be in alignment with competence expectations of practitioners. The objective of this study is to present a framework applicable to investigate, in a specific competency domain in LIS, the alignment of expected level of competence for LIS graduates as expressed in course objectives offered by educators, and the expected level of competence by practitioners for the graduates.

The Case of DBM and Academic Libraries

In this study, database management is used as an example of a competence area for LIS graduates to assess the alignment of the level of competence offered in LIS education with that demanded by LIS practitioners. Several researchers have identified LIS tasks in which competence is a requirement in various types of libraries and library positions. So far, researchers in LIS have either concentrated on listings of competence requirements for LIS practitioners, especially those who are already employed; or suggested an array of courses to be included and expanded in LIS curricula. Tennant (1999), Ocholla (2002), Prestamo (2000), Heinrichs and Lim (2009), and Kroth, Phillips, and El dredge (2009) focused on IT competence for LIS work and agree that competence in IT tasks is essential for LIS professionals irrespective of their job titles or positions. Heinrichs and Lim found a perceived need for librarians to obtain additional training in the areas of database development, web design, and multimedia skills. Mbabu (2009) and Macklin and Culp (2008) examined the level of information literacy instruction in LIS education and both studies agree that current information literacy curricula do not align with required competency standards for current LIS practice. The studies therefore suggested the expansion of information literacy instruction within the larger context of student learning. From a cataloguing and technical services perspective, Hall-Ellis (2008) identified employers' competency expectations for cataloguers and technical services librarians to include education, theoretical knowledge, cataloguing competencies, and communication skills (including supervision and training). Thomas and Patel (2008, p. 303) suggested the use of Competency-Based Training (CBT) by LIS practitioners and educators to "better understand the jobs of specialized professionals, and to build curricula shaped by the knowledge, skills and abilities required to perform such work." Lester and Van Fleet (2008) want to see professional competencies and standards documents from library and information professional associations used to plan the curricula of LIS programs. According to this study, practitioners use these standard documents for their recruitment programs but educators have not paid adequate attention to the application of these documents and standards to their program planning and curriculum development.

Research in LIS has yet to develop or identify a conceptual framework to determine if there is an alignment between LIS education and practitioner requirements from entry-level LIS graduates. The competence expectations of LIS practitioners need to be explored for a better understanding of the requirements for today's LIS job market. Such studies will engender the effort of LIS educators to ensure the alignment of LIS education to current and emerging job competence requirements for LIS graduates.

DBM has been an integral part of information resources management for some time (Batini, Ceri, & Navathe, 1992). The creation and management of databases is an integral part of LIS work. Database creation and management is a core traditional bibliographic service. Today, library databases are in themselves a vehicle for information provision as library "collections now consist of online databases" (Intner, 2004, p. 9). Database design is the process of determining the organization of a database, including its structure, contents, and the applications to run it (Batini et al., 1992). In the past, DBM was regarded as a task for computer experts. Today, with advances in the application of IT in LIS, the task of managing databases, particularly bibliographic ones, has become a common skill requirement for many LIS positions. DBM is a part of the general knowledge background for LIS work in various types of libraries (Ocholla, 2002; Prestamo, 2000; Tennant, 1999).

LIS Schools in the US and Canada have taught DBM dating back to the early 1970s (Schwartz, 1990). In 1989, Eddison (1990) surveyed 63 ALA-accredited LIS schools in the US to find out, among other things, if they taught DBM courses and when they started teaching such courses. The result of her survey indicates that 36 LIS schools taught 56 courses: 29 were stand-alone courses where database design was the main topic, and in 27 DBM topics were part of a broader course. Six of the courses were first taught in 1970, and 42 began in the 1980s. DBM remains a visible component of every LIS curriculum. Course information on the websites of ALA-accredited LIS programs indicates that DBM is taught in all the programs either as a distinct course, or as a topic in a broader course. Measuring the level of competence offered in LIS education and that expected by the practitioners from LIS graduates as undertaken in the present study will provide a suitable platform to examine the alignment of LIS education with practitioner requirements, particularly in DBM. In this study, only courses solely dedicated to DBM instruction in ALA-accredited LIS programs in the US and Canada are included. Such courses are used as typical examples of DBM courses.

Concept mapping of the DBM field is employed as a basis to determine the selection of the DBM courses included in this study. A concept map is a graphical node-arc representation of the relationships among a collection of concepts (Novak & Gowin, 1984). The concept map of DBM used for this study does not claim to be completely exhaustive; however, it is intended to provide a framework for identifying DBM courses in LIS programs. To construct the DBM concept map, the Database Term Dictionary (College of Humanities, Ohio State University, 2004); Databasics: A Database Dictionary (Vines, 1998); and Glossary of Database Terminology (High Energy Astrophysics Division, Harvard-Smithsonian Center for Astrophysics, Harvard University, 1996) were used to generate a comprehensive listing of DBM concepts that formed the map. The DBM concept map drawn from the 93 unique concepts identified here is assumed to represent DBM domain as defined in this study. Figure 1 shows the derived DBM concept map.

This study also limits its focus to academic libraries. The patterns of database use in academic libraries as reported by Tenopir and Read (2000) suggest that DBM is also an essential task in academic libraries. Academic libraries acquire enormous volumes of various information resources organized in databases for effective information management and service to users. Academic librarians will therefore require significant competence levels in DBM to effectively manage and utilize the volume of information in various databases, developed or acquired, that is maintained in academic libraries. Academic libraries employ the most significant percentage of LIS graduates in the US and Canada, following public libraries (Maatta, 2005). Work requirements vary from one type of library to the other (Hayati, 2005). There is need to adequately train professionals who will form the work force in academic libraries, particularly in the area of DBM. The scope of this study will be limited to academic libraries so as to examine in detail the level of competence requirement of LIS practitioners for entry-level academic librarians. Further research, outside the scope of the current study, will be necessary to examine the level of competence required in other types of libraries and library tasks and or positions.

Classifying Learning Objectives with the Taxonomy Table

In education, learning objectives state what a teacher expects students to learn as a result of instruction. Bloom, Engelhart, Furst, Hill, and Krathwohl (1956) describe learning objectives as "explicit formulations of the ways in which students are expected to be changed by the educative process" (p. 26). In the words of Ellington (1984), learning objectives are "definitive descriptions of desirable educational outcomes, often expressed in terms of what students should be able to do at the end of their course" (p. 5). Kemp, Morrison, and Ross (1998) describe learning objectives as statements regarding what the learner will know or be able to perform, under specific conditions, upon completion of instruction. Anderson and Krathwohl (2001) designed a two-dimensional table, which they call the Taxonomy Table (TT). The Knowledge Dimension (KD) and the Cognitive Process Dimension (CPD) are the vertical and horizontal axis of the TT, respectively (Table 1). Anderson and Krathwohl suggest that all learning objectives could be classified within the cells of the TT.

The TT provides a framework for classifying statements of what teachers expect or intend students to learn as a result of an instruction, i.e. the learning objective. A learning objective contains a verb/verb phrase and a noun/noun phrase. The verb/verb phrase generally describes the intended cognitive process. The noun/noun phrase generally describes the knowledge students are expected to acquire or construct (Anderson & Krathwohl, 2001). In contrast with the single dimension of the original Bloom's Taxonomy (Bloom, et. al., 1956), the Anderson/Krathwohl TT presents a two-dimensional framework for classifying learning objectives in support of the two facets of cognition prescribed by Sticht and McDonald (1989) and confirmed in Anderson and Krathwohl (2001). To classify learning objectives with the TT, the verb (cognitive process) and noun (knowledge) in an objective are placed within the corresponding rows and columns of the TT to indicate the cognitive process and the level of knowledge inherent in the objective. Figure 2 graphically demonstrates this process. Classifying a learning objective with the TT begins with the location or identification of the verb/verb phrase and the noun/noun phrase in the objective.

[FIGURE 1 OMITTED]

The verb/verb phrase is examined within the context of the six categories of the CPD: Remember, Understand, Apply, Analyze, Evaluate, and Create. On the other hand, the noun/noun phrase is examined in the context of the four types of knowledge in the KD: Factual, Conceptual, Procedural, and Metacognitive. As shown in Figure 1, the objective "The student will learn to apply the reduce-reuse-recycle approach to conservation" will be classified as follows. The verb is "apply." Apply is one of the six cognitive process categories, so this verb will be classified under Apply. The noun phrase is "the reduce-reuse-recycle approach to conservation." An approach is a method or technique and therefore falls under Procedural Knowledge. Thus, this objective is placed in the cell corresponding to the intersection of Apply and Procedural Knowledge.

A learning objective represents the intended content of the instruction given to students; it expresses what students are to learn--the cognitive process and knowledge for a course. The verb and noun elements represent the level of competence expected in the learning objective. The analysis of learning objectives with the Taxonomy Table (TT) will reveal the type(s) of cognitive processes and Knowledge--the level of competence--expected by the instructor. The TT, according to Krathwohl (2002), identifies knowledge and cognitive processes emphasized in objectives and:
   ... provides a clear, concise, visual representation
   of a particular course or unit.
   Once completed, the entries in the Taxonomy
   Table can be used to examine relative
   emphasis, curriculum alignment, and
   missed educational opportunities. Based
   on this examination teachers can decide
   where and how to improve the planning of
   curriculum and the delivery of instruction.
   (p. 213)


Previous studies reported in the literature have identified LIS tasks that require competence. Evaluation of the level of competence in learning objectives of LIS educators and practitioners within essential LIS task(s) is yet to be done. Furthermore, the level of alignment, if any, between LIS education and practitioners' requirements from LIS graduates is yet to be determined. Using the TT to analyze learning objectives obtained from LIS educators and practitioners helps to determine the level of competence intended for LIS students by educators, and expected from LIS graduates by practitioners. As a framework, the TT can be utilized for the analysis of all learning objectives irrespective of the discipline involved. DBM courses involve predominantly cognitive tasks such as data analysis, database programming, normalization, data design, database modelling, making the TT a framework that is applicable to analyzing learning objectives in DBM.

[FIGURE 2 OMITTED]

Methodology

This study offers a theoretical framework for the evaluation and analysis of learning objectives. The comparison between educators' and practitioners' objectives in DBM will lead to the revelation of any competence shortfall in LIS education and engender efforts to close such knowledge gaps between LIS education and the expectations from LIS graduates. To demonstrate the use of the Taxonomy Table (TT) for the analysis of learning objectives in LIS, DBM was selected as an essential competence requirement for LIS work. A sample of LIS educators and practitioners in this area were selected and their learning objectives obtained. With the use of concept mapping, the knowledge structure of DBM was identified and used to identify DBM courses to be included in the study. A modified database life circle (MDBLC) was applied to identify five work processes in DBM: Database planning, Database design, Database implementation, Database Operation, and Database maintenance. These work processes formed the basis of the comparison between educators' and practitioners' competence expectations in DBM. To ascertain if there is a significant difference between proposed levels of competence by educators and practitioners, a non-parametric test, the Mann-Whitney U test, is applied. Competence levels recommended for LIS students and graduates by educators and practitioners are distributed according to the ranking of the CPD (1 to 6) and KD (1 to 4) as stipulated in the Taxonomy Table.

Population

The population used for this study is made up of LIS educators that teach DBM in ALA-accredited LIS schools and LIS practitioners working in academic libraries, in the US and Canada. A review of the courses offered in each ALA-accredited LIS program as listed on their websites provided the list of DBM courses in each program. After mapping the course content of DBM courses offered in each program into the DBM concept map compiled for this study, 26 DBM courses were identified from all the ALA-accredited LIS programs. LIS schools where the identified DBM courses are offered were contacted by telephone to identify the faculty who teach each DBM course. Once identified, a letter was sent to each DBM faculty member to formally invite him/her to take part in the study.

LIS practitioners who participated in this study were drawn from academic libraries in universities located in the US and Canada. The focus on academic libraries is due to the prevalence of database development and use in such libraries as reported in Tenopir and Read (2000). The Association of Research Libraries (ARL) provided the platform for the selection of university libraries used in this study. Membership in ARL comprises 123 academic and research libraries in the US and Canada. The top 26 member libraries as ranked by the ARL were selected for this study. Each selected academic library was contacted by telephone to identify a database manager in each academic library. In this study, librarians who were identified as database managers in the top 26 member libraries of ARL were formally invited to participate in the study.

Data Collection

Learning objectives, found in the course information and/or outlines of DBM courses published on the web sites of LIS schools that offer DBM courses, are largely stated in general terms and often do not express clearly what DBM educators expect the students who take these courses to learn or be able to do as a result of instruction given in these courses. Also, the structure of these course objectives are not in conformity with the structure of learning objectives described earlier. The researcher found that course objectives as posted in library school web sites were inadequate as the only source of data from DBM educators for the study. To obtain appropriate learning objective statements from DBM educators and practitioners, a survey of both groups was conducted using telephone interviews. Within the interview questions was an additional note that explained the structure of learning objectives to the respondents. For the purpose of this study, e-mail was considered an effective alternative for data collection in cases where telephone interview was not feasible (Kazmer & Xie, 2008). Two educators and three practitioners responded by e-mail. To facilitate the collection of data and encourage participation in this study, a web site with links to two web pages was constructed for educators and practitioners. The educators and practitioners were sent invitation letters by mail. This made it easy for the respondents to readily visit the study sites to participate in the study. Ten educators and 12 practitioners participated in the study. Nine out of the 10 educators who participated are full-time faculty members while one is a doctoral student in the LIS program where he teaches DBM. None of the educators may be considered a practitioner.

Data Analysis

The tape-recorded responses to telephone interviews were transcribed. Content analysis is used as the technique to analyze the learning objective statements from all respondents to determine the noun/verb coordinates in each derived learning objective, and to assign learning objectives into the DBM work processes. Content analysis is a "technique for gathering and analyzing the content of text. Content refers to words, meanings, pictures, symbols, ideas, themes, or any message that can be communicated. Text is anything written, visual or spoken, that serves as a medium for communication" (Neuman, 2006 p. 322). In LIS, content analysis has been used to analyze the effects of affective issues such as emotion and confidence on information behaviour (Julien, McKechnie, & Hart, 2005). This classification allowed for the comparison of the level of competence proposed by educators and practitioners not only in DBM overall but also in each DBM work process.

Connolly, Begg, and Strachan (1996) describe DBM work processes as the database application lifecycle, which involves: Database planning, System definition, Requirements collection and analysis, Database design, DBMS selection, Application Design, Prototyping, Implementation, Data Conversion and loading, Testing, and Operational maintenance. Similarly, in Rob and Coronel (2000), DBM work comprises six phases: Database initial study, Database design, Implementation and loading, Testing and evaluation, Operation, and Maintenance and evolution. For the purpose of this study, the database management work processes identified above were revised to obtain a modified database life circle (MDBLC) that is used to assess and compare competence level requirements of DBM educators and practitioners. The MDBLC identifies five phases which are representative of all the work processes in database management namely: Database planning, Database design, Database implementation, Database Operation, and Database maintenance.

A total of 99 learning objectives were derived from DBM educators, while 141 learning objectives were derived from DBM practitioners. To determine the competence levels proposed in learning objectives derived from educators and practitioners, and to compare the results thereof, DBM competence requirements are classified according to the DBM work processes provided in the MDBLC. The MDBLC provides for five work processes, namely: Database planning, Database design, Database implementation, Database Operation, and Database maintenance. In this study, each of these work processes is regarded as a competence area and is used to assess and compare the competence expectations of LIS educators and practitioners. According to Weber (1990), "To make valid inferences from the text, it is important that the classification procedure be reliable in the sense of being consistent: different people should code the same text in the same way' (p. 12). Inter-coder reliability has been the subject of intense methodological research efforts over the years (Krippendorf, 2004) and it is important for validating research analysis procedures (Weber, 1990). To ensure inter-coder reliability in the content analysis process and classification procedure used in this study, a panel of two experts, one with experience in DBM and the other in the Taxonomy Table, were recruited to verify the data analysis done by the researcher in the content analysis, the competence area classification, and the competence levels assignment. The data sets for three educators and practitioners were randomly selected and analyzed by the experts in each of the stages noted above. The average agreement level between the researcher and the experts was 87% (in content analysis), 74.5% (in competence area classification) and 74% (in competence levels assignment). A 70% intercoder reliability measure was adopted in a similar analysis in a study by Tabatabai (2005).

Discussion

A total of 99 learning objectives were derived from the statements obtained from LIS educators that participated in this study. The distribution of these learning objectives in DBM competence areas is as shown in Figure 3. Table 2 and Table 3 show details of the competence levels proposed by educators as derived from their learning objectives in the CPD and KD respectively. The empty cell in Table 2 and subsequent tables indicates that no learning objective is placed in the competence level in question, either in the CPD or KD. For instance in Table 2, there is no learning objective that emphasizes the competence level Create in the CPD of Database Planning.

Similarly, a total of 141 learning objectives were derived from the statements obtained from LIS practitioners who participated in this study. The distribution of these learning objectives in the DBM Competence areas is shown in Figure 4. Tables 4 and 5 show details of the competence levels proposed by practitioners as derived from their learning objectives in the CPD and KD respectively.

In the KD, no learning objective obtained from the practitioners emphasizes the Metacognitive Knowledge level. In order to establish if there is a significant difference between the level of competence that LIS educators propose for students, as emphasized in the learning objectives of their DBM courses, and those that LIS practitioners require from LIS graduates as prescribed in learning objectives obtained from practitioners, the CPD and KD were considered separately within each DBM competence area, as well as in DBM overall. Table 6 highlights the distribution of the learning objectives obtained from educators and practitioners in the DBM Competence areas.

A chi square test shows that there is no significant difference between educators and practitioners in the distribution of their learning objectives in various DBM competence areas. This indicates that the DBM competence areas educators emphasize on in their DBM courses are similar to those that practitioners emphasize and demand competence in from LIS graduates. Table 7 and Table 8 lay out the levels of competence in all the DBM competence areas proposed by educators, and that required by practitioners in the CPD and KD, respectively.

The Mann-Whitney U Test results shown in Table 9 and Table 10 provide details of the differences in expectations in competency levels between Educators and Practitioners in the CPD and KD, respectively. In Database Planning, the result indicates that there is a significant difference (z = 3.037, p = 0.002) found between the competence levels proposed by educators for LIS students and those expected by LIS practitioners in the CPD. Whereas educators emphasize Understand, followed by Remember, the two lowest competence levels, practitioners on the other hand are expecting Analyze followed by Create, which are higher competence levels.

The distribution of learning objectives obtained from educators indicates that the most emphasized DBM competence area is Database Design (48.5%), followed by Database Planning (21.2%), Database Implementation (15.2%), Database Operation (11.1%), and Database Maintenance (4.0%). Regarding the level of competence that LIS educators propose for students, as emphasized in the learning objectives of their DBM courses, the results in the CPD (see Table 2) show that the most emphasized competence level is Understand with 35 of the 99 learning objectives obtained from the educators. It is then followed by Apply (24.2%); Create (22.2%); Remember (10.1%); Analyze (5.1%); and Evaluate (3.0%). In the KD (Table 3), the level of competence most emphasized is Procedural Knowledge with 40 out of 99 learning objectives obtained. Next to Procedural Knowledge are Conceptual Knowledge (36.4%) and Factual Knowledge (23.2%). Clearly, the most emphasized level of competence in DBM for educators is Understand/Procedural Knowledge. This is followed by Apply/Conceptual Knowledge. At this level, students are expected to learn to put database management concepts to work as database managers.

For the practitioners, Figure 4 shows that the most emphasized DBM competence area by LIS practitioners is also Database Design (54.6%), followed by Database Planning (14.9%), Database Operation (12.8%), Database Implementation (10.6%), and Database Maintenance (7.1%). Practitioners therefore have varying degrees of importance attached to the various DBM competence areas. In the CPD (see Table 4), the level of competence that LIS practitioners emphasize most is Understand, (28.4%) followed by Apply (17.7%); Analyze (16.3%); Create (15.6%); Remember (14.9%); and Evaluate (7.1%). In the KD, the level of competence most emphasized is Conceptual Knowledge (see Table 5) with 69 (48.9%) of the learning objectives considered in this dimension. Procedural Knowledge is next with 51 (36.2%) learning objectives, and lastly Factual Knowledge with 21 (14.9%) learning objectives. Thus the most emphasized level of competence in DBM for practitioners is Understand/Conceptual Knowledge. Like the educators, practitioners did not provide any learning objectives that emphasize Metacognitive Knowledge.

In this study, no statistically significant difference is found in the competence level proposed for LIS students by educators, and those expected of LIS graduates at entry-level positions by practitioners in the CPD (z = 0.027, p = 0.978) as well as the KD (z = 0.245, p = 0.806). There is therefore alignment between LIS education and practitioner expectations of LIS graduates. Interestingly, although there is no statistically significant difference between LIS education course objectives and practitioner expectations in DBM, this study shows that whereas practitioners demand Understand/Conceptual Knowledge, educators offer Understand/Procedural Knowledge to LIS students. This is an indication that in the KD, educators' objectives in DBM more than matches practitioners' expectations. LIS graduates who have taken DBM courses in ALA-accredited LIS programs will likely obtain competence levels that match and even exceed employers' expectations, provided the educators design course delivery methods and evaluation criteria that will ensure that their learning objectives are actualized.

This study also reveals that the distribution of learning objectives obtained from sample educators and practitioners show that Database Design is the most emphasized DBM competence area by educators and practitioners. Although in the CPD of Database Implementation and KD of Database Planning, the results obtained are regarded as inconclusive because the probability in these instances are not appreciably higher than [alpha] = 0.05, the Mann-Whitney U tests show no significant difference between educators and practitioners in all DBM competence areas overall.

Conclusion

The results of the analysis on the learning objectives of educators and the competence expectations of the practitioners in this study suggest that there is alignment between LIS education and practitioner expectations of LIS graduates in DBM. The only competence area in which a significant difference was recorded statistically between educators and practitioners is Database Planning, in the CPD. Although this result did not affect the overall DBM competence level difference indices, it is noteworthy that for Database Planning, in the CPD, educators mostly emphasize Understand while practitioners expect mostly Analyze.

DBM is an essential competence area that involves the use of IT in LIS work, particularly in academic libraries. The fear expressed by Swigger (1997) that there is a seemingly growing gap between LIS education and employer requirements of LIS graduates could not be established. The "growing rift" acknowledged in Moran (2001) between educators and practitioners regarding the quality of LIS education which gave her concern that LIS education may not be targeted at the competence level required from graduates by their employers, seems not to be the case in DBM, as this study reveals by providing evidence of reasonable alignment between the level of competence offered in DBM education and practitioner required levels of competence in DBM. With the TT presented and demonstrated as a conceptual framework for the analysis of learning objectives, LIS educators can now systematically and objectively monitor the competence demands of practitioners in all essential facets of LIS practice, especially in the volatile and rapidly developing area of IT. This will ensure that there is an alignment between their learning objectives and the demands of employers of their graduates at all times. Also, the application of the TT as a practical framework for the analysis of learning objectives in LIS will enable educators to address the issue of meaningful learning in LIS. Meaningful learning requires that instruction go from simple presentation of factual knowledge to building and transfer of the knowledge and cognitive processes needed for successful problem solving (Anderson & Krathwohl, 2001). Courses in LIS programs will achieve a higher level of competence if their learning objectives emphasize higher cognitive processes and knowledge. If LIS educators become aware that the level of competence expected of their graduates requires higher cognitive processes and knowledge, it becomes imperative that they design and pursue learning objectives that will ensure the right level of competence is offered in their courses. A study such as this is recommended in other areas of LIS education. Such studies could be used to check the alignment between LIS education and practitioner requirements for the acceptance of LIS graduates for entry positions in other essential LIS competency domains. This study explored only a segment of the IT competence requirements for LIS work--DBM in academic library settings within the US and Canada. Further research may be needed to also explore other IT competence areas required for LIS work. There is need also to expand similar research into other types of libraries.

References

Anderson, L. W., & Krathwohl, D. R. (2001). A taxonomy for learning, teaching, and assessing: A revision of Bloom's Taxonomy of educational objectives (Complete ed.). New York, NY: Longman.

Base, M. A., Hall-Ellis, S. D., & Kinney, A. J. (2008). Making the connection: Focusing on the disconnect between LIS education and employer expectations. Journal of Education for Library and Information Science, 49(2), 91-92.

Bates, M. E. (1998). The newly minted MLS: What do we need to know today? Searcher, 6(5), 30-33.

Batini, C., Ceri, S., & Shamkant, B. N. (1992). Conceptual database design: An entity-relationship approach. Redwood City, CA: The Benjamin/Cummings Publishing Company.

Bloom, B. S., Engelhart, F. H., & Krathwohl, D. R. (1956). Taxonomy of educational objectives: The classification of educational goals. Handbook 1: Cognitive domain. New York, NY: David McKay.

Connolly, T. M., Begg, C. E., & Strachan, A. D. (1996). Database systems: A practical approach to design, implementation and management. Wokingham, England: Addison-Wesley Publishers Limited.

Database Term Dictionary. (2004). Retrieved from http://his.osu.edu/help/database/terms.cfm

Eddison, B. (1990, February). Teaching information professionals about database design. Database, 33-37.

Ellington, H. (1984). Educational objectives, teaching and learning in higher education. Aberdeen, Scotland: Robert Gordon Institute of Technology.

Elmborg, J. K. (2008). Framing a vision for 21st-century librarianship: LIS education in changing times. The Serials Librarian, 55, 499-507.

Frank, D. G. (1989). Education for librarians in a major science-engineering library: Expectations and reality. Journal of Library Administration, 199(3/4), 107-116.

Glossary of Database Terminology.(1996).Retrieved from http://hea-www.harvard.edu/MST/simul/ software/docs/pkgs/pgsql/glossary/glossary. html#databasemanager

Grissom, R. J. (1994). Statistical analysis of ordinal categorical status after therapies. Journal of Consulting and Clinical Psychology, 62(2), 281-284.

Hall-Ellis, S. D. (2008). Cataloger competencies ... What do employers require? Cataloging and Classification Quarterly, 46(3), 205-330.

Hayati, Z. (2005). Competency definition for Iranian library and information professionals. Journal of Education for Library and Information Science, 46(2), 165--192.

Heinrichs, J. H., & Lim, J. S. (2009). Emerging requirements of computer related competencies for librarians. Library and Information Science Research, 31(2), 101-106.

Intner, S. S. (2004). The invisible cataloguer. Technicalities, 24(5), 1, 9.

Johnson, R., & Kuby, P. (2000). Elementary statistics (8th ed.). Pacific Grove, CA: Duxbury.

Julien, H., McKechnie, L., & Hart, S. (2005). Affective issues in library and information science work: A content analysis. Library and Information Science Research, 27, 453-166.

Kazmer, M. M., & Xie, B. (2008). Qualitative interviewing in Internet studies: Playing with the media, playing with the method. Information, Communication, and Society, 11(2), 115-136.

Kemp, J. E., Morrison, G. R., & Ross, S. M. (1998). Designing effective instruction (2nd ed.). Upper Saddle River, NJ: Prentice Hall.

Krathwohl, D. R. (2002). A revision of Bloom's Taxonomy: An overview. Theory Into Practice, 41, 212-218.

Krippendorf, K. (2004). Reliability in content analysis: Some common misconceptions and recommendations. Human Communications Research, 30, 411-433.

Kroth, P. J., Phillips, H. E., & Eldredge, J. D. (2009). Leveraging change to integrate library and informatics competencies into a new CTSC curriculum: A program evaluation. Medical Reference Services Quarterly, 28, 221-234.

Lester, J., & Fleet, C. V. (2008). Use of professional competencies and standards documents for curriculum planning in schools of library and information studies education. Journal of Education for Library and Information Science Education. 49(1), 43-69.

Maatta, S. (2005, October). Closing the gap--Placements and salaries 2004. Library Journal. Retrieved from http://www.libraryjournal.com/ article/CA6269428.html

Macklin, A. S., & Culp, F. B. (2008). Information literacy instruction: competencies, caveats and a call to action. Science and Technology Libraries, 28(1), 45-61.

Mayer, R. E. (2002a). The promise of educational psychology: Teaching for meaningful learning. Upper Saddle River, NJ: Prentice Hall.

Mbabu, L. G. (2009). LIS curricula introducing information literacy courses alongside instructional classes. Journal of Education for Library and Information Science, 50(3), 203-210.

Moran, B. B. (2001, November). Practitioners versus LIS educators: Time to reconnect. Library Journal, 126(18), 52.

Moyo, L. (2002). CPE anywhere anytime: Online resources for the information society. In P. Layzell-Ward (Ed.), Continuing professional education for the information society: The World Conference on the Continuing Professional Education for the Librry and Information Science Professions (pp. 224-231). Munchen, Germany: K.G. Saur.

Neuman, W. L. (2006). Social research methods: Qualitative and quantitative approaches (6th ed.). Boston, MA: Pearson.

Novak, J. D., & Gowin, D. B. (1984). Learning to learn. New York, NY: Cambridge University Press.

Ocholla, D. N. (2002, June). An overview of the status of information and communication technologies (ICT): Their availability and usage in the academic dispensation for information services in Library schools of sub-Saharan Africa. Paper presented at the 9th International Conference, "Crimea 2002," Sudak, Ukraine.

Paquette, G., Leonard, M., Lundgren-Cayrol, K., Mihaila, S., & Gareau, D. (2006). Learning design based on graphical knowledge-modelling. Educational Technology & Society, 9(1), 97-112.

Prestamo, A. M. (2000). A comprehensive inventory of technology and computer skills for academic reference librarians. Unpublished doctoral dissertation, Oklahoma State University, Stillwater.

Reitz, J. M. (2004). Dictionary for Library and Information Science. Westport, CT: Libraries Unlimited.

Rob, P., & Coronel, C. (2000). Database systems: Design, implementation and management. Cambridge, MA: Course Technology, Thomas Learning.

Roe, R. A. (2002). Competences: A key toward the integration of theory and practice in work and organizational psychology. Gedrag en Organisatie, 15, 203-224.

Schwartz, C. (1990). Teaching database management. Database, 91-92.

Society for Human Resource Management. (2003). Competency toolkit: Competencies overview. Retrieved from http://www.shrm.org/competencies/ overview.asp

Sticht, T. G., & McDonald, B. A. (1989). Making the nation smarter: The intergenerational transfer of competence. El Cajon, CA: Applied Behavioral & Cognitive Sciences.

Tabatabai, D. (2002). Information-seeking expertise: modeling a web search. Unpublished thesis, McGill University, Montreal, Quebec, Canada.

Tennant, R. (1999). Skills for the new millennium. The Library Journal, 39.

Tenopir, C., & Read, E. (2000). Patterns of database use in academic libraries. College and Research Libraries, 61, 234-246. Retrieved from http://www.ala.org/ala/acrl/acrlpubs/crljournal/ backissues2000b/may00/tenopir.pdf

Thomas, C., & Patel, S. I. (2008). Competency based training for digital librarians: A viable strategy for an evolving workforce? Journal of Education for Library and Information Science, 49, 298-309.

Vines, R. (1998). Databasics: A database dictionary. Retrieved from http://www.geekgirls.com/ database_dictionary.htm

Weber, R. P. (1990). Basic content analysis (2nd ed.). Beverly Hills, CA: Sage.

Chukwuemeka Dean Nwakanma

School of Information Studies, McGill University, 23 Rue Parmentier, Notre-Dame-De-Lile-Perrot, Quebec, Canada J7V8Y6. E-mail: chukwuemekanwakanma@yahoo.ca
Table 1: The Taxonomy Table.

                                The Cognitive Process Dimension

                                 1.           2.           3.
The Knowledge Dimension       Remember    Understand     Apply

A. Factual Knowledge
B. Conceptual Knowledge
C. Procedural Knowledge
D. Metacognitive Knowledge

                               The Cognitive Process Dimension

                                 4.           5.           6.
The Knowledge Dimension       Analyze      Evaluate      Create

A. Factual Knowledge
B. Conceptual Knowledge
C. Procedural Knowledge
D. Metacognitive Knowledge

Table 2: The Competence Levels Proposed by Educators in Various DBM
Competence Areas: The CPD.

                                   Cognitive Process Dimension (CPD)

Competence Areas                  Remember    Understand     Apply

Planning
  Count                              7            8            3
  % within Competence
    Areas                          33.3%        38.1%        14.3%
  % within Cognitive Process
    Dimension (CPD)                70.0%        22.9%        12.5%

Design
  Count                              1            25           8
  % within Competence
    Areas                           2.1%        52.1%        16.7%
  % within Cognitive Process
    Dimension (CPD)                10.0%        71.4%        33.3%

Implementation
  Count                              1                         4
  % within Competence
    Areas                           6.7%                     26.7%
  % within Cognitive Process
    Dimension (CPD)                10.0%                     16.7%

Operation
  Count                              1            1            7
  % within Competence
    Areas                           9.1%         9.1%        63.6%
  % within Cognitive Process
    Dimension (CPD)                10.0%         2.9%        29.2%

Maintenance
  Count                                           1            2
  % within Competence
    Areas                                       25.0%        50.0%
  % within Cognitive Process
    Dimension (CPD)                              29.%         8.3%

Total
  Count                              10           35           24
  % within Competence
    Areas                          10.1%        35.4%        24.2%
  % within Cognitive Process
    Dimension (CPD)                100.0%       100.0%       100.0%

                                   Cognitive Process
                                    Dimension (CPD)

Competence Areas                  Analyze      Evaluate

Planning
  Count                              2            1
  % within Competence
    Areas                           9.5%         4.8%
  % within Cognitive Process
    Dimension (CPD)                40.0%        33.3%

Design
  Count                              3
  % within Competence
    Areas                           6.3%
  % within Cognitive Process
    Dimension (CPD)                60.0%

Implementation
  Count
  % within Competence
    Areas
  % within Cognitive Process
    Dimension (CPD)

Operation
  Count                                           1
  % within Competence
    Areas                                        9.1%
  % within Cognitive Process
    Dimension (CPD)                             33.3%

Maintenance
  Count                                           1
  % within Competence
    Areas                                       25.0%
  % within Cognitive Process
    Dimension (CPD)                             33.3%

Total
  Count                              5            3
  % within Competence
    Areas                           5.1%         3.0%
  % within Cognitive Process
    Dimension (CPD)                100.0%       100.0%

                                   Cognitive Process
                                    Dimension (CPD)

Competence Areas                   Create       Total

Planning
  Count                                           21
  % within Competence
    Areas                                       100.0%
  % within Cognitive Process
    Dimension (CPD)                             21.2%

Design
  Count                              11           48
  % within Competence
    Areas                          22.9%        100.0%
  % within Cognitive Process
    Dimension (CPD)                50.0%        48.5%

Implementation
  Count                              10           15
  % within Competence
    Areas                          66.7%        100.0%
  % within Cognitive Process
    Dimension (CPD)                45.5%        15.2%

Operation
  Count                              1            11
  % within Competence
    Areas                           9.1%        100.0%
  % within Cognitive Process
    Dimension (CPD)                 4.5%        11.1%

Maintenance
  Count                                           4
  % within Competence
    Areas                                       100.0%
  % within Cognitive Process
    Dimension (CPD)                              4.0%

Total
  Count                              22           99
  % within Competence
    Areas                          22.2%        100.0%
  % within Cognitive Process
    Dimension (CPD)                100.0%       100.0%

Table 3: The Competence Levels Proposed by Educators in Various DBM
Competence Areas: The KD.

                                        Knowledge Dimension (KD)

                                          Factual     Conceptual
Competence Areas                         Knowledge     Knowledge

Planning
  Count                                     15             3
  % within Competence Areas                71.4%         14.3%
  % within Knowledge Dimension (KD)        65.2%         8.3%

Design
  Count                                      7            24
  % within Competence Areas                14.6%         50.0%
  % within Knowledge Dimension (KD)        30.4%         66.7%

Implementation
  Count                                                    4
  % within Competence Areas                              26.7%
  % within Knowledge Dimension (KD)                      11.1%

Operation
  Count                                      1             3
  % within Competence Areas                9.1%          27.3%
  % within Knowledge Dimension (KD)        4.3%          8.3%

Maintenance
  Count                                                    2
  % within Competence Areas                              50.0%
  % within Knowledge Dimension (KD)                      5.6%

Total
  Count                                     23            36
  % within Competence Areas                23.2%         36.4%
  % within Knowledge Dimension (KD)       100.0%        100.0%

                                         Knowledge
                                         Dimension
                                           (KD)

                                        Procedural
Competence Areas                         Knowledge       Total

Planning
  Count                                      3            21
  % within Competence Areas                14.3%        100.0%
  % within Knowledge Dimension (KD)        7.5%          21.2%

Design
  Count                                     17            48
  % within Competence Areas                35.4%        100.0%
  % within Knowledge Dimension (KD)        42.5%         48.5%

Implementation
  Count                                     11            15
  % within Competence Areas                73.3%        100.0%
  % within Knowledge Dimension (KD)        27.5%         15.2%

Operation
  Count                                      7            11
  % within Competence Areas                63.6%        100.0%
  % within Knowledge Dimension (KD)        17.5%         11.1%

Maintenance
  Count                                      2             4
  % within Competence Areas                50.0%        100.0%
  % within Knowledge Dimension (KD)        5.0%          4.0%

Total
  Count                                     40            99
  % within Competence Areas                40.4%        100.0%
  % within Knowledge Dimension (KD)       100.0%        100.0%

Table 4: The Competence Levels Proposed by Practitioners in Various
DBM Competence Areas: The CPD.

                                 Cognitive Process Dimension (CPD)

Competence Areas                 Remember    Understand     Apply

Planning
  Count                             3            3
  % within Competence
    Areas                         14.3%        14.3%
  % within Cognitive Process
    Dimension (CPD)               14.3%         7.5%

Design
  Count                             16           27           11
  % within Competence
    Areas                         20.8%        35.1%        14.3%
  % within Cognitive Process
    Dimension (CPD)               76.2%        67.5%        44.0%

Implementation
  Count                                          1            8
  % within Competence
    Areas                                       6.7%        53.3%
  % within Cognitive Process
    Dimension (CPD)                             2.5%         32.0

Operation
  Count                                          9            5
  % within Competence
    Areas                                      50.0%        27.8%
  % within Cognitive Process
    Dimension (CPD)                             22.5        20.0%

Maintenance
  Count                             2                         1
  % within Competence
    Areas                         20.0%                     10.0%
  % within Cognitive Process
    Dimension (CPD)                9.5%                      4.0%

Total
  Count                             21           40           25
  % within Competence
    Areas                         14.9%        28.4%        17.7%
  % within Cognitive Process
    Dimension (CPD)               100.0%       100.0%       100.0%

                                   Cognitive Process
                                    Dimension (CPD)

Competence Areas                 Analyze      Evaluate

Planning
  Count                             9            2
  % within Competence
    Areas                         42.9%         9.5%
  % within Cognitive Process
    Dimension (CPD)               39.1%        20.0%

Design
  Count                             8            4
  % within Competence
    Areas                         10.4%         5.2%
  % within Cognitive Process
    Dimension (CPD)               34.8%        40.0%

Implementation
  Count                                          2
  % within Competence
    Areas                                      13.3%
  % within Cognitive Process
    Dimension (CPD)                            20.0%

Operation
  Count                             2
  % within Competence
    Areas                         11.1%
  % within Cognitive Process
    Dimension (CPD)                8.7%

Maintenance
  Count                             4            2
  % within Competence
    Areas                         40.0%        20.0%
  % within Cognitive Process
    Dimension (CPD)               17.4%        20.0%

Total
  Count                             23           10
  % within Competence
    Areas                         16.3%         7.1%
  % within Cognitive Process
    Dimension (CPD)               100.0%       100.0%

                                Cognitive      Total
                                 Process
                                Dimension
                                  (CPD)

Competence Areas                  Create

Planning
  Count                             4            21
  % within Competence
    Areas                         19.0%        100.0%
  % within Cognitive Process
    Dimension (CPD)               18.2%        14.9%

Design
  Count                             11           77
  % within Competence
    Areas                         14.3%        100.0%
  % within Cognitive Process
    Dimension (CPD)               50.0%        54.6%

Implementation
  Count                             4            15
  % within Competence
    Areas                         26.7%        100.0%
  % within Cognitive Process
    Dimension (CPD)               18.2%        106.%

Operation
  Count                             2            18
  % within Competence
    Areas                         11.1%        100.0%
  % within Cognitive Process
    Dimension (CPD)                9.1%        12.8%

Maintenance
  Count                             1            10
  % within Competence
    Areas                         10.0%        100.0%
  % within Cognitive Process
    Dimension (CPD)                4.5%         7.1%

Total
  Count                             22          141
  % within Competence
    Areas                         15.6%        100.0%
  % within Cognitive Process
    Dimension (CPD)               100.0%       100.0%

Table 5: The Competence Levels Proposed by Practitioners in Various DBM
Competence Areas: The KD.

                                       Knowledge Dimension (KD)

                                        Factual     Conceptual
Competence Areas                       Knowledge     Knowledge

Planning
  Count                                    9             7
  % within Competence Areas              42.9%         33.3%
  % within Knowledge Dimension (KD)      42.9%         10.1%

Design
  Count                                    6            49
  % within Competence Areas              7.8%          63.6%
  % within Knowledge Dimension (KD)      28.6%         71.0%

Implementation
  Count                                    2             5
  % within Competence Areas              13.3%         33.3%
  % within Knowledge Dimension (KD)      9.5%          7.2%

Operation
  Count                                    2             3
  % within Competence Areas              11.1%         16.7%
  % within Knowledge Dimension (KD)      9.5%          4.3%

Maintenance
  Count                                    2             5
  % within Competence Areas              20.0%         50.0%
  % within Knowledge Dimension (KD)      9.5%          7.2%

Total
  Count                                   21            69
  % within Competence Areas              14.9%         48.9%
  % within Knowledge Dimension (KD)     100.0%        100.0%

                                       Knowledge
                                       Dimension
                                         (KD)

                                      Procedural
Competence Areas                       Knowledge       Total

Planning
  Count                                    5            21
  % within Competence Areas              23.8%        100.0%
  % within Knowledge Dimension (KD)      9.8%          14.9%

Design
  Count                                   22            77
  % within Competence Areas              28.6%        100.0%
  % within Knowledge Dimension (KD)      43.1%         54.6%

Implementation
  Count                                    8            15
  % within Competence Areas              53.3%        100.0%
  % within Knowledge Dimension (KD)      15.7%         10.6%

Operation
  Count                                   13            18
  % within Competence Areas              72.2%        100.0%
  % within Knowledge Dimension (KD)      25.5%         12.8%

Maintenance
  Count                                    3            10
  % within Competence Areas              30.0%        100.0%
  % within Knowledge Dimension (KD)      5.9%          7.1%

Total
  Count                                   51            141
  % within Competence Areas              36.2%        100.0%
  % within Knowledge Dimension (KD)     100.0%        100.0%

Table 6: The Distribution of Learning Objectives Obtained from
Educators and Practitioners in the DBM Competence Areas.

                       Number of Learning   Number of Learning
                        Objectives from      Objectives from
DBM Competence Areas       Educators          Practitioners

Planning                   21 (21.2%)           21 (41.9%)
Design                     48 (48.5%)           77 (54.6%)
Implementation             15 (15.2%)           15 (10.6%)
Operation                  11 (11.1%)           18 (12.8%)
Maintenance                 4 (4.0%)            10 (7.1%)

Table 7: Competence Levels Proposed by Educators Versus
Those of Practitioners in DBM Competence Areas: The CPD.

                           Cognitive Process Dimension (CPD)

Competence Areas         Remember     Understand       Apply

Planning
  Educators
    Count                    7             8             3
    % within Type of       33.3%         38.1%         14.3%
      Respondent

Practitioners
    Count                    3             3
    % within Type of       14.3%         14.3%
      Respondent

Total
    Count                   10            11             3
    % within Type of       23.8%         26.2%         7.1%
      Respondent

Design
  Educators
    Count                    1            25             8
    % within Type of       2.1%          52.1%         16.7%
      Respondent

Practitioners
    Count                   16            27            11
    % within Type of       20.8%         35.1%         14.3%
      Respondent

Total
    Count                   17            52            19
    % within Type of       13.6%         41.6%         15.2%
      Respondent

Implementation
  Educators
    Count                    1                           4
    % within Type of       6.7%                        26.7%
      Respondent

Practitioners
    Count                                  1             8
    % within Type of                     6.7%          53.3%
      Respondent

Total
    Count                    2             1            12
    % within Type of       14.3%         3.3%          40.0%
      Respondent

Operation
  Educators
    Count                    1             1             7
    % within Type of       9.1%          9.1%          63.6%
      Respondent

Practitioners
    Count                                  9             5
    % within Type of                     50.0%         27.8%
      Respondent

Total
    Count                    1            10            12
    % within Type of       3.4%          34.5%         41.1%
      Respondent

Maintenance
  Educators
    Count                                  1             2
    % within Type of                     25.0%         50.0%
      Respondent

Practitioners
    Count                    2                           1
    % within Type of       20.2%                       10.0%
      Respondent

Total
    Count                    2             1             3
    % within Type of       14.3%         7.1%          21.4%
      Respondent

                           Cognitive Process
                            Dimension (CPD)

Competence Areas          Analyze      Evaluate

Planning
  Educators
    Count                    2             1
    % within Type of       9.5%          4.8%
      Respondent

Practitioners
    Count                    9             2
    % within Type of       42.9%         9.5%
      Respondent

Total
    Count                   11             3
    % within Type of       26.2%         7.1%
      Respondent

Design
  Educators
    Count                    3
    % within Type of       6.3%
      Respondent

Practitioners
    Count                    8             4
    % within Type of       10.4%         5.2%
      Respondent

Total
    Count                   11             4
    % within Type of       8.8%          3.2%
      Respondent

Implementation
  Educators
    Count
    % within Type of
      Respondent

Practitioners
    Count                                  2
    % within Type of                     13.3%
      Respondent

Total
    Count                                  2
    % within Type of                     6.7%
      Respondent

Operation
  Educators
    Count                                  1
    % within Type of                     9.1%
      Respondent

Practitioners
    Count                    2
    % within Type of       11.1%
      Respondent

Total
    Count                    2             1
    % within Type of       6.9%          3.4%
      Respondent

Maintenance
  Educators
    Count                                  1
    % within Type of                     25.0%
      Respondent

Practitioners
    Count                    4             2
    % within Type of       40.0%         20.0%
      Respondent

Total
    Count                    4             3
    % within Type of       28.6%         21.4%
      Respondent

                         Cognitive
                          Process
                         Dimension
                           (CPD)

Competence Areas          Create         Total

Planning
  Educators
    Count                                 21
    % within Type of                    100.0%
      Respondent

Practitioners
    Count                    2            21
    % within Type of       19.0%        100.0%
      Respondent

Total
    Count                    4            42
    % within Type of       9.5%         100.0%
      Respondent

Design
  Educators
    Count                   11            48
    % within Type of       22.9%        100.0%
      Respondent

Practitioners
    Count                   11            77
    % within Type of       14.3%        100.0%
      Respondent

Total
    Count                   22            125
    % within Type of       17.6%        100.0%
      Respondent

Implementation
  Educators
    Count                   10            15
    % within Type of       66.7%        100.0%
      Respondent

Practitioners
    Count                    4            15
    % within Type of       26.7%        100.0%
      Respondent

Total
    Count                   14            30
    % within Type of       46.7%        100.0%
      Respondent

Operation
  Educators
    Count                    1            11
    % within Type of       9.1%         100.0%
      Respondent

Practitioners
    Count                    2            18
    % within Type of       11.1%        100.0%
      Respondent

Total
    Count                    3            29
    % within Type of       10.3%        100.0%
      Respondent

Maintenance
  Educators
    Count                                  4
    % within Type of                    100.0%
      Respondent

Practitioners
    Count                    1            10
    % within Type of       10.0%        100.0%
      Respondent

Total
    Count                    1            14
    % within Type of       7.1%         100.0%
      Respondent

Table 8: Competence Levels Proposed by Educators Versus Those of
Practitioners in DBM Competence Areas: The KD.

                                   Knowledge Dimension (KD)

                                     Factual     Conceptual
Competence Areas                    Knowledge     Knowledge

Planning
  Educators
    Count                              15             3
    % within Type of Respondent       71.4%         14.3%

Practitioners
    Count                               9             7
    % within Type of Respondent       42.9%         33.3%

Total
    Count                              24            10
    % within Type of Respondent       57.1%         23.8%

Design
  Educators
    Count                               7            24
    % within Type of Respondent       14.6%         50.0%

Practitioners
    Count                               6            49
    % within Type of Respondent       7.8%          63.6%

Total
    Count                              13            73
    % within Type of Respondent       10.4%         58.4%

Implementation
  Educators
    Count                                             4
    % within Type of Respondent                     26.7%

Practitioners
    Count                               2             5
    % within Type of Respondent       13.3%         33.3%

Total
    Count                               2             9
    % within Type of Respondent       6.7%          30.0%

Operation
  Educators
    Count                               1             3
    % within Type of Respondent       9.1%          27.3%

Practitioners
    Count                               2             3
    % within Type of Respondent       11.1%         16.7%

Total
    Count                               3             6
    % within Type of Respondent       10.3%         20.7

Maintenance
  Educators
    Count                                             2
    % within Type of Respondent                     50.0%

Practitioners
    Count                               2             5
    % within Type of Respondent       20.0%         50.0%

Total
    Count                               2             7
    % within Type of Respondent       14.3%         50.0%

                                    Knowledge
                                    Dimension
                                      (KD)

                                   Procedural
Competence Areas                    Knowledge       Total

Planning
  Educators
    Count                               3            21
    % within Type of Respondent       14.3%        100.0%

Practitioners
    Count                               5            21
    % within Type of Respondent       23.8%        100.0%

Total
    Count                               8            42
    % within Type of Respondent       19.0%        100.0%

Design
  Educators
    Count                              17            48
    % within Type of Respondent       35.4%        100.0%

Practitioners
    Count                              22            77
    % within Type of Respondent       28.6%          125

Total
    Count                              39            125
    % within Type of Respondent       31.2%        100.0%

Implementation
  Educators
    Count                              11            15
    % within Type of Respondent       73.3%        100.0%

Practitioners
    Count                               8            15
    % within Type of Respondent       53.3%        100.0%

Total
    Count                               8            30
    % within Type of Respondent       53.3%        100.0%

Operation
  Educators
    Count                               7            11
    % within Type of Respondent       63.6%        100.0%

Practitioners
    Count                              13            18
    % within Type of Respondent       72.2%        100.0%

Total
    Count                              20            29
    % within Type of Respondent       50.0%        100.0%

Maintenance
  Educators
    Count                               2             4
    % within Type of Respondent       50.0%        100.0%

Practitioners
    Count                               3            10
    % within Type of Respondent       30.0%        100.0%

Total
    Count                               5            14
    % within Type of Respondent       35.7%        100.0%

Table 9: Mann-Whitney U Test Results of Competence Level Difference
Between Educators and Practitioners: The CPD.

DBM Competence Areas     Edu--N     Pract--N    z Scores

Planning                   21          21         3.072
Design                     48          77         1.305
Implementation             15          15         1.702
Operation                  11          18         0.906
Maintenance                 4          10         0.722
DBM Overall                99          141        0.027

DBM Competence Areas        p                Remarks

Planning                  0.002      Significant difference
Design                    0.192     No significant difference
Implementation            0.089        Inconclusive result
Operation                 0.365     No significant difference
Maintenance               0.470     No significant difference
DBM Overall               0.978     No significant difference

Table 10: Mann-Whitney U Test Results of Competence Level Difference
Between Educators and Practitioners: The KD.

DBM Competence Areas     Edu--N      Pract--N     z Scores

Planning                   21           21         1.694
Design                     48           77         0.156
Implementation             15           15         1.296
Operation                  11           16         0.386
Maintenance                4            10         0.931
DBM Overall                99          141         0.245

DBM Competence Areas       p                  Remarks

Planning                 0.090          Inconclusive result
Design                   0.876       No significant difference
Implementation           0.195       No significant difference
Operation                0.699       No significant difference
Maintenance              0.352       No significant difference
DBM Overall              0.806       No significant difference

Figure 3. The distribution of learning objectives obtained from
educators in the DBM competence areas.

Frequency

Competence Areas    Frequency   Percent

Planning               21        21.2
Design                 48        48.5
Implementation         15        15.2
Operation              11        11.1
Maintenance             4         4.0

Total                  99       100.0

Figure 4. The distribution of learning objectives obtained from
practitioners in the DBM competence area.

Competence Areas      Frequency     Percent

Planning                  21          14.9
Design                    77          54.6
Implementation            15          10.6
Operation                 18          12.8
Maintenance               10           7.1

Total                     141        100.0

Figure 5. Distribution of learning objectives from educators and
practitioners.

                        Type of Respondent
Competence Areas     Educators   Practitioners     Total

Planning                 21            21            42
Design                   48            77           125
Implementation           15            15            30
Operation                11            18            29
Maintenance               4            10            14

Total                    99           141           240
COPYRIGHT 2011 Association for Library and Information Science Education
No portion of this article can be reproduced without the express written permission from the copyright holder.
Copyright 2011 Gale, Cengage Learning. All rights reserved.

Article Details
Printer friendly Cite/link Email Feedback
Title Annotation:Library and Information Science
Author:Nwakanma, Chukwuemeka Dean
Publication:Journal of Education for Library and Information Science
Article Type:Report
Geographic Code:1USA
Date:Jan 1, 2011
Words:9299
Previous Article:A measurement model of students' behavioral intentions to use second life virtual environments.
Next Article:Supporting our students, improving our scholarship.
Topics:

Terms of use | Privacy policy | Copyright © 2019 Farlex, Inc. | Feedback | For webmasters