Английская Википедия:Concept inventory
Шаблон:Short description A concept inventory is a criterion-referenced test designed to help determine whether a student has an accurate working knowledge of a specific set of concepts. Historically, concept inventories have been in the form of multiple-choice tests in order to aid interpretability and facilitate administration in large classes. Unlike a typical, teacher-authored multiple-choice test, questions and response choices on concept inventories are the subject of extensive research. The aims of the research include ascertaining (a) the range of what individuals think a particular question is asking and (b) the most common responses to the questions. Concept inventories are evaluated to ensure test reliability and validity. In its final form, each question includes one correct answer and several distractors.
Ideally, a score on a criterion-referenced test reflects the amount of content knowledge a student has mastered. Criterion-referenced tests differ from norm-referenced tests in that (in theory) the former is not used to compare an individual's score to the scores of the group. Ordinarily, the purpose of a criterion-referenced test is to ascertain whether a student mastered a predetermined amount of content knowledge; upon obtaining a test score that is at or above a cutoff score, the student can move on to study a body of content knowledge that follows next in a learning sequence. In general, item difficulty values ranging between 30% and 70% are best able to provide information about student understanding.
The distractors are incorrect or irrelevant answers that are usually (but not always) based on students' commonly held misconceptions.[1] Test developers often research student misconceptions by examining students' responses to open-ended essay questions and conducting "think-aloud" interviews with students. The distractors chosen by students help researchers understand student thinking and give instructors insights into students' prior knowledge (and, sometimes, firmly held beliefs). This foundation in research underlies instrument construction and design, and plays a role in helping educators obtain clues about students' ideas, scientific misconceptions, and didaskalogenic ("teacher-induced" or "teaching-induced") confusions and conceptual lacunae that interfere with learning.
Concept inventories in use
Concept inventories are education-related diagnostic tests.[2] In 1985 Halloun and Hestenes introduced a "multiple-choice mechanics diagnostic test" to examine students' concepts about motion.[3] It evaluates student understanding of basic concepts in classical (macroscopic) mechanics. A little later, the Force Concept Inventory (FCI), another concept inventory, was developed.[3][4][5] The FCI was designed to assess student understanding of the Newtonian concepts of force. Hestenes (1998) found that while "nearly 80% of the [students completing introductory college physics courses] could state Newton's Third Law at the beginning of the course, FCI data showed that less than 15% of them fully understood it at the end". These results have been replicated in a number of studies involving students at a range of institutions (see sources section below). That said, there remain questions as what exactly the FCI measures.[6] Results using the FCI have led to greater recognition in the science education community of the importance of students' "interactive engagement" with the materials to be mastered.[7]
Since the development of the FCI, other physics instruments have been developed. These include the Force and Motion Conceptual Evaluation concept[9] and the Brief Electricity and Magnetism Assessment.[10] For a discussion of how a number of concept inventories were developed see Beichner.[11]
In addition to physics, concept inventories have been developed in statistics,[12] chemistry,[13][14] astronomy,[15] basic biology,[16][17][18][19] natural selection,[20][21][22] genetics,[23] engineering,[24] geoscience.[25] and computer science.[26]
In many areas, foundational scientific concepts transcend disciplinary boundaries. An example of an inventory that assesses knowledge of such concepts is an instrument developed by Odom and Barrow (1995) to evaluate understanding of diffusion and osmosis.[27] In addition, there are non-multiple choice conceptual instruments, such as the essay-based approach[14] and the essay and oral exams concept to measure student understanding of Lewis structures in chemistry.[21][28]
Caveats associated with concept inventory use
Some concept inventories are problematic. The concepts tested may not be fundamental or important in a particular discipline, the concepts involved may not be explicitly taught in a class or curriculum, or answering a question correctly may require only a superficial understanding of a topic. It is therefore possible to either over-estimate or under-estimate student content mastery. While concept inventories designed to identify trends in student thinking may not be useful in monitoring learning gains as a result of pedagogical interventions, disciplinary mastery may not be the variable measured by a particular instrument. Users should be careful to ensure that concept inventories are actually testing conceptual understanding, rather than test-taking ability, language skills, or other abilities that can influence test performance.
The use of multiple-choice exams as concept inventories is not without controversy. The very structure of multiple-choice type concept inventories raises questions involving the extent to which complex, and often nuanced situations and ideas must be simplified or clarified to produce unambiguous responses. For example, a multiple-choice exam designed to assess knowledge of key concepts in natural selection[20] does not meet a number of standards of quality control.[22] One problem with the exam is that the two members of each of several pairs of parallel items, with each pair designed to measure exactly one key concept in natural selection, sometimes have very different levels of difficulty.[21] Another problem is that the multiple-choice exam overestimates knowledge of natural selection as reflected in student performance on a diagnostic essay exam and a diagnostic oral exam, two instruments with reasonably good construct validity.[21] Although scoring concept inventories in the form of essay or oral exams is labor-intensive, costly, and difficult to implement with large numbers of students, such exams can offer a more realistic appraisal of the actual levels of students' conceptual mastery as well as their misconceptions.[14][21] Recently, however, computer technology has been developed that can score essay responses on concept inventories in biology and other domains,[29] promising to facilitate the scoring of concept inventories organized as (transcribed) oral exams as well as essays.
See also
- Шаблон:Annotated link
- Шаблон:Annotated link
- Шаблон:Annotated link
- Шаблон:Annotated link
- Шаблон:Annotated link
- Шаблон:Annotated link
- Шаблон:Annotated link
- Шаблон:Annotated link
- Шаблон:Annotated link
- Шаблон:Annotated link
- Шаблон:Annotated link
- Шаблон:Annotated link
- Шаблон:Annotated link
- Шаблон:Annotated link
- Шаблон:Annotated link
- Шаблон:Annotated link
- Шаблон:Annotated link
References
External links
- Astronomy
- Biology Concept Inventory
- Bio-Diagnostic Question Clusters
- Classroom Concepts and Diagnostic Tests
- Chemistry
- Diagnostic Question Clusters in Biology
- Engineering
- Evolution Assessment
- Force Concept Inventory
- Genetics
- Geosciences
- Molecular Life Sciences Concept Inventory
- Physics
- Statistics
- Thinking Like a Biologist
- ↑ "Development and Validation of Instruments to Measure Learning of Expert-Like Thinking." W. K. Adams & C. E. Wieman, 2010. International Journal of Science Education, 1-24. iFirst, Шаблон:Doi
- ↑ Шаблон:Cite journal
- ↑ 3,0 3,1 Hallouin, I. A., & Hestenes, D. Common sense concepts about motion (1985). American Journal of Physics, 53, 1043-1055
- ↑ Шаблон:Cite journal
- ↑ Шаблон:Cite journal
- ↑ Шаблон:Cite journal
- ↑ Шаблон:Cite journal
- ↑ Redish page. Visited Feb. 14, 2011
- ↑ Шаблон:Cite journal
- ↑ Ding, L, Chabay, R, Sherwood, B, & Beichner, R (2006). Evaluating an electricity and magnetism assessment tool: Brief electricity and magnetism assessment Brief Electricity and Magnetism Assessment (BEMA). Phys. Rev. ST Physics Ed. Research 2, 7 pages. Шаблон:Cite journal
- ↑ Шаблон:Cite journal
- ↑ Allen, K (2006) The Statistics Concept Inventory: Development and Analysis of a Cognitive Assessment Instrument in Statistics. Doctoral dissertation, The University of Oklahoma. [1]
- ↑ Шаблон:Cite web
- ↑ 14,0 14,1 14,2 Шаблон:Cite journal
- ↑ [2] Astronomy Diagnostic Test (ADT) Version 2.0, visited Feb. 14, 2011
- ↑ Шаблон:Cite journal
- ↑ Шаблон:Cite journal
- ↑ D'Avanzo C, Anderson CW, Griffith A, Merrill J. 2010. Thinking like a biologist: Using diagnostic questions to help students reason with biological principles. (17 January 2010; www.biodqc.org/)
- ↑ Шаблон:Cite journal
- ↑ 20,0 20,1 Шаблон:Cite journal
- ↑ 21,0 21,1 21,2 21,3 21,4 Шаблон:Cite journal
- ↑ 22,0 22,1 Nehm R & Schonfeld IS (2010). The future of natural selection knowledge measurement: A reply to Anderson et al. (2010). Journal of Research in Science Teaching, 47, 358-362. [3] Шаблон:Webarchive
- ↑ Шаблон:Cite journal
- ↑ Concept Inventory Assessment Instruments for Engineering Science. Visited Feb. 14, 2011. [4]
- ↑ Libarkin, J.C., Ward, E.M.G., Anderson, S.W., Kortemeyer, G., Raeburn, S.P., 2011, Revisiting the Geoscience Concept Inventory: A call to the community: GSA Today, v. 21, n. 8, p. 26-28. [5] Шаблон:Webarchive
- ↑ Caceffo, R.; Wolfman, S.; Booth, K.; Azevedo, R. (2016). Developing a Computer Science Concept Inventory for Introductory Programming. In Proceedings of the 47th ACM Technical Symposium on Computing Science Education (SIGCSE '16). ACM, New York, NY, USA, 364-369. DOI=https://dx.doi.org/10.1145/2839509.2844559 [6]
- ↑ Odom AL, Barrow LH 1995 Development and application of a two-tier diagnostic test measuring college biology students' understanding of diffusion and osmosis after a course of instruction. Journal of Research In Science Teaching 32: 45-61.
- ↑ Шаблон:Cite journal
- ↑ Шаблон:Cite journal