Skip to main content
Log in

Quantitative Measurement of Pre-Service Teachers’ Competency of Questioning in Scaffolding Students’ Science Learning

  • Published:
Research in Science Education Aims and scope Submit manuscript

Abstract

Questioning is a critical strategy for science teachers to scaffold students’ exploration and knowledge construction in inquiry-oriented science teaching. In science teacher preparation, open-ended questions asked by teachers are advocated as an advantageous strategy to prompt student thinking. However, insufficient attention has been cast on how science content knowledge embedded in teacher questions contributes to students’ conceptual understanding. Pre-Service Teachers (PST) may formulate a mindset of hands-off inquiry teaching where students could achieve a learning objective by articulating their thoughts without guidance from teachers. In addition, existing methods for the assessment of questioning are mainly qualitative via discourse analysis from limited scenarios, which may yield biased inferences of a teacher’s competency in questioning. Besides, qualitative methods are unwieldy for large-scale analyses due to the complexity of synthesizing discoursal information. In this study, we designed a written instrument for quantitative assessment of PSTs’ pedagogical content knowledge of questioning. We thoroughly introduced the free-response and multiple-choice versions of this instrument and applied it with 108 PSTs. The findings supported the validity and reliability of this instrument. As suggested by this instrument, the participating PSTs were aware of the importance of questioning in inquiry teaching. However, the PSTs’ difficulties with science content knowledge and knowledge of students’ understanding might impede them from determining effective guiding questions to scaffold student learning. Finally, we discussed the potential of this instrument in preparing PSTs’ questioning skills.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3

Similar content being viewed by others

References

  • Almahrouqi, A., & Scott, P. (2012). Classroom discourse and science learning: Issues of engagement, quality and outcome. In D. Jorde & J. Dillon (Eds.), Science Education Research and Practice in Europe: Retrospective and Prospective (pp. 291–307). Sense publishers.

    Chapter  Google Scholar 

  • Benedict-Chambers, A., Kademian, S. M., Davis, E. A., & Palincsar, A. S. (2017). Guiding students towards sensemaking: Teacher questions focused on integrating scientific practices with science content. International Journal of Science Education, 39(15), 1977–2001.

    Article  Google Scholar 

  • Brennan, R. L., & Prediger, D. J. (1981). Coefficient kappa: Some uses, misuses, and alternatives. Educational and Psychological Measurement, 41(3), 687–699.

    Article  Google Scholar 

  • Brown, J. D. (2011). Likert items and scales of measurement. Statistics, 15(1), 10–14.

    Google Scholar 

  • Bybee, R. W. (2000). Teaching science as inquiry. In J. Minstrell & E. van Zee (Eds.), Inquiring into inquiry learning and teaching in science (pp. 20–46). American Association for the Advancement of Science.

    Google Scholar 

  • Carlson, J., Daehler, K. R., Alonzo, A. C., Barendsen, E., Berry, A., Borowski, A., ... & Wilson, C. D. (2019). The refined consensus model of pedagogical content knowledge in science education. In Repositioning pedagogical content knowledge in teachers’ knowledge for teaching science (pp. 77–94). Springer, Singapore.

  • Chen, Q., Zhu, G., Liu, Q., Han, J., Fu, Z., & Bao, L. (2020). Development of a multiple-choice problem-solving categorization test for assessment of student knowledge structure. Physical Review Physics Education Research, 16(2), 020120.

    Article  Google Scholar 

  • Chin, C. (2007). Teacher questioning in science classrooms: Approaches that stimulate productive thinking. Journal of Research in Science Teaching, 44(6), 815–843.

    Article  Google Scholar 

  • Chin, C., & Osborne, J. (2008). Students’ questions: A potential resource for teaching and learning science. Studies in Science Education, 44(1), 1–39.

    Article  Google Scholar 

  • Cochran-Smith, M., Villegas, A. M., Abrams, L., Chavez-Moreno, L., Mills, T., & Stern, R. (2015). Critiquing teacher preparation research: An overview of the field, part II. Journal of Teacher Education, 66(2), 109–121.

    Article  Google Scholar 

  • Cohen, L., Manion, L., & Morrison, K. (2007). Research Methods in Education (6th edition). Routledge. https://doi.org/10.4324/9780203029053

    Book  Google Scholar 

  • Conway, C. J. (2014). Effects of guided inquiry versus lecture instruction on final grade distribution in a one-semester organic and biochemistry course. Journal of Chemical Education, 91(4), 480–483.

    Article  Google Scholar 

  • Costa, A. L., & Kallick, B. (2015). Five Strategies for Questioning with Intention. Educational Leadership, 73(1), 66–69.

    Google Scholar 

  • Crawford, B. A. (2014). From inquiry to scientific practices in the science classroom. In N. G. Lederman & S. K. Abell (Eds.), Handbook of Research on Science Education (Vol. 2, pp. 515–541). Routledge.

    Google Scholar 

  • Cruz-Guzmán, M., García-Carmona, A., & Criado, A. M. (2017). An analysis of the questions proposed by elementary pre-service teachers when designing experimental activities as inquiry. International Journal of Science Education, 39(13), 1755–1774.

    Article  Google Scholar 

  • Davis, E. A., Petish, D., & Smithey, J. (2006). Challenges new science teachers face. Review of Educational Research, 76(4), 607–651.

    Article  Google Scholar 

  • Grossman, P. (2005). Research on pedagogical approaches in teacher education. In M. Cochran-Smith & K. M. Zeichner (Eds.), Studying teacher education: The report of the AERA panel on research and teacher education (pp. 426–476). Erlbaum.

    Google Scholar 

  • Heale, R., & Twycross, A. (2015). Validity and reliability in quantitative studies. Evidence-Based Nursing, 18(3), 66–67.

    Article  Google Scholar 

  • Hiebert, J., & Morris, A. K. (2012). Teaching, rather than teachers, as a path toward improving classroom instruction. Journal of Teacher Education, 63(2), 92–102.

    Article  Google Scholar 

  • Kawalkar, A., & Vijapurkar, J. (2013). Scaffolding Science Talk: The role of teachers’ questions in the inquiry classroom. International Journal of Science Education, 35(12), 2004–2027.

    Article  Google Scholar 

  • Kelly, G. J. (2014). Discourse practices in science learning and teaching. In N. G. Lederman & S. K. Abell (Eds.), Handbook of Research on Science Education (Vol. 2, pp. 321–336). Routledge.

    Google Scholar 

  • Kim, M. (2021). Student agency and teacher authority in inquiry-based classrooms: Cases of elementary teachers’ classroom talk. International Journal of Science and Mathematics Education, 1–22.

  • Korthagen, F. A., & Kessels, J. P. (1999). Linking theory and practice: Changing the pedagogy of teacher education. Educational Researcher, 28(4), 4–17.

    Article  Google Scholar 

  • Kuechler, W. L., & Simkin, M. G. (2010). Why is performance on multiple-choice tests and constructed-response tests not more closely related? Theory and an empirical test. Decision Sciences Journal of Innovative Education, 8(1), 55–73.

    Article  Google Scholar 

  • Lampert, M., Franke, M. L., Kazemi, E., Ghousseini, H., Turrou, A. C., Beasley, H., . . . Crowe, K. (2013). Keeping it complex: Using rehearsals to support novice teacher learning of ambitious teaching. Journal of Teacher Education, 64(3), 226–243.

  • Lee, Y., & Kinzie, M. B. (2012). Teacher question and student response with regard to cognition and language use. Instructional Science40(6), 857–874. http://www.jstor.org/stable/43575388.

  • Lehtinen, A., Lehesvuori, S., & Viiri, J. (2019). The connection between forms of guidance for inquiry-based learning and the Communicative approaches applied—a case study in the context of Pre-service teachers. Research in Science Education, 49(6), 1547–1567. https://doi.org/10.1007/s11165-017-9666-7

    Article  Google Scholar 

  • Magnusson, S., Krajcik, J., & Borko, H. (1999). Nature, sources, and development of pedagogical content knowledge for science teaching. In Examining pedagogical content knowledge (pp. 95–132). Springer, Dordrecht.

  • Morris, J., & Chi, M. T. (2020). Improving teacher questioning in science using ICAP theory. The Journal of Educational Research, 113(1), 1–12.

    Article  Google Scholar 

  • Park, S., & Oliver, J. S. (2008). Revisiting the conceptualisation of pedagogical content knowledge (PCK): PCK as a conceptual tool to understand teachers as professionals. Research in Science Education, 38(3), 261–284.

    Article  Google Scholar 

  • Pols, C. F. J., Dekkers, P. J. J. M., & de Vries, M. J. (2022). Defining and assessing understandings of evidence with the assessment rubric for physics inquiry: Towards integration of argumentation and inquiry. Physical Review Physics Education Research, 18(1), 010111.

    Article  Google Scholar 

  • Puntambekar, S., & Hubscher, R. (2005). Tools for scaffolding students in a complex learning environment: What have we gained and what have we missed? Educational Psychologist, 40(1), 1–12. https://doi.org/10.1207/s15326985ep4001_1

    Article  Google Scholar 

  • Reinsvold, L. A., & Cochran, K. F. (2012). Power dynamics and questioning in elementary science classrooms. Journal of Science Teacher Education, 23(7), 745–768.

    Article  Google Scholar 

  • Roth, W. M. (1996). Teacher questioning in an open-inquiry learning environment: Interactions of context, content, and student responses. Journal of Research in Science Teaching, 33(7), 709–736.

    Article  Google Scholar 

  • Saleh, A., Yuxin, C., Hmelo-Silver, C. E., Glazewski, K. D., Mott, B. W., & Lester, J. C. (2020). Coordinating Scaffolds for Collaborative Inquiry in a Game-Based Learning Environment. Journal of Research in Science Teaching, 57(9), 1490–1518.

    Article  Google Scholar 

  • Salmon, A. K., & Barrera, M. X. (2021). Intentional questioning to promote thinking and learning. Thinking Skills & Creativity, 40, N.PAG. https://doi.org/10.1016/j.tsc.2021.100822.

  • Scott, P. H., Mortimer, E. F., & Aguiar, O. G. (2006). The tension between authoritative and dialogic discourse: A fundamental characteristic of meaning making interactions in high school science lessons. Science Education, 90(4), 605–631.

    Article  Google Scholar 

  • Smith, P. M., & Hackling, M. W. (2016). Supporting Teachers to Develop Substantive Discourse in Primary Science Classrooms. Australian Journal of Teacher Education, 41(4), 151–173.

    Article  Google Scholar 

  • Taber, K. S. (2018). The use of Cronbach’s alpha when developing and reporting research instruments in science education. Research in Science Education, 48(6), 1273–1296.

    Article  Google Scholar 

  • van Driel, J. (2021). Developing science teachers’ pedagogical content knowledge. In J. van Driel (Ed.), Science Teachers’ Knowledge Development (pp. 1–37). Brill.

    Google Scholar 

  • Vygotsky, L. (1978). Interaction between learning and development. Readings on the Development of Children, 23(3), 34–41.

    Google Scholar 

  • Wang, J. & Sneed S. (2019). Exploring the design of scaffolding pedagogical instruction forelementary preservice teacher education. Journal of Science Teacher Education. 30(5), 483–506. https://doi.org/10.1080/1046560X.2019.1583035

  • Wang, J., Wang, Y. Wipfli, K., Thacker, B., & Hart, S. (2023). Investigating learning assistants’ use of questioning in online courses about introductory physics. Physical Review Physics Education Research. 19(1), 1–18. https://doi.org/10.1103/PhysRevPhysEducRes.19.010113

  • Zeichner, K. (2012). The turn once again toward practice-based teacher education. Journal of Teacher Education, 63(5), 376–382.

    Article  Google Scholar 

Download references

Funding

The research reported here is supported by the National Science Foundation, through Grant# 1838339 to Texas Tech University. The opinions, findings, and conclusions or recommendations expressed are our own and do not necessarily reflect the views of the National Science Foundation.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Jianlan Wang.

Ethics declarations

Conflicts of interest

There are no conflicts of interest to report.

Ethics approval

This study was approved by the human research protection program of Texas Tech University, IRB No. 2019–579.

Informed consent

Informed consent was obtained from all individual participants included in the study.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Appendix. Exemplary responses and coding

Appendix. Exemplary responses and coding

1) Sample question about magnets

Context: You teach a new unit about magnets in the first grade. You provide students with several items, including paper clips, nails, aluminum cans, pencils, crayons, and books. They need to make predictions about if certain items can be attracted by a magnet. You approach a group of students and have a conversation with them as shown below:

You: Which items do you think can be attracted by the magnet?

Students: Paper clips, nails, and cans.

You: Why?

Students: Because they are metal, magnets attract metal.

You: Do you think magnet attract all metals?

Students: Yes.

  • Q1. What can you conclude from the information provided about the students’ scientific content knowledge, both the strengths (i.e., what they know) and difficulties (i.e., what they do not know)? Put “n/a” in the box if you think there are no strengths or difficulties in students’ understanding.

  • Q2. How would you respond to the students? Please use direct quote of what you would say. What is(are) your purpose(s) behind that response?

Sentence stems were provided so the PSTs answered the two questions by filling in the blanks.

  • 1-a. Based on the students’ answers, they know that ().

  • 1-b Based on the students’ answers, they do not know that ().

  • 2-a I would respond to students by saying ().

  • 2-b Because ().

2) Exemplary responses from the PSTs (in parentheses), our codes, and justifications

Response 1—Example of effective guiding questions

  • 1-a. Based on the students’ answers, they know that (Students know that magnets attract metal.)

  • 1-b. Based on the students’ answers, they do not know that (Students do not know that magnets do not attract all metals.)

  • 2-a. I would respond to students by saying (“Okay, so the cans are made of metal. Do they attract to the magnet?”)

  • 2-b. Because (not every metal will attract magnets.)

Codes: O = 2, C = 2, S = 2, I = 2, PCK-Q = 8

Justification: In this response, the Preservice Teacher (PST) accurately identified the strength and difficulty in students’ understanding (S = 2). The statements “magnets attract metal” and “not every metal will attract magnets”/ “magnets do not attract all metals” suggest that the PST possessed the correct science content knowledge associated with this question (C = 2). The PST used a question that referred to a specific phenomenon, i.e., whether aluminum cans can be attracted by a magnet. Thus, it is a specific guiding question (O = 2). By answering this question, students need to test whether cans can be attracted by a magnet. Then they would find out cans as metal cannot be attracted a magnet, which challenges their conclusion that a magnet attracts all metals. Thus, this guiding question is likely to be effective (I = 2). Together, the PCK-Q score for this question is 2 + 2 + 2 + 2 = 8, which indicates that this PST is likely to scaffold students’ learning with effective guiding questions while teaching the topic of magnetic items.

Response 2 – Example of referring to educational terms as panacea responses

  • 1-a. Based on the students’ answers, they know that (Students know that metals are attracted by magnets.)

  • 1-b. Based on the students’ answers, they do not know that (Students do not know that not all metals are magnetic.)

  • 2-a. I would respond to students by saying (“That is a great inference, or prediction. In science, we call this a hypothesis. A hypothesis is an idea that can be tested to prove it or disprove it. How about we test your hypothesis?”)

  • 2-b. Because (I want students to learn from firsthand experiences as much as possible. Guide them through the process of self-discovery.)

Codes: O = 1, C = 2, S = 2, I = 0, PCK-Q = 5

Justification: In this response, the PST accurately identified the strength and difficulty in students’ understanding (S = 2). The statement “metals are attracted by magnets” in 1-a is not very accurate. However, with the statement in 1-b that “Students do not know that not all metals are magnetic”, it is safe to infer that the PST meant “items that can be attracted by a magnet are metal”. This PST possessed the correct science content knowledge associated with this question (C = 2). The PST used a question in 2-a. However, this question does not refer to any specific scientific phenomena or concepts as guidance. This question or the response in 2-a can be applied to any cases when students propose an idea. It cannot represent the PST’s intentional effort of guidance in this specific context (O = 1). The PST told students to test their hypothesis but did not provide clear guidance on how. According to the question stem, the students have tested the provided items and drawn an incorrect conclusion. Without clear guidance by referring to non-magnetic metals (e.g., aluminum cans), students may simply repeat the lab and draw the same conclusion. This is the situation where explicit intervention is probably needed from a teacher to help students realize their misconception. However, the guidance is missing from this PST’s response (I = 0). For the purpose of PST education, assigning I = 0 to this response is to convey to this PST that there is no universal response to students (i.e., test your hypothesis) by referring to general pedagogical knowledge. Instead, the response should be customized into pedagogical content knowledge for specific contexts or topics (e.g., test your hypothesis with aluminum cans or copper coins). Together, the PCK-Q score for this question is 1 + 2 + 2 + 0 = 5, which indicates that this PST is less likely to scaffold students’ learning with effective guiding questions while teaching the topic of magnetic items. This PST had sufficient science content knowledge to determine students’ strengths and difficulties but might not be skilled enough to determine a content-specific guiding question to bridge the gap in students’ understanding.

Response 3 – Example of diverting questions

  • 1-a. Based on the students’ answers, they know that (Students know that magnets attract metals).

  • 1-b. Based on the students’ answers, they do not know that (Students do not know how metals and non-metals work).

  • 2-a. I would respond to students by saying (“Class, what is the difference between metals and non-metals?”)

  • 2-b. Because (Aluminum is not a metal. Therefore, it is not magnetic.)

Codes: O = 2, C = 0, S = 1, I = 0, PCK-Q = 3

Justification: Like Response 2, the statement “magnets attract metals” is inaccurate but can be interpreted as “items that can be attracted by a magnet are metal”. However, there is no statement of “not all metals are magnetic” to support that interpretation. Besides, the statement “Aluminum is not a metal. Therefore, it is not magnetic” indicates a logic from this PST that “All metals are magnetic. Aluminum is not a metal, so it is not magnetic”. The analysis of students’ strength could be taken as incorrect as well. The analysis of students’ difficulty in 1-b is vague. It is unclear what “work” means. Besides, there is no evidence from the question stem about whether students knew or did not know “how metals and non-metals work”. Thus, the S score could be 1 (taking 1-a as an accurate analysis of students’ strength) or 0 (taking 1-a as an inaccurate analysis of students’ strength)*. The statement in 2-b, “Aluminum is not a metal. Therefor it is not magnetic.”, is a clear piece of evidence about the PST’s misconception, because aluminum is metal (C = 0). This PST guided students with a question that referred to a specific topic, i.e., the difference between metals and non-metals (O = 2). However, this topic is not directly related to students’ difficulty in this context. Articulating the difference between metals and non-metals is unlikely to help students realize that not all metals can be attracted by a magnet (I = 0). Together, the PCK-Q score for this question is 2 + 0 + 1 + 0 = 3, which indicates that this PST is unlikely to scaffold students’ learning with effective guiding questions while teaching the topic of magnetic items. This PST might struggle with the science content knowledge associated with this question, which impeded her analysis of students’ strengths and difficulties. She was probably aware of the importance of using guiding questions, but she might not be skilled enough to determine an effective guiding question to bridge the gap in students’ understanding.

Note*: This uncertainty is the limit of our instrument. Like other written instruments, post interviews with respondents could clarify some uncertainty but would also increase the workload of assessment.

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Wang, J., Wang, Y., Kashef, S.S. et al. Quantitative Measurement of Pre-Service Teachers’ Competency of Questioning in Scaffolding Students’ Science Learning. Res Sci Educ (2024). https://doi.org/10.1007/s11165-024-10168-3

Download citation

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1007/s11165-024-10168-3

Keywords

Navigation