Development and application of the Elementary School Science Classroom Environment Scale (ESSCES): measuring student perceptions of constructivism within the science classroom

This article describes the development, validation and application of a Rasch-based instrument, the Elementary School Science Classroom Environment Scale (ESSCES), for measuring students’ perceptions of constructivist practices within the elementary science classroom. The instrument, designed to complement the Reformed Teaching Observation Protocol (RTOP), is conceptualised using the RTOP’s three construct domains: Lesson Design and Implementation; Content; and Classroom Culture. Data from 895 elementary students was used to develop the Rasch scale, which was assessed for item fit, invariance and dimensionality. Overall, the data conformed to the assumptions of the Rasch model. In addition, the structural relationships among the retained items of the Rasch model supported and validated the instrument for measuring the reformed science classroom environment theoretical construct. The application of the ESSCES in a research study involving fourth grade students provides evidence that educators and researchers have a reliable instrument for understanding the elementary science classroom environment through the lens of the students.
This is a preview of subscription content, log in via an institution to check access.
Access this article
Subscribe and save
Springer+ Basic
€32.70 /Month
- Get 10 units per month
- Download Article/Chapter or eBook
- 1 Unit = 1 Article or 1 Chapter
- Cancel anytime
Buy Now
Price includes VAT (France)
Instant access to the full article PDF.
Rent this article via DeepDyve
Similar content being viewed by others

Students’ perceptions of their first experiences of secondary-school science in New Zealand
Article Open access 27 September 2022

Validation and application of the Constructivist Learning Environment Survey in English language teacher education classrooms in Iran
Article 10 February 2015

Learning science outside the classroom: development and validation of the out-of-school learning environments perception scale
Article 18 November 2020
Explore related subjects
References
- Adamson, S. L., Banks, B., Burtch, M., Cox, F., I. I. I., Judson, E., Turley, J., et al. (2003). Reformed undergraduate instruction and its subsequent impact on secondary school teaching practice and student achievement. Journal of Research in Science Teaching,40, 939–957. ArticleGoogle Scholar
- American Association for the Advancement of Science (AAAS). (1993). Science for all Americans. Washington, DC: AAAS. Google Scholar
- Bond, T. G., & Fox, C. M. (2007). Applying the Rasch model: Fundamental measurement in the human sciences (2nd ed.). Mahwah, NJ: Lawrence Erlbaum Associates. Google Scholar
- Boone, W. J., & Scantlebury, K. (2006). The role of Rasch analysis when conducting science education research utilizing multiple-choice tests. Science Education,90, 253–269. ArticleGoogle Scholar
- Boone, W. J., Townsend, J. S., & Staver, J. (2011). Using Rasch theory to guide the practice of survey development and survey data analysis in science education and to inform science reform efforts: An exemplar utilizing STEBI self-efficacy data. Science Education,95, 258–280. ArticleGoogle Scholar
- Brooks, J. G., & Brooks, M. G. (1999). In search of understanding: The case for constructivist classrooms. Alexandria, VA: Association for Supervision and Curriculum Development. Google Scholar
- Cavanagh, R. F., & Romanoski, J. T. (2006). Rating scale instruments and measurement. Learning Environments Research,9, 273–289. ArticleGoogle Scholar
- Cochran-Smith, M., & Lytle, S. L. (2006). Troubling images of teaching in No Child Left Behind. Harvard Educational Review,76, 668–697. Google Scholar
- DeBoer, G. E. (2002). Student-centered teaching in a standards-based world: Finding a sensible balance. Science and Education,11, 405–417. ArticleGoogle Scholar
- Dorans, N. J., & Kingston, N. M. (1985). The effects of violations of unidimensionality on the estimation of item and ability parameters and on item response theory equating of the GRE verbal scale. Journal of Educational Measurement,22, 249–262. ArticleGoogle Scholar
- Falconer, K., Wyckoff, S., Mangala, J., & Sawada, D. (2001, April). Effect of reformed courses in physics and physical science on student conceptual understanding. Paper presented at the annual meeting of the American Educational Research Association, Seattle, WA.
- Fisher, D. L., & Fraser, B. J. (1981). Validity and use of My Classroom Inventory. Science Education,65(1), 145–156. ArticleGoogle Scholar
- Fisher, D. L., & Fraser, B. J. (1983a). Validity and use of Classroom Environment Scale. Educational Evaluation and Policy Analysis,5, 261–271. Google Scholar
- Fisher, D. L., & Fraser, B. J. (1983b). A comparison of actual and preferred classroom environments as perceived by science teachers and students. Journal of Research in Science Teaching,20, 55–61. ArticleGoogle Scholar
- Fraser, B. J. (1990). Individualised Classroom Environment Questionnaire. Melbourne: Australian Council for Educational Research. Google Scholar
- Fraser, B. J. (1998). Classroom environment instruments: Development, validity and applications. Learning Environments Research,1(1), 7–33. ArticleGoogle Scholar
- Fraser, B. J. (2012). Classroom learning environments: Retrospect, context and prospect. In B. J. Fraser, K. G. Tobin, & C. J. McRobbie (Eds.), Second international handbook of science education (pp. 1191–1239). New York: Springer. ChapterGoogle Scholar
- Fraser, B. J., Anderson, G. J., & Walberg, H. J. (1982). Assessment of learning environments: Manual for Learning Environment Inventory (LEI) and My Class Inventory (MCI) (3rd ed.). Perth: Western Australian Institute of Technology. Google Scholar
- Gable, R. K., Ludlow, L. H., & Wolf, M. B. (1990). The use of classical and Rasch latent trait models to enhance the validity of affective measures. Educational and Psychological Measurement,50, 869–878. ArticleGoogle Scholar
- Gijbels, D., Van De Watering, G., Dochy, F., & Van Den Bosscher, P. (2006). New learning environments and constructivism: The students’ perspective. Instructional Science,34, 213–226. ArticleGoogle Scholar
- Hambleton, R. K. (1993). Comparison of classical test theory and item response theory and their applications to test development. Educational Measurement: Issues and Practice,12(3), 38–47. ArticleGoogle Scholar
- Hambleton, R. K. (2006). Good practices for identifying differential item functioning. Commentary Medical Care,44(11:3), S182–S188. ArticleGoogle Scholar
- Herrenkohl, L. R., & Guerra, M. R. (1998). Participant structures, scientific discourse, and student engagement in fourth grade. Cognition and Instruction,16, 431–473. ArticleGoogle Scholar
- Herrenkohl, L. R., Palincsar, A. S., DeWater, L. S., & Kawasaki, K. (1999). Developing scientific communities in classrooms: A sociocognitive approach. The Journal of the Learning Sciences,8, 451–493. ArticleGoogle Scholar
- Jong, C., Pedulla, J. J., Reagan, E. M., Salomon-Fernandez, Y., & Cochran-Smith, M. (2010). Exploring the link between reformed teaching practices and pupil learning in elementary school mathematics. School Science and Mathematics,110, 309–326. ArticleGoogle Scholar
- Lance, C. E., & Vandenberg, R. J. (Eds.). (2009). Statistical and methodological myths and urban legends: Doctrine, verity and fable in the organizational and social sciences. New York: Routledge, Taylor and Francis Group.
- Linacre, J. J. (2003). Rasch power analyses: Size vs. significance: Infit and outfit mean-square and standardized Chi square fit statistic. Rasch Measurement Transactions, 17(1), 918. [http://www.rasch.org/rmt/rmt171n.htm].
- Linacre, J. M. (2009a). WINSTEPS (Computer program 3.68). Chicago: MESA Press. Google Scholar
- Linacre, J. M. (2009b). A user’s guide to Winsteps Rasch-model computer programs (Program manual 3.68.0). Chicago: Winsteps. Google Scholar
- Loyens, S. M., & Gijbels, D. (2008). Understanding the effects of constructivist learning environments: Introducing a multi-directional approach. Instructional Science,36, 351–357. ArticleGoogle Scholar
- Loyens, S. M., Rikers, R. M. J. P., & Schmidt, H. G. (2008). Relationships between students’ conceptions of constructivist learning and their regulation and processing strategies. Instructional Science,36, 445–462. ArticleGoogle Scholar
- Ludlow, L. H., Enterline, S. E., & Cochran-Smith, M. (2008). Learning to teach for Social-Justice-Beliefs Scale: An application of Rasch measurement principles. Measurement and Evaluation in Counseling and Development,40, 194–214. Google Scholar
- Martin, A., & Hand, B. (2009). Factors affecting the implementation of argument in the elementary science classroom: A longitudinal case study. Research in Science Education,39, 17–38. ArticleGoogle Scholar
- Marx, R. W., & Harris, C. J. (2006). No Child Left Behind and science education: Opportunities, challenges and risks. The Elementary School Journal,106, 467–477. ArticleGoogle Scholar
- Masters, G. N. (1982). A Rasch model for partial credit scoring. Psychometrika,47, 149–174. ArticleGoogle Scholar
- Moos, R. H., & Trickett, E. J. (1987). Classroom Environment Scale manual (2nd ed.). Palo Alto, CA: Consulting Psychologists Press. Google Scholar
- National Research Council. (1996). National science education standards. Washington, DC: National Academy Press. Google Scholar
- No Child Left Behind Act of 2001, 20 U.S.C. § 6319 (2008).
- Ogasawara, H. (2002). Exploratory second-order analyses for components and factors. Japanese Psychological Research,44(1), 9–19. ArticleGoogle Scholar
- Olgun, O. S., & Adali, B. (2008). Teaching grade 5 life science with a case study approach. Journal of Elementary Science Education,20(1), 29–44. ArticleGoogle Scholar
- Piburn, M., & Sawada, D. (2000a). Reformed Teaching Observation Protocol (RTOP) reference manual (Technical Report No. IN00-32). Tempe, AZ: Collaborative for Excellence in the Preparation of Teachers, Arizona State University.
- Piburn, M., & Sawada, D. (2000b). Reformed Teaching Observation Protocol (RTOP) training guide, (Technical Report No. IN00-02). Tempe, AZ: Collaborative for Excellence in the Preparation of Teachers, Arizona State University.
- Pringle, R. M., & Martin, S. C. (2005). The potential impacts of upcoming high-stakes testing on the teaching of science in the elementary classroom. Research in Science Education,35, 347–361. ArticleGoogle Scholar
- Rasch, G. (1960). Probabilistic models for some intelligence and attainment tests. Copenhagen: Danish Institute for Educational Research. (Expanded edition, 1980. Chicago: University of Chicago Press).
- Roth, W. M., & Bowen, G. M. (1995). Knowing and interacting: A study of culture, practices and resources in a grade 8 open-inquiry science classroom guided by a cognitive apprenticeship metaphor. Cognition and Instruction,13(1), 73–128. ArticleGoogle Scholar
- Sawada, D., Piburn, M., Judson, E., Turley, J., Falconer, K., Benford, R., et al. (2002). Measuring reform practices in science and mathematics classrooms: The Reformed Teaching Observational Protocol (RTOP). School Science and Mathematics,102, 17–38. ArticleGoogle Scholar
- Smith, E. V., Jr. (2000). Metric development and score reporting in Rasch measurement. Journal of Applied Measurement,1, 303–326. Google Scholar
- Smith, E. V., Jr. (2002). Detecting and evaluating the impact of multidimensionality using item fit statistics and principal components analysis of residuals. Journal of Applied Measurement,3, 205–231. Google Scholar
- Smith, C. L., Maclin, D., Houghton, C., & Hennessey, M. G. (2000). Sixth-grade students’ epistemologies of science: The impact of school science experiences on epistemological development. Cognition and Instruction,18, 349–422. ArticleGoogle Scholar
- Smith, A. B., Rush, R., Fallowfield, L. J., Velikova, G., & Sharpe, M. (2008). Rasch fit statistics and sample size considerations for polytomous data. BMC Medical Research Methodology,8(33), 1–11. Google Scholar
- Southerland, S., Kittleson, J., Settlage, J., & Lanier, K. (2005). Individual and group meaning-making in an urban third grade classroom: Red fog, cold cans, and seeping vapor. Journal of Research in Science Teaching,42, 1032–1061. ArticleGoogle Scholar
- Southerland, S. A., Smith, L. K., Sowell, S. P., & Kittleson, J. M. (2007). Resisting unlearning: Understanding science education’s response to the United State’s national accountability movement. Review of Research in Education,31, 45–78. ArticleGoogle Scholar
- Taylor, P. C., Fraser, B. J., & Fisher, D. L. (1997). Monitoring constructivist classroom learning environments. International Journal of Educational Research,27, 293–302. ArticleGoogle Scholar
- Taylor, P. C., Fraser, B. J., & White, L. R. (1994, April). The revised CLES: A questionnaire for educators interested in the constructivist reform of school science and mathematics. Paper presented at the annual meeting of the American Educational Research Association, Atlanta, GA.
- Tenenbaum, G., Naidu, S., Jegede, O., & Austin, J. (2001). Constructivist pedagogy in conventional on-campus and distance learning practice: An exploratory investigation. Learning and Instruction,11, 87–111. ArticleGoogle Scholar
- Thomas, G. P. (2004). Dimensionality and construct validity of an instrument designed to measure the metacognitive orientation of science classroom learning environments. Journal of Applied Measurement,5, 367–384. Google Scholar
- Thurstone, L. L. (1940). Current issues in factor analysis. Psychological Bulletin,37, 189–236. ArticleGoogle Scholar
- Tsai, C. (2000). Relationships between student science epistemological beliefs and perceptions of constructivist learning environments. Educational Research,42, 193–205. ArticleGoogle Scholar
- Upadhyay, B., & DeFranco, C. (2008). Elementary students’ retention of environmental science knowledge: Connected science instruction versus direct instruction. Journal of Elementary Science Education,20(2), 23–37. ArticleGoogle Scholar
- Wolfe, E. W., & Smith, E. V., Jr. (2007a). Instrument development tools and activities for measure validation using Rasch models: Part I—Instrument development tools. Journal of Applied Measurement,8, 97–123. Google Scholar
- Wolfe, E. W., & Smith, E. V., Jr. (2007b). Instrument development tools and activities for measure validation using Rasch models: Part II—Instrument development tools. Journal of Applied Measurement,8, 204–234. Google Scholar
- Wright, B. D., & Masters, G. N. (1982). Rating scale analysis. Chicago: MESA Press. Google Scholar
- Wright, B. D., & Mok, M. (2000). Rasch models overview. Journal of Applied Measurement,1, 83–106. Google Scholar
- Wright, B. D., & Stone, M. H. (1979). Best test design: Rasch measurement. Chicago: MESA Press. Google Scholar
- Wright B. D, & Masters, G. N. (2002). Number of person or item strata. Rasch Measurement Transactions, 16(3), 888. (http://www.rasch.org/rmt/rmt163.htm). Google Scholar
Author information
Authors and Affiliations
- Department of Elementary and Secondary Education, 75 Pleasant Street, Malden, MA, 02148, USA Shelagh M. Peoples
- Department of Educational Research, Measurement and Evaluation, Boston College, 140 Commonwealth Avenue, Chestnut Hill, MA, 02467, USA Laura M. O’Dwyer, Yang Wang, Jessica J. Brown & Camelia V. Rosca
- Shelagh M. Peoples