Loading…
Welcome to LCT3!

Programme Updates
Friday - 11.35: session 20 – Mathew Toll & Shi Chunxu is back on, in B48, replacing Sha Xie.

Friday
Win free books! Find out what happens next for publishing and where LCT4 is happening! 
Friday, July 5 • 10:45am - 11:25am
Semantic analysis of introductory physics assessments: Towards cumulative learning

Sign up or log in to save this to your schedule and see who's attending!

Physics is based on the application of highly abstract core concepts in different contexts ranging from commonplace to highly specialised. Success in building the hierarchical knowledge structure of physics requires cumulative learning, defined as learning that facilitates the ability to ‘transfer knowledge across contexts and build knowledge over time’ (Maton 2009).
In South Africa, where student cohorts have highly diverse educational backgrounds and where success rates of under-represented groups is a priority (Conana, Marshall, and Case 2016), cultivating cumulative learning among first year students is particularly critical and challenging. In first year Physics modules, cumulative learning means that the students must be able to formulate the core concept in the curriculum and apply these concepts to analyse scenarios and solve problems in a variety of everyday contexts. Past experience from introductory physics modules in our department suggests that students struggle to identify, formulate and apply the core concepts when faced with an unfamiliar problem and instead try to re-apply patterns of previously worked-out examples. We had to address this problem.

The Semantic dimension of Legitimation Code Theory offers a framework to distinguish between levels of abstraction (semantic gravity) and complexity (semantic density) (Maton 2014). Shay (2008) proposed LCT as a useful theoretical framework for conceptualising the relation between knowledge and assessment criteria. Our study is motivated further by the arguments that cumulative learning is promoted by exposing and, in particular, assessing students in an appropriate range of semantic gravity levels (Maton 2013, Kilpert and Shay 2013). To our knowledge our study represents the first application of the Semantic dimension to cumulative assessments in physics.

We report in this paper how we have used the Semantic dimension to critique the quality of assessments of two sequential mainstream physics modules, and how the results are used to inform assessment and education practice.

The study concerns two calculus-based introductory physics modules offered in the first and second semester respectively to students in physics, chemistry, mathematics and other related programmes. We started the study by analysing the semantic gravity profiles of past test and exam question papers over the 2012-2016 academic years. The results showed weaknesses in the papers and highlighted the difficulty students have at both the weaker and stronger ends of the semantic gravity range.

The results informed an intervention during the 2017 academic year. During interactive moderation sessions the two lecturers and internal moderator did semantic gravity analysis of the test and exam question papers before finalising the papers. The process was empowering as it challenged lecturers to rethink the dependence of knowledge on context and provided for the first time a practical language to categorise questions. The process guided both lecturers and students to focus on core concepts in the teaching and learning of the module content. The impact of the intervention was determined by detailed analysis of the students’ grades before and during the intervention. The results show that the intervention improved students’ understanding of the core concepts and their ability to apply these in everyday scenarios, suggesting improved cumulative learning.

The next step in the study is the development of a useful translation device for analysing semantic density in physics assessment questions. Whereas semantic gravity has been an empowering tool to rethink the focus and context dependence of our assessments, semantic density enables us to determine how complexity influences students’ responses to questions. Students’ diverse educational backgrounds may cause large variation in their ability to deal with complexity. We also have to investigate whether the intervention may have had an unintentional effect on the level of complexity in the question papers. The initial results of these analyses will be discussed. The results of this study can inform educators from various disciplines on how Semantics can contribute to the design of assessments aimed at cumulative knowledge building.


Friday July 5, 2019 10:45am - 11:25am
Room B47

Attendees (2)