Loading…
Welcome to LCT3!

Programme Updates
Friday - 11.35: session 20 – Mathew Toll & Shi Chunxu is back on, in B48, replacing Sha Xie.

Friday
Win free books! Find out what happens next for publishing and where LCT4 is happening! 
Wednesday, July 3 • 2:15pm - 2:55pm
Influence of feedback on assessment when using semantic waves

Sign up or log in to save this to your schedule and see who's attending!

According to feedback provided by industry on local and international scale argued about skill deficiencies in engineering graduates, and findings from both cognitive science and thousands of educational research studies, there are serious deficiencies in traditional teaching methods. These have had the effects of provoking calls for changes in how engineering curricula are structured, delivered, and assessed (Felder, 2012). The pedagogic community has critiqued how engineering courses are taught and assessed, showing crucial misalignment between assessment approaches and assessment purpose (Biggs, 2003). The main principle in education, according to Biggs (2003) is the notion of constructive alignment, where explicit learning objectives are aligned to forms of learning, methods of teaching and assessment. Many engineering graduates are not truly career ready, despite intensified efforts to teach and assess many of the bodies knowledge and skill sets that are demanded by industry and mandated by accreditation agencies. Empirical evidence of this gap is spotty and tentative but appears to be growing.

Wiggins (1989), who first related the term ‘authentic assessment’, defined ‘tasks set in a real-world context’ as tasks which ‘are either replicas of or analogous to the kinds of problems faced by adult citizens and consumers or professionals in the field.’ By using authentic real-world environments and exemplars in the assessment, educators can make engineering subjects more ‘visible’ (Winberg & Winberg, 2017), particularly if industry’s needs and recommendations are included in the design of engineering assessment practices. In the assessment, the feedback should meaningful, when it is a linked series of learning opportunities across the entire learning cycle. This research paper will focus on how to enhance students performance by providing authentic and meaningful feedback during the assessment process in engineering education. Building on this line of thinking (Boud and Molloy 2013; Carless 2015; Carless and Boud 2018), feedback is conceptualised as a process in which learners make sense of comments about the quality of their work in order to inform the development of future performance or learning strategies.
The study has been conducted by Prasetyo (2017) in relation to authentic assessment compare to traditional assessment. A study completed by Ghosh (2017) found the important aspect that authentic assessment collects the evidence of the students’ competence to perform workplace tasks. Prasetyo listed six elements with differentiating a traditional test from an authentic assessment as follow in Table 1.
Table 1: Comprising studies between traditional assessment and authentic assessment (Prasetyo, 2017).
No An aspect of the study Traditional assessment - assessment of learning Authentic assessment – assessment for learning and as learning
1 Students’ preparation Knowing in advance to ensure the validity Most of the cases it is not announced before
2 Focus Students’ performance Students’ progress, students engagement, validate and reliable performance
3 Results Score / grade Marking according to the rubric. Extended feedback (Mark 1, 2) and monitoring progress.
4 Frequency Regularly planned as a form of formative and summative assessments Continuous during the year. Demonstration (report, critique and feedback).
5 Format Oral presentations, multiple choice questions tests, written exams based on the covered material. An integrated project, portfolio, logbook, case –studies (problem - based tests), wicked problems, practical skills tests (formative feedback)
6 Context Frequently contextualised Contextually simplified engineering problems and design solution. Disciplinary content in more applied ways.



Following by Shay (2008), Wolff and Hoffman (2014) mentioned in examples of mechatronics students were expected to demonstrate solution for the project to a given problem is feasible and economically viable, the so-called ‘value of a product’ (the solution), as opposed to the value of a ‘process’ (how problem was solved). These are the two different functions served by assessment in engineering education.
Shay (2008) argued the different philosophical paradigms have had profound implications on what we think we are assessing (a matter of ontology) and how we go about assessing ( a matter of epistemology). Are we assessing knowledge, if so, what knowledge? Are we assessing knowers? Are we assessing knowing?
Following Shay (2008), Ingold (2018) debated firstly, a language of description for disciplinary knowledge and secondly, a better understanding of the relationship between these disciplinary forms of knowledge and assessment. Some suggestions include that instructors should clearly explain the purpose and expectation of the activity, acknowledge the challenges of the new approach ramp up slowly provide students with feedback and support through the process align activities with other courses. In Legitimate Code Theory (LCT), Semantics looks at how much meaning is packing into word or sustainable feedback.
LCT (Maton, 2012) is a sociological toolkit for the study of practice. LCT Semantics has been used to demonstrate how real learning in engineering involves a ‘cumulative’ process that sees the relationship between theoretical knowledge and practical knowledge applied beyond the classroom (Wolff, 2018:736). Maton’s legitimate codes provide a potentially useful theory for conceptualising the evaluative criteria which assessors’ bring to bear on student performance. The theoretical framework by applying LCT promises to bring enriched understanding to the role of assessment. Blackie (2014) applied Semantics approach for chemical engineering students and found that assessment has a significant influence on the way in which students learn. A study by Hassan (2017) explore LCT concepts of semantics to investigate how feedback from tutors moved upward and downward along the semantics scale to make semantics waves as part of cumulative knowledge - building in practical exercises.
The specialization dimension explains knowledge claims and knowledge practices as always involving both knowledge and knowers and thus relations to knowledge and relations to knowers and their practices (Winberg, 2018).
As a lecturer, my challenge is to develop my assessment in order to properly assess the difference between gain in complexity and gain in abstraction. Although feedback is recognised as essential and part of the assessment, it has been suggested that students disregard feedback if they perceive no potential gain to be obtained by responding to the provided feedback (Guszzomi et al. 2015). This situation is common in traditional assessments tasks where feedback is often supplied too late to be integrated into the engineering tasks to improve the performance. The conception of feedback as dialogue implies its integration throughout the learning process, not just as part of the formal assessment. Wolff (2018:1) argued that a synergistic ‘feedback approach’ is necessary, not just with respect to student feedback during a course, but from the eventual site of practice – industry expectation of graduates.
The paper presents a pilot analysis - using LCT Semantics - of a case study in Mathematics for Engineering which involved the shift from manual to electronic assessment. For every chapter during the semester the students should complete tutorials as part of a formative assessment by submitting online. This intervention will identify when the student has a learning gap and how the student can improve the learning process and performance in order to be ready for the summative assessment. The feedback process on tutorials can increase students’ cumulative learning opportunities. The paper will propose a way to improve the feedback element and using Semantics approach. Drawing on the data from 2018 and 2019 on the formative assessments for engineering mathematics 1 for the first year students. This paper uses Semantics to explore how ed


Wednesday July 3, 2019 2:15pm - 2:55pm
Room B45

Attendees (4)