Learning Designs - Products of the AUTC project on ICT-based learning designs
Home | Exemplars (selected) | Guides | Tools | The Project | Search
  Snapshot (selected) Designer's Voice Cross-links
  Self-Assessment in Engineering Design Team
Snapshot
 

"Use of computer based numerical input tests for student self-assessment in Engineering Dynamics"

 
Scott
Stone
Design Team
   
  Team: Nathan Scott and Brian Stone

 

  Focus: Concept/Procedure Development
  Discipline: Engineering
  Target: Undergraduate (early)
  ICT used: Web plus special software (Flying Fish server)
  Scope: Entire subject

Designer's Summary

 

The learning design implemented in an undergraduate subject about Engineering Dynamics focuses on self-assessment and correcting misconceptions about topics in engineering dynamics via the use of a computer program that presents graphical problems for students to solve.

The tutorial system software supports problems with numerical, graphical or
mathematical expression responses. The software provides feedback to explain whether an answer is correct or incorrect. The feedback provided is tailored to focus on the most common misconceptions students make. In the specific learning design reported here, only the numerical response problem type was used.

This learning design is implemented in face-to-face tutorial sessions and its purpose is to assist students to consolidate their knowledge about concepts introduced to them in face-to-face lectures.

The learning design is repeated on a weekly basis. Students are asked to work through a number of problems each week. Each weekly problem set has some "Practice" problems and then some "Assessed" ones. Answers to the Assessed problems cannot be entered until the required Practice problems have been completed. There is a strictly enforced weekly deadline for the assessed problems and late problems are automatically given a mark of zero.

Where possible, the graphical problems are taken from real physical situations (e.g., the velocity of an aircraft as measured by a radar). Students are encouraged to work out a solution to each problem in small groups. However each student has slightly different numerical parameters to each problem, so at some stage independent working must be done.

Any number of attempts may be made at Practice problems. However, on Assessed problems, each incorrect attempt causes a loss of marks.

Comment from the Project Team...

The rationale for the use of ICT and the nature of teacher support provided in this learning design implementation are important factors that contribute to its potential to foster high quality learning. See the comments provided in the "Designer Debrief" section (accessible below).

Rationale for Inclusion

 

This exemplar has been selected for inclusion for the following reasons:

  • It illustrates the use of ICT for purposes of self-assessment. The learning design focuses on correcting learner misconceptions about fundamental concepts in engineering dynamics by allowing learners to self-assess/practise by answering questions specifically designed to elicit a known misconception. The software utilised (FlyingFish) is a licensed commercial product.
  • The development of this ICT-based learning design was funded by the Australian Government Committee for the Advancement of University Teaching, CAUT. Stone, Devenish and Entwistle received four CAUT grants in the period 1993-1997. These grants allowed the group to develop various innovative types of teaching, such as:
    • A comprehensive set of dynamics course notes, with movies of dynamic systems;
    • Some graphical input test types where the student sketches a graph or draws a vector on the computer screen;
    • Unique evolving lab exercises.

    The FlyingFish software, however, was independently developed

Please Cite As:

  Scott, N. & Stone, B. (2002). Description of Use of computer based graphical tests for student self-assessment in Engineering Dynamics. Retrieved , from Learning Designs Web site:
     
  Top of Page Home