Computers & Geosciences, Volume 25, Number 3, 1999

Don M.Mackenzie
Division of Earth Sciences
University of Derby, Kedleston Road, Derby, DE22 1GB, England
Oxford, OX3 0BP

John C. Butler
Department of Geosciences
University of Houston
Houston, TX 77204

From the Associate Editor

Given the international readership of Computers&Geosciences, it was decided early on to feature information about how the Internet is being used around the world. I met Jose Brilha at a meeting this summer at the University of Darby and was impressed with how he and his colleagues are using the Internet in their country.

The Tripartite Interactive Assessment Delivery System Project


An Overview of the System


The Tripartite Interactive Assessment Delivery System (TRIADS) is an assessment generation tool kit for users of Authorware Professional (Macromedia). Building on earlier work, the basic system was initiated at the University of Derby in 1992 and developed in operation over the next four years. Since 1996, it has formed the basis of a joint project between the Universities of Liverpool, Derby and the Open University, U.K. to produce an interactive assessment system capable of testing learning outcomes. The development of the system is financed by the Higher Education Funding Council for England and fourteen other UK universities are currently evaluating the product before wider distribution. A brief description of the development of the system and its essential features is given here.

Introduction and History of Development

A sudden rise in student numbers during 1988 and the increased workload associated with their assessment led tutors in geology at the University of Derby to consider the application of

a Software Authoring System to deliver tests to first year undergraduates. Trials started in 1989 using a commercial DOS-based system that facilitated the production of rectangular hot spot, label diagram and single text entry interactions.

The outcomes were very encouraging. Tests could be generated quite quickly and the results were similar to those obtained by traditional methods. By 1992 we had developed the concept of an assessment ‘engine’ which handled the sign-on and score calculation and into which it was easy to insert question ‘objects’ for display in sequence. Assessments developed using this system were used in two first year undergraduate courses in geology. However, the limitations of the software were beginning to restrict the scope of question design.

Transition to the Windows environment opened the possibility of using similar design principals in a more powerful authoring system such as Authorware Professional (Macromedia). Authorware provided a greater range of interaction styles, had full multi-media capabilities and individual segments of code (the question ‘objects’) could be saved as separate files. The University of Derby Interactive Assessment Delivery System (DIADS) was thus born and an early version was incorporated into a courseware module written for the UK Earth Science Courseware (Mackenzie & Wilkins, 1994).

Between 1992 and 1996 the system became increasingly used to generate assessments, in a number of disciplines at Derby. Tutors were given a free hand in question design and an experienced Authorware programmer coded the questions into the system. Initially, this was quite time consuming because of the sometimes over ambitions nature of individual question designs. However, it became apparent that many of these could resolved into a sequence or combination of basic interactions for which code ‘templates’ could be developed.

The TRIADS Project and System

A number of other UK Earth science departments were using Authorware Professional to develop interactive tutorials for the UK Earth Science Courseware Consortium during this time. Some of these had an element of assessment built into them and some departments built interactive assessments around them (Boyle, 1997) In 1996 Earth science departments at the Universities of Liverpool, Derby and the Open University successfully bid for government funding to develop an interactive assessment system capable of testing learning outcomes based on the DIADS system. The delivery system was renamed TRIADS, continuing the crystallographic theme of the initial acronym and recognizing the contribution that all three institutions were to make.

After eighteen months of operation, the TRIADS project ( /triads/index.html ) has resulted in a much more widespread use of the system in a network of 14 evaluation sites in UK universities. About half the evaluation sites are in departments of Earth sciences, the remainder are in a wide variety of other disciplines. The broader use of the system has resulted in the development of a number of question styles that had not previously been considered but could be adapted for use across all disciplines.

A selection of code templates has been developed for each question style. These templates are extremely flexible and provide full error trapping, a variety of feedback types and sophisticated scoring mechanisms for fine-tuning of results. Tutors are now advised to use the templates to produce questions unless their specialist design is of particular academic importance or has the potential to be developed into a new template for wider application.

A system of devolved question design, with tutors using web sourced ‘Question Definition Proformas’, is now recommended as the most efficient means of building a new assessment.

Tutors do not need to know how to program the system and a faculty/school technician can be employed to paste the questions into the engine, adjust the graphics and compile the assessments for a range of disciplines.

The expansion of the range of question styles to over twenty has allowed tutors to design sequences of questions which begin to test understanding as well as basic knowledge and reduce the ‘guess factor’ that is inherent in more traditional multiple choice tests.

A detailed description of the system and its operation will be published in due course but a list of the facilities that it offers are given in Tables 1 and 2. The system provides a greater variety of question styles and more flexibility in operation than most commercially available assessment programs. Further details and a live demonstration of some of the question styles may be obtained from the project system web site at URL:

A single copy of Authorware Professional is required to generate an assessment in either a Macintosh or a PC environment. When packaged the resulting assessments may be delivered without additional software over a Local Area Network, and over Intranets and the Internet by means of the ‘Shockwave’ plugin from Macromedia.


TRIADS Evaluation Sites are contracted to provide full feedback from both academics and students to the project team concerning the operation of the system. In order that student survey results are received in a standard format for processing, assessment evaluation forms for student use are built into the ‘engine’ and can be activated as required by the tutor. Student feedback data is automatically saved to file. When completed, this is likely to be the largest survey of student perception of this mode of assessment that has been undertaken to date.


The TRIADS project is resulting in extensive collaboration in the field of computer delivered assessment across a significant number of UK universities in a range of disciplines. It is intended that this will result in a fully functional and thoroughly tested assessment system for distribution during 1999.

The system will be highly interactive and capable of delivering over twenty basic question styles in a variety of configurations with full multimedia support. The ability to deliver such tests in the web environment opens exciting possibilities for increased flexibility of assessment in the future.


Mackenzie, D. M. & Wilkins, H. (1994) Preparing for Fieldwork 1 - Using a Compass-Clinometer. Interactive Courseware Module 2. UK Earth Science Courseware Consortium, University of Manchester.

Boyle, A.P., Bryon, D.N. and Paul, C.R.C. (1997) Computer-based learning and assessment: A palaeontological case study with outcomes and implications. Computers & Geosciences, Vol. 23, No.5, pp.573-580


Table 1. TRIADS - Summary of Features of Assessment Engine

Configuration Password protected menu allows tutor to configure a test at run-time.

Default configuration set in Authorware code.

Sign-on Asks user for Full Name, Date of Birth, Academic Group.

Includes dyslexic and colour-blindness checkboxes.

Run modes Sequential - no return to a question once answered.

Paged - user may page between questions and re-answer during test and/or at end.

Cycling - sequential delivery to end - optional return to questions scoring <100%

Formative/summative - switches feedback on/off

Menu for user (optional) - tutor configurable - access to questions groups

Show/hide score, question scores, scores for groups of questions

Run-time printout (optional)

Evaluations Module/course evaluation form - tutor configurable - (optional)

Assessment evaluation forms - fixed content - (optional)

Results filing Up to 3 files, group or individual, fully configurable for content by tutors.

Scoring Final score based on fixed maximum mark or maximum for questions attempted.

Editing Mode Skips sign-on and menu to allow rapid creation and testing of questions within the

Authorware editor. Access via an on-screen button.

Tutorial area Code areas for insertion of previously created Authorware code are provided so that

courseware materials may be used as introductions or as feedback to questions.

Question Sequencer

Area of the engine into which question coding is placed. Individual questions or whole assessments may be saved as Authorware ‘model’ files for archiving and later use.

General The whole system is open-coded within Authorware Professional.

Fixed components of the engine are enclosed within red map icons that should not be opened or edited by the user. A copy of Authorware Professional is required to compile an assessment but the resulting packaged files may be run on any suitable machine, over a Local Area Network and over Intranets and the Internet by means of the ‘Shockwave’ plugin from Macromedia without incurring any run-time licence fees.

The system will work on PC-Windows 3.11, 95, NT and Mac/PowerMac.



Table 2. TRIADS - Summary of Question Templates & Styles

All templates are fully configurable for question/answer content, scoring, feedback, graphic design and mode of operation. Scores for individual questions can be weighted and/or carried over if desired. Video and sound support may be included. Feedback up to the level of full courseware units or internet links.

Multiple-choice/response types

Text +/-graphics up to 10 selections - optionally scored ‘Don’t know’.

True/False/Yes/No - up to five statements for which True/False/Don’t know answers required.

Matrix +/-graphics - multiple response with up to 25 selections.

Rectangular hotspot - multiple choice/response selection of hidden rectangles on graphic.

Polygonal hotspot - multiple choice/response selection of hidden irregular areas on graphic.

Hot object - multiple choice/response selection of visible irregular shaped graphics.

Assertion-reason - user is asked to assess the quality of evidence for a statement.

Text Analysis - user selects line(s) of text in response to question (under development).

Move object types

Label diagram - up to 18 labels in group +/- dummy labels, dummy positions, and penalty scores.

Randomised label diagram - labels presented to user, one at a time, in random sequence.

Randomised graphic + label diagram - identify graphic and position its label on diagram.

Sequencing - build sequence text/graphic - 4 sequence scoring algorithms.

Classification - place text/graphics into classes.

Sequenced classification - place text/graphics into classes with contents in correct sequence.

Slider(s) on scale - move slider(s) to position on continuous X/Y scale.

Build diagram - move graphic objects to position - useful for simulations.

Text/Numeric entry

Single/multiple entry - up to 10 entries per screen/template, +/- graphics.

All templates of this type will cope with text entries, numeric entries, mixed text entry and numeric entry and numeric entries within text. Tutor may specify an error range for testing numeric entries. The user’s entry in the first entry position of a question may be used to branch the testing of subsequent multiple entries.

Graph Plotting (under development)

A series of modular templates with tutor-configurable graph paper which can be ‘bolted together’ to test graph plotting and interpretation skills.

Draw object

Arrow/line - tests slope, intercept on Y, start & end positions, direction of draw.

Box/circle - outline area - check boundaries

Combination Types

Matrix-sequential - select from up to 25 statements - place selected statements in sequence

Matrix- classification - select from up to 25 statements - place selected statements in class.

Multiple text entry + labelling - type entry and match with label.

Multiple text entry + sliders - type entry and estimate value (under development)

(more in development)

Simulations and empty templates (shells)

Empty templates(shells) are provided so that assessment creators may program complex interactions up to the level of full simulations. Any complexity of code will work in a shell and will be scored by the system so long as a few simple rules are followed with respect to the nomenclature of scoring variables.