PARALLELISM IN WORKING-, LEARNING- AND DO-ENVIRONMENTS
The Parallel Instruction Theory for Coaching in Learning Environments for Computer Simulation
by Rik Min
Published in the proceedings of EuroMedia 96; Telematics in a multimedia environment, dec. 19-21, 1966; A publication of the society for computer simulation International (SCS) (Eds. A. Verbraeck & P. Geril) (Long paper and figures available on other WEB page)
This article reports about a research project in the field of human-computer interaction and learning tools technology. Patterns and experiences found in a large number of different working-, learning- and do-environments have been linked. The concepts 'parallelism' and 'interactive work environments' are discussed as well as a design theory for such do-environments.
The design theory and ideas around the concept of parallelism are supposed to offer a solution for a number of recurring problems with the organisation of an educationally sound, interactive learning environment. Min's 'parallel instruction' theory, the 'PI-theory', which is primarily intended as a theoretical framework for the organisation of learning environments for simulation, also has its effect on other kinds of interactive working-environments.
The concepts 'parallel' and 'simultaneous' are dealt with. Parallel data flows and simultaneous processes occur everywhere in society. The focus is on how people handle several processes which are forced upon them simultaneously. The article described the phenomena of parallelism in user-interfaces and ends with a few results of this philosophy and an advice on how work environments should be organised.
Parallel and simultaneous
In daily life people gather impressions continually. They are also expected to act all the time. This also applies to work in interactive learning environments.
In an interactive computer-based working environment there are continually all sorts of simultaneous processes providing different types of visual and auditory stimuli side by side, mixed and on top of each other. The attention is mainly on electronic products or interactive working environments. These are working environments based on a monitor, in contrast to (paper) working environments. 'Interactive' means working environments that are 'computer-based' with software either 'at a distance' or not and require two-way communication between user and system.
A person receives relevant and less relevant information from all sides. When working in an electronic environment there is a continuous process of supply and demand. People have learned to handle this fairly efficiently. A mechanism has been developed enabling a person to select and use only that which he needs. However, in some interactive working environments this is not possible. The question is whether this is due to the transmitter or the receiver: the equipment or the user?
The impressions concerned are visual or auditory; image or sound, but wind, heat, cold, and rain somehow influence the human being too. Brains are fed with information through all channels of perception. How does man select and remember what he needs in those continually changing circumstances?
Eyes and ears are the main channels of perception in most situations. We are trained to select the right image and the right dosage from vast amounts of information. As the eye is able to focus immediately on the one correct spot out of an immense number of parallel impressions, so the ear can focus on the one source it wants to hear among a quantity of different sources. From a series of voices, man is able to pick out the story he wants to hear.
This continuous and almost unconscious selection consumes a lot of the untrained persons' energy with traditional media. Some educationalists say that when offering information to inexperienced people this should be free from all kinds of disturbing information. However, other experts say that an instruction- or learning-environment may well be exciting, challenging or tempting. Some researchers say that a learning environment may well be full of information, or even that it has to be like that. In literature there are numerous directions how a perfect transfer-, learning- or working-environment should look. But one should not follow these directions literally. Depending on the circumstances and the target group in certain fields a design-condition why a design guideline should or should not be applied may well be interpreted differently.
The above remarks also refer to situations involving the common media, such as television, video-recorders and interactive tape-slide series with remote control. Human beings are quite capable of seeing and hearing selectively and handling simultaneous processes. However, for computer-based working environments and in particular interactive open working environments, other laws apply. Linear situations are much less frequent there. Relative linear media are traditionally a world for instruction technologists. Really interactive media should be approached differently.
The television set was designed for the presentation of linear programs. It is true that image and sound were linked, but that form of parallelism provides no problems. The present problem of multimedia computers is parallelism within one channel of perception. Images appear and at the same time other images disappear. This requires the use of one's memory which may well be a problem in certain situations. The same happens in the audio field but we are better equipped to handle it. A watcher's memory is used differently in linear processes than in learning processes.
With courseware and interactive software in general but in particular with learning environments for simulations, this implicit restriction of the monitor indeed plays a part. One is not always aware that many problems with simulations are related to this handicap of the small monitor. In a teaching program for instance certain information has to be used in a different part of the lesson as well. In other words: information from an earlier moment should also be available later. The student would like to see that information without having to do anything. If a designer hasn't anticipated this with his product then it will stay in the cupboard, never to be used again.
When is simultaneous also parallel and when and why is it important for a designer to know or realise this? We all know that in no matter what presentation the use of extra tools is unavoidable. Everyone who speaks live, uses a tool, even if it is only an overhead projector. This is used not merely because a lot of information can be presented in a relatively short time and that one can prepare these things at home, but also because they are retrievable. All these tools, and loose sheets in particular, can be quickly retrieved during the lecture when there are questions about an earlier part. So beside the speaker, there is a parallel flow of information in the form of sheets, slides or electronic sheets, which besides being handy, clearly enable listeners to remember the message better than if the speaker had merely told his story. This then is the third advantage of tools. One would be inclined to think that one type of information flow - the speaker - would be hard to follow, let alone if there is a second or third flow of information. But this is not true at all, as will be demonstrated below in the experiments with 'talking heads' as a means of instruction with simulation. These psychological matters are rather complicated, for the reverse would be more logical. Somehow the message is more easily remembered. This may be due to redundancy and to the fact that listeners are capable of more parallel actions than scientists think. Another factor might be that a speaker tends to radiate a certain ease or confidence. This should also be achieved in computer-based instruction. We assume that parallel events - within the scope of the story - may under certain conditions reinforce the main story. Information that supports the main flow of information but in a different form, will create a better understanding in the listener and therefore the information is better remembered. There is of course an ideal balance between too many and too few simultaneous processes and parallel presentations. The point is whether matters may run parallel in every perception channel or whether this should be avoided. We must carefully consider the conditions under which something may or may not be applied.
Learning versus Instruction
Instruction theories differ from learning theories. Experts in the field of learning ('learning technologists') have developed well-structured ideas which are often at right angles to ideas from the traditional world of instruction ('instruction technologists'). This is of course due to the fact that instruction is something being offered (the 'supply side') and learning is more the gathering of knowledge by the user. This could be called the 'demand side' in education. 'Asking for certain information' and 'offering certain information' are two entirely different fields of science, both traditionally and ideologically. Just compare Gagné's and Romiszowski's ideas with those of Van Parreren and Papert. Van Parreren and Papert have approached the subject from the learning angle: student-controlled learning. Their points of departure, approach and targets in their solutions for educational problems are quite different.
In the world of educational technology there are two different fields: the world of 'learning technology' and that of 'instruction technology'. For the analyses below it is essential to distinguish clearly between 'learning tools' and 'means of instruction'. In principal, learning tools do not contain knowledge or very little and instruction means do contain knowledge. If learning tools do contain knowledge then it is really a combination of learning and instruction tools. For the benefit of this discussion the two functions should be considered and analysed separately.
An example of this fundamental difference between instruction-rich and instruction-poor environments is a chemistry lab with nothing but a set-up for a practical. Apart from reading manuals there is no knowledge transfer. On the other side there is a course on television where knowledge is transferred, dosed in a certain way, with very little interaction.
Computer-based learning environments have to meet other requirements than computer-based instruction programs. With learning tools nothing happens until the user asks for it. With instruction programs the user has to do exactly what the designer wanted. In that type of courseware a target can not be compared with targets in open learning environments. The main target with instruction tools is of course knowledge transfer. This is different for learning tools because then insight is trained or tested or knowledge that is already present is deepened. In general, knowledge is already acquired elsewhere (let's call that knowledge 'static') and somehow a deeper understanding should be achieved by means of another learning tool. Then 'static knowledge' will become 'dynamic knowledge' and all that has been learned will be better applicable.
Plain simulation programs and tools usually do not contain knowledge. Knowledge about the domain of the simulation or the subject has already been touched upon in a traditional way, as part of the curriculum or a course. In general, curriculums or courses have a differentiated offer of learning tools. Simulation could be one of them. Lessons, tutorials, study materials, laboratory sessions, instruction books or an excursion are possible other tools. Each form has it's pros and cons. Instruction materials play a part both in one-way and in two-way tools. In the latter case, instruction plays a secondary but essential part. The question is what type of instruction plays a part in this investigation to create a proper learning environment? A number of forms have been investigated, viz.: paper support or instruction materials; electronic instruction with text and images and electronic instruction with a 'talking head' instructor (desktop video).
Problems with paper-based materials
Simulation programs have always been provided with paper support or instruction materials. Simulations with this form of coaching with all sorts of paper materials proved the most successful in practice. In 1990 Van Schaick Zillesen and Gmelich Meijling investigated certain simulations and the role of paper materials versus electronic instruction materials beside or on a monitor (Min,1995). In the years 1992 -1994, that research and the pattern in the results really showed us the way as to what the pros and cons of sequential situations are and what the order of parallelism is, no matter the medium used. This in itself was an indication that parallelism plays a certain part here. However, we were not aware at the time of how important parallelism was to become for our research as research topic and subject variable. The electronic instruction method - at least the linear form - proved unsuccessful for simulation environments. Apparently the medium or the software design methods were not yet sufficiently developed. At the time multitasking was not yet used for learning tools. This was also shown in our research. Instruction via the monitor did not work on crucial moments and paper materials were not given sufficient credit by teachers and users. Often it was not read at all and carefully prepared cases were left unused. Instead people tended to play with a simulation, including the teachers themselves. Simulations which were used without instruction or coaching often disappeared without a trace, however well-designed. Research showed that simulations which were successful had achieved this through carefully composed work sheets or booklets with set tasks and cases. It was practically always paper instruction material. Therefore paper had a big advantage. But nobody was aware of the main reason for this: parallelism. People were looking for modern solutions and paper had a specific disadvantage: it does not prosper in a monitor culture. It is also difficult to distribute and relatively difficult to copy. The arrival of tele-learning, the down-load possibility, on-line working and delivery on demand method make it necessary to continue searching for suitable electronic possibilities that can become successful. Computers prove to have disadvantages which will be discussed below.
Problems with electronic materials
Many researchers have for various reasons tried to solve coaching and instruction work in open work and learning environments electronically. Those efforts failed in the harsh reality of learning tools practice. Modern learning tools are used for a while and then they disappear again for reasons unknown. There are numerous examples to illustrate this.
Designers often try to solve this problem by using one half of the screen for instruction and the other for the actual learning environment. These solutions, called viewport solutions in our research, indicate the awareness of the designer of the user's need to put information side by side in order to be able to compare things. It proves that users need to have things parallel. This is defined as parallelism of the first order. The problem with this is that the designer needs a larger screen in the end than that provided on a standard PC.
Learning environments imply a streamlined two-way traffic which should be realised in a natural way. It is not clear yet what is really essential for a good and effective learning environment. Much of what we can do is technically still imperfect.
It is still difficult to retrieve earlier presented and selected information onto the screen. It may well be more practical to print the information on the screen so that this can be put beside the monitor for further consultation, rather than to find the same information again and again and organise it on a screen. The fact that paper materials beside the monitor are quite practical, goes to prove that users are more familiar with working parallel than we think. Below you will find a list of some solutions that were used so far:
* monitors with internal screen memory (page switching);
* several monitors side by side;
* very big monitors;
* scrollable information;
* hypertext like solutions;
* viewports / split-screen.
However, few of these solutions really work. They usually require a certain skill from the user. But in education every obstacle is one too many. Cramming a screen with all kinds of information is a solution. This is 1st order parallelism, but it has all kinds of ergonomic disadvantages: text is too compact, syntax is poor due to statements that are really too brief, an excess of information, letters on the screen are too small and so on.<
Parallelism as a concept
Parallelism occurs everywhere. It is something that people can handle very well as a rule. It is present in our daily lives. Information reaches us through various channels. We have to select it if we want it to reach us by means of a perception channel and then we can process it. A good example of the use of more perception channels and several media sources is the use of a walkman with an audio tape. On this tape, a step by step explanation is given and exercises are evaluated. This method of instruction is insufficiently appreciated and it is often applied in difficult software programs. The student can instruct himself at his own speed. This is a clear case of two parallel processes that are not connected, two simultaneous, a-synchronic processes in other words. There is no technical connection, therefore the term a-synchronic is used. Besides there are two entirely different applications, which are also separately designed: viz. software in the computer and instruction on tape. This division between these two media has considerable advantages for the design technique. Some more examples from daily life to illustrate the important role parallelism plays and some examples of parallelism in computer systems without window techniques are: a museum or exhibition set-up, a supermarket, a newspaper, loose sheets of paper and open books on a worktable or desk, a board on which tools are hung, a lesson in which the teacher is reading out loud with the help of illustrated textbooks, news on TV (the event in a frame, subtitles, sound, logo etc.), and a walkman plus tape (as a means of instruction or on a guided tour).
In all these examples information is put side by side or parallel. The eye or ear determines what is read or seen first and what happens next. A newspaper is also a strange form of parallelism. Actually the newspaper has been the same for 150 years. And even today we can not imagine life without newspapers. So it must be a very practical medium. Why is it so practical? Because when you open a page you can decide for yourself what you want to read and what you want to skip. This is a problem with a sequential source of information such as a film or book. The user in 1995 has often become too impatient for an ordinary linear medium. As a result he enjoys zapping. In a museum or supermarket one can see at a glance where one is going to start looking or buying and where one will go after that. Museums are set up according to 1st order parallelism, with an optimum amount of freedom for the user. Although certain people would like to limit that freedom (teachers or directors of museums) it is essential because it makes discovering and learning through discovering fun.
In a well-functioning window system, information components can easily be compared to information placed elsewhere in another window. This is essential for a good working environment in our present information society. Windows on a screen provide a host of new possibilities and the concept of parallelism gets a new dimension. Although windows is not a 100% form of parallelism, they are still very practical. In windowing software one has all the advantages of 1st order parallelism with for instance larger monitors. A number of examples in the present computer practice where this type of parallelism plays an important part are:
* windowing operating systems;
* the desktop philosophy in general and full-size page design in particular;
* modern tools with the help of little windows (and/or pull down and pop up menus);
* an electronic multi-tasking working environment (with a clipboard system).
The advantages of data offered parallel and data offered quasi-parallel, as is the case in 2nd order parallelism with windows, are obvious. Yet these concepts are not fully appreciated in learning psychology, instruction technology and instrumentation technology. Probably because drills and linear instruction are still common practice.
Levels of parallelism
What are sensible and practical design solutions if we assume that computers and monitors will continue to have some deficiencies for the time being? The solution with the design of the working environment and user-interface is to observe the concept of parallelism and its consequences (the PI-theory a.o.). We defined three types of parallelism, which all have a different effect on the behaviour of the user viz.
* 1st order parallelism: all elements can be seen at a glance;
* 2nd order parallelism: the elements are on overlapping carriers;
* 3rd order parallelism: as 1st order, but with specific coaching.
Most of the examples here are forms of pure side by side elements or sources of information. That is defined as 1st order parallelism here. Elements on carriers such as paper, windows etc. and which are partly overlapping are counted as 2nd order parallelism. The last type is 1st order parallelism combined with something else: usually a relative linear coaching element. This is defined as 3rd order parallelism. First something about parallelism in general.
The desk top philosophy together with the arrival of windows around 1982, and multi-tasking, were not only technical breakthroughs but proved to be a big step forward as well. This was ideal for the beginner who wanted only to be able to work with his program and nothing else. Working with windows is quite like working with loose sheets, books and tools on a desk. The type of parallelism with windows is parallelism of the second order. The effective screen surface became larger than 100% due to the arrival of multi-windowing applications, which is no small advantage. Pull down menu's also became rapidly popular all over the world. Not lastly because there is a quite distinct form of parallelism in pull down menu's of the second order. You can take a quick peep elsewhere without messing up everything on the screen.
People who do not want to or can not remember commands, immediately recognised the revolutionary aspect of window technology in particular for education. For mouse and window the technical mechanisms need to be 100 %. Response times need to be very fast. This is still insufficient on most PC's used at schools today.
Parallelism in the computer
Window-techniques and multitasking systems turn out to be good solutions for the lack of screen space in do-environments. They enable things which earlier on were inadequately or clumsily instrumented. The concept of Min's MacTHESIS philosophy (Min, 1990) is such a multi-windowing and multitasking system. The MacTHESIS philosophy and the matching MacTHESIS system proved very productive as a sort of directive to dimension bare simulation environments. It was also applicable in the instruction-supply side. Finally it offered a theoretical grip - through the 'Pi-theory' - for both the success of paper instruction methods as well as certain electronic instructions (Min, 1994).
Initially electronic methods for instruction and help systems were not so easy to handle and not as cheap in use compared to paper instruction materials. But copying software was obviously cheap and practical. The inclination of software designers to solve all problems by means of adapting the software, proved more difficult in practice than anticipated. It appeared that the instruction component in software could also be produced electronically (in particular due to the arrival of new techniques). However, this is not always simple for the user. He gets lost in hyperspace.
In simulation the best ergonomic presentation form for (electronic) instruction proved to be a separate window. However, within that window there are design problems of a sequential nature. The idea of two separate windows is in principle quite feasible. The instruction program should be seen as separate from the simulation program and they have to be used apart from each other, this yields specific advantages for both user and designer. The two different functionalities as a whole may well be two separate applications. This even makes the design and realisation of those applications easier and therefore usually cheaper. Every application can be made with its own specific tools. No feats are required to link the software. Min (1992) used the MacTHESIS system for the bare simulations and HyperCard for instruction. HyperCard is easy to use for teachers. It is in particular a perfect 'poor-man's tool' for the writing of all kinds of instruction materials or cases, even testing systems.
Every user can decide for himself if he wants to use a certain component and to what extent. In our philosophy, instruction is only then effective when the student realises its value. Our experiments have shown that in interactive, open learning environments, users indeed have a usually unconscious need to see things side by side, to put them side by side and if necessary to move them temporarily in order to be able to study underlying information. In other words, parallelism plays an important part with the user-interface in educational software to solve problems that face designers of screen solutions (Min, 1994). Traditional user-interfaces with a strong serial nature whereby something that can be seen on the monitor disappears again through the next action, often relies too heavily on a person's short-term memory. A proper interactive learning tool should be a two-way medium with optimal reminders for the user. It should not have the disadvantages of the one-way media such as linear programs in which the viewer has often completely forgotten details supplied in the beginning by the time he really needs them.
The Parallel Instruction Theory
The 'parallel instruction theory' for simulation environments, in short the 'PI-theory', was developed and published by Min around 1992. In this theory researchers suppose that a user in an (open) learning environment can only work and learn if the environment has been designed in such a way that all relevant information to take decisions is 'visible' or can be immediately 'retrieved'. On studying various kinds and large numbers of examples, it turned out time and again that traditionally presented instruction methods yielded better results than solutions provided in a modern way. Experiments with simulations realised within instruction-write-systems and experiments with instruction-write-systems plus simulation systems, showed that there was no long term solution. Simulations with paper instruction materials turned out to function much better at the time, although the reasons were not very clear. In the end we discovered the above mentioned deficiencies of monitors and the software technique used at the time, when carrying out a small but not very successful experiment for AKZO in Hengelo. Learning environments with a malfunctioning instruction component proved on closer study to have been too sequential in design and too rigidly linked in general. In other words, the instruction was too well 'synchronised with the simulation' and too rigid. Also the instruction text disappeared completely from the screen while one was working with the simulation. It was surprising to see that such a design did not work, for the reverse was so much more obvious.
In ordinary courseware the designer has often put a lot of information in one picture or text. It is not to be recommended for various reasons but it goes to prove that people in learning situations such as simulation environments, like to be able to survey everything they have finally found or what they have composed. Everything has to be at close hand and within view. The PI-theory can be described as follows: When one has to design an (open) learning- or working-environment, one should do everything within one's power to have all loose components within easy reach and ready for use. They should remain in the position (or state) which they have (or had) at the time. In particular the instruction, feedback, tasks, the solution room, scrapbook etc. should be parallel to the (open; bare) learning environment of the simulator. So far it is assumed that the usefulness of parallelism in open do environments is due to:
1. the user's limited short memory as regards details or loose components; because the monitor always wipes out (at least part of) the screen contents when the next image is shown;
2. the user wants to, should and must be able to compare; by comparing things from the past physically to things of the present;
3. the user wants to gain insight into cause/effect relations; through repeated verification and comparison;
4. the user wants to create his own frame of reference and should be able to do so by putting things that pass by on the screen side by side (by means of windows) and comparing them.
Further research will have to reveal which psychological variables and design variables play a part for the user and in software. Also why and under which conditions the user will best develop in such an 'environment'.
The statements in this article are based on empirical research into the effects of specific prototypes with the focus on optimal design of open interactive learning environments with simulated phenomena of the type with which a lot of knowledge has been gathered recently. Modern methods and techniques which enable an adequate and dynamic design with high performance possibilities were tested in particular. This research has always had the characteristics of a specific R&D project. Every hypothesis about a design variable which is believed to be important is tested by means of an experiment with a single specific value or tuning of design variables which were caught in a suitable multimedia prototype.
The research subject is always which aspects of design, structure and communication are of importance in well-functioning learning environments in relation to learning targets as the teachers demand. Those design variables, parameters, preconditions or system determinants were investigated with a view to the usefulness of a tool or program in relation to the learner, teacher and/or designer as being the decisive factor. New techniques which computer science offered in instrumentation technology were studied and in particular the benefits of techniques such as:
* movable windows
* 'a-synchronic instruction'
* digital and in particular desktop video (in general)
* video fragments as instruction ('talking head' concept)
* video fragments as feedback ('desktop video')
* 'intelligent feedback' (in simulations, monitored by a 'rule base system')
* model-driven animation (dynamic visualisations).
This prototyping approach is the best guarantee that research in the field of instrumentation technology will actually yield verifiable and relevant insights, viz. better functionality, sufficient performance (with emphasis on strong and fast), usability, stimulating and with a verifiable effectiveness as regards achieving the set learning targets. Relevant aspects were tested empirically by means of formative evaluation techniques, a.o. with video-recorded observations in the university studio. Some aspects were considered longitudinally and evaluated summatively on test schools. The aim being to develop theories in the field of methodology (design) of educational simulations in particular and for multimedia courseware in general.
Some experiments have been carried out with digital video. This multimedia element has tremendous potential in the field of learning tools, in particular as a linear instruction method. A speaking person - the face of the instructor- holds the attention of the audience and the auditory information is highly effective; much more so than written text on a screen. There were also experiments with video as feedback in this type of environment. This feedback is essential for keeping the learning process started by the instruction under steam. Therefore video as instruction is defined in this research - in this process - as 'input' and feedback as 'output'. The whole system of cause and effect keeps the learning process and the learner going.
Benshoof, L.A., & Hooper, S. (1993). The effects of single- and multiple-window presentation on achievement during computer-based Instruction. Journal of Computer Based Instruction, 20 (4), 113-117.
Min, F.B.M. (1995). Simulation technology & parallelism in learning environments: methods, concepts, models and systems. Publisher: Academic Book Center, De Lier. ISBN 90-5478-036-3.
Min, F.B.M. (1994). Parallelism in open learning and working environments. British Journal of Educational Technology, 25 (2), pp. 108-112. ISSN 0007-1013.
Min, F.B.M. (1992). Parallel instruction, a theory for educational computer simulation. Interactive Learning International, 8 (3), 177-183.