By Rik Min
Also published (as small paper) under the title: Shortcomings monitors; the problem of linear presentation media in learning situations; the importance of parallelism in open learning and working environments. (Published in the proceedings of EuroMedia 96; Telematics in a multimedia environment, dec. 19-21, 1966; A publication of the society for computer simulation International (SCS) (Eds. A. Verbraeck & P. Geril) (published on WEB)
In an interactive computer-driven working environment there are continually all sorts of simultaneous processes providing different types of visual and auditive information side by side, mixed and on top of each other. The attention is mainly on electronic products or interactive working environments. These are working environments based on a computer or a monitor, in contrast to (paper) working environments. The word 'interactive' means 'on line' here, working environments that are 'computer-based' with software either 'at a distance' or not.
A person receives relevant and less relevant information from all sides. When working in an electronic environment there is a continuous process of supply and demand. People have learned to handle this fairly efficiently. A mechanism has been developed enabling a person to select and use only that which he needs. However, in some interactive working environments this is not possible. The question is whether this is due to the transmitter or the receiver: the equipment or the user?
The impressions concerned are auditive or visual; image or sound, but wind, storm, heat cold and rain somehow influence the human being too. Brains are fed with information through all channels of perception. How does man select and remember what he needs in those continually changing circumstances?
Eyes and ears are the main channels of perception. We are trained to select the right image and the right dosage from vast masses of information. As the eye is able to focus immediately on the one correct spot out of an immense amount of parallel impressions, so the ear can focus on the one source it wants to hear among a quantity of different sources. From a series of voices, man is able to pick out the story he wants to hear.
This continuous and almost unconscious selection consumesa lot of the untrained persons' energy with traditional media. Some educationalists say that when offering information to inexperienced people this should be free from all kinds of disturbing information. However, other scientists say that an instruction- or learning-environment may well be exciting, challenging or tempting. In literature there are numerous directions how a perfect transfer-, learning- or working-environment should look. But one should not follow these directions literally. Depending on the circumstances and the target group and in certain fields a design-condition why something should or should not be applied may well be interpreted differently.
Within educationalism, and in particular in applied educational science there are instruction technologists who want an instruction to be clear, stripped from all trimmings. The reason being that it may distract the receiver. But there are learning psychologists who say the reverse. Some researchers say that a learning environment may well be full of information or even that it has to be like that. The question is of course: who is right?
The above remarks also refer to situations involving the common media such as television, video-recorders and interactive tape-slide series with remote control. Human beings are quite capable of seeing and hearing selectively and handling simultaneous processes. However, for computer-based working environments and in particular interactive open working environments, other laws apply. Linear situations are much less frequent there. Relative linear media are traditionally a world for instruction technologists. Really interactive media should be approached differently.
In the world of educational instrumentation technology ('educational technology') there are two different fields: the world of 'learning technology' and that of 'instruction technology'. In other words: there is the world of the designers of learning tools and thye world of the designers of instruction. For the analyses below it is essential to distinguish clearly between 'learning tools' and 'means of instruction'. In principal, learning tools do not contain knowledge or very little and instruction means do contain knowledge. If learning tools do contain knowledge then it is really a combination of learning and instruction tools. If not, we should call the underlying models or systems knowledge. However, that is not the issue here but in practice one often comes across a combination of these two. For the benefit of this discussion the two functions should be considered and analysed separately.
An example of this fundamental difference between instruction-rich and instruction-poor encvironments is a chemistry lab with nothing but a set-up for a practical. Apart from reading manuals there is no knowledge transfer. On the other hand there is a course on television where knowledge is transferred, dosed in a certain way, with very little interaction. These are two opposites: learning tool versus means of instruction.
Computer-based learning environments have to meet other requirements than computer-based instruction programs. With learning tools nothing happens until the user asks for it. With instruction programs the user has to do exactly what the designer wanted.
In that type of courseware a target can not be compared with targets in open learning environments. The main target with instruction tools is of course knowledge transfer. This is different for learning tools because then insight is trained or tested or knowledge that is already present is deepened. In general, knowledge is already acquired elsewhere (let's call that knowledge 'static') and somehow a deeper understanding should be achieved by means of another learning tool. Then 'static knowledge' will become 'dynamic knowledge' and all that has been learned will be better applicable.
Plain simulation programs and tools usually do not contain knowledge. Knowledge about the domain of the simulation or the subject has already been touched upon in a traditional way, as part of the curriculum or a course. In general, curriculums or courses have a differentiated offer of learning tools. Simulation could be one of them. Lessons, tutorials, study materials, laboratory sessions, instruction books or an excursion are possible other tools. Each form has it's pros and cons. Instruction materials play a part both in one-way and in two-way tools. In the latter case, instruction plays a secondary but essential part. The question is what type of instruction plays a part in this investigation to create a proper learning environment? A number of forms have been investigated, viz.:
In 1990 Van Schaick Zillesen and Gmelich Meijling investigated certain simulations and the role of paper materials versus electronic instruction materials beside or on a monitor (Min,1995). In the years 1992 -1994, that research and the pattern in the results really showed us the way as to what the pros and cons of sequential situations are and what the order of parallelism is, no matter the medium used.
Designers often try to solve this problem by using one half of the screen for instruction and the other for the actual learning environment. These solutions, called viewport solutions in our research, indicate the awareness of the designer of the user's need to put information side by side in order to be able to compare things. It proves that users need to have things parallel. This is defined as parallelism of the first order. The problem with this is that the designer needs a larger screen in the end than that provided on the standard PC.
With simulations someone has to prove his insight. His (passive) knowledge is 'coming alive'. Designers of interactive software want to achieve completely different things with people than film or video designers. A television screen is a one-way medium. With a film or a discussion on television the continuous disappearance of images is no problem. In a well-designed linear program there is always plenty redundant information to get the message across.
With courseware and interactive software in general but in particular with learning environments for simulations, this implicit restriction of the monitor indeed plays a part. One is not always aware that many problems with simulations are related to this handicap of the small monitor. In a teaching program for instance certain information has to be used in a different part of the lesson as well. In other words: information from an earlier moment should also be available later. The student would like to see that information without having to do anything. If a designer hasn't anticipated this with his product then it will stay in the cupboard, never to be used again. This has happened to a lot of MS.DOS products. Macintosh and dtp with windows pointed in the right direction. Meanwhile MacOS System7 is the most suitable operating system for this type of experiments as it allows multitasking in a very natural manner. Windows 3.1 and Windows 95 are a step in the right direction. Most possibilities of tested prototypes designed on Mac can be transferred to the present Intel-computers. However, as yet not all techniques are operating smoothly on that level.
A designer really wants a larger effective surface screen, even when only virtually. This can not be achieved without windows. But they have to move easily and quickly, without loosing their unique contents and it must be possibnle to call them up since they offer insufficient advantages otherwise.
It is still difficult to retrieve earlier presented and selected information onto the screen. It may well be more practical to print the information on the screen so that this can be put beside the monitor for further consultation, rather than to find the same information again and again and organise it on a screen. The fact that paper materials beside the monitor are quite practical, goes to prove that users are more familiar with working parallel than we think. Before windows became popular on computers, many people have tried to solve problem in various ways. Below you will find a list with efforts from the previous decade:
However, few of these 'old' solutions really work. They usually require a certain skill from the user. But in education every obstacle is one too many. Designers have come up with all kinds of solutions, e.g. by cramming a screen with all kinds of information. This is 1st order parallelism, but it has all kinds of ergonomic disadvantages: text is too compact, syntax is poor due to statements that are really too brief, an excess of information, letters on the screen are too small and so on.
The desk top philosophy together with the arrival of windows around 1982, and later multitasking (around 1990) were not only technical breakthroughs but proved to be a big step forward as well. This was ideal for the beginner who wanted only to be able to work with his program and nothing else. Working with windows was quite like working with loose sheets, books and tools on a desk. The fun of working and learning behind a desk was not unlike working and learning with information sources through windows. Many professional software producers, in particular the traditional informaticians did not see the use of this childishly simple userinterface. The value of the ideas of Adele Goldberg and the Xerox Palo Alto Research Centre on SmallTalk and such like (1980) was not recognised. But people in education understood the approach. Just think of Papert (1980) and the breakthrough of Apple Macintosh computers on schools in the USA (1985-1995). This type of parallelism with windows is defined as parallelism of the second order. The effective screen surface became larger than 100% due to the arrival of multi-windowing applications, which is no small advantage. Pull down menu's also became rapidly popular all over the world. Not lastly because there is a quite distinct form of parallelism in pull down menu's of the second order. You can take a quick peep elsewhere without messing up everything on the screen. This is implicit proof of the use of parallel thinking during the design stage. People who do not want to or can not remember useless things (such as commands), immediately recognised the revolutionary aspect of a fast window technology in particular for education. The computer world and in particular the MS.DOS and Intel people did not understand the impact of pull down menu's, mouse and windows. They realised far too late why this concept was a major step forward compared to scrolling screens and command-based software. Traditional computer scientists who did not believe in this type of event-driven windowing system counted on the sheer force of a processor and a good memory of the user to remember commands rather than looking for a perfect concept that would enable everyone to use software. Primitive copies of the Macintosh concept on 286 and 386 PC's also induced people to be blind to the advanced event-driven programs. For mouse and window mechanisms need to be 100 %. Response times need to be fast. This is still insufficient on ordinary PC's and particularly on the PC's used at schools.
Mouse and windows are still far from perfect, this applies to all manufacturers. They certainly don't run on Window3.1 equipment. Mouse movements are influenced by the difficulty of the job. There are good and bad windowing computers. With most computers the response time of the mouse movement depends too much on the status of the primary process. If an extensive read and write operation is active from or to the disk, the mouse moves across the screen in a hopelessly irregular manner. This apart from the problem of the mouse with two or three buttons. A person needs only one mouse button. It has been shown that a mouse with one button is a lot pleasanter for most people. They need not think whether they have to press the right or the left one.
Macintosh System 7 - and Motorola computers in general - are way ahead of the rest. In particular in respect of customer-friendliness, for the developers of this application have moved with the times. This is also due to the fact that there are software directives for these computers. Another reason is that the hardware of Motorola chips is more systematic. With Motorola based computers, the mouse has a small separate independent processor which results in fast response times under all kinds of circumstances, no matter what else is going on inside the computer.
Most of the examples here are forms of pure side by side elements or sources of information. That is defined as 1st order parallelism here. Elements on carriers such as paper, windows etc. and which are partly overlapping are counted as 2nd order parallelism. The last type is 1st order parallelism combined with something else: usually a relative linear coaching element. This is defined as 3rd order parallelism. First something about parallelism in general.
The phenomenon parallelism covers a wide area, wider than one would think initially. It is present in our daily lives. Information reaches us through various channels. We have to select it if we want it to reach us by means of a perception channel and then we can process it. There are several perception channels through which information can and does reach us.
A good example of the use of more perception channels and several media sources is the use of a walkman with an audio tape. On this tape, a step by step explanation is given and exercises are evaluated. This method of instruction is insufficiently appreciated and it is often applied in difficult software programs. The student can instruct himself at his own speed. This is a clear case of two parallel processes that are not connected, two simultaneous, a-synchronic processes in other words. There is no technical connection, therefore the term a-synchronic is used. Besides there are two entirely different applications, which are also separately designed: viz. software in the computer and instruction on tape. This division between these two media has considerable advantages for the design technique. However, this is beyond the scope of this article.
Some more examples from daily life to illustrate the important role parallelism plays and some examples of parallelism in computer systems without window techniques:
In all these examples information is put side by side or parallel. The eye or ear determines what is read or seen first and what happens next.
A newspaper is also a strange form of parallelism. Actually the newspaper has been the same for 150 years. And even today we can not imagine life without newspapers.. So it must be a very practical medium. Why is it so practical? Because when you open a page you can decide for yourself what you want to read and what you want to skip. This is a problem with a sequential source of information such as a film or book. The user in 1995 has become too impatient a man for an ordinary linear medium. As a result he enjoys zapping.
In a museum or supermarket one can see at a glance where one is going to start looking or buying and where one will go after that. Museums are set up according to 1st order parallelism, with an optimum amount of freedom for the user. Although certain people would like to limit that freedom (teachers or directors of museums) it is essential because it makes discovering and learning through discovering fun.
There are many more examples of pure 1st order parallelism; loose overlapping sheets of paper and windows are examples of 2nd order parallelism. We will discuss those later. On this subject see earlier articles by Min (Min 1992, 1994 and 1995). An underlying window can be brought to the surface, including the contents, with a simple click of the mouse. This is a fairly natural and automatic action so that even when the window contains rather unique information which was hard to find we do not hesitate to open another window. For we know that the information will not be lost. In a well-functioning window system, information components can easily be compared to any information placed elsewhere in another window. This is essential for a good working environment in our present information society. However, this 2nd order parallelism which can be manipulated is just as important for the ultimate success of learning tools.
A museum or an exhibition can also be visited in a different way: one can wear a walkman with a tape-recorder which then becomes a sort of electronic 'guided tour'. This can sometimes be highly recommended for certain people. Visiting a museum with a simple walkman on your head then becomes a good example of 3rd order parallelism. This method, coaching with a walkman, links a specific advantage of linear instruction to the advantage of another concept. The instruction on tape is linear. But the instruction tool is parallel to that on the wall or on the panels. Every target group chooses its own form of coaching whether this is in a museum, at an exhibition or on a tourist route. One can have a set route, a catalogue a live guide or a guided tour with a walkman and earphones.
Another example of 3rd order parallelism is the set up of a supermarket. By means of a sort of implicit, but dominant route of preference, indicated by signs and paths, the customer is distracted from his own preferences and coached along the shelves in a certain order. The aim of course being that the customer will buy more because he sees more, a theory founded on scientific research.
The terms 'parallel' and 'linear' should be compared. An example is a speech accompanied by sheets or slides. Loose sheets can easily be used in conformity with the concept of parallelism. PowerPoint presentations however -electronic sheets by means of a wide angle presentation platform - are actually almost linear. The advantages of PowerPoint presentations are obvious. But it lacks the one advantage of loose sheets, as does the slide presentation. Modern tools therefore may have disadvantages as compared to ordinary sheets. When using ordinary loose sheets, the speaker is able to glance through his speech, just before or during his speech. He may even decide to choose a different order. This survey is quickly lost when the same speaker should use the electronic sheet of PowerPoint for the same speech. For he can not see in advance which picture will appear etc. But solutions have been found for this as well, parallel solutions, mind you.
If we compare these ordinary examples to examples with monitors and software programs we see a lot of similarities. Windows on a screen provide a host of new possibilities and the concept of parallelism gets a new dimension. Although windows is not a 100% form of parallelism, they are still very practical. In windowing software one has all the advantages of 1st order parallelism with for instance larger monitors. A number of examples in the present computer practice where this type of parallelism plays an important part are:
The advantages of data offered parallel and data offered quasi-parallel, as is the case in 2nd order parallelism with windows, are obvious. But in certain sectors of society these techniques were only recognised much later. Apple Macintosh's desktop concept is based on parallelism. Yet these concepts are not fully appreciated in learning psychology, instruction technology and instrumentation technology. Probably because drills and linear instruction are still rated very highly. Windows is a sophisticated concept, but more from the concept angle than from the angle of our research, viz. designing and setting up a learning environment for simulation with an instruction or coaching problem.
A lot of educational software still looks terribly linear in spite of the designation 'interactive' and the fact that everything is interactive multi and hypermedia nowadays. This may be due to the conservative outlook of informaticians and computer manufacturers on the one hand and on the other hand the specific 'one-way' thinking of AV specialists. The development of the portable PowerPC, Intendo-games, pocket computer and things like disposable Hintendo pocket games, will soon bring the Intel world face to face with reality. For a user wants 'natural' and preferably 'dedicated' learning tools that are 'turn-key' and portable and therefore come in one piece and weigh and cost next to nothing.
We all know that in no matter what presentation the use of extra tools is unavoidable. Everyone who speaks live, uses a tool, even if it is only an overhead projector. This is used not merely because a lot of information can be presented in a relatively short time and that one can prepare these things at home, but also because they are retrievable. All these tools, and loose sheets in particular, can be quickly retrieved during the lecture when there are questions about an earlier part.
So beside the speaker, there is a parallel flow of information in the form of sheets, slides or electronic sheets, which besides being handy, clearly enable listeners to remember the message better than if the speaker had merely told his story. This then is the third advantage of tools. One would be inclined to think that one type of information flow - the speaker - would be hard to follow, let alone if there is a second or third flow of information. But this is not true at all, as will be demonstrated below in the experiments with 'talking heads' as a means of instruction with simulation.
These psychological matters are rather complicated, for the reverse would be more logical. Somehow the message is more easily remembered. This may be due to redundancy and to the fact that listeners are capable of more parallel actions than scientists think. Another factor might be that a speaker tends to radiate a certain ease or confidence. This should also be achieved in computer-based instruction. We assume that parallel events - within the scope of the story - may under certain conditions reinforce the main story. Information that supports the main flow of information but in a different form, will create a better understanding in the listener and therefore the information is better remembered. There is of course an ideal balance between too many and too few simultaneous processes and parallel presentations.
The point is whether matters may run parallel in every perception channel or whether this should be avoided. Another question is, which channel of perception is the most critical in this respect; the eye or the ear ? We must carefully consider the conditions under which something may or may not be applied. One should never say never. A pattern that is present in one situation does not necessarily occur elsewhere. Even when situations appear similar. However, people in the world of educational science, the courseware- and media-world often violate this principle.
Initially electronic methods for instruction and help systems were not so easy to handle and not as cheap in use compared to paper instruction materials. But copying software was obviously cheap and practical. The inclination of software designers to solve all problems by means of adapting the software, proved more difficult in practice than anticipated. It appeared that the instruction component in software could also be produced electronically (in particular due to the arrival of new techniques). However, this is not always simple for the user. He gets lost in hyperspace or he is distracted during work.
In simulation the best ergonomic presentation form for (electronic) instruction proved to be a separate window. However, within that window there are design problems of a sequential nature. The idea of two separate windows - usually resulting in two different applications - is in principle quite feasible. As the instruction program should be seen as apart from the simulation program and as they have to be used apart from each other, this yields specific advantages for both user and designer. The two different functionalities as a whole may well be two separate applications. This even makes the design and realisation of those applications easier and therefore usually cheaper. Every application can be made with its own specific tools. No feats are required to link the software. Min (1992) used the MacTHESIS system for the bare simulations and HyperCard for instruction. HyperCard is easy to use for teachers. It is in particular a perfect 'poor-man's tool' for the writing of all kinds of instruction materials or cases, even testing systems.
Researchers call this side by side use of these two simultaneous processes an 'a-synchronic' user situation. This word indicates that 'open learning environment' refers not only to the simulation environment, but also to the more or less noncommittal use of the instruction program. Every user can decide for himself if he wants to use a certain component and to what extent. In our philosophy, instruction is only then effective when the student realises its value and asks for it. Our experiments have shown that in interactive open learning environments, users indeed have a usually unconscious need to see things side by side, to put them side by side and if necessary to move them temporarily in order to be able to study underlying information. In other words, parallelism plays an important part with the userinterface in educative software to solve problems that face designers of screen solutions nowadays. The experiments stated here and further down are not described in detail. They can be studied elsewhere (Van Schaik Zillesen, 1990, Min, 1992 and 1994). Traditional user interfaces with a strong serial nature whereby something that can be seen on the monitor disappears again through the next action, often relies too heavily on a person's short-term memory. A proper interactive learning tool should be a two-way medium with optimal reminders for the user. It should not have the disadvantages of the one-way media such as linear programs in which the viewer has often completely forgotten details supplied in the beginning by the time he really needs them.
It is clear that users of educative software like to have all information well-ordered and at hand. They want to be able to see the connection or coherence of things. An example will illustrate this. Large screens are favourite in work situations, which can be compared to learning situations. This is proved by the huge success of SUN work stations. They are sold well to people who can afford them. The price is still a problem however. The fact that large screens are so popular partly proves the ideas found on parallelism. Users want to be able to put relevant information somewhere, preferably on a large screen, so that they can consult it without having to move other relevant information from the screen. Users of screen-oriented work stations have an implicit need to put a lot of information sources or tools side by side as a reference point so that comparisons are possible. This is not merely a laziness of the brain but the result of learning being largely a question of making comparisons and trying to see the coherence. In ordinary courseware the designer has often put a lot of information in one picture or text. It is not to be recommended for various reasons but it goes to prove that people in learning situations such as simulation environments, like to be able to survey everything they have finally found or what they have composed. Everything has to be at close hand and within view. The PI-theory can be described as follows: When one has to design an open learning- or working-environment, one should do everything within one's power to have all loose components within easy reach and ready for use. They should remain in the position (or state) which they have (or had) at the time. In particular the instruction, feedback, tasks, the solution room, scrapbook etc. should be parallel to the open (bare) learning environment of the simulator. The number of design variables is just as large as the number of qualitative and quantitative obscurities.
1. the user's limited short memory as regards details or loose components; because the monitor always wipes out the image contents partly or entirely when the next image is shown;
2. the user wants to, must and can compare; by comparing things from the past physically to things of the present;
3. the user wants to gain insight into cause/result relations; through repeated verification and comparison;
4. the user wants to create his own frame of reference and should be able to do so; by putting things that pass by on the screen side by side (by means of windows) and comparing them.
Further research will have to reveal which psychological variables and design variables play a part for the user and in software. Also why and under which conditions the user will best develop in such an 'environment'. At the main branch of Randstad in Diemen they faced a similar problem. Visits were paid on both sides and the solution was really quite obvious. Below this project will be discussed in more detail because it is so specific.
All this is no hard evidence in favour of the parallel instruction theory for simulations or do environments. But our findings are such that further empirical research will show whether our hypotheses are correct. In spite of this we published the idea in concept and as theory in 1992. Also because we developed a large number of prototypes with many different design variables which anyone can apply for to experiment with. They will find that the concept - provided it is used under the conditions discovered by us - really works. One way to guarantee a wide spreading of our experimental products was to publish the lot in one go on CD.ROM with a large number of software products, articles, manuals and figures and a book on parallelism and simulation technology (Min, 1993; Min, 1995).
The research subject is always which aspects of design, structure and communication are of importance in well-functioning learning environments in relation to learning targets as the teachers demand.
Those design variables, parameters, preconditions or system determinants were investigated with a view to the usefulness of a tool or program in relation to the learner, teacher and/or designer as being the decisive factor.
New techniques which computer science offered in instrumentation technology were studied and in particular the benefits of techniques such as:
The research on which this study is based has always been design-oriented, i.e. 'prototype-driven' research. This prototype-driven approach in the sense of R & D is the best guarantee that research in the field of instrumentation technology will actually yield verifiable and relevant insights, viz. better functionality, sufficient performance (with emphasis on strong and fast), usability, stimulating and with a verifiable effectivity as regards achieving the set learning targets. Relevant aspects were tested empirically with testees by means of formative evaluation techniques, a.o. with video-recorded observations in the university studio. Some aspects were considered longitudinally and evaluated summatively on test schools. The aim being to develop theories in the field of methodology (design) of educative simulations in particular and for multimedia courseware in general. Some experiments have been carried out with digital video. This multimedia element has tremendous potential in the field of learning tools, in particular as a linear instruction method. A speaking person - the face of the instructor- holds the attention of the audience and the auditive information is highly effective; much more so than written text on a screen. There were also experiments with video as feedback in this type of environment. This feedback is essential for keeping the learning process started by the instruction under steam. Therefore video as instruction is defined in this research - in this process - as 'input' and feedback as 'output'. The whole system of cause and effect keeps the learning process and the learner going.
Navigating and comparing input with output from elsewhere is a very demanding job. Min proposed some ideas from the PI-theory to solve this problem. The result was a pilot project which was concluded at the end of 1994 and early in 1995. This pilot project resulted in two overlapping prototypes, developed by the University of Twente. These prototypes were developed with the rapid-prototyping method. One has a special Pascal-procedure library to consolidate and conserve the model and one with the MacTHESIS system to show the obvious advantages of parallelism to the client.
When parallelism proved to be able to solve the problem of the Randstad group, a project was started in which a young enterprise (Koopal & Gritter multimedia) produced a second series of prototypes. The university, faculty and in particular the faculty lab simply had no room, facilities or know-how to do this. The department did but the person involved could not possibly stop with his other daily activities. Other scientific members of staff in the field of modelling (for that is what it was at that stage) were not available.
Late in August 1995, Koopal & Gritter multimedia made a working prototype in HyperCard. When Randstad showed great satisfaction with this prototype, a second stage was entered with the prototype in HyperCard as the starting point for the development of a definite strategic planning system on the Windows-platform. Here Toolbook Multimedia 3.0. is applied. Meanwhile a number of steps of the strategic planning process have been elaborated in Toolbook and the definite program will be finished by the end of January 1996.
The following steps of the strategic planning process have been elaborated: introduction upstart data; data feeding with respect to the current year; points of departure for strategic planning, viz. a unit-planning and a branch-planning; and making a summary of the results. Between the steps with which the user puts in data, there are links of windows in most cases. Every link consists of an input window and an output window that are shown parallel on the screen This way input and output relations can be pictured more clearly so that the user is better able to compare data. Besides, the program uses visual presentation of data in the form of line diagrams, histograms and pie diagrams.
Parallelism has finally managed to make a contribution in finding a solution for a practical problem. We hope that after a number of other tests have been carried out we will be able to say that all potential users and in particular inexperienced software users, will be able to handle the planning-system after a relatively short period of time.
Further research will reveal which conditions need to be distinguished and how the design variables here described should be applied, so that the user can work and learn as well as possible in this type of 'environment'. Open environments which are not open for a user and where there is too much control, may induce situations that a teacher does not want and which might well have been prevented by the designer.
Ball, G., (1993)
Redesign of simulation materials for CD-ROM. OKT verslag / Rapport (writen in English). University of Twente, Enschede (Begeleiding/Supervisor: Rik Min)
Gritter, H. (1993)
Het ontwerpen, ontwikkelen en evalueren van ISAV (Instructional Support for ArcView): een COO-programma ter ondersteuning van het geografische informatiesysteem ArcView. M Sc Thesis, University of Twente, Enschede (in cooporation with the University of Southampton, J. Moonen and F.B.M. Min).
Gritter, H., W. Koopal and F.B.M. Min, (1994)
A New Appraoch to Computer Simulations; Article. Interact, European Platform for Interactive Learning, Vol. 1, no. 2, ISSN 0929-4465.
Min, F.B.M., (1994)
MacTHESIS designers manual, Manual. University of Twente, Enschede (218 pages.)
Min, F.B.M., (1992)
Parallel Instruction, a theory for Educational Computer Simulation. Article. Interactive Learning Intern., Vol. 8, no. 3, 177-183.
Min, F.B.M., (1994)
Parallelism in open learning and working environments. Article. Britsh Journal of Educational Technology, Vol. 25, No. 2, pp. 108-112. ISSN 0007-1013.
Min, F.B.M., (1992)
The THESIS Family; universal design systems for educational computer simulation programs. In: Modellbildungssysteme - Konzepte und Realiserungen; J. Wedekind and W. Walser (Eds.), Paper in proceedings van Deutsches Institut für Fernstudien an der Universität Tübingen. 1992, Comet Verlag für Unterrichtssoftware, Duisburg, Deutschland.
Van Schaick Zillesen, P.G. (1990) Methods and techniques for the design of educational computer simulation programs and their validation by means of empirical research. PhD. thesis, University of Twente, Enschede, Holland (promotors: E. Warries and F.B.M. Min).
Schaick Zillesen, P.G. van, F.B.M. Min, M.R. Gmelich Meijling and B. Reimerink (1995).
Computer support of operator training based on an instruction theory about parallelism. Chapter in book. Kluwer Academic Publishers (Eds: M. Mulder, W. Nijhof en R. Brinkerhof). ISBN 0-7923-9599-9. p.209-226.