15
Psychological Review 1982, Vol. 89, No. 6, 693-707 Copyright 1982 by the American Psychological Association, Inc. 0033-295X/82/8906-0693$00.75 Theory of Serial Pattern Production: Tree Traversals Ren6 Collard and Dirk-Jan Povel University of Nijmegen, The Netherlands Structural information about a serial pattern may be encoded in the form of a hierarchical memory code. Conversely, when the pattern is to be produced, the memory code should be decoded. In this article a theory on serial pattern pro- duction is proposed that builds on Restle's theory of structural trees and, espe- cially, on the process models for sequence production developed by Greeno and Simon. According to the tree traversal interpreter, presented in this article, a structural tree corresponds to the interpretive process that operates on a hier- archical memory code. A comparative analysis of computational properties and psychological relevance is substantiated by empirical test and extension to recent work on the internal representation of music. Serial pattern production refers to the pro- cess by which the elements of a sequence are produced from a hierarchical memory rep- resentation in an orderly serial fashion. The idea that a hierarchical code may be used to represent a serial pattern has been proposed by Simon and Sumner (1968) and Leeuwen- berg (1969), among others. The concept of structural trees was introduced by Restle (1970) to depict the inner structure of serial patterns as represented by hierarchical codes. Essentially, Restle showed that high-level transitions in a structural tree cause more difficulty in anticipating serial patterns than do low-level transitions. He foresaw that an integration of memory code and structural tree could provide the basis of a theory of how serial patterns might actually be pro- duced. Greeno and Simon (1974) showed that, in fact, different production models are conceivable. In the present article a model is presented that reflects the exact relation between memory code and structural tree. According to this model, called the tree trav- ersal interpreter, a structural tree corresponds not so much to a hierarchical memory code as to the interpretive process that operates on that code. This research was supported in part by the Nether- lands Organization for the Advancement of Pure Re- search (Z.W.O.). Requests for reprints should be sent to Rene Collard, Department of Experimental Psychology, University of Nijmegen, Montessorilaan 3, 6500 HE Nijmegen, The Netherlands. The psychological issue under considera- tion may be best introduced by means of an example. Consider a person who wants to whistle a tune learned long ago. Assume that ^the tune is stored in memory not as a long series of notes but in a concise hierarchical code reflecting the melodic structure, for ex- ample, of the form proposed by Deutsch and Feroe (1981). For the whistler to actually produce the tune, the memory presentation must be decoded in order to generate the proper sequence of notes. Theoretically, the decoding can be done in several different ways, as shown by Greeno and Simon (1974), who proposed the following models: the dou- bling, the recompute, and the push-down in- terpreters. These three interpreters and the tree traversal interpreter may be character- ized as follows: First, the whistler might be able to decode in advance the memory rep- resentation to a complete series of notes that is temporarily stored in a short-term memory (the doubling interpreter). Second, it may be that the whistler recomputes each consecu- tive note to be produced by applying the memory code to the very first note of the tune (the recompute interpreter). Third, the whistler may use a push-down stack to store and recover notes preceding the last one in order to generate the next note (the push- down interpreter). Finally, the tree traversal interpreter assumes that the whistler uses the hierarchical memory code in an on-line fash- ion to compute each note from the imme- diately preceding one. It seems reasonable to 693

povel/Publications/MusicRelatedArticles/1982Collard&Povel

Embed Size (px)

DESCRIPTION

http://www.socsci.ru.nl/~povel/Publications/MusicRelatedArticles/1982Collard&Povel.pdf

Citation preview

Psychological Review1982, Vol. 89, No. 6, 693-707

Copyright 1982 by the American Psychological Association, Inc.0033-295X/82/8906-0693$00.75

Theory of Serial Pattern Production: Tree TraversalsRen6 Collard and Dirk-Jan PovelUniversity of Nijmegen, The Netherlands

Structural information about a serial pattern may be encoded in the form of ahierarchical memory code. Conversely, when the pattern is to be produced, thememory code should be decoded. In this article a theory on serial pattern pro-duction is proposed that builds on Restle's theory of structural trees and, espe-cially, on the process models for sequence production developed by Greeno andSimon. According to the tree traversal interpreter, presented in this article, astructural tree corresponds to the interpretive process that operates on a hier-archical memory code. A comparative analysis of computational properties andpsychological relevance is substantiated by empirical test and extension to recentwork on the internal representation of music.

Serial pattern production refers to the pro-cess by which the elements of a sequence areproduced from a hierarchical memory rep-resentation in an orderly serial fashion. Theidea that a hierarchical code may be used torepresent a serial pattern has been proposedby Simon and Sumner (1968) and Leeuwen-berg (1969), among others. The concept ofstructural trees was introduced by Restle(1970) to depict the inner structure of serialpatterns as represented by hierarchical codes.Essentially, Restle showed that high-leveltransitions in a structural tree cause moredifficulty in anticipating serial patterns thando low-level transitions. He foresaw that anintegration of memory code and structuraltree could provide the basis of a theory ofhow serial patterns might actually be pro-duced. Greeno and Simon (1974) showedthat, in fact, different production models areconceivable. In the present article a modelis presented that reflects the exact relationbetween memory code and structural tree.According to this model, called the tree trav-ersal interpreter, a structural tree correspondsnot so much to a hierarchical memory codeas to the interpretive process that operates onthat code.

This research was supported in part by the Nether-lands Organization for the Advancement of Pure Re-search (Z.W.O.).

Requests for reprints should be sent to Rene Collard,Department of Experimental Psychology, University ofNijmegen, Montessorilaan 3, 6500 HE Nijmegen, TheNetherlands.

The psychological issue under considera-tion may be best introduced by means of anexample. Consider a person who wants towhistle a tune learned long ago. Assume that^the tune is stored in memory not as a longseries of notes but in a concise hierarchicalcode reflecting the melodic structure, for ex-ample, of the form proposed by Deutsch andFeroe (1981). For the whistler to actuallyproduce the tune, the memory presentationmust be decoded in order to generate theproper sequence of notes. Theoretically, thedecoding can be done in several differentways, as shown by Greeno and Simon (1974),who proposed the following models: the dou-bling, the recompute, and the push-down in-terpreters. These three interpreters and thetree traversal interpreter may be character-ized as follows: First, the whistler might beable to decode in advance the memory rep-resentation to a complete series of notes thatis temporarily stored in a short-term memory(the doubling interpreter). Second, it may bethat the whistler recomputes each consecu-tive note to be produced by applying thememory code to the very first note of thetune (the recompute interpreter). Third, thewhistler may use a push-down stack to storeand recover notes preceding the last one inorder to generate the next note (the push-down interpreter). Finally, the tree traversalinterpreter assumes that the whistler uses thehierarchical memory code in an on-line fash-ion to compute each note from the imme-diately preceding one. It seems reasonable to

693

694 RENE COLLARD AND DIRK-JAN POVEL

expect that the amount of processing re-quired at each location in the sequence re-lates to performance, be it latencies or errorsmade. In that case the different productionmodels lead to distinguishably different pre-dictions, which may be tested empirically.

This article is mainly concerned with themathematical description of the tree traversalinterpreter and the comparative analysis ofcomputational properties and psychologicalrelevance of processes for sequence produc-tion. To that end we will first recapitulate thebasic framework underlying serial patternresearch (Simon, 1972). For the general re-lation between such research and theories ofperception, cognitive organization, and lan-guage that share the idea of hierarchical rep-resentation, the reader is referred to Greenoand Simon (1974), Restle (1979), and Deutschand Feroe (1981).

Serial Pattern Research

Serial pattern research has mainly focusedon the development and application of cod-ing models for the internal representation ofpatterned sequences, such as letter and num-ber series (Geissler, Klix, & Scheidereiter,1978; Jones & Zamostny, 1975; Leeuwen-berg, 1969; Simon & Kotovsky, 1963; Vitz& Todd, 1969), patterns of lights (Restle1970,1976; Restle & Brown, 1970), temporalpatterns (Povel, 1981), and musical patterns(Collard, Vos, & Leeuwenberg, 1981;Deutsch, 1980; Deutsch & Feroe, 1981;Leeuwenberg, 1971; Restle, 1970; Simon &Sumner, 1968). In each application the cod-ing model provides a precise notation to de-scribe the internal representation of serialpatterns, usually in the form of a hierarchicalcode. As such, a code is a static structuraldescription, independent of the processesthat may be involved.

The underlying processes of encoding anddecoding have been studied less extensively.Encoding corresponds to the process of theacquisition of pattern representations. So far,computer programs that simulate this pro-cess have been confined to strictly periodicpatterns, for example, from the well-knownThurstone Letter Series Completion Test(Klahr & Wallace, 1970; Kotovsky & Simon,1973; Simon & Kotovsky, 1963). A major

theoretical problem in encoding is the in-herent structural ambiguity of serial patterns:Generally, a pattern may be represented inso many different ways that it is neither prac-tical nor feasible to generate all possible rep-resentations. But even if that could be done,the problem remains that only the most eco-nomical codes should be generated becauseonly these appear to be perceptually relevant(Leeuwenberg, 1971; Simon, 1972). Re-cently, a solution to this thorny problem hasbeen sought in a set-theoretical order of codesthat is derived from the structural informa-tion of pattern codes, defined as the set ofpatterns having the same structural relations(Collard & Buffart, in press). The main sub-ject of this article will be the opposite process,that of decoding, which corresponds to recallor production from memory.

In principle, each of the three stages—en-coding, memory representation, and decod-ing—can be described in a separate model,with the constraint that the encoding modelgenerates codes that belong to the codingmodel and the decoding model operates oncodes from the coding model. Therefore, thecoding model plays a central role. As Simon(1972) showed, all coding models proposedso far bear upon the same elementary trans-formations within a given alphabet, and thecentral core of the different notations canmost conveniently be captured in what isusually called the TRM-model after the fol-lowing basic operations: T for transpose, Rfor repeat, and M for mirror (Restle, 1970,1976; Simon, 1972, 1978).

Notational Preliminaries

Following Greeno and Simon (1974), wewill first restrict the analysis to strictly hier-archical codes with operations Tk of the formTI<(X) = (x tk(x)) where tk is a transpositionon an ordered set of integers, defined bytk(x) = x + k. Thus, t(,(x) = x, ti(x) = x + 1,and so on. Within the TRM-model, R is ashorthand for T0. So, sequence (2 2) isrepresented by R(2) because R(2) = (2to(2)) = (2 2). Similarly, (1 2) is representedby r,(l), because r,(l) = (1 *,(!)) = (! 2).The transpositions act on sequences of num-bers in the same way, for example, ?i(l 2 23) = (;,(!) *,(2) /,(2) *,(3)) = (233 4). There-

SERIAL PATTERN PRODUCTION 695

5 6 2 3 4 5 1 2Figure 1. Structural tree for code 7"-,(r_3(r,(5))) of sequence ( 5 6 2 3 4 5 1 2 ) .

fore, the operations may also be used hier-archically so that (2233) can be representedby r,(*(2)), because r,(*(2)) = r,(2 ?0(2)) =r,(2 2) = (2 2 f,(2 2)) = (22 f,(2) f,<2)) =( 2 2 3 3 ) , and, similarly, ( 3 2 3 2 ) by/?(r_,(3)). The mirror operation, M, and fur-ther generalizations will be denned later inthis article. For the moment we will focus onsome important group-theoretical propertiesof the transpositions, or transformations asthey are generally called.

The above transpositions constitute a well-known transformation group: the additionmodulo K. The theory, however, applies toany set of operations derived from commu-tative transformation groups that act on se-quences. In this context, a transformation ona set Xis a one-to-one mapping u:X—*Xor,in other words, a permutation of X. Thetransformations act on sequences of elementsof X as if applied to each element individu-ally, that is, u(xi x2 . . . xn) = (u(xt) u(x2). . . »(„)). To each transformation u, anoperation U corresponds, denned by U(x) =(x u(xj). In this way the transformations aredenoted by lowercase letters, whereas theuppercase letters denote the correspondingoperations. Let "o" denote composition oftransformations; then we may summarizethe basic properties of a commutative trans-formation group G as follows: for transfor-mations t, u, and v from G,

Associativity: (t o u) o v = t o (u o v);

Commutativity: t o u = u o t; _,Inverse: For each t there is a t such that

t o t = e, where e is the identity trans-formation. Also recall that inverse dis-tributes over composition: If t - u o v,then t - u o t>.

It is easy to see that these properties applyto the transpositions within the TRM-model.For instance, the transpositions commutebecause tk o tt = tk+t = ttJrk = t\ o tk.

Using the commutative property, we maydepict a hierarchical code as a so-called struc-tural tree (Restle, 1970). In this way eachstrictly hierarchical code corresponds to astrictly nested binary tree (Greeho & Simon,1974). The special property of such a tree isthat the operations within each level are iden-tical (see Figure 1). This equivalence, how-ever, is a static mathematical fact that doesnot show how codes relate effectively tostructural trees. If one assumes that the in-ternal representation of serial patterns is inthe form of an abstract hierarchical formulabut at the same time claims that the actualbehavior in serial pattern processing is basedon the corresponding structural tree (Restle,1970, p. 487), one must conclude that mem-ory code and structural tree should be relatedby means of appropriate interpretive pro-cessing of the code. Without such processing,a code remains just a formula and will notaccount for characteristics due to its struc-tural tree. Simply writing out the formulas,

696 RENE COLLARD AND DIRK-JAN POVEL

5

SQ Si 82 83 84 85 SB 87Figure 2. Computational diagram of the tree traversal interpreter, converting code T-i(T-3(Tt(5))) tosequence ( 5 6 2 3 4 5 1 2 ) . (Each element, Sh is computed from its immediate predecessor, S(_i, bytraversing the tree.)

for example, r_,(r_3(r,(5)))=> r_,(r_3 X(5 6)) =» r_,(5 6 2 3)=> (5 6 2 3 4 5 1 2),will not do either, because then the genera-tion of the second half of a sequence wouldbe quite different from that of the first half,and therefore incompatible with the apparentsymmetry of the structural tree.

The Tree Traversal InterpreterWe shall first explain the operation of the

tree traversal interpreter informally. The ba-sic idea is illustrated in Figure 2 for sequenceS= (5 6 2 3 4 5 1 2 ) from code Trl(T-3(jTi(5))). The computational diagram in Fig-ure 2 is obtained from the structural tree inFigure 1 by surrounding each node with atriangle of transformations, according toTk(x) = (x 4(x)). Within each triangle, theleft-hand arrow denotes the identity, the bot-tom arrow corresponds to 4, and the right-hand arrow corresponds to the inverse, 4-

The fth element of the sequence, Si, canbe computed from its immediate predecessorSj-i by traversing the tree and applying theoperations that are encountered in the fol-lowing way: straightforward when moving tothe right, inversely when moving upward.Thus, r_1(r_3(r,(5))) is converted subse-quently to S1 as follows: 5, /i(5) = 6,M/i(6)) = 2, r,(2) = 3, M'-3(*i<3))) = 4,

f,(4) = 5, /_3(/,(5)) = 1, /,(!) = 2. Becausecomposition is associative, we might as wellcompose^the transformations first: 5, f,(5) =6, (r_3 o f,)(6) = r_4(6) = 2, and so on. In theformal analysis below, we will adopt the in-termediate notation 4 o t^Sf) that leaves ei-ther possibility open. Technically speaking,the path through the tree should be called apreorder traversal (see Aho & Ullman, 1974,for instance). Because the leaves (sequenceelements) must be produced from left to rightanyway, we may call it simply a tree traversalin the present context.

The Tree Traversal TheoremIn order to specify the interpreter by a gen-

eral equation S, = r^St-\\ we need some no-tation regarding position of leaves in binarytrees. Let m be the depth of the tree; then thebinary representation of leaf / < 2m will bedenoted by bt = bt,mb,,m-i, . . . ,&,,,, wherebitj is either 0 or 1. In terms of binary trees,this means that the left branches are labeledwith a 0 and the right branches with a 1. Thebinary representation of leaf i can be seen asthe path from the top of the tree to leaf/'. Forinstance, in a binary tree of depth 3, theleaves are denoted from left to right as fol-lows: So by 000, S^ by 001, S2 by 010, andso on. Now the tree traversal equation reads:

SERIAL PATTERN PRODUCTION 697

THEOREM. For operations Um, E/m_i,. . . , C/i, based on a commutative transfor-mation group U, let S = (S0, S,, . . . , S2-«-i)be the sequence coded by Um(Um-\ X

Table 1Code T-,(T-3(T,(5))) Converted to Sequence(56234512)

rt(Si-i), where-i o

= w ,,, = 0,

PROOF. From Greeno and Simon's (1974)Equation 19, we have

St = Pi(So), wherePi = PijnO Pi,m-\ O • • • A',1 ;

P,j= Uj if &,,; = 1,= e if 6,,, = 0.

Replacing 50 by A-i(S,-i), we can rewrite thisequation by associativity and commutativityof composition in U and by distributivity ofinverse:Si = (Pi-\,m O Pi,m) O (Pi-i,m~i O A',m-l)

• • • o(/3,_M OA.iXS-i).Denoting (p,-ij o /?,-,_/) by r,i7, we have

if i/_ij = 0 and MW = 1, then

rtj = PIJ = Uj ;ifbi-ij = 1 and M,J = 0, then

rtj = A-ij = Uj ;if &,-ij = by then either rw = e o e = e

or r,j = A-U o pu = M,- o w, = e.

The InterpreterThe tree traversal theorem may be imple-

mented in several different ways, dependingon the machinery available. In essence, anm-bit binary counter has to be mapped onthe code: adding one to the binary represen-tation of i—l, bi-i corresponds to traversingthe tree from position /— 1 to /', whereas a bit-change 0 =* 1 of the /th bit dictates the ap-plication of the transformation at the /thlevel, Uj, and a change 1 => 0 dictates the ap-

Binary

r_, :0

0

0

0i1111i0

counter

—3

0

0i11i0

0i11i0

Tt

0iIi0iII0i1i0i1I0

Transformationto be applied

/i

t-i it

tt

t-t i-3 it

t,

t-3 I,

tt

i-t I-, it

Currentelement

S0 = 5

S, = 6

S2 = 2

S, = 3

S4 = 4

ijj = 3

iS6 = 1

ST = 2

Note. A bit change 0 =* 1 of the ./th bit dictates the^'thtransformation to be applied, a bit change 1 =» 0, theinverse.

plication of the inverse, M,. For the exampleof Figure 2, T-i(T-J(Ti(5))), we have U3 =71-,, C/2 = r_3, and t/, = r,. So M3 = /-i,M2 = f-s, and MI = fi . Table 1 shows oncemore the derivation of sequence ( 5 6 2 3 45 1 2), using a binary counter.

PropertiesThe interpreter employs both commuta-

tivity and inverse of the transformations.Therefore, the transformations must consti-tute a commutative group. Using an argu-ment similar to that of Greeno and Simon(1974, p. 195), we may prove that the treetraversal interpreter is the most efficient al-gorithm to compute 5, from S/_i. (Leavingout a Uj or a «/ in the tree traversal theoremwould change the value of some Si. There-fore, no shortcut is possible in computing 51,from Si-i.)

The number of transformations to be ap-plied equals the number of bit changes in thesuccessive increments of 6,. Let s denote thelength of the sequence, that is, s = 2m; thenthe total number of bit changes equals 2(s -I). Figure 3 shows the computational costsper element.

698 RENE COLLARD AND DIRK-JAN POVEL

I 'S

I SS0 Si S2 S3 S4 S6 S6 S7

sequence elements

Figure 3. Performance profile of the tree traversal in-terpreter for code T^(T.3

From Table 1 it is easily seen that the"housekeeping operations" on the binarycounter yield the same profile as the trans-formations do. So no matter what assump-tions are made about the costs of elementaryprocessing steps, a jag-shaped performanceprofile results that corresponds to tree tra-versing. There are no special memory re-quirements for storing sequence elements,except possibly one location for the currentelement.

On-Line ProductionThe tree traversal interpreter can also be

seen as emitting a series of transformationsrather than sequence elements. For the ex-ample r_,(r_3(r,C ))), we obtain by treetraversing t\, t-3 o t\, t{, t-{ o r_3 o t\, t\, t-3 ofi, t\. Thus, a hierarchical code that appliesto the elements of the sequence may be de-coded to a series of possibly composite trans-formations that correspond to the differencesbetween subsequent elements. Such a way ofon-line operating may prove valuable if theabsolute value of sequence elements is hardto represent internally. On-line productioncannot be performed by the interpreters re-viewed below.

' Comparison

Greeno and Simon (1974) proposed threedifferent models for sequence production:the push-down, the recompute, and the dou-bling interpreters. The first one needs a push-down stack to store elements in the sequenceprior to the last one produced. It uses thebinary counter (see Table 1) to recover ele-ments stored in the push-down stack in a

push push push push

s, 03 04 05 Og OyFigure 4. Computational diagram of the push-down interpreter, converting code T-i(T-3(Ti(5))) to se-quence ( 5 6 2 3 4 5 1 2 ) . (Each element, Sh is computed from some previously computed element, storedin the push-down stack.)

SERIAL PATTERN PRODUCTION 699

i 61

s2 3 4 5 1

82 S6 S7

Figure 5. Computational diagram of the recompute interpreter, converting code T-,(T-)(Tt(5))) to se-quence ( 5 6 2 3 4 5 1 2 ) . (Each element, Sh is computed from the first element, S0, by descending thetree.)

similar way as the tree traversal interpreteremploys the binary counter to apply the in-verse transformations. Thus, it can be seenas a semi-tree-traversal because upwardmovements in the tree are performed by pop-ping the push-down stack. See Figure 4.

The recompute interpreter computes eachelement, Sh from the first element, So- In-stead of bit changes (see Table 1), the abso-lute values of the bits are used to apply theappropriate transformations. Therefore, itcan be seen as a tree descender. See Fig-ure 5.

The doubling interpreter evaluates nestedformulas in the usual way, that is, inside tooutside and therefore does not incorporatea structural tree, as has been shown already

in the introduction. Moreover, doubling re-quires a short-term memory that must becapable of storing half the sequence. Hence,we will examine tree interpreters only.

Comparison of Tree Interpreters

In Table 2 some properties of the threetree interpreters are summarized. The treetraversal interpreter fits the special group-theoretical properties underlying the TRM-model exactly, whereas the push-down in-terpreter also applies to systems in which thetransformations do not have inverses. Therecompute interpreter computes the ele-ments in a top-down fashion, or vice versa,and therefore, the transformations do not

Table 2Comparison of Three Tree Interpreters for Strictly Nested Binary Trees

Transformation

Interpreter

Tree traversal

Push down

Recompute

Must form

Commutativegroup

Commutativesemigroup

Any

Total numberto be applied

2 (s - 1)

s- 1

*/2S log S

Memoryrequirements

1

'/2 log i

1

Housekeeping operation

Pushes andpops

2 (s - 1)

Binary counter

2 (s - 1)

2 (s - 1)l/2S log S

Note, s = sequence length.

700 RENE COLLARD AND DIRK-JAN POVEL

Table 3 iComparison of Tree Interpreters for Repeating Sequence (23235656) From Code T3 (R(T,(2)))

Interpreter Sequence

Tree traversal 2 - 3 - - 2 - S 5 - 6 - - S - 6 2 - 3 - - 2 . . .Pushdown 2 - 3 - 2 - 3 - 5 - 6 - 5 - 6 2 - 3 - 2 . . .Recompute 2 - 3 - 2 - - 3 - S — 6 - - 5 6 2 - 3 - 2 . . .

Note. Each short line denotes a transformation to be applied.

need to commute, a point that we will returnto later.

The product of computational and mem-ory costs of the tree traversal interpreter islinearly proportional to sequence length s,denoted O(s), whereas the other two tree in-terpreters require O^logs). Unlike the re-compute interpreter, the push-down inter-preter may produce a similar performanceprofile for the popping of the push-downstack or for the housekeeping operations onthe binary counter (Simon, 1978) as does thepresent tree traversal interpreter. The treetraversal profile for transformations, how-ever, (see Table 3) does not apply to the push-down interpreter because it computes eachelement of the sequence, except .So, by ap-plying just one transformation to some pre-viously stored element. In effect, the treetraversal interpreter is the one that comesclosest to the basic idea underlying structuraltrees, namely, that the difficulty of any lo-cation in the sequence equals the level oftransition immediately preceding it (Restle,1970).

Generalizations

The main generalizations are from binaryto «-ary trees and from strictly nested to notstrictly nested trees. Unfortunately, Greenoand Simon (1974) did not provide a completeanalysis for these generalizations. In fact,such an analysis should involve automata-theoretical issues that are beyond the scopeof this paper too. Without elaborating alldetails, it is not difficult, however, to predictthe behavior of the different interpreters forgeneralizations. The picture that emerges isthat the recompute interpreter becomes lessefficient when operating on «-ary trees,whereas the push-down interpreter demandsa somewhat more sophisticated scheme for

manipulating the push-down store when ap-plied to not strictly hierarchical codes. Theone that remains stable under both general-izations is the tree traversal interpreter. Asindicated above, we will not elaborate at thistime on all automata-theoretical details, butmerely indicate the most characteristic dif-ferences.

Following Restle (1970), codes represent-able by «-ary structural trees are obtained bygeneralizing operations Tk: Tk(x) = x t^x) to

Tkn(x) = xTk»-\t£x)\ with Tk° = e.

In this way sequence ( 2 3 4 5 ) may be rep-resented by r,3(2). In generating the se-quence from the code, we need an n-arycounter instead of a binary counter. The de-tails are straightforward (see Greeno & Si-mon, 1974). What changes in Table 2 withrespect to efficiency is the factor log 5. Forthe push-down interpreter, the maximumsize of the push-down stack remains propor-tional to the depth of the tree, whereas forthe recompute interpreter the number ofhousekeeping operations and transforma-tions to be applied may increase to (s —l)(s- 2)/2, that is, O(s2). "Recomputing"r,"(l), for instance, yields 1, t i ( l ) , t,(tj(l)),ti(ti(ti(\))), and so on. All other propertiesin Table 2 remain unchanged.

Codes that are not strictly hierarchical canbe represented by concatenation. For in-stance, (1 1 2 3 2 2 3 4 ) may be representedby Tt(R(l), r,(2)). Obviously, the numberof memory locations needed for interpretiveprocessing of not strictly hierarchical codesequals, at least, the number of independentinitial elements. Again, the details of thecounter are not very much more complicated(see Greeno & Simon, 1974). The push-downinterpreter, however, needs a somewhat dif-ferent scheme for manipulating the push-down store. The problem is that a push-down

SERIAL PATTERN PRODUCTION 701

store is a "first in, last out" device. So, if asequence is made up hierarchically of twodifferent patterns and some elements of thefirst pattern are kept in the push-down stackand the appropriate elements of the secondpattern are pushed on top of these, then thefirst-pattern elements cannot be releasedwithout losing the elements of the secondpattern. It turns out that one needs either arather complex scheme for transmitting in-formation from the push-down stack to thememory locations for initial elements, andvice versa, or several different push-downstacks.

Mirroring

We have postponed a precise account ofmirroring, denoted by m, because its inter-pretation is somewhat more complicatedthan that of transposition. An example mayhelp to clarify the problem of interpreting themirror operation. Suppose we have an or-dered set of six elements, say (1 2 3 4 5 6 ) ,then m(l) = 6, m(2) = 5, m(3) = 4, and soon, and the sequence ( 2 3 5 4 ) can be rep-resented by M(Ti(2)), because M(T{(2)) =M(2 f,(2)) = M(2 3) = (2 3 m(2 3)) = (23m(2) m(3)) = (235 4). If this code is depictedin a tree (see Figure 6), a problem arises be-cause the right-hand T\ applies inversely.

The tree of Tl(M(2)\ however, fits cor-rectly ( 2 5 3 4), but straightforward evalua-tion of 7^(2)) via 7,(2 5) yields ( 2 5 3 6).Greeno and Simon (1974) concluded that Mshould be viewed as an operation that affectsthe way the other operations work: Trans-position has to be applied in a negative di-

rection if and only if it is preceded by an oddnumber of reflections, depending on the typeof interpretive process. This idea, however,does not seem easy to implement. (Also, itdoes not apply to the doubling interpreter.)Therefore, we propose another way to handlereflections that does not demand any specialconsiderations for interpretive processing.

The problem is that mirroring, as usedabove, does not commute with transposition,that is, m o tk ¥= tk o m. In order to let thetransformations commute, a double repre-sentation for sequence elements is needed.Suppose we have an ordered set E of six ele-ments, (a b c d e f), then E will be representedby counting from the left, £L = UL IL 3L 4L5L 6L), and by ER = (6R 5R 4R 3R 2R 1R)counting from the right. Thus Element b, forexample, is both the second element fromthe left and the fifth element from the right.Within each representation of E, transposi-tions apply as usual, whereas mirroring hasthe function of changing from representa-tion. For instance, *i(2R) = 3R and fi(2L) =3L, whereas m(2R) = 2L and w(3L) = 3R. IfE is stored in memory as a list with two-wayaccess, or two lists with one-way access, theelements of the double representation can beviewed as retrieval operations. Thus, for theexamples of Figure 6, no arithmetic comple-ment needs to be taken. See Figure 7.

In this way codes, trees, and interpretiveprocessing fit perfectly again. Note that dou-ble representation is more powerful: For ex-ample, sequence ( 2 1 6 5 3 2 5 4 ) may berepresented hierarchically by Ti(T-\(2$,^I(IR)), a code that could not be expressedin the earlier formalism.

Figure 6. Two examples of mirroring: For code Af(7Y(2)) the right-hand 7", in the structural tree mustapply inversely, whereas straightforward evaluation of code Tt(M(2)) yields (253 6).

702 RENE COLLARD AND DIRK-JAN POVEL

3R2R 3LII II II

4 2 5 3 4Figure 7. A double representation of elements solves the mirroring problem.

The above construction applies to anycommutative transformation group T. Let Vhdenote the set of subscripted elements vh, hbeing L or R. Moreover let h denote L if h =R, and conversely. Then transformations tkand transformations with reflection mk aredenned by tk(yh) = (tk(v))h and mk(vh) =(tk(v))fi, respectively. Particularly, if the tksare transpositions modulo K, we have t^vh) -(v + k)h and m^vh) = (v + k)f>, where + is ad-dition modulo K. In fact, Greeno and Simon(1974) provided a similar group-theoreticaldefinition. Basically, they employed an in-verted copy, E, of an ordered set of elementsE in such a way that mirroring both changesfrom copy and takes inverse, whereas trans-position operates straightforwardly in E, andinversely in E. Some typical applications ofmirroring will be given below.

ApplicationThe main area of experimental research

where the present theory may prove valuableis serial pattern production. In experiments,however, one usually has to account for allthe processes involved. Kotovsky and Simon(1973) analyzed errors in the Thurstone Let-ter Series Completion Test and concludedthat some errors are due to failures in theacquisition of pattern concepts whereas oth-ers are caused by faulty extrapolation of cor-rect pattern descriptions. Also, in Restle's(1970) experiments on serial pattern learn-ing, where college students tried to anticipatepatterns of lights correctly, the jag-shapederror profile may not have been due to learn-ing only. Because of this potential difficulty

in assigning errors, latency might be a vari-able that lends itself better to compare dif-ferent production models. Simon (1978) de-scribes an experiment in which subjects weretrained to apply each of the three interpretivemodels of Greeno and Simon (1974). Forinstance, a subject was asked to simulatethe doubling interpreter and to decodeM(R(T(1))) yielding the number sequence( 1 2 1 2 6 5 6 5 ) . Thereafter, the subject wasasked to simulate the recompute interpreterand then the push-down interpreter. The la-tencies for performing the calculations asobserved in the experiment qualitatively fitthe theoretical performance curves very well,though considerable differences were foundbetween subjects. Thus, the experiment showsthat subjects can learn to mimic these par-ticular methods to produce sequences. Asomewhat different question is what methoddo subjects actually use in production taskswhile whistling a tune, imitating a rhythm,or tapping a "finger pattern."

Tapping the Memory CodePovel and Collard (in press) studied sub-

jects who tapped finger patterns—sequencesof taps made with different fingers. A patternto be tapped was presented as a sequence ofnumbers, for example, ( 1 2 3 2 3 4 ) , wherethe numbers 1 through 4 correspond with theindex, middle, ring, and little fingers, respec-tively. After some test trials, subjects tappedthe pattern as fast as possible in a repeatedfashion. Apart from the possible effect of thememory code and the interpretive process-ing, the motor demands to produce the ac-

SERIAL PATTERN PRODUCTION 703

tual finger transitions might, of course, alsodetermine the latencies. In order to disen-tangle these potential factors, sets of stimulirather than single stimuli were used. Each setconsisted of a number of cyclic permutationsof one base sequence, for example, ( 3 2 1 23 4), (2 1 2 3 4 3), and ( 1 2 3 43 2). Such a set has two typical properties:First, when the sequences are tapped repeat-edly, the motor demands will be equal forall. Second, if one sequence within the setcan be coded hierarchically, so can the othersequences. In the above example the lastthree elements persist to mirror the first threeelements. The sequences can be coded byA/(7V(3)), M(2 1 2), and AftTftl)), respec-tively. Therefore, the timing profile as pre-dicted from the tree traversal interpreter forall three patterns has the form—SI -52-53—54-55-56, where "—" indicates a long in-terval, and "-" a short interval in a way sim-ilar to that in Table 3.

Such a set of cyclic permutations providesan interesting opportunity to study the pro-duction process: If the latencies were deter-mined by the motor demands to produce thefinger transitions, one would expect to findidentical profiles when the observed timingprofiles are displayed in a diagram such thatthe finger transitions are aligned. Conversely,

if the latencies were mainly determined bythe interpretive processing of the memorycode, we would expect to find identical pro-files when they are displayed with the codesaligned. For each of the 20 subjects in theexperiment, the mean latencies between theelements were calculated for five repetitions.Next, this profile was transformed into ranksnumbered 1 through 6. The profiles, shownin Figure 8, present means ranks obtainedby averaging over subjects.

Figure 8 clearly shows that in the finger-transition-aligned display the profiles arequite dissimilar, whereas in the codes-aligneddisplay the profiles are almost identical;moreover, the general form follows the speci-fied prediction from the tree traversal inter-preter.

It should be noted here that in case a pat-tern is ambiguous, further analysis is appro-priate (Collard & Leeuwenberg, 1981; Restle,1976, 1979). Here, the last pattern of theabove example, (1 2 3 4 3 2 ) , may alterna-tively be coded by r,3(l), T_,(3), that is, asa run of four elements followed by a groupof two elements. Indeed, inspection of theindividual profiles revealed that the timingprofiles of some subjects correspond to thislatter code rather than to the former code.Such an analysis (Povel & Collard, in press),

UJ «I- 0)< e

3 2 1 2 3 4

finger number

s0 s, s2 s3 s4 s5

element number

FINGER TRANSITIONSALIGNED

CODES ALIGNED

Figure 8. Latency profiles for the three patterns—(3 2 I 2 3 4), (2 I 2 3 4 3), and (I 2 3 4 3 2)—indicatedwith a straight line, a dashed line, and a dotted-dashed line, respectively.

704 RENE COLLARD AND DIRK-JAN POVEL

5 / ».."!..-»

Figure 9. Two different applications of mirroring. Each finger has a complementary finger in the samehand: For instance, c(lR) = 1R = 5R. In addition, each finger of the right hand can be mirrored to theleft hand, and vice versa: For instance, m(lR) = 1L.

which takes structural ambiguity into ac-count, renders more pronounced profilesthan those given in Figure 8.

A Handy ModelThe relation between finger patterns and

number sequences depends on the fact thatadjacent fingers are denoted by consecutivenumbers. If the numbers are assigned to thefingers out of order, the serial instructionsappear almost impossible to follow. Simi-larly, the transformations used in the codescan be conceived as descriptions of fingertransitions. For example, t\ may mean "tapwith the next finger." This conception iscompatible with the subjective experiencefound when repeatedly producing a fingerpattern: Once the movement has started, theactual tapping seems to be done in relationalterms, and one no longer realizes what fingersare actually involved. This may correspondto the property of the tree traversal inter-preter that we have described as on-line pro-duction.

The model for the internal representationof the finger patterns used in the experimentis part of the model given in Figure 9. Withthe help of the transformations, given in Fig-ure 9, patterns tapped with both hands canalso be adequately represented. For an ex-periment involving both hands, see Rosen-baum, Kenny, and Derr (in press). Themodel of Figure 9 provides a good exampleto illustrate the group-theoretical definition

of the mirror operation given above. Withineach hand there is a transformation of takingcomplement, denoted by c. Consequently,each finger is represented twice. For instance,the right-hand little finger is represented bothas the first finger counting from the right andas the fifth finger counting from the left. Inaddition, there is a mapping from hand tohand that accounts for mirroring hands, de-noted by m. Here, the double representationof fingers refers, one to one, to the fingers ofdifferent hands. Thus, the right-hand littlefinger, for instance, can be mirrored to theleft-hand little finger and vice versa. The ba-sic point of this illustration is that the natureof the mirroring transformation depends onthe type of patterns involved. Essentially,mirroring has the function of changing fromcopy, whereas the copies may or may notcoincide.

ExtensionFrom a mathematical point of view, one

might ask the question: How is the theory ofserial pattern production affected if the trans-formations used in the codes do not com-mute? The answer to that question leads tothe recent work of Deutsch and Feroe (1981).In their article on the structural aspects ofmusic perception, Deutsch and Feroe pro-pose a coding model that, as they state, "dif-fers from earlier ones in its basic architec-ture" (p. 504). First, we will see that in casethe transformations commute, the frame-

SERIAL PATTERN PRODUCTION 705

work underlying the classical TRM-modelcan be converted to their coding model. Sec-ond, this conversion is no longer possible ifthe transformations do not commute: Theirapproach is the one that relates consistentlyto structural trees. Third, the tree traversalinterpreter as well as the other interpretersapply equally well in this extended frame-work.

Converting the Memory CodeWe will consider only those aspects of the

Deutsch and Feroe (1981) model that areessential for analyzing the basic architecture.Essentially, they converted the frameworkunderlying the TRM-model while retainingthe same elementary transformations. Theconversion will be seen more easily if we re-write the symbols Tk to the correspondingstring of transformations, [e fej (Greeno &Simon, 1974). Recall that e denotes the iden-tity transformation. So, instead of Ti(2) wewill write [e tt](2), because [e ti](2) = (e(2)'i(2)) = (2 3). In hierarchical codes, eachtransformation in turn is applied to the wholestring of sequence elements: Thus, T3(T}(1))can be rewritten as [e t3]([e fi](l)) = [e t3](l2) = (e(l 2) t3(l 2)) = (1 2 4 5). We will fur-ther refer to this type of application as theclassical way.

It is also possible to apply the transfor-mations to sequence elements the other wayaround; that is, the, string of transformationsis first applied to the first sequence element,then to the second element, and so on. Letus denote' this type of application by writingthe sequence elements at the left side of theformulas, for example, (1 4)rt = (1 4)[eti] =<0)[? fil (4)[« fil) = ( 1 2 4 5). This type ofapplication is called the top-down way be-cause the higher level elements are generatedfirst, which in turn serve to generate the lowerlevel elements (Deutsch & Feroe, 1981). Forinstance, ((I)r3)r, can be rewritten as ((l)[e

( 1 2 4 5 ) . Note that in the top-down appli-cation the first and third elements are gen-erated before the second and fourth.

The relation between classical codes andtop-down codes is fairly straightforward: Thecodes have only to be converted such thatthe highest level becomes the lowest level, the

second highest the second lowest, and so on.For the above example, T$( T\ (1)) can be con-verted to ((I)r3)r! and vice versa, while bothrepresent the sequence (1 245) . Using thecommutative property of the transforma-tions, the corresponding theorem for con-verting hierarchical codes can easily beproved. Why use a converted notation if theold one is equally good? The reason is thatthe model of Deutsch and Feroe (1981) hasa feature that cannot be adequately capturedin the classical framework.

Where the Routes DivergeIn their model, Deutsch and Feroe allowed

transformations to operate in different al-phabets occurring at different levels of a hi-erarchy. This feature of representation nicelytakes into account that in music perception,different alphabets, familiar through priorlearning, may be simultaneously involved.The important thing here is that if differentalphabets are allowed in one and the samerepresentation, the coding model is, in prac-tice, noncommutative and cannot thereforebe incorporated in the classical frameworkunderlying the TRM-model. A typical,though somewhat artificial, example mayhelp to clarify the two possible routes. Sup-pose k is a transformation that squares itsargument, for example, k(2) = 4, k(3) = 9,and so on. Thus, K(2) = (2 4) and K(3) =(3 9). Note that k does not commute withtranspositions, for example, tt(k(2)) = 5whereas k(t\(2)) = 9. Now, the two ways ofdefining hierarchical codes appear to be nolonger compatible: T,(K(2)) = ( 2 4 3 5)whereas, conversely, ((2)Ti)K = ( 2 4 3 9). So,if the transformations fail to commute, theconversion between classical codes and top-down codes is no longer possible.

The left-hand tree of Figure 6 already pro-vided an example showing that structuraltrees do not apply to classical codes if thetransformations do not commute; however,as will be clear now, they do apply preciselyto codes defined in the top-down fashion. So,we conclude that the top-down way of defin-ing hierarchical codes fits exactly the conceptof a structural tree, whereas the classical wayis appropriate only if the transformationscommute. As the basic idea proposed in this

706 RENE COLLARD AND DIRK-JAN POVEL

article is that a structural tree actually cor-responds to the interpretive process that op-erates on a hierarchical code, we will examinemore closely the way in which Deutsch andFeroe (1981) handled the production of pitchsequences from memory.

Tree Traversal AgainIn their article, Deutsch and Feroe (4981)

employed the doubling interpreter to writeout the codes in a top-down fashion. Fur-thermore, they assumed that intermediateresults of the decoding remain stored in afully redundant tree, similar to the one givenin Figure 5. The main advantages of theseassumptions are that errors made at a highlevel will be reflected at lower levels becausethe elements are generated in a top-downfashion and that the higher level elementswill be recalled better because they are moreoften represented. This is in accordance withmusical intuitions and assumptions generallymade by music theorists, as Deutsch andFeroe (1981) state. In fact, the same two ad-vantages apply to the tree traversal inter-preter, as can be seen in Figure 2: Errorsmade while traversing the higher levels willbe carried over to the lower levels, and thehigher the element in the tree, the more oftenthe element will be encountered.

A final point to be considered is thatDeutsch and Feroe's assumptions imply thata sequence of pitches is represented inter-nally, during production, in a completelyelaborated network of notes as if the se-quence were a visual display, where all in-formation is given in parallel. The tree tra-versal interpreter, however, assumes that thepitches are produced in an on-line fashionfrom the knowledge of a concise memorycode. Even if the hierarchical structure of aserial pattern is indeed represented inter-nally, during production, as a redundant net-work of elements, then the tree traversal in-terpreter may be seen as assembling such anetwork, not in a parallel top-down fashionlike the doubling interpreter but on-line ina serial fashion. These extensions to othersorts of coding models need further elabo-ration and empirical testing. A point thatcertainly deserves more attention is the dif-ferent possible implementations of the pro-

duction models along with refined assump-tions about human processing of structuralinformation.

DiscussionIn this article a production model has been

presented that fits exactly the relation be-tween hierarchical memory codes and struc-tural trees. According to the tree traversalinterpreter proposed, a sequence that is rep-resented internally by a hierarchical code isproduced by traversing the correspondingstructural tree. The comparative study ofdifferent possible production models can bedetached from the way codes are defined(Greeno & Simon, 1974). For instance, incase the transformations commute, hierar-chical codes may also be defined in terms ofthe coding model of Deutsch and Feroe(1981), which appears to be appropriate forextensions as well. Generally, the importanceof commutative systems, like the classicalTRM-model, lies in the fact that codes areinvariant under transformations, that is,t(C(x)) = C(t(x)). Thus, when a pattern istransformed, the structure remains invariantwhile the parameters vary, a point also em-phasized by Restle (1979) for motion pat-terns and by Collard and Buffart (in press)for the encoding of serial patterns.

The experiments discussed in this articlerefer to particular applications in serial pat-tern research. A production model that hasproven to be valid in one application is notnecessarily equally valid in another. There-fore, empirical testing is required for eachspecific application. The main difficulty intesting models for serial pattern productionis to effectively disentangle all the processesinvolved. In the simple experiment on fingertapping referred to earlier, a relatively com-plicated design was required to separate ex-perimentally the process of decoding fromthe process of execution. In principle, bothprocesses may affect performance measuressuch as latencies and errors. When morecomplex tasks like typing, speaking, andpiano playing are studied (Shaffer, 1982;Steinberg, Monsell, Knoll, & Wright, 1978),the problem becomes even more intricate.For, in such tasks, the interaction betweenthe structural characteristics of the memory

SERIAL PATTERN PRODUCTION 707

representation and the physical motor de-mands of the execution process are likely tobe much more complicated.

Another instance of tangled processes canbe observed when someone listens to a pieceof music: On the one hand, internal repre-sentations are built up or activated; on theother hand, inferences are drawn from thoserepresentations. With respect to the infer-ences, it may be that the listener makes hisor her expectations in an on-line fashion onthe basis of the preceding tones and the struc-tural information available.

ReferencesAho, A. V., & Ullman, J. D. The design and analysis

of computer algorithms. Reading, Mass.: Addison-Wesley, 1974.

Collard, R. F. A., & Buffart, H. F. Minimization of struc-tural information: A set-theoretical approach. PatternRecognition, in press.

Collard, R. F. A., & Leeuwenberg, E. L, J. Temporalorder and spatial context. Canadian Journal of Psy-chology, 1981, 35, 323-329.

Collard, R. F. A., Vos, P. G., & Leeuwenberg, E. L. J.What melody tells about metre in music. Zeitschriftfur Psychology, 1981, 189, 25-33.

Deutsch, D. The processing of structured and unstruc-tured tonal sequences. Perception & Psycophysics,1980, 28, 381-389.

Deutsch, D., & Feroe, J. The internal representation ofpitch sequences in tonal music. Psychological Review,1981,55,503-522.

Geissler, H., Klix, F., & Scheidereiter, U. Visual rec-ognition of serial structure. In E. Leeuwenberg & H.Buffart (Eds.), Formal theories of visual perception.New York: Wiley, 1978.

Greeno, J. G., & Simon, H. A. Processes for sequenceproduction. Psychological Review, 1974,87,187-197.

Jones, M. R., & Zamostny, K. P. Memory and rule struc-ture in the prediction of serial patterns. Journal ofExperimental Psychology: Human Learning andMemory, 1975, 104, 295-306.

Klahr, D., & Wallace, J. G. The development of serialcompletion strategies: An information processinganalysis. British Journal of Psychology, 1970,61,243-357.

Kotovsky, K., & Simon, H. A. Empirical tests of a theory

of human acquisition of concepts for sequential pat-terns. Cognitive Psychology, 1973, 4, 399-424.

Leeuwenberg, E. L. J. Quantitative specification of in-formation in sequential patterns. Psychological Re-view, 1969, 76, 216-220.

Leeuwenberg, E. L. J. A perceptual coding language forvisual and auditory patterns. American Journal ofPsychology, 1971, 84, 307-349.

Povel, D. J. Internal representation of simple temporalpatterns. Journal of Experimental Psychology: Hu-man Perception and Performance, 1981, 7, 3-18.

Povel, D. J., & Collard, R. F. A. Structural factors inpatterned finger tapping. Ada Psychologica, in press.

Restle, F. Theory of serial pattern learning: Structuraltrees. Psychological Review, 1970, 77, 481-495.

Restle, F. Structural ambiguity in serial pattern learning.Cognitive Psychology, 1976, 8, 357-381.

Restle, F. Coding theory of the perception of motionconfigurations. Psychological Review, 1919,86, 1-24.

Restle, F., & Brown, E. R. Serial pattern learning. Jour-nal of Experimental Psychology, 1910,83, 120-125.

Rosenbaum, D. A., Kenny, S. B., & Derr, M. A. Hi-erarchical control of rapid movement sequences.Journal of Experimental Psychology: Human Percep-tion and Performance, in press.

Shaffer, L. H. Rhythm and timing in skill. PsychologicalReview, 1982, 89, 109-122.

Simon, H. A. Complexity and the representation of pat-terned sequences of symbols. Psychological Review,1972, 79, 369-382.

Simon, H. A. Induction and representation of sequentialpatterns. In E. L. J. Leeuwenberg & H. F. J. M. Buffart(Eds.), Formal theories of visual perception. NewYork: Wiley, 1978.

Simon, H. A., & Kotovsky, K. Human acquisition ofconcepts for sequential patterns. Psychological Re-view, 1963, 70, 534-546.

Simon, H. A., & Sumner, R. K. Pattern in music. In B.Kleinmutz (Ed.), Formal representation of humanjudgment. New York: Wiley, 1968.

Sternberg, S., Monsell, S., Knoll, R. L., & Wright, C. E.The latency and duration of rapid movement se-quences: Comparison of speech and writing. In G.Stelmach (Ed.), Information processing in motor con-trol and learning. New York: Academic Press, 1978.

Vitz, P., & Todd, R. A coded element model of theperceptual processing of sequential stimuli. Psycho-logical Review, 1969, 76, 433-449.

Received December 23, 1981Revision received May 25, 1982