14
Appendix 1 Universal Numerators and Hash-sets Universal numerator UT(A, B) for any T-sized subset SofA contains a map f : A -+ B, which is injective on A. Universal a-hash set UT(A, B, a) for any T-sized subsetS of contains a map f: A-+ B, whose index on Sis not greater than a. The logarithm of the size of a numerator is program complexity. The time is given to within a multiplicative constant. Name Size of the Precomputing Running Index numerator time time guaranteed Unordered I AI" 0 Tlog IAI 0 Lexicographic jAjT(Ho(l)) T(log IAI) 2 log T log log IAI 0 Sect. 5.7 Two-step Digital (log IAIY-'' T(log IAI) 3 logTlog IAI 0 Sect. 5.5 Enumerative yu1 T(log IAI) 2 logTlog IAI 0 Galois Linear IAI IBIIAilog IAilog IBI log lA Ilog IBI T/IBI Sect. 5.2 Galois Polynomial IBI T IBilog IAilog lEI log lA Ilog lEI IBIIogiBI Sect. 5.3 Obtained by Exhaustive Search Sect. 4.5, e T(Ho(l)) yiAI(Ho(l)) logiAI 0 In IniAl 0 ---;p--+ Optimal e T(Ho(lJJ . In IAI T3(ln IAI)3 log IAiloglog IAI 0 Optimal T log 2 [AI e T(Ho(l)) T 3 (1n IAif IogiAI 0 log log IAI 209

Appendix 1 Universal Numerators and Hash-sets978-94-017-3628-2/1.pdf · Appendix 1 Universal Numerators and Hash-sets ... An integer xis given binary word Lev x, !Lev xl = logx +

Embed Size (px)

Citation preview

Appendix 1

Universal Numerators and Hash-sets

Universal numerator UT(A, B) for any T-sized subset SofA contains a map f : A -+ B, which is injective on A. Universal a-hash set UT(A, B, a) for any T-sized subsetS of contains a map f: A-+ B, whose index on Sis not greater than a. The logarithm of the size of a numerator is program complexity. The time is given to within a multiplicative constant.

Name Size of the Precomputing Running Index numerator time time guaranteed

Unordered I AI" 0 Tlog IAI 0 Lexicographic jAjT(Ho(l)) T(log IAI) 2 log T log log IAI 0

Sect. 5.7 Two-step Digital (log IAIY-'' T(log IAI)3 logTlog IAI 0

Sect. 5.5 Enumerative yu1 T(log IAI) 2 logTlog IAI 0 Galois Linear IAI IBIIAilog IAilog IBI log lA I log IBI T/IBI

Sect. 5.2 Galois Polynomial IBI T IBilog IAilog lEI log lA I log lEI Tlo~

IBIIogiBI Sect. 5.3

Obtained by Exhaustive Search

Sect. 4.5, e T(Ho(l)) yiAI(Ho(l)) logiAI 0 In IniAl 0 ---;p--+

Optimal e T(Ho(lJJ . In IAI T3(ln IAI)3 log IAiloglog IAI 0 Optimal

T ~ log2 [AI e T(Ho(l)) T3 (1n IAif IogiAI 0 log log IAI

209

Appendix 2

MAIN CODES

1. The Shannon Code. (Section 2.5) A letter A;, whose probability is p(A;), is given a binary word f(A;),

lf(A;)I = f -logp(A;)l, i = 1, ... , k.

The redundancy does not exceed 1.

2. The Huffman Code. (Section 2.5). It is a best code for a stochastic source. The codes of two least probable letters

differ by the last digit only.

3. The Cover and Lenng-Yan-Cheong Code. (Section 2.5). The code is not prefix. The letters A1, .•• , Ak are ordered by their probabilities.

The codelength of A; is rlog i~2 l , i = 1, ... , k. The coding map is injective.

4. The Gilbert and Moore Code. (Section 2.5). The letters are ordered by their probabilities. The codelength of A; is f -logp(A;)l

+ 1. The redundancy does not exceed 2. The code preserves the order.

5. The Khodak Code. (Section 2.7). It is a variable-to-block code. It is defined by a tree. Minus log probability of

each leaf does not exceed a given number, whereas minus log probability of one of its sons does. The redundancy is 0 (~), d being the coding delay.

6. The Levenstein Code. (Section 2.2). It is a prefix encoding of integers. An integer xis given a binary word Lev x,

!Lev xl = logx + loglogx(1 + 0 (1)).

7. Empirical Combinatorial Entropic (ECE) Code. (Section 2.9). A word w is broken into n-length subwords, n > 0. The list of all different subwords

is the vocabulary V of w. Each subword is given its number in V. The concatenation

210

211

of those numbers is the main part of ECE-code. There is a prefix, allowing one to find V and n.

8. The Shannon Empirical Code. (Section 2.9). Subwords of a word w are encoded in accordance with their frequencies.

9. Move to Front Code. (Sections 2.9, 3.5). The letters of an alphabet A are supposed to make a pile. The position of a letter

in the pile is encoded by the monotone code. Take letters of a word w one by one. Each letter is given the code of its position in the pile. The letter is taken out of the pile and put on the top. The codelength does not exceed

JwJ HA(w) +(log log JAJ +C) ·JwJ,

HA(w) being the empirical entropy of w.

10. Empirical Entropic (EE) Code. (Section 2.8). A word w over an alphabet B is given a vector

r(w) = (r1(w), ... , riBI(w)),

ri(w) being the number of occurrences of i in w, i = 1, ... , JBJ. The suffix of EE-code is the number of w within the set of words with one and the same vector r( w ). The prefix is the number of such a set.

11. Modified Empirical Entropic (MEE) Code. (Section 3.2). A word w, Jw J = n over an alphabet B is given a code MEE ( w),

JBJ-1 JMEE(w)J = nHB(w) + - 2-logn+ 0 (B).

The code is asymptotically optimal for the set of Bernoulli sources. Its redundancy is asymptotically IB1-l log n.

12. Trofimov's Code. It is a variable-to-block code defined by a tree. The probability of a word w

is averaged over all Bernoulli sources with respect to the Dirichlet measure with parameter~ = (~, ... , ~). Minus log average probability of each leaf does not exceed a given number, whereas such a probability of one of its sons does. The redundancy of the Trofimov's code on any Bernoulli source is 0 ('o:;,'~~if1), JL~J being the quantity of leaves of the coding tree.

13. Adaptive Code. Given a sample w, the adaptive code fw takes a word a to the code fw(a),

lfw(a)J = jlogEr(w)+l/2Ps(a)l,

212 Main codes

where p8 (a) is the probability for a word a to be generated by a source S, Er(w)+I/2

is the average with respect to the Dirichlet measure with parameter r(w) + 1/2. This code is asymptotically the best.

14. The Ryabko Monotone Code. Let ~ be a set of sources and for each S E ~ p8 (A1) ~ .•. ~ Ps(Ak), where

ps(A) is the probability for letter A to be generated by a source S E ~- The probabilities are not known. The best for~ code f takes a letter Ai to f(Ai),

lf(A;)I = r -log G (1- ~y-l 0 2-R(E)) l' where

R(E) = logt. ~ (1- ~)i-l + o:k, lo:kl ~ 1. i=l z z

R(~) '""log log k, If( A) I ~log i +log log k + 0 (1), i = 1, ... , k.

15. Rapid Lexicographic Code. (Section 5.7). A word w is given its lexicographic number in a set S. The program length is

0 (lSI ·lwl), the running time is 0 (log lSI log lwl).

16. Digital Code. (Section 5.5). It is described by a tree L\, whose nodes are given, first, numbers from 0 to IL\I-1,

and, second, labels from 1 to lwl , w being a word encoded. Being at a node, take the letter at the labelled position. Go to the left or right son of the node. The word is given the number of the leaf reached. The program length is 0 (I~ I log 1~1), the computation time is 0 (lwl).

17. Two-step Digital Code. (Section 5.5). It is computed like the digital code up to a fixed level of the tree L\. Then the

numeration of the nodes is started from zero. The label of a node divides the corre­sponding dictionary into nearly equal parts. The program length is 0 (IL\Ilog lwl), the computation time is 0 (lwl).

18. Linear Galois Code. (Section 5.2). A word w = w1w2 • •• W11., n being the wordlength, ~ being an integer. A code 'Pb

I' /L is defined by ann-length word b = b1 ... b11..

I'

'Pb(w) = b1w1 + ... + b11.w11.. I' I'

19. Polynomial Galois Code. (Section 5.3). A word w = w1 ... w11.. A code /b is defined by an J.L-length word b,

I'

20. Optimal Perfect Hash-code. (Section 6.3). A word w o£ a dictionary S is given log lSI-length code. The time to find it is

either 0 (log lwl) or 0 (log lwlloglog lwl).

REFERENCES

1. Books on Data Compression and Information Retrieval

Aho A.V., Hopcroft J.E., and Ullman J.D. (1983) Data Structures and Algorithms. Addison-Wesley, Reading, MA.

Bell T.C., Cleary J.G., Witten I.H. (1990) Text Compression. Prentice Hall, Engle­wood Cliffs., N.J.

Gallager R.G. (1968) Information Theory and Reliable Communication. John Wiley and Sons. Inc.

Hamming R.W. (1980) Coding. Information Theory. Prentice Hall Inc., 233 p.

Knuth D.E. (1973) The Art of Computer Programming. 2nd ed., Addison-Wesley, Reading, MA, 3 volumes.

Lynch T.J. (1985) Data Compression, Techniques and Applications. Lifetime Learn­ing Publications, Bellmont.

Mehlhorn K. (1987) Data Structures and Algorithms. 1: Sorting and Searching Springer-Verlag.

Storer I.A. (1988) Data Compression: Methods and Theory. Computer Science Press, Rockwille, MD.

2. Books and Papers Cited

Alon N. (1986) Explicit Construction of Exponential Families of k-independent Sets. Discrete Mathematics, Vol. 58(2), P. 191-195.

Andreev A.E. (1989) On the Complexity of Partial Boolean Functions Realization by Functional Gates Networks. Discrete Mathematics, Vol. 1, P. 35-46 (Rus).

Becker B., Simon H.U. (1988) How Robust is then-cube. Information and Compu­tation, Vol. 77, P. 162-178.

Belichev B.F. (1963) Identifying Key of Siberian Ants. Nauka Publishing House(Rus).

213

214 References

Bentley J.L., Sleator D.D., Tarjan R.E., Wei V.K. (1986) A Locally Adaptive Data Compression Scheme. Comm. of ACM, Vol. 29(4), P. 320-330.

Bernstein S.N. (1946) Probability Theory. Moskow, Gostechizdat, 556p. (Rus).

Blackwell D., Girshick M.A. (1954) Theory of Games and Statistical Decisions. John Willey and Sons, Inc.

Chang C.C., Lee R.C.T. (1986) A Letter Oriented Perfect Hashing Scheme. The Computer Journal, Vol. 29(3), P. 277-281.

Cover T.M. (1973) Enumerative Source Encoding. IEEE. Trans. Inf. Th., Vol. IT-19(1), P. 73-73.

Cormac G.V., Horspool R., Kaiserwerth M. (1985) Practical Perfect Hashing. The Computer Journal, Vol. 28 (1), P. 54-58.

Davisson L. (1973) Universal Noiseless Coding. IEEE. Trans. Inf. Th., Vol. 19(6), P. 783-795.

Davisson L., Leon-Garcia A. (1980) A Source Matching Approach to Finding Mini­max Codes. IEEE. Trans. lnf. Th., Vol. 26(2), P. 166-174.

Desa M. (1965) Effectiveness of Detecting or Correcting Noises. Probl. of Inf. Tr., Vol. 1(3), P. 29-39 (Rus).

Dietzfelbinger M., Karlin A., Mehlhorn K., Meyer auf der Heide F., Rohnert H., Tarjan R.E. (1988) Dynamic Perfect Hashing: Upper and Lower Bounds. 29-th Ann. Symp. on Found. of Comp. Sci., P. 524-531.

Dunham J.G. (1980) Optimal Noiseless Coding of Random Variables. IEEE. Trans. Inf. Th., Vol. IT-26(3), P. 345.

Elias P. (1975) Universal Codeword Sets and Representations of Integers. IEEE. Trans. Inf. Th., Vol.21(2), P. 194-203.

Elias P. (1987) Interval and Recency Rank Source Encoding: two on -line Adaptive Variable - Length Schemes. IEEE. Trans. Inf. Th., Vol. IT-33M, P. 3-10.

Feller W. (1957) An Introduction to Probability Theory and its Applications. John Willey and Sons, Inc., Vol. 1, second edition.

Fiat A., Naor M., Schmidt S.P., Siegel A. (1992) Nonoblivious Hashing. JACM, Vol. 39(4), P. 764-782.

Fitingof B.M. (1966) Optimal Encoding under an Unknown or Changing Statistics. Problems of Information Transmission, Vol. 2(2), P. 3-11.

Fox E.A., Chen Q., Daoud A.P. (1992) Practical Minimal Perfect Hash Function for Large Databases. Comm. of ACM, Vol. 35(1), P. 105-121.

Fox E.A., Chen Q.,Heath L. (1992) A Faster Algorithm for Constructing Minimal Perfect Hash Functions. SIGIR Forum, P. 226-273.

Fredman M., Komlos J., Szemeredi E. (1984) Storing a Sparse Table with 0 (1) Worst Case Access Time. JACM, Vol. 31(3), P. 538-544.

215

Friedman J. ( 1984) Constructing 0 ( n log n) Size Monotone Formulae for k-th El­ementary Symmetric Polynomial of n Boolean Variables. 25-th Ann. Symp. on Found. of Comp. Sci., P. 506-515.

Gilbert E.N. and Moore E.F. (1959) Variable-length Binary Encoding. BSTJ, Vol. 38(4), P. 933-967.

Goppa V.D. (1974) Arbitrary Noise Correction by Irreducible Codes. Problems of Information Transmission, Vol. 10(3), P. 118-119. (Rus).

Hansel G. (1962) Nombre des Lettres Necessaires Pour Ecrire une Fonction Sym­metrique den Variables. C.R.Acad. Sci. Paris, Vol. 261(21), P. 1297-1300.

Hartley R.V.L.(1928) Transmission of Information. BSTJ, Vol. 7(3), P. 535-563.

Huffman D.A. (1952) A Method for the Construction of Minimum-redundancy Codes. Proc. Instr. of Electr. Radio eng., Vol. 40(9), P. 1098-1101.

Jablonskii S.V. (1959) Algorithmic Difficulties of the Synthesis of Minimal Contact Networks. Problems of Cybernetics, Vol. 2, P. 75-123 (Rus).

Jacobs C.T.M., van Emde Boas P. (1986) Two Results on Tables. Inf. Proc. Lett., Vol. 22, P. 43-48.

Jaeschke G. (1981) Reciprocal Hashing- A Method for Generating minimal Perfect Hashing Functions. Comm. of ACM, Vol. 24, N 12, P. 829-833.

Jelinek F., Schneider K. (1979) On Variable-Length to Block Coding. IEEE. Trans. Inf. Th., Vol. 1(12), N. 6.

Karp R.M., Rabin M.O. (1987) Efficient Randomized Pattern-Matching Algorithms. IBM J. Res. Develop, Vol. 31(2), P. 249-260.

Kleitman D., Spencer J. (1973) Families of k-independent Sets. Discrete Mathemat­ics, Vol. 6, P. 255-262.

Kolmogorov A.N. (1965) Three Approaches to the Definition of the Concept "the Quantity of Information". Problems of Information Transmission, Vol. 1(1), P. 3-11. (Rus).

Kolmogorov A.N., Tichomirov V.M. (1959) c:-entropy and c:-capacity of Sets in Met­ric Spaces. Uspechi Math. Nauk, Vol. 14(2), P. 3-86. (Rus).

Leung-Yan-Cheong S.K., Cover T.M. (1979) Some Equivalences Between Shannon Entropy and Kolmogorov Complexity. IEEE. Trans. Inf. Th., Vol. 24, N. 3, P. 331-338.

Levenstein V.I. (1968) The Redundancy and the Delay of a Decipherable Encoding of Integers. Problems of Cybernetics, Vol. 20, P. 173-179. (Rus).

Medvedev J.l. (1970) Some Theorems on Asymptotic Distribution of x2-statistic. Dokl. Acad. Sci. USSR, Vol. 192, N. 5, P. 987-990. (Rus).

Nagaev S.V. (1979) Large Deviations of Sums of Independent Random Variables. Ann. Probab., Vol. 7, N. 5, P. 745-789.

216 References

Nechiporuk E.l. (1965) The Complexity of Realization of Partial Boolean FUnctions by Valve Networks. Dokl. Acad. Sci. USSR, Vol. 163(1), P. 40-43. (Rus).

Petrov V.V. (1972) Sums of Independent Stochastic Variables. Moskow, Nauka, 414 p. (Rus).

Pippenger N. (1977) Information Theory and the Complexity of Boolean FUnctions. Math. Syst. Theory, Vol. 10, P. 129-167.

Poljak S., Pultr A., Rodl V. (1983) On Qualitatively Independent Partitions and Related Problems. Discr. Appl. Math., Vol. 6, P. 109-216.

Shannon C.E. (1948) A Mathematical Theory of Communication. BSTJ, Vol. 27,

P. 398-403. Shannon C.E. (1949) The Synthesis of Two-terminal Switching Circuits. BSTJ,

Vol. 28, N. 1, P. 59.

Schmidt I.P., Siegel A. (1990) The Spatial Complexity of Oblivious k-probe Hash­FUnctions. SIAM J. Comp., Vol. 19, N. 5.

Sholomov L.A. {1969) Realization of Partial Boolean FUnctions by FUnctional Gate Networks. Problems of Cybernetics, N. 21, P. 215-227. (Rus).

Shtarkov Y., Babkin V. {1971) Combinatorial Encoding for Discrete Stationary Sources. 2-nd Int. Symp. Inf. Th., 1971, Akad. Kiado, Budapest, P. 249-257.

Sprugnoli R. (1977) Perfect Hashing Functions: a Single Probe Retrieving Method for Static Files. Comm. of ACM, Vol. 23, N. 1, P. 17-19.

Wyner A.D. (1972) An Upper Bound on the Entropy Series. Inf. Contr. Vol. 20, P. 176.

Ziv J., Lempel A. (1978) Compression of Individual Sequences via Variable-Length Coding. IEEE. Trans. Inf. Th., Vol. IT-24, N. 5, P. 530-536.

Zvonkin A.K., Levin L.A. {1970) The Complexity of Finite Objects and the Con­cepts of Information and Randomness through the Algorithm Theory. Uspechi Math. Nauk, Vol. 25(6), P. 85-127. (Rus).

3. The book has its origin in the following papers of the author and his students:

Hasin L.S. (1969) Complexity of Formulae over Basis {V,&,-} Realizing Threshold Functions. Dokl. Acad. Sci. USSR, Vol. 189(4), P. 752-755. (Rus).

Khodak G.L. (1969) Delay-Redundancy Relation of VB-encoding. All-union Con­ference on Theoretical Cybernetics. Novosibirsk. P. 12. (Rus).

Khodak G.L. (1972) Bounds of Redundancy of per Word Encoding of Bernoulli Sources. Problems of Information Transmission. Vol. 8(2), P. 21-32. (Rus).

217

Krichevskii R.E. (1964) 1r-Network Complexity of a Sequence of Boolean Functions. Problems of Cybernetics. N 12, P. 45-56. Moscow, Nauka. (Rus).

Krichevskii R.E. (1968) A Relation Between the Plausibility of Information about a Source and Encoding Redundancy. Problems of Information Transmission. Vol. 4, N 3, P. 48-57. (Rus).

Krichevskii R.E. (1976) The Complexity of Enumeration of a Finite Word Set. Dokl. Acad. Sci. USSR, Vol. 228, N 5, P. 287-290. (full version: Problems of Cyber­netics N. 36, P. 159-180, (1979). English translation in Selecta Mathematica Sovetica Vol. 8, N. 2 (1989), P. 99-129 ).

Krichevskii R.E. (1985) Optimal Hashing. Information and Control. Vol. 62, N. 1, P. 64-92.

Krichevskii R.E. (1986) Retrieval and Data Compression Complexity. Proc. Int. Congr. Math. Berkley, 1986, P. 1461-1468.

Krichevskii R.E., Ryabko B.Ja. (1985) Universal Retrieval Trees. Discr. Appl. Math .. Vol. 12, P. 293-302.

Krichevskii R.E., Ryabko B.Ya., Haritonov A.Y. (1981) Optimal Kry for Taxons Ordered in Accordance with their Frequencies. Discr. Appl. Math .. Vol. 3, P. 67-72.

Krichevskii R.E., Ttofimov V.K. (1981) The Performance of Universal Encoding. IEEE. Trans. Inf. Th .. Vol. 27, N. 2, P. 199-207.

Reznikova Zh.I., Ryabko B.Ja. (1986) Information Theoretical Analysis of the Lan­guage of Ants. Problems of Information Transmission. Vol. 22(3), P. 103-108. (Rus).

Ryabko B.Ja. (1979) The Encoding of a Source with Unknown but Ordered Prob­abilities. Problems of Information Transmission. Vol. 14(2), P. 71- 77. (Rus).

Ryabko B.Ja. (1980,a) Universal Encoding of Compacts. Dokl. Acad. Sci. USSR. Vol. 252(6), P. 1325-1328. (Rus).

Ryabko B.Ja. (1980,b) Information Compression by a Book Stack. Problems of Information Transmission. Vol. 16(4), P. 16-21. (Rus).

Ryabko B.Ja. (1990) A Fast Algorithm of Adaptive Encoding. Problems of Infor­mation Transmission. Vol. 26, N. 4, P. 24-37. (Rus).

Trofimov V.K. (1974) The Redundancy of Universal Encoding of Arbitrary Markov Sources. Problems of Information Transmission. Vol. 10(4), P. 16-24. (Rus).

Ttofimov V.K. (1976) Universal BY-Encoding of Bernoulli Sources. Discr. Analysis, Novosibirsk, N. 29, P. 87-100.

Potapov V.N. (1993) Fast Lexicographic retrieval. Dokl. Acad. Sci. USSR. Vol. 330, N. 2, P. 158-160. (Rus).

Index

adaptive code, 75 arbitrary additive noise, 179 atom, 16 automaton sources, 10 Babkin and Starkov theorem, 103 Bernoulli source, 20 binary tree, 28 boolean functions, 8 cluster, 43 colliding index, 42 collision, 42 combinatorial entropy 10 combinatorial source, 8 communication channel, 76 concerted probabilities, 27 Davisson and Leon-Garcia lemma, 76 digital search, 138 Dirichlet distribution, 80 doubling, 63 Elias claim , 95 empirical entropic (EE) code, 73 empirical combinatorial entropic (ECE)

code, 71 empirical combinatorial per letter en-

tropy, 47 empirical Shannon or Huffman code, 72 encoding cost, 27 entropy doubling effect, 202 epsilon-entropy, 8 exhaustive search, 138 Fekete claim, 23 Fibbonacci source, 11 fingerprint maps, 155 fixed rate encoding, 27 Gilbert-Moore order-preserving code, 51

218

greedy algorithm, 148 group code, 202 Hartley entropy, 23 hashing, 27 Horner scheme, 144 Huffman code, 48 identifying keys, 96 Jensen inequality, 18 improved variant of lexicographic re-

trieval, 164 indirect adaptive code, 90 independent sets, 135 information lossless code, 27 information retrieval, 39 injection, 42 injective polynomial, 181 interface function, 195 Jablonskii invariant classes, 22 key-address transformations, 42 Khodak lemma, 56 Kolmogorov complexity, 63 Kolmogorov encoding, 27 Kraft inequality, 27 language of ants, 10 large deviations probabilities, 118 leaves, 28 Lempel-Ziv code, 72 Levenstein code, 34 Lempel-Ziv-Kraft inequality, 35 Leung-Ian-Cheong and Cover code, 49 lexicographic order, 29 lexicographic retrieval, 168 linear code, 179 linear Galois hashing, 142 Lipschitz space, 13

loading factor, 130 Macmillan and Karush claim, 34 majorant of the Kolmogorov complex-

ity, 68 Markov source, 20 membership function, 180 monotone functions, 41 monotone sources, 75 move to front code, 72 Nechiporuk covering lemma, 111 partially specified boolean function, 112 partition, 16 perfect hash-function, 42 piercing sets, 135 polynomial Galois hashing, 146 precedence function, 29 precomputing time, 152 prefix code, 28 program length, 40 redundancy, 27 reminder, 156 representation function, 179 running time, 40 Ryabko claim, 77 Shannon code, 48 Shannon entropy, 16 signature, 114 sliding window adaptive code,91 source code, 70 stationary combinatorial source, 22 stationary source, 8 Stirling formula, 73 stochastic source, 8 string matching, 138 strongly universal, 105 syndrome, 203 table of a function, 41 table of indices, 144 tensor product, 138 threshold boolean functions, 27 Trofimov code, 83 two-lewel computation of monotone func-

tions, 40 two-step digital search, 138 uniform maps, 126 universal code, 75 universal combinatorial code, 99 universal hash-set, 111 universal monotone code, 92 universal numerator, 123 universal set, 111 variable rate encoding, 27 Varshamov-Gilbert bound, 179 weakly universal, 105

219

William Occam's Razor Principle, 112 Wyner inequality, 95

Other Mathematics and Its Applications titles of interest:

P.H. Sellers: Combinatorial Complexes. A Mathematical Theory of Algorithms. 1979,200 pp. ISBN 90-277-1000-7

P.M. Cohn: Universal Algebra. 1981, 432 pp. ISBN 90-277-1213- 1 (hb), ISBN 90-277-1254-9 (pb,

J. Mockor: Groups of Divisibility. 1983, 192 pp. ISBN 90-277-1539-4

A. Wwarynczyk: Group Representations and Special Functions. 1986, 704 pp. ISBN 90-277-2294-3 (pb), ISBN 90-277-1269-7 (hb)

I. Bucur: Selected Topics in Algebra and its Interrelations with Logic, Number Theory and Algebraic Geometry. 1984,416 pp. ISBN 90-277-1671-4

H. Walther: Ten Applications of Graph Theory. 1985,264 pp. ISBN 90-277-1599-8

L. Beran: Orthomodular Lattices. Algebraic Approach. 1985, 416 pp. ISBN 90-277-1715-X

A. Pazman: Foundations of Optimum Experimental Design. 1986,248 pp. ISBN 90-277-1865-2

K. Wagner and G. Wechsung: Computational Complexity. 1986, 552 pp. ISBN 90-277-2146-7

A.N. Philippou, G.E. Bergum and A.F. Horodam (eds.): Fibonacci Numbers and Their Applications. 1986,328 pp. ISBN 90-277-2234-X

C. Nastasescu and F. van Oystaeyen: Dimensions of Ring Theory. 1987, 372 pp. ISBN 90-277-2461-X

Shang-Ching Chou: Mechanical Geometry Theorem Proving. 1987, 376 pp. ISBN 90-277-2650-7

D. Przeworska-Rolewicz: Algebraic Analysis. 1988,640 pp. ISBN 90-277-2443-1

C.T.J. Dodson: Categories, Bundles and Spacetime Topology. 1988, 264 pp. ISBN 90-277-2771-6

V.D. Goppa: Geometry and Codes. 1988, 168 pp. ISBN 90-277-2776-7

A.A. Markov and N.M. Nagorny: The Theory of Algorithms. 1988, 396 pp. ISBN 90-277-2773-2

E. Kratzel: Lattice Points. 1989, 322 pp. ISBN 90-277-2733-3

A.M.W. Glass and W.Ch. Holland (eds.): Lattice-Ordered Groups. Advances and Techniques. 1989, 400 pp. ISBN 0-7923-0116-1

N.E. Hurt: Phase Retrieval and Zero Crossings: Mathematical Methods in Image Reconstruction. 1989, 320 pp. ISBN 0-7923-0210-9

Du Dingzhu and Hu Guoding (eds.): Combinatorics, Computing and Complexity. 1989, 248 pp. ISBN 0-7923-0308-3

Other Mathematics and Its Applications titles of interest:

A.Ya. Helemskii: The Homology of Banach and Topological Algebras. 1989, 356 pp. ISBN 0-7923-0217-6

J. Martinez (ed.): Ordered Algebraic Structures. 1989, 304 pp. ISBN 0-7923-0489-6

V.I. Varshavsky: Self-Timed Control of Concurrent Processes. The Design of Aperiodic Logical Circuits in Computers and Discrete Systems. 1989, 428 pp.

ISBN 0-7923-0525-6

E. Goles and S. Martinez: Neural and Automata Networks. Dynamical Behavior and Applications. 1990, 264 pp. ISBN 0-7923-0632-5

A. Crumeyrolle: Orthogonal and Symplectic Clifford Algebras. Spinor Structures. 1990,364 pp. ISBN 0-7923-0541-8

S. Albeverio, Ph. Blanchard and D. Testard (eds.): Stochastics, Algebra and Analysis in Classical and Quantum Dynamics. 1990,264 pp. ISBN 0-7923-0637-6

G. Karpilovsky: Symmetric and G-Algebras. With Applications to Group Represen-tations. 1990, 384 pp. ISBN 0-7923-0761-5

J. Bosak: Decomposition of Graphs. 1990,268 pp. ISBN 0-7923-0747-X

J. Adamek and V. Trnkova: Automata and Algebras in Categories. 1990, 488 pp. ISBN 0-7923-0010-6

A.B. Venkov: Spectral Theory of Automorphic Functions and Its Applications. 1991, 280 pp. ISBN 0-7923-0487-X

M.A. Tsfasman and S.G. Vladuts: Algebraic Geometric Codes. 1991,668 pp. ISBN 0-7923-0727-5

H.J. Voss: Cycles and Bridges in Graphs. 1991,288 pp. ISBN 0-7923-0899-9

V.K. Kharchenko: Automorphisms and Derivations of Associative Rings. 1991, 386 pp. ISBN 0-7923-1382-8

A.Yu. Olshanskii: Geometry of Defining Relations in Groups. 1991, 513 pp. ISBN 0-7923-1394-1

F. Brackx and D. Constales: Computer Algebra with LISP and REDUCE. An Introduction to Computer-Aided Pure Mathematics. 1992, 286 pp.

ISBN 0-7923-1441-7

N.M. Korobov: Exponential Sums and their Applications. 1992,210 pp. ISBN 0-7923-1647-9

D.G. Skordev: Computability in Combinatory Spaces. An Algebraic Generalization of Abstract First Order Computability. 1992,320 pp. ISBN 0-7923-1576-6

E. Goles and S. Martinez: Statistical Physics, Automata Networks and Dynamical Systems. 1992, 208 pp. ISBN 0-7923-1595-2

Other Mathematics and Its Applications titles of interest:

M.A. Frumkin: Systolic Computations. 1992, 320 pp. ISBN 0-7923-1708-4

J. Alajbegovic and J. Mockor: Approximation Theorems in Commutative Algebra. 1992,330 pp. ISBN 0-7923-1948-6

I.A. Faradzev, A.A. Ivanov, M.M. Klin and A.J. Woldar: Investigations in Al­gebraic Theory of Combinatorial Objects. 1993, 516 pp. ISBN 0-7923-1927-3

I.E. Shparlinski: Computational and Algorithmic Problems in Finite Fields. 1992, 266 pp. ISBN 0-7923-2057-3

P. Feinsilver and R. Schott: Algebraic Structures and Operator Calculus. Vol. 1. Representations and Probability Theory. 1993, 224 pp. ISBN 0-7923-2116-2

A. G. Pinus: Boolean Constructions in Universal Algebras. 1993, 350 pp. ISBN 0-7923-2117-0

V.V. Alexandrov and N.D. Gorsky: Image Representation and Processing. A Recursive Approach. 1993,200 pp. ISBN 0-7923-2136-7

L.A. Bokut' and G.P. Kukin: Algorithmic and Combinatorial Algebra. 1993, 469 pp. ISBN 0-7923-2313-0

Y. Bahturin: Basic Structures of Modern Algebra. 1993, 419 pp. ISBN 0-7923-2459-5

R. Krichevsky: Universal Compression and Retrieval. 1994,219 pp. ISBN 0-7923-2672-5