back to list

Complete list of TOP self-consistent vals up to 120

🔗Keenan Pepper <keenanpepper@gmail.com>

11/5/2011 7:18:18 PM

The following infinite vals are generalized patent vals for their *own* TOP tunings (the tunings with the least TOP-max error). Furthermore, I claim these are the *only* vals below 120 steps per 2/1 with this property.

All other infinite vals are not patent for their own TOP tunings, which means that the generalized patent val for that tuning is some other val (with, in turn, a possibly different TOP tuning).

This is great news for people looking to create no-limit temperaments, because we have a well-defined and *discrete* set of vals to work with. Note that you can't do this for TE, because the TE error doesn't converge (unless you analytically continue it or something). But TOP is no worse than TE; I remember when everybody was using TOP. Even http://xenharmonic.wikispaces.com/Tenney-Euclidean+tuning says "there are theoretical arguments favoring TOP".

Here are the first few in long-winded format:

First is the tuning, in steps per octave
Next comes the beginning of the infinite val, which is the GPV of that tuning
Next, the flattest and sharpest primes, which determing the TOP tuning
Next, the TOP error in cents per octave

0.815464876786
Val: [1, 1, 2, 2, 3, 3, 3, 3, 4, 4, 4, ...] (all of the following should have "..." because they're infinite vals)
Worst primes: 3 2
Error: 271.553262637

1.06160631164
Val: [1, 2, 2, 3, 4, 4, 4, 5, 5, 5, 5]
Worst primes: 5 3
Error: 226.358709403

1.14601483711
Val: [1, 2, 3, 3, 4, 4, 5, 5, 5, 6, 6]
Worst primes: 2 5
Error: 152.893137906

1.86135311615
Val: [2, 3, 4, 5, 6, 7, 8, 8, 8, 9, 9]
Worst primes: 5 2
Error: 89.3845768332

2.02252493568
Val: [2, 3, 5, 6, 7, 7, 8, 9, 9, 10, 10]
Worst primes: 13 5
Error: 77.6402915263

2.02734724807
Val: [2, 3, 5, 6, 7, 8, 8, 9, 9, 10, 10]
Worst primes: 3 13
Error: 79.6457464189

2.26185950714
Val: [2, 4, 5, 6, 8, 8, 9, 10, 10, 11, 11]
Worst primes: 2 3
Error: 138.926139125

2.76185950714
Val: [3, 4, 6, 8, 10, 10, 11, 12, 12, 13, 14]
Worst primes: 3 2
Error: 103.469633661

3.00215313236
Val: [3, 5, 7, 8, 10, 11, 12, 13, 14, 15, 15]
Worst primes: 7 3
Error: 60.9545064918

The few vals with the least TOP errors relative to there size are:

12.0753641528
Val: [12, 19, 28, 34, 42, 45, 49, 51, 55, 59, 60, 63, 65, 66, 67, 69]
Worst primes: 3 43
Error: 8.71514934392

25.9262587256
Val: [26, 41, 60, 73, 90, 96, 106, 110, 117, 126, 128, 135, 139, 141, 144]
Worst primes: 31 11
Error: 4.14601940058

28.9385927663
Val: [29, 46, 67, 81, 100, 107, 118, 123, 131, 141, 143, 151, 155, 157, 161, 166]
Worst primes: 7 29
Error: 3.55831859325

30.9950566273
Val: [31, 49, 72, 87, 107, 115, 127, 132, 140, 151, 154, 161, 166, 168, 172]
Worst primes: 37 31
Error: 3.47291662817

45.9931176763
Val: [46, 73, 107, 129, 159, 170, 188, 195, 208, 223, 228, 240, 246, 250, 255]
Worst primes: 29 5
Error: 2.3292364275

49.9331483721
Val: [50, 79, 116, 140, 173, 185, 204, 212, 226, 243, 247, 260, 268, 271, 277]
Worst primes: 3 41
Error: 2.1556303066

58.0580443925
Val: [58, 92, 135, 163, 201, 215, 237, 247, 263, 282, 288, 302, 311, 315, 322]
Worst primes: 47 19
Error: 1.81798494881

58.060825976
Val: [58, 92, 135, 163, 201, 215, 237, 247, 263, 282, 288, 302, 311, 315, 323]
Worst primes: 37 47
Error: 1.8454963235

80.0639853429
Val: [80, 127, 186, 225, 277, 296, 327, 340, 362, 389, 397, 417, 429, 434, 445, 459, 471, 475, 486]
Worst primes: 43 7
Error: 1.23848578403

80.0780680651
Val: [80, 127, 186, 225, 277, 296, 327, 340, 362, 389, 397, 417, 429, 435, 445, 459]
Worst primes: 13 43
Error: 1.31233467614

99.0887013374
Val: [99, 157, 230, 278, 343, 367, 405, 421, 448, 481, 491, 516, 531, 538, 550]
Worst primes: 2 13
Error: 1.07420526719

102.929503781
Val: [103, 163, 239, 289, 356, 381, 421, 437, 466, 500, 510, 536, 551, 559, 572]
Worst primes: 3 43
Error: 1.02540802112

And without further ado, here is the complete list (the first entries of the val are omitted, because they are easy to compute):

0.815464876786 3 2 271.553262637
1.06160631164 5 3 226.358709403
1.14601483711 2 5 152.893137906
1.86135311615 5 2 89.3845768332
2.02252493568 13 5 77.6402915263
2.02734724807 3 13 79.6457464189
2.26185950714 2 3 138.926139125
2.76185950714 3 2 103.469633661
3.00215313236 7 3 60.9545064918
3.08924219134 13 7 45.3013983674
3.12142892656 2 13 46.6820534136
3.87892137107 11 2 37.4574116929
3.91512086522 13 11 40.3931447649
3.91957541892 3 13 41.0239017758
4.2082541375 2 3 59.3844755651
4.7082541375 3 2 74.3577183337
4.89244008369 5 3 38.0173350469
5.04111037214 11 5 30.2334165362
5.09503374662 7 11 25.4678887283
5.17155390331 2 7 39.8071233175
5.79939762748 5 2 41.5082500788
5.85391979759 3 5 35.9865570105
6.15464876786 2 3 30.152577089
6.87797693034 13 2 21.2893536971
6.89706428508 7 13 22.4665552806
7.00748433567 5 7 19.9773957509
7.13086438827 3 5 32.0808959926
7.28557852143 2 3 47.0373388615
7.78557852143 3 2 33.0490243695
8.10104339821 2 3 14.9674643002
8.80676558073 5 2 26.3299052295
8.9258136179 17 5 15.9170836467
8.94254330419 3 17 14.6973965407
9.23197315179 2 3 30.152577089
9.73197315179 3 2 33.0490243695
9.96154007598 11 3 16.0620924255
10.0022216413 19 11 13.8026071395
10.0140720552 5 19 13.0030483831
10.1681186969 2 5 19.8406846213
10.8431078066 7 2 17.3631615044
10.8841143055 3 7 17.4538955075
11.1783677821 2 3 19.147816814
11.814133534 5 2 18.8790619785
11.9068904003 7 5 15.3242252813
11.9813511204 11 7 12.9895107839
12.0156007501 13 11 12.4959500114
12.0697455519 43 13 9.04457150884
12.0753641528 3 43 8.71514934392
12.3092975357 2 3 30.152577089
12.8092975357 3 2 17.8653791517
13.0849107836 5 3 15.0965377561
13.1754866501 2 5 15.9830134368
13.8908249292 5 2 9.43141142868
14.0463904975 3 5 14.1761047257
14.2556921661 2 3 21.5233743624
14.7556921661 3 2 19.8682242361
15.0329985775 17 3 8.72604419045
15.0645177349 7 17 8.26970062154
15.1584545228 2 7 12.5438531415
15.8365581164 7 2 12.3846519494
15.9012836296 3 7 9.6626003236
16.2020867964 2 3 14.9674643002
16.7020867964 3 2 21.4042621525
16.8884205703 7 3 10.4224862638
16.947165373 5 7 10.6764479965
17.1135311615 2 5 7.96079970148
17.8288694405 5 2 11.5182105113
17.8772242695 3 5 14.1761047257
18.1484814268 2 3 9.81777527019
18.8946068553 11 2 6.69353824458
18.95818514 17 11 7.60157553383
18.980861601 7 17 6.44105750666
19.0815403555 3 7 9.6626003236
19.2794111804 2 3 17.3912685036
19.7794111804 3 2 13.3829354755
20.0004368928 5 3 11.3586651613
20.0947003537 7 5 8.78427882639
20.1519048326 2 7 9.04558653928
20.8201746563 43 2 10.364486176
20.9619166067 3 5 8.08521901039
21.2258058107 2 3 12.7659215991
21.7258058107 3 2 15.144801993
21.9055898943 7 3 9.69349731606
21.9851269085 23 7 5.44288105048
21.9978817272 13 23 5.92372774336
22.0620165624 5 13 5.30389053273
22.1975905099 2 5 10.6817274507
22.8567355643 3 2 7.52151690135
23.1722004411 2 3 8.91760494701
23.843605347 5 2 7.87102373348
23.9918843942 7 5 6.3022740123
24.0987096795 3 7 6.14368381324
24.3031301946 2 3 14.9674643002
24.8031301946 3 2 9.52475613255
25.0858466202 7 3 7.23962917926
25.1349753265 5 7 7.43513559971
25.2049584632 2 5 9.75800678892
25.8234587359 7 2 8.2037622884
25.8649471006 11 7 6.41072526889
25.9262587256 31 11 4.14601940058
25.9395628267 5 31 4.57845845767
26.0696949695 3 5 9.27859293758
26.249524825 2 3 11.4070556323
26.749524825 3 2 11.2364691323
26.9159630021 5 3 9.54153792653
27.0078260031 11 5 5.54493303604
27.0709953338 23 11 4.4814757102
27.0954808616 2 23 4.22863999162
27.8751116633 11 2 5.37633735427
27.900098655 3 11 5.98662389979
28.1929075927 5 3 8.46729202306
28.2123264164 2 5 9.0312190475
28.6959194554 3 2 12.7159770621
28.9206626044 29 3 4.2366689025
28.9385927663 7 29 3.55831859325
29.0321593669 5 7 7.30899718766
29.1430029745 2 5 5.88832830799
29.8268492089 3 2 6.96623863384
30.1423140857 2 3 5.66568652866
30.7890178116 5 2 8.22301729866
30.907930893 13 5 3.91283215821
30.9772063141 29 13 3.88084492924
30.9828909743 31 29 3.87235886987
30.9950566273 37 31 3.47291662817
31.006435283 3 37 3.51710309945
31.2732438393 2 3 10.4847648306
31.7732438393 3 2 8.56404194147
32.0237413648 5 3 5.75858013364
32.1503709278 2 5 5.61253597068
32.8657092068 5 2 4.90325496521
32.9852210787 3 5 6.43465966334
33.2196384696 2 3 7.93404671797
33.7196384696 3 2 9.97738563332
33.945395686 11 3 4.41218036121
33.9746661404 7 11 4.76767122537
34.0979449812 2 7 3.4469519347
34.8505682232 3 2 5.14534314027
35.1660331 2 3 5.66568652866
35.87307716 5 2 4.24573022452
36.0592527085 13 5 3.91283215821
36.0874543234 3 13 4.13855858255
36.2969628536 2 3 9.81777527019
36.7969628536 3 2 6.62132297937
37.1124277304 2 3 3.63525871737
37.8789809167 7 2 3.83386502
37.9849771174 3 7 4.08135782953
38.2433574839 2 3 7.63607067811
38.7433574839 3 2 7.94900182341
38.939267474 5 3 5.497091003
39.0957833923 2 5 2.93996080482
39.8111216714 5 2 5.69323306737
39.9007471879 3 5 4.57653724213
40.1897521143 2 3 5.66568652866
40.6897521143 3 2 9.14966161038
40.8842872465 11 3 3.70254914513
40.9265833278 13 11 3.53673332585
40.9664200033 17 13 3.21569137637
40.9991330128 23 17 2.99005531888
41.016156348 5 23 2.98077655636
41.1724747875 2 5 5.0268959079
41.8206818679 3 2 5.14534314027
42.1361467446 2 3 3.87733825218
42.8184896246 5 2 5.08687841047
42.9062591302 7 5 4.51393378384
42.9938135774 23 7 2.99362313969
43.0054227435 3 23 2.85171945742
43.2670764982 2 3 7.40729034169
43.7670764982 3 2 6.38626621893
43.9892833821 7 3 4.79569625166
44.0493500624 5 7 3.2783540856
44.1798427408 2 5 4.88483605978
44.8855246038 17 2 3.06046273562
44.9030308947 5 17 3.010991093
45.0085255506 3 5 5.66594113804
45.2134711286 2 3 5.66568652866
45.7134711286 3 2 7.52151690135
45.8547935832 5 3 5.31447410225
45.9931176763 29 5 2.3292364275
46.0072142888 19 29 2.67287251981
46.0403166363 13 19 2.60199036589
46.080725772 7 13 3.38532780178
46.153467162 2 7 3.99017897787
46.8315707556 7 2 4.31578719331
46.8540752313 3 7 4.23289029617
47.1317381739 5 3 4.78641402707
47.187210694 2 5 4.76088392468
47.8658815362 7 2 3.36235647082
47.9465341029 5 7 3.53984685729
48.0932178878 3 5 3.55700756129
48.2907955125 2 3 7.22611030317
48.7907955125 3 2 5.14534314027
49.1062603893 2 3 2.59666417547
49.8595750633 11 2 3.37969033591
49.9115742042 29 11 2.32348709474
49.9284870443 41 29 2.21821046078
49.9331483721 3 41 2.1556303066
50.2371901429 2 3 5.66568652866
50.7371901429 3 2 6.21579215723
50.962571946 5 3 3.36100996155
51.0940690836 7 5 3.67419537999
51.1469174718 2 7 3.4469519347
51.8250210653 7 2 4.05160899669
51.8437181433 5 7 3.76202588015
51.9240516599 3 5 4.34045944429
52.1835847732 2 3 4.22166719316
52.859331846 7 2 3.19341503028
52.9596939883 17 7 2.60976336014
52.9940154279 11 17 2.15463299331
53.0773031551 13 11 2.5010665002
53.1050665326 5 13 2.97937424475
53.2009962506 3 5 4.57653724213
53.3145145268 2 3 7.07907472279
53.8145145268 3 2 4.13610658412
54.0472642832 5 3 4.7224116182
54.1326231586 2 5 2.93996080482
54.8479614377 5 2 3.32640028955
54.991253124 7 5 2.95348009053
55.0515012812 3 7 3.50099924853
55.2609091571 2 3 5.66568652866
55.7609091571 3 2 5.14534314027
56.0386382219 7 3 2.44352502996
56.1343440563 5 7 3.1294532976
56.2093145538 2 5 4.46860927877
56.8750062149 13 2 2.63723122229
56.9019642028 3 13 2.49453001542
57.2073037875 2 3 4.34847525633
57.7073037875 3 2 6.08649914215
57.87555082 7 3 3.52452473202
57.8862151402 5 7 3.6393402048
58.0259640141 19 5 2.38592487659
58.0580443925 47 19 1.81798494881
58.060825976 37 47 1.8454963235
58.072835322 17 37 1.87330629506
58.1134145121 2 17 2.34192768787
58.786005949 5 2 4.36826515198
58.8395777691 3 5 3.32655045748
59.1536984179 2 3 3.11794708299
59.8969319976 19 2 2.06490714632
59.9328459781 11 19 1.93069951085
59.9841456541 7 11 2.82751153883
60.0315280967 5 7 3.34797123786
60.1165223598 3 5 3.55700756129
60.2846281714 2 3 5.66568652866
60.7846281714 3 2 4.25183474932
61.055807546 7 3 2.8376376882
61.1301572678 11 7 2.69874485977
61.1408715897 2 11 2.76485930344
61.7933739022 5 2 4.01258746116
61.9242701063 3 5 1.8051905553
62.2310228018 2 3 4.45480966989
62.7310228018 3 2 5.14534314027
62.9840823408 17 3 2.07467688344
62.9993086726 5 17 2.29585745812
63.1547270184 2 5 2.93996080482
63.8619525554 3 2 2.59398478971
64.1774174321 2 3 3.31738058482
64.8007418555 5 2 3.68992339513
64.8913406821 11 5 2.60409944607
64.9346469876 7 11 1.93928398106
65.079387665 19 7 1.9642638865
65.0970168102 3 19 2.05082563474
65.3083471857 2 3 5.66568652866
65.8083471857 3 2 3.49474476987
66.0705687551 5 3 3.21575018753
66.1620949717 2 5 2.93996080482
66.8744120718 13 2 2.2535602064
66.8869643997 5 13 2.36993081361
67.032048469 3 5 2.74784245345
67.2547418161 2 3 4.54525838674
67.7547418161 3 2 4.34375237546
68.0353237852 11 3 1.8535594981
68.1096495055 2 11 1.9318761375
68.8739935559 7 2 2.19542566191
68.9377687191 3 7 2.89697069562
69.2011364464 2 3 3.48785797619
69.7011364464 3 2 5.14534314027
69.9014025272 5 3 2.26261759288
70.0769834674 11 5 2.1113151694
70.117217396 13 11 2.14843778372
70.1309600756 2 13 2.24083757719
70.815477762 5 2 3.12681199896
70.8628822411 3 5 3.36565211182
71.1475310768 2 3 2.48831251718
71.8998592884 19 2 1.67133642673
71.9378360094 29 19 1.62292589514
71.9462142174 23 29 1.67086724139
71.9752254404 13 23 1.53179979101
72.0382862152 5 13 1.9206622974
72.1398268317 3 5 3.55700756129
72.2784608304 2 3 4.62313381596
72.7784608304 3 2 3.6528253076
72.9860948643 5 3 3.31717788685
73.0896582839 13 5 2.05812846567
73.1172699249 2 13 1.92463299063
73.8228457153 5 2 2.87966603847
73.90562786 7 5 2.77229434741
73.9549380432 3 7 2.209294669
74.2248554607 2 3 3.63525871737
74.7248554607 3 2 4.41852239265
74.9420749838 7 3 2.21876468495
75.0487187923 5 7 1.77481011211
75.1841988314 2 5 2.93996080482
75.8557852143 3 2 2.28140467291
76.1712500911 2 3 2.69786972174
76.8302136685 5 2 2.65186816531
76.9911738478 11 5 1.55752991875
77.0605305211 7 11 1.86520552031
77.1351947691 3 7 2.51659181798
77.3021798446 2 3 4.69088729842
77.8021798446 3 2 3.05112513429
78.0938732271 5 3 2.17301885453
78.1915667847 2 5 2.93996080482
78.8608941754 7 2 2.11672707014
78.9459028327 5 7 2.00784639408
79.055352941 3 5 2.87090608036
79.248574475 2 3 3.7639714276
79.748574475 3 2 3.78327301753
79.9592443079 7 3 2.53380677345
80.0639853429 43 7 1.23848578403
80.0780680651 13 43 1.31233467614
80.1303659325 2 13 1.95230755692
80.8795042286 3 2 1.78778204801
81.1785655643 5 3 3.12455848399
81.1989347379 2 5 2.93996080482
81.6949691054 3 2 4.4805338392
81.9131069061 11 3 1.57866396088
81.9614783541 5 11 1.94626533592
82.0934378134 7 5 2.42335733041
82.141930111 2 7 2.07343719513
82.8258988589 3 2 2.52241596121
83.1395010338 7 3 2.06726914428
83.1762408916 2 7 2.54266203545
83.8543444852 7 2 2.0844074191
83.9468478948 11 7 1.68747152199
84.0157608418 3 11 1.45834946769
84.2722934893 2 3 3.87733825218
84.7722934893 3 2 3.22331509044
85.0093993363 5 3 2.34492746178
85.129508718 11 5 2.03618861725
85.1370618819 2 11 1.9318761375
85.8523175283 5 2 2.06423042661
85.9708790502 3 5 2.29518506215
86.2186881196 2 3 3.04372229846
86.7186881196 3 2 3.89275095998
86.9596242224 17 3 1.49975492305
86.9807410132 19 17 1.5845160144
87.0079258 7 19 1.28777434641
87.1337127861 5 7 1.88741717902
87.2136706444 2 5 2.93996080482
87.8496178732 3 2 2.05417572107
88.1566703578 7 3 2.36163831674
88.1696912014 2 7 2.30951746455
88.8477947949 7 2 2.05572064544
88.8855838701 5 7 2.24395767736
89.0292001639 13 5 1.6288681522
89.0698431073 3 13 1.46511344257
89.2960125036 2 3 3.9779492312
89.7903620396 5 2 2.80169883188
89.8017128222 3 5 2.80201089943
90.1061300925 13 3 1.55582757476
90.1297717894 2 13 1.72779919602
90.8826755629 17 2 1.54913269895
90.8881785659 11 17 1.60843494506
90.9277200889 13 11 1.68529730409
90.9515453773 7 13 1.56627572501
91.021462207 3 7 2.209294669
91.2424071339 2 3 3.18808512241
91.7424071339 3 2 3.36934084182
91.9249254455 5 3 2.49097064774
92.0331188508 7 5 1.71674603963
92.1288307305 2 7 1.67805100055
92.7977299929 5 2 2.6156244183
92.8864051594 3 5 1.8051905553
93.1888017643 2 3 2.43121612099
93.6888017643 3 2 3.98593936332
93.8455117458 7 3 2.08455194077
93.9056684914 13 7 1.69994728441
93.9513515007 31 13 1.16902509931
93.9639149746 29 31 1.24847125848
93.9790826438 41 29 1.18647411319
93.9801524622 5 41 1.18317921118
94.159083109 2 5 2.02741705357
94.8197315179 3 2 2.28140467291
95.1351963946 2 3 1.70531706164
95.8050979461 5 2 2.44123193481
95.9261500953 19 5 1.43513344328
95.9332849928 7 19 1.42038285178
96.038631531 3 7 1.7156719453
96.2661261482 2 3 3.31738058482
96.7661261482 3 2 2.90027753834
97.0257684717 7 3 1.69951031124
97.0733938235 5 7 2.11594444467
97.1664510623 2 5 2.05566090526
97.8817893413 5 2 1.4492255542
97.9941835222 3 5 2.45000319065
98.2125207786 2 3 2.59666417547
98.7125207786 3 2 3.49474476987
98.9473951632 17 3 1.31677419864
98.9718200793 11 17 1.35371234146
99.0281999737 13 11 1.46667862306
99.0887013374 2 13 1.07420526719
99.8434505321 3 2 1.88153915379
100.117396145 5 3 2.402395749
100.173819016 2 5 2.08220891157
100.889157295 5 2 1.31838990495
101.078875859 3 5 1.54471786673
101.289845163 2 3 3.43385059425
101.789845163 3 2 2.47751437871
102.042937796 7 3 1.97190264957
102.11573135 2 7 1.36000220696
102.897549599 23 2 1.19478531642
102.924666412 43 23 1.06677069704
102.929503781 3 43 1.02540802112
103.236239793 2 3 2.74601004454
103.736239793 3 2 3.05112513429
103.948229917 5 3 1.79142354141
104.111863527 2 5 1.28934616776
104.827201806 5 2 1.97809184379
104.909709631 3 5 2.00595966729
105.182634423 2 3 2.08362634252
105.896767286 7 2 1.16981151102
106.041549147 13 7 1.22434210603
106.075013988 5 13 1.4546011971
106.186654222 3 5 2.1523114006
106.313564177 2 3 3.53931330456
106.813564177 3 2 2.09451851531
107.129029054 2 3 1.44531193512
107.834569759 5 2 1.84093365824
107.960498146 11 5 1.54854339115
107.999616452 3 11 1.229210933
108.259958807 2 3 2.88149535629
108.759958807 3 2 2.64848786801
109.028411801 13 3 1.34756324305
109.068691791 5 13 1.18298950435
109.195922875 2 5 2.15307901792
109.848081197 73 2 1.65958805777
109.924898969 3 7 1.56239595013
110.206353438 2 3 2.24691333375
110.706353438 3 2 3.1829778875
110.863756027 5 3 1.94704500339
110.947493587 7 5 1.80689143056
111.044042537 29 7 1.00068230928
111.065891858 23 29 0.987496726004
111.097779459 2 23 1.05614487388
111.837283191 3 2 1.74593091981
112.140700617 5 3 1.76345092276
112.203290829 2 5 2.17416969228
112.867641132 11 2 1.40722921132
112.904123503 17 11 1.27779482094
112.931094929 31 17 1.03555743978
112.936373351 5 31 1.05314310102
113.092806543 7 5 1.85824264566
113.105155695 3 7 1.79015505874
113.283677821 2 3 3.0049641066
113.783677821 3 2 2.28140467291
114.092292636 7 3 1.11481073946
114.171253531 2 7 1.79996479543
114.849357124 7 2 1.57398748462
114.942068293 3 7 1.17819153198
115.230072452 2 3 2.395962584
115.730072452 3 2 2.79886680265
115.929205234 7 3 1.67553385562
115.987768559 5 7 1.4047112666
116.14133534 2 5 1.46031046908
116.856673619 5 2 1.47181715806
116.933014103 3 5 2.16544175388
117.176467082 2 3 1.807193064
117.787350177 5 2 2.16644475994
117.914606121 13 5 0.92376129355
117.999326209 7 13 0.963537122143
118.122325019 3 7 1.40662081595
118.307396836 2 3 3.11794708299
118.807396836 3 2 1.94536538377
119.056226727 5 3 1.90998861884
119.13530354 7 5 1.63279606943
119.164703841 2 7 1.658583476
119.842807434 7 2 1.57398748462
119.8849526 5 7 1.57019995018
120.017706441 3 5 1.41032452094

Keenan

🔗genewardsmith <genewardsmith@sbcglobal.net>

11/6/2011 10:52:56 AM

--- In tuning-math@yahoogroups.com, "Keenan Pepper" <keenanpepper@...> wrote:
>
> The following infinite vals are generalized patent vals for their *own* TOP tunings (the tunings with the least TOP-max error).

I'm not clear what you are computing.

🔗Keenan Pepper <keenanpepper@gmail.com>

11/6/2011 11:11:58 AM

--- In tuning-math@yahoogroups.com, "genewardsmith" <genewardsmith@...> wrote:
>
> --- In tuning-math@yahoogroups.com, "Keenan Pepper" <keenanpepper@> wrote:
> >
> > The following infinite vals are generalized patent vals for their *own* TOP tunings (the tunings with the least TOP-max error).
>
> I'm not clear what you are computing.

I assume you know what a generalized patent val is. (It's simply the function that rounds each prime to the nearest interval of a rank-1 temperament.)

I also assume that you know what a TOP tuning is. (It's the tuning that minimizes the maximum Tenney-weighted error of any interval.) The TOP error is guaranteed to exist for these GPVs because it's the maximum of a sequence that's bounded by a decreasing function (no convergence problems). Also, although for general temperaments there can be more than one TOP tuning, for these rank-1 temperaments I claim this can never happen; there is always one unique TOP tuning.

So we have a function from real numbers to integer sequences (GPV), and we have another function from integer sequences to real numbers (TOP tuning). We can compose them to get a function from reals to reals.

The fixed points of this R->R function are rank-1 temperaments that are the TOP tunings of their own GPVs. That's what these are.

Keenan

🔗Carl Lumma <carl@lumma.org>

11/6/2011 11:01:57 PM

Keenan wrote:
>First is the tuning, in steps per octave
>Next comes the beginning of the infinite val, which is the GPV of that tuning
>Next, the flattest and sharpest primes, which determing the TOP tuning
>Next, the TOP error in cents per octave
>
>0.815464876786
>Val: [1, 1, 2, 2, 3, 3, 3, 3, 4, 4, 4, ...] (all of the following
>should have "..." because they're infinite vals)
>Worst primes: 3 2
>Error: 271.553262637

Ah, now I see what you're driving at. Can you clarify how you
know which are the flattest and sharpest primes?

-Carl

🔗Keenan Pepper <keenanpepper@gmail.com>

11/7/2011 7:37:39 AM

--- In tuning-math@yahoogroups.com, Carl Lumma <carl@...> wrote:
> Ah, now I see what you're driving at. Can you clarify how you
> know which are the flattest and sharpest primes?

Since these are generalized patent vals, the worst unweighted error for any prime is 1/2 of a step of the temperament. Since the weights are decreasing, this means the worst weighted error is bounded by a decreasing function (it's no more than 1/2/log2(p)). So, given any error, I can find some limit such that if I check all errors in primes up to that limit, I'm guaranteed not to find any larger weighted errors for any primes above that limit.

Specifically, if I already have some prime whose weighted error is x, verifying that that's the worst weighted error of all primes only requires explicitly checking other primes up to the point where

1/2/log2(p) = x

p = 2^(1/(2x))

Now, this does grow exponentially fast as x shrinks, so I have to be a little careful in my algorithm. But if I have a tuning where some pair of primes have the same weighted error in opposite directions, and no other primes up to this cutoff have larger weighted error, then that should convince you that I've found the TOP tuning.

Keenan

🔗genewardsmith <genewardsmith@sbcglobal.net>

11/7/2011 10:04:43 AM

--- In tuning-math@yahoogroups.com, "Keenan Pepper" <keenanpepper@...> wrote:

> The fixed points of this R->R function are rank-1 temperaments that are the TOP tunings of their own GPVs. That's what these are.

How are you computing them in practice?

🔗genewardsmith <genewardsmith@sbcglobal.net>

11/7/2011 10:18:27 AM

--- In tuning-math@yahoogroups.com, "genewardsmith" <genewardsmith@...> wrote:
>
>
>
> --- In tuning-math@yahoogroups.com, "Keenan Pepper" <keenanpepper@> wrote:
>
> > The fixed points of this R->R function are rank-1 temperaments that are the TOP tunings of their own GPVs. That's what these are.
>
> How are you computing them in practice?

I see this has already been answered. What should these tunings be called?

🔗Carl Lumma <carl@lumma.org>

11/7/2011 11:46:16 AM

Well that makes sense. I'll have to run through some calculations
when I get a chance. It appears you've rather directly solved a
problem that eluded me for years!

-Carl

>Since these are generalized patent vals, the worst unweighted error
>for any prime is 1/2 of a step of the temperament. Since the weights
>are decreasing, this means the worst weighted error is bounded by a
>decreasing function (it's no more than 1/2/log2(p)). So, given any
>error, I can find some limit such that if I check all errors in primes
>up to that limit, I'm guaranteed not to find any larger weighted
>errors for any primes above that limit.
>
>Specifically, if I already have some prime whose weighted error is x,
>verifying that that's the worst weighted error of all primes only
>requires explicitly checking other primes up to the point where
>
>1/2/log2(p) = x
>
>p = 2^(1/(2x))
>
>Now, this does grow exponentially fast as x shrinks, so I have to be a
>little careful in my algorithm. But if I have a tuning where some pair
>of primes have the same weighted error in opposite directions, and no
>other primes up to this cutoff have larger weighted error, then that
>should convince you that I've found the TOP tuning.
>
>Keenan

🔗genewardsmith <genewardsmith@sbcglobal.net>

11/7/2011 11:53:56 AM

--- In tuning-math@yahoogroups.com, Carl Lumma <carl@...> wrote:
>
> Well that makes sense. I'll have to run through some calculations
> when I get a chance. It appears you've rather directly solved a
> problem that eluded me for years!

Now that we have these Pepper tunings, or whatever we should call them, what do we do with them? One thing to play with is the sequence of superparticular ratios mapped to unison by the Pepper map. Is there some use for that?

🔗Carl Lumma <carl@lumma.org>

11/7/2011 12:21:38 PM

Gene wrote:

>> Well that makes sense. I'll have to run through some calculations
>> when I get a chance. It appears you've rather directly solved a
>> problem that eluded me for years!
>
>Now that we have these Pepper tunings, or whatever we should call
>them, what do we do with them? One thing to play with is the sequence
>of superparticular ratios mapped to unison by the Pepper map. Is there
>some use for that?

I'm wondering for which divisions, if any, the best val is
not a self-consistent generalized patent val (and therefore
doesn't appear on Keenan's list).

One trend that's evident is that there are fewer entries on
Keenan's list for 'better' divisions.

It would be interesting to compare zeta tunings with Keenan's
results.

-Carl

🔗Keenan Pepper <keenanpepper@gmail.com>

11/7/2011 1:08:49 PM

--- In tuning-math@yahoogroups.com, "genewardsmith" <genewardsmith@...> wrote:
> I see this has already been answered. What should these tunings be called?

I've suggested "TOP self-consistent" as a name for these tunings and their associated vals. If anybody has a better name that more clearly indicates what they are, that's great. Maybe "TOP fixed-point" would be better.

Keenan

🔗Carl Lumma <carl@lumma.org>

11/7/2011 1:19:01 PM

--- In tuning-math@yahoogroups.com, "Keenan Pepper" <keenanpepper@...> wrote:

> I've suggested "TOP self-consistent" as a name for these tunings
> and their associated vals. If anybody has a better name that
> more clearly indicates what they are, that's great. Maybe
> "TOP fixed-point" would be better.

Fixed point actually tells you what it is, so I think
that's better. My vote for TOP-FP.

-Carl

🔗Mike Battaglia <battaglia01@gmail.com>

11/7/2011 3:55:16 PM

On Mon, Nov 7, 2011 at 2:53 PM, genewardsmith
<genewardsmith@sbcglobal.net> wrote:
>
> --- In tuning-math@yahoogroups.com, Carl Lumma <carl@...> wrote:
> >
> > Well that makes sense. I'll have to run through some calculations
> > when I get a chance. It appears you've rather directly solved a
> > problem that eluded me for years!
>
> Now that we have these Pepper tunings, or whatever we should call them, what do we do with them? One thing to play with is the sequence of superparticular ratios mapped to unison by the Pepper map. Is there some use for that?

I think there may very well be. Keenan might have some other ideas
what to do with them, but one idea he expressed in XA chat was to
wedge them together to start getting optimal w-limit linear
temperaments (where w is pronounced "omega").

I wonder though, what would the difference be in these two algorithms?

1) Finding 2 TOP-FP vals
2) Getting the wedgie

and

1) Finding 2 TOP-FP vals
2) Coming up with the fractional val representing the optimum tuning
3) Wedging the two fractional vals
4) Re-patenting by rounding the wedgie off directly

I'm intrigued by the latter, but I'm not sure what the implications might be.

-Mike

🔗Mike Battaglia <battaglia01@gmail.com>

11/7/2011 4:36:01 PM

Carl wrote:
>
> I'm wondering for which divisions, if any, the best val is
> not a self-consistent generalized patent val (and therefore
> doesn't appear on Keenan's list).

Likewise, and a few related questions:

1) Are there any generalized vals that are lower in TOP-max error than
the ones listed, but which aren't patent with respect to anything and
hence unobtainable by this method?
2) One stated objective for this approach was to obtain a set of
"fixed point" infinite vals that we know have desirable properties to
hence wedge together and obtain really good linear temperaments. If L
is the set of all linear temperaments you can obtain by using this
method, which hence consists of pairs of FP vals, does there exist an
infinite linear temperament under some threshold of badness deemed
"tolerable" that's not going to be in L? (Let's assume we have some
way to measure infinite-limit complexity for this.)

-Mike

🔗Keenan Pepper <keenanpepper@gmail.com>

11/7/2011 4:38:47 PM

--- In tuning-math@yahoogroups.com, Mike Battaglia <battaglia01@...> wrote:
> > Now that we have these Pepper tunings, or whatever we should call them, what do we do with them? One thing to play with is the sequence of superparticular ratios mapped to unison by the Pepper map. Is there some use for that?
>
> I think there may very well be. Keenan might have some other ideas
> what to do with them, but one idea he expressed in XA chat was to
> wedge them together to start getting optimal w-limit linear
> temperaments (where w is pronounced "omega").

Yes, this is something I had in mind. If you want to construct finite-rank omega-limit temperaments, you have to start with rank-1 omega-limit temperaments. There are way too many of those to deal with, but TOP fixed-point temperaments are an optimal, discrete subset.

> I wonder though, what would the difference be in these two algorithms?
>
> 1) Finding 2 TOP-FP vals
> 2) Getting the wedgie
>
> and
>
> 1) Finding 2 TOP-FP vals
> 2) Coming up with the fractional val representing the optimum tuning
> 3) Wedging the two fractional vals
> 4) Re-patenting by rounding the wedgie off directly
>
> I'm intrigued by the latter, but I'm not sure what the implications might be.

One of the basic properties of wedge product is that if you wedge together any set of things that is linearly dependent, you get zero.

Fractional vals representing JI tuning are all exactly proportional to each other.

Therefore, if you wedge together any two of these fractional vals, you get 0. So your second algorithm doesn't actually work at all.

Keenan

🔗Mike Battaglia <battaglia01@gmail.com>

11/7/2011 4:49:16 PM

On Mon, Nov 7, 2011 at 7:38 PM, Keenan Pepper <keenanpepper@gmail.com> wrote:
>
> > 1) Finding 2 TOP-FP vals
> > 2) Coming up with the fractional val representing the optimum tuning
> > 3) Wedging the two fractional vals
> > 4) Re-patenting by rounding the wedgie off directly
> >
> > I'm intrigued by the latter, but I'm not sure what the implications might be.
>
> One of the basic properties of wedge product is that if you wedge together any set of things that is linearly dependent, you get zero.
>
> Fractional vals representing JI tuning are all exactly proportional to each other.
>
> Therefore, if you wedge together any two of these fractional vals, you get 0. So your second algorithm doesn't actually work at all.

How do these fractional vals represent JI? I thought that the
coefficients of these vals were linearly scaled versions of some
integer val so as to sync up with the TOP tuning for that val. For
example, the TOP tuning for 12p is <11.9844 18.9753 27.9636|, but
11.9844*log2(<2 3 5]) is <11.9844 18.9753 27.9636].

-Mike

🔗Mike Battaglia <battaglia01@gmail.com>

11/7/2011 4:50:25 PM

On Mon, Nov 7, 2011 at 7:49 PM, Mike Battaglia <battaglia01@gmail.com> wrote:
>
> How do these fractional vals represent JI? I thought that the
> coefficients of these vals were linearly scaled versions of some
> integer val so as to sync up with the TOP tuning for that val. For
> example, the TOP tuning for 12p is <11.9844   18.9753   27.9636|, but
> 11.9844*log2(<2 3 5]) is <11.9844   18.9753   27.9636].

Agh, clipboard fail. The second should have been <11.9844 18.9948 27.8269].

-Mike

🔗Keenan Pepper <keenanpepper@gmail.com>

11/7/2011 4:55:22 PM

--- In tuning-math@yahoogroups.com, Mike Battaglia <battaglia01@...> wrote:
>
> Carl wrote:
> >
> > I'm wondering for which divisions, if any, the best val is
> > not a self-consistent generalized patent val (and therefore
> > doesn't appear on Keenan's list).
>
> Likewise, and a few related questions:
>
> 1) Are there any generalized vals that are lower in TOP-max error than
> the ones listed, but which aren't patent with respect to anything and
> hence unobtainable by this method?

I would be very surprised if there were. I'm thinking up a proof.

Basically the only reason that some best vals aren't patent (in the restricted sence of "patent") is because you're rounding all the primes based on octaves being pure. I think that every best val is a generalized patent val.

Note that for each TOP-FP val, there are an infinite number of non-patent vals that have exactly the same TOP error as the patent val (make a small enough change to whatever really large primes you want to). But none of them are strictly better.

> 2) One stated objective for this approach was to obtain a set of
> "fixed point" infinite vals that we know have desirable properties to
> hence wedge together and obtain really good linear temperaments. If L
> is the set of all linear temperaments you can obtain by using this
> method, which hence consists of pairs of FP vals, does there exist an
> infinite linear temperament under some threshold of badness deemed
> "tolerable" that's not going to be in L? (Let's assume we have some
> way to measure infinite-limit complexity for this.)

Now this is a much more significant question. I honestly have no idea.

Keenan

🔗Keenan Pepper <keenanpepper@gmail.com>

11/7/2011 5:02:39 PM

--- In tuning-math@yahoogroups.com, Mike Battaglia <battaglia01@...> wrote:
> How do these fractional vals represent JI? I thought that the
> coefficients of these vals were linearly scaled versions of some
> integer val so as to sync up with the TOP tuning for that val. For
> example, the TOP tuning for 12p is <11.9844 18.9753 27.9636|, but
> 11.9844*log2(<2 3 5]) is <11.9844 18.9753 27.9636].

Oh, you were talking about a different thing than I thought you were talking about. Hmmm.

Keenan

🔗Keenan Pepper <keenanpepper@gmail.com>

11/7/2011 5:46:39 PM

--- In tuning-math@yahoogroups.com, "Keenan Pepper" <keenanpepper@...> wrote:
>
> --- In tuning-math@yahoogroups.com, Mike Battaglia <battaglia01@> wrote:
> > How do these fractional vals represent JI? I thought that the
> > coefficients of these vals were linearly scaled versions of some
> > integer val so as to sync up with the TOP tuning for that val. For
> > example, the TOP tuning for 12p is <11.9844 18.9753 27.9636|, but
> > 11.9844*log2(<2 3 5]) is <11.9844 18.9753 27.9636].
>
> Oh, you were talking about a different thing than I thought you were talking about. Hmmm.

The thing that you were actually talking about does not seem useful because:

1. It's based on singling 2 out from all other primes. Why should the fractional val for 12p be <11.984 18.975 27.964| (where everything is in units of 2^(1/12)) rather than <12.043 19.069 28.101| (where everything is in units of 5^(1/28))? 28ed5 is an equally accurate characterization of this rank-1 temperament as 12ed2.

2. If you naively apply the algorithm you said, you get crazy wedgies that don't correspond to any reasonable temperament. For example, I applied it to <2 3 4 4| and <7 11 16 19| (yields meantone), and I got <<1 5 11 5 15 14||, which is a totally unreasonable temperament. It tempers out 243/160.

I think that the "rounding wedgie entries" operation just doesn't make sense.

Keenan

🔗Carl Lumma <carl@lumma.org>

11/7/2011 8:33:34 PM

>1) Are there any generalized vals that are lower in TOP-max error than
>the ones listed, but which aren't patent with respect to anything and
>hence unobtainable by this method?

I believe there must be (see my earlier message).

>2) One stated objective for this approach was to obtain a set of
>"fixed point" infinite vals that we know have desirable properties to
>hence wedge together and obtain really good linear temperaments.

I don't know how to wedge infinite vals. And even if I did,
finding the optimal tuning of an infinite rank 2 temperament
seems like a much harder problem...

-Carl

🔗genewardsmith <genewardsmith@sbcglobal.net>

11/7/2011 9:23:04 PM

--- In tuning-math@yahoogroups.com, Carl Lumma <carl@...> wrote:

> I don't know how to wedge infinite vals. And even if I did,
> finding the optimal tuning of an infinite rank 2 temperament
> seems like a much harder problem...

Wedging infinite vals is not a problem, but then what?

🔗Mike Battaglia <battaglia01@gmail.com>

11/7/2011 10:08:31 PM

On Tue, Nov 8, 2011 at 12:23 AM, genewardsmith
<genewardsmith@sbcglobal.net>wrote:

> **
>
> --- In tuning-math@yahoogroups.com, Carl Lumma <carl@...> wrote:
>
> > I don't know how to wedge infinite vals. And even if I did,
> > finding the optimal tuning of an infinite rank 2 temperament
> > seems like a much harder problem...
>
> Wedging infinite vals is not a problem, but then what?
>

Then we work out a complexity measure for the resulting infinite
temperament that actually converges. And then all of these subgroup
temperaments, and even lower vs higher-limit extensions of the same
temperament, become unnecessary; they serve no purpose except as a ways to
optimize for a select number of primes at the exclusion of others. There'd
be no point describing them as independent mathematical objects. It'd be
like saying that quarter-comma meantone is a contorted 2.5-limit
temperament or something.

-Mike

🔗Graham Breed <gbreed@gmail.com>

11/9/2011 1:03:02 AM

On 8 November 2011 00:55, Keenan Pepper <keenanpepper@gmail.com> wrote:
> --- In tuning-math@yahoogroups.com, Mike Battaglia <battaglia01@...> wrote:

>> 1) Are there any generalized vals that are lower in TOP-max error than
>> the ones listed, but which aren't patent with respect to anything and
>> hence unobtainable by this method?
>
> I would be very surprised if there were. I'm thinking up a proof.

The TOP tuning is the one for which the highest and lowest weighted
prime deviations are equal and opposite. The optimal tuning is the
middle of that range. If you're taking the nearest approximation to
the optimal tuning, as in the optimal patent val, and it's possible to
lie within the TOP range, it will lie within that range. There may
not be a unique TOP val but it's difficult to see how the generalized
patent val for the TOP tuning wouldn't be TOP. Still, that seems to
be what you found, so I'm not sure.

A similar argument should work for TE error and the tuning with zero
mean deviation (which has come out before, and is easy to find, but I
forget how). The ZMD tuning is very close to the TE tuning so TE vals
should also be self-consistent almost all the time.

Graham