The cosine principle, radial effect and entropy in the generalized Lozi map

The generalized Lozi map is a good way to illustrate the cosine principle and the radial effects (in lay circles to which I belong in this regard, as opposed to mathematicians). The generalized Lozi map is a 2-dimensional map defined thus:

x_{n+1}= 1 + y_n + a|x_n|
y_{n+1}= -x_n

The map is area-preserving and yields “aesthetic” images for a \in [-0.6,1.1]. Additionally, values a=-1; a=\sqrt{2} are also somewhat aesthetic and interesting. We have previously described the cosine principle for various dynamical systems, but we reiterate it here for the generalized Lozi map as it is one of the easiest ones to explain to a layperson. First, a few words on how we visualize this map. We start with the vertices of a 60-sided polygon circumscribed by a circle of radius r, centered at (0,0), and record the evolution of each vertex for a 1000 iterations under the map. Since the map has an absolute value term, it will be bilaterally symmetric along the line y=-x. Hence, we rotate the iterates by an angle of -\tfrac{\pi}{4}, then scale and center the points, and plot the evolutes of each vertex (orbit of the vertex) in a different color. The examples of 9 such mappings starting with the said polygon in a circle of radius r=0.2 are shown in Figure 1.


Figure 1

The values of of the parameter are chosen such that a=2\cos\left(\tfrac{2\pi}{p/q}\right), where p,q are integers. We observe that the value of p determines a key aspect of the shape of the map, i.e. in each map there is a central, largely excluded area that takes the form of a polygon with p-sides. This is the cosine principle. More generally, the shape of the central region of the map is determined by the p corresponding to the 2\cos(\theta) closest to a. Note that for the case \tfrac{\pi}{2}, we take a number relatively close to 0, for at 0 the map is degenerate. Outside of the polygonal exclusion zone, we may find chaotic behavior but it is still bounded within a unique external shape. The chaos is particularly apparent in the cases when a=-1; 1; \sqrt{2} when the map respectively yields the headless gingerbread man, the classical gingerbread man and the tripodal gingerbread man strange attractors. At the other values of a, we see bands of chaos interspersed with rings of closed loops that resemble the period-doubling phenomenon in other strange attractors prior to the outbreak of full-fledged chaos.

In Figure 2 we produce the same plot by changing the radius of the circumscribing circle of the initial polygon to r=0.45. We can see that at this radius, for the low p the cosine principle remains dominant, but for large p the polygonal zone gets “smoothened” out (e.g. for p=9..11). This indicates the radial principle, i.e. the effect of the starting radius on the degree of expression of the cosine principle in the map.


Figure 2

The degree of chaos can be seen as the measure of entropy of the map. By following the colors, one can see that when a=-1; 1; \sqrt{2} the orbits of a given starting vertex under the map are all over the place within the attractor boundary. In contrast, for the other values of a, the evolutes are mostly limited to particular bands. When a \approx 0 then the evolute of each vertex is limited to a certain concentric curve. Thus, the former lie at the high end of the entropy spectrum and the latter at the low end. A proxy for the entropy distribution of the attractor can be obtained by computing the coefficient of variation, c, i.e. the ratio of the standard deviation to the mean of the distances of the evolutes of a particular vertex from the center of the map:

c=\dfrac{\sigma_d}{\mu_d}, where \sigma_d is the standard deviation and \mu_d the mean distance from the center

We plot c for the maps with r=0.2 (Figure 1) and r=0.45 (Figure 2) for each vertex at 60 angles from 0..2\pi respectively in Figures 3 and 4. The mean c is shown as \mu for each plot.

entropy1Figure 3

entropy2Figure 4

We note that \mu for a=-1; 1; \sqrt{2} is significantly (an order of magnitude) greater than the \mu those for the other a. Further, the radial effect can also be seen affecting the entropy of a map. While it remains roughly the same or is lower for the high entropy triangular, hexagonal and octagonal a, for the remaining polygonal a the entropy rises at r=0.45 relative r=0.2. In the pentagonal case, it is mostly across the board while we see specific peaks in the decagonal and heptagonal case.

We next examine the radial effect and entropy more systematically for a fixed value of a by choosing the hendecagonal value a=2\cos\left(\tfrac{2\pi}{11/3}\right). The map is shown in Figure 5 and the entropy proxy c in Figure 6.

Lozi5Figure 5

entropy5Figure 6

Here we see two disconnected effects of the radius. First, at certain values the inner hendecagon is lost (e.g. r=0.1) or becomes smoothened out (e.g. r=0.5; 0.6). Second, the entropy of the orbits of certain vertices dramatically rises for some values (e.g. r= 0.5..0.8). The radial effect on neither the entropy nor the expression of the polygonal inner zone is the same across different a values. However, more generally, the lower the number of polygon sides, stronger is the polygonal expression across r.

Finally, we touch upon a general philosophical point that can be realized from such chaotic systems. While it is not specific to this generalized Lozi attractor, we take this opportunity to articulate it because we have presented the entropy concept. Most people agree that the attractors with neither too much entropy nor too little entropy are aesthetically most pleasing. This also has a counterpart in biology. Selection tends to prefer systems with an optimal entropy. Too much entropy in a structure (say a protein) and it is too disordered to be useful for most functions. Too little entropy and it is again too rigid to be useful for much. Moreover, from an evolvability viewpoint, too rigid a structure offers too little option for exploring multiple functions in biochemical function space. Too much disorder again means that it explores too much space to perform any function well enough to be selected. Hence, structures with some entropy optimum tend to be selected rather than those with minimum or maximum entropy. Selection can be conceived as maximizing a certain function, say f(x) for simplicity, in a given entity under selection. This f(x) will then be the fitness function. We can see from the above that f(x) cannot directly or inversely track mean entropy because that will not maximize fitness which is at some optimal entropy. It has to hence track something else. This would depend on the optimal band of entropy that is selected by the given constraints. For example, one field of constraints could select for an optimal band of mean c, like \mu \in [0.03, 0.1]. Such a field will select a corresponding to the pentagon, heptagon, nonagon and decagon while avoiding the triangle, hexagon and octagon for too high entropy and the square and hendecagon for too low entropy (Figure 1, 3). This constraint field will also select for other values (e.g. Figures 7 and 8) that have \mu is in this interval (Panels 1, 2, 6). Thus, the f(x) will be a function with local peaks that is very different from the underlying reality of a continuous entropy distribution from low to high. Thus, selection translates the underlying reality into a sensed structure very different from it. The philosophical corollary to this is that a sensed structure will be different from and unlikely to reflect the underlying reality.

Lozi6Figure 7

entropy6Figure 8

This entry was posted in Life. Bookmark the permalink.