27
The spring bounces back: Introducing the Strain Elevation Tension Spring embedding algorithm for network representation Jonathan Bourne July 2020 Abstract This paper introduces the Strain Elevation Tension Spring embed- ding (SETSe) algorithm, a graph embedding method that uses a physics model to create node and edge embeddings in undirected attribute net- works. Using a low-dimensional representation, SETSe is able to differen- tiate between graphs that are designed to appear identical using standard network metrics such as number of nodes, number of edges and assortativ- ity. The embeddings generated position the nodes such that sub-classes, hidden during the embedding process, are linearly separable, due to the way they connect to the rest of the network. SETSe outperforms five other common graph embedding methods on both graph differentiation and sub-class identification. The technique is applied to social network data, showing its advantages over assortativity as well as SETSe’s ability to quantify network structure and predict node type. The algorithm has a convergence complexity of around O(n 2 ), and the iteration speed is lin- ear (O(n)), as is memory complexity. Overall, SETSe is a fast, flexible framework for a variety of network and graph tasks, providing analytical insight and simple visualisation for complex systems. 1 Introduction With the rise of social media and e-commerce, graph and complex networks have become a common concept in society. Their ubiquity and the already digitised nature of social networks has led to a great deal of research. Although there is a range of algorithms that can perform supervised learning directly on graphs [5, 16, 30, 29] (see Wu et al. [36] for a recent survey on the subject), a more common approach is to represent the nodes of the network in a latent vector space that traditional supervised learning techniques can then use. These graph embed- dings create a vector representation of the graphs, preserving valuable network properties such as distance on the graph, community structure and node class [14, 21, 24, 28]. These algorithms tend to find embeddings by minimising the 1 arXiv:2007.09437v1 [cs.SI] 18 Jul 2020

The spring bounces back: Introducing the Strain Elevation ... · The spring bounces back: Introducing the Strain Elevation Tension Spring embedding algorithm for network representation

  • Upload
    others

  • View
    1

  • Download
    0

Embed Size (px)

Citation preview

Page 1: The spring bounces back: Introducing the Strain Elevation ... · The spring bounces back: Introducing the Strain Elevation Tension Spring embedding algorithm for network representation

The spring bounces back: Introducing the Strain

Elevation Tension Spring embedding algorithm

for network representation

Jonathan Bourne

July 2020

Abstract

This paper introduces the Strain Elevation Tension Spring embed-ding (SETSe) algorithm, a graph embedding method that uses a physicsmodel to create node and edge embeddings in undirected attribute net-works. Using a low-dimensional representation, SETSe is able to differen-tiate between graphs that are designed to appear identical using standardnetwork metrics such as number of nodes, number of edges and assortativ-ity. The embeddings generated position the nodes such that sub-classes,hidden during the embedding process, are linearly separable, due to theway they connect to the rest of the network. SETSe outperforms fiveother common graph embedding methods on both graph differentiationand sub-class identification. The technique is applied to social networkdata, showing its advantages over assortativity as well as SETSe’s abilityto quantify network structure and predict node type. The algorithm hasa convergence complexity of around O(n2), and the iteration speed is lin-ear (O(n)), as is memory complexity. Overall, SETSe is a fast, flexibleframework for a variety of network and graph tasks, providing analyticalinsight and simple visualisation for complex systems.

1 Introduction

With the rise of social media and e-commerce, graph and complex networks havebecome a common concept in society. Their ubiquity and the already digitisednature of social networks has led to a great deal of research. Although there is arange of algorithms that can perform supervised learning directly on graphs [5,16, 30, 29] (see Wu et al. [36] for a recent survey on the subject), a more commonapproach is to represent the nodes of the network in a latent vector space thattraditional supervised learning techniques can then use. These graph embed-dings create a vector representation of the graphs, preserving valuable networkproperties such as distance on the graph, community structure and node class[14, 21, 24, 28]. These algorithms tend to find embeddings by minimising the

1

arX

iv:2

007.

0943

7v1

[cs

.SI]

18

Jul 2

020

Page 2: The spring bounces back: Introducing the Strain Elevation ... · The spring bounces back: Introducing the Strain Elevation Tension Spring embedding algorithm for network representation

distance between similar nodes, within the structure of the network. The litera-ture review by Goyal and Ferrara [13] provides a survey of the most significant ofthese algorithms. With recent improvements in neural networks, there has beensubstantial growth in research on graph embedding algorithms, with over 50new neural-network-based embedding algorithms published between 2015 and2020 [10].

Physics models can also be used to embed graphs in vector space; however,these are typically used only for drawing graphs. While other graph drawingtechniques exist [11, 18, 19], force-directed physics models are some of the mostpopular [9, 12, 15]. These algorithms originated in the 1960s [33], and use simplephysics to find an arrangement of nodes that provides an aesthetically pleasingplot of a graph or network. These algorithms are sometimes called ‘springembedders’ [17] due to using springs or spring-like methods to place nodes andtypically attempt to optimise an ideal distance between nodes, minimising theenergy of the system. Although popular at the end of the 20th century, springembedders became less common in research as machine learning became morepopular.

The importance of drawing graphs is discussed in several papers [6, 20,23, 27]. These researchers demonstrate that graphs that are structurally verydifferent can appear identical until visualised. A popular statistical equivalentis Anscombe’s quartet [2], a series of four figures showing very different data.However, in terms of the mean, variance, correlation, linear regression and R2,all four figures in the quartet appear identical. It highlights the importance ofvisualising data and the weaknesses of some commonly used statistical tools.

The Strain Elevation Tension Spring embedding (SETSe) algorithm takes itsname from the embeddings it produces. SETSe takes the node attributes of agraph, representing them as a force. The edges are represented by springs whosestiffness is dependent on the edge weight. The algorithm finds the position ofeach node on a manifold such that the internal forces created by the nodes arebalanced by the resistive forces of the springs and the network is in equilibrium.The SETSe algorithm acts as a hybrid between the advanced techniques of themachine learning graph embedders used for analysis and the intuitive simplicityof the spring embedders used for graph drawing.

This paper demonstrates that in a world of sophisticated machine learn-ing, there is still a role for simple, intuitive embedding methods. It showsthat SETSe can find meaningful embeddings for the Peel’s quintet [23] seriesof graphs, as well as the relations in Facebook data [32]. It finds these embed-dings efficiently in linear iteration time and space complexity. An R packagehas been created which provides all the functionality necessary to run SETSeanalysis/embeddings (available from https://github.com/JonnoB/rSETSe).

The paper asks can SETSe distinguish between graphs that are identicalusing traditional network metrics? It also asks can SETSe be used to classifyindividual nodes? To gauge how well SETSe performs these tasks, it is comparedagainst several popular graph embedding algorithms: node2vec [14], SDNE [35],LLE [28], Laplacian Eigenmaps [3] and HOPE [21].

2

Page 3: The spring bounces back: Introducing the Strain Elevation ... · The spring bounces back: Introducing the Strain Elevation Tension Spring embedding algorithm for network representation

2 Method

This section begins by providing a simple example of the SETSe algorithm.It then describes and defines the physics model that underpins the embeddingmethod. The algorithmic implementation and practical issues related to con-vergence are also briefly described. The datasets used in this paper are thenintroduced. Finally, the analyses performed are described.

2.1 Introduction to SETSe: a simple example

The SETSe algorithm takes a network G containing the set of V nodes andthe set E edges. It converts the n attributes or variables of each node intoorthogonal forces where each attribute occupies a single dimension. In eachdimension, the total sum (across the network) of the forces in that dimensionis 0 (i.e. the network is balanced in all dimensions). Each edge is converted toa spring whose stiffness k is taken from an attribute of the edge, typically theedge weight. If no edge attribute is to be used, all the edges in the networktake k as an arbitrary constant. The algorithm then positions each node onan n+ 1-dimensional manifold such that no node experiences a net force. Likeother spring embedders [12, 15, 26], SETSe is subject to the n-body problem [1,31] and must be solved iteratively.

The functioning of SETSe is best described using example. Consider thesimple network shown in figure 1. The network is planar, and so can be drawnin two dimensions (x and y), with no edges crossing. The nodes in the networkhave a single attribute/variable that acts perpendicular to the plane; as such,the nodes in the network act as beads whose movement is restricted to the z-axisand are fixed in x and y. The nodes in the network have an identical mass m,and the edges between the nodes have a common distance d, which is the lengthof the spring at rest. Node A exerts a force of 1, node B exerts no force on thenetwork, while nodes C and D exert a force of −1 each, resulting in a net forceof 0. The edges of the network are springs that, when stretched, act accordingto Hooke’s law F = ∆Hk, where ∆H is the extension of the edge and k thespring stiffness such that 0 < k ≤ ∞.

Although the vertical forces (defined by the node attribute) across the net-work sum to 0, the individual nodes are not in equilibrium and so begin tomove in the direction of their respective forces. The vertical distance betweennode pairs resulting from this movement extends the springs, creating a resis-tive force. The network will find a three-dimensional equilibrium when the netforce acting on each node is 0. This occurs when the elevation of the nodesin the system is such that for each node, the sum of the vertical tension in alledges connected to that node is equal and opposite to the force produced by thenode itself. As an example, if k = 1000 and d = 1, the equilibrium positionsof the nodes are 0.1450, 0.0185, −0.0818 and −0.0818 for the nodes A to D,respectively. The interested reader can confirm that the net forces on each nodeare 0, using Pythagoras’ theorem and Hooke’s law.

A crucial point in this example is that the nodes are beads. As such, the

3

Page 4: The spring bounces back: Introducing the Strain Elevation ... · The spring bounces back: Introducing the Strain Elevation Tension Spring embedding algorithm for network representation

x

y

D=−1

A=2

B=0

C=−1

Figure 1: A network of four nodes and three edges. The node attributes areconsidered forces that act in the z-direction. The forces acting on the networkbalance, but the forces acting on the nodes only balance when the nodes are inthe appropriate position in the z-axis; the network is shown in the x–y planes.

xy positions are fixed and only the vertical component of the spring tensionhas an impact on the nodes; all horizontal forces can be disregarded. Ignoringthe horizontal force means that the xy position of the nodes can be ignored,reducing the initial layout of the network to a zero-dimensional point in space,out of which a one-dimensional node elevation appears. The horizontal distancebetween nodes is reduced to a crucial but abstract mapping value.

SETSe space is a non-Euclidean metric space of n + 1 dimensions wheren is the number of attributes the network has. The n + 1th dimension is thegraph space, representing the graph adjacency matrix. The distance betweenconnected nodes in the graph space is di,j . Dimensions 1 to n are Euclidean,

4

Page 5: The spring bounces back: Introducing the Strain Elevation ... · The spring bounces back: Introducing the Strain Elevation Tension Spring embedding algorithm for network representation

SETSe with two variablesin three dimensional space

SETSe with one variablein two dimensional space

F1 F1

-F1 -F1

F2 -F2

dd

Figure 2: Two nodes in a network in two-dimensional and three-dimensionalSETSe systems. The minimum distance d between the nodes is maintained bythe graph space dimension.

while the graph dimension is not. This concept is visualised in figure 2, whichshows nodes embedded in pairwise two-dimensional space and pairwise three-dimensional space. The graph space acts as the minimum distance betweenthe nodes. Figure 2 shows that the nodes occupy parallel Euclidean hyper-planes separated by the graph space. As SETSe space is locally Euclidean, itis an n+ 1-dimensional manifold. As an example of a network that is pairwiseEuclidean but not Euclidean overall, consider a maximally connected networkof four nodes. There is no arrangement of the nodes on a plane where thedistance between all nodes is equal, although pairwise all nodes can have thesame distance.

2.2 Creating the physics model

The calculation of the solution of a single variable graph is described below.The extension for higher dimensions is described in the subsequent paragraph.The net force acting on node i can be written as Fnet,i = Fi − Fvten,i, whereFi is the force produced by the node and Fvten,i is the vertical component ofthe net tension acting on the node from the springs. The net force on node iis shown again in equation 1 where Ften,i,j is the total tension in edge i, j. Theangle θi,j is the angle of the force between nodes i and j. The tension in an edgeis given by Hooke’s law as Ften,i,j = ki,j(Hi,j − di,j) where Hi,j is the lengthof the extended spring and di,j is the graph distance between the nodes. Thelength of the extended spring length Hi,j can be found, as it is the hypotenuse

of the distance triangle between nodes i and j such that Hi,j =√

∆z2i,j + d2

i,j ,

5

Page 6: The spring bounces back: Introducing the Strain Elevation ... · The spring bounces back: Introducing the Strain Elevation Tension Spring embedding algorithm for network representation

where ∆zi,j is the elevation difference between nodes i and j, ∆zi,j = zj−zi. Ascos θ = ∆z

H , the equation for net tension can be rearranged into an alternativeexpression of edge tension, which is shown in equation 3. The strain componentof SETSe is simple mechanical strain and is shown in equation 4. Strain andtension are perfectly correlated in the special case that k is constant for all edgesin the network.

Fnet,i = Fi −n∑j

Ften,i,j cos θi,j (1)

Fnet,i = Fi −n∑j

ki,j(Hi,j − di,j)∆zi,jHi,j

(2)

Fnet,i = Fi −n∑j

ki,j∆zi,j(1−di,jHi,j

) (3)

εi,j =Hi,j − di,j

di,j(4)

The extension of SETSe from a graph with a single attribute to a graph withn attributes is straightforward. The hypotenuse vector is Hi,j = zi − zj + d,which is a vector of n+1 elements, where the first n elements are the differencesin position between nodes i and j in the n dimensions, and the n+1th dimensionis the graph distance d. The scalar length of Hi,j is the Euclidean distance

between the nodes in n+ 1-dimensional space i.e. Hi,j =√

(∑n

1 (zi,q − zj,q)2 +d2i,j). To find the angle between the hypotenuse and the distance between the

two nodes in dimension q, the cosine similarity is used, as shown in equation5. As all entries of ∆zq,i,j are 0, apart from the qth entry, the cosine similaritysimplifies to equation 6, which is the ‘vertical’ distance between the nodes indimension q over the scale length of Hi,j . It is then easy to see that equation 7is the multidimensional equivalent of equation 3.

cos θq,i,j =zq,i,j ·Hi,j

‖zq,i,j‖ ‖Hi,j‖(5)

cos θq,i,j =∆zq,i,jHi,j

(6)

Fnet,q,i = Fq,i −n∑j

ki,j∆zq,i,j(1−di,jHi,j

) (7)

The distance d between the nodes is a key parameter when it comes tofinding the final elevation embedding. If the distance is not a constant, itmust be meaningful for the type of network being analysed. As an example,the distance in metres between two connected points on an electrical circuit isunlikely to be meaningful; however, a variable distance may be appropriate iftraffic were being analysed. Understanding meaningful distance metrics is not

6

Page 7: The spring bounces back: Introducing the Strain Elevation ... · The spring bounces back: Introducing the Strain Elevation Tension Spring embedding algorithm for network representation

explored in this paper, and distance between nodes is considered a constantacross all edges.

SETSe can be used on continuous and categorical variables. In both cases,the forces must be balanced; this is when the net value of the sum of the forcesacross all nodes equals 0. For continuous variables, the raw attribute force ofnode i (Fi) is normalised to create balanced force Fi,bal by subtracting the meanfrom all values, as shown in equation 8, where |V| is the number of nodes in thenetwork.

Fi,bal = Fi −1

|V|

|V|∑1

Fi (8)

For categorical node attributes, each level is treated as a binary attribute,making as many new dimensions as there are levels; this is similar to howvariables in linear and logistic regression are treated. The total force leveldimension γ is the fraction that level makes up of the total number of nodes, asshown in equation 9. The force produced by the nodes in each level dimensioncan then be treated as continuous variables, as described previously in equation8.

Fγ =|Vγ ||V|

(9)

With the force and distance relationship between pairs of nodes defined,it is now possible to look at the method used to find the equilibrium state ofthe network and its corresponding strain, elevation and tension embeddings.The difficulty in solving such a problem is that the relationship between thefinal elevation of a node and the force it experiences is non-linear. In addition,each node is affected by all other nodes and spring stiffness k in the network.This interaction creates a situation that is similar to the n-body problem ofastrophysics. In this case, although the nodes act as bodies, instead of exertinga force on all other nodes, as celestial bodies do, they only exert a force on thosenodes with which they have a direct connection.

The equilibrium solution can be found by treating the problem as a dynamicsystem and iterating through discrete time steps until the system reaches theequilibrium point. By representing the network as a dynamic system, the accel-eration and velocity of each node must be calculated. Using Newton’s secondlaw of dynamics, F = ma, where F is the force acting on the node and a isthe acceleration, the nodes need to be assigned an arbitrary constant mass m(note mass does not affect the final embeddings). The net force acting on eachnode at time step t is then equation 10, where F is the force generated by thenode according to the node attribute. The system is assumed to be a viscouslaminar fluid, and so the damping is simply the product of the velocity v andthe coefficient of drag c. Friction is used to cause the system to slowly loseenergy and converge. While it does not affect the value at convergence, it needsto be correctly parametrised or the system will not converge. This is discussedin the Appendix.

7

Page 8: The spring bounces back: Introducing the Strain Elevation ... · The spring bounces back: Introducing the Strain Elevation Tension Spring embedding algorithm for network representation

Fnet,i = Fvteni + Fi − cvi (10)

Knowing the net force acting on the node allows calculation of the equationsof motion at each time step. Velocity can be calculated as vt = vt−1 +

Fnet,t

m ∆t,where v is the velocity at time t. Distance is the elevation embedding and iscalculated by z = vt−1∆t+ 1

2at∆t2 + zt−1.

The SETSe algorithm is shown in algorithm 1. The equations describedin equations 1 to 10 are either converted to vectors or matrices, allowing allnodes and edges in the network to be updated simultaneously. The algorithmtakes a graph G, which has been processed so that each edge has distance di,jand spring constant ki,j . The dynamics of the network are all initialised at0; only the forces exerted by the nodes are non-zero values. In algorithm 1,vectors are lower case letters in bold, while matrices are in bold and capitals.The time in the system is represented by t and the time step per iteration is∆t. The elevation of each node in the system is represented by the matrix Z.The elevation difference across each edge is ∆Z, and is obtained by subtractingthe transpose of the elevation matrix from the original elevation matrix. Thehypotenuse, or total length, of the edge is represented by the matrix H andis found using the elevation difference ∆Z as well as the horizontal differenced. The vertical component of the tension in each edge is represented by Fvtenand is the element-wise product of the edge spring stiffness matrix K with theextension of the edge; this matrix is then multiplied element-wise again usingthe element-wise product by the tangent of the angle of the edge. Line 8 showsthat the vertical component of the force fvten is updated by summing the rowsof each line in the Fvten matrix using a column vector of 1s, that is, |V| long.The elevation of each node is updated on line 9 of the algorithm. The vector z isthen reshaped using a function into matrix form. Line 11 updates the velocity ofeach node in the network. Line 12 updates the static force on each node fstatic.Static force is the force exerted by the node minus the sum of the tensionsexerted by all the connected edges. The next update is the system friction ordrag fd. The system force fnet is then updated. Finally, the acceleration a tobe used in the next iteration is calculated.

One of the advantages that SETSe has over traditional force expansion algo-rithms [15, 12] is that the distance from the optimal solutions is known. In theother algorithms, the loss function is to reduce the total energy of the system tosome unknown minimum. However, SETSe has a loss function more similar tothe error metrics used in statistics or machine learning. The ideal static forceof the system is 0 and the initial static force is

∑‖Fi‖, which is thus bounded

in a finite space. This is an important consideration when it comes to efficientconvergence and auto-convergence and is discussed further in the Appendix.Although the stop condition of the algorithm is that the static force in the net-work is 0, in practice, the system is said to have converged if fstatic ≈ 0. In this

paper, the tolerance for convergence will be fstatic ≤∑‖Fi‖

103 , that is when thestatic force is reduced to 1/1000th of the absolute sum of forces exerted by thenodes.

8

Page 9: The spring bounces back: Introducing the Strain Elevation ... · The spring bounces back: Introducing the Strain Elevation Tension Spring embedding algorithm for network representation

Algorithm 1: The SETSe algorithm

Result: The strain, elevation and tension embeddings of the originalgraph

1 t = 0 ;2 fstatic = f;3 while

∑‖fstatic‖ 6= 0 do

4 t = t+ ∆t;

5 ∆Z = Z− ZT ;

6 H =√

∆Z2 + d2;

7 Fvten = K ◦ (H− d) ◦ ∆ZH ;

8 fvten = FvtenJ|V|;

9 z = v∆t+ 12a∆t2 + z;

10 Z = f(z);11 v = v + a∆t;12 fstatic = f− fvten;13 fd = cdv;14 fnet = fstatic − fd;

15 a = fnet

m ;

16 end

2.2.1 Practical convergence issues

The implementation of the algorithm in the R package rSETSe has two modes:sparse and semi-sparse. Semi-sparse mode is used on smaller graphs, and sparsemode is for larger graphs (starting at 5000–10,000 edges), the complexity ofsparse mode is linear to the number of edges O(|E|). The mode has no impacton the final embeddings. Time and space complexity are discussed further insection 3.3.

Although there are several parameters of the physical model that must beinitialised before running the algorithm, only two have any real bearing on thefinal embeddings. Drag, time step and mass affect the rate of convergence, butnot the final outcome (when Fstatic = 0). The distance between the nodes doesaffect the outcome as final elevation will change. However, when Fstatic = 0, theangle between the nodes is unaffected by the length of the edges and so elevationcan be normalised. Only the force variable and the spring stiffness have animpact on the final converged values. The force variable is not controlled by theuser, leaving only k. As such, the value of k must be constant for all networksunder evaluation, or if k is a function, then k = f(x) must be consistentlyparametrised.

All networks in this paper are embedded using bi-connected SETSe, a moreadvanced method than algorithm 1. This method breaks the network into bi-connected sub-graphs then calls auto-SETSe, which is an algorithm that choosesthe coefficient of drag using a binary search. Both bi-connected SETSe and

9

Page 10: The spring bounces back: Introducing the Strain Elevation ... · The spring bounces back: Introducing the Strain Elevation Tension Spring embedding algorithm for network representation

auto-SETSe are discussed in detail in the Appendix.

2.3 Data

Two datasets are used in this paper to illustrate how SETSe works and how itcan be used to gain insight into network structure and behaviour. Both of thedatasets have binary edges, meaning that k is constant across all edges in allnetworks. This reduces the embeddings from three to two, as strain and tensionhave a perfect linear relationship.

2.3.1 Peel’s quintet

Peel’s quintet [23] is an example of the graph equivalent of Anscombe’s quartet[2]. It is a collection of five binary attribute graphs that have an identical numberof nodes, edges connections between and within classes, and assortativity. Thenetworks are very different when visualised (see figure 3). Peel, Delvenne, andLambiotte [23] achieve this situation by dividing the binary classes into two sub-classes, which have different mixing patterns. They then develop an alternativemetric and demonstrate that it can distinguish between the quintet and othernetwork structures. Peel’s quintet is essentially a hierarchical stochastic blockmodel. Each network has two blocks containing two sub-classes. Each sub-classcontains 10 nodes, with a total of 40 nodes per network. Each network has 120edges with 40 edges connecting the classes together and 40 edges internally ineach class. Because the number of edges connecting within and between thesub-class is distinct, the overall network structure is itself distinct, even thoughin terms of traditional network metrics they are identical. Peel’s quintet willbe used as an example of how SETSe is affected by graph topology and thenetwork attributes, in this case, the two known communities and the hiddencommunities.

Figure 3 shows Peel’s quintet. The classes are shown as being either turquoiseor red, while the two hidden classes are triangles or circles. While type A issimply a random network, the other networks show varying types of structureproduced by the inter hidden group connection patterns. Table 1 shows theblock models the network is based on.

2.3.2 Facebook data

The Facebook 100 dataset by Traud, Mucha, and Porter [32] is a snapshot ofthe entire Facebook network on a single day in September 2005. At this time,Facebook was open only to 100 US universities. There were very few linksbetween universities then, so each one can be considered a stand-alone unit.Such an assumption would not be possible now. The data allow insight intothe structure of relationships in attributed social networks. The networks areanonymised and the universities are referred to by a reference. Caltech36 isthe smallest university network and has only 769 nodes and 16,656 edges, whilePenn94 has the most nodes with 41,554 and Texas84 has the most edges with

10

Page 11: The spring bounces back: Introducing the Strain Elevation ... · The spring bounces back: Introducing the Strain Elevation Tension Spring embedding algorithm for network representation

●●●

●●

●●

Type A

●●

● ●

Type B

●●

● ●

●●

●● ●

●●

Type C

●●

●●

Type D

● ●

●●

●●

●●

●●●

Type E

The Peels quintet of assortativty identical graphs

Figure 3: These networks were introduced by Peel, Delvenne, and Lambiotte[23]. They are all networks that have identical numbers of nodes edges, groupsize, within-group connections and between-class connections. However, thenetworks are clearly structurally distinct.

11

Page 12: The spring bounces back: Introducing the Strain Elevation ... · The spring bounces back: Introducing the Strain Elevation Tension Spring embedding algorithm for network representation

Table 1: Block model for Peel’s quintet

(a) Type A

a1 a2 b1 b2a1 10 20 20 20a2 - 10 20 20b1 - - 10 20b2 - - - 10

(b) Type B

a1 a2 b1 b2a1 38 2 20 20a2 - 0 20 20b1 - - 0 2b2 - - - 38

(c) Type C

a1 a2 b1 b2a1 38 2 0 0a2 - 0 80 0b1 - - 10 20b2 - - - 10

(d) Type D

a1 a2 b1 b2a1 10 20 0 0a2 - 10 80 0b1 - - 10 20b2 - - - 10

(e) Type E

a1 a2 b1 b2a1 38 2 0 0a2 - 0 80 0b1 - - 0 2b2 - - - 38

1,590,655. The original study on this dataset [32] found there were assortativitypatterns within the variables that were generally common across all universities,such as tendency to be connected to students who will graduate at the sametime or who lived in the same university accommodation. The networks haveseven attributes, all of which are categorical and have been anonymised. Theseattributes are: student type, gender, major, minor, dorm, year of graduationand high school.

2.4 Experimental analysis

The experiments are broken across the two datasets. The first set of experi-ments will focus on Peel’s quintet, distinguishing network types, then distin-guishing between node types. The second set of experiments will look at theFacebook data, first comparing assortativity with SETSe, which is similar todistinguishing between networks, then distinguishing between node types onthe Facebook dataset. The two SETSe dimensions will be elevation and nodetension. Node tension is the mean absolute tension in the edges connected to

the nodes vten,i =

∑Ften,i,j

n , where for this expression only, n is the number

of edges for node vi.This paper uses four accuracy metrics: accuracy (eq 11), balanced accuracy

(eq 12), f1 score (eq 13) and Cohen’s kappa (eq 14), where P, N, TP, TN, FPand FN are the number of positives, negatives, true positives, true negatives,false positives and false negatives, respectively. TPR is the false positive rate,

TPR = TPP , and TNR is the true negative rate TNR = TN

N . po is the observedprobability of two events occurring, e.g. predict class one truth is class one. pe isthe expected probability of two events occurring, given their overall prevalence.

ACC =TP + TN

P + N(11)

12

Page 13: The spring bounces back: Introducing the Strain Elevation ... · The spring bounces back: Introducing the Strain Elevation Tension Spring embedding algorithm for network representation

BAL ACC =TPR + TNR

2(12)

f1 =2TP

TP + FP + FN(13)

κ =po − pe1− pe

(14)

2.4.1 Distinguishing between networks

The first analysis of the performance of SETSe will be to compare it to a selectionof other node embedding methods using Peel’s quintet [23]. The methods thatit will be compared against are node2vec [14], SDNE [35], LLE [28], LaplacianEigenmaps [3] and HOPE [21]. The methods cover three main areas: graphfactorisation [28, 3, 21], random walks [14] and deep learning [35]. A set of100 networks from each of the five classes of Peel’s quintet will be generatedand embedded into two dimensions using each of the embedding methods. Theembeddings are produced at node level and will be aggregated using the meanto network level. The linear separability of the aggregated embeddings for eachclass will be compared.

2.4.2 Classifying class and sub-class of Peel’s quintet

This section will test the ability of the embedding methods to separate thehidden classes of Peel’s quintet. The embeddings generated in the previousexperiment will be used, and a multinomial logistic regression will be createdfor each network to see the accuracy of separating either the nodes into theirknown classes or into their hidden classes. The logistic regression will use as theindependent variables the two embedding values. In the case of SETSe, thesewill be the node elevation and the mean node tension. The role of the logisticregression is not so much to be a predictive model but to test the separability ofthe data; as such, no cross validation will be necessary. The accuracy measurewill be accuracy as the classes are balanced. The performance of SETSe will becompared against all other embedding methods.

2.5 Relationship with assortativity

The Facebook data will be embedded for all of the 100 universities using thegraduation year of the student. Although technically categorical data, gradua-tion year can be treated as continuous and will be done so for speed of calcula-tion. Missing data will not exert a force. The embeddings will be aggregated tonetwork level using the mean elevation and mean node tension. The resultingtwo-dimensional data will be compared to the assortativity scores of the datato see if there is a relationship between the SETSe embeddings and the networkassortativity.

13

Page 14: The spring bounces back: Introducing the Strain Elevation ... · The spring bounces back: Introducing the Strain Elevation Tension Spring embedding algorithm for network representation

2.5.1 Predicting node class in Facebook data

This tests to see how well SETSe can separate the classes within large andcomplex networks. Year of graduation will be the embedding class, studenttype will be the hidden sub-class. The two main classes of student are types 1and 2, which make up 80% and 15% of the dataset, respectively. The meaning ofstudent type is not clear from the original paper, but it appears to be graduatestudent or alumna. Due to the distribution of student type 2, only 2005 willbe used for the hidden sub-class test. Student type is being chosen over theother variables, as dorm, major, minor and high school have so many levels thatembedding would be impractical. Gender has two levels only; however, there isalmost no assortativity suggesting a complete lack of structure.

Due to the complexity of the data, a k nearest-neighbour approach willbe used. This method will label the node as the majority class of the near-est k nodes in SETSe space. The nearest-neighbour model will be comparedagainst graph adjacency voting. Graph adjacency voting finds the majorityclass amongst all nodes for which the target node shares an edge. The graphadjacency voting will use the full network, but only student types 1 or 2 willcount towards the totals. The metrics used to evaluate performance will be ac-curacy, balanced accuracy, Cohen’s kappa and the f1 score. The hidden classesare highly imbalanced in some of the universities, and so the results need to beinterpreted with care. The hidden class model accuracy will also be comparedagainst the naive ratio of type 2 students (the majority class) to all students.

2.6 Computational details

Each simulation used a single core Intel Xeon Gold 2.3 GHz processor with 4GB of RAM. Code and analysis used R version 4.0, and made extensive use ofigraph [8] and rSETSe https://github.com/JonnoB/rSETSe packages. Thenon-SETSe embeddings were done using the GEM library [13] in Python 3.6.

3 Results

3.1 Peel’s quintet

The first test of the SETSe algorithm uses Peel’s quintet of networks. The 100networks of each class are projected into SETSe space then aggregated using themean absolute elevation and the mean of the node tension. Figure 4 shows theresults of the six algorithms reducing the networks to two-dimensional space.It is clear that the different connection patterns between the nodes result indistinctive tension elevation patterns within the graph class. This results in thefive networks being trivially separable in SETSe space. The other graph em-bedding algorithms struggle to differentiate the network types, node2vec is themost successful projecting the graph types into more or less concentric quarterrings in two dimensions. The HOPE algorithm also has some success, but likenode2vec failed to provide a clear linear separability. The SDNE algorithm was

14

Page 15: The spring bounces back: Introducing the Strain Elevation ... · The spring bounces back: Introducing the Strain Elevation Tension Spring embedding algorithm for network representation

●●●●

●●●

●●

● ●●

●●

●● ●

●●

●●

● ●●

●●

●●

●●

● ●●

● ●

●●

●●

●●

●●

●●

●●

● ●

●●

●●

● ●

● ●

● ●●

●●

●●

● ●

● ●

● ● ●●

●●●

●●

●●

●●●

●●

●●

●● ●

● ●

● ●●●●●

●●●●

●●

●●

●● ●

●●●

●●●● ●●●●

●●

● ●●

● ●●● ●● ●●

●●●●●●●

● ●

●●●

●●

●● ●

● ●●●

● ●

●●●

● ●

●●

● ●

● ●

●●

●●●

●●

● ●● ●●

●●●

●●●●

●●● ●● ●●

●●●●

●●

● ●●

● ●

●●

●●●

●● ●●

●●●●

●●●●

● ●

● ●

●● ●●●

●●

●●●

●●

●●●

●●

●●●

●●●

●●

●●●

●●●●●●

●●●

●●●●●

●●

●●

●●

●●

●●

●●

●●●

●●●●●

●●

●●●

●●●

●●●

●●●●

●●●●●

●● ●

●●●●

●●●

●●

●●

●●

● ●

●● ●

●●●● ●

●●●●●

● ●

● ● ●● ●

●●

● ●

● ● ●

●●

● ●●

●●

●●

● ●

● ● ●

●●●

● ●●●

● ●

●● ●

●●

●●

●●●

●●

●●

●●●

●●●

●● ●●

●●

●●

●●

●●

●●●

●●

●●● ● ●

●●

●●●●

●●

●●●

●●

● ●

●●

●●

●●

●●●

●●

●●

●●

● ●

●●

●●

●●●

●●

●●

●●

●●

●●

●●

● ●●

●●

●●

● ●

●●●

●●●

●●

●●

●●●

●●●

●●

●●

●●

●●●

●●

●●●●

●●

●●

●●

●●●

●●

●●

● ●●

●●●

●●

●●

●●●

●●

●●

●●

●●●

●●

● ●

●●

●●

●●●

●●

●●

●●

●●●

●●●

●●

●●

●●

●●

●●●

●●

● ●

●●

● ●

●●

●●

●●

●●

●●

●●

●●

●● ●

●●

●●

●●

●●

●●

●●

●● ●●

●●

●●

●●

●●●●

●●

●●

●● ●

●●

● ●

●●

●●

●●

●● ●● ●

●●●

●● ●● ●

●●

● ●●

●●

●● ●

●●

●●●

●●

●●

● ●

●●

●●

●●

●●

●●

●●

● ●

●●

●●

●●

●●

●●

● ●

●●

●●

●●

●●

● ●

●●

●●

●●

● ●

●●

●●●

●●

●●

●●●

●●●

●●

●●

●●●

●●

●●

●●

●●

●●

●●

●●

●●

●● ●

●●

●●●●

●●●

●●

●●●●

●●●

●●

●●

●●●

●●

●●

●●●

●●●●●●

●●●●●●

●●●

●●

●●●●

●●

●●

●●●

●●●●●●●

●●

●●

●●

●●●●●

●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●

●●●●●●●●●●●● ●● ●●

●● ●●● ●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●

●●●●●●●●●●

●●

●●

●●

● ●

●●

●● ●

● ●

● ●

●●

●●

●●

● ●●

●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●

●●

●●

● ●

● ●

●●●

●●

●●

● ●

●●

●●

●●

●●

● ●

●●

●●

● ●

●●

●● ●

● ●

●●

● ●

●●

●●

●●

●● ●

●●●

●●

●●

●●

●●

● ●

●●

●●

● ●

●●

●●

●●

●●

●●

●● ●

●●

●●

●●

●●

●●

●●●●

●●

●●

●●● ●

●●

●●

●●●

●●●

●●

● ●

●●

●●●●

●●● ●

● ●

●●●

●●● ●●

●●

●●

●●●●

●●

●●

●●

●●●●

●●●

●●●● ●●● ●●●●●●●●● ●● ●●●

●●●●●

●●●● ●● ●●● ●●●●●●●● ●●● ●●●●●● ●●● ●●●●●●● ● ●●●● ●● ●●●● ●●●●

●●●●● ●●●● ●●●● ●●●● ●● ●●●● ● ● ●● ●●●●● ●

●●●●●●●●●● ●

●●

● ● ●● ●●● ●●●● ●

●● ●● ●●● ●

●● ●●●●●● ●●● ●●●●

●●●●● ● ●●

●●●

●●●

●●

● ●●●●●●● ●●●●●● ●●

●●●●●●●●

●●

●●●● ●●

● ●●●● ●●

●● ●●●

● ●●● ●●●●●

● ●●●

●●

● ●●●●●● ●● ● ●● ●●● ●●●● ●

●●

●●

● ●●

●●●● ● ●●● ●●●

●●●● ●

● ●● ●

●●

●●●●

●●●

●●●●● ●●

●● ●●

●●●

● ●●

● ●●

●● ● ●●● ●●●●

●●

●●

●● ● ●●● ●●● ●●

●●

●●●

●●●● ●● ●●● ● ●

●●●●● ●●● ●● ● ●●●

● ●● ●●●

●● ●●

●●●●●

● ●●● ●

●● ●● ●●

●●● ●

●●●●

●● ●

●● ●●

●●●

● ●●●●●●

●●●

●●●●

●●

●●●●●●

●●●

●●● ●●

●●

●● ●

●●

●● ●●

●●●●●

●●

●●

●●

●●●

●●●

●●●●

node2vec SDNE SETSe

HOPE LapEig LLE

0.00 0.25 0.50 0.75 1.00 0.00 0.25 0.50 0.75 1.00 0.00 0.25 0.50 0.75 1.00

0.00

0.25

0.50

0.75

1.00

0.00

0.25

0.50

0.75

1.00

first dimension (tension for SETSe)

seco

nd d

imen

sion

(el

evat

ion

for

SE

TS

e)

type

A

B

C

D

E

Seperating Peel's Quintet in 2 dimensions

Figure 4: One hundred examples of each of the five network classes of Peel’squintet introduced by Peel, Delvenne, and Lambiotte [23]. The networks areclearly linearly separable in the averaged SETSe space even though all 500 areidentical using assortativity.

unable to provide successful embeddings; this may be due to the number of em-bedding dimensions being so low, or due to the structure of the quintet graphsthemselves.

Clearly, SETSe is successful at separating the graph types. However, it is alsointeresting to know whether it can assign the nodes to the correct classes withineach graph type. Figure 5 shows the SETSe embedding of the nodes in theelevation tension dimensions for an example graph of each type. The nodes arecoloured by the hidden sub-class. As can be seen, there are clear patterns in thenode placement. Although the sub-classes of type A cannot be distinguished,types C, D and E appear to be linearly separable in the elevation dimensionalone. In contrast, type B produces a roughly symmetrical distribution requiringboth elevation and strain for separation. The separability is checked for all 500networks using SETSe and the five other embedding types. The results areshown in figure 6. The figure shows how each embedding technique separates theclasses and sub-classes for each type of graph in Peel’s quintet. A multinomiallogistic regression with two independent variables, reflecting the two dimensionsof the embedding, was used to model the accuracy of each class and sub-classwithin the graphs.

As can be seen from figure 6, SETSe outperforms the other embedding algo-rithms in every case, again node2vec and HOPE come next in terms of perfor-mance. When comparing pure linear separability, SETSe greatly outperforms

15

Page 16: The spring bounces back: Introducing the Strain Elevation ... · The spring bounces back: Introducing the Strain Elevation Tension Spring embedding algorithm for network representation

●●●●

●●●●

●●●●

●●

●●

●●●●

●●●●

●●●●

●●

●●

●●

●●●●

●●

●●

●●

●● ●●

●●●● ●●

●●●●●●

●●

●●

●●●●

●●●●

●●●●

●●●●

●●

●●

●●

●● ●●●●

●●●●●● ●●●● ●●

●●

●●

●●●●

●●●● ●●●●

●●●●

●● ●●

●● ●●●●●●

●●

●●

●●●● ●●●●●●

●●●●

●●●●●●

●●

●●●●

●●

●●

●●

●●

●●●● ●●

●●

●●●●

●●

●●

●●●●

●●●●

●●

●●●●●●

●● ●●

●●●●

●●●●

●●●●

●●

●●

●●●●

●●●●

●●

●●●●

●●●●●●

●●●●●●●●●●

●● ●●

●●●●

●●

●●

●●

●●●●●●●●

●●

●●●●●●●●

●●

●●●●

●●

●●●●

●●●●●●●●●●●●●●●●

●●●●

●●●●●●

●●●●●●●● ●●

●●●●

●●●●

●●

●●

●●

●●

●●●●

●●

●●

●●

●● ●●●●●●●●

●●

●●●●

●●

●●●●●●●● ●●●● ●●

●●●●

●●

D E

A B C

0.00 0.25 0.50 0.75 1.00 0.00 0.25 0.50 0.75 1.00

0.00 0.25 0.50 0.75 1.00

0.25

0.50

0.75

1.00

0.25

0.50

0.75

1.00

node tension

node

ele

vatio

n

sub class

A 1

A 2

B 1

B 2

The position of individual nodes in each graph type of Peel's Quintet

Figure 5: Embeddings of individual nodes; the x-axis is node tension and they-axis is node elevation. This representation reveals the hidden structure of thegroups in all network types apart from type A.

all other embedding techniques. With the exception of identifying the sub-classof graph type A, SETSe can linearly separate the classes and sub-classes atleast 67% of the time. And it can perfectly linearly separate the sub-classesin four of the ten cases. No other embedding method is close, HOPE can lin-early separate all four sub-classes for graph D 32% of the time, but for mostcases, linear separation is not possible on either the known binary class or thefour hidden sub-classes. The SDNE algorithm was not included in figure 6 dueto poor performance. When the other methods are allowed to embed the 20node graphs in eight dimensions, their performance increases substantially to alevel comparable with SETSe. As an additional comparison, three communitydetection algorithms were also compared: Fast Greedy [7] , Walktrap [25] andLouvain [4]. These algorithms were not successful at distinguishing between theclasses or sub-classes.

3.2 Analysing Facebook data using SETSe

The Facebook 100 dataset was embedded into SETSe space using the variablegraduation year. The results for four different universities are shown in figure 7.For ease of viewing, the x-axis is presented on a log scale. Figure 7 shows threeuniversities with low assortativity (Auburn, Maine and Michigan), meaning thatthere is a high degree of mixing between years, and one university with a highassortativity (BC), meaning students tend to associate within years. Although

16

Page 17: The spring bounces back: Introducing the Strain Elevation ... · The spring bounces back: Introducing the Strain Elevation Tension Spring embedding algorithm for network representation

●●●●●●●●●●●●

●●●

●●●● ●●

●●●

●●●

●●

●●

●●

●●

●●

●● ●

● ●

●●●●

●●

●●

●●

●●

●●

●●

●●

Type E: class detection Type E: sub class detection

Type C: class detection Type C: sub class detection Type D: class detection Type D: sub class detection

Type A: class detection Type A: sub class detection Type B: class detection Type B: sub class detection

HOPE

LapE

igLL

E

node

2vec

SETSe

HOPE

LapE

igLL

E

node

2vec

SETSe

HOPE

LapE

igLL

E

node

2vec

SETSe

HOPE

LapE

igLL

E

node

2vec

SETSe

0.25

0.50

0.75

1.00

0.25

0.50

0.75

1.00

0.25

0.50

0.75

1.00

accu

racy

Comparison of graph embeddings methods for class identification on Peel's Quintet

Figure 6: Using the knowledge of the groups, SETSe greatly outperforms theother node embedding techniques, at separating the classes and sub-classes foreach type of Peel’s quintet. The upper and lower bounds on the boxes describethe 26th and 75th percentiles. The whiskers are 1.5 times the inter-quartilerange.

17

Page 18: The spring bounces back: Introducing the Strain Elevation ... · The spring bounces back: Introducing the Strain Elevation Tension Spring embedding algorithm for network representation

●●

●●

●●

●●

●●

●●

●●●

●●●●

● ●●● ●●

●●

●●

●●

●●

●●

●●

●●

●●●

●●

●●

●●●

●●●

●●

●●

●●

● ●

●●

● ●●

● ●●●

●● ●

● ●

●●

●●

●●●

●●

●●

●● ●

●● ●

●●

●●

● ●●

● ●

●●

●●

● ●●

●●

● ●

●●●

● ●

● ●

●●●

●●

●● ●●

●●

●●

●●●

●●

●●

●●

●●

● ●

●●

●●

●●●

● ●● ●

●●●

●●

● ●●

● ●●● ●

●●●

●●●

●●●

●● ●●

●●

●●

●●

●●●

●●

●●

● ●●●

●●

●●

● ●●

●●●

●●

●● ●

●●

● ●

●●

●●

●●●

●● ●

●●

●●

●●

●●●

●●

●●

●●●

●●● ●

●●

● ●

●●

●● ●

●●

● ●●

●●

●●

●●

●●

●●

● ●

● ●

●●

●●

●●

●●●

●●

●●

●●

● ●

●●

●●● ●

●●

●●

●●

●●

●●

●●

●●

● ●

●●●

● ●●

●●

●●

●●

●●●●

●●

● ●●●●

●●

●●

●●

●●

●● ●

●●●

● ●

●●

●●●

●● ●

●●

●●

●●●●

●●

●●

●●●

● ●

●●

●● ●

●● ●●

● ●

●●● ●

● ●●

●●

● ● ●●

●●

● ●

● ●

●●●

●●●

●●

●●

●●

●●●

●●

●● ●

● ●

●●

●●

●●

●●

●●

●●

● ●

● ●

●●

●●

●● ●●

●●

●●

●●

●●

●●

●●●

●●

●●●

●●●

● ●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●●

● ●

●●

●●●

●●

●●

●●

●●

●●

●●

● ●●

●●

● ●●●

●●

● ●●●

●●●

●●●

●●

●●

●●

●●

●●

●●

●●

●●

● ●

●●

●●

●●

●●

●●

● ●

●●

● ●●

● ●

● ●

●●●

●●

●●

●●●

●●

●●●

● ●

●●

● ● ●●

● ●

● ●

●●

●●

●●

●●●●

●●

● ●● ●●

● ●

● ●

●●

●●●

●●

●●

●●●

●●

●●

● ●●●

●●

●● ● ●

●●

●●

●●●

●●

●●

●●

●●

●●

●●

●●

●●

●● ●

●●●

●●

●●

●●●

●●

●●

●●

●●

●● ●

●●●

●●

●● ●

●●

●●

●●

●●

●●●

●●

●● ●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●● ●● ●●

●●

● ●

●●

●● ●

● ●

●●

●●

●●

●●

● ●

● ●●

● ●

●●●

● ●

●●

●●

●●

●●

●●

● ●

● ●●

●●

●●●

●●

●●

●●

●●

● ●

●●

●●●

● ●●●

●● ●●●

●●

●● ●●

●●

●●

● ●●●

●●

●●

●●●●

●●

● ●

●●●

●●

●●

●●

● ●●

●●

●●

●● ●

●●

● ●●

●●

●●

●●

●●

●●

● ●●

●●

●●

●●

●●

●●

● ●●● ●

●●

●●

●●

●●●

●●●

●●

●●

●●

●●

●●

●●

●●●

●●

●●

● ●

● ●●●

●●

● ●

●●

●●

●●

●●

●●

●● ●

●●

●●

●●

●●●

●●

● ●● ●

●●

●●●

●●

●●

●●

● ●

●●

●●

●●

●●

●●●

● ●

●●

●●

● ●

●●

●●

●●● ●

●●

●●●

● ●

●●

●●

●●

●●

●●

●●

●●●

●●

●●

●●●

●●

●●

●●

●●●

●●

●●

●●

●● ●

●●

● ●

●●

●●●

●●

●●

●● ●●

●●

● ●●

●●

●●

●●

●●●

●●

● ●●

●●●

●●●

●● ● ●

●● ●

●●

● ●

●●●

●● ●

●●

● ● ●

●●

●●

● ●

● ●

● ●●

●●

● ●

●●

●●●

●●

●●

● ●

●●

●●

● ●

●●

● ●

●●●

●●

●● ●

● ●

●●

●● ●

●●

●● ● ●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

● ●

●●

●●

●●●

●●

●●

●●

● ●●

●●

● ●●

●●

● ●

●●

●●

●●

● ●

●●

●●

●●

●● ●

● ●●

●●

●●

● ●

●●● ●

● ●

● ●●

●●

●●

●● ●

●●

●●

●●

●●

●●

●●

●●

●●

● ●

●●

●●● ●

●●

●●

●●

●●

●●

●●

● ●

●●

●●

●●

●●● ●

●●

●●

●●

● ●

●● ●

●●

●●

●● ●

●●

●●

●●

●●●

●●

●●

●●

●●

●●

● ●

●●

●●●●

●● ●

●●

●●

●●

●●

● ●

●●

●●

●● ●

●●●●

●●

●●

●●

●●

● ●

●●

● ●

●●

● ●

●●

●● ●●

●● ●

● ●

●●

●●

● ●●

●●

●●

●●

●●

●● ●●

●●

●●

●●

●●

●●

●●

●●

●●

●●●

● ●

●●

●●

● ●●

●●

● ●

●●

●●

●●

●●●

●●

●●

●●●

● ● ●●

● ●

●●

●●

● ●

● ●

●●

●●●

●●

●●

● ●

●●●

●●

●●●

●●

●● ●

●● ●

●●

●●

●●●

●●

●●●

●●●

●●

●●

●●

●●

●●●●

●●●

●●

●●

● ●

●●●

●●

●●

●●

●●

●●●

●● ●

●●●

●●

●●●

● ●

●●

●●

●●●

●●

●● ●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●●

●●

●●

●●

● ●

● ●●●

●●●

●●

●●

● ●

●●

● ●

●●

●● ●

●●

●●

●●

●●

● ●

●●

●●

●●

●●

●●

●●

● ●●●

● ●●

●●

●●●

●● ●

●●

●●

●●

●●

●●

●●

● ●

●● ●

●●

●●

●●

●●

●●

●●

● ●●●●

● ●

●●

●●

●●

●●

●●●

● ●● ●

●●

●●

●● ●

●●

●●

●●

●●●

● ●

●●

●●

●●

●● ●●

●●

● ●●

●●

●●●

●●

●●

● ●

● ●

●●

●●

● ●●

●●●

●●

●●

●●

●●

●●

●●

●●

●●●

● ●●●●

● ●●

●●●

●●●●

●●

●●

●●●

●●

●●

●●

● ●

● ●●

●●●

● ●

●●

●●

●●

●●

●●●

●●●

●●

●●

●●

●●

●● ●

●●

●●

●●

●●● ●

●●

●●

● ● ●●

●●

● ●●●●

●●

●●

●●●

● ●● ●

● ●

●●

●●

●●●

● ●

● ●●

●●

●●

●● ● ●●

●●

●●

●●

●● ●

●●

●●

●●

● ●●

● ●

●●

● ●

● ●●

●●

●●

●● ●

●●

●● ●

●●

●●

●●

●●●

●●

●●

●●●

●●

●●

●●

●●

●●

●● ●

● ●●

● ●

● ●

●●

●●

●●

● ●●

●● ●

●● ●

●●●

●●

●●

●●●

●● ●●

● ●

●●

●●

●●

● ●

● ●●

●●

●● ●

●●

●●

●●●●

●●

●●

●●

●● ●

●●

●●

●●

●●

● ●

●●

● ●

●● ●

●●

●●●

●●

●●●

●●

● ●●

● ●

●●

●●

●●●

● ●

●●

●●

● ●

●●

●●●

●●

●●●

● ●

●●

●●●

●●

●●

●●

●●

●●

● ●●

●●

●●

●●

●●

●●

●●

●●

● ●

●●●

●●

●● ●

●●

●●●

●●

●●

●●

●●

●●● ●

●●

●●

●● ●

●●

●●

● ●

●●

●●

●●

●●●

● ●

●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●

●●

●●

●●●

● ●

●●●

●●

●●

●●

●●

●●

●●

●●

●●

●● ●

● ●

●●

●●

●●●

●●●

●●

●●

●●

● ● ●●●

●●

●●

●●

●●

●●●

●●●

●●

●●

● ●●

●●●

●●

●●●

●●

●●

● ●●●

● ●

● ● ●

● ●

●●●

●●

●●●

●●

●●

●●

●●

●●

● ●

●●

●●

●●

●●

●●

●●

●●● ●

●●

●●

●●

●● ●

●●

● ●●

●●●

●●● ●

●●

● ●●● ●

●●

●●●

●●●

●●

●●

●●

● ●

● ● ●

●● ●

●●

●●

● ●

●●

● ●●

● ● ●●

●●●

●●

●●●

●●

● ●●●

● ●●

●●

●●

●●●●●

●●

●●●

●●

●● ●

●●

●●●●

●●

●●

●●

●●

● ● ●

●● ●

●●

●●●

●●

●●●

●●

●●

●● ●●●

●●

●●

● ●

● ●

●●

●●

●●●

●●

●●● ●

●●●

●●

●●

● ●

●●●●

●●●

●●

●●

●●●

●●

●●

●●

●●

● ●●

●●●

●●●

●● ●

● ●● ●

● ●

●●

●●●

● ●

●●

● ●

● ●

●●

●● ●

●●●

●●

●●

●●

●●●

●● ●●●

● ●

●●

●●●●

● ●

●●

● ●●

● ●●

●●

●●

●●

●● ●

●●

●●

●●

●●

●●

●●

●● ●

● ●●

●●

●● ●

●●

●●

●●

●●

●●

●●

●●●●

● ●

●●

● ●

● ●

●●

●●

●●

●●

●●

●●

●●

●●

●●● ●●

●●

●●

●●

●●●●

●●

●●

●●●

●●

● ●

●●

●●

●● ●

● ●● ●

● ●

●●

●●

●●

●●●

●●

●●●

●●

●●

●●

●●

●●

● ●

● ●●●

●● ●

●●

●●

●●

●●

●●

●●

●●

●●●

●●

●●

●●

●●

●● ●

●●●

●●

●●

●●●

●●

●●

●● ●

●●●●

●●

●●

● ●●

● ●

●●

● ●●

●●●

●●●

● ●

●●

●●

●●

●●

●●

●●●

●●●

●●

●●

●●

●●●

●●

●●

● ●

●●

●●

●●

●●

●●●

●●

●● ●

●●●

● ●●●

●●

●●

●●

●●

●●

●●●●

●●

● ●

●●

● ●●

● ●

●●●

●●

●●

● ●

●●●

●●

●●●

●●

●●●

●●

●●

●●

●●

●●

●●●●

●●

●●

● ●●●

●●

●●

●● ●

● ● ●●●●●

●●

●●●

●●

●●●

●●

●●●

●●● ●●

●●

●● ●

●●●

●●

●●●

●●

●●●

●●

●●

●●

●●

●●

●●

● ●

●●

●●

●●

●●

●●●

● ●

●●

●●

●●●

●●

●●

●●

●●

● ●●

● ●

● ●

●●

●●

●●

●●●

●●

●● ●●

●●●

● ●●

●●

●●●●

●●

● ●●

●●●

●●

●●

●● ●

● ●

●●● ●

●● ●

●●

●●

●●

●● ●

●●

●●

●●

●●

●●

● ●

●●

● ●●

●●

●●

●●

●●

● ●●

●●

●●

●●

●●●

●●●

● ●

●●●

●●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●● ●●

●●● ●●

●●●

●●

● ●

●●

● ●

● ●●

●●

●●

●● ●

●●

●●

●●

●●

●●●

● ●●

● ●

● ●

●●

●●●

● ●●

●● ●

●●

●●●

●● ●●

●●

●●

●●

●●

●●

● ●

●● ●●

● ●●

●●

●●

●●

●●● ●

●●

●●

●● ●

●●

●●●

● ●

●●●

●●●

● ●●

●●

●●

●●

●●

●●

● ●●●

●●

●●

●●●

●●

●● ●

●●

●●

●●●

●●

●●

● ●

●●

●●

●●

●●

●●

●●

● ●

● ●

●●

● ●

●●

●●

●●

●●

●●●

● ●

●●●●

●●●

●●

●●

●●

●●●

●●●

●●

● ●●

●●

●●

●●

●●●

●●

●●

●●

●●

● ●

● ●

●●

●●

● ●

●●

● ●

● ●●

● ●

●● ●

● ●

●●●

● ●

●●

●● ● ●●

●●

●●

●● ●●● ●

●●

●●

●●

●●●

● ● ●●

●●

●●

● ●

● ●

●●

●●

●●

●●

●●

●● ●

●●

● ●

● ●●●

●●

●●

●●

●●●

●●

●●

● ●●

●●

● ●

● ●●●●

●● ● ●

●●

●●

●●

●●

●●

●●

●● ●

●●●

●●

●●●●●●

●●

●●● ●

●●

●●

●●

● ●

●●

●●

●●●●

●●●

●●●

● ●●

●●

●●

●●●

● ● ●

●●

●●

●●●

●●●

●●

● ●

●●●

●●

●●

●●●

●●●

●●

● ●

●●

●●

●●

●●●

●●

●●●

●● ●

● ●●●

●● ● ●

●● ●

●●

●●

●● ●●

●●●● ●●●

●●

●●

●●

●●

●●

●●

● ●

●●

●●

●●

●●

●●●

●●●

● ●●

●●

● ●●

●●

●●

●●

●●●

● ● ●●●●

●●

●●

●●

●●●

● ●

● ●

●●

●●

●●

●●●●

●●

●●

●●

● ●● ●

●●

●●

●●●

● ●●●

●●

●●

●●●●●● ●●

●●

●● ●

●●

●●

● ●

●●

●●

●● ●● ●

●●

● ●

●●

●●

●●

●●●

●●

●●●

●●

●●●

●●

●●

●●

●●●

● ●

●●

● ●

●●

●●

●●

●●

●●

● ●

●●

● ●

●●

●●●

●●

● ●

●●

●●

●●

●●

●●

●●●

● ●

●●●

●●

●●●

●●

● ●

● ●●

●●

●●

●●

●●

●●

●●

●●

● ●

●●●

●●

●●

●●

●●●●

●●

●●

●● ●

●●

●●

●●

●●●●

● ●●

●●

●●

●●

● ●

●● ●

●●

●●

●●

●●

●●●

●●●

●●

●●

● ●

●●

● ●

●●

●●

●●

●●

●●●

● ●●

● ●●

● ●●

● ●

●●●

●●

●●

●●

●●●

● ● ●

●●

●●

●●●

●●

●●

●●

●●

●●

●●●

●●● ●

●●

●●

●●●

●● ● ●●

●● ●

● ●

●●

●●

●●

●●

●● ●●●

●●

●●●●

●●

●●● ●

●●

●●

●●

● ●●

●●

●●

●●

●●●

●●

●●●

●●

●●

● ●● ●

●●

●● ●

●●

●●

●●

●●

●●

●●

● ●●

●●●

● ●●

●●

●●●

● ●●

●●

●●

●●●

● ●●

●●

●●

●●

●●

●● ●

●●

●●

● ●●

●●

●●

●●

●●

●●

●●

●●● ●

● ●●●

●●

●●

● ●●

●●

●●

●●

●●

●● ●

●●

● ●

●●

● ●

●●

●●

●●

●●

●●

●● ●●●●

●●

● ●

●●

●●

●●

●●●●

●●

●●

●●

● ●

●●

● ●

●●

●●

●●

●●

●●

●●

● ●

● ●●

●●

●●

● ●

● ●

●●

● ●

●●

●●

●● ●●●

●●

●●

●●

●●

●●

● ●

●●

●●

● ●●

●●

●●

●●● ●

●●

●●

●●●●

●●●

●●

●●●

●●

●●

●●●

●●

●●●

● ●

●●

●●

●●

●●

●● ●

●●

●●●●

●●

● ●●● ●

●●

●●

●● ●

●●

●●

●●

●●

● ●

●● ● ●

●●

●●

●●

●●

●●

●●

● ●

●●

●●●

●●

●●

●●

●● ●●

●●

● ●●

●●

●●

●●

●●

●●

●●

●● ●

●●

● ●● ●●

●● ●

●●●

●●

● ●

●●

●●

●●

●●

●● ●

●●

●●

●●

●●

●●● ● ●

●●

●●

●●

●●

● ●

● ●●

●●

●●

●●

●●●

●●●

●●

●●

●● ●●

●●

●●

●●

●●

● ●

● ●●

● ●

●●

●●

●●

●●●

●●●

●●

●●

●●●●

●●●

●●●

●●

●●

● ●

●●

●●

●● ●

●●

●●

●●

●●

●● ●

●●

●●

●●

●●

●●

●●

● ●● ●●

●●●

●●

●●●

●●●

●●

●● ●

●●

●●

●●

● ●

●●

●●●●

●●

●●●

●●●

● ●

●●

●●

●●

●●

●●

●●●●

● ●

●●

●●

●●

● ●●

●●●

●●●

●●

● ●

●●●

●●

●●

●●●●

● ●●

● ●

●●●

● ●

●● ●

●● ●●

● ●

●●

● ●

● ● ●●

●●

●●

● ●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●● ● ●

●● ●

●●

● ●

●●

●●

●●

●●

●●

●●●

● ●● ●

●●

● ●

●●

●●

●●●

●●

●●

●●●

● ●

●●

●●

●●

●●

●●

●●●

●●●

●●

● ●

●●●

●●

●●

●●●

●●

●●

● ●

●●●

●●

●●

● ●

●●

●●

● ●

●●

● ●●

●● ●

●●

●●

● ●

● ●

●●

●●

●●

●●

●●

●●●

●●●

● ●●

●●

● ●●

●●

●●

● ●●

● ●

●●

●●

●●●

●●

●●

●●

●●●

● ●

●●

●●

● ●

●●

●●●

●●

●●●

●●

● ●

●● ●

●●

●●

●●

●●

●●

●●

●● ●

● ●

●●

● ●●

●●

●●

●●

● ●

●● ●

●●●

●●

●●

●●

●●

●● ●

●●

●●●● ●●

●●

●●●

●●

●●

●●

● ●●

●●

●●●

●●

●●

●●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●● ●

● ●●

●●

●●

●●

●●

●●

●●

● ●

●●

●●

●●

●●

●●

●●

●●●

● ●●

●●

●● ●

●●

●●

●●●

●●

●●

●● ●

●●

●●

●● ●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●● ●

● ●

●●

●●●

●●

● ●●●

●●

●● ●●

●● ●●

●●

●●

●●

●●

●●●

●●●

●●

●●

●●

●●

● ●

●●

● ●●

●●● ●

●●

●●●●

●●

●●

●●

●● ●

●●

●● ●●

●●

●●

● ●

●●

● ●

●●●

●●

●●

●●

● ●●

●●

●●

●● ●

●● ●

●●●●●

●●

●●

● ●

● ●● ●

●●

● ●

● ●

●●

● ●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

● ●● ●●●

● ●●

●●

●●●

●●

●●

●●

●●

● ●

●●

●●

● ●

● ●●

●●

●●

●●

● ●● ●

●●●

●●●

●●

●●

●● ●

● ●

●●●

●●●●

● ●

●●

●●● ●● ●

●●

●●

● ●

●●●

●●

●●

●●

●●

● ●

●●●

● ●

●●

●●

●●●

●●●●

●●

●●

● ●●

● ●●

●●

●●

●● ●

●●

●●

●●

●●● ●●●

● ● ●

●●

● ●●

● ●

●●● ●

●●●

●●

●●

●●

●●

●●

●●●●

●●●●

●●

●● ●

● ●

● ●●

●●

●●●

●●

●●

●●

●●

●●

●●

● ●

●●

●●

●● ●

● ●

●●●●

●●

● ●

●●

●●

●●

●● ● ●

● ●

●●

●●

●●

●●

●●●

● ●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

● ●●● ●

●●

●●

●●

● ●

● ●●

●●

● ●

● ●

● ●● ●

●●

●●

● ●

● ●

●●

●●●●

●●

●●

●● ●

●●

● ●

●●

●●

● ●●

●●

●●

●●● ●

●●

●●

●●

●●●

●●

●●

●●●

●●

●●

●●

●●●

● ●●

●●

●●

● ●●

●●

●●●

●●

●●

●●●

●●

●●

●●

●●

●●

●●

●●

●●

●●●

● ●●

●●

●●●

●●

●●

●● ●●

●● ●●

●●●

● ●

●●●

●●

● ●

●●

●●●

● ●

●●

●●

●●

●●

●●●

●●

●● ●

●●

●●

●●

●●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

● ●

●●

●●●

●●

●●

●● ●

●●

●●

●●

●●

● ●

●●●

●●

●●

●●●

●●

●●

● ●

●●●

●●

●● ●● ●

●●●

●●

●●●

● ●

●●

● ●

●●

● ●

●●

● ●

●●●

●● ● ●● ●

●●

● ●●

●●

● ●●

●●●

●●

●●

●●

●●

●●

●●

●● ●●

●●

●●

●●

●●

●●

●●●

●●

● ●●

●●●

●● ●

●●

●●

●● ●●

●●

●●

●●

●●

●● ●●●

●●

●●

●●

●●●

●●

●●●

●●

●●

●●●●

●●

● ●

●●

●●

●●

●●

● ●

●●

●●●

●●●

●●

● ●

●●●●

● ●

● ●

●●

●●

●●

●●●●

● ●

● ●

●●

●●

●●

●● ●

●●

●●●

●●

●●●

●●

● ● ●●●

●●

●●

●●

●●●

●●

●●●

●●

●●

●●●●

●●

●●●●

●●

●●

●●

●●●

●●

●●

●●

●● ●

●●

●●●●

●●

●●

●●

● ●

● ●

●●

●●

● ●

● ●

●●

●●

●●

●●●

● ●●

●●●

●●

●●

●●

●●

● ●

● ●

●● ●

●●

●● ●

●●● ●

●●

●●

●●

● ●

●●

●●

●●

●●

●●

●●

●●

● ●

●●

●●

●●

●●

●●

●●

●●

●● ●●

●●

●●

●●●

●●

●●● ●●

●●

● ●

●●

●●

●●●

●●

●●

●●

●●

●●

●●

●●●

●● ●

● ●

●●●

●●

●●

●●

●●

●●

●●

●●

●● ●●● ●

●●

●●

●●

●● ●●

●●

●●

● ●

●●

● ●

● ●

●●

● ●

●●

●●

●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●

●●

●●

● ●

●●

●●●

●●

●● ●

●●

●●

●●

●●

●●

●●●

●●

●●

●●

● ●●

● ●●

●●

●●

●●

●●

●●

●●

● ●

●●

●●

● ● ●

●●

●●

● ●●

●●● ●●

●●

●●

●●

●● ●

●●

● ●

●●

●●●

● ●●

●●

●●

●●

● ●●

●●

●●

●●

●●

●●

●●

●●

●●●●

●●

●●

●●

●●

●● ●

●●● ●

●●

●●

●●

●●

●●

●● ●

●●

● ●

●●●

● ●

●●

●●

●●

●●

●●

●●

● ●

●●● ●

●●

●●

●●

● ●●

●●

● ●

●●

●●

●●

●●

●●

●●

●●●

●●

●●

●●●

● ●●●●

●● ●

● ●● ●●

●●

●●

●●

●●●

●●

●●

●●

●●●

● ●●●

●●

●●●

●●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●

●●●

● ●

●●

●●

●●

●●

●●

●●

●●

●●●

●●

●●

●●

●●

● ●

●●

●● ●

●●

●●

●●●●

●●

●●

● ●

●● ●

●●

● ● ●●

●●●

●●

●●

● ●

● ●

●●

●●

●●

● ●

●●

●●

●●

● ●

●●

●●

●●

●●

●●●

●● ●

●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●●

●●●

●●

●●

●●

●●●

● ●

●●

●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●

●●

● ●●

●●

●●

● ●

●●●

●●●●

●● ●●●

●●

● ●●

●●

●●

●● ●

●●● ●

●●

●●

●●

● ●

●●

●●

●●

●●

●● ●●

●●

●●●●

●●

●●

●●

●●

● ●

●●●

●●

●●

●●●

●●

● ●

●●

●●

●●

●●

●●

● ●●● ●

●●

●●

● ●

●●

●●

●●

●●

● ●

●●

● ●●

●●●

● ●●

●●

●●

●●

● ●

●●

●●

●●●

●●

●●

●●●

●●

●● ●

●●

●●●

●●● ●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

● ●

●●

●●

●●●

● ● ●

●●

● ●

●●

● ●●

●●

●●●

●●

●●●

●●

●●

●●

●●●●

●●

●●

● ●

●●●

● ●

●●

●●

● ●

●●

●●

●●

●●

● ●

●●

● ●●

●●

●● ●

●●

●●

●●

● ●

●●

● ●

●●●

●●●●

●●

●●

●●

●● ●

●●● ●●

●●

●●

● ●●

●●

●●● ●

●●●

●●

●●

●●

●●

●●

●●

●●

●●

● ●

●● ●

●●

●●

●●

● ●●

●●●●

●●

● ●

●●

●●

●●●

●●

● ●

●●●

● ●

●●

●●

●●

●●

● ●●

● ●

●● ●●

● ●

●●

●●

●●

●●

● ●

●●

● ●●●

●●

●●

●●

● ●●

●●●

● ●

●●

● ●

●● ●

●●

●●

●●

●●

●●

● ●●

●●

●●

● ●

●●

●●

●●

●●

●●

●●

●●

●●●●

●●

●●

●●

● ●●

●●

●●

●●●

● ● ●

●●

●●

●●

●●

●●

●●●

●●

●●

●●

●● ●

●●

●●

● ●

●●

●● ●●

●●

●● ●●

●●

●●

● ●●

● ● ●

●●●

●●

●●

●●

● ●●

●●

●●

●●●●●

●●●

●● ●●

●●

●●

●●

●●

● ●

●● ●

●● ●

●●●

●●

●●

● ●●●

●●

● ●

●●

● ●

●●

● ●

● ●●

●●

●●

●●● ● ● ●

●● ●

●●●

●●

●●

●●

●●

●●

●●

● ●

●● ●

●●

●●

●●

● ●

●●

●●

●●

●●

●●

● ●●●●●

●●

●●

●●

●●

● ●●

●●●●●

●● ●

●●

●●●●

●●

●●

●● ●

●●

●●●

● ●●

●●●

●● ●

●●

●●

●●

●●

●● ●

● ●

●●

●●●

●●

●●

●●

● ●

●●

●●

●●●

●●

●●●

● ●

●●

● ●●

● ●

●●

● ● ●

●●

●●

● ● ●●

●●●

●●●

● ●

●●

● ●

● ●●

●●

●●

●●

● ● ●

●●

●●●

●●

●●

●●● ●●●

●● ●

● ●

●●

●●●

●●

●●

●●

●●●

●●

●●

●●

●●

● ●

●●

●●

●●

●●●●

●●●

●●

●●●

●●

●●

●●● ●

●●

●●

●●

●●

●●●

●●

● ●

●●●

●●●

●●

●●

●●

●●

●●

●●●

●●

● ●● ●

● ●●●

●● ●

●●

●●

●●●

●●

●●

●●

●●●

●●

●●

●●

●●● ●

●●

●●

●● ●

●●

● ●●

●●

●●

● ●●

●●● ●●

●●

● ●

● ●●

●●

●●

●●

●●●

●●●

●●

●●

●● ●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●●

● ●

●●

●●

●●

●●●

●●

●●

●●

●●

●●

●●

● ●●

●●

●●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

● ●

●●

●●

●●

●●

● ●●

●●

●●

●●

●●●

●●●●

●●

●●

●●

● ●

●●●●

●●●●

●●

●●

●●

●●

●●●

●●●

●●●

●●

● ●

●●

●●●

●●

●●

●●

●●

●●

●●

● ●●●

●●

●●

●● ●●

●●●

●●

● ●

●●

●●

● ●●

● ●

●●●

●●

●●

●●

●●

● ●

● ●●●

● ●

● ●

●●

●●●

●●

●● ●

● ●

●●

●●

●●●

●●●

●●●

●●

●●

●●

●●

●●

●●

●●

●●

●●●

●● ● ●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

● ●●

●● ●

●●●

●●●

●●

● ●

● ● ●

●●●

●●●

●●●

●●

● ●●●

●●

●●

●●

●●

●●

● ●

●●

●●

● ●

●●

●●

●●

●●

●●

●●

●●

●●

●●●●

●●

●●

●●●

●●

●●

●●

●●

assort=0.19 abs elevation=0.76 tension=2.7

●●●

●●

●●

●● ●

● ●

●●

●●

●●●●

●● ●

●●

●●

●●

●●●●

●●

●●● ●

●●

●●●

●●

●● ●●

●●

● ●●

●●

● ●

●● ●●

●●●●

●●●

●●

●●

●●

●●

●●

●● ●

●●

● ●

●●

●●● ●

●●●

●●●

●● ●

●●

●●

●●●

●●

●● ●

●●

●●

●●

●●

● ● ● ●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●●

● ●

●●●

●●

●●

●●

●●●

●● ●

●●

●●

●●

●●

●●

●●

●●

● ●

●●●

●●

●●●

●●

●●

●● ●●

●●●

●●

●●

●●

● ●

●●

●●

●●

●●

●●

●●

●●

●●●

●●●

●●

● ●●

●●●● ●●

●● ●

●●

●●

●●

●●

●●

● ●●●

●●

●●

●●●

●●●

●●●

●●

●●

●●

● ●

●●

●●

● ●

●●

●●

●●

● ●●

●●

●●●

●●

●●●●

●●

●●

● ●●

● ●●●

●●

●●●

●●

● ●

●●

●●

● ●●

●●

●●

●● ●

●●

●●

●●

●●

●●

● ● ●●

● ●●

●● ●

● ●

●●

●●

●●

●●

●●

●●●

●●

● ●●

●●

●●

●●

●●

●●

●●

●●●

●●

●●

● ●

●●●

●●

●●

●●● ●

●●●

●●●

●●●

● ●

●●

●●

●●

●●

●●

●●●

●●

● ●

●● ●

●●

● ●

●●

●●●

●●

●● ●

● ●●

●●

●● ●

●●

●●

●●

●●

●●

●●●

●●

●●

●●

●●

●● ●

●●

●●

●●

●●●

●●

● ●●

● ●●

●●

●●

● ●●

●●

●●●

●●

●●

● ●

●●

●●

●●

●●

●●

●●

●●

●●

●●

● ●●

●●●

●●

● ● ●

●●

●● ●

●●

●●

●●

●●

●●

●●

● ●

●●

●●

●●

● ●

●●

●●●

●● ●

● ●

●● ●

●●

●●

●●

●●●●

●●

●●

●●

●●

●●

●●

●●

●●

● ●

● ●●

●●

●●●

●●

●●●

●●

●●

●●

●●

●●

●●●

●●

●●●

● ●

●●

●●

●●

● ●●●

● ●●

●●

●●

● ●●● ●

●●

●●

●●

●●

●●

●● ●

●●

●● ●

●●

●● ●

●●

●●

●●

● ●

●●

●● ●●●

●●

●● ●

● ●

●●●●

● ●

●●

●●

●●

●●●●

● ●

●●●

●●

●●

●●●

●●

●● ●

●●

●●

●●

●●

●●

●●

●●

● ●

●●

●●

●●

● ●

●●

●●

●●

● ●●

●●

●●

●●

●●

●●

●●

●●●

●●

●●

● ●●

●●

●●

● ●●●

●●

● ●●●

●●

●●

●●

●●

●●

●●●

●●

●●

●●

● ● ●●● ●

●●●●

●●

●●

●●

●●

●●

●●●

●●

●●

● ●

●●

●●●●

●●

●●●

●●

●●

●●

●●

●●

● ●●

●●●●

●●

●●

●●

●●●

●●

●●

●●

●●

● ●●

●●

●●

●●

● ●● ●

●●

●●●

●●

● ●

●●

●●

●●

●●

●●

● ●● ●

●●●

●●

●●

●●●●

●●

●●● ●

● ●

●●

●●

●● ●

●●●

● ●

●●

●●

●●●

●●

●●

●●

●●

●● ●

●●●

●●

●● ●

●● ●●●

● ●

●●

●●

●●

●●

● ●●

● ●

●● ●

●●

●●

●● ●

●● ●●●

●● ●

●●

●●

●●

●●

●●

●●

● ●

●●●

●●

●●●●

●●●

●●

●●●●

●●

●●

● ●●

●● ●

●●

● ●

● ●

●●

●●● ●●

●●●

●●

●●

●●● ●

●●

● ●

●● ●

●●

● ●

●●

●●

●●

● ● ●

●●

●●

●●

● ●

●●

●●

●●

● ●

●●

●●

●●

●●

●●

● ●

● ●

●●

●●

●●

●●

●●

●●

●●

●●●

●●

●●

● ● ●

● ●

● ●

●●

●● ●●

●● ●

●●

●●

●●

●●

●● ●

●●

●●

● ●●●

●● ●

●●●●

● ●●

●●●

●●

●● ●

●●

● ● ●

●●

●●

●●●

●●

●●

●●

●●●●

●●

●●

● ● ●

●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●

●●

● ●●

●●●

●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●● ●

●●

●●

● ●

●●

●●

● ●●● ●

●●

●● ●

●●

●●

●●●

●●

●●

●●

● ●

●●

●●

●●

●●

●● ●

●●

●●

●●

●●

● ●●

●● ●

●●

●●●●

●●

●● ●

●●

●●

●●● ●● ●

● ●

●●●

●●

●●

●●

●●

●●

●● ●●

●●

●●

● ●●

● ●

●●●

●●

●●

●●

●●

●●

● ●●●

●●

●●

●●●

●●

● ●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●● ●

● ●●

●●

●●

●● ●

● ●

●●

●●

●●●

●●

● ●

● ●

●●●

●●

● ●●

●●

●●

●●

●●

● ●

●●

●●

●●

●●

●● ●

● ●

●●●●

●●

●●

● ●●●

●●●

● ●

●●●

●●

●●

●●

●●●

●●

●●

●●

●●

● ●

● ● ●

●●

● ●

●● ●

●●

●● ●

● ●●

●●

●●

●●●

●●

●●

●● ●●

● ● ●

●●

●●

●●

●●●

●●

●●

●●

●●

●● ●

●●

● ●

●●

●●

●●

●●

●●

●●

● ●

●●

●●

●●

●●

●●

●●

●●●

●●

● ●

●●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

● ●

●●

●●

●●

● ●

●●

●●

●●

●●

●●

● ●●●

●●

●●

● ●●

●● ●

●● ●

●●

●●

●●

●●●

●●

●●

●●

●●

● ●

●●

●●

●●

●●

●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●

●●

●●

● ●●

●●

●●

●●

●●

● ●

●●

●●

● ●●●●

●●

●●

●●

●●●

●●

●●

●●

● ●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●● ●

●●

● ●

●●

●●

●●

●●●

● ● ●●

●●●● ●

●●

●●

●● ● ●

●●●

●●

●●

● ●

● ●

●● ●

●●

●●

●●

●●

●●

●●

●●

● ●●●

●●

●●

● ●●

●●

●● ●

●● ●

●●

●●

●●

●●

●●

●●● ●

●●

●●

●●

● ●●

● ●●

● ●●

●●●

●●●

●● ●●

●●

●●

●●

●● ●●

●●

● ●

● ●

●●● ●

●●

●●

●●

●●

●●

●●

●●

● ●

●● ● ●

● ●

●●

●●

●●●

●●

● ● ●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

● ●

● ●

● ●

●●

●●

●●●

●● ●

●●

●●

● ●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●● ●

●●

● ●

● ●

●●

●●

●● ●

●●

● ●

●●

● ●

●●●

● ●

●●

● ●●

●●

●●

● ●

●●

●●

● ●

●●

●●

●● ●●

●●

●●●

●●

● ●●

● ●

●●●

● ●

●●●

●●

● ●

●●

●●

● ●

●●

●●

● ●●

●● ●

●●

●●

●●

●●

●●

●●

●●●

● ●

●●

●●

●●

●●

● ● ●

●●

●●

● ●

●●

●●

●●

●●

●● ●

● ●●

●●●

●●

● ●

●●

●●

●●

●●●

● ●

●●

● ●

●●

●●

● ●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●●

●●

●●

●●

●●

● ●●

●●

●●

●●

● ●

●●

●●

●●

●●

●●

●● ●

●●

●●

●●

● ●●

●●●

●●

●● ●

● ●

●●

●●

● ●

●●●

● ●

●●● ●

●●●

●●

●●

●●

●●

●●

●●

●●

●●

●● ●

●● ●

●●

●●

●●● ●●

● ●●

●●●

●●

●●●

● ●

●●

●●

● ●●

●●

● ●

●●

●●

● ●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●●●

●●

●● ●

●●

●●

●●

●●

●●

●● ●

●●

●●

● ● ●

●●

●●

●●●●●

●●

●●

●●

●●

● ●●

●●

●●

●●

● ●

●●

●●

●●

●●

●●●

●●

●●

●●●

●●

●●

●●●

●●

●●●

●●

●●

●●

●●

●●

●●●●

●●●

●●●

●●

●●●

●●

●●

●●

● ●●●

●●

● ●

●●

●●●

●● ●

●●●

●●

●●

●●●

●●

●●

● ●

● ●

● ●●

●●

●●

●●

● ●●

●●

●●

●●

●●

●●

●●

●●

●●

●● ●

● ●

●●

●●

●●

●●●

●●●● ●● ●●●

●●●

●●

●●

●● ●

●●

●●

●●

●●

●●

● ●

●●

●●

●●

●●●●●

● ●●●

●●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●●

●●

●● ●●

●● ●

●●

●●

●●

● ●

●● ●●

● ●●

●● ●●

●●●●●

●●

●●

●●

●●

●●

● ●

●●

●●

●●

●●●

● ●●

●●

●●●

●●●

●●

●●

●●

●●

●●

●●●

●●

●●

●● ●

●● ●

●●●● ●

● ●

●●

● ●●

●●

● ●●

● ●●

●●

●●

●●

●●

●●

● ●

●●● ●

●●

●●

●●●

● ●● ●

●●

●●

●●

●●

●●

●●

● ●

●●

●●●

●●

●●

●●

●●

●●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●

●●● ●

●●

●●

● ●

●●

●●●

●●

●● ● ●

●●

● ●

●●

●●

●●

●●

●●● ●

●●

●●●

●●

●●

●●

●●

● ●●

●● ●

● ●●

●●

●●

●● ●

●●

●●

●●

●●

●●

●●

●●

●●

●●

● ● ●

●●

●●

●●

●●

●●

●● ●●

●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

● ●

●●●

●●

●●

●●

● ●

●●

●●

●●●

●●●

●●

●●

●●

●● ●

●●

●●●

●●●

●●

●●

● ●

●●●

●●

●●

●●●

●●

●●

●●

●●

●●

● ●●

●●

●●

●●●

● ●●●

●●

●●

●●●

●●

●●

● ●

●●

●● ● ●

●●

●●

●●

●● ●●●

● ●

●●

●●● ●

●● ●

●●

● ●●

●●●

●●

● ●

●●

●●

●●

●●

●●

●●

● ● ●

● ● ●●

●●

●●

● ●●●

●●

● ●

●●

●●●

● ●

●●●

● ● ●●

● ●●

●●

●●

●●

●●

●●

● ●

●●

● ●

●●●

●●●

●●

●●

●●

●●●

●●

● ●

● ●

●●

●●●

●●● ●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●●

●●● ●

●●●

●●

●●

● ●

●●

●●

●●●

● ●

●●●

●●

●●

●●

●●

●●

●●●

● ●

●●

●●●

●●

●●

●●

●●●

●●

●●

●●

● ●●

● ●

●● ●

●●

●●

●●

●●

●●

●●

●●

●●

● ●

●●

● ● ●

●●

●●

● ●

●●

●●

●●●●●

● ●

● ●

●●

●●

● ●

●●●

●●

●● ●

● ●

●●●

●●●●

●●

●●

●●

●●

●●

●●●

●●

●●●

●●

●●

●●●

●●●

● ●

●●

●●

● ●●●

● ●●

●●

●●

● ●

●●●

●●

●●

●● ●

●●

●●●

●●

●●

●● ●

●● ●●●●

● ●

●●

●●

● ●●

●●

●●

●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●

●●

assort=0.18 abs elevation=0.93 tension=3.9

●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●

● ●

● ●

●●●

●●

●●●

● ●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●●

●●

●●

● ●

●●

●● ●

●●

●●●

● ●

●● ●

●● ●

●●

●●

●●●

●●

●●

● ●

● ●

● ●

●●

●●

●●●

●●

●●

●●●

● ●●

●●

●●

●●

● ●

●●●

●●

●● ●● ●●

●●●

● ●

●●

●● ●

●●

●●●

● ●●

●●

●●

● ●

●●

●●

●●

● ●

● ●●

●●

●●

●●

●●

●●●

● ●

●●

●●

●●

● ●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

● ●

● ●

●●

● ●

●●

●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●

●●

● ●

●●

●●

●●

●●

●●

●●

●●

●●

●●

● ● ●

●●

● ●●

● ●●

●●

● ●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●

●● ●

●●

●●

●●

●●

● ●

●●

●●

●●

● ●

●●

●●

●●

●● ●

●●

● ●

●●

●●

●●●

●●

●●

●●

●●

●●

●●●

●●●

●●

●●

●●

●●●

●●

●● ●

●●

●●

●●

● ●●

● ●

●●

●●●

●●

●●

●●

●●

●●

●●

●●

●●

● ●

●●

●●

●●

●●● ●●●

●●

●●

● ●●

●●

●●

●●

●●●

●●

● ●●

● ●

●●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

● ●

●●

●●

●●

●●

●●

●●

●●●

●●

●●

● ●●

●●

●●

● ●

●●

● ●

● ●

● ●

●●

●●

●● ●●●

●●

●●

●● ●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●●

●●

●●

●●

●●●

● ●●

●●●

●●

●●

●●

●●

●●

● ●●

●●

●●

●●

●●

●●

● ●

●●

●●

●●

●●●

●●

●●

●●

●●

● ●

● ●

●●

●●

●●

●●

●●●

●●

● ●

●●

●●

● ●

●●

●●

● ●

●●

● ●

●●

●●

●●

●●

●●

● ●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●●●

●●

●●

● ●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

● ●

●●

●●

●●●

●●●

●●● ●

● ●

●●●

●●

●● ●

●●

●●

●●●

● ●●

●●

● ●●

●●

● ●

● ●●

●●

●●

●●

● ●

●●

●●●

●●

●●

●●

●●

●●●●

● ●

●●

●●

●●

● ●

●●

● ●

●●

●●

●●●

● ●

● ●●

●●● ●●

●●

● ●

●●

●●

●●

●●

● ●

● ●

●●

●●

● ●

●●

● ●

●●●

●●

●●

● ●

● ●●

● ●

●●●

● ● ●● ●

●●●

● ●

●●

●●

●●

●●

●●

●●●

●●

●●

●●

●●

● ●

●●

●●

●●

●●

● ●●

●●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

● ●●

●●

●●

● ●

●●

●●

●●

●●

●●

●●●

●●

● ●●

●●

●●

●●

●●

●●

●●●

●●

● ●

●●

●●

●●●

●●

●●

●●

●●

●●

●● ●

●●

●●

● ●

●●

●●

●●

●●

●●

● ●

● ●

● ●

●●

●●

●●

● ●

●●

●●

●●

● ●

●●

●●

●●

●●

●●

●●

● ●●

●●

●●

●●●

●●

●●

●●

●●

● ●

●●

● ●

●●

●●

●●

●●

●●

●● ●

● ● ●

● ●●

●●

● ●●

●●

●●

● ●

●● ●

●●

●●

●●●

●●

●●

●●

●●

●●

●●

●●●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

● ●●

●●

●● ●

●●

●●

●●●

●●

● ●

● ●

●●●

●●

●●

●●

●●

● ●●

●●

● ●

●●

● ●

●●

● ●●

●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●

● ●●●

●●

●●

●●●●

● ●

●●

● ●

●● ●

●●

● ●

●●● ●

●●

●●

● ●

●●

● ●

●●

●●

●●

●●●

●●

● ●

●●

●●

●●

●●

●●

●●

●●

● ●

●●●●

● ● ●●

● ●

●●

●●

●●●

●●

●●

●●

●●

●●

●●

●●● ●

●●

● ●

●●

●●

●●

●●

●●

● ●

●●

●●

●● ●

●●

● ●

●●

●●

●●

●●

●● ●

●●

●●

●●

●●

●●

●●

● ●

●●

● ●

●●

●●

● ●

● ●

● ●●

●●

●●

● ●●

●●● ●

●●

●●

●●

●●

●●

●●

● ●

● ●

●●

● ●●

●●

●●

●●●

● ●

●●

●●

●●

●●

●●

●●

●●

●●

● ●

●●

● ● ●●

●●

●●

●●

●●●

●●

●●

●● ●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●● ●

●●

●●

●●

●●

●●

●●

●●

● ●

● ●

●●

●●

●●

●●

●●

● ●

●●

●●

●●

●●●●

● ●

●●

● ●

●●

●●

● ●

●● ●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

● ●●

●●●

●●●●

●●

● ●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

● ●

●●

●●

●●

●● ●

●●

● ●

●●

●●●

● ● ●

●● ●

●●

● ●

●●

● ●

● ●

●●

● ●

●●

●●

●●

●●

● ●

●●

● ●

●●

●●

●●

● ●

●●

●●

●●●

●●

●●

●●

● ●●

● ●

●●

●●

●●

●●

● ●●

● ●●●

●●

●● ●

●●

●●

●●

●●

●●

●●

●●●

●●

●● ●

●●

●●●

●●

● ●

●●

●●

●●

●●

● ●

●●●

●●

●●

●●

● ●

● ●

●●

●●

●●

●● ●

●●

●●

●●

●●

●●● ●

●●

● ●●

● ●

●●

● ●

●● ●

●●

●●

●●

●●

●●

●●

●●

● ●●

●●

●●

●●

●●

●●

●●

●● ●

●●

●●

●●

●●

●●

● ●

●●

●●

●●

●●

● ●

●●

●●

● ●

●●

●●

●●●

●●

●●

●●

●●

●●

●● ●●

●●

● ●

●●

●●

●●●

●●

●●

●●

●●●

●●●

●●

●●

● ●

●●

● ●

●●

● ●

● ●

●●

● ●

●●

●●

●●● ●

● ●

●●

●●

●●●

●●

●●

●●

● ●

●●

● ●

●●

●●●

●●

●●

●●

● ●

●●

●●

●●

●●

●●

●●

●●

●●

● ●

●●

●●

●●●

●●

●●

● ● ●

● ●●

●●

●●

●●

● ●

●●

●●

●●

●●

● ●

●●

●●

●●

●●

●●

●●

●●

● ●

●●

●●

●●●

●●●

●●

●●

●●

●●

●●

● ●

●●

●●

●●

●●

● ●●

● ●

● ●

●●

●●

●●

●●

●● ●● ●

●●

● ●

● ●

●●

●●

●●

●●

● ●

●●

●● ●

● ●

● ●●

●●

●●

●●● ●

● ●

●●

●● ●

● ●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●● ●

●●●

●●

●●●

●●

●●

●●

●●

●●

● ●

●●

● ●●

●●

● ●

●●

●●

●●

●●

● ●●

● ●●

●●

●●

●●●

●●

● ●●●●

●●

●●

●●

● ●

●●

●●

●●

●●

●●

●●

●●

●●

● ● ●

●●●●

●●● ●●

●●

● ●

●●

● ●

●●

●●

●●

●● ●●

●●●

●●

●●

●●

●●

●●●

●●

●●

● ●

●●

●●

● ●●

●●

●●

●●

● ●

●●

●●

●●

●●

● ●

●●

●●

●●

●●●

●●

●●

●●●

●●● ●

●●

●●

●●

●●

●●

●●

●●

●●

●●●

●●

●●

●● ●

●●

●●

●●

●●

●●

● ●

● ●

● ●

●●

●●

●●

●●●

●●

●●

●●

●●

●●

●●●

●●

●●

●●

● ● ●

● ●

●●

●●

●●

●●

●●

●●

● ●

●●

●●

●●

●● ●

●●

● ●

● ●

●●

●●

●●

●●

●●

● ●

●●

● ●

●●

●●

●●

●●

●● ●

●●

●●

●●

● ●

●●

● ●

● ●●

●●

●●

●●

● ●

●●

●●

●●●

●●

●●

●●

● ●

●●

●●

●●

●●●

●●

●●●

●●

●●

●● ●

●●

●●

●●

●●

● ●

●● ●

● ●

●● ●

● ●

●●

● ●

●●

●●

●●

●●

●●

●● ●

● ● ●

●●●

●●

●●

●●

●●

●●

●●

●●

● ●

● ●

● ●

●●

● ●

●●

●●

●●

●●

●●●

●●

●●

●●●

●●

●●

●●

● ●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

● ●

●●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

● ●

●●

●●●●

●●

●●

●●

●●

●●

●●●

●●

●●

●●●●

●●

● ●●

● ●●

● ●●

●●

●●●

● ●●

●●

●●

●●

●●

●●

●●

● ●

●●

●●

●●

●●

●●

●●

●●

● ●●

● ●

●●

●●●

● ●

●●

●●●

●●

● ●

●●

●●●

●●●●

●●

●●

●●

●● ●

●●

●●

●●

●●

●●

●●

●●

●●●

●● ●

●●

● ●

●●

● ●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●● ●

●●

●●

●●

●●

● ●●

●●

●●

●●

●●

●●●●

●●

●●

●●

●●

●●

●●●

●●●

●●

●● ●

●●●●

●●

●●

●●

●●

●●

● ●

●●

●●

●●

●●

● ●● ●

●●

●●●

●●

● ●●

●●

●●

●●

●●

●●

●●

● ●●

●●

●●

● ●●

●●

●●

●●

●●

●● ●

●●

●●

●●

●● ●

●●●

●●

●●

●● ●●● ●●

●●

●●

●●

●●

●●

● ●

●●

●●●

●● ●

●●

●●

●●

●●

● ●

●●●

●●●

●● ●

●●

● ●●●

●●

●●

●●

● ●●●

●●

● ●

●●●

●●

●●

●●

●●

●●

● ●

●●

●●●

●●

●●

●●

●●

● ●

●●

●●

●●

●●

●●

●●

●●

●●

●●●

●●

●●●●

● ●

●●

● ●

●●

● ●

●●

●●

●●

●●

●●

●●

● ●●

●●

●●

●●

●●

●●

●●

●●

● ●●

●●

●●

●●

● ●

●●

●●●

● ●●

●●

● ●

●●

●●

●●

●●●

●●

●● ●

●● ●

●●

●●

●●

●●

●● ●●

●●

●●

●●

● ●

●● ●

● ●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

● ●

●●

●●●

●●●●●

●●●●

●●

●●

●●●

●●

●● ●

●●

●●

●●

●●●

●●

● ●

●●

●●

● ●●●

●●

●●

●●

●●

●●

●●

●●●

●●●

●●

●●

● ●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

● ●●

●●

●●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●●

●●

●●

● ●

●●● ●

●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

● ● ●

●●

●●●

●●

●●

●●

●●

●●

● ●

● ●

●●

●●

●●

●●

●●

●●

● ●

●●

●●

● ●

● ●

● ●

●●

●●

●●

●●

● ● ●

●● ●

●●●

●●

●●

●●

●●

●●

●●

●●

●●●

● ●

●●●

● ●●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●● ●

● ●

●●

●●

●●● ●

● ●

●●

●●

●●

●●

● ●

●●

● ●●

●●

● ●●

●●

●●

●●

●●

●●

● ●

●●

●●

●●

●●

●●

●●

●●

●●

● ●

●●

● ●

●●

●●

●●

●●

●●

●●

● ●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●● ●

●●

●●

●●

●●

●●

●● ●

● ●

●●

●●

●●

●●●

●●

●●

●●

●● ●

● ●

●●

●●

●●

●●●

●●

●●

●●

●●

●●

●●

●●

●●●

●●

●●●

●●

●● ●

●●

● ●

● ●

● ●

● ●● ●

● ●

●●

● ●

●●

●●

●●●

●●

●●

●●

●●

●●

●● ●

● ● ●

●●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●●

●●

●●

● ●

●●

●●

●●

●●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●●

●●

●●

●●●

●●

●●●

●●

●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

● ●

●●●

●●

● ●

●●

● ●●

● ●

●●

● ●

●●●

●●

●●

●●

●●

● ●●

●●

●●●

●●

●● ●

●●

●●

●●

● ●●

●●

●●

●●●

●●

● ●●

●●

●●

●●● ●

●●

●●

● ●

● ●

● ●

●●

● ●

●●

●●

●●

●●

●●

●●●

●●

● ●

●●●

●●

●●

●●

● ●

●●

●●

●● ●

●●

● ●

●● ●●

●●

●●

●●●

● ●

●●

●● ●

●●

● ●

●●

●●

●●●

●●

●●

●●

●●

●●

●●●

●●

●●

●●

●●●

●●

●●

●●●

● ●

●●

●●

●●

●●

assort=0.58 abs elevation=1.3 tension=3.5

● ●●

●●

●●

●● ●

●●

●●●●

●● ●●

●●

●●●

●●

●●

●●

●●

●●

● ●

● ●●

●●

●●

●●●

●●

● ●

● ●

●●

● ●●●

● ●●

●●

●●

●●

●●

●●

●●

●●

●●

● ● ●

●●

● ●

●●

●●

● ●

●●

●●

●●

●●

●●

●●

● ●

●● ●

● ●●

●●

●●

●●

●●

● ●●

●●

● ●

●●●

●●

●●●

●●

● ●

●●

●●

●●

● ●

●●

●●●

● ●

●●

●●

●●

●●

●●

●●

●●

● ●

●●

●●●

● ●

● ●

●●

●●

● ●

●●

●●

●●

●●

●●●

● ●

● ●●●

●●

●●

●●

●●●

●●●

●●

● ●

●●

● ●

●●

●●

●●

●●

● ●●

●●

● ●

●● ●

●●

● ●

●●

●●

●●

●●

●●●

●●

● ●

●●

●●●

● ●

●●

●●

●● ●

●●

●●

●●

● ●

●●

● ●

● ●

●●

● ●

●●

●●

●●

●●

●●

●●

● ●

●● ●

●●

●● ●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

● ●

● ●

●●

● ●

●●

●●

●●

●●

●●●

● ●

●●

●●

●●●

●●

● ●

●●

●●

● ●●

●●

●●●

●●

●●

● ●

●●

●●

●●

●●●

●●

●●

●●●

● ●

●●

●●

●●●

● ●

●●

● ●

●●

●●

●●

●●

●●

●●

● ●

●●

●●

●●

●●●

●●

●●●

●●

●● ●

●● ●

●●

●●

●●

●●●

●● ●●●

●●

●●

● ●

●●

●●

● ●●

●●

●●●

● ●

●●

●●

●●

●●

●●

●●

● ●

●●

●●

●●

●●

●●

●●

●●●

●●

●●

●●

● ●

●●

●●

●●

●●

●●

● ●

●●

● ●

●●

●●

●●

● ●

●● ● ●

● ●

●●

●●

●●

●●

●●

●●

●●

● ●

●●

●●

●●●

● ●●

●●

●●

●●

● ●

●●

●●

●●

●●●

●●

●●

●●

●●

●●

●●

●●

● ●

●● ●

●●

●● ●

●●

● ●●

●●

●●

●●

●●

●● ●

●●

●●

● ●●

● ●

●●

●●●

●●

●●

●● ●

●●

●●

●●

●●

●●

●●

● ●

●●

●●

●●

●●

●● ●

●●

●●

●●

●●●

●●

●●●

● ●

●●

●●●

●●

●●

●●●

●●

● ●

●●

●●

●●

●●

●●

● ●

● ●

●●

●●

●●

●●

●●

●●

●●

●●

●●

● ●●

●●

●●

● ●

●●

●●

●●

●●

● ●●●●

● ●●

●●

●●

● ●

● ●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

● ●

● ●●

● ●

●●

● ●

● ●

●●

●●

●●

●●

●●

●●

●●●

● ●

●●

●●

●●

●●

●●●

● ●

●●

●●

● ●●

●●

●●

● ●

●● ●

●●●

●●

●●

●●

●●●

●●

● ●●●

●●

●●

●●

● ●

●●

●●●

●● ●

●●

●●●

●● ●

●●

● ●

● ●

●●●

●●

●●

●● ●

●●

●●

●●

●●

● ●

●●

● ●

●●

●●●

●●

●●

●●

●●

●● ●

●●

●●

●●

●●●

●●

● ●

●●

● ●

●●

● ●● ●

●●

●●

●●

●●

●●

●●

●●

●●

● ●

●●

● ●

●●

●●

●●

●●●

●●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●●

● ●

●●●

●●

●●

●●

●●

● ●

●●

●●

●●

●●

●●

●●

●●

● ●

●●

●●

●●

●●

●● ●

●●

●●

●●

●●

●●

●●●

● ●●

● ●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

● ●

● ●

●●

● ●

●●

●●

●●●

●●

● ●

●●

● ●

●●

●●

● ●●●

●●

●●

●●

●● ●

●●

●●●

●●

●●

●●

●●

● ●●

●●

●●

●●

● ●

● ●

●●

●●

●●●

●●

●●

●●

assort=0.16 abs elevation=1.2 tension=5

Maine59 Mich67

Auburn71 BC17

−2.5 0.0 2.5 −2.5 0.0 2.5

−5

0

5

−5

0

5

log mean tension

elev

atio

n

Year

2004

2005

2006

2007

2008

2009

Node level Facebook embeddings of selected universities

Figure 7: The node level embedding shows that the tension and elevation distri-bution of nodes by type are distinct and related to the overall network tensionand elevation; this is not possible to explore with assortativity.

it is not possible to explain why the universities have these differences (Peel,Delvenne, and Lambiotte [23] suggest it is due to housing allocation policy), it ispossible to analyse the relation between assortativity and SETSe. It is clear fromthe figure that the years separate to roughly their own tension elevation bandwithin the data. This is what we would hope to see given that the embeddinguses force based on the graduation year. The overall shape of the data is a funnelwith the ‘nose’ at the low-tension end and the funnel at the higher-tension end.It is clear from the plot that the younger years (those graduating in 2008 and2009) are more separate than the older years (graduation in 2004, 2005 and2006). This is because they have had less time to form cross-year bonds, andso are more assortative. The nose is created because the nodes that are mostcentral within their year experience the least tension.

The three low assortativity universities have very different tension and ele-vation scores. It can be seen that higher tension appears to create a fuzzier, lessclearly defined groups within each year. Low elevation creates a single ‘nose’for the whole cone, but higher elevation starts separating out, forming a nosefor each year. It is impractical to understand the differences of the universitiesby looking at the scatter plots of thousands of students across 100 universities.The elevation and node tension are thus aggregated at university level, and all100 universities are plotted in Figure 8. The points are coloured by assortativ-ity. It can be seen that while there is a positive relationship between tensionand elevation, there is, in fact, a negative relationship between the tension and

18

Page 19: The spring bounces back: Introducing the Strain Elevation ... · The spring bounces back: Introducing the Strain Elevation Tension Spring embedding algorithm for network representation

●●

●●

●●

●●

●●

●●

●●

0.8

1.0

1.2

1.4

3.0 3.5 4.0 4.5 5.0mean node tension

mea

n ab

solu

te n

ode

elev

atio

n

0.2

0.3

0.4

0.5

yearassortativity

The average elevation and tension embeddings of the facebook 100 dataset.

Figure 8: There is a clear relationship between the SETSe embeddings andyear assortativity; however, the pattern is dependent on elevation and tension.SETSe finds large differences between networks with very similar assortativity.

elevation dimensions and the value of assortativity. Analysing the distributionof the tension, elevation and assortativity scores for all 100 universities, it isfound that the data are normally distributed. Creating a linear regression onassortativity using tension and elevation as independent variables provides anR2 = 0.82, where the coefficients are significant to p < 0.001. Creating lin-ear models using either tension or elevation provides R2 = −0.006 and 0.429,respectively.

Next, the ability of the embeddings to be able to predict the k nearest-neighbour nodes is tested against a baseline of graph adjacent voting. Themodel is an effective predictor of year with high values of kappa (59% whenk is 9), and balanced accuracy (77% when k is 9) indicating that the modelpredicts above the naive baseline value. However, the graph adjacency modeloutperforms the knn using SETSe by between 7% to 10% in terms of accuracy;kappa and f1 score and up to 15% on balanced accuracy. It should be noted thathigh levels of performance are expected as the model is predicting on the datait was embedded with. Despite this caveat and the poor performance againstthe adjacency voting model, the results show that the embeddings producedare meaningful, even on large and complex networks, although the actual graphstructure is lost.

Figure 6 showed that SETSe could uncover the hidden structure of the net-work; however, the question is can SETSe do that on a complex real-worlddataset? Using the same year embeddings for the Facebook 100 dataset, the k

19

Page 20: The spring bounces back: Introducing the Strain Elevation ... · The spring bounces back: Introducing the Strain Elevation Tension Spring embedding algorithm for network representation

0.5

0.6

0.7

0.8

0.9

1.0

0 5 10 15 20number of nearest neighbours in SETSe space k

frac

tion

of ti

mes

SE

TS

e be

ats

grap

h ne

ighb

our

votin

g

metric

accuracy

bal_accuracy

f_meas

kap

Predicting student type from year embeddings relative to graph neighbour voting

Figure 9: Using k nearest neighbours to predict the type of student performsbetter. This is despite the graph embeddings being relative to year not studenttype. SETSe beats the baseline more than 50% of the time for all values of k.Note for ease of viewing, the y-axis starts at 0.5.

nearest neighbours were used to predict student type. The results show thatthe accuracy of the student type model averaged across all 100 universities, out-performs the naive rate of student type 2 over student type 1 at all values ofk. The model performance is also quite stable for all values of k, accuracy isaround 71%, balanced accuracy is 55%, kappa is very low at 12% and the f1score is around 79%. The low balanced accuracy is not significantly higher than0.5, and the low kappa score signifies that the results could simply be due tochance. However, as shown in figure 9, the model comprehensively beats thenearest-neighbour voting method in all metrics for almost all values of k. Notethat the f1 score is not reliable in this case as there are a significant number ofoccasions where type 1 students are not predicted at all. This means that the f1score is dependent on the class labelling; balanced accuracy and Cohen’s kappaavoid this issue.

3.3 Complexity

The time taken to embed the Facebook graph is plotted in Figure 10. The leftpanel of the figure shows that the iteration time complexity is almost perfectlylinear (O(|E|)). The time taken to reach convergence is shown in the rightpanel. The convergence complexity shows heteroscedasticity and is closer torunning in quadratic time (O(|E|2)). This is in keeping with other spring system

20

Page 21: The spring bounces back: Introducing the Strain Elevation ... · The spring bounces back: Introducing the Strain Elevation Tension Spring embedding algorithm for network representation

●●●

●●

●●

●●

●●

●●●

● ●●●

● ●

●● ●

●●

●●

● ●● ●●

●● ●

● ●

● ●●

●●

●●

●●

●●

●●

● ●●

● ●●

time per iter time taken

0e+00 1e+06 2e+06 3e+06 0e+00 1e+06 2e+06 3e+06

0

5000

10000

15000

20000

0.0

0.1

0.2

0.3

0.4

0.5

number of edges

seco

nds

Time complexity of SETSe on iteration and covnergence of Facebook 100 data

Figure 10: The per-iteration complexity is linear; however, the complexity ofthe total time to convergence is closer to quadratic.

models that are also (O(n2)), but is slower than LLE, Laplacian, Eigenmapsand HOPE, which run in (O(|E|)), as well as node2vec and SDNE, which runin (O(|V|)), for single attribute networks. It should be noted that time toconvergence is highly dependent on the network topology; see the Appendix onthe bi-connected component method. On the system described in section 2.6 anetwork of 40,000 nodes and 1.5 million edges uses about 1.9 GB of RAM, toconverge and 3.9 hours.

4 Discussion

SETSe acts as a hybrid between the advanced techniques of the graph embeddersused for analysis and the intuitive simplicity of the spring embedders used forgraph drawing. It does this by projecting the network onto an n+1-dimensionalmanifold, using node attribute as a force. This is different from traditionalspring embedders where all nodes exert an equal force [15, 12, 9]. As such, unlikethe graph embedding algorithms that were used to benchmark performancein this paper, SETSe cannot be said to ‘learn’ the properties of the network.Instead, similar to the other spring embedders, SETSe finds an equilibriumposition wherein all forces are balanced. SETSe also distinguishes itself fromthe graph embedders and spring embedders by being entirely deterministic. TheSETSe algorithm is fast within each iteration, running in O(|E|) linear time.However, the time to convergence is not linear and appears to be closer toO(|E|2), which is slower than most machine learning embedders.

21

Page 22: The spring bounces back: Introducing the Strain Elevation ... · The spring bounces back: Introducing the Strain Elevation Tension Spring embedding algorithm for network representation

When separating Peel’s quintet, only SETSe managed to find a successfultwo-dimensional representation of the networks. Two points should be made onthis, as the output of the other graph embeddings produces a two-dimensionalrepresentation of each node. SETSe produces a one-dimensional representationof each node and a two-dimensional representation of each edge (although strainand tension are identical in this case). Another point is that the other graphembedding algorithms can embed the graph in any number of dimensions; typi-cally the graph would be embedded in higher-dimensional space then visualisedin two dimensions by embedding the nodes a second time using some other datareduction method [34, 22]. These two points illustrate the fundamentally dif-ferent approach to embedding that SETSe takes as it is able to natively projecthigh-dimensional data in meaningful low-dimensional space without needing asecondary embedding method. The explicit edge embedding also allows for flex-ibility when it comes to projection choices, which is something other embeddingmethods lack. The goal of typical graph embedders is to minimise the distancebetween nodes according to some measure of similarity. In contrast, SETSe doesnot try to optimise the meaning in the data; instead, it maps the attributes andedge weights of the network to a new space (the manifold), and in doing soreveals properties of the network.

The work of Goyal and Ferrara [13] suggests that at least some of the othergraph embedders would have been able to linearly separate the sub-classes ofPeel’s quintet more successfully if more dimensions were used. However, thisleads to new problems such as how many dimensions should be used? Whatdimension reduction algorithm should then be chosen to reduce the dimensionsagain for plotting purposes, and how can the results be interpreted? In the caseof using the output of the embedding for a model, dimensional parsimony isparamount. Being able to model the dependent variable in two dimensions ismuch more desirable than getting the same results with 16 dimensions. This isnot to say that SETSe is always better than the other methods tested here. Inparticular, for applications where node similarity is the most important feature,or a large number of dimensions is advantageous, it is likely to be outperformed.Also, unlike most machine learning graph embedders, SETSe is unable to per-form link prediction. Being bound by physics also means SETSe cannot beeasily tuned to target specific goals. However, while most graph embedders aredesigned to express a similarity chosen by the designer and parametrised by theuser, SETSe in contrast is broadly goal agnostic. Instead, it simply allows thegraph to express in a different way what was already there and provides insightinto the underlying data in the process. In some cases, SETSe could be used inthe preprocessing stage of the more sophisticated graph embedders, providingedge and attribute data to support similarity optimisation.

5 Conclusions

The SETSe algorithm is an unsupervised graph embedding method that usesnode and edge attributes to project a graph into three separate latent vector

22

Page 23: The spring bounces back: Introducing the Strain Elevation ... · The spring bounces back: Introducing the Strain Elevation Tension Spring embedding algorithm for network representation

spaces. The node vector space is pairwise Euclidean for connected nodes. Thusthe projection of data into SETSe space can be considered a manifold of dimen-sion n + 1, where n is the number of variables, and the final dimension is thegraph space defining the minimum node distance. The vector space providesinsight into the original data and can reveal the structure that was not availablewhen the embedding was produced.

This paper showed that SETSe outperforms several popular graph embed-ding algorithms, on tasks of network classification and node classification. SETSealso provides distinctions between networks that appear to be similar or identi-cal when using the popular assortativity metric.

SETSe is a physical system whose equilibrium state allows previously ob-scured patterns in the data to be expressed. And although it could be classedas an unsupervised learning algorithm, SETSe does not learn any properties ofthe system or attempt to optimise similarity. As such, it can be considered bothas an auxiliary of, and a counterweight to, machine learning techniques. This isimportant because although the value of machine learning in current researchprogress cannot be overstated, it is not the answer to everything. Besides, oftenthe more sophisticated a technique, the more subjective choices are required,both in development and parametrisation.

Although spring embedders lost popularity with the rise of machine learning,SETSe shows that there is still a role for unsupervised physics models in moderndata analysis. The spring has bounced back.

6 Additional materials

The Appendix contains details on the auto-SETSe and bi-connected SETSealgorithms used to make network convergence easier.

In addition an R package has been created, rSETSe, which can be used tocreate SETSe embeddings. The package allows easy embedding of graphs onnetworks of tens of thousands of nodes and over a million edges on a normal lap-top. The package can be installed from https://github.com/JonnoB/rSETSe.

Acknowledgements

I would like to thank Connor Galbraith and Patrick De Mars for their thought-ful and patient advice at all stages of this project; Dr Ellen Webborn for herinsightful and thorough feedback on the manuscript; and my supervisors DrElsa Arcaute and Dr Aidan O’Sullivan for giving me the space to pursue thisidea. I acknowledge use of the UCL Myriad High Performance Computing Fa-cility (Myriad@UCL), and associated support services, in the completion of thiswork.

23

Page 24: The spring bounces back: Introducing the Strain Elevation ... · The spring bounces back: Introducing the Strain Elevation Tension Spring embedding algorithm for network representation

References

[1] Sverre J. Aarseth. “The N-body problem”. In: Gravitational N-Body Sim-ulations: Tools and Algorithms. Cambridge Monographs on Mathemati-cal Physics. Cambridge University Press, 2003, pp. 1–17. doi: 10.1017/CBO9780511535246.002.

[2] F. J. Anscombe. “Graphs in Statistical Analysis”. In: The American Statis-tician 27.1 (1973), pp. 17–21. issn: 0003-1305. doi: 10.2307/2682899.

[3] Mikhail Belkin and Partha Niyogi. “Laplacian Eigenmaps for Dimension-ality Reduction and Data Representation”. In: Neural Computation 15.6(June 2003). Publisher: MIT Press, pp. 1373–1396. issn: 0899-7667. doi:10.1162/089976603321780317.

[4] Vincent D Blondel et al. “Fast unfolding of communities in large net-works”. In: Journal of Statistical Mechanics: Theory and Experiment 2008.10(Jan. 2008), P10008. issn: 1742-5468. doi: 10.1088/1742-5468/2008/10/p10008.

[5] Shaosheng Cao, Wei Lu, and Qiongkai Xu. “Deep Neural Networks forLearning Graph Representations”. en. In: Thirtieth AAAI Conference onArtificial Intelligence. Feb. 2016.

[6] Hang Chen et al. “Same Stats, Different Graphs”. en. In: Graph Drawingand Network Visualization. Ed. by Therese Biedl and Andreas Kerren.Lecture Notes in Computer Science. Cham: Springer International Pub-lishing, 2018, pp. 463–477. isbn: 978-3-030-04414-5. doi: 10.1007/978-3-030-04414-5_33.

[7] Aaron Clauset, M E J Newman, and Cristopher Moore. “Finding commu-nity structure in very large networks”. In: Physical Review E 70.6 (Jan.2004). issn: 1539-3755. doi: 10.1103/physreve.70.066111.

[8] Gabor Csardi and Tamas Nepusz. “The igraph software package for com-plex network research”. In: InterJournal Complex Sy (2006), p. 1695.

[9] P. EADES. “A heuristic for graph drawing”. In: Congressus Numerantium42 (1984), pp. 149–160.

[10] Matthias Fey and Jan Eric Lenssen. “Fast Graph Representation Learningwith PyTorch Geometric”. In: arXiv:1903.02428 [cs, stat] (Apr. 2019).arXiv: 1903.02428.

[11] Arne Frick, Andreas Ludwig, and Heiko Mehldau. “A fast adaptive layoutalgorithm for undirected graphs (extended abstract and system demon-stration)”. en. In: Graph Drawing. Ed. by Roberto Tamassia and Ioan-nis G. Tollis. Lecture Notes in Computer Science. Berlin, Heidelberg:Springer, 1995, pp. 388–403. isbn: 978-3-540-49155-2. doi: 10.1007/3-540-58950-3_393.

[12] Thomas M. J. Fruchterman and Edward M. Reingold. “Graph drawing byforce-directed placement”. en. In: Software: Practice and Experience 21.11(1991), pp. 1129–1164. issn: 1097-024X. doi: 10.1002/spe.4380211102.

24

Page 25: The spring bounces back: Introducing the Strain Elevation ... · The spring bounces back: Introducing the Strain Elevation Tension Spring embedding algorithm for network representation

[13] Palash Goyal and Emilio Ferrara. “Graph embedding techniques, applica-tions, and performance: A survey”. en. In: Knowledge-Based Systems 151(July 2018), pp. 78–94. issn: 0950-7051. doi: 10.1016/j.knosys.2018.03.022.

[14] Aditya Grover and Jure Leskovec. “node2vec: Scalable Feature Learn-ing for Networks”. In: Proceedings of the 22nd ACM SIGKDD Interna-tional Conference on Knowledge Discovery and Data Mining. KDD ’16.San Francisco, California, USA: Association for Computing Machinery,Aug. 2016, pp. 855–864. isbn: 978-1-4503-4232-2. doi: 10.1145/2939672.2939754.

[15] Tomihisa Kamada and Satoru Kawai. “An algorithm for drawing generalundirected graphs”. In: Information Processing Letters 31.1 (Apr. 1989),pp. 7–15. issn: 0020-0190. doi: 10.1016/0020-0190(89)90102-6.

[16] Thomas N. Kipf and Max Welling. “Semi-Supervised Classification withGraph Convolutional Networks”. In: (Sept. 2016).

[17] Stephen G Kobourov. “Force-Directed Drawing Algorithms”. en. In: Hand-book of Graph Drawing and Visualization. CRC Press, June 2013, pp. 383–408. isbn: 978-1-138-03424-2.

[18] Y. Koren. “Drawing graphs by eigenvectors: theory and practice”. en. In:Computers & Mathematics with Applications 49.11 (June 2005), pp. 1867–1888. issn: 0898-1221. doi: 10.1016/j.camwa.2004.08.015.

[19] Martin Krzywinski et al. “Hive plots—rational approach to visualizingnetworks”. en. In: Briefings in Bioinformatics 13.5 (Sept. 2012). Pub-lisher: Oxford Academic, pp. 627–644. issn: 1467-5463. doi: 10.1093/bib/bbr069.

[20] Justin Matejka and George Fitzmaurice. “Same Stats, Different Graphs:Generating Datasets with Varied Appearance and Identical Statistics throughSimulated Annealing”. In: Proceedings of the 2017 CHI Conference onHuman Factors in Computing Systems. CHI ’17. Denver, Colorado, USA:Association for Computing Machinery, May 2017, pp. 1290–1294. isbn:978-1-4503-4655-9. doi: 10.1145/3025453.3025912.

[21] Mingdong Ou et al. “Asymmetric Transitivity Preserving Graph Embed-ding”. In: Proceedings of the 22nd ACM SIGKDD International Con-ference on Knowledge Discovery and Data Mining. KDD ’16. San Fran-cisco, California, USA: Association for Computing Machinery, Aug. 2016,pp. 1105–1114. isbn: 978-1-4503-4232-2. doi: 10.1145/2939672.2939751.

[22] Karl Pearson. “LIII. On lines and planes of closest fit to systems of pointsin space”. In: (Nov. 1901). doi: 10.1080/14786440109462720.

[23] Leto Peel, Jean-Charles Delvenne, and Renaud Lambiotte. “Multiscalemixing patterns in networks”. en. In: Proceedings of the National Academyof Sciences 115.16 (Apr. 2018), pp. 4057–4062. issn: 0027-8424, 1091-6490.doi: 10.1073/pnas.1713019115.

25

Page 26: The spring bounces back: Introducing the Strain Elevation ... · The spring bounces back: Introducing the Strain Elevation Tension Spring embedding algorithm for network representation

[24] Bryan Perozzi, Rami Al-Rfou, and Steven Skiena. “DeepWalk: onlinelearning of social representations”. In: Proceedings of the 20th ACM SIGKDDinternational conference on Knowledge discovery and data mining. KDD’14. New York, New York, USA: Association for Computing Machinery,Aug. 2014, pp. 701–710. isbn: 978-1-4503-2956-9. doi: 10.1145/2623330.2623732.

[25] Pascal Pons and Matthieu Latapy. “Computing communities in large net-works using random walks”. In: Journal of Graph Algorithms and Appli-cations 10.2 (2006), pp. 191–218. issn: 1526-1719. doi: 10.7155/jgaa.00124.

[26] Aaron Quigley and Peter Eades. “FADE: Graph Drawing, Clustering, andVisual Abstraction”. en. In: Graph Drawing. Ed. by Joe Marks. LectureNotes in Computer Science. Springer Berlin Heidelberg, 2001, pp. 197–210. isbn: 978-3-540-44541-8.

[27] Liam J. Revell et al. “Graphs in phylogenetic comparative analysis: Anscombe’squartet revisited”. en. In: Methods in Ecology and Evolution 9.10 (2018),pp. 2145–2154. issn: 2041-210X. doi: 10.1111/2041-210X.13067.

[28] Sam T. Roweis and Lawrence K. Saul. “Nonlinear Dimensionality Reduc-tion by Locally Linear Embedding”. en. In: Science 290.5500 (Dec. 2000).Publisher: American Association for the Advancement of Science Sec-tion: Report, pp. 2323–2326. issn: 0036-8075, 1095-9203. doi: 10.1126/science.290.5500.2323.

[29] Franco Scarselli et al. “The Graph Neural Network Model”. In: IEEETransactions on Neural Networks 20.1 (Jan. 2009). Conference Name:IEEE Transactions on Neural Networks, pp. 61–80. issn: 1941-0093. doi:10.1109/TNN.2008.2005605.

[30] Youngjoo Seo et al. “Structured Sequence Modeling with Graph Con-volutional Recurrent Networks”. en. In: Neural Information Processing.Ed. by Long Cheng, Andrew Chi Sing Leung, and Seiichi Ozawa. Lec-ture Notes in Computer Science. Cham: Springer International Publish-ing, 2018, pp. 362–373. isbn: 978-3-030-04167-0. doi: 10.1007/978-3-030-04167-0_33.

[31] Volker Springel et al. “Simulations of the formation, evolution and cluster-ing of galaxies and quasars”. En. In: Nature 435.7042 (June 2005), p. 629.issn: 1476-4687. doi: 10.1038/nature03597.

[32] Amanda L. Traud, Peter J. Mucha, and Mason A. Porter. “Social structureof Facebook networks”. en. In: Physica A: Statistical Mechanics and itsApplications 391.16 (Aug. 2012), pp. 4165–4180. issn: 0378-4371. doi:10.1016/j.physa.2011.12.021.

[33] W. T. Tutte. “How to Draw a Graph”. en. In: Proceedings of the LondonMathematical Society s3-13.1 (1963), pp. 743–767. issn: 1460-244X. doi:10.1112/plms/s3-13.1.743.

26

Page 27: The spring bounces back: Introducing the Strain Elevation ... · The spring bounces back: Introducing the Strain Elevation Tension Spring embedding algorithm for network representation

[34] L J P Van Der Maaten and G E Hinton. “Visualizing high-dimensionaldata using t-sne”. In: Journal of Machine Learning Research (2008). issn:1532-4435. doi: 10.1007/s10479-011-0841-3.

[35] Daixin Wang, Peng Cui, and Wenwu Zhu. “Structural Deep Network Em-bedding”. en. In: Proceedings of the 22nd ACM SIGKDD InternationalConference on Knowledge Discovery and Data Mining. San Francisco Cal-ifornia USA: ACM, Aug. 2016, pp. 1225–1234. isbn: 978-1-4503-4232-2.doi: 10.1145/2939672.2939753.

[36] Zonghan Wu et al. “A Comprehensive Survey on Graph Neural Networks”.In: IEEE Transactions on Neural Networks and Learning Systems (2020).Conference Name: IEEE Transactions on Neural Networks and LearningSystems, pp. 1–21. issn: 2162-2388. doi: 10.1109/TNNLS.2020.2978386.

27