Documenti di Didattica
Documenti di Professioni
Documenti di Cultura
by
Garimella Ramamurthy
G. Rama Murthy
International Institute of Information Technology
Hyderabad, India
rammurthy@iiit.ac.in
ABSTRACT
In this research paper, an interesting probability mass function is associated with the vertices of
graph. One among the various possible entropies ( Shannon, Renyii, Tsallis etc ) is associated
with such a probability mass function. Specifically Shannon entropy is utilized to define a
novel graph entropy. Characterization of minimum and maximum Shannon entropy graphs is
discussed. By associating a symmetric stochastic matrix with the graph, novel Shannon Capacity
of a graph is defined. Several interesting results in spectral graph theory of structured graphs
are reported. New results related to sparsest and densest cut computation are reported ( without
invoking Cheeger‟s inequality ).
This research paper is organized in the
I. INTRODUCTION:
following manner. In Section 2, relevant
Directed / undirected, weighted / unweighted research literature is reviewed. In Section 3, a
graphs naturally arise in various applications. novel graph entropy is defined by associating
Such graphs are associated with matrices such a graph with an interesting and natural
as weight matrix, incidence matrix, adjacency probability mass function. Its properties are
matrix, Laplacian etc. Such matrices implicitly studied. In Section 4, several interesting results
specify the number of vertices / edges, adjacency related to spectral graph theory of structured
information of vertices graphs are discussed. In Section 5, new results
( with edge connectivity ) and other related related to sparsest / densest cut computation
information ( such as edge weights ). In recent are discussed. In Section 5, some interesting
years, there is explosive interest in capturing applications of such graph entropy are briefly
networks arising in applications such as social specified. The research paper concludes in
networks, transportation networks, bio- Section 7.
informatics related networks ( e.g. gene
regulatory networks ) using suitable graphs. 2. REVIEW OF RESEARCH
Thus, NETWORK SCIENCE led to important LITERATURE:
problems such as community extraction,
frequent sub-graph mining etc. In research literature, there are already
In research efforts related to large scale efforts to associate a probability mass function
network science, various interesting scalar with a graph and compute one among various
valued measures ( performance related ) are possible entropies ( such as Renyii entropy,
defined and utilized for making interesting Tsallis entropy etc ) [9,Simonyi]. Such efforts
inferences on large scale graphs. Specifically, resulted in interesting insights into network
graphs are associated with a relevant science. The essential utility of those
probability mass function and related measures definitions of graph entropy relies on how
are computed. This research paper is an effort well the associated probability mass function
in that direction. captures the uncertainity related to the graph
structure ( relevant to the application of Definition: Vertex Degree Probability Mass
interest ). Function is a probability mass
The author innovated the concept of function, where
vertex degree probability mass function by
normalizing the associated vertex degree
distribution by twice the total number of
number of edges ( or equivalently sum of all
the vertex degrees ). Then he naturally With such a probability mass function, one of
proposed one among various possible graph the various possible entropies ( i.e. definitions
entropies by computing the Shannon entropy such as Shannon, Tsallis, Renyii etc ) can be
of such a Probability Mass Function ( PMF ). associated and utilized as the definition of
It naturally follows that such a PMF can be corresponding graph entropy. Specifically, we
utilized to compute other entropies such as have the following definition.
Renyii Entropy (leading to other definitions of
graph entropy ). It is expected that the Definition: Shannon entropy of an undirected
selected probability mass function captures graph with Vertex Degree Probability Mass
interesting uncertainity associated with the Function is given by
structure of graph. Efforts are underway to
explore interesting ramifications of graph
entropy proposed by the author. In the
following sections, we study various properties
of graph entropy proposed by the author. Remark 1: By virtue of the above definition,
the novel graph entropy is endowed with all
3. SHANNON GRAPH ENTROPY, GRAPH the properties of Shannon entropy.
CAPACITY : PROPERTIES: Specifically, all the axioms satisfied by
Shannon entropy function are satisfied by
We begin the discussion with a definition such graph entropy.
Q.E.D.
Thus, the maximum Shannon graph entropy We now compute the Shannon graph entropy ,
value associated with such a probability mass H(G) of a minimum entropy graph ( Hub-and-
function is given by spoke graph ):
Q.E.D.
H(G) =
For instance, ring connected graph and clique
( fully connected graph ) are examples of = 1+(N–1)+ (N-1)
undirected graphs having the maximum
Shannon entropy. We now discuss, when = N+ (N-1)
maximum entropy graphs exist on „N‟
vertices. Novel Shannon Capacity of a Graph:
Claim: Tree connected graph that has Hence, when N is even the adjacency matrix
maximum Shannon entropy is a line / bus A is unimodular
connected graph. Q.E.D.
.
MINIMUM ENTROPY GRAPHS:
Also, as the initial condition, we have
Now, we consider minimum entropy
graphs and study the spectrum of such
graphs. From Lemma 2, we know that it
must be a Hub-and-Spoke graph. The
adjacency matrix A of such a graph is of
the following form:
Lemma 5: The eigenvalues of the graph The above result holds for the case when N
associated with a minimum entropy graph is even as well as odd. Thus, the
on N vertices are characteristic polynomial of A is given by
the .
Hence, the eigenvalues of A are given by
multiplicity of the eigenvalue at 0 is (N-
2). the
Equivalently, the characteristic polynomial
of A is given by multiplicity of the eigenvalue at 0 is (N-2).
Q.E.D.
Proof: We prove the result by
mathematical induction. Consider the Characterization of Null Space of A :
case where N = 3.
Let N be odd. Consider the vectors in the only vector in the vector in the
which the components out of null space of A that can also be stable
state is the “all-ones vector” or „all-
(N-1) components are +1 and the other minu-ones vector”.
components are -1. Also, let the
last component be ZERO. All such Consider a corner of hypercube, with
vectors are in the null space of the last component being +1. Then, we
adjacency matrix of minimum Shannon have that the first ( N-1) components
entropy graph when N is odd. of A are all equal to „one‟. Thus,
cannnot be an eigenvector since, the
Now let N be even. Any one last component of A will be equal to
component among the first (N-1) ( N -1 ). Similarly, when the last
components is zero. component of is -1, we reason that
Also, consider the vectors in which the cannot be an eigenvector.
components out of (N-2)
Claim: None of the corners of
components are +1 and the other hypercube can be an eigenvector of A.
components are -1. Further, let the last
component be ZERO. All such vectors 5. SPARSEST CUT COMPUTATION in
are in the null space of adjacency DIRECTED / UNDIRECTED
matrix of minimum Shannon entropy UNWEIGHTED GRAPHS : SPECTRAL
graph when N is even GRAPH THEORY:
Definition: The local minimum vectors of a Definition: Let a weighted, directed graph be
quadratic form on the hypercube are called denoted by G= (V,E). Every edge has a
anti-stable states. This concept w as first weight and direction associated with it. The
introduced in [8, RaN]. weights of directed edges can be represented
by an N x N matrix M in which
represents the weight of the edge from vertex
„i‟ to vertex „j‟. Let a subset of the vertex
set V be denoted by and and let Sparsest Cut Computation in
The set of edges each of Undirected , Unweighted Graphs:
which has its tail at a vertex in and its
head at a vertex in is called a In view of Theorem 2, we have the
DIRECTED CUT OF G. The directed cut following claim.
with minimum total weight is called a
DIRECTED MINIMUM CUT (DMC) in G. Claim : Since the weight matrix of an
The following Theorem is in undirected graph is a { 0, 1 } matrix,
the same spirit of Theorem for directed computation of local / global minimum cut
graphs. computation in the associated graph is
equivalent to computation of local / global
Theorem 3: Let M be the matrix of edge optimum stable state in the associated
weights ( M is not necessarily symmetric ) in Hopfield neural network. Specifically, the
a weighted directed graph denoted by G = global minimum cut computation corresponds
(V,E). The network performs a to determining the cut with smallest number
local search for a DMC of G where of edges i.e. the sparsest cut
Claim 1 : In summary MIN / MAX cut Number of Cut Edges in Sparsest Cut =
computation
in directed / undirected graphs is equivalent ( 2 |E| - SLSV ).
to the determination of global optimum Thus, ideally we would like to compute the
stable / anti-stable state computation of
Second Largest Stable State and the Second
associated Hopfield neural network.
Largest Stable Value (SLSV). We specifically
consider the following case where such a
thing is possible. The proof of above Lemma is based on
projecting points, X on the unit hypercube
CASE I : The Vertex Degree of all vertices is onto the unit hypersphere, Y using the
a constant., say „d‟ i.e. “all ones vector ” is following transformation:
an eigenvector of adjacency matrix A
corresponding to the spectral radius, “d”.
CASE II: The Vertex Degree of all vertices is the eigenvector corresponding to
is NOT a constant i.e. “all ones vector ” is the spectral radius of W
NOT an eigenvector of adjacency matrix A. or
Corner of hypercube can be the
In this case, the following Lemmas proved eigenvector of W corresponding to the
in [Rama ] provides an approach to bound smallest eigenvalue only, if is not an
the Second Largest Stable Value (SLSV). eigenvector
and let