Sei sulla pagina 1di 8

Neutrosophic Sets and Systems, Vol.

14, 2016 21

University of New Mexico

On Single-Valued Neutrosophic Entropy of order


Harish Garg, Nancy
School of Mathematics, Thapar University Patiala - 147004, Punjab, India. E-mail: harishg58iitr@gmail.com

Abstract: Entropy is one of the measures which is used for mea- proposed and validated by taking an example of structure lin-
suring the fuzziness of the set. In this article, we have presented guistic variable. Furthermore, an approach based on the pro-
an entropy measure of order under the single-valued neutro- posed measure has been presented to deal with the multi criteria
sophic set environment by considering the pair of their mem- decision-making problems. Finally, a practical example is pro-
bership functions as well as the hesitation degree between them. vided to illustrate the decision-making process.
Based on this measure, some of its desirable properties have been
Keywords: Entropy measure, neutrosophic set, multi criteria decision-making, linguistic variable.

1 Introduction in linear order, and hence it does not give the exact nature of the
alternative. Therefore, keeping the criteria of flexibility and effi-
ciency of neutrosophic sets, this paper presents a new parametric
In a real world, due to complexity of decision making or various
entropy measure of order for measuring the fuzziness degree of
constraints in todays life, it is difficult f or t he d ecision makers
a set. For this, a entropy measure of order has been presented
to give their opinions in a precise form. To handle these situa-
which makes the decision makers more reliable and flexible for
tions, fuzzy set (FS) theory [1], intuitionistic fuzzy set (IFS) the-
the different values of these parameters. Based on it, some desir-
ory [2] are successful theories for dealing the uncertainties in
able properties of these measures have been studied.
the data. After their pioneer works, various researchers have
The rest of the manuscript is summarized as follows. Sec-
worked on these theories under the different domains such as on
tion 2 presents some basic definition about the NS. In Section 3,
entropy measures, on correlation coefficients, on aggregation op-
a new entropy of order is proposed and its axiomatic just-
erators, and many others [3, 4, 5, 6, 7, 8, 9, 10, 11, 12]. However,
ification is established. Further, various desirable properties of
both FS and IFS theories are not able to deal with the indeter-
it in terms of joint, and conditional entropies have been studied.
minate and inconsistent information. For example, if an expert
An illustrative example to show their superiority has been
take an opinion from a certain person about the certain object,
described for structural linguistic variable. Section 4 presents
then a person may say that 0.5 is the possibility that statement
the MCDM method based on the proposed generalized entropy
is true, 0.7 say that the statement is false and 0.2 says that he
measure along with an illustrative example for selecting the best
or she is not sure of it. To resolve this, Smarandache [13] intro-
alternative. Finally a conclusion has been drawn in Section 5.
duced a new component called as indeterminacy-membership
function and added into the truth membership function and
falsity membership function, all are independent components 2 Preliminaries
lies in ]0+ , 1+ [, and hence the corresponding set is known as
Neutrosophic sets (NSs), which is the generalization of IFS and In this section, some needed basic concepts and definitions re-
FS. However, without specification, NSs are difficult to apply in lated to neutrosophic sets (NS) are introduced.
real-life problems. Thus, an extension of the NS, called a single-
Definition 2.1. [13] A NS A in X is defined by its truth mem-
valued NSs (SVNSs) has been proposed by Wang et al. [14].
bership function (TA (x)), a indeterminacy-membership func-
After their pioneer work, researchers are engaged in their exten-
tion (IA (x)) and a falsity membership function (FA (x)) where
sions and their applications in the different disciplines. However,
all are the subset of ]0 , 1+ [ such that 0 sup TA (x)+sup IA (x)+
the most important task for the decision maker is to rank the ob-
sup FA (x) 3+ for all x X.
jects so as to get the desired one(s). For it, researchers have in-
corporating the idea of SVNS theory into the measure theory and Definition 2.2. [14] A NS A is defined by
applied in many practically uncertain situations such as decision
making, pattern recognition, medical diagnosis by using similar- A = {hx, TA (x), IA (x), FA (x)i | x X}
ity measures [15, 16], distance measures [17, 18], cosine simi-
larity measure [19, 20, 21, 22]. Thus, it has been concluded that and is called as SVNS where TA (x), IA (x), FA (x) [0, 1]. For
the information measures such as entropy, divergence, distance, each point x in X, TA (x), IA (x), FA (x) [0, 1] and 0 TA (x)+
similarity etc., are of key importance in a number of theoretical IA (x) + FA (x) 3. The pairs of these is called as single-valued
and applied statistical inference and data processing problems. neutrosophic numbers (SVNNs) denoted by
But it has been observed from the above studies that all their = hA (x), A (x), A (x) | x Xi
measures do not incorporate the idea of the decision-maker pref-
erences into the measure. Furthermore, the existing measure is and class of SVNSs is denoted by (X).

Harish Garg and Nancy, On Single-Valued Neutrosophic Entropy of order


22 Neutrosophic Sets and Systems, Vol. 14, 2016

Definition 2.3. Let and (P2) Let A (xi ) = A (xi ) = A (xi ) for all xi X which
be two SVNSs. Then the implies that E (A) becomes
following expressions are defined by [14]
E (A)
(i) A B if and only if A (x) B (x), A (x) B (x) and n
1 X
log3
 
A (x) B (x) for all x in X; = A (xi ) + A (xi ) + A (xi )
n(1 ) i=1
(ii) A = B if and only if A B and B A. (1)
+ A (xi ) + A (xi ) + A (xi )
(iii) Ac = {hA (x), A (x), A (x) | x Xi} +31 1 A (xi ) A (xi ) A (xi )

n
(iv) A B = hmin(A (x), B (x)), max(A (x), B (x)), 1 X 1
log3 3
 
= A (xi ) 3A (xi )
max(A (x), B (x))i n(1 ) i=1
+31 1 3A (xi )

(v) A B = hmax(A (x), B (x)), min(A (x), B (x)),
n
min(A (x), B (x))i 1 X
log3 32 A (xi )

=
n(1 ) i=1
Majumdar and Samant [16] define the concept of entropy for
+31 32 A (xi )

neutrosophic sets which has been defined as below.

Definition 2.4. An entropy on SV N S(X) is defined as real val- = 1


ued function E : SV N S(X) [0, 1] which satisfies following
axioms [16]:

(P1) E(A) = 0 if A is crisp set.


Now, let E (A) = 1, that is,
(P2) E(A) = 1 if A (x) = A (x) = A (x)
n
X 


c log3 A (xi ) + A (xi ) + A (xi ) A (xi ) + A (xi )
(P3) E(A) = E( A ) for all A SV N S(X)
i=1

1
(P4) E(A) E(B) if A B that is , A (x) B (x), A (x) + 31 1 A (xi ) A (xi ) A (xi )

+A (xi )
B (x) and A (x) B (x) for B (x) B (x) and B (x)
= n(1 )
B (x). 
log3

A (xi ) + A (xi ) + A (xi ) A (xi ) + A (xi ) +

3 Entropy of order- A (xi )
1
+ 31 1 A (xi ) A (xi ) A (xi )


= (1 )
In this section we proposed parametric entropy for SV N S 


(x
A i ) + (x
A i ) + (x
A i ) A (xi ) + A (xi ) +
Definition 3.1. The entropy of order- for SV N S A is defined 1  
as: A (xi ) + 31 1 A (xi ) A (xi ) A (xi )
n 
1 X = 31
log3

E (A) = A (xi ) + A (xi ) + A (xi )  
A (xi ) + (xi ) + A
(x )
n(1 ) i=1 A (xi ) + A (xi ) + A (xi )
i

1 3
A (xi ) + A (xi ) + A (xi ) 
A (xi ) + A (xi ) + A (xi )
 
=0 (2)
3

1

+ 3 1 A (xi ) A (xi ) A (xi ) , (1)
From Eq. (2) we get, either A (xi ) + A (xi ) + A (xi ) = 0
implies that
where > 0, 6= 1.
A (xi ) = A (xi ) = A (xi ) = 0 for all xi X (3)
Theorem 1. E (A) as defined in Definition 3.1 is entropy for
SV N S. or
 
A (xi ) + (xi ) + A (xi ) A (xi ) + A (xi ) + A (xi )
Proof. In order to proof E (A) is a valid measure, we have to 3

3
=0 (4)
proof that it satisfies the axioms as given in Definition 2.4.
Now, consider the following function
(P1) Let A be a crisp set i.e. A = (1, 0, 0) or A = (0, 0, 1).
Then from Definition 3.1 we get E (A) = 0. g() = where [0, 1]

Harish Garg and Nancy, On Single-Valued Neutrosophic Entropy of order


Neutrosophic Sets and Systems, Vol. 14, 2016 23

Differentiate it with respect to , we get f


0, whenever x y, x z, > 0, 6= 0. (12)
x
0 1
g () =
Thus, f (x, y, z) is increasing function with respect to x for
00
g () = ( 1) 2 x y, x z and decreasing when x y, x z. Simi-
larly, we have
because g 00 () > 0 for > 1 and g 00 () < 0 for < 1
therefore g() is convex or concave according to > 1 or f f
0 and 0, whenever x y, x z. (13)
< 1. So, for any points 1 , 2 and 3 in [0, 1], we have y z

g(1 ) + g(2 ) + g(3 ) 1 + 2 + 3 


g 0 for > 1 (5) f f
3 3 0 and 0, whenever x y, x z. (14)
y z
g(1 ) + g(2 ) + g(3 ) 1 + 2 + 3  Thus, f (x, y, z) is decreasing function with respect to y and
g 0 for < 1 (6)
3 3 z for x y, x z and increasing when x y, x z.
In above, equality holds only if 1 = 2 = 3 . Hence from Therefore from monotonicity of function f , and by tak-
Eqs. (3),(4), (5) and (6) we conclude Eqs. (2) and (4) holds ing two SV N Ss A B, i.e., A (x) B (x), A (x)
only when A (xi ) = A (xi ) = A (xi ) for all xi X. B (x) and A (x) B (x) for B (x) B (x) and B (x)
B (x), we get E (A) E (B).
(P3) Since Ac = {hx, A (x), A (x), A (x) | x Xi} which
implies that E (Ac ) = E (A).
(P4) Rewrite the entropy function as Example 3.1. Let A be SV N S in universe of discourse X =
{x1 , x2 , x3 , x4 } defined as A = {hx1 , 0.4, 0.3, 0.9i, hx2 , 0.7, 0.5,
f (x, y, z) = 0.3i, hx3 , 0.2, 0.9, 0.8i, hx4 , 0.5, 0.4, 0.6i}. Then entropies val-
n 
1 X 1 ues for different values of is E0.2 (A) = 0.9710; E0.5 (A) =
log3 x + y + z x + y + z
 
1 i=1 0.9303; E2 (A) = 0.7978; E5 (A) = 0.7246; E10 (A) = 0.7039.
 It is clearly seen from this result that with the increase of , the
+31 (1 x y z) (7) values of E (A) is decreases.
The above proposed entropy measure of order satisfies the
where x, y, z [0, 1]. In order to proof the proposed en- following additional properties.
tropy satisfies (P4), it is sufficient to prove that the function Consider two SV N Ss A and B defined over X = {x1 , x2 , . . . , xn }.
f defined in Eq. (7) is an increasing function with respect Take partition of X as X1 = {xi X : A B}, X2 = {xi
to x and decreasing with respect to y and z. For it, take a X : A B}. Then we define the joint and conditional entropies
partial derivative of the function with respect to x, y and z between them as follows
and hence we get.
(i) Joint entropy
(1 )(x + y + z )(x + y + z) E (A B)
f + (x + y + z)1 x1 31 n 
= (8) 1 X


x (x + y + z )(x + y + z)1
=
n(1 )
log3 AB (xi ) + AB (xi ) + AB (xi )
i=1
+ 31 (1 x y z)] 1
AB (xi ) + AB (xi ) + AB (xi )
(1 )(x + y + z )(x + y + z) 
f + (x + y + z)1 y 1 31 +31 1 AB (xi ) AB (xi ) AB (xi )

= (9)
y (1 )[(x + y + z )(x + y + z)1 (  
1
+ 31 (1 x y z)]
X
= log3 B (x i ) +
B (x i ) +
B (xi )
n(1 ) x X
(1 )(x + y + z )(x + y + z) i 1
1
+ (x + y + z)1 z 1 31
 
f B (xi ) + B (xi ) + B (xi ) + 31 1 B (xi )
= (10)
z (1 )[(x + y + z )(x + y + z)1  X 
+ 31 (1 x y z)] B (xi ) B (xi ) + log3
A (xi ) + A (xi )
xi X2
f
= 0, f f  1
After setting x y = 0 and z = 0, we get x = y =
+A (xi ) A (xi ) + A (xi ) + A (xi )
z. Also,
 )
f +31 1 A (xi ) A (xi ) A (xi ) (15)
0, whenever x y, x z, > 0, 6= 0 (11)
x

Harish Garg and Nancy, On Single-Valued Neutrosophic Entropy of order


24 Neutrosophic Sets and Systems, Vol. 14, 2016

(  
(ii) Conditional entropy 1 X
+ log3 B (xi ) +
B (x i ) +
B (x i )
n(1 ) x X
E (A|B) i 1
(    1 
1 X
B (xi ) + A (xi ) + B (xi ) + 31 1 B (xi )
= log3
A (xi ) + A (xi ) + A (xi )
n(1 ) x X
i 2   
(
 1  B xi ) B (xi ) log3 A (xi ) +
A (xi ) +
A (xi )
A (xi ) + A (xi ) + A (xi ) + 31 1 A (xi )
 1 
A (xi ) + A (xi ) + A (xi ) + 31 1 A (xi )
  
A (xi ) A (xi ) log3
B (xi ) + B (xi ) + B (xi )
)
1
A (xi ) A (xi )
 
B (xi ) + B (xi ) + B (xi ) + 31 1 B (xi )
(  
) 1 X
B (xi ) B (xi ) log3
B (xi ) + B (xi ) + B (xi )
n(1 ) x X
i 1
 1 
B (xi ) + B (xi ) + B (xi ) + 31 1 B (xi )
and  X  
B (xi ) B (xi ) log3
A (xi ) + A (xi )A (xi )
E (B|A) xi X2
(  
1 X  1 
= log3 (x
B i ) +
B i(x ) +
(x
B i ) A (xi ) + A (xi ) + A (xi ) + 31 1 A (xi )
n(1 ) x X
i 1
 1  )
B (xi ) + B (xi ) + B (xi ) + 31 1 B (xi ) A (xi ) A (xi )
   (  
B (xi ) B (xi ) log3 (x ) +
(x ) +
(x ) 1 X
A i A i A i = log3
A (xi ) +
A (xi ) +
A (x i )
n(1 ) x X
 1  i 1

A (xi ) + A (xi ) + A (xi ) + 31 1 A (xi )  1 


A (xi ) + A (xi ) + A (xi ) + 31 1 A (xi )
)
A (xi ) A (xi )
 X  
A (xi ) + log3
A (x i ) +
A (x i ) + A

(x i )
xi X2
Here E (A|B) is entropy of A given B.  1 
A (xi ) + A (xi ) + A (xi ) + 31 1 A (xi )
)
Theorem 2. For SV N Ss A and B following statements hold A (xi ) A (xi )
(  
1 X
(i) E (A B) = E (A) + E (B|A) + log3
B (xi ) + B (xi ) + B (xi )
n(1 ) x X
i 1
 1 
B (xi ) + A (xi ) + B (xi ) + 31 1 B (xi )
(ii) E (A B) = E (B) + E (A|B)   
B (xi ) B (xi ) log3
A (xi ) + A (xi ) + A (xi )
 1 
(iii) E (A B) = E (A) + E (B|A) = E (B) + E (A|B) A (xi ) + A (xi ) + A (xi ) + 31 1 A (xi )
)
A (xi ) A (xi )
(iv) E (A B) + E (A B) = E (A) + E (B). (  
1 X
log3 B i(x ) +
(x
B i ) +
(x
B i )
n(1 ) xX
1
Proof. (i) Here, we have to proof (i) only, (ii) and (iii) can be 1
follows from it.
 
B (xi ) + B (xi ) + B (xi ) + 31 1 B (xi )
E (A) + E (B|A) E (A B)   
X
n
B (xi ) B (xi ) log3 (x ) +
(x )
 
1 X A i A i
= log3
A (xi ) + A (xi ) + A (xi ) xi X2
n(1 ) i=1
 1 
1
+ 31 1 A (xi )
 
A (xi ) + A (xi ) + A (xi )
A (xi ) + A (xi ) + A (xi ) + 31 1 A (xi )
 )
A (xi ) A (xi ) A (xi ) A (xi )

= 0

Harish Garg and Nancy, On Single-Valued Neutrosophic Entropy of order


Neutrosophic Sets and Systems, Vol. 14, 2016 25

(  
1 X
= log3
A (xi ) +
A (x i ) +
A (x i )
n(1 ) xX
1
 1 
(iv) For an SV N Ss A and B, we have A (xi ) + A (xi ) + A (xi ) + 31 1 A (xi )
E (A B)  X  
n   A (xi ) + log3 A (x i ) +
A (x i ) +
A (x i )
1 X
= log3
AB (xi ) + AB (xi ) + AB (xi ) xX2
n(1 ) i=1  
 1 A (xi ) + A (xi ) + A (xi ))1 + 31 1 A (xi )
AB (xi ) + AB (xi ) + AB (xi ) +  X  
  A (xi ) A (xi ) log3
A (xi ) + A (xi ) + A (xi )
31 1 AB (xi ) AB (xi ) xX1
 1 
A (xi ) + A (xi ) + A (xi ) + 31 1 A (xi )
(  
1 X
= log3
A (xi ) +
A (xi ) +
A (xi )
n(1 ) xX  X  
1
A (xi ) A (xi ) log3
B (xi ) + B (xi ) + B (xi )
 (1) 
xX2
A (xi ) + A (xi ) + A (xi ) + 31 1 A (xi )  1 
 X   B (xi ) + B (xi ) + B (xi ) + 31 1 B (xi )
A (xi ) A (xi ) + log3
B (xi ) + B (xi ) + B (xi ) )
xX2
 1  (x
B i ) (x
B i )
B (xi ) + B (xi ) + B (xi ) + 31 1 B (xi ) (  
1 X
) + log3 (x
A i ) +
(x
A i ) +
(x
A i )
n(1 ) xX
B (xi ) B (xi ) 2
 (1) 
A (xi ) + A (xi ) + A (xi ) + 31 1 A (xi )
Hence, by the definition of joint entropy E (A B) given
in Eq. (15), we get
  
A (xi ) A (xi ) log3
B (xi ) + B (xi ) + B (xi )

E (A B) + E (A B) = E (A) + E (B)  1 
B (xi ) + B (xi ) + B (xi ) + 31 1 B (xi )
)
B (xi ) B (xi )

= E (A|B)

This completes the proof.


Theorem 3. For SV N Ss A and B following statements holds
Let A = hx, A (x), A (x), A (x)|x Xi be SV N S in X.
(i) E (A) E (A B) = E (A|B) For n be any positive real number, Zhang et al. [23] defined An
as follows
n
(ii) E (B) E (A B) = E (A|B) An = hx, A (x) , 1 (1 A (x))n , 1 (1 A (x))n i (16)

Definition 4. Contraction of SV N S A in universe of discourse


Proof. We prove (i) part only, other can be proven similarly. X is defined by
Consider
CON (A) = hx, CON (A) (x), CON (A) (x), CON (A) (x)i
E (A) E (A B)
( n
where CON (A) (x) = [A (x)]2 ; CON (A) (x) = 1 [1
 
1 X
= log3
A (xi ) + A (xi ) + A (xi )
n(1 ) i=1 A (x)]2 ; CON (A) (x) = 1 [1 A (x)]2 i.e. CON (A) =
 1  A2
A (xi ) + A (xi ) + A (xi ) + 31 1 A (xi )
 Xn  Definition 5. Dilation of SV N S A in universe of discourse X
A (xi ) A (xi ) log3
AB (xi ) + AB (xi ) is defined by
i=1



 1
DIL(A) = hx, DIL(A) (x), DIL(A) (x), DIL(A) (x)i
+AB (xi ) AB (xi ) + AB (xi ) + AB (xi )
 ) where DIL(A) (x) = [A (x)]1/2 ; DIL(A) (x) = 1 [1
+31 1 AB (xi ) AB (xi ) AB (xi )
A (x)]1/2 ; DIL(A) (x) = 1[1A (x)]1/2 i.e. DIL(A) =

Harish Garg and Nancy, On Single-Valued Neutrosophic Entropy of order


26 Neutrosophic Sets and Systems, Vol. 14, 2016

A1/2 variable More or Less LARGE, LARGE, VERY LARGE


are decreases. Also it has been observed that whenever the values
An illustrative example has been tested on the concentration of are increases from 0 to 15 then the pattern for the variable
and dilation for comparing the performance of proposed entropy LARGE is E (A) > E (A1/2 ) > E (A2 ) > E (A3 ) and the

with the some existing entropies as given below. results coincides with the existing measures results. On the other
(i) Entropy defined by [5]; hand, whenever the value of are increases beyond the 15 then
" # the order the patterns are slightly different. Hence the proposed
1
Pn min(A (xi ), A (xi )) + A (xi ) entropy measure is used as an alternative measure for computing
ESK (A) = n i=1
max(A (xi ), A (xi )) + A (xi ) the order value of the linguistic variable as compared to existing.
Moreover, the proposed measure is more generalized as the dif-
(ii) Entropy defined by [3]; ferent different values of will give the different choices of the
n
decision-maker for assessing the results, and hence more reliable
1X from linguistic variable point-of-view.
EBB (A) = A (xi )
n i=1

(iii) Entropy defined by [8]; Table 1: Values of different entropy measure for IFS
Entropy measure A1/2 A A2 A3 Ranking
! EBB [3] 0.0818 0.100 0.0980 0.0934 (2341)
n V EZL [4] 0.4156 0.4200 0.2380 0.1546 (2134)
1 X A (xi ) A(xi ) ESK [5] 0.3446 0.3740 0.1970 0.1309 (2134)
EZJ (A) = W 2
n i=1 A (xi ) A(xi ) Ehc [7] 0.3416 0.3440 0.2610 0.1993 (2134)
1/2
Er [7] 0.6672 0.6777 0.5813 0.4805 (2134)
EZJ [8] 0.2851 0.3050 0.1042 0.0383 (2134)
(iv) Entropy defined by [4]; E (A) (Proposed measure)
= 0.3 0.7548 0.7566 0.6704 0.5774 (2134)
= 0.5 0.7070 0.7139 0.6101 0.5137 (2134)
n
1 X = 0.8 0.6517 0.6637 0.5579 0.4731 (2134)
EZL (A) = 1 |A (xi ) A (xi )| 1 0.6238 0.6385 0.5372 0.4611 (2134)
n i=1
=2 0.5442 0.5727 0.4956 0.4513 (2134)
=5 0.4725 0.5317 0.4858 0.4793 (2341)
= 10 0.4418 0.5173 0.4916 0.4999 (2431)
Example 3.2. = 15 0.4312 0.5112 0.4937 0.5064 (2431)
= 50 0.4166 0.4994 0.4937 0.5064 (4231)
Let X = {x1 , x2 , ..., x5 } be universe of discourse and a = 100 0.4137 0.4965 0.4612 0.5112 (4231)

SV N S A LARGE on X may be defined as A = {hx1 , 0.1, 0.7,


0.8i, hx2 , 0.3, 0.6, 0.5i, hx3 , 0.5, 0.3, 0.4i, hx4 , 0.9, 0.2, 0.0i, hx5 ,
1.0, 0.1, 0.0i}. Using the operations defined in Eq. (16) on
SV N S, we can generate following SV N Ss 4 MCDM problem on proposed entropy
A, A1/2 , A2 , A3 measure
which can be defined as In this section, we discuss the method for solving the MCDM
A1/2 may treated as More or Less LARGE, problem based on the proposed entropy measure.
A2 may treated as Very LARGE,
A3 may treated as Quite Very LARGE 4.1 MCDM method based on proposed Entropy
and these corresponding sets are computed as
A1/2 = {hx1 , 0.3162, 0.4523, 0.5528i, hx2 , 0.5477, 0.3675,
measure
0.2929i, hx3 , 0.7071, 0.1633, 0.2254i, hx4 , 0.9487, 0.1056, 0i, Consider the set of different alternatives A = {A1 , A2 , ..., Am }
hx5 , 1.0000, 0.0513, 0i} ; having the different criteria C = {C1 , C2 , ..., Cn } in neutro-
A1 = {hx1 , 0.1, 0.7, 0.8i, hx2 , 0.3, 0.6, 0.5i, hx3 , 0.5, 0.3, 0.4i, sophic environment and the steps for computing the best alter-
hx4 , 0.9, 0.2, 0.0i, hx5 , 1.0, 0.1, 0i} ; native is summarized as follows
A2 = {hx1 , 0.01, 0.91, 0.96i, hx2 , 0.09, 0.84, 0.75i,
hx3 , 0.25, 0.51, 0.64i, hx4 , 0.81, 0.36, 0i, hx5 , 1.00, 0.19, 0i}; Step 1: Construction of decision making matrix :
A3 = {hx1 , 0.0010, 0.9730, 0.9920i, hx2 , 0.0270, 0.9360, 0.8750i, Arrange the each alternatives Ai under the criteria Cj ac-
hx3 , 0.1250, 0.6570, 0.7840i, hx4 , 0.7290, 0.4880, 0i, cording to preferences of the decision maker in the form
hx5 , 1.000, 0.2710, 0i} of neutrosophic matrix Dmn = hij , ij , ij i where
The entropy measures values corresponding to existing mea- ij represents the degree that alternative Ai satisfies the
sures as well as the proposed measures for different values of criteria Cj , ij represents the degree that alternative Ai
are summarized in Table 1 for these different linguistic variable indeterminant about the criteria Cj and ij represents the
SV N Ss. From this table, it has been concluded that with the in- degree that alternative Ai doesnt satisfies the criteria Cj ,
crease of the parameter , the entropy measure for the linguistic where 0 ij , ij , ij 1 and ij + ij + ij 3;

Harish Garg and Nancy, On Single-Valued Neutrosophic Entropy of order


Neutrosophic Sets and Systems, Vol. 14, 2016 27

i = 1, 2, ..., m ; j = 1, 2, ....n. The decision matrix Step 1: The value of an alternative Ai (i = 1, 2, 3, 4) with respect
given below to criteria Cj (j = 1, 2, 3) obtained from questionnaire of

h11 , 11 , 11 i h12 , 12 , 12 i . . . h1n , 1n , 1n i
domain expert. Thus, when the four possible alternatives
h21 , 21 , 21 i
Dmn (xij ) =
h22 , 22 , 22 i . . . h2n , 2n , 2n i
with respect to the above three criteria are evaluated by
.. .. .. ..
. . . . the expert, we obtain the following single valued neutro-
hm1 , m1 , m1 i hm2 , m2 , m2 i . . . hmn , mn , mn i
sophic decision matrix:
Step 2: Normalized the decision making : Criterion of alterna-
h0.5, 0.2, 0.3i h0.5, 0.1, 0.4i h0.7, 0.1, 0.2i
tives may be of same type or of different types . If the h0.4, 0.2, 0.3i h0.3, 0.2, 0.4i h0.8, 0.3, 0.2i
all criterion are of same kind then there is no need of D= h0.4, 0.3, 0.1i h0.5, 0.1, 0.3i h0.5, 0.1, 0.4i

normalization. On the other hand , we should convert
h0.6, 0.1, 0.2i h0.2, 0.2, 0.5i h0.4, 0.3, 0.2i
the benefit type criterion values to the cost types in C by
using the following method-
 c Step 2: Since the criteria C1 is the benefit criteria and C2 ,C3 are
ij ; j B cost criteria, so we above decision matrix transformed
rij = (17)
ij ; j C into following normalized matrix R = hTij , Iij , Fij i as
c
follows
where ij = hij , ij , ij i is complement of ij = hij ,
ij , ij i. Hence, we obtain the normalized NS decision h0.3, 0.2, 0.5i h0.5, 0.1, 0.4i h0.7, 0.1, 0.2i
making R = [rij ]mn . h0.3, 0.2, 0.4i h0.3, 0.2, 0.4i h0.8, 0.3, 0.2i
R= h0.1, 0.3, 0.4i h0.5, 0.1, 0.3i h0.5, 0.1, 0.4i

Step 3: Compute the aggregated value of the alternatives: By h0.2, 0.1, 0.6i h0.2, 0.2, 0.5i h0.4, 0.3, 0.2i
using the proposed entropy measure aggregated the rat-
ing values corresponding to each alternatives Ai (i =
1, 2, ..., m) and get the overall value ri . Step 3: Utilizing the proposed entropy measure corresponding
to = 2 to get the aggregated values rij of all the al-
Step 4: Rank the Alternatives: Rank all the alternatives Ai (i = ternatives, which are as following E (A1 ) = 0.7437;
1, 2, ..., m) according to the values of proposed entropy E (A 2 ) = 0.8425; E (A3 ) = 0.8092; E (A4 ) = 0.8089
obtained from Step 3 and get the most desirable alterna-
tive. Step 4: Based on above values, we conclude that ranking of given
alternatives is
4.2 Illustrative Example E (A2 ) > E (A3 ) > E (A4 ) > E (A1 )
Let us consider multi-criteria decision making problem. There
is investment company,which wants to invest a sum of money in
Hence, A2 is best alternative i.e., Investment company should
best option. There is a panel with four possible alternatives to
invest in transport company.
invest the money, namely

(i) A1 is food company;

(ii) A2 is transport company; 5 Conclusion


(iii) A3 is an electronic company; In this article, we have introduced the entropy measure of order
for single valued neutrosophic numbers for measuring the de-
(iv) A4 is an tyre company. gree of the fuzziness of the set in which the uncertainties present
in the data are characterized into the truth, the indeterminacy and
Decision maker take decision according to three criteria given
the falsity membership degrees. Some desirable properties cor-
below:
responding to these entropy have also been illustrated. A struc-
a) C1 is growth analysis; ture linguistic variable has been taken as an illustration. Finally,
a decision-making method has been proposed based on entropy
b) C2 is risk analysis; measures. To demonstrate the efficiency of the proposed coef-
ficients, numerical example from the investment field has been
c) C3 is enviroment impact analysis. taken. A comparative study as well as the effect of the parame-
ters on the ranking of the alternative will support the theory and
Then the following procedure has been followed for comput- hence demonstrate that the proposed measures place an alterna-
ing the best alternative as an investment. tive way for solving the decision-making problems.

Harish Garg and Nancy, On Single-Valued Neutrosophic Entropy of order


28 Neutrosophic Sets and Systems, Vol. 14, 2016

References
[1] L. A. Zadeh, Fuzzy sets, Information and Control 8 (1965) [13] F. Smarandache, A unifying field in logics. Neutrosophy:
338353. neutrosophic probability, set and logic, American Research
Press, Rehoboth, 1999.
[2] K. T. Atanassov, Intuitionistic fuzzy sets, Fuzzy Sets and
Systems 20 (1986) 87 96. [14] H. Wang, F. Smarandache, Y. Q. Zhang, R. Sunderraman,
Single valued neutrosophic sets, Multispace Multistructure
[3] P. Burillo, H. Bustince, Entropy on intuitionistic fuzzy sets 4 (2010) 410 413.
and on interval-valued fuzzy sets, Fuzzy sets and systems
78 (3) (1996) 305 316. [15] J. Ye, Multiple attribute group decision-making method
with completely unknown weights based on similarity mea-
[4] W. Zeng, H. Li, Relationship between similarity measure sures under single valued neutrosophic environment, Jour-
and entropy of interval-valued fuzzy sets, Fuzzy Sets and nal of Intelligent and Fuzzy Systems 27 (6) (2014) 2927
Systems 157 (11) (2006) 1477 1484. 2935.
[5] E. Szmidt, J. Kacprzyk, Entropy for intuitionistic fuzzy sets, [16] P. Majumdar, S. K. Samant, On similarity and entropy of
Fuzzy Sets and Systems 118 (3) (2001) 467 477. neutrosophic sets, Journal of Intelligent and Fuzzy Systems
[6] H. Garg, N. Agarwal, A. Tripathi, Generalized intuitionistic 26 (3) (2014) 1245 1252.
fuzzy entropy measure of order and degree and its ap- [17] J. Ye, Similarity measures between interval neutrosophic
plications to multi-criteria decision making problem, Inter- sets and their applications in multicriteria decision-making,
national Journal of Fuzzy System Applications 6 (1) (2017) International Journal of Intelligent Fuzzy Systems 26 (1)
86 107. (2014) 165 172.
[7] W. L. Hung, M. S. Yang, Fuzzy entropy on intuitionistic [18] H. L. Huang, New distance measure of single-valued neu-
fuzzy sets, Intenational Journal of Intelligent Systems 21 trosophic sets and its application, International Journal of
(2006) 443 451. Intelligent Systems 31 (10) (2016) 1021 1032.
[8] Q. S. Zhang, S. Y. Jiang, A note on information entropy [19] S. Broumi, F. Smarandache, Correlation coefficient of inter-
measure for vague sets, Information Science 178 (2008) val neutrosophic set, Applied Mechanics and Material 436
4184 4191. (2013) 511 517.
[9] H. Garg, A new generalized improved score function of
[20] I. Hanafy, A. Salama, K. Mahfouz, Correlation coefficients
interval-valued intuitionistic fuzzy sets and applications in
of neutrosophic sets by centroid method, International Jour-
expert systems, Applied Soft Computing 38 (2016) 988
nal of Probability and Statistics 2 (1) (2013) 912.
999.
[21] S. Broumi, F. Smarandache, Cosine similarity measure of
[10] H. Garg, A new generalized pythagorean fuzzy information
interval valued neutrosophic sets, Neutrosophic Sets and
aggregation using einstein operations and its application to
Systems 5 (2014) 15 20.
decision making, International Journal of Intelligent Sys-
tems 31 (9) (2016) 886 920. [22] J. Ye, Improved cosine similarity measures of simplified
neutrosophic sets for medical diagnoses, Artificial Intelli-
[11] H. Garg, A novel correlation coefficients between
gence in Medicine 63 (2015) 171 179.
pythagorean fuzzy sets and its applications to decision-
making processes, International Journal of Intelligent Sys- [23] H. Y. Zhang, J. Q. Wang, X. H. Chen. Interval neutro-
tems 31 (12) (2016) 1234 1253. sophic sets and their application in multicriteria decision
making problems, The Scientific World Journal Volume
[12] H. Garg, Generalized intuitionistic fuzzy interactive geo-
2014 (2014) Article ID 645953, 15 pages.
metric interaction operators using einstein t-norm and t-
conorm and their application to decision making, Computer
and Industrial Engineering 101 (2016) 53 69. Received: November 11, 2016. Accepted: November 17, 2016

Harish Garg and Nancy, On Single-Valued Neutrosophic Entropy of order

Potrebbero piacerti anche