Documenti di Didattica
Documenti di Professioni
Documenti di Cultura
By
Dinu Mathew Panampunna
PGPDCM II
(2004-06)
Submitted to
Mudra Institute of Communications, Ahmedabad
In partial fulfillment of the requirements for the Postgraduate Programme
Diploma in Communications Management
Dissertation Guide:
Dr. Rasananda Panda
Faculty, MICA
Notice of Copyright
Executive Summary
ACKNOWLEDGEMENT
I am happy to place on record my gratitude towards numerous persons who have made
valuable contributions academically and non-academically towards the completion of this
work.
It is difficult for me to express in words the deep sense of gratitude I owe to Dr.
Rasananda Panda for his generous assistance, constant supervision, and patient forbearance to
the numerous problems and difficulties encountered during the present study. It would have
been impossible for me to do this study but for the inspiration I received from Prof. Panda at
different stages of my work.
I am thankful to Prof. Mathew, whom I constantly troubled with mallu expletives, Prof.
Neeraj Amarnani, Prof. Naval Bhargav, Dr. Shubhra Gaur, and Prof Anita Basalingappa for
their valuable suggestions as my dissertation Committee. I place my sincere gratitude for them
for spending their valuable time for discussion and help rendered to me for shaping the thesis.
I express my sincere thanks to my best friend Gujju, Shubs, Smita, Mohit and also Abhijit
who sat as a patient audience in my intellectual outbursts and for the encouragement they
have extended during my study. I also sincerely express my heart felt gratitude to my junior
Gaurav Bose who helped me in the type setting and also donned the role of a competitive
evaluator.
I would fail in my duty if I forget to place my heartfelt gratitude to my fiance, Lija
Ramachandran whose constant criticism and encouragement has helped me to improve the
work significantly.
Last but not least, I would fail in my duty if I forget to mention my heartfelt thanks to
my parents who have been my continuous source of inspiration and encouragement over the
years. Without their blessing, I would not have been able to start on and complete my
dissertation.
Dinu Mathew Panampunna
CONTENTS
Page No.
Notice of Copy right
Executive Summary
Acknowledgement
Chapter 1
Chapter 2
Chapter 3
1- 9
2
2
3
4
4
4
5
6
7
7
8
9
9
9
10 21
11
11
11
12
14
15
17
18
18
20
22 26
22
23
25
Extreme, synchronized rises and falls in financial markets occur infrequently but
they do occur. The problem with the models is that they did not assign a high
enough chance of occurrence to the scenario in which many things go wrong at
the same time- the perfect storm scenario
(Business Week, September 1998).
That which is static and repetitive is boring. That which is dynamic and random
is confusing. In between lies art." John A. Locke
Regulators have criticized LTCM and banks for not stress-testing risk models
against extreme market movements... The markets have been through the financial
equivalent of several Hurricane Andrews hitting Florida all at once. Is the
appropriate response to accept that it was mere bad luck to run into such a rare
event - or to get new forecasting models that assume more storms in the future?
(The Economist, October 1998, after the LTCM rescue)
Chapter 1
Abstract
1.1 Introduction
Financial risk-taking is a concern of public policy because associated with the risk
taking actions of individuals there are externalities, i.e. costs and benefits accruing
to the society that are external to the calculations of the individual investor, and
not accounted for in the market place. 1 1 In an economy where there are important
externalities, competitive markets will be socially inefficient. The task of public
policy, in this case of financial regulation, is to attempt to mitigate these
inefficiencies. Financial externalities are particularly potent because they are
transmitted macro economically. Yet despite all this talk of externalities, contagion
and panics, a peculiarity of market expectations is that they seem to be remarkably
stable (or tranquil) for substantial periods of time, even when underlying real
circumstances might be decidedly unpropitious.
The externality of systemic risk is in large part manifest through what the
economist John Maynard Keynes called a beauty contest. In Keyness contest,
beauty is not in the eye of the beholder. Instead, the game is won by those who can
accurately assess what others think is beautiful. In financial markets, it is knowing
what others believe to be true that is the key to knowing how markets will behave.
The market is driven by participants belief about what average opinion believes
average opinion believes and so on, ad infinitum (Keynes 1936:; Eatwell and
Taylor 2000).
1
There are a number of other important market failures in the financial sector which attract the
concerns of public policy, most notably the asymmetry of information between individual savers
and market professionals that is the motivation of consumer protection. This lecture deals solely
with the market failure manifest in systemic risk.
Liberalisation of financial markets that has taken place over the past 3 decades has
inevitably reduced the heterogeneity in financial markets. By definition
liberalisation has broken down market segmentation - cross- market correlations
have risen sharply. And with liberalisation has come a growing professionalisation
of financial management (BIS 1998: chpt. V), and extensive consolidation of
financial institutions (Group of Ten 2001). The professional investor is not only
subject to a continual pressure to maximize shortterm returns, but also in a
competitive market myopic (i.e. short-time horizon) investment is an optimal
strategy (Kurz 1987). So whatever the preferences of the private investor might be,
convergence on myopic strategies by professional investors is homogenizing the
market.
Public policy also needs to take into account the fact that beliefs about average
opinion transmit externalities through macroeconomic variables the interest rate
say, or the general level of stock prices, or the exchange rate. So effective
regulation of firms should be conceived in conjunction with macroeconomic
policy. This is particularly true in an international setting, where a major focus of
systemic risk is the exchange rate. In policy terms, macroeconomic action may be
a far more efficient means of reducing systemic risk than traditional
microeconomic regulation.
1.4 Methodology
The methodological aspects of the study would focus on the following
To analyse the above stated objective the following methodology is proposed. The
study will briefly describe the aspects of PP theory that are central to the paper and
discuss an intensity based approach to inference for PPs. I adopt a different
approach in which the model is specified via the vector stochastic intensity. This
provides a natural and powerful modeling framework for multivariate market
event data. Each element of the stochastic intensity is a continuous time process
that may be interpreted as the conditional hazard for the particular type of market
event in question. The field upon which the hazard is conditioned is updated
continuously as new information arrives, thus allowing other types of event to
influence the hazard as they occur in continuous time. My approach is closest to
that of Russell (1999), who also specfies a multivariate PP model via the stochastic
intensity.
Therein I would like to introduce a new class of models for financial market event
data (the generalized Hawkes) models. These allow the estimation of the nature of
the dependence of the intensity on the events of previous trading days rather than
imposing strong, a priori assumptions concerning this dependence. Continuing
from the generalized Hawkes model I would attempt to build on the suggestions of
Russell (1999) for the construction of diagnostic tests for parametric, multivariate
PP models.
10
There is now a reasonably large body of empirical work testing for the existence of
contagion during financial crises. A range of different methodologies are in use,
making it difficult to assess the evidence for and against contagion, and
particularly its significance in transmitting crises between countries. 2 The origins
of current empirical studies of contagion stem from Sharpe (1964) and Grubel and
Fadner (1971), and more recently from King and Wadhwani (1990), Engle, Ito and
Lin (1990) and Bekaert and Hodrick (1992).
The literature on financial crises themselves is much wider than that canvassed here and is
reviewed in Flood and Marion (1998) while more recent papers are represented by Allen and
Gale (2000), Calvo and Mendoza (2000), Kyle and Xiong (2001) and Kodres and Pritsker
(2002).
11
To simplify the analysis, the number of assets considered is three. Extending the
model to N assets is straightforward with an example given below. Let the returns
of three asset markets during a non-crisis period be defined as
All returns are assumed to have zero means. The returns could be on currencies, or
national equity markets, or a combination of currency and equity returns in a
particular country or across countries. The following trivariate factor model is
assumed to summarise the dynamics of the three processes during a period of
tranquility.
The definition of the term contagion varies widely across the literature. In this
paper contagion is represented by the transmission of unanticipated local shocks to
another country or market. This definition is consistent with that of Masson
(1999a,b, c), who divides shocks to asset markets as either common, spillovers that
result from some identifiable channel, local or contagion, and as shown below that
of other approaches, such as Forbes and Rigobon (2002) where contagion is
12
y1,t = 1 wt + 1 u1,t
y2,t= 2 wt + 2 u2,t + u1,t
...................(1.3)
y3,t = 3 wt + 3 u3,t
where the xi,t are replaced by yi,t to signify demeaned asset returns during the
crisis period. The expression for y2,t now contains a contagious transmission
channel as represented by unanticipated local shocks from the asset market in
country 1, with its impact measured by the parameter . The fundamental aim of
all empirical models of contagion is to test the statistical significance of the
parameter . 3
An important assumption underlying 3 is that the common shock and idiosyncratic shocks have
the same impact during the crisis period as they have during the non-crisis period.
13
The test for contagion presented so far is a test for contagion from country 1 to
country 2. However, it is possible to test for contagion in many directions provided
that there are sufficient moment conditions to identify the unknown parameters.
For example, (1.3) can be extended as
.(1.6)
14
1.9 Databases
15
The next chapter goes on to explain in detail the modeling of the hazards that
accompany a contagion and trying to make use of robust models to predict the
hazard variables.
Chapter 2
Abstract
16
This chapter attempts to sketch the mathematical ideation behind time series
analysis extending it into stochastic processes. Therein an attempt has been
made to model trends in financial data by using Random Walk Hypothesis and
its myriad variants. An S- Variant point process model has been laid down to
explain the effects that time and the type of event has on the stock price. The
stochastic intensity approach and its diagnostic tool, the generalized Hawkes
model has been introduced to further strengthen the process. Itos Lemma has
been introduced at this stage to test out the deterministic and random
component of a stock price and to analyse whether one could explain any
deterministic character out of the random component in the Generalized
Hawkes Model.
17
There are two main goals of time series analysis: (a) identifying the nature of the
phenomenon represented by the sequence of observations, and (b) forecasting
(predicting future values of the time series variable). Both of these goals require
that the pattern of observed time series data is identified and more or less formally
described. Once the pattern is established, one can interpret and integrate it with
other data. Mathematical ideation behind Time Series Analysis (TSA) is the
concept of an abstract probability space defined as (, F, P) such that 0<Ti Ti+1
Where, Sample space
F Sigma Algebra defined in
P Probability on
Ti point process on (0,
If a time series is stationary, its mean, variance, and autocovariance (at various
lags) remain the same no matter at what point one measures the m; that is they are
time invariant. Such a time series will tend to return to its mean and fluctuations
around this mean (measured by its variance) will have a broadly constant
amplitude. If a time series is not stationary in the sense just defined, it is called a
18
non-stationary time series. In other words, a non-stationary time series will have a
time- varying mean or a time- varying variance or both.
Its very important to understand why a stationary time series is important over a
non-stationary one. If a time series is non-stationary, one can study its behaviour
only for the time period under consideration. Each set of time series data will be
therefore for a particular episode. As a reason it is not possible to generalize it to
other time periods. Therefore for the purpose of forecasting, such nonstationary
time series may be of little practical value.
.(2.1)
19
Yt = Yt-1 + ut
which is nothing but a pure random walk without drift and hence is non-stationary
Yt = (Yt Yt-1 ) = ut
---- (2.2)
If
----- (2.3)
b) Deterministic Trend
If 1
0; 2
0; 3 =0
Yt = 1 + 2 .t + ut
(2.4)
Here though the mean of Yt is 1 + 2 .t, its variance is. Once the means of 1 and 2
are known, the mean can be forecast perfectly. Therefore if one subtracts the mean
of Yt from Yt , the resulting series will be stationary, hence the name trend
stationary.
0; 2
0; 3 =1
Yt = 1 + 2 .t + Yt-1 + ut
---------------(2.5)
Here Yt is non-stationary
20
It becomes pertinent at this point after laying out all the possibilities of the RWM
and having determined its variability of its trends to look into more specific factors
and theoretical constructs that define and determine multivariant market event
data. (Multivariate refers to more than one event under consideration). A relevant
issue to be considered at this point is whether the obvious difficulties of modeling
A standard time series analysis of aggregated data using fixed intervals of real time
is also problematic. Since the data records the timing and characteristics of
individual market events, aggregation involves an undesirable loss of information.
Thus the characteristics and timing relations of individual transactions will be lost,
mitigating the advantages of moving to transactions data in the first place. The
considerations set out above suggest that models for market event data set in
continuous time are likely to provide important economic insights into the
functioning of financial markets.
Let {Ti}i
(1,2.)
21
{Ti, Zi}i
{1,2}
speaking, market event data can be viewed as the realisation of a Marked Point
Process that is, as the realisation of a double sequence(Ti;Zi)i (1;2;:::) of random
variables where Ti is the random occurrence time of the ith event and Zi is a vector
of additional variables (or `marks') associated with that event.
where Ti Occurrence time of the ith market event
Zi The event type
Whilst considerable progress has been made in modeling the univariate case using
time series models of durations, multivariate extensions of this work have been
slow to emerge in the econometrics literature. It is believed that approaching the
problem of modelling multivariate market event data by directly specifying the
stochastic intensity provides a powerful, flexible framework.
There is a family of models, the generalised Hawkes models for which analytic
likelihoods are available and diagnostic tests based on the integrated intensity can
be constructed. In contrast to previous work, the models are general enough to
allow one to estimate the nature of the dependence of the intensity on the events of
previous trading days rather imposing strong apriori assumptions concerning this
dependence.
22
Let N(t) be a simple point process on (0,) defined on (0, , P) that is adopted to
some filtration {F t } and let Pt be a positive, Ft predictable process.
E [N(t) N(s)/F s] = E [ s
(n) dn/F s]
.......................(2.6)
This is one of the most crucial assumptions that we could be making. The
conditional expectation of the point process N(t) after filtering out the effects
through stochastic intensity should very well be the standing ground on the basis
of which the whole theoretical construction can be accepted or rejected. The
stochastic filter should be able to identify conditional hazards after having
observed the historical pre determined predicted contingency effects. The
extensional analysis should go from here to a very explicit assumption of trying to
23
predict the additional events from hereon. This would form the very crux of what
the study started out to achieve.
Standing at time s having observed the history Fs and if one wishes to predict the
numbers of additional events that will occur by time t using the conditional
expectation of (N t Ns)/Fs. Then one can equivalently use
E [s t (u) du/Fs]
(2.7)
(2.8)
One has to carefully look back at this point for introspection since this forms one
of the crucial building blocks for the analysis. This is also one of the inflexion
points where the study lacks the depth for overall understanding of the subject.
The intensity factor culled out based on the predicted Ft process could very well
have been empirically tested for its validity based on the past data and the financial
scams that have rocked the Indian financial market.
24
j(t)
(2.9)
Stochastic intensity of the Hawkes model is thus the sum of the deterministic
compone nt output t and k non-deterministic components (j(t)jk = 1)
The mathematical rigor needed for the complete explanation of the Hawkes model
is beyond the scope of the present study but at this juncture it gives us that much
needed theoretical support to continue to go along with the original idea of
separating the conditional hazards. It seems very much an empirical impossibility
to mathematically model these hazards but the purpose of the study has to be
looked upon the light of it providing a new perspective to the idea of modeling
financial contagion. To simplify the issue further strong assumptions needed to be
made at this point. The Hawkes model might well end up by assuming that the non
deterministic component as defined by the Hawkes model follows a martingale
process.
i.e; the conditional expected value of the next observation, given all the past
observations, is equal to the last observation.
It may seem too far fetched at this point of time to introduce a theoretically
conceived model but for all its uncertainties. The construction may well be a
25
hazard in itself but the entire intellectual process should be viewed in the light of
creating an entirely new perspective to the randomness approach. Financial
markets and contingencies have always aroused the curiosity of the academia at
large though randomness still remains a topic to be studied at large.
The randomness of the non deterministic component still lurks in the dark and in
this situation another strong conceptual assumption is being made. The non
deterministic component as derived in the Hawkes Model may well follow Itos
Lemma (Proof derived in the appendix).
dS = adt + bdz.
(2.11)
26
One of the strongest assumptions in this study is being put forward. It can be put
forward that the non-deterministic component is a function of the change in stock
price and time. If a change in stock price is ultimately related to as explained in
(2.11) and if one makes the assumption that the random component is a function of
the stock price and time, it should naturally follow that this random component
should have a deterministic and a non-deterministic component.
As mentioned earlier,
dS = adt + bdz,
Now suppose that j(t)jk = 1, the random component as derived in the Hawkes
model is a function of stock price and time.
j(t)jk
S,
T)(2.12)
Because j(t)jk is a function of the stochastic variable S, it will have a stochastic
component as well as a deterministic component.
representation of the form:
dj(t)jk = pdt +
qdz(2.13)
27
j(t)jk
will have a
The crucial problem is how the functions p and q are related to the functions a and
b in the equation
dS = adt + bdz.
Ito's Lemma gives the answer. The deterministic and stochastic components are
given by:
p= f/ t+( f/ S)a +(1/2)( 2f/ S2 ) b2 (2.14)
q = ( f/ S)b.(2.15)
Humankind has been concerned with randomness since prehistoric times, mostly
through divination (reading messages in random patterns) and gambling. The
opposition between free will and determinism has been a divisive issue in
philosophy and theology . Mathematicians focused at first on statistical
randomness and considered block frequencies (that is, not only the frequencies of
occurrences of individual elements, but also those of blocks of arbitrary length) as
the measure of randomness, an approach that extended into the use of information
entropy in information theory.
28
unpredictability,
especially
against
All these lend credence to the model explained in the study. The theoretical
construction followed has shed some illuminating insights into the deterministic
randomness of the non-deterministic component. Once the values of the constant
are known it makes sense then to model out the deterministic nature of the
randomness associated with forecasting randomness. Randomness as a concept
looks beyond the inevitable and this study has been an attempt to model the
invisible within the explicit.
This theoretical construct may seem incredulous but it certainly sheds light into a
new perspective of looking at randomness. Though its empirical validity requires
complex spread sheet programs and voluminous data the theoretical underpinnings
gives us no reason to absurdity. Complex algorithms could be made based on these
models, though at the present time, the lack of it, severely acts as an impediment to
its credibility and subsequent validity.
29
Chapter 3
Financial Contagion: Models and Perspectives
30
simply variables which are common across countries but omitted from the
specification.
and
display
volatility
clustering
(time-varying
One set of stylized facts for contagion is the existence of strong regional effects in equity and currency
market contagion, such as documented by Agenor, Miller, Vines and Weber (1999), Eichengreen
(2002), Kaminsky and Reinhart (2002) and Krugman (2000). However, for bond market data this
regionality does not seem to be present, a feature noted by Masson, Chakravarty and Gulden (2003)
and confirmed in Dungey, Fry, Gonzalez-Hermosillo and Martin (2002) in a study of the Russian and
LTCM crises.
31
Butler and Joaquin (2002) conduct tests consistent with the very restrictive definition of contagion,
although their paper is not directly addressing this issue.
32
1990s. The strength of the self- fulfilling mechanism for contagion may also be a
potent explanatory factor underlying the collapse of the ERM in 1992-93; see
Drazen and Masson (1994).
1. Crises are in some way associated with an increase in the conditional volatility
of financial market returns.
2. The association of excess returns in one country or market with excess returns
in another country after controlling for fundamentals (excess co- movement) is
consistent with financial market contagion.
3.
4. The theoretical insights into Itos lemma and its application into volatility
models sheds light into randomness
34
5. The association of randomness in non-deterministic models may help in making forecasts smoother leading to better
accuracy.
New models will undoubtedly be required with the advent of new crises. However,
some of the salient aspects outlined in this study are likely to recur. These include:
the fundamental linkages, the means of transmission across countries and asset
classes, the statistical properties of the data, the simultaneous identification of
contagion, interdependency and herding and the endogenous identification of crisis
and non-crisis periods from sample data. Each of these issues is extremely
important for assessing the appropriate policy response to prevent crises and
adequately managing those that occur.
35
Reference List
Agenor, P.R., Miller, M., Vines, D. and Weber, A. (1999), The Asian
Financial Crisis: Causes, Contagion and Consequences, Cambridge, UK:
Cambridge University Press.
Allen, F. and Gale, D. (1998). Optimal Financial Crises, Journal of Finance
53: 1245- 1284.
Allen, F. and Gale, D. (2000). Financial Contagion,
Journal of Political
36
Boyer, B.H., Gibson, M.H. and Loretan, M. (1999). Pitfalls in Tests for
Changes in Correlations, Working Paper 597R,Federal Reserve Board
International Finance Division.
Davis, R., T. Rydberg, and N. Shephard (2001). The Cbin model for
counts: Testing common features in the Speed of Trading, Quote changes,
Limit and Market Order Arrivals. Mimeo.
Diebold, F., J. Hahn, and A. Tay (1999). Multivariate density forecast
evaluation and calibration in financial risk management: High- frequency
returns on foreign exchange. Review of Economics and Statistics, 81: 661673.
Diebold, F., T. Gunther, and A. Tay (1998). Evaluating density forecasts,
with applications to financial risk management, International Economic
Review, 39: 863-83.
Doornik, J. A. (2001). Ox 3.0 - An Object-Oriented Matrix Programming
Language. London: Timberlake Consultants Ltd.
37
Dufour, A. and R. F. Engle (2000). Time and the price impact of a trade,
Journal of Finance 55: 2467-98.
Easley, D. and M. O'Hara (1987). Price, trade size, and information in
securities markets, Journal of Financial Economics, 19: 69-90.
Easley, D. and M. O'Hara (1992). Time and the process of security price
adjustment, Journal of Finance, 47: 577-605.
Engle, R. F. (2000). The econometrics of ultra-high- frequency data.
Econometrica, 68: 1 - 22.
Engle, R. F. and J. R. Russell (1997). Forecasting the frequency of changes
in quoted foreign excha nge prices with the autoregressive conditional
duration model. Journal of Empirical Finance, 4: 187-212.
Engle, R. F. and J. R. Russell (1998). Autoregressive conditional duration:
A new model for irregularly spaced transaction data. Econometrica, 66:
1127-1162.
Engle, R. F. and A. Lunde (2003). Trades and quotes: A bivariate point
process. Journal of Financial Econometrics, 1 (2): 159-188.
Glosten, L. R. and P. R. Milgrom (1985). Bid, ask and transaction prices in
a specialist market with heterogeneously informed traders. Journal of
Financial Economics, 14: 71-100.
Grammig, J. and M. Wellner (2002). Modeling the interdependence of
volatility
and
inter-transaction
duration
processes.
Journal
of
38
39
Appendix
Derivation of Itos Lemma
The Taylor series for f(S,t) gives the increment in j(t)jk as:
dj(t)jk
= ( f/ t)dt + ( f/ S)adt +
order
terms.
40
f/ S2)(b2v2w2dt)
Noting that the expected value of w2 is unity the expected value of dj(t)jk
is:
[ f/ t + ( f/ S)a + 1/2( 2 f/ S2 )b2 ]dt.
This is the deterministic component of dj(t)jk . The stochastic component is the
term that depends upon dz, which in (8) is represented as vw(dt)1/2. Therefore the
stochastic component is:
[( f/ S)b]dz(
1)
From the above derivation it would seem that there is an additional stochastic term
that arises from the random deviations of w2 from its expected value of 1; i.e., the
additional
term
41
Glossary
Contagion This follows Eichengreen and Rose (1995) and Eichengreen, Rose
and Wyplosz (1996), who propose that contagion refers to the association of
excess returns in one country with excess returns in another country after
controlling for the effects of fundamentals. This definition is closely related to
true contagion, as defined in Kaminsky and Reinhart (2000), arising in the
absence of, or after controlling for, common shocks and all possible
interconnection channels.
Stochastic process- Any variable whose value changes over a period of time in an
uncertain way is said to follow a stochastic process. They can be classified into
discrete time and continuous time, where in the latter the underlying va riable can
take any value within a certain range while only certain discrete value are possible
in the former
Systematic Risk- Risk that cannot be diversified and arises due to the correlations
between the returns from the investment and the stock market as a whole. The
investor expects a rate of return higher than the risk- free interest rate for bearing
positive amounts of systematic risk.
42
Itos Lemma This is derived from the Itos process. This is a generalised
stochastic process where the parameters a and b are functions of the value of the
underlying variable, x, and time, t. Both the expected drift and variance rate of an
Ito process are liable to change over time. This is used in the derivation of the
famous Black Scholes equation in the pricing of European options
Black-Scholes Pricing Formulae- In their path breaking paper, Black and Scholes
succeeded in solving their differential equation to obtain exact formulas for the
prices of European call and put options
Normal Distribution- A distribution with mean zero and standard deviation one
43