![](http://library.vu.edu.pk/cgi-bin/nph-proxy.cgi/000100A/http/web.archive.org/web/20040617223946im_/http:/=2fwww.twoday.net/static/images/mahalanobis/diamantentussi.jpg)
First of all, some background information:
Synthetic diamonds were first produced on February 16, 1953 in Stockholm by the QUINTUS project of ASEA, Sweden's major electrical manufacturing company using a bulky apparatus designed by Baltzar von Platen. The discovery was kept secret and in 1955 the General Electric Research Laboratory announced the invention of the first reproducible process for making diamonds. Ergo, synthetic diamonds are nothing new. Tiny synthetic diamonds are used in saw blades for cutting asphalt and marble, in drill bits for oil and gas drilling, and even as an exfoliant (substance for removing dead skin cells) in cosmetics.
For quite some time there have been rumors around that GE can already produce diamonds that are visually indistinguishable from their mined counterparts but decided not to engage in the jewelry business. Some people say that agents from De Beers with dark sunglasses were seen arriving at and leaving GE's corporate headquarters. A gentlemen's agreement? If one takes into account that GE would have to produce quite a lot of diamonds to be able to sell them at reasonable prices (economies of scale) and that the value of diamonds would drop tremendously since scarcity is no longer an issue, one can assume that De Beers could came up with incentives for GE to enter such an agreement.
But now, armed with inexpensive, mass-produced gems, two startups (Gemesis and Apollo Diamond) are launching an assault on the De Beers cartel. Here is the superb story.
Picture shows Malaysian model Kavita Kaur with one of the world's most expensive 'diamond' gown in Kuala Lumpur (2002). The gown, worth 19 million ringgit ($5 million) has almost 2,500 diamonds hand-sewn in by its creator American Anne Bowen. In total, the diamonds on the gown weight 625.25 carats.
Mahalanobis - am Donnerstag, 17. Juni 2004, 06:00 - Rubrik: economics
noch kein Kommentar - Kommentar verfassen
Numerous bloggers hosted by Weblogs.com are offline and scrambling to find new hosting after blogging pioneer Dave Winer abruptly closed the free service last weekend.
"I can't afford to host these sites," Winer wrote. "I don't want to start a site hosting business. These are firm, non-negotiable statements."
via Netcraft
TINSTAAFL-Story *yawn* Addendum:
How Not to Shutter a Service, LawMeme @ Yale Law School
"I can't afford to host these sites," Winer wrote. "I don't want to start a site hosting business. These are firm, non-negotiable statements."
via Netcraft
TINSTAAFL-Story *yawn* Addendum:
How Not to Shutter a Service, LawMeme @ Yale Law School
Mahalanobis - am Mittwoch, 16. Juni 2004, 15:40 - Rubrik: tech
noch kein Kommentar - Kommentar verfassen
Mahalanobis - am Dienstag, 15. Juni 2004, 19:18 - Rubrik: Finance
![](http://library.vu.edu.pk/cgi-bin/nph-proxy.cgi/000100A/http/web.archive.org/web/20040617223946im_/http:/=2fwww.twoday.net/static/images/mahalanobis/cantodobrasil.jpg)
.
![](http://library.vu.edu.pk/cgi-bin/nph-proxy.cgi/000100A/http/web.archive.org/web/20040617223946im_/http:/=2fwww.twoday.net/static/images/mahalanobis/haiti.jpg)
.
South Africa: Apartheid was a social and political policy of racial segregation that resulted in the oppression of Black, Indian and Coloured South Africans by the ruling white minority between 1948 and 1994. Despite the severe restrictions on free political debate, resistance groups, such as the African National Congress, fought for liberation even though resistance often resulted in imprisonment or death.
via Like a Packet of Woodbines and interactivenarratives.org
Mahalanobis - am Montag, 14. Juni 2004, 19:18 - Rubrik: geo
noch kein Kommentar - Kommentar verfassen
Mahalanobis - am Montag, 14. Juni 2004, 17:04 - Rubrik: economics
noch kein Kommentar - Kommentar verfassen
![](http://library.vu.edu.pk/cgi-bin/nph-proxy.cgi/000100A/http/web.archive.org/web/20040617223946im_/http:/=2fwww.twoday.net/static/images/mahalanobis/recruitment01.jpg)
.
via Craig Newmark
Mahalanobis - am Samstag, 12. Juni 2004, 16:05 - Rubrik: biz
noch kein Kommentar - Kommentar verfassen
![](http://library.vu.edu.pk/cgi-bin/nph-proxy.cgi/000100A/http/web.archive.org/web/20040617223946im_/http:/=2fwww.twoday.net/static/images/mahalanobis/sheepies.jpg)
So how did this mispricing occur? That information was available to everyone weeks before and so the market should have discounted it appropriately!
When thinking about this fact a book which was written by the greatest speculator of all times, namely George Soros, came to my mind. On the first pages he describes his way of thinking and reveals the shortcomings of economic theory. He criticises mostly the efficient market hypothesis and replaces it by his concept of reflexivity. By referring to his mentor Sir Karl Popper he states that perfect knowledge is not possible. Therefore, a judgement of an actual situation by an individual could be “correct” inasmuch it is built on all information available and appropriately applied to economic theory.
One can claim that even the smartest individual can misjudge a market situation because economic theory is not an axiomatic science like Euclidean geometry and thus intrinsically false at all stages because we know, according to Sir Karl Popper, that a natural science can never be complete and is a process of making hypotheses and falsifying them. By assuming perfect information in financial markets one would have to set a price for a buy/sell decision by using his more or less false economic theory. But people (or even future) may disagree with this view! And the market price shows its imperfection by becoming a more or less democratic vote of a set of existing theories. One could thereby realize an excessive return by simply applying a “more correct” financial theory, i. e. including long weekends in crude oil pricing.
For further reading about inefficiencies in financial markets or a more comprehensive explanation of the theory of reflexivity I recommend:
- The Alchemy of Finance by George Soros
- Hidden Collective Factors in Speculative Trading by Bertrand Roehner
stxx - am Samstag, 12. Juni 2004, 03:33 - Rubrik: Finance
Mahalanobis - am Samstag, 12. Juni 2004, 02:44 - Rubrik: fun
noch kein Kommentar - Kommentar verfassen
![](http://library.vu.edu.pk/cgi-bin/nph-proxy.cgi/000100A/http/web.archive.org/web/20040617223946im_/http:/=2fwww.twoday.net/static/images/mahalanobis/armsrace_opportunitycost_small.jpg)
...a competition between two or more countries for military supremacy. Each party competes to produce superior numbers of weapons or superior military technology in a technological escalation. The term "arms race" is used generically to describe any competition where there is no absolute goal, only the relative goal of staying ahead of the other competitors. Evolutionary arms races are common occurrences, e.g. predators evolving more effective means to catch prey while their prey evolves more effective means of evasionTaggert J. Brooks (A Random Walk, currently guest-blogging at Truck and Barter) writes: What's interesting is the many economic relationships that are typified by arms races. Still more interesting is trying to understand the notion of equilibrium in the context of these on going battles, and asking the question is it truly a stable equilibrium, or can technological innovation move you to another equilibrium? He then gives two nice examples.
What first came to my mind was that if two countries engage in an armes race then their (expected) armament levels must move in tandem, i.e. the time series "armament levels" must be cointegrated! It would be very interesting to know how far those series can drift apart before this relationship eventually breaks down. Even more interesting: Could it be that politicians are eager to keep this gap closer than necessary and thereby accelerate the whole process? Unfortunately, there has already been done some research in this field ;-(.
related links:
Ralph Nader and the regulation of neuroscience, Tyler Cowen
VAR and Cointagration Inference, Michael Stastny ;-D
Mahalanobis - am Freitag, 11. Juni 2004, 15:17 - Rubrik: economics
noch kein Kommentar - Kommentar verfassen
![](http://library.vu.edu.pk/cgi-bin/nph-proxy.cgi/000100A/http/web.archive.org/web/20040617223946im_/http:/=2fwww.twoday.net/static/images/mahalanobis/arbeitsamt.jpg)
Does Germany have an unemployment problem?
What's quite interesting is that in Spain, another country with a very high unemployment rate, infojobs made it on the list (5th) and not oficina de empleo.
via Google Blog
Mahalanobis - am Freitag, 11. Juni 2004, 03:15 - Rubrik: economics
noch kein Kommentar - Kommentar verfassen
![](http://library.vu.edu.pk/cgi-bin/nph-proxy.cgi/000100A/http/web.archive.org/web/20040617223946im_/http:/=2fwww.twoday.net/static/images/mahalanobis/visitorville.jpg)
Visitors come to your web site from other sites (referrers). Some of these referrers are search engines. In VisitorVille, referrers are depicted as buses. And web pages on your site are depicted as buildings. When a new visitor arrives, a bus delivers them to a building. To move between buildings, visitors either walk, take a cab or-if you have designated them as a VIP-a limousine. VIPs also fly in by helicopter... you must check this out.
Mahalanobis - am Donnerstag, 10. Juni 2004, 22:13 - Rubrik: tech
noch kein Kommentar - Kommentar verfassen
![](http://library.vu.edu.pk/cgi-bin/nph-proxy.cgi/000100A/http/web.archive.org/web/20040617223946im_/http:/=2fwww.twoday.net/static/images/mahalanobis/argentina_tango.jpg)
Listen to the interview (Real Audio, Windows Media Player) with Michael Reid, Americas editor of The Economist.
![](http://library.vu.edu.pk/cgi-bin/nph-proxy.cgi/000100A/http/web.archive.org/web/20040617223946im_/http:/=2fwww.twoday.net/static/images/mahalanobis/economist_arg.gif)
Mahalanobis - am Donnerstag, 10. Juni 2004, 18:29 - Rubrik: economics
noch kein Kommentar - Kommentar verfassen
This is the title of a Cowles Foundation Discussion Paper written by econometrician Peter Phillips. Since it assumes some knowledge of econometrics I recommend watching this video lecture (Clive Granger) on Time Series and Cointegration if you are new to the field. Here (and here) is another starting point I heavily recommend.
Phillips starts by discussing some general weaknesses and limitations of the econometric approach. A fundamental issue that bears on all practical economic analysis is the extent to which we can expect to understand economic phenomena by the process of developing a theory, taking observations and fitting a model. According to Phillips, forty years of empirical experience in macroeconomic forecasting suggest that there are limits to our capacity to make predictions about economic activity. In fact, the performance of aggregate predictions has improved little over this time in spite of much early optimism, enormous bodies of research in macroeconomic theory and modeling, improvments in econometric methods, and larger data sets of better quality.
All models are wrong: The position Phillips takes in this discussion is related to views about modeling similar to those of Cartwright and Hoover. Cartwright advances the notion that models can be interpreted as machines that generate laws (so-called nomological machines) and, even more flexibly, puts forward the view that the laws that may emerge from modeling are analogous to the morals that we draw from story-telling fables. Hoover takes a sympathetic but even more flexible position in arguing the case that economic modeling is useful to the extent that it sheds light on empirical relationships. As Hoover puts it, talking about formal laws seems to do nothing for economics-"even accumulated falsifications or anomalies do not cause scientists to abandon an approach unless there is the prospect of a better approach on offer". This position is similar to that of Rissanen who argues against the concept of a true model and sees statistics as a "language for expressing the regular features of the data".
But there has been a steady progression of modeling practice and econometric methodology. In the last two decades there has been done a lot of work in the subject area of cointegration, which has succeeded in addressing three principal features of macroeconomic data - joint dependence, serial dependence and non-stationarity. But although econometricians have successfully produced a fairly complete theory of inference for unit root nonstationary and jointly dependent time series data, the linkage mechanism between series often seem to be much more subtle than the linear concept of cointegration allows. Recent research has begun to tackle the difficulties of formulating and estimating models in which the time series are I(d) and have memory that is characterised by a possibly fractional parameter d (remember: I(0) = stationary series, I(1) = nonstationary series (unit root process), I(d) for d > 1 = explosive series). Empirical evidence (US data?) indicates that interest rates are I(0.9), inflation is I(0.6) and money suppy is I(1.4), respectively. In consequence, no finite linear relation among these variables or subsets of them can be cointegrating in the conventional sense (unbalanced relationsship).
He then reminds that unit roots inevitably cause trouble because of the nonstandard limit distributions and the discontinuities that arise in the limit theory as the autoregressive parameter passes through unity. Regression with integrated time series that is done in the econometric kitchen inevitably shows up in the attic in the asymptotic theory. He compares the situation to that of the fictional character Dorian Gray in the novel by Oscar Wilde (1980) - the face of Dorian Gray showed no signs of aging as time passed, whereas the sins of this worldy existence showed up to torment him in the portrait of himself that he kept hidden in the attic. Unit roots also cause trouble because of the difficulty in discriminating between stochastic trends and deterministic trend alternatives, including models that may have trend breaks.
Phillips also points out that no one really understands trends, even though most of us see trends when looking at economic data. As by now everybody knows (spurious regression), any trend function that is postulated in an econometric specification will turn out to be statistically significant in large samples provided the data do in fact have a trend, whether it is of the same form as that specified in the empirical regression or not. A well-known example is that polynomial trends are statistically significant (with probability one, asymptotically) when the true trend is stochastic and vice versa. Also worth mentioning: When we include different trend functions in an empirical regression, they will each compete in the explanation of the observed trend in the data. Correspondingly, when we regress a unit root stochastic trend on a time polynomial of degree K as well as a lagged variable, each of the K+1 regressors is a valid yardstick for the trend. If we let K -> ∞ as the number of observations n -> ∞ but with (K/n) -> 0 so that the regression remains meaningful as n grows large, then the coefficient of the lagged variable tends to unity but at the reduced rate n/K. This reduction in the rate of convergence to a unit root coefficient demonstrates how seemingly irrelevant time polynomial regressors can succeed in reducing the explanatory power of a lagged dependent variable even though the true model is an AR(1)!
The last econometric law is that the true data generating processes are unknown and inherently unknowable. Phillips says that we deal so much with models, random processes and probability spaces in our work as econometricians that it is easy to be lured into thinking that there must be an underlying true DGP. Even in the ideal situation where there is a true system and the only unknowns are a finite number of parameters to be estimated, closeness to the true system is limited.
To understand the line of reasoning you first need to know what the Kullback-Leibler Distance is. Econometricians would like to know how close on average (measured in terms of Kullback-Leibler distance) they can get to a true DGP using overserved data. Assume that the true DGP is known up to a certain parameter θ and Pθ is the corresponding unknown probability measure. The class of potential emprical models for the data generated by Pθ is very wide, but will normally depend on some rules of estimation for obtaining numerical values of the unknown parameters, leading to a usable empirical form of the model that can be represented by a proper probability measure, Gn, say. As a measure of 'goodness of fit' to the true model we may use the sequence of random variables given by the log likelihood ratio
![](http://library.vu.edu.pk/cgi-bin/nph-proxy.cgi/000100A/http/web.archive.org/web/20040617223946im_/http:/=2fwww.twoday.net/static/images/mahalanobis/rissanen1.gif)
computed for different empirical models Gn. Now Rissanen showed that if the true underlying series is stationary and if some additional technical conditions are fulfilled, then the Lebesgue measure (i.e., the volume in k dimensional Euclidean space) of the set
![](http://library.vu.edu.pk/cgi-bin/nph-proxy.cgi/000100A/http/web.archive.org/web/20040617223946im_/http:/=2fwww.twoday.net/static/images/mahalanobis/rissanen2.gif)
converges to 0 as n -> ∞ for any choice of emprical model Gn. This theorem shows that, whatever one's model, one can approximate (with respect to KL distance) the DGP no better than (k/2)log(n). Thus outside of a small set of parameters we can get no closer to the truth than this bound, and the volume of the set for which we can do better actually converges to zero.
This theorem justifies a certain amount of skeptisism about models with a large number of parameters. The minimum achievable distance of an empirical model to the DGP in this theory increases linearly with the number of parameters. In essence the more complex the system is, the harder it is to construct a good empirical model. Also remarkable: Coefficients of trending regressors (before we assumed that the regressors are stationary) actually increase the bound even though these coefficients may be estimable at higher rates than the coefficients of stationary variables.
More on that can be found here
related links:
VAR and Cointegration Inference - Technical Notes (Michael Stastny)
Phillips starts by discussing some general weaknesses and limitations of the econometric approach. A fundamental issue that bears on all practical economic analysis is the extent to which we can expect to understand economic phenomena by the process of developing a theory, taking observations and fitting a model. According to Phillips, forty years of empirical experience in macroeconomic forecasting suggest that there are limits to our capacity to make predictions about economic activity. In fact, the performance of aggregate predictions has improved little over this time in spite of much early optimism, enormous bodies of research in macroeconomic theory and modeling, improvments in econometric methods, and larger data sets of better quality.
All models are wrong: The position Phillips takes in this discussion is related to views about modeling similar to those of Cartwright and Hoover. Cartwright advances the notion that models can be interpreted as machines that generate laws (so-called nomological machines) and, even more flexibly, puts forward the view that the laws that may emerge from modeling are analogous to the morals that we draw from story-telling fables. Hoover takes a sympathetic but even more flexible position in arguing the case that economic modeling is useful to the extent that it sheds light on empirical relationships. As Hoover puts it, talking about formal laws seems to do nothing for economics-"even accumulated falsifications or anomalies do not cause scientists to abandon an approach unless there is the prospect of a better approach on offer". This position is similar to that of Rissanen who argues against the concept of a true model and sees statistics as a "language for expressing the regular features of the data".
But there has been a steady progression of modeling practice and econometric methodology. In the last two decades there has been done a lot of work in the subject area of cointegration, which has succeeded in addressing three principal features of macroeconomic data - joint dependence, serial dependence and non-stationarity. But although econometricians have successfully produced a fairly complete theory of inference for unit root nonstationary and jointly dependent time series data, the linkage mechanism between series often seem to be much more subtle than the linear concept of cointegration allows. Recent research has begun to tackle the difficulties of formulating and estimating models in which the time series are I(d) and have memory that is characterised by a possibly fractional parameter d (remember: I(0) = stationary series, I(1) = nonstationary series (unit root process), I(d) for d > 1 = explosive series). Empirical evidence (US data?) indicates that interest rates are I(0.9), inflation is I(0.6) and money suppy is I(1.4), respectively. In consequence, no finite linear relation among these variables or subsets of them can be cointegrating in the conventional sense (unbalanced relationsship).
He then reminds that unit roots inevitably cause trouble because of the nonstandard limit distributions and the discontinuities that arise in the limit theory as the autoregressive parameter passes through unity. Regression with integrated time series that is done in the econometric kitchen inevitably shows up in the attic in the asymptotic theory. He compares the situation to that of the fictional character Dorian Gray in the novel by Oscar Wilde (1980) - the face of Dorian Gray showed no signs of aging as time passed, whereas the sins of this worldy existence showed up to torment him in the portrait of himself that he kept hidden in the attic. Unit roots also cause trouble because of the difficulty in discriminating between stochastic trends and deterministic trend alternatives, including models that may have trend breaks.
Phillips also points out that no one really understands trends, even though most of us see trends when looking at economic data. As by now everybody knows (spurious regression), any trend function that is postulated in an econometric specification will turn out to be statistically significant in large samples provided the data do in fact have a trend, whether it is of the same form as that specified in the empirical regression or not. A well-known example is that polynomial trends are statistically significant (with probability one, asymptotically) when the true trend is stochastic and vice versa. Also worth mentioning: When we include different trend functions in an empirical regression, they will each compete in the explanation of the observed trend in the data. Correspondingly, when we regress a unit root stochastic trend on a time polynomial of degree K as well as a lagged variable, each of the K+1 regressors is a valid yardstick for the trend. If we let K -> ∞ as the number of observations n -> ∞ but with (K/n) -> 0 so that the regression remains meaningful as n grows large, then the coefficient of the lagged variable tends to unity but at the reduced rate n/K. This reduction in the rate of convergence to a unit root coefficient demonstrates how seemingly irrelevant time polynomial regressors can succeed in reducing the explanatory power of a lagged dependent variable even though the true model is an AR(1)!
The last econometric law is that the true data generating processes are unknown and inherently unknowable. Phillips says that we deal so much with models, random processes and probability spaces in our work as econometricians that it is easy to be lured into thinking that there must be an underlying true DGP. Even in the ideal situation where there is a true system and the only unknowns are a finite number of parameters to be estimated, closeness to the true system is limited.
To understand the line of reasoning you first need to know what the Kullback-Leibler Distance is. Econometricians would like to know how close on average (measured in terms of Kullback-Leibler distance) they can get to a true DGP using overserved data. Assume that the true DGP is known up to a certain parameter θ and Pθ is the corresponding unknown probability measure. The class of potential emprical models for the data generated by Pθ is very wide, but will normally depend on some rules of estimation for obtaining numerical values of the unknown parameters, leading to a usable empirical form of the model that can be represented by a proper probability measure, Gn, say. As a measure of 'goodness of fit' to the true model we may use the sequence of random variables given by the log likelihood ratio
![](http://library.vu.edu.pk/cgi-bin/nph-proxy.cgi/000100A/http/web.archive.org/web/20040617223946im_/http:/=2fwww.twoday.net/static/images/mahalanobis/rissanen1.gif)
computed for different empirical models Gn. Now Rissanen showed that if the true underlying series is stationary and if some additional technical conditions are fulfilled, then the Lebesgue measure (i.e., the volume in k dimensional Euclidean space) of the set
![](http://library.vu.edu.pk/cgi-bin/nph-proxy.cgi/000100A/http/web.archive.org/web/20040617223946im_/http:/=2fwww.twoday.net/static/images/mahalanobis/rissanen2.gif)
converges to 0 as n -> ∞ for any choice of emprical model Gn. This theorem shows that, whatever one's model, one can approximate (with respect to KL distance) the DGP no better than (k/2)log(n). Thus outside of a small set of parameters we can get no closer to the truth than this bound, and the volume of the set for which we can do better actually converges to zero.
This theorem justifies a certain amount of skeptisism about models with a large number of parameters. The minimum achievable distance of an empirical model to the DGP in this theory increases linearly with the number of parameters. In essence the more complex the system is, the harder it is to construct a good empirical model. Also remarkable: Coefficients of trending regressors (before we assumed that the regressors are stationary) actually increase the bound even though these coefficients may be estimable at higher rates than the coefficients of stationary variables.
More on that can be found here
related links:
VAR and Cointegration Inference - Technical Notes (Michael Stastny)
Mahalanobis - am Donnerstag, 10. Juni 2004, 08:13 - Rubrik: mathstat
noch kein Kommentar - Kommentar verfassen
zurück blättern