Some evildoer at ScienceBlogs asks "Is String Theory an Unphysical Pile of Garbage?" and references
this paper (in particular p.54 to 57) and other heretic stuff.
"... in string theory, we don't know both the variables and the equations. In fact, unless another theory (...) comes along that encompasses and expands upon string theory, string theory isn't a fundamental theory at all, due to instabilities."
I think this is yet another job for SuperLumo, defender of the one true string theory!
While we wait for SuperLumo, let me add a few remarks about this (but keep in mind that I am not a string theorist).
The blog post and the paper it references basically complain about the lack of a fundamental, non-perturbative formulation of string theory. But it seems to me that the Maldacena conjecture provides for such a non-perturbative description, currently at least for AdS.
A while ago string field theory has been proposed as a more fundamental theory and the referenced paper suggests that it suffers from instabilities. I cannot judge the argument in detail, but it seems to me that string field theory is indeed no longer such a hot topic.
Most of the effort seems today focused on generalizing and understanding the AdS/CFT correspondence.
However, I also think it is a mistake to underestimate 'string theory as perturbation theory'.
After all it provides for the only known way to deal with quantum gravity but at the same time keep local Lorentz invariance (i.e. a smooth spacetime) and quantum theory as we know it. And it gets surprisingly far, including consistent calculations of black hole entropy, making use of the amazing string dualities.
Many questions remain unanswered and perhaps early hopes that string theory will answer all questions of particle physics turned into consternation about the multiverse, but I think the answer to the "incendiary title" of the blog post obviously has to be "No!".
This paper is a really interesting comment to the one proposing a "Quantum resolution to the arrow of time dilemma", discussed here previously.
"In this note we show that the argument is incomplete and furthermore, by providing a counter-example, argue that it is incorrect. Instead of quantum mechanics providing a resolution in the manner suggested, it allows enhanced classical memory records of entropy-decreasing events."
A Bayesian statistician understands probabilities as numerical description of 'subjective uncertainty'. In order to make this a little bit more concrete, Bruno de Finetti considered betting odds and order books (see e.g. this paper [pdf!] for a more detailed discussion).
Meanwhile this approach is usually introduced and shortened as follows (*):
"we say one has assigned a probability p(A) to an event A if,
before knowing the value of A, one is willing to either buy or sell a lottery ticket of the form 'Worth $1 if A' for an amount $p(A).
The personalist Bayesian position adds only that this is the full meaning of
probability; it is nothing more and nothing less than this definition."
But there is a problem with that.
As every professional gambler knows, if one places a series of bets one needs to follow the Kelly criterion to avoid certain ruin in the long run. But the Kelly fraction and thus the amount one should be willing to bet is strictly zero for the case of a fair bet (when one would be willing to take both the buy and sell side).
In other words, it is wrong to pay the price suggested in the above definition, if one wants to survive in the long run.
Now, it would seem that this is one of those 'technicalities' that one can easily sweep under a rug or into a footnote. But I think it would be difficult to somehow incorporate the Kelly criterion in the above definition of probability, because in order to derive the Kelly fraction one needs to know already quite a bit about probabilities in the first place.
It may make more sense to emphasize that, of course, an order book is really a process and Bayesian probabilities are found in the limit when such order books approach the fair price and bet sizes tend towards zero. But unfortunately, in general we don't know much about the convergence of this process and real world examples of order books do not exhibit such a limit [x].
There is one additional problem, because the Kelly criterion is actually the limiting case of a more general rule. Indeed, if the estimated probabilities contain just a small amount of noise, bankruptcy still looms even if the Kelly criterion is used. Therefore professional gamblers know that one must bet 'less than Kelly'; In general a rational agent will bet 'less than Kelly' due to risk aversion and this means that the direct relationship between bet sizes and probability, as proposed in the above definition, is lost completely [x].
Therefore, I suggest that Bayesians simply refrain from using order books etc. (free markets are becoming quite unpopular anyways) to define probability and simply state that 'subjective uncertainty' is a self evident concept. After all, we know that self evidence is a very reliable kind of rug.
(*) I do not want to pick on this particular paper, since similar 'definitions' are
nowadays used in many cases. But since the paper is about the foundations of quantum theory I would like to point out that I have it on good authority that God does not play dice.
[x] If risk aversion and probability estimates of the participating agents are correlated, this would be already one reason why the order book would not converge towards the 'fair price'.
We already have managed accounts and managed healthcare, now we also have managed world views, thanks to Scott. A very interesting application and I recommend that you try it and check your views on quantum mechanics, strong AI and other topics.
Of course, the devil is in the details and several commenters have already detected ambiguities in the statements proposed by the world view manager and the detected tensions are often already accompanied by comments disputing their validity.
But where Wittgenstein ran into a brick wall, Scott may yet succeed. His manager could indeed be the first application of Web 3.0 a.k.a. 'the semantic interwebs'.
In the future, bloggers and other opinionators will check the consistency of their views and rationally resolve tensions and perhaps we will have, in addition to virus and spam filters, consistency scans of web pages and blog posts.
e.g. "wvm just detected an unresolved tension between this and that blog post and recommends to shut down this blog until the tension is resolved."
I am sure that Web 3.0 will be a better place, with about 95% of the current blogs removed from the interwebs due to inconsistencies.
"Nothing could be simpler. It is merely a small wooden casket, the size and shape of a cigar box, with a single switch on one face. When you throw the switch, there is an angry, purposeful buzzing. The lid slowly rises, and from beneath it emerges a hand. The hand reaches down, turns the switch off and retreats into the box. With the finality of a closing coffin, the lid snaps shut, the buzzing ceases and peace reigns once more."
The 'ultimate machine' of Claude Shannon.