|
CoreyMichael
Lew:
He
|
|
|
|
|
CoreyMichael Lew: Here is a reconstruction of seetirvy as a kind of likelihood function.Suppose I decide to carry out a statistical hypothesis test in the following fashion. First I choose some one-dimensional statistic of the data to be collected in such a way that its median (or expectation or some other measure of central tendency) is a monotonic increasing function of a one-dimensional parameter of interest. Next, I choose a threshold and decide to report pass/fail according to whether the statistic was either (i) less than, or (ii) equal to or greater than the threshold. Naturally this pass/fail report is a random variable (well, random element technically, but whatevs), and hence has a sampling distribution that depends on the parameter value. So far so good. Now I collect my data, and find that by a *remarkable* coincidence, the value of my statistic is exactly equal to the threshold I chose prior to seeing the data. I report the likelihood function of the parameter given not the complete data, not even the statistic, but the pass/fail variable. *This* likelihood function is mathematically identical to the seetirvy function.Obviously this likelihood function corresponds to a badly mangled description of the experiment that was actually carried out, so I'm not sure what can be learned from this mathematical reconstruction.
|
|
|
|
|
|
|
|
|
(VISITOR) AUTHOR'S NAME Burak
MESSAGE TIMESTAMP 16 december 2014, 14:21:02
AUTHOR'S IP LOGGED 207.200.10.115
|
|
|
|