Let X₁,..., Xn be a random sample of size n from the U(0,0) distribution, where > 0 is an unknown parameter. Recall that the pdf fof the U(0, 0) distribution is of the form if 0 X₁. Once you have X₁ and the information that > X₁, obtain X₂. If X2 > X₁, then you know a bit more about 0, namely, 0 > X₂; however, if X2X₁, then it does not contribute anything, above and beyond what you already know from X₁, to your knowledge about 0. In other words, when you have obtained X₁ and X2, what you know about is that it is greater than the maximum of X₁ and X₂. As such, any reasonable estimator of should be a function of T. Construct an unbiased estimator of which is a function of T and calculate its variance. To start with, you should calculate, in that order, the cdf, the pdf, the expected value, and the variance of T.

Calculus For The Life Sciences
2nd Edition
ISBN:9780321964038
Author:GREENWELL, Raymond N., RITCHEY, Nathan P., Lial, Margaret L.
Publisher:GREENWELL, Raymond N., RITCHEY, Nathan P., Lial, Margaret L.
Chapter2: Exponential, Logarithmic, And Trigonometric Functions
Section2.CR: Chapter 2 Review
Problem 111CR: Respiratory Rate Researchers have found that the 95 th percentile the value at which 95% of the data...
icon
Related questions
Question
Let X₁,..., Xn be a random sample of size n from the U(0, 0) distribution, where > 0
is an unknown parameter. Recall that the pdf fof the U(0, 0) distribution is of the form
f(x) = {
S0-1
Note that the information about contained in the random sample X₁,..., Xn equals the
information about contained in the statistic
T
if 0 < x < 0
otherwise.
=
max(X₁, , Xn).
To understand why so, let's think of the random sample as being obtained in a sequential
manner, that is, you obtain X₁ and pause before obtaining X₂. What does X₁ tell you
about #? It tells you that > X₁. Once you have X₁ and the information that > X₁,
obtain X₂. If X₂ > X₁, then you know a bit more about 0, namely, 0 > X₂; however, if
X2X₁, then it does not contribute anything, above and beyond what you already know
from X₁, to your knowledge about 0. In other words, when you have obtained X₁ and
X2, what you know about is that it is greater than the maximum of X₁ and X₂. As such,
any reasonable estimator of should be a function of T.
Construct an unbiased estimator of 0 which is a function of T and calculate its
variance. To start with, you should calculate, in that order, the cdf, the pdf, the expected
value, and the variance of T.
Transcribed Image Text:Let X₁,..., Xn be a random sample of size n from the U(0, 0) distribution, where > 0 is an unknown parameter. Recall that the pdf fof the U(0, 0) distribution is of the form f(x) = { S0-1 Note that the information about contained in the random sample X₁,..., Xn equals the information about contained in the statistic T if 0 < x < 0 otherwise. = max(X₁, , Xn). To understand why so, let's think of the random sample as being obtained in a sequential manner, that is, you obtain X₁ and pause before obtaining X₂. What does X₁ tell you about #? It tells you that > X₁. Once you have X₁ and the information that > X₁, obtain X₂. If X₂ > X₁, then you know a bit more about 0, namely, 0 > X₂; however, if X2X₁, then it does not contribute anything, above and beyond what you already know from X₁, to your knowledge about 0. In other words, when you have obtained X₁ and X2, what you know about is that it is greater than the maximum of X₁ and X₂. As such, any reasonable estimator of should be a function of T. Construct an unbiased estimator of 0 which is a function of T and calculate its variance. To start with, you should calculate, in that order, the cdf, the pdf, the expected value, and the variance of T.
Expert Solution
steps

Step by step

Solved in 4 steps with 8 images

Blurred answer
Similar questions
Recommended textbooks for you
Calculus For The Life Sciences
Calculus For The Life Sciences
Calculus
ISBN:
9780321964038
Author:
GREENWELL, Raymond N., RITCHEY, Nathan P., Lial, Margaret L.
Publisher:
Pearson Addison Wesley,