Sum of two exponential series with equal means and variances
.everyoneloves__top-leaderboard:empty,.everyoneloves__mid-leaderboard:empty,.everyoneloves__bot-mid-leaderboard:empty{ margin-bottom:0;
}
$begingroup$
Assuming $A$ and $B$ are two non-negative real-valued random variables such that
$mathrm{E}(A)=mathrm{E}(B)$ (equal means)
$mathrm{Var}(A)=mathrm{Var}(B)<epsilon$ (equal small variances)
is there a way to prove that
$frac{1}{N}sum_{j=1}^Ne^{-a_{j}}$ and $frac{1}{N}sum_{j=1}^Ne^{-b_{j}}$ are arbitrarily close to each other where $a_j$ and $b_j$ are realizations taken from $A$ and $B$, respectively. ($N$ can be assumed to be large as well)
variance mean exponential
New contributor
$endgroup$
add a comment |
$begingroup$
Assuming $A$ and $B$ are two non-negative real-valued random variables such that
$mathrm{E}(A)=mathrm{E}(B)$ (equal means)
$mathrm{Var}(A)=mathrm{Var}(B)<epsilon$ (equal small variances)
is there a way to prove that
$frac{1}{N}sum_{j=1}^Ne^{-a_{j}}$ and $frac{1}{N}sum_{j=1}^Ne^{-b_{j}}$ are arbitrarily close to each other where $a_j$ and $b_j$ are realizations taken from $A$ and $B$, respectively. ($N$ can be assumed to be large as well)
variance mean exponential
New contributor
$endgroup$
add a comment |
$begingroup$
Assuming $A$ and $B$ are two non-negative real-valued random variables such that
$mathrm{E}(A)=mathrm{E}(B)$ (equal means)
$mathrm{Var}(A)=mathrm{Var}(B)<epsilon$ (equal small variances)
is there a way to prove that
$frac{1}{N}sum_{j=1}^Ne^{-a_{j}}$ and $frac{1}{N}sum_{j=1}^Ne^{-b_{j}}$ are arbitrarily close to each other where $a_j$ and $b_j$ are realizations taken from $A$ and $B$, respectively. ($N$ can be assumed to be large as well)
variance mean exponential
New contributor
$endgroup$
Assuming $A$ and $B$ are two non-negative real-valued random variables such that
$mathrm{E}(A)=mathrm{E}(B)$ (equal means)
$mathrm{Var}(A)=mathrm{Var}(B)<epsilon$ (equal small variances)
is there a way to prove that
$frac{1}{N}sum_{j=1}^Ne^{-a_{j}}$ and $frac{1}{N}sum_{j=1}^Ne^{-b_{j}}$ are arbitrarily close to each other where $a_j$ and $b_j$ are realizations taken from $A$ and $B$, respectively. ($N$ can be assumed to be large as well)
variance mean exponential
variance mean exponential
New contributor
New contributor
edited 4 hours ago
nOp
New contributor
asked 4 hours ago
nOpnOp
113
113
New contributor
New contributor
add a comment |
add a comment |
2 Answers
2
active
oldest
votes
$begingroup$
Although it is not explicitly specified, I will assume that you intend for all the realisations of these random variables to be independent (i.e., I will assume joint independence of all the random variables in both series). The difference between the two series is the random variable defined by the function:
$$D(N) = frac{1}{N} sum_{i=1}^N (e^{-A_i} - e^{-B_i}).$$
Since $e^{-a} leqslant 1$ for all $a geqslant 0$, it follows that $mathbb{V}(e^{-A}) leqslant mathbb{E}(e^{-2A}) leqslant 1$ for any non-negative random variable $A$. Thus, we have:
$$begin{equation} begin{aligned}
mathbb{V}(D(N))
&= frac{1}{N^2} sum_{i=1}^N Big( mathbb{V}(e^{-A_i}) + mathbb{V}(e^{-B_i}) Big) \[6pt]
&leqslant frac{1}{N^2} sum_{i=1}^N Big( 1 + 1 Big) \[6pt]
&= frac{1}{N^2} cdot 2N \[6pt]
&= frac{2}{N}. \[6pt]
end{aligned} end{equation}$$
We therefore have $lim_{N rightarrow infty} mathbb{V}(D(N)) = 0$, so the variance of the difference converges to zero. We also have the constant mean difference:
$$begin{equation} begin{aligned}
mathbb{E}(D(N))
&= frac{1}{N} sum_{i=1}^N Big( mathbb{E}(e^{-A_i}) - mathbb{E}(e^{-B_i}) Big) \[6pt]
&= mathbb{E}(e^{-A}) - mathbb{E}(e^{-B}). \[6pt]
end{aligned} end{equation}$$
Combining these results we see that $D(N)$ converges in mean square to $mathbb{E}(e^{-A}) - mathbb{E}(e^{-B})$, which is a constant value. Since the variances of both random variables are small, this limiting value should be close to (but not necessarily equal to) zero. Thus, for large values of $N$ you would indeed expect the difference to converge towards a value near zero.
$endgroup$
add a comment |
$begingroup$
Just apply the law of large numbers, since $e^{-a_j}$ has finite variance because $a$ is positive and has finite variance it self. The result is that the variance of $e^{-a_j}$ is smaller than the variance of $a$. This is elaborated in this question: https://mathoverflow.net/questions/330348/proof-of-variance-bounds-for-transformed-random-variables/330357#330357
$endgroup$
add a comment |
Your Answer
StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "65"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});
function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: false,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: null,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});
}
});
nOp is a new contributor. Be nice, and check out our Code of Conduct.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstats.stackexchange.com%2fquestions%2f405806%2fsum-of-two-exponential-series-with-equal-means-and-variances%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
2 Answers
2
active
oldest
votes
2 Answers
2
active
oldest
votes
active
oldest
votes
active
oldest
votes
$begingroup$
Although it is not explicitly specified, I will assume that you intend for all the realisations of these random variables to be independent (i.e., I will assume joint independence of all the random variables in both series). The difference between the two series is the random variable defined by the function:
$$D(N) = frac{1}{N} sum_{i=1}^N (e^{-A_i} - e^{-B_i}).$$
Since $e^{-a} leqslant 1$ for all $a geqslant 0$, it follows that $mathbb{V}(e^{-A}) leqslant mathbb{E}(e^{-2A}) leqslant 1$ for any non-negative random variable $A$. Thus, we have:
$$begin{equation} begin{aligned}
mathbb{V}(D(N))
&= frac{1}{N^2} sum_{i=1}^N Big( mathbb{V}(e^{-A_i}) + mathbb{V}(e^{-B_i}) Big) \[6pt]
&leqslant frac{1}{N^2} sum_{i=1}^N Big( 1 + 1 Big) \[6pt]
&= frac{1}{N^2} cdot 2N \[6pt]
&= frac{2}{N}. \[6pt]
end{aligned} end{equation}$$
We therefore have $lim_{N rightarrow infty} mathbb{V}(D(N)) = 0$, so the variance of the difference converges to zero. We also have the constant mean difference:
$$begin{equation} begin{aligned}
mathbb{E}(D(N))
&= frac{1}{N} sum_{i=1}^N Big( mathbb{E}(e^{-A_i}) - mathbb{E}(e^{-B_i}) Big) \[6pt]
&= mathbb{E}(e^{-A}) - mathbb{E}(e^{-B}). \[6pt]
end{aligned} end{equation}$$
Combining these results we see that $D(N)$ converges in mean square to $mathbb{E}(e^{-A}) - mathbb{E}(e^{-B})$, which is a constant value. Since the variances of both random variables are small, this limiting value should be close to (but not necessarily equal to) zero. Thus, for large values of $N$ you would indeed expect the difference to converge towards a value near zero.
$endgroup$
add a comment |
$begingroup$
Although it is not explicitly specified, I will assume that you intend for all the realisations of these random variables to be independent (i.e., I will assume joint independence of all the random variables in both series). The difference between the two series is the random variable defined by the function:
$$D(N) = frac{1}{N} sum_{i=1}^N (e^{-A_i} - e^{-B_i}).$$
Since $e^{-a} leqslant 1$ for all $a geqslant 0$, it follows that $mathbb{V}(e^{-A}) leqslant mathbb{E}(e^{-2A}) leqslant 1$ for any non-negative random variable $A$. Thus, we have:
$$begin{equation} begin{aligned}
mathbb{V}(D(N))
&= frac{1}{N^2} sum_{i=1}^N Big( mathbb{V}(e^{-A_i}) + mathbb{V}(e^{-B_i}) Big) \[6pt]
&leqslant frac{1}{N^2} sum_{i=1}^N Big( 1 + 1 Big) \[6pt]
&= frac{1}{N^2} cdot 2N \[6pt]
&= frac{2}{N}. \[6pt]
end{aligned} end{equation}$$
We therefore have $lim_{N rightarrow infty} mathbb{V}(D(N)) = 0$, so the variance of the difference converges to zero. We also have the constant mean difference:
$$begin{equation} begin{aligned}
mathbb{E}(D(N))
&= frac{1}{N} sum_{i=1}^N Big( mathbb{E}(e^{-A_i}) - mathbb{E}(e^{-B_i}) Big) \[6pt]
&= mathbb{E}(e^{-A}) - mathbb{E}(e^{-B}). \[6pt]
end{aligned} end{equation}$$
Combining these results we see that $D(N)$ converges in mean square to $mathbb{E}(e^{-A}) - mathbb{E}(e^{-B})$, which is a constant value. Since the variances of both random variables are small, this limiting value should be close to (but not necessarily equal to) zero. Thus, for large values of $N$ you would indeed expect the difference to converge towards a value near zero.
$endgroup$
add a comment |
$begingroup$
Although it is not explicitly specified, I will assume that you intend for all the realisations of these random variables to be independent (i.e., I will assume joint independence of all the random variables in both series). The difference between the two series is the random variable defined by the function:
$$D(N) = frac{1}{N} sum_{i=1}^N (e^{-A_i} - e^{-B_i}).$$
Since $e^{-a} leqslant 1$ for all $a geqslant 0$, it follows that $mathbb{V}(e^{-A}) leqslant mathbb{E}(e^{-2A}) leqslant 1$ for any non-negative random variable $A$. Thus, we have:
$$begin{equation} begin{aligned}
mathbb{V}(D(N))
&= frac{1}{N^2} sum_{i=1}^N Big( mathbb{V}(e^{-A_i}) + mathbb{V}(e^{-B_i}) Big) \[6pt]
&leqslant frac{1}{N^2} sum_{i=1}^N Big( 1 + 1 Big) \[6pt]
&= frac{1}{N^2} cdot 2N \[6pt]
&= frac{2}{N}. \[6pt]
end{aligned} end{equation}$$
We therefore have $lim_{N rightarrow infty} mathbb{V}(D(N)) = 0$, so the variance of the difference converges to zero. We also have the constant mean difference:
$$begin{equation} begin{aligned}
mathbb{E}(D(N))
&= frac{1}{N} sum_{i=1}^N Big( mathbb{E}(e^{-A_i}) - mathbb{E}(e^{-B_i}) Big) \[6pt]
&= mathbb{E}(e^{-A}) - mathbb{E}(e^{-B}). \[6pt]
end{aligned} end{equation}$$
Combining these results we see that $D(N)$ converges in mean square to $mathbb{E}(e^{-A}) - mathbb{E}(e^{-B})$, which is a constant value. Since the variances of both random variables are small, this limiting value should be close to (but not necessarily equal to) zero. Thus, for large values of $N$ you would indeed expect the difference to converge towards a value near zero.
$endgroup$
Although it is not explicitly specified, I will assume that you intend for all the realisations of these random variables to be independent (i.e., I will assume joint independence of all the random variables in both series). The difference between the two series is the random variable defined by the function:
$$D(N) = frac{1}{N} sum_{i=1}^N (e^{-A_i} - e^{-B_i}).$$
Since $e^{-a} leqslant 1$ for all $a geqslant 0$, it follows that $mathbb{V}(e^{-A}) leqslant mathbb{E}(e^{-2A}) leqslant 1$ for any non-negative random variable $A$. Thus, we have:
$$begin{equation} begin{aligned}
mathbb{V}(D(N))
&= frac{1}{N^2} sum_{i=1}^N Big( mathbb{V}(e^{-A_i}) + mathbb{V}(e^{-B_i}) Big) \[6pt]
&leqslant frac{1}{N^2} sum_{i=1}^N Big( 1 + 1 Big) \[6pt]
&= frac{1}{N^2} cdot 2N \[6pt]
&= frac{2}{N}. \[6pt]
end{aligned} end{equation}$$
We therefore have $lim_{N rightarrow infty} mathbb{V}(D(N)) = 0$, so the variance of the difference converges to zero. We also have the constant mean difference:
$$begin{equation} begin{aligned}
mathbb{E}(D(N))
&= frac{1}{N} sum_{i=1}^N Big( mathbb{E}(e^{-A_i}) - mathbb{E}(e^{-B_i}) Big) \[6pt]
&= mathbb{E}(e^{-A}) - mathbb{E}(e^{-B}). \[6pt]
end{aligned} end{equation}$$
Combining these results we see that $D(N)$ converges in mean square to $mathbb{E}(e^{-A}) - mathbb{E}(e^{-B})$, which is a constant value. Since the variances of both random variables are small, this limiting value should be close to (but not necessarily equal to) zero. Thus, for large values of $N$ you would indeed expect the difference to converge towards a value near zero.
edited 21 mins ago
answered 2 hours ago
BenBen
29.4k234130
29.4k234130
add a comment |
add a comment |
$begingroup$
Just apply the law of large numbers, since $e^{-a_j}$ has finite variance because $a$ is positive and has finite variance it self. The result is that the variance of $e^{-a_j}$ is smaller than the variance of $a$. This is elaborated in this question: https://mathoverflow.net/questions/330348/proof-of-variance-bounds-for-transformed-random-variables/330357#330357
$endgroup$
add a comment |
$begingroup$
Just apply the law of large numbers, since $e^{-a_j}$ has finite variance because $a$ is positive and has finite variance it self. The result is that the variance of $e^{-a_j}$ is smaller than the variance of $a$. This is elaborated in this question: https://mathoverflow.net/questions/330348/proof-of-variance-bounds-for-transformed-random-variables/330357#330357
$endgroup$
add a comment |
$begingroup$
Just apply the law of large numbers, since $e^{-a_j}$ has finite variance because $a$ is positive and has finite variance it self. The result is that the variance of $e^{-a_j}$ is smaller than the variance of $a$. This is elaborated in this question: https://mathoverflow.net/questions/330348/proof-of-variance-bounds-for-transformed-random-variables/330357#330357
$endgroup$
Just apply the law of large numbers, since $e^{-a_j}$ has finite variance because $a$ is positive and has finite variance it self. The result is that the variance of $e^{-a_j}$ is smaller than the variance of $a$. This is elaborated in this question: https://mathoverflow.net/questions/330348/proof-of-variance-bounds-for-transformed-random-variables/330357#330357
edited 1 hour ago
answered 4 hours ago
Peter Mølgaard PallesenPeter Mølgaard Pallesen
39319
39319
add a comment |
add a comment |
nOp is a new contributor. Be nice, and check out our Code of Conduct.
nOp is a new contributor. Be nice, and check out our Code of Conduct.
nOp is a new contributor. Be nice, and check out our Code of Conduct.
nOp is a new contributor. Be nice, and check out our Code of Conduct.
Thanks for contributing an answer to Cross Validated!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
Use MathJax to format equations. MathJax reference.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstats.stackexchange.com%2fquestions%2f405806%2fsum-of-two-exponential-series-with-equal-means-and-variances%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown