`(9)/(5)``(81)/(25)`981 Show Answer : D Solution : Given, `r=0.6`, covariance `=27, sigma_((y))^(2)=25 rArr sigma(y)=5` <br> We know, `r=("covariance" (x,y))/(sigma(x).sigma(y))` <br> `rArr sigma(x)=("covariance"(x,y))/(r.sigma(y))=(27)/(((6)/(10)).5)` <br> `=(27xx2)/(6)=9`. <br> `sigma^(2)(x)=81` 5.3.1 Covariance and CorrelationConsider two random variables $X$ and $Y$. Here, we define the covariance between $X$ and $Y$, written $\textrm{Cov}(X,Y)$. The covariance gives some information about how $X$ and $Y$ are statistically related. Let us provide the definition, then discuss the properties and applications of covariance. The covariance between $X$ and $Y$ is defined as \begin{align}%\label{} \nonumber \textrm{Cov}(X,Y)&=E\big[(X-EX)(Y-EY)\big]=E[XY]-(EX)(EY). \end{align} Note that \begin{align}%\label{} \nonumber E\big[(X-EX)(Y-EY)\big]&=E\big[XY-X(EY)-(EX)Y+(EX)(EY)\big]\\ \nonumber &=E[XY]-(EX)(EY)-(EX)(EY)+(EX)(EY)\\ \nonumber &=E[XY]-(EX)(EY). \end{align} Intuitively, the covariance between $X$ and $Y$ indicates how the values of $X$ and $Y$ move relative to each other. If large values of $X$ tend to happen with large values of $Y$, then $(X-EX)(Y-EY)$ is positive on average. In this case, the covariance is positive and we say $X$ and $Y$ are positively correlated. On the other hand, if $X$ tends to be small when $Y$ is large, then $(X-EX)(Y-EY)$ is negative on average. In this case, the covariance is negative and we say $X$ and $Y$ are negatively correlated. Example
Now we discuss the properties of covariance. Lemma
\begin{align}%\label{} \nonumber \textrm{Cov}\left(\sum_{i=1}^{m}a_iX_i, \sum_{j=1}^{n}b_jY_j\right)=\sum_{i=1}^{m} \sum_{j=1}^{n} a_ib_j \textrm{Cov}(X_i,Y_j). \end{align} All of the above results can be proven directly from the definition of covariance. For example, if $X$ and $Y$ are independent, then as we have seen before $E[XY]=EX EY$, so \begin{align}%\label{} \nonumber \textrm{Cov}(X,Y)=E[XY]-EX EY=0. \end{align} Note that the converse is not necessarily true. That is, if $\textrm{Cov}(X,Y)=0$, $X$ and $Y$ may or may not be independent. Let us prove Item 6 in Lemma 5.3, $\textrm{Cov}(X+Y,Z)=\textrm{Cov}(X,Z)+\textrm{Cov}(Y,Z)$. We have \begin{align}%\label{} \nonumber \textrm{Cov}(X+Y,Z)&=E[(X+Y)Z]-E(X+Y)EZ\\ \nonumber &=E[XZ+YZ]-(EX+EY)EZ\\ \nonumber &=EXZ-EXEZ+EYZ-EYEZ\\ \nonumber &=\textrm{Cov}(X,Z)+\textrm{Cov}(Y,Z). \end{align} You can prove the rest of the items in Lemma 5.3 similarly. Example
One of the applications of covariance is finding the variance of a sum of several random variables. In particular, if $Z=X+Y$, then \begin{align}%\label{} \nonumber \textrm{Var}(Z)&=\textrm{Cov}(Z,Z)\\ \nonumber &=\textrm{Cov}(X+Y,X+Y)\\ \nonumber &=\textrm{Cov}(X,X)+\textrm{Cov}(X,Y)+ \textrm{Cov}(Y,X)+\textrm{Cov}(Y,Y)\\ \nonumber &=\textrm{Var}(X)+\textrm{Var}(Y)+2 \textrm{Cov}(X,Y). \end{align} More generally, for $a,b \in \mathbb{R}$, we conclude: \begin{align}\label{eq:var-aX+bY} \textrm{Var}(aX+bY)=a^2\textrm{Var}(X)+b^2\textrm{Var}(Y)+2ab \textrm{Cov}(X,Y) \hspace{20pt} (5.21) \end{align} Correlation Coefficient:The correlation coefficient, denoted by $\rho_{XY}$ or $\rho(X,Y)$, is obtained by normalizing the covariance. In particular, we define the correlation coefficient of two random variables $X$ and $Y$ as the covariance of the standardized versions of $X$ and $Y$. Define the standardized versions of $X$ and $Y$ as \begin{align}\label{eq:normalize} U=\frac{X-EX}{\sigma_X}, \hspace{10pt} V=\frac{Y-EY}{\sigma_Y} \hspace{20pt} (5.22) \end{align} Then, \begin{align}%\label{} \nonumber \rho_{XY}=\textrm{Cov}(U,V)&=\textrm{Cov}\left(\frac{X-EX}{\sigma_X},\frac{Y-EY}{\sigma_Y}\right)\\ \nonumber &=\textrm{Cov}\left(\frac{X}{\sigma_X},\frac{Y}{\sigma_Y}\right) &(\textrm{by Item 5 of Lemma 5.3})\\ \nonumber &=\frac{\textrm{Cov}(X,Y)}{\sigma_X \sigma_Y}. \end{align} \begin{align}%\label{} \nonumber \rho_{XY}=\rho(X,Y)=\frac{\textrm{Cov}(X,Y)}{\sqrt{\textrm{Var(X) Var(Y)}}}=\frac{\textrm{Cov}(X,Y)}{\sigma_X \sigma_Y} \end{align} A nice thing about the correlation coefficient is that it is always between $-1$ and $1$. This is an immediate result of Cauchy-Schwarz inequality that is discussed in Section 6.2.4. One way to prove that $-1 \leq \rho \leq 1$ is to use the following inequality: \begin{align}%\label{} \alpha \beta \leq \frac{\alpha^2+\beta^2}{2}, \textrm{for }\alpha,\beta \in \mathbb{R}. \end{align} This is because $(\alpha-\beta)^2 \geq 0$. The equality holds only if $\alpha=\beta$. From this, we can conclude that for any two random variables $U$ and $V$, \begin{align}%\label{} E[UV] \leq \frac{EU^2+EV^2}{2}, \end{align} with equality only if $U=V$ with probability one. Now, let $ U$ and $V$ be the standardized versions of $X$ and $Y$ as defined in Equation 5.22. Then, by definition $\rho_{XY}=\textrm{Cov}(U,V)=EUV$. But since $EU^2=EV^2=1$, we conclude \begin{align}%\label{} \rho_{XY}=E[UV] & \leq \frac{EU^2+EV^2}{2}=1, \end{align} with equality only if $U=V$. That is, \begin{align}%\label{} \frac{Y-EY}{\sigma_Y}=\frac{X-EX}{\sigma_X}, \end{align} which implies \begin{align}%\label{} Y&=\frac{\sigma_Y}{\sigma_X} X+ \left(EY-\frac{\sigma_Y}{\sigma_X} EX\right)\\ &=aX+b, \hspace{3pt} \textrm{where $a$ and $b$ are constants.} \end{align} Replacing $X$ by $-X$, we conclude that \begin{align}%\label{} \nonumber \rho(-X,Y) \leq 1. \end{align} But $\rho(-X,Y)=-\rho(X,Y)$, thus we conclude $\rho(X,Y) \geq -1$. Thus, we can summarize some properties of the correlation coefficient as follows. Properties of the correlation coefficient:
Definition
Note that as we discussed previously, two independent random variables are always uncorrelated, but the converse is not necessarily true. That is, if $X$ and $Y$ are uncorrelated, then $X$ and $Y$ may or may not be independent. Also, note that if $X$ and $Y$ are uncorrelated from Equation 5.21, we conclude that $\textrm{Var}(X+Y)=\textrm{Var}(X)+\textrm{Var}(Y)$. If $X$ and $Y$ are uncorrelated, then \begin{align}%\label{} \nonumber \textrm{Var}(X+Y)=\textrm{Var}(X)+\textrm{Var}(Y). \end{align} More generally, if $X_1,X_2,...,X_n$ are pairwise uncorrelated, i.e., $\rho(X_i,X_j)=0$ when $i \neq j$, then \begin{align}%\label{} \nonumber \textrm{Var}(X_1+X_2+...+X_n)=\textrm{Var}(X_1)+\textrm{Var}(X_2)+...+\textrm{Var}(X_n). \end{align} Note that if $X$ and $Y$ are independent, then they are uncorrelated, and so $\textrm{Var}(X+Y)=\textrm{Var}(X)+\textrm{Var}(Y)$. This is a fact that we stated previously in Chapter 3, and now we could easily prove using covariance. Example
The print version of the book is available through Amazon here. How do you find the COV of X and Y?The covariance of and , denoted Cov ( X , Y ) or σ X Y , is defined as:. C o v ( X , Y ) = σ X Y = E [ ( X − μ X ) ( Y − μ Y ) ]. C o v ( X , Y ) = ∑ ∑ ( x , y ) ∈ S . C o v ( X , Y ) = ∫ S 2 ∫ S 1 ( x − μ X ) ( y − μ Y ) f ( x , y ) d x d y.. When var x 2.25 var y 1 and COV x/y 0.9 then correlation coefficient is?
What is cov XY formula?The covariance between X and Y is defined as Cov(X,Y)=E[(X−EX)(Y−EY)]=E[XY]−(EX)(EY). Note that E[(X−EX)(Y−EY)]=E[XY−X(EY)−(EX)Y+(EX)(EY)]=E[XY]−(EX)(EY)−(EX)(EY)+(EX)(EY)=E[XY]−(EX)(EY). Intuitively, the covariance between X and Y indicates how the values of X and Y move relative to each other.
How do you calculate COV?To calculate covariance, you can use the formula:. Cov(X, Y) = Σ(Xi-µ)(Yj-v) / n.. 6,911.45 + 25.95 + 1,180.85 + 28.35 + 906.95 + 9,837.45 = 18,891.. Cov(X, Y) = 18,891 / 6.. |