# convert volatility of log returns to dollars

3 messages
Open this post in threaded view
|

## convert volatility of log returns to dollars

 While not specifically an "R" question, I was hoping someone in the group could help with an answer. (It is finance related and I am coding in R) I stuck on converting volatility of log returns to an absolute dollar amount.  The main question is, "How many dollars/cents is one standard deviation give our current price and volatility of returns?" For example (x are a series of prices) range(x) 753.0  763.5 mean(x) 758.9125 sd(x) 2.61521 # I assume that this answers my main question with "\$2.62" Now, If I take the log returns r <- diff(log(x)) r[1] <- 0 Then calculate the SD of the returns: sd(r) 0.001201941 This is where I'm stuck.  How can I translate the 0.001201941 into the correct monetary value?  (Which I assume is \$2.62) Thanks! -N Hi, I'm a bit confused on how to convert a garch of log returns into a dollar value. My understanding is that garch will give us the volatility. The square root of that is the standard deviation. But just multiplying the SD times the price does not appear to produce the correct values. (They seem too large.) To make is simpler, if I have 100 prices and first calculate the log returns: then calculate the volatility: I have the volatility of the log returns. How can I convert that back to a physical dollar value. I want to answer the question, "How many dollars/cents is one standard deviation given the current price and volatility of returns." Thanks! -- Noah Silverman UCLA Department of Statistics 8117 Math Sciences Building Los Angeles, CA 90095         [[alternative HTML version deleted]] _______________________________________________ [hidden email] mailing list https://stat.ethz.ch/mailman/listinfo/r-sig-finance-- Subscriber-posting only. If you want to post, subscribe first. -- Also note that this is not the r-help list where general R questions should go.