PDA

View Full Version : Need help conceptualizing the constant "e"


derosnec
11-16-2007, 02:30 PM
I've used the constant e many times in my finance classes and know how to use it, but I don't understand what e is. To me it's just a number and I use it for discounting.

I looked it up in wikipedia and that page confuses me.

Can anyone explain e to me in kindergarten language?

jay_shark
11-16-2007, 03:10 PM
By definition e is the limit of (1+1/n)^n as n approaches infinity .

Try this out yourself .

Plug a large number for n in your calculator and you'll ~ match the number e you get on your calculator .

If we expand this using the binomial theorem we can represent as a limit of infinite sums .

e = 1/0! + 1/1! + 1/2! + 1/3! + 1/4! +....
e ~ 2.718281828....

e is an irrational number much like pi so it cannot be represented as a fraction a/b . There are many interesting properties about e but it's probably worthwhile to start with the definition before you get into other topics .

derosnec
11-16-2007, 04:06 PM
ok makes sense.

now, and you might not know this, but why do i use it in finance so much? we use it for continuously compounding interest rates. for example $100 at 5% interest continuously compounded for 2 years is $100e^.05*2. so, why e?

T50_Omaha8
11-16-2007, 04:34 PM
[ QUOTE ]
ok makes sense.

now, and you might not know this, but why do i use it in finance so much? we use it for continuously compounding interest rates. for example $100 at 5% interest continuously compounded for 2 years is $100e^.05*2. so, why e?

[/ QUOTE ]

Because the definition of e...
[ QUOTE ]
the limit of (1+1/n)^n as n approaches infinity

[/ QUOTE ] ...is the formula for compounding interest with the number times you compound limiting to infinity.

Say you compound interest 1 time over a period at rate r.

You get (1+r)^1 = 1+r

Say you compound interest 4 times over a period.

You get (1+r/4)^4

Say you compound monthly.

You get (1+r/12)^12.

Say you compound continuously.

You get lim(n to infinity) (1+r/n)^n = e^r

I forget the proof of the very last equality, but I imagine it's not hard to come by. It should be pretty easy to see where the definition of e comes into play though based on its similarity to the formula for compound interest.

jay_shark
11-16-2007, 05:30 PM
Pretty much what Omaha said but I'll expand some more .

When we talk about compound interest , we have a familiar formula A = P(1+r/n)^(nt)

P = Principal amount
r= annual interest rate
n = the number of compounded periods per annum .
t= t years


The above formula may be re-written as

A= P*[(1+r/n)^(n/r)]^(rt)
Substitute n/r = x

A= P*[1+1/x)^x]^(rt)

So as x becomes large , the quantity (1+1/x)^x approaches e.

A=p*e^(rt)

Example : Find the amount after 3 years if $1000 is invested at an interested rate of 12% per annum compounded continuously .

Solution : Using A=P*e^(rt) , r=0.12 and t=3,
A=1000*e^(0.12*3)
A= $1433.33

PairTheBoard
11-16-2007, 06:35 PM
[ QUOTE ]
By definition e is the limit of (1+1/n)^n as n approaches infinity .

[/ QUOTE ]

I think this definition is arbitrary. e shows up in all sorts of places and you could use any one of them as the starting point to define it. My first introduction to e was by way of the area under the curve 1/x. e is that number such that the area from 1 to e under the curve 1/x is 1. The natural log function, ln, is defined as the area under the curve 1/x so from this definition e is that number such that ln(e)=1. From this definition properties like the one above can be derived.

PairTheBoard

Fly
11-16-2007, 09:54 PM
Mathematicians were looking for a function that is equal to its own derivative. They narrowed down the search to functions of the form f(x) = a^x, where a is real.

For a fixed x,

f'(x) = lim (1/h)( f(x+h) - f(x) ) where h--> infinity

a^x = lim (1/h)( a^(x+h) - a^x ) where h---> infinity

factoring out a^x from the right hand side

a^x = a^x * lim (1/h)( a^x - 1) where h--> infinity

1 = lim (1/h) (a^x - 1 ) where h--->infinity

e is defined to be the unique value of a such that the equation above is true. You can massage the equation above and substitute h = 1/n to get the definition provided by previous posters.

David Sklansky
11-16-2007, 10:16 PM
e dollars is the amount of money you would have at the end of the year if you put one dollar into a bank that offered 100% interest compounded continuously.

For those who don't know what this means lets change it to a bank that compounded your million dollars every 3.65 days. At a 100% annual interest rate. So every 3.65 days they gave you one percent. After 3.65 days you would have 1,010,000. After 7.30 days you would have 1,020,100. After 10.95 days you would have 1,030,201. At the end of the year you would have just short of "e" million dollars (as opposed to two million with no compounding or 2.25 million if interest was compounded every six months.) The thing is that even though shrinking the time period for compunding makes you more and more money, you run into one of those limit thingies that jason and boris love and you can't get past e.

The more important thing about e concerns making prop bets when poker tournaments redraw. If there are more than a few players left from 20 to a googol, the chances that everybody will draw a new seat is almost exactly one out of e.

Fly
11-16-2007, 10:27 PM
oops, in the last 2 lines of my post, it should be a^h not a^x inside the limit =/

David Sklansky
11-16-2007, 11:25 PM
[ QUOTE ]
[ QUOTE ]
By definition e is the limit of (1+1/n)^n as n approaches infinity .

[/ QUOTE ]

I think this definition is arbitrary. e shows up in all sorts of places and you could use any one of them as the starting point to define it. My first introduction to e was by way of the area under the curve 1/x. e is that number such that the area from 1 to e under the curve 1/x is 1. The natural log function, ln, is defined as the area under the curve 1/x so from this definition e is that number such that ln(e)=1. From this definition properties like the one above can be derived.

PairTheBoard

[/ QUOTE ]

Somebody else must have written this post for you.

Enrique
11-17-2007, 01:13 PM
[ QUOTE ]
Mathematicians were looking for a function that is equal to its own derivative. They narrowed down the search to functions of the form f(x) = a^x, where a is real.

For a fixed x,

f'(x) = lim (1/h)( f(x+h) - f(x) ) where h--> infinity

a^x = lim (1/h)( a^(x+h) - a^x ) where h---> infinity

factoring out a^x from the right hand side

a^x = a^x * lim (1/h)( a^x - 1) where h--> infinity

1 = lim (1/h) (a^x - 1 ) where h--->infinity

e is defined to be the unique value of a such that the equation above is true. You can massage the equation above and substitute h = 1/n to get the definition provided by previous posters.

[/ QUOTE ]

I don't think this is true. I think Euler was the first one to talk about the constant and he was trying to sum power series. Working out properties of summing power series, he found "e" although of course he didn't call it e and he noticed it was an important constant for summing stuff.

The property that DS mentions about everyone getting a new seat, is a cool probability that Euler discovered while working on what is called the hat problem: If you have n people entering a party and every one leaves his hat at the door to dance. If you give them their hats back randomly, what is the probability that no one got his hat back? The answer is 1/e.

Fly
11-17-2007, 01:40 PM
[ QUOTE ]

The property that DS mentions about everyone getting a new seat, is a cool probability that Euler discovered while working on what is called the hat problem


[/ QUOTE ]

I've never heard it called the "hat problem". Either "matching problem", "problem of recontre", or for the simplest case, counting derangements.

borisp
11-17-2007, 06:46 PM
[ QUOTE ]
The more important thing about e concerns making prop bets when poker tournaments redraw

[/ QUOTE ]
lololololololololol...etc

Oh, and

[ QUOTE ]
Somebody else must have written this post for you

[/ QUOTE ]

lololololololololol...etc

David Sklansky
11-17-2007, 10:22 PM
[ QUOTE ]
[ QUOTE ]

The property that DS mentions about everyone getting a new seat, is a cool probability that Euler discovered while working on what is called the hat problem


[/ QUOTE ]

I've never heard it called the "hat problem". Either "matching problem", "problem of recontre", or for the simplest case, counting derangements.

[/ QUOTE ]

Would make a small bet that its known mainly by the words "Euler's problem of the misaddressed letters"

Fly
11-17-2007, 10:49 PM
[ QUOTE ]
[ QUOTE ]
[ QUOTE ]

The property that DS mentions about everyone getting a new seat, is a cool probability that Euler discovered while working on what is called the hat problem


[/ QUOTE ]

I've never heard it called the "hat problem". Either "matching problem", "problem of recontre", or for the simplest case, counting derangements.

[/ QUOTE ]

Would make a small bet that its known mainly by the words "Euler's problem of the misaddressed letters"

[/ QUOTE ]

Sure, so long as "mainly" corresponds to what has more (relevant) google hits and/or a wiki entry.

Edit - Spelling Correction

Should be problem of rencontres not recontre

borisp
11-17-2007, 11:59 PM
[ QUOTE ]
...The answer is 1/e...

[/ QUOTE ]
I think you mean that the answer tends to 1/e as n approaches infinity. Lol mathaments. Btw, wiki thinks that Bernoulli was the first to "discover" e, and apparently he did it by considering continuously compounded interest. Of course, this could be wrong.

Here is a cooler problem, imo: show that the expected value of the # of people who get their hat back is 1, independent of n.

blah_blah
11-18-2007, 12:08 AM
I don't want to spoil it, but in fact, if \sigma is an element of the symmetric group on n letters, p is a polynomial, and \fix\sigma denote the number of fixed points of \sigma, then there are methods to evaluate

\sum_{\sigma\in S_n} p(\fix\sigma)

here is a simple proof of the for the case p = id which generalizes to higher degree polynomials. let \fix_i \sigma = 1 if \sigma fixes the ith place and 0 otherwise.

\sum_{\sigma\in S_n} \fix\sigma =
\sum_{\sigma\in S_n} \sum_i \fix_i\sigma =
\sum_i \sum_{\sigma\in S_n} \fix_i\sigma =
\sum_i (n-1)! =
n(n-1)! =
n!

which is the desired result.

Fly
11-18-2007, 12:19 AM
[ QUOTE ]

Here is a cooler problem, imo: show that the expected value of the # of people who get their hat back is 1, independent of n.

[/ QUOTE ]

How is this cooler? This is way easier to solve than the original problem, just use <font color="white"> indicator functions </font> &lt;---- answer in white.

borisp
11-18-2007, 12:23 AM
[ QUOTE ]
I don't want to spoil it

[/ QUOTE ]
Then why did you post the answer? /images/graemlins/smile.gif

Say \sigma is a permutation of n letters, and V is a vector space of dimension n, with basis e_i. Define a linear map e_i \mapsto e_{\sigma(i)}. The trace of this linear map is equal to the number of fixed points of \sigma. This observation, together with the fact that trace is linear, is basically your argument.

borisp
11-18-2007, 12:25 AM
[ QUOTE ]
[ QUOTE ]

Here is a cooler problem, imo: show that the expected value of the # of people who get their hat back is 1, independent of n.

[/ QUOTE ]

How is this cooler? This is way easier to solve than the original problem, just use <font color="white"> indicator functions </font> &lt;---- answer in white.

[/ QUOTE ]
Cooler in that it admits several elegant and simple solutions. To me, easier problems are cooler.

Fly
11-18-2007, 12:44 AM
[ QUOTE ]
[ QUOTE ]
[ QUOTE ]

Here is a cooler problem, imo: show that the expected value of the # of people who get their hat back is 1, independent of n.

[/ QUOTE ]

How is this cooler? This is way easier to solve than the original problem, just use <font color="white"> indicator functions </font> &lt;---- answer in white.

[/ QUOTE ]
Cooler in that it admits several elegant and simple solutions. To me, easier problems are cooler.

[/ QUOTE ]

Can't disagree with that =).

That post also jarred my memory and reminded me of an amazingly simple solution to the 1st problem using exponential generating functions. Thanks /images/graemlins/smile.gif

blah_blah
11-18-2007, 01:24 AM
[ QUOTE ]
[ QUOTE ]

Here is a cooler problem, imo: show that the expected value of the # of people who get their hat back is 1, independent of n.

[/ QUOTE ]

How is this cooler? This is way easier to solve than the original problem, just use <font color="white"> indicator functions </font> &lt;---- answer in white.

[/ QUOTE ]

sure, this provides an easy solution, but it's essentially a linear technique and thus isn't particularly useful if you want to generalize the problem.

what is

\sum_{\sigma\in S_n} [\fix(\sigma)]^2

thylacine
11-18-2007, 02:21 AM
[ QUOTE ]

Here is a cooler problem, imo: show that the expected value of the # of people who get their hat back is 1, independent of n.

[/ QUOTE ]

Hmm, how about n(1/n)=1!
uA

David Sklansky
11-18-2007, 02:58 AM
[ QUOTE ]
[ QUOTE ]

Here is a cooler problem, imo: show that the expected value of the # of people who get their hat back is 1, independent of n.

[/ QUOTE ]

Hmm, how about n(1/n)=1!
uA

[/ QUOTE ]

Obviously. And while I'm mad I just got to this thread so you posted it first, I'm quite happy that boris was nice enough to post a question that perfectly shows why clever amateur will sometimes beat not so clever pros.

Permit me to answer it in a way that everyone will understand.

There is 1000 players in a tournament redrawing for seats. The RIO is paying five thousand dollars to any player who gets his own seat. Each player has a one in a thousand chance of making a thousand dollars. Each player has an EV of $5.

I go around buying up everyone's EV for face value. It cost me five grand. Each purchase is a break even purchase for me. Thus the whole deal is a break even thing for me. Therefore my expected payoff from the Rio (which ranges from zero to 5mil) is the $5000 I paid. If the expected value of my prize is $5000, the expected number of matches is one. And of course this would work for any size tournament.

borisp
11-18-2007, 03:31 AM
[ QUOTE ]
[ QUOTE ]

Here is a cooler problem, imo: show that the expected value of the # of people who get their hat back is 1, independent of n.

[/ QUOTE ]

Hmm, how about n(1/n)=1!
uA

[/ QUOTE ]
Ok, mr genius, show that the outcomes are independent. I know this is "intuitively obvious" but actually providing a proof is an altogether different matter.

And for once, I actually agree with Sklansky rigor. Tournament reseating is the perfect way to conceptualize this problem, for those who already have experience with poker tournaments.

Subfallen
11-18-2007, 03:46 AM
[ QUOTE ]

And for once, I actually agree with Sklansky rigor. Tournament reseating is the perfect way to conceptualize this problem, for those who already have experience with poker tournaments.

[/ QUOTE ]

I can't even tell when you're being sarcastic anymore, lol.

What needs to be proven re: independent outcomes? Algebraically A's probability of receiving his seat doesn't change after B receives an unknown seat:
<font color="white">..</font>(1/n)(0) + ((n-1)/n)(1/(n-1)) = 1/n

So it seems to follow by induction that no B,C,D...Z...(n-1) prior assignments would ever change the algebraic probability for A receiving his seat. Is algebra not good enough here?

borisp
11-18-2007, 03:57 AM
[ QUOTE ]
[ QUOTE ]

And for once, I actually agree with Sklansky rigor. Tournament reseating is the perfect way to conceptualize this problem, for those who already have experience with poker tournaments.

[/ QUOTE ]

I can't even tell when you're being sarcastic anymore, lol.

What needs to be proven re: independent outcomes? Algebraically A's probability of receiving his seat doesn't change after B receives an unknown seat:
<font color="white">..</font>(1/n)(0) + ((n-1)/n)(1/(n-1)) = 1/n

So it seems to follow by induction that no B,C,D...Z...(n-1) prior assignments would ever change the algebraic probability for A receiving his seat. Is algebra not good enough here?

[/ QUOTE ]
When something is obvious, it has not yet overcome its requirement of being written down.

However, you have provided a proof of what I was demanding. My point was that this notion (independence) must be acknowledged, otherwise the solution is incomplete.

And with regard to sarcasm, etc...if you are able to develop a method for determining this sort of thing, plz to be sending it to my home base, for we would be greatly appreciating such an algorithm /images/graemlins/smile.gif

David Sklansky
11-18-2007, 04:10 AM
Now please make yourself useful and go over to my thread "Improving On Buffet And Desert Cat" on the Business Forum and explain to DeserstCat that I'm right. (Even though my argument doesn't meet your brand of rigor.)

borisp
11-18-2007, 05:48 AM
[ QUOTE ]
Now please make yourself useful and go over to my thread "Improving On Buffet And Desert Cat" on the Business Forum and explain to DeserstCat that I'm right. (Even though my argument doesn't meet your brand of rigor.)

[/ QUOTE ]
Like, seriously, this is a loaded request.

One way to explain your point in the business forum is to point out that history books are only written by the winners. This sufficiently discredits the opinion(s) of Buffet (or whomever) enough...or so I think...The point is that an informed opinion is worth more than the literature assigns it, since the literature has a natural bias.

In any event, I'll do my best. And if I do, I'm only doing this for irrational hero worship. I actually DID wear the cover off of my theory of poker book.

ADDboy
11-18-2007, 09:50 AM
[ QUOTE ]
[ QUOTE ]
[ QUOTE ]

Here is a cooler problem, imo: show that the expected value of the # of people who get their hat back is 1, independent of n.

[/ QUOTE ]

Hmm, how about n(1/n)=1!
uA

[/ QUOTE ]
Ok, mr genius, show that the outcomes are independent. I know this is "intuitively obvious" but actually providing a proof is an altogether different matter.


[/ QUOTE ]

Why do we care about whether or not the outcomes are independent? The question is about expected values, and the expected value of the sum is the sum of the expected values regardless of whether or not the variables are independent.

Fly
11-18-2007, 12:57 PM
[ QUOTE ]

I'm quite happy that boris was nice enough to post a question that perfectly shows why clever amateur will sometimes beat not so clever pros


[/ QUOTE ]

If I call every bet, I will never be bluffed!!!!!!!!!!!!

Fly
11-18-2007, 01:05 PM
[ QUOTE ]
[ QUOTE ]
[ QUOTE ]
[ QUOTE ]

Here is a cooler problem, imo: show that the expected value of the # of people who get their hat back is 1, independent of n.

[/ QUOTE ]

Hmm, how about n(1/n)=1!
uA

[/ QUOTE ]
Ok, mr genius, show that the outcomes are independent. I know this is "intuitively obvious" but actually providing a proof is an altogether different matter.


[/ QUOTE ]

Why do we care about whether or not the outcomes are independent? The question is about expected values, and the expected value of the sum is the sum of the expected values regardless of whether or not the variables are independent.

[/ QUOTE ]

n(1/n) = 1/n + 1/n + .... + 1/n (n terms in the sum) where the ith term represents the probability that the ith person receives his hat. If the events were not independent, the 2nd through nth terms could be different, yielding a different sum.

PairTheBoard
11-18-2007, 01:27 PM
[ QUOTE ]
[ QUOTE ]
[ QUOTE ]
[ QUOTE ]
[ QUOTE ]

Here is a cooler problem, imo: show that the expected value of the # of people who get their hat back is 1, independent of n.

[/ QUOTE ]

Hmm, how about n(1/n)=1!
uA

[/ QUOTE ]
Ok, mr genius, show that the outcomes are independent. I know this is "intuitively obvious" but actually providing a proof is an altogether different matter.


[/ QUOTE ]

Why do we care about whether or not the outcomes are independent? The question is about expected values, and the expected value of the sum is the sum of the expected values regardless of whether or not the variables are independent.

[/ QUOTE ]

n(1/n) = 1/n + 1/n + .... + 1/n (n terms in the sum) where the ith term represents the probability that the ith person receives his hat. If the events were not independent, the 2nd through nth terms could be different, yielding a different sum.

[/ QUOTE ]

Actually, the events are not "independent" as the term is normally used. To be "independent" would mean that the probality of me getting my hat stays the same regardless of whether Joe gets his hat or not. But it does change. If Joe gets his hat my chance improves to 1/(n-1) instead of 1/n.

So the concept being talked about is not "independence" as the term is used in probability. I'm not sure it has a name. But it's the same concept as the one we invoke when we say that the unseen cards in other players' hands do not affect my probability of hitting my flush card on the river. With the hats, each person can say ahead of time that he will have 1/n chance of getting his own hat regardless of what place in line he is in when they are handed out. That defines n dependent indicator functions each of whose EV can be added despite their not being independent.

What is the correct term for the concept that unseen cards don't affect my probabilities regardless of whether they are in other players' hands or at the bottom of the deck? I'm not sure. I don't think it's something we ask to be proved all the time though.



PairTheBoard

David Sklansky
11-18-2007, 02:47 PM
Two in a row. Unbelievable.

thylacine
11-18-2007, 04:50 PM
[ QUOTE ]
[ QUOTE ]
[ QUOTE ]
[ QUOTE ]
[ QUOTE ]
[ QUOTE ]

Here is a cooler problem, imo: show that the expected value of the # of people who get their hat back is 1, independent of n.

[/ QUOTE ]

Hmm, how about n(1/n)=1!
uA

[/ QUOTE ]
Ok, mr genius, show that the outcomes are independent. I know this is "intuitively obvious" but actually providing a proof is an altogether different matter.


[/ QUOTE ]

Why do we care about whether or not the outcomes are independent? The question is about expected values, and the expected value of the sum is the sum of the expected values regardless of whether or not the variables are independent.

[/ QUOTE ]

n(1/n) = 1/n + 1/n + .... + 1/n (n terms in the sum) where the ith term represents the probability that the ith person receives his hat. If the events were not independent, the 2nd through nth terms could be different, yielding a different sum.

[/ QUOTE ]

Actually, the events are not "independent" as the term is normally used. To be "independent" would mean that the probality of me getting my hat stays the same regardless of whether Joe gets his hat or not. But it does change. If Joe gets his hat my chance improves to 1/(n-1) instead of 1/n.

So the concept being talked about is not "independence" as the term is used in probability. I'm not sure it has a name. But it's the same concept as the one we invoke when we say that the unseen cards in other players' hands do not affect my probability of hitting my flush card on the river. With the hats, each person can say ahead of time that he will have 1/n chance of getting his own hat regardless of what place in line he is in when they are handed out. That defines n dependent indicator functions each of whose EV can be added despite their not being independent.

What is the correct term for the concept that unseen cards don't affect my probabilities regardless of whether they are in other players' hands or at the bottom of the deck? I'm not sure. I don't think it's something we ask to be proved all the time though.



PairTheBoard

[/ QUOTE ]


FWIW when I wrote "Hmm, how about n(1/n)=1!" I was fully aware that the events were not independent (for n&gt;1) and I was fully aware that it didn't matter.

Fly
11-18-2007, 07:29 PM
PTB,

concept that is not independence = symmetry?

borisp
11-19-2007, 04:07 AM
ugh, yea I misused the term independence. I know nothing about formal statistics and hence I misused the vocabulary. What I really meant was the concept PTB alluded to, which is to observe that there was a symmetry among the hat recipients.

But my real point was that a solution is incomplete when the author has not accounted for all relevant considerations, even when some of them are obvious. Of course, this is a subjective judgment in any case.

borisp
11-19-2007, 04:30 AM
[ QUOTE ]
Now please make yourself useful and go over to my thread "Improving On Buffet And Desert Cat" on the Business Forum and explain to DeserstCat that I'm right. (Even though my argument doesn't meet your brand of rigor.)

[/ QUOTE ]
It now occurs to me that I have no idea what you are talking about in that thread.

David Sklansky
11-19-2007, 05:06 AM
If your estimate of the future price of all stocks is on average more accurate than the present market price, but at the same time you are not sure you have an edge investing, unless your estimate is at least 30% higher or lower than the market price, (because you admit you are not perfect) then barring weird discontinuous functions, you can conclude that the actual EV of a stock is on average somewhere in between your estimate and the market's estimate. (As long as the market price is not totally random.)

The subject arose because Buffett and DesertCat say the right way to invest is to know stock analysis, look at the information about a stock before knowing its price, estimate its true value, and then invest if there is a large discrepancy. I pointed out that this is fine but that the original estimate should be moved to varying degrees toward the market price. DesertCat disagreed. I said if you disagree you can't claim you need a large cushion to invest.

jay_shark
11-23-2007, 07:50 PM
Here is a rigorous solution to the expected number of hat matches .

Let X denote the number of matches we can compute E(x) by writing X= X1 + X2 + X3+ ...+ Xn

where Xi = { 1 if the ith person selects his own hat , or 0 otherwise }

For each i , the ith person is equally likely to select any of the N hats .

E(Xi]= P(Xi=1)= 1/N

E(X) = E(X1) + E(X2) + ... + E(Xn) = 1/N*N = 1

Hence on average , exactly one person selects his own hat .