Two Plus Two Newer Archives  

Go Back   Two Plus Two Newer Archives > Other Topics > Science, Math, and Philosophy
FAQ Community Calendar Today's Posts Search

Reply
 
Thread Tools Display Modes
  #1  
Old 05-16-2007, 01:26 AM
viciouspenguin viciouspenguin is offline
Senior Member
 
Join Date: Mar 2006
Location: Commerce/Vegas
Posts: 1,442
Default Game Theory Question

There are two firms in a market that produce an identical product. Each firm has either one or zero units to sell. The probability of having a unit to sell is q and the probability of having no units to sell is 1-q. There is a single consumer interested in buying only one unit of the good at a price not to exceed $1. If both firms have capacity available, it will buy from the lowest priced. If only one firm has capacity, it buys from that firm provided its price does not exceed $1. You must decide what price to charge for the good if you were to have a unit available to sell. Note that at the time of making this decision you do not know whether your competitor will have a unit of capacity to sell or not and what price it will choose.

What are the optimal price choices for the following 3 cases?

1. q = 1/4
2. q = 1/2
3. q = 3/4
Reply With Quote
  #2  
Old 05-16-2007, 02:33 AM
m_the0ry m_the0ry is offline
Senior Member
 
Join Date: Aug 2006
Posts: 790
Default Re: Game Theory Question

Question; q represents the probability of having a unit for both firms, or is the probability of one firm having a unit unknown?
Reply With Quote
  #3  
Old 05-16-2007, 01:28 PM
bigmonkey bigmonkey is offline
Senior Member
 
Join Date: May 2007
Posts: 118
Default Re: Game Theory Question

Interesting question. I'm finding it hard to work it out straight away so I'll just post my thoughts on it (the q=1/4 one). Firstly we don't know what happens if they charge exactly the same price and they both have it in stock...? I assume that the consumer flips a coin to choose who he buys it from, equivalent to the sellers selling half a product each.

If you charge a dollar then you have EV of at least 18.75c. That's the chance that you have it in stock and your opponent doesn't multiplied by your dollar.

It seems what you want to be doing is predicting what your opponent will charge then go 1c below that (let's assume you have to charge to the nearest cent). If your opponent charged $1 and you charged 99c your EV would be 24.75c. I think that is the highest prize.

I've done a table with values to the nearest 10c as any more is too much effort.



The yellow fields represent strategies that both agents know that neither player will take because they are dominated by strategies above them. But 80c isn't dominated by 90c because there are some possibilities where you get paid more for 80c than for 90c.

Now with these 3 strategies available each this is what we are left with: If I go 100 I want my opponent to go 100. If I go 90 I want my opponent to go 100. If I go 80 I want my opponent to go 100. Also if I go 100 he ought to go 90. If I go 90 he ought to go 80 and if I go 80 he ought to go 100.

Hence there is no equilibrium here. But if we believed that he will randomly choose between these three strategies then we are better off going with one dollar. It basically becomes like stone, paper, scissors then with him thinking you'll go 100 so he goes 90 so you go 80 so he goes 100 etcetera.

I'm not that knowledgeable of game theory, so I'd be interested if anybody does this problem a shorter way than me. Or if they can make some equation and observations based on my chart to do q equalling the other values. Also, at which point to the nearest cent does the strategy become dominated?
Reply With Quote
  #4  
Old 05-16-2007, 02:27 PM
HP HP is offline
Senior Member
 
Join Date: Oct 2004
Location: DZ-015
Posts: 2,783
Default Re: Game Theory Question

I am assuming if both firms choose the same price, the consumer flips a coin to make a choice

For any positive q:

It would appear there is a co-operation/competition effect between the two firms resulting in only one Nash Equilibrium at both of them setting a price of 0
Reply With Quote
  #5  
Old 05-16-2007, 02:32 PM
HP HP is offline
Senior Member
 
Join Date: Oct 2004
Location: DZ-015
Posts: 2,783
Default Re: Game Theory Question

however I'm not sure about that
Reply With Quote
  #6  
Old 05-16-2007, 03:08 PM
HP HP is offline
Senior Member
 
Join Date: Oct 2004
Location: DZ-015
Posts: 2,783
Default Re: Game Theory Question

okay i've changed my answer:

There is a Nash Equilibrium only at setting the price at 0 for each of the firms, for any q except q = 1/2

In the q = 1/2 case, both firms should choose a price P from the uniform distribution over [0,1]

my answer is subject to further change...
Reply With Quote
  #7  
Old 05-16-2007, 03:20 PM
HP HP is offline
Senior Member
 
Join Date: Oct 2004
Location: DZ-015
Posts: 2,783
Default Re: Game Theory Question

update on my next answer lol

when q > 1/2, The only Nash Equilibrium is both firms selling for a price of 0

when q = 1/2, Both firms should choose a price from the uniform distribution over (0,1)

when q < 1/2, not sure yet....

edit: nope this ain't right either. allow me to start over
Reply With Quote
  #8  
Old 05-16-2007, 04:17 PM
HP HP is offline
Senior Member
 
Join Date: Oct 2004
Location: DZ-015
Posts: 2,783
Default Re: Game Theory Question

Alright my latest new answer:

each firm chooses a mixed strategy. If p(x) represents the probability density function of the price a firm chooses, p(x) is the following:

(1-q)/(q*x^2) when (1-q) < x < 1

and p(x) is 0 everywhere else

This is my final guess!! At least until some one else posts in this thread

and yeah this is for 0 < q <= 1

When q = 0, it would seem both firms choose a price of 0

edit: I'm talking about Nash Equilibriums obv

edit2: wrote a fraction down wrong
Reply With Quote
  #9  
Old 05-17-2007, 12:32 AM
janneman janneman is offline
Junior Member
 
Join Date: May 2007
Posts: 3
Default Re: Game Theory Question

I don't really know anything about Nash equilibria, but if q=3/4 then by picking price=1 your company will make 1/4 on average. So regardless of any assumptions it seems that the price has to be at least 1/4 for it to be a better strategy than this trivial one.

Assume your opponent's price is uniformly distributed in [0,1]. Call your price P. On average you will make:

(1-q)P+q(1-P)P=P-qP^2
(parabola with maximum at P=1/(2q))

so if q=1/4: P_optimal=1
if q=1/2: P_optimal=1
if q=3/4: P_optimal=2/3

I don't know however if you're allowed to make this assumption (i guess not if there's game theory involved).
Reply With Quote
Reply


Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT -4. The time now is 08:02 PM.


Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2024, vBulletin Solutions Inc.