The concept of information structure in incomplete information gamesDynamic Bertrand competition when players take turnsConfusion on Strategy Sets in a simultaneous move gameBayes-Nash equilibrium and correctness of beliefsGame Theory: T-Fold Repetition GameBayesian game and the set of typesPrisoner's dilemma as a Bayesian one-shot gameSocial Welfare and Pareto Optimality in a Bayesian GameDefinition of Bayesian Nash equilibriumRobust asymmetric information?Bayes Correlated Equilibrium: obedience
Problem in downloading videos using youtube-dl from unsupported sites
Why does SSL Labs now consider CBC suites weak?
How do I identify the partitions of my hard drive in order to then shred them all?
Filter a data-frame and add a new column according to the given condition
Ansible: use group_vars directly without with_items
Re-testing of regression test bug fixes or re-run regression tests?
Is there an academic word that means "to split hairs over"?
Does "Software Updater" only update software installed using apt, or also software installed using snap?
Is Valonqar prophecy unfulfilled?
Is this possible when it comes to the relations of P, NP, NP-Hard and NP-Complete?
Can a tourist shoot a gun for recreational purpose in the USA?
Would life always name the light from their sun "white"
Extract the characters before last colon
Is 12 minutes connection in Bristol Temple Meads long enough?
Could there be a material that inverts the colours seen through it?
What information exactly does an instruction cache store?
How to disable Two-factor authentication for Apple ID?
Do we have C++20 ranges library in GCC 9?
Will a coyote attack my dog on a leash while I'm on a hiking trail?
How to not get blinded by an attack at dawn
How might a landlocked lake become a complete ecosystem?
White foam around tubeless tires
Mark command as obsolete
Developers demotivated due to working on same project for more than 2 years
The concept of information structure in incomplete information games
Dynamic Bertrand competition when players take turnsConfusion on Strategy Sets in a simultaneous move gameBayes-Nash equilibrium and correctness of beliefsGame Theory: T-Fold Repetition GameBayesian game and the set of typesPrisoner's dilemma as a Bayesian one-shot gameSocial Welfare and Pareto Optimality in a Bayesian GameDefinition of Bayesian Nash equilibriumRobust asymmetric information?Bayes Correlated Equilibrium: obedience
$begingroup$
I would like your help to understand the concept of information structure in the incomplete information game at p.6-7 of this paper.
Let me summarise the game as described in the paper.
There are $Nin mathbbN$ players, with $i$ denoting a generic player.
There is a finite set of states $Theta$, with $theta$ denoting a generic state.
A basic game $G$ consists of
for each player $i$, a finite set of actions $A_i$, where we write $Aequiv A_1times A_2times ... times A_N$, and a utility function $u_i: Atimes Theta rightarrow mathbbR$.
a full support prior $psiin Delta(Theta)$.
An information structure $S$ consists of
for each player $i$, a finite set of signals $T_i$, where we write $Tequiv T_1times T_2times ... times T_N$.
a signal distribution $pi: Theta rightarrow Delta(T)$.
A decision rule of the incomplete information game $(G,S)$ is a mapping
$$
sigma: Ttimes Thetarightarrow Delta(A)
$$
My question:
I interpret $pi(t|theta)$ as a probability that, conditional on the realisation $theta$ of the state, the players receive as signals $t_1,...,t_N$, respectively. According to the given information structure, signals are more or less informative.
If this interpretation is correct, then I'm confused about the first sentence at p.7 of the linked paper: "If there is complete information, i.e., if $Theta$ is singleton, [...]".
My first guess was that complete information is characterised by specifying $S$ and not by restricting the support of the state. In other words, I thought that complete information corresponds to the information structure $barSequiv (barT,barpi)$ such that
for each player $i$, $barT_i=Theta$.
$barpi(theta|theta)=1$.
Where am I wrong? Why the author defines complete information as having $|Theta|=1$?
game-theory bayesian-game cooperative-game-theory
$endgroup$
add a comment |
$begingroup$
I would like your help to understand the concept of information structure in the incomplete information game at p.6-7 of this paper.
Let me summarise the game as described in the paper.
There are $Nin mathbbN$ players, with $i$ denoting a generic player.
There is a finite set of states $Theta$, with $theta$ denoting a generic state.
A basic game $G$ consists of
for each player $i$, a finite set of actions $A_i$, where we write $Aequiv A_1times A_2times ... times A_N$, and a utility function $u_i: Atimes Theta rightarrow mathbbR$.
a full support prior $psiin Delta(Theta)$.
An information structure $S$ consists of
for each player $i$, a finite set of signals $T_i$, where we write $Tequiv T_1times T_2times ... times T_N$.
a signal distribution $pi: Theta rightarrow Delta(T)$.
A decision rule of the incomplete information game $(G,S)$ is a mapping
$$
sigma: Ttimes Thetarightarrow Delta(A)
$$
My question:
I interpret $pi(t|theta)$ as a probability that, conditional on the realisation $theta$ of the state, the players receive as signals $t_1,...,t_N$, respectively. According to the given information structure, signals are more or less informative.
If this interpretation is correct, then I'm confused about the first sentence at p.7 of the linked paper: "If there is complete information, i.e., if $Theta$ is singleton, [...]".
My first guess was that complete information is characterised by specifying $S$ and not by restricting the support of the state. In other words, I thought that complete information corresponds to the information structure $barSequiv (barT,barpi)$ such that
for each player $i$, $barT_i=Theta$.
$barpi(theta|theta)=1$.
Where am I wrong? Why the author defines complete information as having $|Theta|=1$?
game-theory bayesian-game cooperative-game-theory
$endgroup$
$begingroup$
Maybe I am missing something but if the set of states is a singleton there is no uncertainty (i.e. complete information) right? There is only one state which by definition is realized.
$endgroup$
– user20105
May 9 at 16:36
$begingroup$
That is correct. I understand that $|Theta|=1$ implies complete information. But also $barS$ as a I define implies complete information. I'm just wondering why the authors prefer to characterise complete information using $|Theta|=1$ rather than $barS$. $barS$ seems to me more natural. Or am I making mistakes somewhere?
$endgroup$
– user3285148
May 9 at 16:42
$begingroup$
I posted as an answer, maybe this will help
$endgroup$
– user20105
May 9 at 17:15
add a comment |
$begingroup$
I would like your help to understand the concept of information structure in the incomplete information game at p.6-7 of this paper.
Let me summarise the game as described in the paper.
There are $Nin mathbbN$ players, with $i$ denoting a generic player.
There is a finite set of states $Theta$, with $theta$ denoting a generic state.
A basic game $G$ consists of
for each player $i$, a finite set of actions $A_i$, where we write $Aequiv A_1times A_2times ... times A_N$, and a utility function $u_i: Atimes Theta rightarrow mathbbR$.
a full support prior $psiin Delta(Theta)$.
An information structure $S$ consists of
for each player $i$, a finite set of signals $T_i$, where we write $Tequiv T_1times T_2times ... times T_N$.
a signal distribution $pi: Theta rightarrow Delta(T)$.
A decision rule of the incomplete information game $(G,S)$ is a mapping
$$
sigma: Ttimes Thetarightarrow Delta(A)
$$
My question:
I interpret $pi(t|theta)$ as a probability that, conditional on the realisation $theta$ of the state, the players receive as signals $t_1,...,t_N$, respectively. According to the given information structure, signals are more or less informative.
If this interpretation is correct, then I'm confused about the first sentence at p.7 of the linked paper: "If there is complete information, i.e., if $Theta$ is singleton, [...]".
My first guess was that complete information is characterised by specifying $S$ and not by restricting the support of the state. In other words, I thought that complete information corresponds to the information structure $barSequiv (barT,barpi)$ such that
for each player $i$, $barT_i=Theta$.
$barpi(theta|theta)=1$.
Where am I wrong? Why the author defines complete information as having $|Theta|=1$?
game-theory bayesian-game cooperative-game-theory
$endgroup$
I would like your help to understand the concept of information structure in the incomplete information game at p.6-7 of this paper.
Let me summarise the game as described in the paper.
There are $Nin mathbbN$ players, with $i$ denoting a generic player.
There is a finite set of states $Theta$, with $theta$ denoting a generic state.
A basic game $G$ consists of
for each player $i$, a finite set of actions $A_i$, where we write $Aequiv A_1times A_2times ... times A_N$, and a utility function $u_i: Atimes Theta rightarrow mathbbR$.
a full support prior $psiin Delta(Theta)$.
An information structure $S$ consists of
for each player $i$, a finite set of signals $T_i$, where we write $Tequiv T_1times T_2times ... times T_N$.
a signal distribution $pi: Theta rightarrow Delta(T)$.
A decision rule of the incomplete information game $(G,S)$ is a mapping
$$
sigma: Ttimes Thetarightarrow Delta(A)
$$
My question:
I interpret $pi(t|theta)$ as a probability that, conditional on the realisation $theta$ of the state, the players receive as signals $t_1,...,t_N$, respectively. According to the given information structure, signals are more or less informative.
If this interpretation is correct, then I'm confused about the first sentence at p.7 of the linked paper: "If there is complete information, i.e., if $Theta$ is singleton, [...]".
My first guess was that complete information is characterised by specifying $S$ and not by restricting the support of the state. In other words, I thought that complete information corresponds to the information structure $barSequiv (barT,barpi)$ such that
for each player $i$, $barT_i=Theta$.
$barpi(theta|theta)=1$.
Where am I wrong? Why the author defines complete information as having $|Theta|=1$?
game-theory bayesian-game cooperative-game-theory
game-theory bayesian-game cooperative-game-theory
edited May 9 at 16:28
user3285148
asked May 9 at 16:11
user3285148user3285148
1528
1528
$begingroup$
Maybe I am missing something but if the set of states is a singleton there is no uncertainty (i.e. complete information) right? There is only one state which by definition is realized.
$endgroup$
– user20105
May 9 at 16:36
$begingroup$
That is correct. I understand that $|Theta|=1$ implies complete information. But also $barS$ as a I define implies complete information. I'm just wondering why the authors prefer to characterise complete information using $|Theta|=1$ rather than $barS$. $barS$ seems to me more natural. Or am I making mistakes somewhere?
$endgroup$
– user3285148
May 9 at 16:42
$begingroup$
I posted as an answer, maybe this will help
$endgroup$
– user20105
May 9 at 17:15
add a comment |
$begingroup$
Maybe I am missing something but if the set of states is a singleton there is no uncertainty (i.e. complete information) right? There is only one state which by definition is realized.
$endgroup$
– user20105
May 9 at 16:36
$begingroup$
That is correct. I understand that $|Theta|=1$ implies complete information. But also $barS$ as a I define implies complete information. I'm just wondering why the authors prefer to characterise complete information using $|Theta|=1$ rather than $barS$. $barS$ seems to me more natural. Or am I making mistakes somewhere?
$endgroup$
– user3285148
May 9 at 16:42
$begingroup$
I posted as an answer, maybe this will help
$endgroup$
– user20105
May 9 at 17:15
$begingroup$
Maybe I am missing something but if the set of states is a singleton there is no uncertainty (i.e. complete information) right? There is only one state which by definition is realized.
$endgroup$
– user20105
May 9 at 16:36
$begingroup$
Maybe I am missing something but if the set of states is a singleton there is no uncertainty (i.e. complete information) right? There is only one state which by definition is realized.
$endgroup$
– user20105
May 9 at 16:36
$begingroup$
That is correct. I understand that $|Theta|=1$ implies complete information. But also $barS$ as a I define implies complete information. I'm just wondering why the authors prefer to characterise complete information using $|Theta|=1$ rather than $barS$. $barS$ seems to me more natural. Or am I making mistakes somewhere?
$endgroup$
– user3285148
May 9 at 16:42
$begingroup$
That is correct. I understand that $|Theta|=1$ implies complete information. But also $barS$ as a I define implies complete information. I'm just wondering why the authors prefer to characterise complete information using $|Theta|=1$ rather than $barS$. $barS$ seems to me more natural. Or am I making mistakes somewhere?
$endgroup$
– user3285148
May 9 at 16:42
$begingroup$
I posted as an answer, maybe this will help
$endgroup$
– user20105
May 9 at 17:15
$begingroup$
I posted as an answer, maybe this will help
$endgroup$
– user20105
May 9 at 17:15
add a comment |
2 Answers
2
active
oldest
votes
$begingroup$
The two formulations are equivalent in the sense that if every type of player always learns the state of the world, there is no uncertainty and there is really no need to carry around the realized state as a variable. By assuming $|Theta|=1$ you simplify the notation without losing generality (of course you lose some information, but this is irrelevant for the purposes of that paper). If you want, you can assume that the authors' statements are true state-by-state.
$endgroup$
add a comment |
$begingroup$
What you are defining is the complete information structure $barS$ with $barT_i=Theta$ for all $i$ and $$ barpi(t|theta) = begincases 1, & textif ;; t_i= theta text for all ; i \ 0, & textotherwise, endcases $$ for all $theta in Theta$.
Note that this is not the same as the case in which $Theta$ is a singleton, but it is rather an extreme information structure. The same authors define this here (Bayes Correlated Equilibrium and the Comparison of Information Structures - Bergemann & Morris), it beats me why they didn't include this detail in the paper you are looking at.
$endgroup$
add a comment |
Your Answer
StackExchange.ready(function()
var channelOptions =
tags: "".split(" "),
id: "591"
;
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function()
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled)
StackExchange.using("snippets", function()
createEditor();
);
else
createEditor();
);
function createEditor()
StackExchange.prepareEditor(
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: false,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: null,
bindNavPrevention: true,
postfix: "",
imageUploader:
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
,
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
);
);
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2feconomics.stackexchange.com%2fquestions%2f29241%2fthe-concept-of-information-structure-in-incomplete-information-games%23new-answer', 'question_page');
);
Post as a guest
Required, but never shown
2 Answers
2
active
oldest
votes
2 Answers
2
active
oldest
votes
active
oldest
votes
active
oldest
votes
$begingroup$
The two formulations are equivalent in the sense that if every type of player always learns the state of the world, there is no uncertainty and there is really no need to carry around the realized state as a variable. By assuming $|Theta|=1$ you simplify the notation without losing generality (of course you lose some information, but this is irrelevant for the purposes of that paper). If you want, you can assume that the authors' statements are true state-by-state.
$endgroup$
add a comment |
$begingroup$
The two formulations are equivalent in the sense that if every type of player always learns the state of the world, there is no uncertainty and there is really no need to carry around the realized state as a variable. By assuming $|Theta|=1$ you simplify the notation without losing generality (of course you lose some information, but this is irrelevant for the purposes of that paper). If you want, you can assume that the authors' statements are true state-by-state.
$endgroup$
add a comment |
$begingroup$
The two formulations are equivalent in the sense that if every type of player always learns the state of the world, there is no uncertainty and there is really no need to carry around the realized state as a variable. By assuming $|Theta|=1$ you simplify the notation without losing generality (of course you lose some information, but this is irrelevant for the purposes of that paper). If you want, you can assume that the authors' statements are true state-by-state.
$endgroup$
The two formulations are equivalent in the sense that if every type of player always learns the state of the world, there is no uncertainty and there is really no need to carry around the realized state as a variable. By assuming $|Theta|=1$ you simplify the notation without losing generality (of course you lose some information, but this is irrelevant for the purposes of that paper). If you want, you can assume that the authors' statements are true state-by-state.
answered May 9 at 17:18
RegioRegio
7988
7988
add a comment |
add a comment |
$begingroup$
What you are defining is the complete information structure $barS$ with $barT_i=Theta$ for all $i$ and $$ barpi(t|theta) = begincases 1, & textif ;; t_i= theta text for all ; i \ 0, & textotherwise, endcases $$ for all $theta in Theta$.
Note that this is not the same as the case in which $Theta$ is a singleton, but it is rather an extreme information structure. The same authors define this here (Bayes Correlated Equilibrium and the Comparison of Information Structures - Bergemann & Morris), it beats me why they didn't include this detail in the paper you are looking at.
$endgroup$
add a comment |
$begingroup$
What you are defining is the complete information structure $barS$ with $barT_i=Theta$ for all $i$ and $$ barpi(t|theta) = begincases 1, & textif ;; t_i= theta text for all ; i \ 0, & textotherwise, endcases $$ for all $theta in Theta$.
Note that this is not the same as the case in which $Theta$ is a singleton, but it is rather an extreme information structure. The same authors define this here (Bayes Correlated Equilibrium and the Comparison of Information Structures - Bergemann & Morris), it beats me why they didn't include this detail in the paper you are looking at.
$endgroup$
add a comment |
$begingroup$
What you are defining is the complete information structure $barS$ with $barT_i=Theta$ for all $i$ and $$ barpi(t|theta) = begincases 1, & textif ;; t_i= theta text for all ; i \ 0, & textotherwise, endcases $$ for all $theta in Theta$.
Note that this is not the same as the case in which $Theta$ is a singleton, but it is rather an extreme information structure. The same authors define this here (Bayes Correlated Equilibrium and the Comparison of Information Structures - Bergemann & Morris), it beats me why they didn't include this detail in the paper you are looking at.
$endgroup$
What you are defining is the complete information structure $barS$ with $barT_i=Theta$ for all $i$ and $$ barpi(t|theta) = begincases 1, & textif ;; t_i= theta text for all ; i \ 0, & textotherwise, endcases $$ for all $theta in Theta$.
Note that this is not the same as the case in which $Theta$ is a singleton, but it is rather an extreme information structure. The same authors define this here (Bayes Correlated Equilibrium and the Comparison of Information Structures - Bergemann & Morris), it beats me why they didn't include this detail in the paper you are looking at.
answered May 9 at 17:14
user20105user20105
53714
53714
add a comment |
add a comment |
Thanks for contributing an answer to Economics Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
Use MathJax to format equations. MathJax reference.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2feconomics.stackexchange.com%2fquestions%2f29241%2fthe-concept-of-information-structure-in-incomplete-information-games%23new-answer', 'question_page');
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
$begingroup$
Maybe I am missing something but if the set of states is a singleton there is no uncertainty (i.e. complete information) right? There is only one state which by definition is realized.
$endgroup$
– user20105
May 9 at 16:36
$begingroup$
That is correct. I understand that $|Theta|=1$ implies complete information. But also $barS$ as a I define implies complete information. I'm just wondering why the authors prefer to characterise complete information using $|Theta|=1$ rather than $barS$. $barS$ seems to me more natural. Or am I making mistakes somewhere?
$endgroup$
– user3285148
May 9 at 16:42
$begingroup$
I posted as an answer, maybe this will help
$endgroup$
– user20105
May 9 at 17:15