Universit? de Gen?ve, 2000, -125 pp.
Dynamic Games are mathematical models of the interaction between different agents who are controlling a dynamical system. Such situations occur in many instances like armed conflicts (e.g. duel between a bomber and a jet fighter), economic competition (e.g. investments in R&D for computer companies), parlor games (Chess, Bridge). These examples conce dynamical systems since the actions of the agents (also called players) influence the evolution over time of the state of a system (position and velocity of aircraft, capital of know-how for Hi-Tech firms, positions of remaining pieces on a chess board, etc). The difficulty in deciding what should be the behavior of these agents stems from the fact that each action an agent takes at a given time will influence the reaction of the opponent(s) at later time. These notes are intended to present the basic concepts and models which have been proposed in the burgeoning literature on game theory for a representation of these dynamic interactions.
I Elements of Classical Game Theory
Decision Analysis with Many Agents
Solution concepts for noncooperative games
II Repeated and sequential Games
Repeated games and memory strategies
Shapley’s Zero Sum Markov Game
Nonzero-sum Markov and Sequential games
III Differential games
Controlled dynamical systems
IV A Differential Game Model
Dynamic Games are mathematical models of the interaction between different agents who are controlling a dynamical system. Such situations occur in many instances like armed conflicts (e.g. duel between a bomber and a jet fighter), economic competition (e.g. investments in R&D for computer companies), parlor games (Chess, Bridge). These examples conce dynamical systems since the actions of the agents (also called players) influence the evolution over time of the state of a system (position and velocity of aircraft, capital of know-how for Hi-Tech firms, positions of remaining pieces on a chess board, etc). The difficulty in deciding what should be the behavior of these agents stems from the fact that each action an agent takes at a given time will influence the reaction of the opponent(s) at later time. These notes are intended to present the basic concepts and models which have been proposed in the burgeoning literature on game theory for a representation of these dynamic interactions.
I Elements of Classical Game Theory
Decision Analysis with Many Agents
Solution concepts for noncooperative games
II Repeated and sequential Games
Repeated games and memory strategies
Shapley’s Zero Sum Markov Game
Nonzero-sum Markov and Sequential games
III Differential games
Controlled dynamical systems
IV A Differential Game Model