Search anything and hit enter
  • Teams
  • Members
  • Projects
  • Events
  • Calls
  • Jobs
  • publications
  • Software
  • Tools
  • Network
  • Equipment

A little guide for advanced search:

  • Tip 1. You can use quotes "" to search for an exact expression.
    Example: "cell division"
  • Tip 2. You can use + symbol to restrict results containing all words.
    Example: +cell +stem
  • Tip 3. You can use + and - symbols to force inclusion or exclusion of specific words.
    Example: +cell -stem
e.g. searching for members in projects tagged cancer
Search for
Count
IN
OUT
Content 1
  • member
  • team
  • department
  • center
  • program_project
  • nrc
  • whocc
  • project
  • software
  • tool
  • patent
  • Administrative Staff
  • Assistant Professor
  • Associate Professor
  • Clinical Research Assistant
  • Clinical Research Nurse
  • Clinician Researcher
  • Department Manager
  • Dual-education Student
  • Full Professor
  • Honorary Professor
  • Lab assistant
  • Master Student
  • Non-permanent Researcher
  • Nursing Staff
  • Permanent Researcher
  • Pharmacist
  • PhD Student
  • Physician
  • Post-doc
  • Prize
  • Project Manager
  • Research Associate
  • Research Engineer
  • Retired scientist
  • Technician
  • Undergraduate Student
  • Veterinary
  • Visiting Scientist
  • Deputy Director of Center
  • Deputy Director of Department
  • Deputy Director of National Reference Center
  • Deputy Head of Facility
  • Director of Center
  • Director of Department
  • Director of Institute
  • Director of National Reference Center
  • Group Leader
  • Head of Facility
  • Head of Operations
  • Head of Structure
  • Honorary President of the Departement
  • Labex Coordinator
Content 2
  • member
  • team
  • department
  • center
  • program_project
  • nrc
  • whocc
  • project
  • software
  • tool
  • patent
  • Administrative Staff
  • Assistant Professor
  • Associate Professor
  • Clinical Research Assistant
  • Clinical Research Nurse
  • Clinician Researcher
  • Department Manager
  • Dual-education Student
  • Full Professor
  • Honorary Professor
  • Lab assistant
  • Master Student
  • Non-permanent Researcher
  • Nursing Staff
  • Permanent Researcher
  • Pharmacist
  • PhD Student
  • Physician
  • Post-doc
  • Prize
  • Project Manager
  • Research Associate
  • Research Engineer
  • Retired scientist
  • Technician
  • Undergraduate Student
  • Veterinary
  • Visiting Scientist
  • Deputy Director of Center
  • Deputy Director of Department
  • Deputy Director of National Reference Center
  • Deputy Head of Facility
  • Director of Center
  • Director of Department
  • Director of Institute
  • Director of National Reference Center
  • Group Leader
  • Head of Facility
  • Head of Operations
  • Head of Structure
  • Honorary President of the Departement
  • Labex Coordinator
Search

← Go to Research

Go back
Scroll to top
Share
© Research
Publication :

Approximate information maximization for bandit games

Scientific Fields
Diseases
Organisms
Applications
Technique

Published in - 06 Oct 2023

Alex Barbier-Chebbah, Christian L. Vestergaard, Jean-Baptiste Masson, Etienne Boursier

Link to HAL – hal-04264220

2023

Entropy maximization and free energy minimization are general physical principles for modeling the dynamics of various physical systems. Notable examples include modeling decision-making within the brain using the free-energy principle, optimizing the accuracy-complexity trade-off when accessing hidden variables with the information bottleneck principle (Tishby et al., 2000), and navigation in random environments using information maximization (Vergassola et al., 2007). Built on this principle, we propose a new class of bandit algorithms that maximize an approximation to the information of a key variable within the system. To this end, we develop an approximated analytical physics-based representation of an entropy to forecast the information gain of each action and greedily choose the one with the largest information gain. This method yields strong performances in classical bandit settings. Motivated by its empirical success, we prove its asymptotic optimality for the two-armed bandit problem with Gaussian rewards. Owing to its ability to encompass the system’s properties in a global physical functional, this approach can be efficiently adapted to more complex bandit settings, calling for further investigation of information maximization approaches for multi-armed bandit problems.