Aarhus University Seal / Aarhus Universitets segl

Reward foraging task and model-based analysis reveal how fruit flies learn value of available options

Research output: Contribution to journal/Conference contribution in journal/Contribution to newspaperJournal articleResearchpeer-review

Standard

Reward foraging task and model-based analysis reveal how fruit flies learn value of available options. / Seidenbecher, Sophie E.; Sanders, Joshua I.; von Philipsborn, Anne C.; Kvitsiani, Duda.

In: PLOS ONE, Vol. 15, No. 10 October, e0239616, 10.2020.

Research output: Contribution to journal/Conference contribution in journal/Contribution to newspaperJournal articleResearchpeer-review

Harvard

APA

CBE

MLA

Vancouver

Author

Bibtex

@article{63f43dc7d4d9495ba464c6eefea5641c,
title = "Reward foraging task and model-based analysis reveal how fruit flies learn value of available options",
abstract = "Foraging animals have to evaluate, compare and select food patches in order to increase their fitness. Understanding what drives foraging decisions requires careful manipulation of the value of alternative options while monitoring animals choices. Value-based decision-making tasks in combination with formal learning models have provided both an experimental and theoretical framework to study foraging decisions in lab settings. While these approaches were successfully used in the past to understand what drives choices in mammals, very little work has been done on fruit flies. This is despite the fact that fruit flies have served as model organism for many complex behavioural paradigms. To fill this gap we developed a single-animal, trial-based decision making task, where freely walking flies experienced optogenetic sugar-receptor neuron stimulation. We controlled the value of available options by manipulating the probabilities of optogenetic stimulation. We show that flies integrate reward history of chosen options and forget value of unchosen options. We further discover that flies assign higher values to rewards experienced early in the behavioural session, consistent with formal reinforcement learning models. Finally, we also show that the probabilistic rewards affect walking trajectories of flies, suggesting that accumulated value is controlling the navigation vector of flies in a graded fashion. These findings establish the fruit fly as a model organism to explore the genetic and circuit basis of reward foraging decisions.",
author = "Seidenbecher, {Sophie E.} and Sanders, {Joshua I.} and {von Philipsborn}, {Anne C.} and Duda Kvitsiani",
year = "2020",
month = oct,
doi = "10.1371/journal.pone.0239616",
language = "English",
volume = "15",
journal = "P L o S One",
issn = "1932-6203",
publisher = "public library of science",
number = "10 October",

}

RIS

TY - JOUR

T1 - Reward foraging task and model-based analysis reveal how fruit flies learn value of available options

AU - Seidenbecher, Sophie E.

AU - Sanders, Joshua I.

AU - von Philipsborn, Anne C.

AU - Kvitsiani, Duda

PY - 2020/10

Y1 - 2020/10

N2 - Foraging animals have to evaluate, compare and select food patches in order to increase their fitness. Understanding what drives foraging decisions requires careful manipulation of the value of alternative options while monitoring animals choices. Value-based decision-making tasks in combination with formal learning models have provided both an experimental and theoretical framework to study foraging decisions in lab settings. While these approaches were successfully used in the past to understand what drives choices in mammals, very little work has been done on fruit flies. This is despite the fact that fruit flies have served as model organism for many complex behavioural paradigms. To fill this gap we developed a single-animal, trial-based decision making task, where freely walking flies experienced optogenetic sugar-receptor neuron stimulation. We controlled the value of available options by manipulating the probabilities of optogenetic stimulation. We show that flies integrate reward history of chosen options and forget value of unchosen options. We further discover that flies assign higher values to rewards experienced early in the behavioural session, consistent with formal reinforcement learning models. Finally, we also show that the probabilistic rewards affect walking trajectories of flies, suggesting that accumulated value is controlling the navigation vector of flies in a graded fashion. These findings establish the fruit fly as a model organism to explore the genetic and circuit basis of reward foraging decisions.

AB - Foraging animals have to evaluate, compare and select food patches in order to increase their fitness. Understanding what drives foraging decisions requires careful manipulation of the value of alternative options while monitoring animals choices. Value-based decision-making tasks in combination with formal learning models have provided both an experimental and theoretical framework to study foraging decisions in lab settings. While these approaches were successfully used in the past to understand what drives choices in mammals, very little work has been done on fruit flies. This is despite the fact that fruit flies have served as model organism for many complex behavioural paradigms. To fill this gap we developed a single-animal, trial-based decision making task, where freely walking flies experienced optogenetic sugar-receptor neuron stimulation. We controlled the value of available options by manipulating the probabilities of optogenetic stimulation. We show that flies integrate reward history of chosen options and forget value of unchosen options. We further discover that flies assign higher values to rewards experienced early in the behavioural session, consistent with formal reinforcement learning models. Finally, we also show that the probabilistic rewards affect walking trajectories of flies, suggesting that accumulated value is controlling the navigation vector of flies in a graded fashion. These findings establish the fruit fly as a model organism to explore the genetic and circuit basis of reward foraging decisions.

UR - http://www.scopus.com/inward/record.url?scp=85092225838&partnerID=8YFLogxK

U2 - 10.1371/journal.pone.0239616

DO - 10.1371/journal.pone.0239616

M3 - Journal article

C2 - 33007023

AN - SCOPUS:85092225838

VL - 15

JO - P L o S One

JF - P L o S One

SN - 1932-6203

IS - 10 October

M1 - e0239616

ER -