The mesolimbic and mesocortical dopamine pathways serve as the neurobiological foundation for reward processing and motivated behavior. Originating in the ventral tegmental area, these pathways project to the nucleus accumbens, striatum, and prefrontal cortex - regions critical for evaluating reward value and guiding decision-making (Schultz, 2016). Dopamine neurons exhibit a remarkable property: they encode prediction errors, signaling the difference between expected and received rewards. This error signal between prediction and actual reward is central to reinforcement learning, allowing organisms to modify behavior according to reward outcomes.
Electrophysiological studies of non-human primates have provided a direct demonstration of dopamine’s role in temporal discounting. Kobayashi and Schultz (2008) showed that midbrain dopaminergic neurons respond to rewards in a way that varies both with magnitude and delay. The results revealed that dopaminergic responses increase monotonically with the reward magnitude but decline with delay to reward.
This pattern replicates the subjective under-valuation of delayed rewards observed in behavioral studies of temporal discounting, and it provides a neurobiological explanation for this effect.
Temporal discounting is a core feature of decision-making that describes how individuals weigh immediate versus delayed rewards. It is usually measured in choice tasks in which people choose between smaller-sooner and larger-later rewards. The rate at which future rewards are discounted—the discount rate—differs significantly between people and can be mathematically described with hyperbolic or exponential functions (Frederick et al., 2002).
The hyperbolic discounting model is provided by the equation V = A/(1+kD), in which V is the subjective value, A is the reward magnitude, D is the delay time, and k is the discount rate. Large values of k are found to be associated with more pronounced discounting behavior and a higher preference for immediate rewards. This task is a strong quantitative predictor of impulsive choice, with greater discounting rates having been associated with addiction, pathological gambling, and other conditions marked by impulsive decision-making.
Parkinson’s disease is a particularly valuable model within which to study the role of dopamine in decision-making. The disease is characterized by progressive degeneration of the substantia nigra pars compacta dopaminergic neurons, in which 60–80% of these cells are lost prior to the symptoms of the disorder becoming clinically apparent (Kish et al., 1988). Degeneration is selectively specific, in that it initially targets the dorsal striatum before progressing into the ventral striatum and other dopaminergic systems.
Dopamine replacement therapy is a typical treatment of Parkinson’s disease that affords a unique opportunity to investigate the impact of dopaminergic modulation on cognition. By testing patients in both medicated (ON) and unmedicated (OFF) states, researchers can directly compare the acute effects of dopamine manipulation on decision-making capacity within the same individuals. The within-subject design allows for strong evidence of the causal relationship between dopamine and temporal discounting and controls well for inter-individual differences that can confound between-group comparisons.
Strong evidence that PD fundamentally alters temporal discounting comes from a systematic meta-analysis by Pennisi et al. (2020). Examining data from several studies, the authors determined that patients with PD showed significantly steeper delay discounting compared with age-matched healthy controls. This held when patients were ON their dopaminergic medication and when withdrawn briefly from medication (OFF state), suggesting that the pathophysiology of PD itself; and not just medication effects; is the cause of altered intertemporal choice. Meta-analysis revealed the most dramatic finding of all: PD patients with dopaminergic medication-induced behavioral addictions, such as pathological gambling or compulsive shopping, had even steeper discounting than PD patients without. Voon et al. (2010) pitted this correlation directly, and they found that PD patients with impulse control disorders had discount rates approximately twice as steep as those without. These patients had a strong preference for the smaller reward now over the larger reward later, even when the latter was much more valuable in the long run. This pattern is consistent with a complex interaction between disease pathology, dopaminergic treatment, and patient vulnerability factors. The elevated temporal discounting observed in patients with impulse control disorders can be the product of an underlying vulnerability exposed or enhanced by treatment with dopaminergic medication, rather than being caused by medication.
The effects of dopaminergic medication on temporal discounting provide some of the most direct evidence for dopamine’s causal role in this process. A methodologically rigorous study by Wagner et al. (2024) administered L-DOPA to 76 participants in a double-blind, placebo-controlled crossover design. The results showed that L-DOPA reliably reduced discount rates with small effect sizes, indicating that dopamine replacement made participants more willing to wait for larger future rewards. Importantly, this study’s large sample size and pre-registered design address statistical power and researcher degrees of freedom issues that plagued the interpretation of earlier smaller studies.
Foerde et al. (2016) provided further evidence by directly comparing PD patients in ON and OFF medication states using a temporal discounting task. They found that patients made more farsighted choices when ON dopaminergic medication compared to when OFF. Specifically, participants showed reduced devaluation of reward as a function of time delay when dopamine was restored. The authors used computational modelling to demonstrate that the effect was caused by changes in the subjective value assigned to delayed reward, and not by variations in choice consistency or other choice parameters.
These pharmacological effects appear independent of improvements in motor function, suggesting dopamine specifically enhances the subjective value of delayed rewards. Such dissociation between cognitive and motor effects of dopaminergic medication is crucial in ensuring the changes in temporal discounting reflect true shifts in decision-making behavior rather than merely increased motor output facilitating better expression of current preference.
Neuroimaging research has offered valuable information on the neural systems responsible for the influence of dopamine on temporal discounting. Joutsa et al. (2015) utilized positron emission tomography (PET) to assess dopamine synthetic capacity in patients with PD and discovered that temporal discounting is associated in particular with left caudate dopaminergic terminal function. This finding directly links a quantitative measure of dopaminergic neurotransmission to individual differences in temporal discounting behaviour, providing a neuroanatomical basis for the behavioral observations.
The striatum is richly innervated with dopaminergic fibres and plays a critical role in reward processing and decision-making. Dopaminergic degeneration in PD progresses from the dorsal to ventral striatum, with the caudate affected earlier than the nucleus accumbens. This pattern of degeneration may explain the specific profile of decision-making deficits observed in PD, with alterations in executive function and planning (associated with dorsal striatal circuits) often preceding changes in reward sensitivity and motivation (associated with ventral striatal circuits). Functional magnetic resonance imaging (fMRI) studies have also indicated that PD patients exhibit abnormally disrupted patterns of activation in the ventromedial prefrontal cortex, ventral striatum, and amygdala for reward-based decision tasks (Politis et al., 2013). These brain regions form a network involved in the representation of subjective value and temporally accumulating reward information. The disrupted function of this network in PD likely contributes to the altered temporal discounting observed behaviourally.

Dopaminergic Function and Temporal Discounting (from Joutsa et al., 2015)
This figure illustrates the relationship between dopamine synthetic capacity in the left caudate and temporal discounting behaviour in Parkinson’s disease patients. Positron emission tomography (PET) scans highlight areas of dopamine production, with warmer colours (e.g., red, yellow) indicating higher dopaminergic activity. The accompanying scatter plot shows a negative correlation between caudate dopamine levels and discount rates, demonstrating that lower dopamine function is associated with steeper discounting (i.e., a stronger preference for immediate rewards). These findings support the role of striatal dopamine in modulating intertemporal decision-making.
Dopamine regulates the impact on temporal discounting by influencing neural circuits that are reward valuation and cognitive control specific. Le Bouc et al. (2016) conducted a sophisticated study dissociating the motor and motivational functions of dopamine in PD patients. They demonstrated that dopamine depletion reduced peak motor response for reward in PD patients independent of reward magnitude. This would mean that dopamine exerts a general influence on reward-driven behaviour, and this would affect the way patients value immediate versus delayed rewards.
The authors employed computational modelling to distinguish between several possible mechanisms by which dopamine might affect effort-based decision-making. Their results implied that dopamine specifically affects the subjective value of rewards in terms of the effort required to obtain them—a process cognitively similar to how dopamine might affect the subjective value of delayed versus immediate rewards in tasks of temporal discounting.
The modulatory effects of dopamine extend beyond the striatum to cortical regions, particularly the prefrontal cortex. Moustafa and Gluck (2011) developed a neurocomputational model demonstrating that dopamine in the prefrontal cortex is important for attentional performance working memory and cognitive functions critical for evaluating and comparing rewards across different time points. Their model showed that dopaminergic medication enhances these aspects of executive function in PD patients, which may contribute to the improved ability to consider future rewards when making intertemporal choices.
Computational models have provided an informative analysis of the role of dopamine dysfunction in temporal discounting. Dopamine learning models cast dopamine as an error signal for prediction that recalibrates action and outcome expected value (Montague et al., 1996). In these models, dopamine signals the difference between received and expected rewards, allowing organisms to learn which actions lead to valuable outcomes.
In the context of temporal discounting, these models suggest that dopamine plays a crucial role in representing the value of future rewards. When dopamine signalling is disrupted, as in PD, the ability to accurately represent and compare the value of immediate versus delayed rewards becomes impaired.
This computational framework helps explain why dopamine depletion might lead to steeper discounting—the neural representation of delayed rewards becomes disproportionately weakened compared to immediate rewards. New findings in computational modeling have employed hierarchical Bayesian estimation to account for the decision process as well as performance in temporal discounting tasks. Peters and Büchel (2011) demonstrated that temporal discounting data are best explained by nonlinear temporal discounting drift-diffusion models. These models suggest dopamine affects the evidence accumulation rate (drift rate) and further the decision boundary when deciding over an immediate reward versus a later reward, giving a mechanistic account of the way dopamine operates to modulate the decision-making process itself.
One of the most important insights from computational approaches is the recognition that dopamine’s effects on cognition follow an inverted U-shaped function. Castrellon et al. (2019) conducted a systematic review demonstrating that both too little and too much dopamine can impair optimal decision-making. This non-linear relationship helps explain the seemingly contradictory findings in the literature, where both dopamine agonists and antagonists have sometimes been reported to increase impulsivity. In PD, this inverted U-shaped relationship is of crucial interest. At the early stage of the disease, when there is moderate depletion of dopamine, replacement with dopamine can restore function to the optimal level and improve decision-making. But as the disease progresses and more dopamine is depleted, the same amount of medication might push some brain regions—particularly those less affected by the disease—beyond the optimal level, and might exacerbate certain features of decision-making.
This framework helps explain why different subcomponents of reward processing are differentially affected by the dopamine state in PD. Meder et al. (2019) found that option valuation and the vigor of reward response were impaired in OFF-medication PD patients, but reinforcement learning was impaired under ON-medication. This pattern suggests that dopamine replacement therapy may have a “sweet spot” for optimal cognitive functioning, with different aspects of reward-based decision-making having different optimal levels of dopaminergic stimulation.

Computational Model of Dopamine’s Influence on Reward-Based Decision-Making (from Meder et al., 2019)
This figure presents a reinforcement learning-based computational model demonstrating how dopamine replacement therapy (DRT) affects different aspects of decision-making in Parkinson’s disease. The top graph illustrates an inverted U-shaped function, where moderate dopamine levels optimize reward-based learning, but excessive or deficient dopamine impairs decision-making. The bottom graph shows simulation results comparing ON- and OFF-medication states, highlighting differences in option valuation, reward response vigour, and learning rates. These results provide a mechanistic explanation for the paradoxical effects of dopamine therapy on impulsivity and cognitive control.
Despite the accumulating evidence for dopamine’s involvement in temporal discounting, there is still significant heterogeneity in the direction of drug effects reported. Petzold et al. (2019) identified that baseline impulsivity levels moderately moderated the effect of LDOPA on value-based decision-making, with more impulsive participants differing in their response to dopaminergic stimulation relative to less impulsive participants. This indicates that between-subject differences in baseline dopaminergic activity could be pivotal for predicting responses to dopaminergic manipulation.
The relationship among dopamine, temporal discounting, and impulse control disorders (ICDs) in PD also needs clarification. Although ICDs are frequent neuropsychiatric adverse effects of dopamine replacement therapy, Martini et al. (2018) reported in their meta-analysis zero difference in reward processing impairment between PD patients with and without ICDs. This implies that ICD processes would be more complicated than simply changed reward processing and could also engage other neurotransmitter systems or individual vulnerability factors.
The pharmacological heterogeneity of drugs used to quantify temporal discounting effects is a limitation of current research. Levodopa and dopamine agonists vary in their pharmacology and receptor binding and, consequently, may have different effects on decision-making. Voon et al. (2017) emphasized that dopamine agonists with preferential D2/D3 receptor stimulation are more strongly associated with ICDs compared to levodopa, which stimulates both the D1 and D2 receptor families. Pharmacological specificity of this kind may be of utmost significance for the elucidation of dopamine’s effect on temporal discounting.
A vast majority of research has focused on acute dopaminergic drug effects on temporal discounting, while less has examined chronic effects or the effects changing with disease progression. Disease stage, duration of dopamine therapy, and motor fluctuations incidence also vary across studies, which could be responsible for discrepancies. Also, the potential effect of non-dopaminergic neurotransmitter systems on temporal discounting in PD is very poorly researched, as modulation of aspects of impulsivity and decision-making by serotonergic and noradrenergic systems is also evidenced.
Longitudinal Studies and Disease Progression Future research should track changes in temporal discounting throughout PD progression and treatment. Longitudinal studies would help clarify how intertemporal choice patterns evolve with disease progression and long-term dopamine replacement therapy. Weintraub et al. (2010) conducted a large cross-sectional sample of 3,090 PD patients and reported that ICDs correlated with dopamine agonist therapy, younger age, and earlier age at disease onset. Longitudinal follow-ups of such research may investigate whether alterations in temporal discounting in the early stages can serve as predictive markers for the development of ICDs or other neuropsychiatric symptoms. Understanding the temporal dynamics of these changes would provide valuable insights into the progressive nature of cognitive alterations in PD. As the pattern of dopaminergic degeneration advances from dorsal to ventral striatum, different aspects of decision-making could be impaired at different stages of the disease. A longitudinal snapshot of these changes would provide a more complete picture of the effect of dopamine dysfunction on temporal discounting throughout the disease.
As people respond inconsistently to dopaminergic therapy, research in the future needs to focus on the adjustment of DRT regimens for specific patients. Nombela et al. (2014) recognized several types of impulsivity in PD, and thus different patients may have different kinds of impulsive behaviour. This variety makes it necessary to have individualized strategies for addressing cognition and behaviour symptoms in PD.
Finding markers or behaviours that can predict how people respond to dopamine drugs would be very valuable. Pharmacogenetics studies investigate how differences in dopamine receptor and transporter genes affect drug efficacy. For instance, polymorphisms in the DRD2 and DRD3 genes, which encode dopamine receptors, have been linked to the risk of impulse control disorders in individuals with Parkinson’s Disease who have been treated with dopamine agonists. These genetic markers can help in the identification of patients who are more likely to have decision-making issues while on dopaminergic treatment.
The evidence from Parkinson’s disease research reveals a fundamental truth about dopamine’s role in decision-making: dopamine is not simply a mechanism for encoding reward magnitude, but a key regulator of how the brain values reward over time. Disruptions in dopamine signaling—whether due to brain damage, medication, or other issues—alter the way we weigh immediate and delayed rewards in understandable ways. Loss of dopamine, such as in Parkinson’s disease, increases the desire for instant rewards by making it hard to assign a value to future rewards. That is, having intact dopamine functioning is critical to sustaining the cognitive processes involved in planning, such as remembering things and learning from rewards. On the other hand, overstimulation of dopamine, especially in certain circuits of the brain, causes people to favor immediate rewards impulsively. This happens in those conditions which disrupt impulse control. This bidirectional effect—where too little and too much dopamine affects how we decide over time—is an important principle of brain biology: dopamine helps the brain calculate the timing of rewards so that it can alternate between short-term imperatives and long-term benefits. When dopamine homeostasis is disrupted, such calibration is compromised, leading to maladaptive decision-making seen in conditions ranging from addiction to schizophrenia. These findings have implications well beyond PD. They imply that impulsivity disorders, compulsivity disorders, and reward dysfunction disorders can be viewed as disruptions in the brain’s capacity to integrate temporal information into decision-making. As research advances, targeting dopamine’s role in temporal cognition could pave the way for novel treatments for neuropsychiatric conditions characterized by impulsive or maladaptive reward preferences.