<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE article
  PUBLIC "-//NLM//DTD JATS (Z39.96) Journal Publishing DTD v1.0 20120330//EN" "JATS-journalpublishing1.dtd">
<article xmlns:xlink="http://www.w3.org/1999/xlink" xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" article-type="research-article" dtd-version="1.0" xml:lang="en">
<front>
<journal-meta>
<journal-id journal-id-type="publisher-id">JM</journal-id>
<journal-id journal-id-type="nlm-ta">Jahrb Musik</journal-id>
<journal-title-group>
<journal-title>Jahrbuch Musikpsychologie</journal-title>
<abbrev-journal-title abbrev-type="pubmed">Jahrb. Musik.</abbrev-journal-title>
</journal-title-group>
<issn pub-type="epub">2569-5665</issn>
<publisher><publisher-name>PsychOpen</publisher-name></publisher>
</journal-meta>
<article-meta>
<article-id pub-id-type="publisher-id">jbdgm.2019v29.67</article-id>
<article-id pub-id-type="doi">10.5964/jbdgm.2019v29.67</article-id>
<article-categories><subj-group subj-group-type="heading"><subject>Forschungsberichte zum Themenschwerpunkt</subject></subj-group></article-categories>
<title-group>
<article-title>Time as the Ink That Music Is Written With: A Review of Internal Clock Models and Their Explanatory Power in Audiovisual Perception</article-title>
<trans-title-group xml:lang="de">
<trans-title>Zeit als Grundlage der Musik: Ein Überblick zu Modellen innerer Uhren und deren Erklärungswert für die audiovisuelle Wahrnehmung</trans-title>
</trans-title-group>
<alt-title alt-title-type="right-running">A Review of Internal Clock Models</alt-title>
<alt-title specific-use="APA-reference-style" xml:lang="en">Time as the ink that music is written with: A review of internal clock models and their explanatory power in audiovisual perception</alt-title>
</title-group>
<contrib-group>
<contrib contrib-type="author" corresp="yes"><name name-style="western"><surname>Wang</surname><given-names>Xinyue</given-names></name><xref ref-type="corresp" rid="cor1"><sup>*</sup></xref><xref ref-type="aff" rid="aff1"><sup>a</sup></xref></contrib>
<contrib contrib-type="author"><name name-style="western"><surname>Wöllner</surname><given-names>Clemens</given-names></name><xref ref-type="aff" rid="aff1"><sup>a</sup></xref></contrib>
 <contrib contrib-type="reviewer"><name name-style="western"><surname>Auhagen</surname><given-names>Wolfgang</given-names></name></contrib>
 <contrib contrib-type="reviewer"><name name-style="western"><surname>Rötter</surname><given-names>Günther</given-names></name></contrib>
<aff id="aff1"><label>a</label>Institut für Systematische Musikwissenschaft, <institution>Universität Hamburg</institution>, <addr-line>Hamburg</addr-line>, <country>Germany</country></aff>
</contrib-group>
<author-notes>
<corresp id="cor1"><label>*</label>Institut für Systematische Musikwissenschaft, Universität Hamburg, Neue Rabenstr. 13, 20354 Hamburg, Germany. <email xlink:href="xinyue.wang@uni-hamburg.de">xinyue.wang@uni-hamburg.de</email></corresp>
</author-notes>
<pub-date pub-type="epub"><day>01</day><month>07</month><year>2020</year></pub-date>
<pub-date pub-type="collection" publication-format="electronic"><year>2020</year></pub-date>
<volume>29</volume>
  <volume-id pub-id-type="title">Musikpsychologie — Musik im audiovisuellen Kontext</volume-id>
<elocation-id>e67</elocation-id>
<history>
<date date-type="received">
<day>30</day>
<month>09</month>
<year>2019</year>
</date>
<date date-type="accepted">
<day>08</day>
<month>05</month>
<year>2020</year>
</date>
</history>
<permissions><copyright-year>2020</copyright-year><copyright-holder>Wang &amp; Wöllner</copyright-holder><license license-type="open-access" specific-use="CC BY 4.0" xlink:href="https://creativecommons.org/licenses/by/4.0/"><license-p>This is an open-access article distributed under the terms of the Creative Commons Attribution (CC BY) 4.0 License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.</license-p></license></permissions>
<abstract>
<p>The current review addresses two internal clock models that have dominated discussions in timing research for the last decades. More specifically, it discusses whether the central or the intrinsic clock model better describes the fluctuations in subjective time. Identifying the timing mechanism is critical to explain and predict timing behaviours in various audiovisual contexts. Music stands out for its prominence in real life scenarios along with its great potential to alter subjective time. An emphasis on how music as a complex dynamic auditory signal affects timing accuracy led us to examine the behavioural and neuropsychological evidence that supports either clock model. In addition to the timing mechanisms, an overview of internal and external variables, such as attention and emotions as well as the classic experimental paradigms is provided, in order to examine how the mechanisms function in response to changes occurring particularly during music experiences. Neither model can explain the effects of music on subjective timing entirely: The intrinsic model applies primarily to subsecond timing, whereas the central model applies to the suprasecond range. In order to explain time experiences in music, one has to consider the target intervals as well as the contextual factors mentioned above. Further research is needed to reconcile the gap between theories, and suggestions for future empirical studies are outlined.</p>
</abstract><trans-abstract xml:lang="de">
<p>Dieser Überblick befasst sich mit zwei Modellen der inneren Uhr, die in den letzten Jahrzehnten die Diskussion in der Forschung zur Zeitwahrnehmung und -gestaltung bestimmt haben. Insbesondere wird diskutiert, ob das zentrale oder das intrinsische Uhrenmodell Schwankungen der subjektiven Zeit besser erklärt. Dabei ist das Erkennen des zugrundeliegenden Mechanismus' entscheidend, um das Zeiterleben im Nachhinein zu erklären oder in verschiedenen audiovisuellen Kontexten vorherzusagen. Musik zeichnet sich durch ihre Bedeutung in realen Szenarien sowie durch ihr großes Potenzial zur Veränderung des subjektiven Zeiterlebens aus. Musik kann als komplexes dynamisches Audiosignal die zeitliche Genauigkeit beeinflussen. Dies ist der Hintergrund, verhaltensbezogene und neuropsychologische Belege zu diskutieren, die eines der Uhrenmodelle oder beide unterstützen. Neben den Zeitmechanismen wird ein Überblick auf interne und externe Variablen wie Aufmerksamkeit und Emotion, sowie auf klassische experimentelle Paradigmen gegeben. Dadurch wird dargelegt, welche Rolle den Mechanismen zukommt hinsichtlich der Reaktion auf Änderungen im Stimulusmaterial, insbesondere beim Erleben von Musik. Im Ergebnis kann kein Modell die Auswirkungen von Musik auf das subjektive Zeiterleben vollständig erklären. Während das intrinsische Modell in erster Linie das Zeiterleben für sehr kurze Dauern unterhalb einer Sekunde zu erklären vermag, bietet das zentrale Modell einen höheren Erklärungswert für den Suprasekundenbereich, das heißt für das Timing von Sekunden bis Minuten. Um Zeiterfahrungen in der Musik zu erklären, müssen die Zielintervalle sowie die oben genannten Kontextfaktoren berücksichtigt werden. Weitere Forschungen sind erforderlich, um die Kluft zwischen den Theorien zu schließen, wobei Vorschläge für künftige empirische Studien skizziert werden.</p></trans-abstract>
<kwd-group kwd-group-type="author"><kwd>internal clock models</kwd><kwd>Dynamic Attending Theory</kwd><kwd>Scalar Expectancy Theory</kwd><kwd>music perception</kwd><kwd>audiovisual timing</kwd></kwd-group>
<kwd-group kwd-group-type="translator" xml:lang="de"><kwd>Innere Uhrenmodelle</kwd><kwd>Dynamic Attending Theory</kwd><kwd>Scalar Expectancy Theory</kwd><kwd>Musikwahrnehmung</kwd><kwd>audiovisuelles Zeiterleben</kwd></kwd-group>
</article-meta>
</front>
<body>
<sec sec-type="intro"><title></title>
<p>Properties of time have attracted the interest of researchers since long. From a Newtonian perspective, time is seen as an arrow flying eternally forward, whereas for classical thermodynamics, the passage of time shares similarity with the irreversible increase of entropy, or the degree of disorder in the universe (<xref ref-type="bibr" rid="r63">Lieb &amp; Yngvason, 1999</xref>). The psychological study of time perception, in comparison, held different opinions. One of the earliest efforts in capturing an internal timing system stemmed from doctors’ skills in making accurate estimates of time based on heartbeats and breathing (<xref ref-type="bibr" rid="r41">Goudriaan, 1921</xref>). Composers, as experts of time and timing, have discovered the link between the tempo in which their works are played and the impression of durations among the audience. Ravel once complained to Furtwängler that his Boléro, when played too fast, would feel unjustifiably long (<xref ref-type="bibr" rid="r75">Nichols, 2011</xref>). Ravel’s somewhat paradoxical observation comes close to that of cognitive scientists, such that music played at various tempi could induce corresponding time distortions (e.g., <xref ref-type="bibr" rid="r32">Droit-Volet et al., 2010</xref>).</p>
<p>Much as Ravel perceived, music is a form of art closely intertwined with time. A rich vein of literature has pointed out that rhythmic patterns, or beats, are fundamentally embedded in all genres of music, leading to perceptual periodicities (<xref ref-type="bibr" rid="r64">London, 2004</xref>; <xref ref-type="bibr" rid="r76">Nozaradan, 2014</xref>). The perceived periodicities provide a sequence of external events, which could subsequently be internalized as representations of time (<xref ref-type="bibr" rid="r34">Droit-Volet et al., 2013</xref>; <xref ref-type="bibr" rid="r40">Gibbon et al., 1984</xref>; <xref ref-type="bibr" rid="r57">Jones &amp; Boltz, 1989</xref>). Music- or beat-induced movements are noticed across a wide range of ages as the substantiation of an individuals’ anticipation of the rhythmic patterns – among infants (<xref ref-type="bibr" rid="r115">Zentner &amp; Eerola, 2010</xref>), pre-schoolers (<xref ref-type="bibr" rid="r36">Eerola et al., 2006</xref>), and adults (<xref ref-type="bibr" rid="r22">Burger et al., 2018</xref>). By either proactively or passively synchronising to the musical beats, an individual’s perceptual time is subject to modifications. The general tendency indicates that fast music leads to duration overestimation, whereas slow music to underestimations (<xref ref-type="bibr" rid="r34">Droit-Volet et al., 2013</xref>; <xref ref-type="bibr" rid="r107">Wang &amp; Shi, 2019</xref>).</p>
<p>In this review, timing refers to the active process of monitoring temporal order by explicit (e.g., tapping) or implicit (e.g., silent counting) actions, with an emphasis on the efforts involved in the task. Meanwhile, duration estimation, the action of gauging past time, composes the second part of time perception. It encompasses the concepts of both prospective (knowing before that the duration of an event should be judged) and retrospective timing (judging duration afterwards) after the target duration has elapsed (<xref ref-type="bibr" rid="r114">Zakay &amp; Block, 1995</xref>). In this sense, one could passively experience the passage of time with or without attending to the passage of time itself, which affects subsequent judgments.</p>
<p>In relation to duration estimation, it is equally important to underline the role of beat perception, or inner timing as the ability to perceive and predict the temporal location of events. As a supporter of the cerebral clock model, <xref ref-type="bibr" rid="r83">Pöppel (1989)</xref> hypothesized that maintaining a constant tempo in music production, especially in classical music, has to do with a time keeping mechanism that functions mainly by tracking the temporal order such as synchrony and succession in addition to measuring durations. In the same vein, a “3-second window of temporal integration” (<xref ref-type="bibr" rid="r83">Pöppel, 1989</xref>, p. 86) was assumed to constitute the psychological present. This has consequences for perceiving musical tempo and the integration of beats, and hence subjective experiences of time. Similar attempts for developing clock models were based on beat perception (<xref ref-type="bibr" rid="r60">Langner, 2002</xref>; <xref ref-type="bibr" rid="r93">Schulze, 1978</xref>). Schulze, in particular, pivoted the <italic>Dynamic Attending Theory</italic> (<xref ref-type="bibr" rid="r57">Jones &amp; Boltz, 1989</xref>) that emerged later by emphasizing the variability of internal clock speed under the influence of environmental cues (accelerating and decelerating beat patterns).</p>
<p>While some studies investigated musical tempo in order to formulate hypotheses about clock models, other research found that the variables embedded in tempo, such as isochrony, salience, or complexity, directly evoked changes in the functioning of the internal clock. <xref ref-type="bibr" rid="r84">Povel and Essens (1985)</xref> observed in their experiments that different grouping of rhythmic beats led to various temporal reproductions, giving rise to a best fit for the internal clock. Explanations lie in the coupling of beat accents and the clock ‘tick’: the stronger the beat pattern is (in this case, higher metrical level), and the less complex the metrical structure is, the more likely the beat would activate the internal time recording system and be represented in temporal processing. Recognizing beat perception not only helps understanding human timing better, but also particularly with music listening and performing, which can also be understood in terms of proactive and active timing. In fact, frequent exposure to musical beat production appears to enhance one’s temporal sensitivity, and this effect may transcend to other sensory modalities (from audition to vision; <xref ref-type="bibr" rid="r27">Cicchini et al., 2012</xref>). Apart from external training, the stability of intrinsic rhythm was also positively correlated with tempo reproduction performances (<xref ref-type="bibr" rid="r70">McPherson et al., 2018</xref>). It is therefore essential to look into the complexity in musical tempo itself.</p>
<p>Musical tempo is subject to ambiguity. The complexity of tempo structures in music has long been recognized (e.g., <xref ref-type="bibr" rid="r85">Pressnitzer et al., 2011</xref>). It is not only marked by the number of note events in a melody (<xref ref-type="bibr" rid="r7">Behne, 1976</xref>), nor only by the patterns of percussion instruments, but rather by changes in pitch, timbre, or loudness (<xref ref-type="bibr" rid="r16">Brochard et al., 2003</xref>), as well as phrasing and articulation (<xref ref-type="bibr" rid="r4">Auhagen &amp; Busch, 1998</xref>). Multiple sound sources of the instruments in a symphony orchestra vary tremendously across different sections and therefore constitute auditory streams that are hard to disentangle (<xref ref-type="bibr" rid="r95">Shamma &amp; Micheyl, 2010</xref>), especially for non-musicians. Note that the difficulty of correctly identifying temporal structures in music is not equal to that of correctly identifying the tempo of music, considering the latter has more to do with detecting the absolute ‘speed’ and tempo changes. Attempts have been made to examine the thresholds of detecting musical tempo acceleration and deceleration, for instance, among musically trained and untrained groups (e.g., <xref ref-type="bibr" rid="r37">Ellis, 1991</xref>). There are several assumptions of how we cope with “noisy” auditory signals in terms of time and tempo perception. Some argued that the process of tempo extraction depends mainly on periodic regularities (<xref ref-type="bibr" rid="r69">McDermott et al., 2011</xref>), while others emphasized the importance of learning, regardless of tempo structure complexities (<xref ref-type="bibr" rid="r1">Agus et al., 2010</xref>).</p>
<p>A small number of studies aimed at the disentanglement of auditory rhythmic features and revealed how tempo salience affects perceptual time. One study investigated different metrical levels and found effects on listeners’ sense of time (<xref ref-type="bibr" rid="r49">Hammerschmidt &amp; Wöllner, 2020</xref>). More specifically, the lower the metrical level individuals attended to by tapping (e.g., eight notes versus half notes), the longer a music excerpt was perceived, providing some evidence for the impact of event density (cf. <xref ref-type="bibr" rid="r7">Behne, 1976</xref>). In this case, the count of time was affected by the number of beats registered in memory.</p>
<p>Apart from music, inputs from other sensory modalities may also affect temporal processing. Indeed, psychological research has often used visual stimuli such as flashes or flickering lights to investigate time. For instance, studies of the entrainment effect for independent modalities showed that the presence of either visual flickers or pure tones led to higher entrainment (e.g., <xref ref-type="bibr" rid="r79">Ortega &amp; López, 2008</xref>; <xref ref-type="bibr" rid="r103">Treisman &amp; Brogan, 1992</xref>; <xref ref-type="bibr" rid="r104">Treisman et al., 1990</xref>). The effect, nevertheless, is not limited to one modality. Past research suggests that auditory signals of various complexities could enhance the entrainment effect for visual sequences and, in some cases, were transferable to the attention acuity of the other modality (<xref ref-type="bibr" rid="r12">Bolger et al., 2013</xref>; <xref ref-type="bibr" rid="r38">Escoffier et al., 2010</xref>). In Bolger et al.’s study, participants were able to perform equally well in a target detection task regardless of the target modality (auditory or visual) when entrained with tone sequences. Another case in point is the cross-modal transfer in tempo discrimination between auditory and tactile domains, where training with rhythmic sounds led to enhanced performance in that of the latter (<xref ref-type="bibr" rid="r74">Nagarajan et al., 1998</xref>). These studies provide evidence that the cognitive processes involved in timing and time perception should function at a domain-general level.</p>
<p>In this review, an overview is provided of internal clock models that were established or further developed in recent years. In particular, the aim was to show how each model accounts for the experience of musical time in auditory and audiovisual contexts. We will tackle questions such as: How does music facilitate temporal processing? What are the timing mechanisms and models, and how do they explain the inference between music and perceptual time, respectively? What are the implications of studying music and time perception?</p></sec>
<sec sec-type="other1"><title>The Internal Clock</title>
<p>Comparable to an actual clock, the internal clock has been an analogy for the timing mechanism in human and animals (<xref ref-type="bibr" rid="r35">Eagleman et al., 2005</xref>; <xref ref-type="bibr" rid="r52">Ivry &amp; Schlerf, 2008</xref>). The temporal order of events is recorded by multiple sensory modalities and processed in, according to different theories, a variety of pathways before becoming representations of time, that is, the occurrence of “clock ticks”. Early in the discussion, hypotheses stated that time perception was a form of information processing that highly depended on the recording capacity (<xref ref-type="bibr" rid="r78">Ornstein, 1969</xref>). Researchers such as <xref ref-type="bibr" rid="r6">Barry (1990)</xref> and <xref ref-type="bibr" rid="r93">Schulze (1978)</xref> both emphasized the importance of music as an environmental construct of attention that shaped both the perceived time (in terms of its duration) and the passage of time (the perceived speed).</p>
<p>In the past years, two major theories were on the forefront in discussions as to how the temporal units are recorded, both postulating the presence of a specific cognitive module dedicated to timing. The ‘no clock’ hypothesis, or state-dependent network, and the ‘central clock’ hypothesis have both received increasing attention in research (<xref ref-type="bibr" rid="r44">Grondin, 2010b</xref>). The latter, in particular, encompasses two theories: The Dynamic Attending Theory, based on a non-linear cumulation of temporal units, as well as the Scalar Expectancy Theory, which assumed that the emission of temporal pulses follows a linear approach (<xref ref-type="bibr" rid="r52">Ivry &amp; Schlerf, 2008</xref>). Such as <xref ref-type="bibr" rid="r98">Stern (1897)</xref> had already pointed out for “Präsenzzeit” (the experienced present moment) and the time range for other cognitive processes, it appears that different theories function best at specific time ranges (<xref ref-type="fig" rid="f1">Figure 1</xref>).</p>
<fig id="f1" position="anchor" fig-type="figure" orientation="portrait"><label>Figure 1</label><caption>
<p>An overview of the internal clock models specified by interval ranges (subsecond, suprasecond, seconds to minutes, and minutes to hours) as well as by the division of central vs. intrinsic model.</p>
<p><italic>Note.</italic> The most important features described in the overview were discussed in detail in the following sections. For a review of the research methods adopted in timing studies, see <xref ref-type="bibr" rid="r44">Grondin (2010b)</xref>.</p></caption><graphic xlink:href="jbdgm.2019v29.67-f1" position="anchor" orientation="portrait"/></fig>
<sec><title>The Intrinsic Clock Model</title>
<p>Unlike the traditional view of a clock, some researchers believe that there might be no clock at all. Such a ‘no-clock’ model is known as a state-dependent timing system or intrinsic model (<xref ref-type="bibr" rid="r52">Ivry &amp; Schlerf, 2008</xref>). The ‘state’ here describes the specific circumstances generated by a neural network in response to external changes. Timing is seen as an implicit function of each neural network that is activated for a given sensory modality and is sensitive to pre- and post-interval changes. It is postulated that activity-elicited changes in neural networks directly reflect the inherent temporal structures and therefore serve as references for timing in sub-second intervals (e.g., <xref ref-type="bibr" rid="r58">Karmarkar &amp; Buonomano, 2007</xref>). The process is also referred to as “a temporal-to-spatial transformation” (<xref ref-type="bibr" rid="r58">Karmarkar &amp; Buonomano, 2007</xref>, p. 3), or the “intrinsic model”, as researchers hypothesize that timing is an integral function of neural activity (<xref ref-type="bibr" rid="r52">Ivry &amp; Schlerf, 2008</xref>).</p>
<p>The model essentially suggests that timing as a function is distributed to a variety of neural structures in which oscillatory patterns stay consistent, known as the recurrent neural network (<xref ref-type="bibr" rid="r21">Buonomano &amp; Laje, 2011</xref>). Ramping, or climbing activities in neural oscillations ranging from primarily low frequency gamma band to higher frequency such as beta and alpha band (e.g., <xref ref-type="bibr" rid="r111">Wittmann, 2013</xref>) have been revealed as the physiological basis for the model, in addition to neural spikes across a wide range of brain regions such as the striatum (<xref ref-type="bibr" rid="r47">Gu et al., 2015</xref>). The time stamps, or accumulated states, are hypothesized to be expressed on both micro (individual neurons) as well as macro (populatory neuron excitation/inhibition) levels (<xref ref-type="bibr" rid="r21">Buonomano &amp; Laje, 2011</xref>).</p>
<p>The intrinsic model proposes the possibility that timing is an inherent function of multiple dynamic neural networks. The flexibility allows the network to take into account calibrations towards previous durations and to judge the duration of the current event on this basis. Furthermore, by hypothesizing the implicity of timing, there is no need for external triggers even when an event is absent. However, the state-dependent network also has its shortcomings. Studies suggested that the model is only applicable to the subsecond range. That is to say, the cumulative effects of previous events, either enhancing or reducing one’s temporal sensitivity, diminished within up to 300 milliseconds (<xref ref-type="bibr" rid="r20">Buonomano et al., 2009</xref>). On the other hand, it does not offer a clear explanation of cross-modal temporal information integration. This is where the central clock model provides a useful alternative perspective.</p></sec>
<sec><title>The Central Clock Model</title>
<p>The central timing mechanism, also known as the dedicated clock model (e.g., <xref ref-type="bibr" rid="r3">Allman et al., 2014</xref>), stemmed from <xref ref-type="bibr" rid="r102">Treisman’s (1963)</xref> work. Decades of research into the human timing mechanism were based on this model and have assumed that timing is a specific cognitive module, hypothetically located across the global neural network (e.g., <xref ref-type="bibr" rid="r2">Allman &amp; Meck, 2012</xref>).</p>
<p>Where is the ‘clock’ in our brain? Neurological studies supporting the timing mechanism as an independent cognitive module that is dedicated solely to this function, however, do not necessarily assume one single structure in the brain for it. Research rather supports the roles of a wide range of brain regions working collaboratively in order to process time (e.g., <xref ref-type="bibr" rid="r19">Buhusi &amp; Meck, 2005</xref>). The cerebellum, for example, is involved in short duration judgments, arguably from a few hundred milliseconds to 30 seconds (<xref ref-type="bibr" rid="r2">Allman &amp; Meck, 2012</xref>). Disruptions to other cortical and subcortical structures including the basal ganglia can lead to timing deficits also in larger time frames. In <xref ref-type="bibr" rid="r94">Schwartze and colleagues’ (2011)</xref> study, participants with basal ganglia lesion failed to detect tempo acceleration and deceleration of tones and could not entrain tapping movements with the signals. The evidence implicates a global network where multiple brain regions are involved; impairments at any part of the chain could possibly lead to malfunctions of the timing mechanism. There are two prominent theories with consequences for time processing in music that will be explained below.</p>
<sec><title>The Dynamic Attending Theory (DAT)</title>
<p>DAT, also known as the oscillator model, hypothesizes that the ability to estimate the duration of past events depends on the coupling between attentional pulses and the occurrences of external events (<xref ref-type="bibr" rid="r57">Jones &amp; Boltz, 1989</xref>). The theory supports the presence of a central clock, in the sense that the allocation of attention as a limited resource is based on the expectation of the next event on the timeline (<xref ref-type="bibr" rid="r62">Large &amp; Jones, 1999</xref>). An exogenous stimulus, when aligned with the peak of attention, is best retained in working memory and transformed into representations of time (<xref ref-type="bibr" rid="r5">Barnes &amp; Jones, 2000</xref>). Essential to this theory is that, like the unstable periodicities of external events, the emission of attentional pulses or oscillations is a non-linear process (<xref ref-type="bibr" rid="r61">Large, 2008</xref>).</p>
<p>Holding the central clock premise, DAT suggests that attention plays a critical role in regulating the frequency of the pacemaker pulses according to the Attentional Gate Model (<xref ref-type="bibr" rid="r9">Block &amp; Gruber, 2014</xref>; <xref ref-type="bibr" rid="r114">Zakay &amp; Block, 1995</xref>). More specifically, the “gate” through which temporal units pass before registering with the counter device opens wider when more attention is assigned to the specific time point. When one’s attention is shifted elsewhere irrelevant to temporal cues, fewer pulses are recorded, leading to duration underestimation. Some argue that DAT applies exclusively to prospective timing, while in retrospective timing, it is subject to contextual influences and memory retrieval (<xref ref-type="bibr" rid="r9">Block &amp; Gruber, 2014</xref>; <xref ref-type="bibr" rid="r47">Gu et al., 2015</xref>).</p>
<p>Note that DAT is hypothesized to function mostly within the suprasecond range, because prospective timing recedes with time due to limited capacities of working memory (WM) (for a review, see <xref ref-type="bibr" rid="r47">Gu et al., 2015</xref>). Concurrent tasks that require extra attentional resources could reduce timing accuracy (<xref ref-type="bibr" rid="r17">Brown &amp; Boltz, 2002</xref>). <xref ref-type="bibr" rid="r82">Polti et al. (2018)</xref> attempted to explore the interval boundary of attention in prospective timing and found that the magnitude of WM interference on time estimation tasks increased proportionally with interval lengths (30 to 90s). In a more naturalistic setting, gamers were asked to estimate the elapsed time (12, 35, or 58 minutes) either knowingly (prospective timing) or not (retrospective; <xref ref-type="bibr" rid="r100">Tobin et al., 2010</xref>). The 12-minute session was estimated significantly longer in the prospective than the retrospective paradigm, while estimation differences were less pronounced in 35- and 58-minute conditions, suggesting that DAT’s predictive power may be reduced in longer intervals. However, no evidence so far has made a clear cut of the interval ranges where each model fits best. This question clearly deserves further exploration and will be discussed in the conclusion.</p></sec>
<sec><title>The Scalar Expectancy Theory (SET)</title>
<p>Another well-known model for the internal timing mechanism argues that the perceived amount of time is composed of regularly emitted pulses from a pacemaker, as an analogy of an internal clock, and accumulated by a counter device, therefore also known as the pacemaker-counter model (<xref ref-type="bibr" rid="r39">Gibbon, 1977</xref>; <xref ref-type="bibr" rid="r40">Gibbon et al., 1984</xref>; <xref ref-type="bibr" rid="r102">Treisman, 1963</xref>). This model specifies that the temporal process is accomplished through roughly three different steps: the clock, the memorisation, and the judgment of time. Accordingly, in order to explain the temporal flow, SET proposes that subjective time is composed of (a) the representation of objective durations, and (b) the estimation variance or error rate by Weber’s fraction (<xref ref-type="bibr" rid="r2">Allman &amp; Meck, 2012</xref>; <xref ref-type="bibr" rid="r44">Grondin, 2010b</xref>). The variance is hypothesized to stem from transferring the clock readings to working memory (<xref ref-type="bibr" rid="r71">Meck, 1984</xref>; <xref ref-type="bibr" rid="r102">Treisman, 1963</xref>). Inaccuracy in duration estimation, according to SET, is also subject to the influence of attention, clock speed error, task switch, decision error and other factors (<xref ref-type="bibr" rid="r2">Allman &amp; Meck, 2012</xref>). The longer the duration, the larger the variance. It is, however, observed that SET has a controversial applicability in the sub-second to supra-second range. <xref ref-type="bibr" rid="r43">Grondin (2010a)</xref> focused on the violation of scalar property when examining participants’ timing performance in a subsecond range, and found a tendency for Weber’s fraction increasing as the interval approached 1s. This suggests that SET does not provide a powerful explanation of timing behaviour in the millisecond range. The applicability of SET in the millisecond to second range was further supported by the audiovisual evidence for this theory. Evidence is still needed for a clear boundary of ‘time ranges of the best fit’ for SET.</p>
<p>From a neurobiological perspective, the striatal beat frequency (SBF) theory offers an explanation for the original pacemaker-counter model (<xref ref-type="bibr" rid="r73">Miall, 1989</xref>; <xref ref-type="bibr" rid="r105">van Rijn et al., 2014</xref>). Unlike the latter, SBF theory instantiated the biological structure of the clock pulses as the oscillations of striatal medium spiny neurons (MSN), locating at the suprachiasmatic nucleus at the anterior hypothalamus. Researchers proposed that the neurons at different oscillatory frequencies reset when the timing begins, receiving inputs from cortical neurons firing as the consequences of dopaminergic releases (<xref ref-type="bibr" rid="r72">Merchant et al., 2013</xref>). Detection of the synchronous neural oscillation is known as the coincidental detection (<xref ref-type="bibr" rid="r19">Buhusi &amp; Meck, 2005</xref>). The MSNs are capable of detecting coincidental oscillations from the cortical neurons that fire at similar frequencies, also known as the input, then translate to temporal units as the output. To justify the scalar property, that is variance in accumulative timing, it’s been proposed that neural oscillations phase out and disperse into the inherent frequency of each neuron after the initial alignment. As a result, the discrepancy increases proportionally until the neurons that were firing together completely desynchronize in the end. The MSNs, however, retain the robust ability to detect temporal patterns up to minutes despite the complexity of the inputs thanks to ironically the large number of cortical neurons (e.g., <xref ref-type="bibr" rid="r66">Matell et al., 2003</xref>), making the theory viable for a wider range of durations.</p></sec></sec></sec>
<sec sec-type="other2"><title>Factors Overarching Both Clock Models</title>
<sec><title>Sensory Modalities</title>
<p>Multisensory inputs often interact with one another in our daily life. A vase dropping to the ground is usually followed by a shattering sound. A knock on the door leads to a knocking sound. To a broader extent, signals from vision, hearing, touch, smell and taste constitute the intangible framework of timing references together. Hence it is critical to understand the specificity of each sensory modality and their joint effects in temporal perception.</p>
<p>The dominant role of audition in temporal processing has been evidenced by a series of studies (e.g., <xref ref-type="bibr" rid="r14">Boltz, 2017</xref>; <xref ref-type="bibr" rid="r26">Chen et al., 2018</xref>; <xref ref-type="bibr" rid="r88">Repp &amp; Penel, 2002</xref>). A number of studies supported higher precision in temporal discrimination in audition compared to vision (e.g., <xref ref-type="bibr" rid="r61">Large, 2008</xref>; <xref ref-type="bibr" rid="r81">Phillips &amp; Hall, 2002</xref>). Furthermore, auditory temporal processing is capable of interfering with visual timing. In this case, participants’ performances in identifying the correct rhythmic visual patterns were most heavily compromised when the task was accompanied by a new string of isochronous sounds rather than visual display (<xref ref-type="bibr" rid="r48">Guttman et al., 2005</xref>). One may assume that temporal information derived from auditory events weighs more than that of visual inputs. The auditory dominance view is, however, not without dispute. <xref ref-type="bibr" rid="r106">van Wassenhove and colleagues (2008)</xref> found that incongruent visual displays could distort temporal perception of auditory information in both directions. A recent finding, in addition, suggests that temporal perception was biased towards the visually perceived tempo of natural human movements rather than that of the drumbeats when the two sensory modalities were incongruent (<xref ref-type="bibr" rid="r108">Wang et al., 2019</xref>).</p>
<p>Research showed that auditory stimuli can effectively distort visual perception (e.g., <xref ref-type="bibr" rid="r23">Burr et al., 2013</xref>). The auditory driving effect emphasizes the perceived coupling of fluttering sounds to visual flicker rates, if the temporal gap between the flutter and the flicker does not exceed a certain range (<xref ref-type="bibr" rid="r97">Shipley, 1964</xref>). In other words, perceptual integration is accomplished by averaging auditory and visual input frequencies while endowing more weights on the former. A robust auditory driving effect could be observed when the sounds were presented as a brief distractor (<xref ref-type="bibr" rid="r23">Burr et al., 2013</xref>). Furthermore, <xref ref-type="bibr" rid="r26">Chen and colleagues’ (2018)</xref> study suggested that, in addition to traditional regular flutters, irregular auditory inputs accompanying the visual flickers could also lead to distortions in perceiving the latter. Similar observations of the audiovisual bias were reported as fission and fusion illusions (<xref ref-type="bibr" rid="r96">Shams et al., 2002</xref>). In this case, the former specifies two visual events perceived as one when presented simultaneously with a beep, while in the latter, one flash is perceived as two when accompanied by two beeps.</p>
<p>Therefore, it is inevitable to take into account the arguably dominant position of the auditory modality when exploring the role of music in temporal processing. It should be noted that music encompasses not only complex acoustic signals, but a rich source of emotions that alter subjective time. Films are an example of how music shapes experiences of time. In a study investigating slow motion film scenes as compared to the same scenes played back in real time, participants were significantly influenced in their temporal judgments of the scenes’ duration when music was present (<xref ref-type="bibr" rid="r113">Wöllner et al., 2018</xref>). While slow motion scenes led to an underestimation of time, the same scenes in real time seemed to last relatively longer, and music yielded more accurate time estimations. Furthermore, music led to higher physiological arousal and larger pupil diameters in observers, suggesting that music modulates emotional responses and experiences of time in audiovisual scenes.</p></sec>
<sec><title>Working Memory and Attention</title>
<p>Central to the SET is the memory stage, in which working memory is retained, and the judgment stage, in which the current count of temporal units is compared to references retrieved from long-term memory (<xref ref-type="bibr" rid="r40">Gibbon et al., 1984</xref>; <xref ref-type="bibr" rid="r105">van Rijn et al., 2014</xref>). Individual differences in short-term memory capacity and discrepancies in timing performances bring attention to the role of working memory in temporal processing (e.g., <xref ref-type="bibr" rid="r15">Broadway &amp; Engle, 2011</xref>). More specifically, higher working memory capacities imply higher potential to hold more time units at the second and third stage of timing, thus leading to more precision (<xref ref-type="bibr" rid="r99">Teki &amp; Griffiths, 2014</xref>).</p>
<p>Working memory is positively related to other executive functions such as selective and divided attention (<xref ref-type="bibr" rid="r29">Colflesh &amp; Conway, 2007</xref>), for both auditory and visual modalities (<xref ref-type="bibr" rid="r112">Wöllner &amp; Halpern, 2016</xref>). Both shift in weights in various timing scenarios. This is particularly relevant for understanding the different mechanisms behind prospective and retrospective timing, as mentioned before (<xref ref-type="bibr" rid="r10">Block &amp; Zakay, 1997</xref>). In the oscillator model (DAT), attentional pulses are emitted in order to track external beats. These pulses are recorded and transferred to working memory before entering the stage of comparison with a reference duration in long-term memory (<xref ref-type="bibr" rid="r10">Block &amp; Zakay, 1997</xref>; <xref ref-type="bibr" rid="r40">Gibbon et al., 1984</xref>). Attention diverted from the timing task results in fewer temporal units taken into the count and consequently underestimations of time, while attention directed to timing led to overestimations regardless of test durations (<xref ref-type="bibr" rid="r82">Polti et al., 2018</xref>). Despite a lack of evidence, we hypothesize a similar result with music listening. When instructed to time a piece of music before it commences, a listener processes the passage of time differently than when asked to estimate the time elapsed at the end of the excerpt.</p>
<p>The interpretation of the roles of WM and attention also depends on the theories. DAT, compared to SET, highlights the role of attention rather than working memory (e.g., <xref ref-type="bibr" rid="r56">Jones, 2010</xref>; <xref ref-type="bibr" rid="r57">Jones &amp; Boltz, 1989</xref>). It postulates that attention, when quantified as regular emitted pulses, could synchronize with external periodicity and therefore serve as a reference for time. The periodicities of external events, that is regular or irregular patterns, do affect the strength of their synchronisation with attentional pulses. The more predictable an exogenous pattern is, the better the effect, known widely as the temporal entrainment effect (<xref ref-type="bibr" rid="r5">Barnes &amp; Jones, 2000</xref>; <xref ref-type="bibr" rid="r92">Schroeder &amp; Lakatos, 2009</xref>), This has been evidenced by a number of visual (<xref ref-type="bibr" rid="r30">Cravo et al., 2013</xref>), auditory (<xref ref-type="bibr" rid="r5">Barnes &amp; Jones, 2000</xref>; <xref ref-type="bibr" rid="r56">Jones, 2010</xref>), and movement (<xref ref-type="bibr" rid="r22">Burger et al., 2018</xref>) studies. <xref ref-type="bibr" rid="r54">Jones (1981</xref>, <xref ref-type="bibr" rid="r55">1990</xref>) proposed that the characteristics of the information, in this case musical expressions, could distort the perception of time. Empirical studies supporting her claims found that, for instance, music was perceived to be slower when there were more pitch variations and inconsistent metrical accents (<xref ref-type="bibr" rid="r13">Boltz, 1998</xref>). We may predict that music genres with more predictable rhythms such as pop and rock, compared to those with less predictability such as Jazz, are associated with higher duration sensitivity and better timing accuracy.</p></sec>
<sec><title>Emotions</title>
<p>“Time flies when you are having fun”. Understanding the nature of emotions in time perception is important to comprehend how music distorts subjective time, as it essentially conveys a wide spectrum of emotions. The relatively small number of studies that have directly looked into the effects of musical emotions on subjective time show that information of strongly emotional contents were more engaging and were subsequently better processed and stored in WM, leading to time overestimation (for a review, see <xref ref-type="bibr" rid="r90">Schäfer et al., 2013</xref>). Music as a powerful tool to induce emotions was found to induce a sense of timelessness (duration overestimation) as well as faster passage of time when an individual is completely submerged in the experience (<xref ref-type="bibr" rid="r51">Herbert, 2012</xref>). Apart from the aesthetic pleasure, other types and intensity of emotions may also have an impact on how music could distort the perception of time. The reasons may lie in the psycho-physiological arousal levels. Higher arousal level is believed to cause time overestimation (e.g., <xref ref-type="bibr" rid="r34">Droit-Volet et al., 2013</xref>). A group of participants, for instance, were presented with emotional film excerpts to induce corresponding emotions in them (<xref ref-type="bibr" rid="r33">Droit-Volet et al., 2011</xref>). Results indicate that, compared to baseline temporal judgments, participants tended to overestimate the durations after watching scary films. There are nevertheless findings implying the opposite, that is, higher emotional arousal leads to duration underestimation especially from a retrospective point of view (<xref ref-type="bibr" rid="r51">Herbert, 2012</xref>).</p>
<p>Another line of studies investigates the impact of emotional valences on temporal processing. Positive emotions, substantiated by happy music, led to duration underestimation, while negative emotions in sad music to duration overestimation with retrospective paradigms (<xref ref-type="bibr" rid="r8">Bisson et al., 2008</xref>). It was speculated that the positive emotions gave rise to less contextual changes than did the negative, therefore registering fewer events in the memory. Some evidence, on the contrary, implies that valence does not matter. Further investigations showed that highly arousing emotional pictures accelerated the internal clock speed and caused a leftward shift in the reaction time compared with pictures of low emotional arousal, regardless of its valence (<xref ref-type="bibr" rid="r31">Droit-Volet &amp; Berthon, 2017</xref>).</p>
<p>The seemingly puzzling observations may be explained by the mechanism by which emotions take effect on time perception. One approach is rooted in the emission rates of attentional pulses, which can be moderated by the affective states, especially the arousal level. According to the pacemaker-counter model (<xref ref-type="bibr" rid="r102">Treisman, 1963</xref>), more attentional pulses are emitted when the arousal level is high, and subsequently be recorded as the sum of clock ticks, that is, the perceived duration. Attention could either facilitate or hinder the interaction between emotions and temporal processing. More specifically, when attention is allocated to sustaining the temporal units, the effect would lead to duration overestimation. In contrast, when attention is shifted from temporal information to the emotionally charged event, fewer ticks are accumulated, resulting in duration underestimation.</p></sec></sec>
<sec sec-type="other3"><title>Modality-Specific Evidence for the Internal Clock Models</title>
<sec><title>Audiovisual Evidence for the Intrinsic Clock Model</title>
<p>Time-dependent neural oscillations are specific to sensory modalities. Studies have revealed that neuron excitation and inhibition could be elicited according to a specific type of sensory input, such as sound (<xref ref-type="bibr" rid="r91">Schnupp et al., 2006</xref>) and visual flicker (<xref ref-type="bibr" rid="r24">Burr et al., 2007</xref>). Researchers found that the time-dependent decodability of visual objects with MEG in a window of 1000ms varied significantly, suggesting that time might be an inherent feature in the local visual network (<xref ref-type="bibr" rid="r25">Carlson et al., 2013</xref>). Furthermore, transcranial magnetic stimulation studies revealed that auditory timing could be dissociated with that in other sensory modalities (<xref ref-type="bibr" rid="r18">Bueti et al., 2008</xref>), as participants performed worse in duration discrimination task (pure tones, 10 to 40ms) when receiving disruptions in the auditory cortex. We might as well propose that, when listening to complex auditory signals such as music, particular groups of neurons in the human auditory cortex generate time-dependent responses, which simultaneously serve as time codes. However, relatively few studies with humans have directly confirmed the time-dependent variability of the local auditory network (<xref ref-type="bibr" rid="r101">Toiviainen et al., 2019</xref>).</p>
<p>The disassociation in timing abilities among different sensory modalities also showed that time is processed as a local flow of information. Early findings entail significantly higher timing precision with hearing than with vision (e.g., <xref ref-type="bibr" rid="r80">Penney et al., 2000</xref>), indicating a superiority of audition over vision in providing temporal cues. Timing is a highly selective, localized process even within one modality. <xref ref-type="bibr" rid="r24">Burr and colleagues (2007)</xref> successfully modulated the perceived durations of the target visual stimuli by manipulating the apparent rate of flickers in a confined retinal region. Their finding is among one of the first to empirically support (a) the spatial-temporal connection in neural representations, and (b) the modality specificity in temporal processing, particularly the superiority of audition (e.g., <xref ref-type="bibr" rid="r88">Repp &amp; Penel, 2002</xref>). In <xref ref-type="bibr" rid="r65">Lustig and Meck’s (2011)</xref> study, the modality effect was stronger for participants at both ends of the age spectrum. One potential cause was that older adults were more susceptible to varying allocation of attention under different experimental conditions, whereas children might be influenced by developing sensory functions. That is not to say that SDN is a ‘one modality, one clock’ system, but rather a large network that also covers the interactions between multiple networks.</p>
<p>Taken together, from an intrinsic model’s perspective, time is a consequence of cumulative states in a recurrent neural network that represent the amount of changes induced by external stimuli. In this sense, when listening to a piece of music repetitively, the perceived duration of both music and video (as a further stimulus) will be altered if presented again later on.</p></sec>
<sec><title>Audiovisual Evidence for the Dynamic Attending Theory</title>
<p>DAT is endowed with a particular emphasis on attention, given that the count of temporal units depends on how well attentional pulses synchronize with the external event, also known as the temporal entrainment effect. The term specifies the coupling of the tempo of extrinsic temporal cues and that of pacemaker pulses (<xref ref-type="bibr" rid="r56">Jones, 2010</xref>). The emphasis on external entrainment like music began in the early days of the formulation of the clock model (<xref ref-type="bibr" rid="r6">Barry, 1990</xref>; <xref ref-type="bibr" rid="r83">Pöppel, 1989</xref>). Neurobiological evidence suggests that the just noticeable differences for auditory gaps can be modulated when neural activities were entrained with specific frequency bands and amplitudes (<xref ref-type="bibr" rid="r50">Henry et al., 2014</xref>). Regarding music, the synchronization between neural oscillations and musical beats was substantiated as the steady-state event potential (SS-EP) evoked by periodicity in musical beats (<xref ref-type="bibr" rid="r76">Nozaradan, 2014</xref>).</p>
<p>Behavioural evidence provided similar findings. Fast tempo was found to lead to overestimation, or “time dilation”, and slow tempo to underestimation, or “time contraction” in both auditory (<xref ref-type="bibr" rid="r107">Wang &amp; Shi, 2019</xref>) and visual perception (<xref ref-type="bibr" rid="r79">Ortega &amp; López, 2008</xref>). In addition, behavioural entrainment to external beats were found across age ranges and stimulus types, including auditory sequences and music excerpts (for a review, see <xref ref-type="bibr" rid="r89">Repp &amp; Su, 2013</xref>). The experimental paradigms usually provide participants with a rhythmic beat that ceases (or not) after a short period of entrainment and require them to continue tapping or moving along with the beats. <xref ref-type="bibr" rid="r11">Boasson and Granot (2012)</xref> adopted a paradigm of tapping to pitch rises and drops in multiple melodic sequences, in order to examine the entrainment effect. In their study, however, musicians and non-musicians uniformly exhibited faster-paced tapping behavior with rising pitch. This is consistent with other findings which revealed no difference in predictive timing between musically trained and untrained groups (e.g., <xref ref-type="bibr" rid="r87">Repp, 2010</xref>), whereas other studies indicated that musicians (percussionists) exhibited better entrainment performance when exposed to intense beat production activities (<xref ref-type="bibr" rid="r27">Cicchini et al., 2012</xref>). These studies suggest that individuals actively entrain with external rhythms and perceive past durations accordingly, and may provide evidence of the wide applicability of DAT.</p>
<p>Building upon simple click paradigms as previously discussed (<xref ref-type="bibr" rid="r104">Treisman et al., 1990</xref>), research in recent years used naturalistic stimuli, since DAT is most applicable in music and speech. Periodic tone entrainment studies yielded new results: <xref ref-type="bibr" rid="r110">Wearden et al. (2017)</xref> found the residual effect of the classic click train paradigm, that is, the higher the preceding click frequency, the longer the following duration would be perceived. They have also observed similar effects with irregular tones as well as white noise. This study revealed multiple approaches to activate and to speed up the internal clock. Periodic and aperiodic clicks, as well as rhythmic visual flickers and even white noise influenced results. In addition, the entrainment effect was also verified to transcend as long as 8s after hearing high-frequency clicks, indicating that the emission of attentional pulses has a latency between activation and cessation.</p>
<p>More complex stimuli such as music are processed similarly. Fast music compared to slow one was perceived to be longer due to the accumulation of more temporal units. A study adopted Mozart’s Sonata for two pianos (K.448), where participants tended to overestimate the duration when the excerpt was at the “fast” (120BPM) end of the spectrum (<xref ref-type="bibr" rid="r107">Wang &amp; Shi, 2019</xref>). The effect, nevertheless, is subject to the allocation of attention. <xref ref-type="bibr" rid="r59">Keller and Burnham (2005)</xref> emphasized the flexibility of attention when listening to musical meter, which could be composed of multiple metrical layers. Therefore, tracking high and low metrical structures is expected to have its corresponding effects on psychological time (cf. <xref ref-type="bibr" rid="r49">Hammerschmidt &amp; Wöllner, 2020</xref>), as the former should hypothetically lead to fewer mental counts and thus time compression. Neurological evidence also indicated that focusing on different temporal structures led to alignments in steady state event potential (SS-EP) frequencies, deciphered from EEG recordings (<xref ref-type="bibr" rid="r77">Nozaradan et al., 2012</xref>). In this case, neural entrainment reflects that attending to local features in complex auditory signals could form mental representations of time by modulating the original neural oscillations.</p>
<p>When more attention is allocated to the temporal features of music, <xref ref-type="bibr" rid="r28">Cocenas-Silva et al. (2011)</xref> observed a time dilation effect. When participants were asked to group excerpts of various arousal levels based on their estimated lengths, those which were highly arousing tended to be overestimated. The finding is consistent with <xref ref-type="bibr" rid="r34">Droit-Volet et al.’s (2013)</xref> observation that faster music, which was thought to be more arousing, was judged to be longer than the slow, less arousing ones. We might reason that, when individuals attend to temporal features of the auditory signals, the temporal entrainment effect is stronger compared to situations when they attend to other features such as key chords and pleasantness.</p></sec>
<sec><title>Audiovisual Evidence for the Scalar Expectancy Theory</title>
<p>The following examines the evidence for multiple sensory modalities that either support or disagree with SET. To establish a solid ground for SET, researchers tried to find evidence for Weber’s fraction, or a constant variance to subjective timing, across different sensory modalities, durations, populations, and other conditions. <xref ref-type="bibr" rid="r109">Wearden and Jones (2007)</xref> probed the scalar property of subjective timing using two variations of the duration comparison task with auditory tones ranging from 600ms to 10s. They found a linear increase in subjective timing that conforms to Weber’s law. This effect is consistent also in the visual domain. In a duration discrimination study, <xref ref-type="bibr" rid="r42">Grondin (2001)</xref> found that participants exhibited similar sensitivity towards intervals marked by visual flickers between 600 to 900 ms, in accordance with Weber’s law. However, the ratio changed when the inter-stimulus interval went beyond 900ms. The violation of Weber’s law might be due to potentially explicit counting.</p>
<p>Similarly, mixed findings have been reported in multi-modalities studies. Hypothetically, if the scalar property holds across modalities, one should expect a consistent linear increase in different modalities. This was indeed the case when participants performed predictive saccades, or eye-movement timing, when intervals from 500 to 1000ms were presented either as visual flashes or auditory tone flutters (<xref ref-type="bibr" rid="r53">Joiner et al., 2007</xref>). However, comparing Weber’s ratios between the two modalities revealed that auditory timing had greater variability than visual timing, as shown in participants’ reactive eye movements when tracking the periodic cues. Hence, one might deduct that the scalar property holds but is also subject to stimulus modality. <xref ref-type="bibr" rid="r9">Block and Gruber (2014)</xref> argued that the obstacles of finding a cross-modal transfer effect was restricted to below the 3 to 5s window, beyond which the automatic processing should diminish due to the limited capacity of working memory.</p>
<p>On the other hand, evidence against the scalar property has been presented in auditory studies. <xref ref-type="bibr" rid="r45">Grondin (2012)</xref> adopted three approaches to measure Weber’s ratio: duration discrimination, reproduction and categorization tasks on a spectrum from 1 to 1.9 seconds using pure tones. In all three tasks, Weber’s ratio appeared to be higher when the intervals were longer regardless of the number of interval repetitions, in this case either 1, 3, or 5 times. These results indicate the inconsistency in Weber’s ratio or temporal sensitivity despite different emphases of each paradigm on the timing process. <xref ref-type="bibr" rid="r43">Grondin (2010a)</xref> pointed out that the failure of conforming to the pacemaker-counter model, which SET is built upon, was because this model no longer applied to this duration range (see <xref ref-type="fig" rid="f1">Figure 1</xref>). More specifically, a cut-off point at 1.2 to 1.3s was observed. This aligns with observations from other studies (for a review, see <xref ref-type="bibr" rid="r68">Matthews &amp; Meck, 2014</xref>). The question is, how is time processed beyond that point? Some researchers proposed that a learning effect might have altered the variance, as the brain was influenced by multiple exposures to the same interval (<xref ref-type="bibr" rid="r67">Matthews &amp; Grondin, 2012</xref>). Findings across timing tasks and sensory modalities, nevertheless, support the presence of a unitary clock system.</p>
<p>Despite the controversial evidence, reports investigating timing precision on multiple sensory modalities align with what the striatal beat frequency theory proposed: a familiarity effect that is reflected by enhanced synaptic communication between neurons. This might lead to higher processing efficiency and smaller variability compared to unfamiliar intervals. <xref ref-type="bibr" rid="r45">Grondin’s (2012)</xref> experiments revealed that participants performed better in 3- and 5-interval discrimination than when only one interval was presented. Frequent exposure to timing tasks, as a part of music training, may also implicate the benefits of enhanced neural connection. In <xref ref-type="bibr" rid="r86">Rammsayer and Altenmüller’s (2006)</xref> study, musicians outperformed non-musicians in a perceptual timing task in terms of showing less variance and thus higher temporal sensitivity for instance in duration discrimination tasks. Musicians, however, did not exhibit significant superiority in a temporal generalization task, where participants compared the duration of an excerpt to the reference at the beginning, hypothetically stored in one’s working memory. The authors believed that this was due to the fact that the intervals exceeded working memory capacities. This explanation is equally applicable to <xref ref-type="bibr" rid="r46">Grondin and Killeen’s (2009)</xref> results, where participants in a reproduction-by-tapping task performed significantly better if they adopted counting or singing, compared to doing nothing. Thus it might be concluded that the SET indeed predicts the timing performance only within short intervals of no more than 2s (for a review, see <xref ref-type="bibr" rid="r52">Ivry &amp; Schlerf, 2008</xref>). Nevertheless, it is equally important to understand timing within a few notes as well as in larger musical structures such as phrases.</p></sec></sec>
<sec sec-type="conclusions"><title>Conclusions</title>
<p>This review has discussed two internal clock models: the intrinsic and the central clock models. The intrinsic model emphasizes automatic processing of temporal information in the subsecond range, while the central clock model explains the suprasecond (seconds to minutes) range of timing, which demands higher levels of cognitive control. Controversially, the Scalar Expectancy Theory, which can be seen as a specific account of the central clock model, applies to timing in the seconds range only, while the Dynamic Attending Theory works for timing intervals from seconds to minutes. According to SET and DAT, short intervals are represented linearly through the accumulation of pacemaker pulses, while longer intervals are represented nonlinearly, as pulse emission is calibrated to align with external periodicity. As for intervals of hours and longer, the timing process is subject to contextual changes and memory segmentation, and relevant research is scarce.</p>
<p>Audition, among all modalities, shows superiority in temporal processing by entailing higher sensitivity to detect changes and to estimate interval lengths compared to vision and other sensory modalities. In this sense, the modality specificity supports a distributed timing mechanism. Yet more evidence is needed to explain the cross-modal transfer of training effects in, for example, duration discrimination. Despite years of debate on the superiority of one clock model, there is no conclusive evidence to the best of our knowledge. We come to the observation that each model has its best fit at a different time duration scale, and as to whether discrete events (SET) or complex streams (DAT) such as in music are at the core of the investigation.</p>
<p>Regarding the explanatory power of the internal clock models for the perception of musical time, it is therefore necessary to consider an interval-specific approach. Short interval timing within the milliseconds range plays a crucial role in music production such as expressive microtiming, whereas long interval timing is more strongly modified by attention, emotion, and working memory, consequently adding more variables to the equation. In this regard, the timing paradigm adopted in an ecologically plausible environment such as music concerts, movies, or sports should receive more attention. Ways of applying clock models to longer-interval timing and time estimation are yet to be investigated.</p></sec>
</body>
<back><fn-group><fn fn-type="financial-disclosure"><label>Funding</label>
<p>This research was supported by a grant from the European Research Council to the second author (grant agreement: 725319) for the project ‘‘Slow motion: Transformations of musical time in perception and performance’’ (SloMo).</p></fn></fn-group>
<ref-list><title>References</title>
<ref id="r1"><mixed-citation publication-type="journal"><person-group person-group-type="author"><name name-style="western"><surname>Agus</surname><given-names>T. R.</given-names></name><name name-style="western"><surname>Thorpe</surname><given-names>S. J.</given-names></name><name name-style="western"><surname>Pressnitzer</surname><given-names>D.</given-names></name></person-group> (<year>2010</year>). <article-title>Rapid formation of robust auditory memories: Insights from noise.</article-title> <source>Neuron</source>, <volume>66</volume>(<issue>4</issue>), <fpage>610</fpage>–<lpage>618</lpage>. <pub-id pub-id-type="doi">10.1016/j.neuron.2010.04.014</pub-id><pub-id pub-id-type="pmid">20510864</pub-id></mixed-citation></ref>
<ref id="r2"><mixed-citation publication-type="journal"><person-group person-group-type="author"><name name-style="western"><surname>Allman</surname><given-names>M. J.</given-names></name><name name-style="western"><surname>Meck</surname><given-names>W. H.</given-names></name></person-group> (<year>2012</year>). <article-title>Pathophysiological distortions in time perception and timed performance.</article-title> <source>Brain</source>, <volume>135</volume>(<issue>3</issue>), <fpage>656</fpage>–<lpage>677</lpage>. <pub-id pub-id-type="doi">10.1093/brain/awr210</pub-id><pub-id pub-id-type="pmid">21921020</pub-id></mixed-citation></ref>
<ref id="r3"><mixed-citation publication-type="journal"><person-group person-group-type="author"><name name-style="western"><surname>Allman</surname><given-names>M. J.</given-names></name><name name-style="western"><surname>Teki</surname><given-names>S.</given-names></name><name name-style="western"><surname>Griffiths</surname><given-names>T. D.</given-names></name><name name-style="western"><surname>Meck</surname><given-names>W. H.</given-names></name></person-group> (<year>2014</year>). <article-title>Properties of the internal clock: First-and second-order principles of subjective time.</article-title> <source>Annual Review of Psychology</source>, <volume>65</volume>, <fpage>743</fpage>–<lpage>771</lpage>. <pub-id pub-id-type="doi">10.1146/annurev-psych-010213-115117</pub-id><pub-id pub-id-type="pmid">24050187</pub-id></mixed-citation></ref>
<ref id="r4"><mixed-citation publication-type="book">Auhagen, W., &amp; Busch, V. (1998). The influence of articulation on listeners’ regulation of performed tempo. In R. Kopiez &amp; W. Auhagen (Eds.), <italic>Controlling creative processes in music</italic> (pp. 69–92). Bern, Switzerland: Peter Lang.</mixed-citation></ref>
<ref id="r5"><mixed-citation publication-type="journal"><person-group person-group-type="author"><name name-style="western"><surname>Barnes</surname><given-names>R.</given-names></name><name name-style="western"><surname>Jones</surname><given-names>M. R.</given-names></name></person-group> (<year>2000</year>). <article-title>Expectancy, attention, and time.</article-title> <source>Cognitive Psychology</source>, <volume>41</volume>(<issue>3</issue>), <fpage>254</fpage>–<lpage>311</lpage>. <pub-id pub-id-type="doi">10.1006/cogp.2000.0738</pub-id><pub-id pub-id-type="pmid">11032658</pub-id></mixed-citation></ref>
<ref id="r6"><mixed-citation publication-type="book">Barry, B. R. (1990). <italic>Musical time: the sense of order</italic>. Hillsdale, NY, USA: Pendragon Press.</mixed-citation></ref>
<ref id="r7"><mixed-citation publication-type="journal"><person-group person-group-type="author"><name name-style="western"><surname>Behne</surname><given-names>K. E.</given-names></name></person-group> (<year>1976</year>). <article-title>„Zeitmaße”: Zur Psychologie des musikalischen Tempoempfindens.</article-title> <source>Die Musikforschung</source>, <volume>29</volume>(<issue>2</issue>), <fpage>155</fpage>–<lpage>164</lpage>.</mixed-citation></ref>
<ref id="r8"><mixed-citation publication-type="journal"><person-group person-group-type="author"><name name-style="western"><surname>Bisson</surname><given-names>N.</given-names></name><name name-style="western"><surname>Tobin</surname><given-names>S.</given-names></name><name name-style="western"><surname>Grondin</surname><given-names>S.</given-names></name></person-group> (<year>2008</year>). <article-title>Remembering the duration of joyful and sad musical excerpts: Assessment with three estimation methods.</article-title> <source>NeuroQuantology</source>, <volume>7</volume>(<issue>1</issue>), <fpage>46</fpage>–<lpage>57</lpage>. <pub-id pub-id-type="doi">10.14704/nq.2009.7.1.206</pub-id></mixed-citation></ref>
<ref id="r9"><mixed-citation publication-type="journal"><person-group person-group-type="author"><name name-style="western"><surname>Block</surname><given-names>R. A.</given-names></name><name name-style="western"><surname>Gruber</surname><given-names>R. P.</given-names></name></person-group> (<year>2014</year>). <article-title>Time perception, attention, and memory: A selective review.</article-title> <source>Acta Psychologica</source>, <volume>149</volume>, <fpage>129</fpage>–<lpage>133</lpage>. <pub-id pub-id-type="doi">10.1016/j.actpsy.2013.11.003</pub-id><pub-id pub-id-type="pmid">24365036</pub-id></mixed-citation></ref>
<ref id="r10"><mixed-citation publication-type="journal"><person-group person-group-type="author"><name name-style="western"><surname>Block</surname><given-names>R. A.</given-names></name><name name-style="western"><surname>Zakay</surname><given-names>D.</given-names></name></person-group> (<year>1997</year>). <article-title>Prospective and retrospective duration judgments: A meta-analytic review.</article-title> <source>Psychonomic Bulletin &amp; Review</source>, <volume>4</volume>, <fpage>184</fpage>–<lpage>197</lpage>. <pub-id pub-id-type="doi">10.3758/BF03209393</pub-id><pub-id pub-id-type="pmid">21331825</pub-id></mixed-citation></ref>
<ref id="r11"><mixed-citation publication-type="confproc">Boasson, A. D., &amp; Granot, R. (2012). Melodic direction’s effect on tapping. In E. Cambouropoulos, C. Tsougras, P. Mavromatis, &amp; K. Pastiadis (Eds.), <italic>Proceedings of 12th international conference on music perception and cognition, and the 8th triennial conference of the European society for the cognitive sciences of music</italic> (pp.110-119). The joint conference ICMPC – ESCOM 2012, Thessaloniki, Greece. Retrieved from <ext-link ext-link-type="uri" xlink:href="http://icmpc-escom2012.web.auth.gr/files/papers/110_Proc.pdf">http://icmpc-escom2012.web.auth.gr/files/papers/110_Proc.pdf</ext-link></mixed-citation></ref>
<ref id="r12"><mixed-citation publication-type="journal"><person-group person-group-type="author"><name name-style="western"><surname>Bolger</surname><given-names>D.</given-names></name><name name-style="western"><surname>Trost</surname><given-names>W.</given-names></name><name name-style="western"><surname>Schön</surname><given-names>D.</given-names></name></person-group> (<year>2013</year>). <article-title>Rhythm implicitly affects temporal orienting of attention across modalities.</article-title> <source>Acta Psychologica</source>, <volume>142</volume>, <fpage>238</fpage>–<lpage>244</lpage>. <pub-id pub-id-type="doi">10.1016/j.actpsy.2012.11.012</pub-id><pub-id pub-id-type="pmid">23357092</pub-id></mixed-citation></ref>
<ref id="r13"><mixed-citation publication-type="journal"><person-group person-group-type="author"><name name-style="western"><surname>Boltz</surname><given-names>M. G.</given-names></name></person-group> (<year>1998</year>). <article-title>Tempo discrimination of musical patterns: Effects due to pitch and rhythmic structure.</article-title> <source>Perception &amp; Psychophysics</source>, <volume>60</volume>(<issue>8</issue>), <fpage>1357</fpage>–<lpage>1373</lpage>. <pub-id pub-id-type="doi">10.3758/BF03207998</pub-id><pub-id pub-id-type="pmid">9865077</pub-id></mixed-citation></ref>
<ref id="r14"><mixed-citation publication-type="journal"><person-group person-group-type="author"><name name-style="western"><surname>Boltz</surname><given-names>M. G.</given-names></name></person-group> (<year>2017</year>). <article-title>Auditory driving in cinematic art.</article-title> <source>Music Perception</source>, <volume>35</volume>(<issue>1</issue>), <fpage>77</fpage>–<lpage>93</lpage>. <pub-id pub-id-type="doi">10.1525/mp.2017.35.1.77</pub-id></mixed-citation></ref>
<ref id="r15"><mixed-citation publication-type="journal"><person-group person-group-type="author"><name name-style="western"><surname>Broadway</surname><given-names>J. M.</given-names></name><name name-style="western"><surname>Engle</surname><given-names>R. W.</given-names></name></person-group> (<year>2011</year>). <article-title>Individual differences in working memory capacity and temporal discrimination.</article-title> <source>PLOS ONE</source>, <volume>6</volume>(<issue>10</issue>), <elocation-id>e25422</elocation-id>. <pub-id pub-id-type="doi">10.1371/journal.pone.0025422</pub-id><pub-id pub-id-type="pmid">22003391</pub-id></mixed-citation></ref>
<ref id="r16"><mixed-citation publication-type="journal"><person-group person-group-type="author"><name name-style="western"><surname>Brochard</surname><given-names>R.</given-names></name><name name-style="western"><surname>Abecasis</surname><given-names>D.</given-names></name><name name-style="western"><surname>Potter</surname><given-names>D.</given-names></name><name name-style="western"><surname>Ragot</surname><given-names>R.</given-names></name><name name-style="western"><surname>Drake</surname><given-names>C.</given-names></name></person-group> (<year>2003</year>). <article-title>The “ticktock” of our internal clock: Direct brain evidence of subjective accents in isochronous sequences.</article-title> <source>Psychological Science</source>, <volume>14</volume>(<issue>4</issue>), <fpage>362</fpage>–<lpage>366</lpage>. <pub-id pub-id-type="doi">10.1111/1467-9280.24441</pub-id><pub-id pub-id-type="pmid">12807411</pub-id></mixed-citation></ref>
<ref id="r17"><mixed-citation publication-type="journal"><person-group person-group-type="author"><name name-style="western"><surname>Brown</surname><given-names>S. W.</given-names></name><name name-style="western"><surname>Boltz</surname><given-names>M. G.</given-names></name></person-group> (<year>2002</year>). <article-title>Attentional processes in time perception: Effects of mental workload and event structure.</article-title> <source>Journal of Experimental Psychology: Human Perception and Performance</source>, <volume>28</volume>(<issue>3</issue>), <fpage>600</fpage>–<lpage>615</lpage>. <pub-id pub-id-type="doi">10.1037/0096-1523.28.3.600</pub-id><pub-id pub-id-type="pmid">12075891</pub-id></mixed-citation></ref>
<ref id="r18"><mixed-citation publication-type="journal"><person-group person-group-type="author"><name name-style="western"><surname>Bueti</surname><given-names>D.</given-names></name><name name-style="western"><surname>van Dongen</surname><given-names>E. V.</given-names></name><name name-style="western"><surname>Walsh</surname><given-names>V.</given-names></name></person-group> (<year>2008</year>). <article-title>The role of superior temporal cortex in auditory timing.</article-title> <source>PLOS ONE</source>, <volume>3</volume>(<issue>6</issue>), <elocation-id>e2481</elocation-id>. <pub-id pub-id-type="doi">10.1371/journal.pone.0002481</pub-id><pub-id pub-id-type="pmid">18575615</pub-id></mixed-citation></ref>
<ref id="r19"><mixed-citation publication-type="journal"><person-group person-group-type="author"><name name-style="western"><surname>Buhusi</surname><given-names>C. V.</given-names></name><name name-style="western"><surname>Meck</surname><given-names>W. H.</given-names></name></person-group> (<year>2005</year>). <article-title>What makes us tick? Functional and neural mechanisms of interval timing.</article-title> <source>Nature Reviews: Neuroscience</source>, <volume>6</volume>(<issue>10</issue>), <fpage>755</fpage>–<lpage>765</lpage>. <pub-id pub-id-type="doi">10.1038/nrn1764</pub-id><pub-id pub-id-type="pmid">16163383</pub-id></mixed-citation></ref>
<ref id="r20"><mixed-citation publication-type="journal"><person-group person-group-type="author"><name name-style="western"><surname>Buonomano</surname><given-names>D. V.</given-names></name><name name-style="western"><surname>Bramen</surname><given-names>J.</given-names></name><name name-style="western"><surname>Khodadadifar</surname><given-names>M.</given-names></name></person-group> (<year>2009</year>). <article-title>Influence of the interstimulus interval on temporal processing and learning: Testing the state-dependent network model.</article-title> <source>Philosophical Transactions of the Royal Society of London. Series B</source>, <volume>364</volume>(<issue>1525</issue>), <fpage>1865</fpage>–<lpage>1873</lpage>. <pub-id pub-id-type="doi">10.1098/rstb.2009.0019</pub-id><pub-id pub-id-type="pmid">19487189</pub-id></mixed-citation></ref>
<ref id="r21"><mixed-citation publication-type="book">Buonomano, D. V., &amp; Laje, R. (2011). Population clocks: Motor timing with neural dynamics. In S. Dehaene, &amp; E. Brannon (Eds.), <italic>Space, time and number in the brain</italic> (pp. 71-85). Cambridge, MA, USA: Academic Press. <pub-id pub-id-type="doi">10.1016/B978-0-12-385948-8.00006-2</pub-id></mixed-citation></ref>
<ref id="r22"><mixed-citation publication-type="journal"><person-group person-group-type="author"><name name-style="western"><surname>Burger</surname><given-names>B.</given-names></name><name name-style="western"><surname>London</surname><given-names>J.</given-names></name><name name-style="western"><surname>Thompson</surname><given-names>M. R.</given-names></name><name name-style="western"><surname>Toiviainen</surname><given-names>P.</given-names></name></person-group> (<year>2018</year>). <article-title>Synchronization to metrical levels in music depends on low-frequency spectral components and tempo.</article-title> <source>Psychological Research</source>, <volume>82</volume>(<issue>6</issue>), <fpage>1195</fpage>–<lpage>1211</lpage>. <pub-id pub-id-type="doi">10.1007/s00426-017-0894-2</pub-id><pub-id pub-id-type="pmid">28712036</pub-id></mixed-citation></ref>
<ref id="r23"><mixed-citation publication-type="journal"><person-group person-group-type="author"><name name-style="western"><surname>Burr</surname><given-names>D.</given-names></name><name name-style="western"><surname>Della Rocca</surname><given-names>E.</given-names></name><name name-style="western"><surname>Morrone</surname><given-names>M. C.</given-names></name></person-group> (<year>2013</year>). <article-title>Contextual effects in interval-duration judgements in vision, audition and touch.</article-title> <source>Experimental Brain Research</source>, <volume>230</volume>(<issue>1</issue>), <fpage>87</fpage>–<lpage>98</lpage>. <pub-id pub-id-type="doi">10.1007/s00221-013-3632-z</pub-id><pub-id pub-id-type="pmid">23864044</pub-id></mixed-citation></ref>
<ref id="r24"><mixed-citation publication-type="journal"><person-group person-group-type="author"><name name-style="western"><surname>Burr</surname><given-names>D.</given-names></name><name name-style="western"><surname>Tozzi</surname><given-names>A.</given-names></name><name name-style="western"><surname>Morrone</surname><given-names>M. C.</given-names></name></person-group> (<year>2007</year>). <article-title>Neural mechanisms for timing visual events are spatially selective in real-world coordinates.</article-title> <source>Nature Neuroscience</source>, <volume>10</volume>(<issue>4</issue>), <fpage>423</fpage>–<lpage>425</lpage>. <pub-id pub-id-type="doi">10.1038/nn1874</pub-id><pub-id pub-id-type="pmid">17369824</pub-id></mixed-citation></ref>
<ref id="r25"><mixed-citation publication-type="journal"><person-group person-group-type="author"><name name-style="western"><surname>Carlson</surname><given-names>T.</given-names></name><name name-style="western"><surname>Tovar</surname><given-names>D. A.</given-names></name><name name-style="western"><surname>Alink</surname><given-names>A.</given-names></name><name name-style="western"><surname>Kriegeskorte</surname><given-names>N.</given-names></name></person-group> (<year>2013</year>). <article-title>Representational dynamics of object vision: The first 1000 ms.</article-title> <source>Journal of Vision</source>, <volume>13</volume>(<issue>10</issue>), <fpage>1</fpage>–<lpage>19</lpage>. <pub-id pub-id-type="doi">10.1167/13.10.1</pub-id><pub-id pub-id-type="pmid">23908380</pub-id></mixed-citation></ref>
<ref id="r26"><mixed-citation publication-type="journal"><person-group person-group-type="author"><name name-style="western"><surname>Chen</surname><given-names>L.</given-names></name><name name-style="western"><surname>Zhou</surname><given-names>X.</given-names></name><name name-style="western"><surname>Müller</surname><given-names>H. J.</given-names></name><name name-style="western"><surname>Shi</surname><given-names>Z.</given-names></name></person-group> (<year>2018</year>). <article-title>What you see depends on what you hear: Temporal averaging and crossmodal integration.</article-title> <source>Journal of Experimental Psychology: General</source>, <volume>147</volume>(<issue>12</issue>), <fpage>1851</fpage>–<lpage>1864</lpage>. <pub-id pub-id-type="doi">10.1037/xge0000487</pub-id><pub-id pub-id-type="pmid">30211578</pub-id></mixed-citation></ref>
<ref id="r27"><mixed-citation publication-type="journal"><person-group person-group-type="author"><name name-style="western"><surname>Cicchini</surname><given-names>G. M.</given-names></name><name name-style="western"><surname>Arrighi</surname><given-names>R.</given-names></name><name name-style="western"><surname>Cecchetti</surname><given-names>L.</given-names></name><name name-style="western"><surname>Giusti</surname><given-names>M.</given-names></name><name name-style="western"><surname>Burr</surname><given-names>D. C.</given-names></name></person-group> (<year>2012</year>). <article-title>Optimal encoding of interval timing in expert percussionists.</article-title> <source>The Journal of Neuroscience</source>, <volume>32</volume>(<issue>3</issue>), <fpage>1056</fpage>–<lpage>1060</lpage>. <pub-id pub-id-type="doi">10.1523/JNEUROSCI.3411-11.2012</pub-id><pub-id pub-id-type="pmid">22262903</pub-id></mixed-citation></ref>
<ref id="r28"><mixed-citation publication-type="journal"><person-group person-group-type="author"><name name-style="western"><surname>Cocenas-Silva</surname><given-names>R.</given-names></name><name name-style="western"><surname>Bueno</surname><given-names>J. L. O.</given-names></name><name name-style="western"><surname>Molin</surname><given-names>P.</given-names></name><name name-style="western"><surname>Bigand</surname><given-names>E.</given-names></name></person-group> (<year>2011</year>). <article-title>Multidimensional scaling of musical time estimations.</article-title> <source>Perceptual and Motor Skills</source>, <volume>112</volume>(<issue>3</issue>), <fpage>737</fpage>–<lpage>748</lpage>. <pub-id pub-id-type="doi">10.2466/11.24.PMS.112.3.737-748</pub-id><pub-id pub-id-type="pmid">21853763</pub-id></mixed-citation></ref>
<ref id="r29"><mixed-citation publication-type="journal"><person-group person-group-type="author"><name name-style="western"><surname>Colflesh</surname><given-names>G. J.</given-names></name><name name-style="western"><surname>Conway</surname><given-names>A. R.</given-names></name></person-group> (<year>2007</year>). <article-title>Individual differences in working memory capacity and divided attention in dichotic listening.</article-title> <source>Psychonomic Bulletin &amp; Review</source>, <volume>14</volume>(<issue>4</issue>), <fpage>699</fpage>–<lpage>703</lpage>. <pub-id pub-id-type="doi">10.3758/BF03196824</pub-id><pub-id pub-id-type="pmid">17972736</pub-id></mixed-citation></ref>
<ref id="r30"><mixed-citation publication-type="journal"><person-group person-group-type="author"><name name-style="western"><surname>Cravo</surname><given-names>A. M.</given-names></name><name name-style="western"><surname>Rohenkohl</surname><given-names>G.</given-names></name><name name-style="western"><surname>Wyart</surname><given-names>V.</given-names></name><name name-style="western"><surname>Nobre</surname><given-names>A. C.</given-names></name></person-group> (<year>2013</year>). <article-title>Temporal expectation enhances contrast sensitivity by phase entrainment of low-frequency oscillations in visual cortex.</article-title> <source>The Journal of Neuroscience</source>, <volume>33</volume>(<issue>9</issue>), <fpage>4002</fpage>–<lpage>4010</lpage>. <pub-id pub-id-type="doi">10.1523/JNEUROSCI.4675-12.2013</pub-id><pub-id pub-id-type="pmid">23447609</pub-id></mixed-citation></ref>
<ref id="r31"><mixed-citation publication-type="journal"><person-group person-group-type="author"><name name-style="western"><surname>Droit-Volet</surname><given-names>S.</given-names></name><name name-style="western"><surname>Berthon</surname><given-names>M.</given-names></name></person-group> (<year>2017</year>). <article-title>Emotion and implicit timing: The arousal effect.</article-title> <source>Frontiers in Psychology</source>, <volume>8</volume>, <elocation-id>176</elocation-id>. <pub-id pub-id-type="doi">10.3389/fpsyg.2017.00176</pub-id><pub-id pub-id-type="pmid">28261125</pub-id></mixed-citation></ref>
<ref id="r32"><mixed-citation publication-type="journal"><person-group person-group-type="author"><name name-style="western"><surname>Droit-Volet</surname><given-names>S.</given-names></name><name name-style="western"><surname>Bigand</surname><given-names>E.</given-names></name><name name-style="western"><surname>Ramos</surname><given-names>D.</given-names></name><name name-style="western"><surname>Bueno</surname><given-names>J. L. O.</given-names></name></person-group> (<year>2010</year>). <article-title>Time flies with music whatever its emotional valence.</article-title> <source>Acta Psychologica</source>, <volume>135</volume>, <fpage>226</fpage>–<lpage>232</lpage>. <pub-id pub-id-type="doi">10.1016/j.actpsy.2010.07.003</pub-id><pub-id pub-id-type="pmid">20674884</pub-id></mixed-citation></ref>
<ref id="r33"><mixed-citation publication-type="journal"><person-group person-group-type="author"><name name-style="western"><surname>Droit-Volet</surname><given-names>S.</given-names></name><name name-style="western"><surname>Fayolle</surname><given-names>S. L.</given-names></name><name name-style="western"><surname>Gil</surname><given-names>S.</given-names></name></person-group> (<year>2011</year>). <article-title>Emotion and time perception: Effects of film-induced mood.</article-title> <source>Frontiers in Integrative Neuroscience</source>, <volume>5</volume>, <elocation-id>33</elocation-id>. <pub-id pub-id-type="doi">10.3389/fnint.2011.00033</pub-id><pub-id pub-id-type="pmid">21886610</pub-id></mixed-citation></ref>
<ref id="r34"><mixed-citation publication-type="journal"><person-group person-group-type="author"><name name-style="western"><surname>Droit-Volet</surname><given-names>S.</given-names></name><name name-style="western"><surname>Fayolle</surname><given-names>S.</given-names></name><name name-style="western"><surname>Lamotte</surname><given-names>M.</given-names></name><name name-style="western"><surname>Gil</surname><given-names>S.</given-names></name></person-group> (<year>2013</year>). <article-title>Time, emotion and the embodiment of timing.</article-title> <source>Timing &amp; Time Perception</source>, <volume>1</volume>, <fpage>99</fpage>–<lpage>126</lpage>. <pub-id pub-id-type="doi">10.1163/22134468-00002004</pub-id></mixed-citation></ref>
<ref id="r35"><mixed-citation publication-type="journal"><person-group person-group-type="author"><name name-style="western"><surname>Eagleman</surname><given-names>D. M.</given-names></name><name name-style="western"><surname>Peter</surname><given-names>U. T.</given-names></name><name name-style="western"><surname>Buonomano</surname><given-names>D.</given-names></name><name name-style="western"><surname>Janssen</surname><given-names>P.</given-names></name><name name-style="western"><surname>Nobre</surname><given-names>A. C.</given-names></name><name name-style="western"><surname>Holcombe</surname><given-names>A. O.</given-names></name></person-group> (<year>2005</year>). <article-title>Time and the brain: How subjective time relates to neural time.</article-title> <source>The Journal of Neuroscience</source>, <volume>25</volume>(<issue>45</issue>), <fpage>10369</fpage>–<lpage>10371</lpage>. <pub-id pub-id-type="doi">10.1523/JNEUROSCI.3487-05.2005</pub-id><pub-id pub-id-type="pmid">16280574</pub-id></mixed-citation></ref>
<ref id="r36"><mixed-citation publication-type="confproc">Eerola, T., Luck, G., &amp; Toiviainen, P. (2006). An investigation of pre-schoolers’ corporeal synchronization with music. In M. Baroni, A. R. Addessi, R. Caterina, &amp; M. Costa (Eds.), <italic>Proceedings of the 9th international conference on music perception and cognition</italic> (pp. 472-476). The Society for Music Perception and Cognition and European Society for the Cognitive Sciences of Music Bologna, Bologna, Italy.</mixed-citation></ref>
<ref id="r37"><mixed-citation publication-type="journal"><person-group person-group-type="author"><name name-style="western"><surname>Ellis</surname><given-names>M. C.</given-names></name></person-group> (<year>1991</year>). <article-title>Research note: Thresholds for detecting tempo change.</article-title> <source>Psychology of Music</source>, <volume>19</volume>(<issue>2</issue>), <fpage>164</fpage>–<lpage>169</lpage>. <pub-id pub-id-type="doi">10.1177/0305735691192007</pub-id></mixed-citation></ref>
<ref id="r38"><mixed-citation publication-type="journal"><person-group person-group-type="author"><name name-style="western"><surname>Escoffier</surname><given-names>N.</given-names></name><name name-style="western"><surname>Sheng</surname><given-names>D. Y. J.</given-names></name><name name-style="western"><surname>Schirmer</surname><given-names>A.</given-names></name></person-group> (<year>2010</year>). <article-title>Unattended musical beats enhance visual processing.</article-title> <source>Acta Psychologica</source>, <volume>135</volume>, <fpage>12</fpage>–<lpage>16</lpage>. <pub-id pub-id-type="doi">10.1016/j.actpsy.2010.04.005</pub-id><pub-id pub-id-type="pmid">20451167</pub-id></mixed-citation></ref>
<ref id="r39"><mixed-citation publication-type="journal"><person-group person-group-type="author"><name name-style="western"><surname>Gibbon</surname><given-names>J.</given-names></name></person-group> (<year>1977</year>). <article-title>Scalar Expectancy Theory and Weber’s law in animal timing.</article-title> <source>Psychological Review</source>, <volume>84</volume>(<issue>3</issue>), <fpage>279</fpage>–<lpage>325</lpage>. <pub-id pub-id-type="doi">10.1037/0033-295X.84.3.279</pub-id></mixed-citation></ref>
<ref id="r40"><mixed-citation publication-type="journal"><person-group person-group-type="author"><name name-style="western"><surname>Gibbon</surname><given-names>J.</given-names></name><name name-style="western"><surname>Church</surname><given-names>R. M.</given-names></name><name name-style="western"><surname>Meck</surname><given-names>W. H.</given-names></name></person-group> (<year>1984</year>). <article-title>Scalar timing in memory.</article-title> <source>Annals of the New York Academy of Sciences</source>, <volume>423</volume>, <fpage>52</fpage>–<lpage>77</lpage>. <pub-id pub-id-type="doi">10.1111/j.1749-6632.1984.tb23417.x</pub-id><pub-id pub-id-type="pmid">6588812</pub-id></mixed-citation></ref>
<ref id="r41"><mixed-citation publication-type="journal"><person-group person-group-type="author"><name name-style="western"><surname>Goudriaan</surname><given-names>J. C.</given-names></name></person-group> (<year>1921</year>). <article-title>Le rythme psychique dans ses rapports avec les fréquences cardiaque et respiratoire.</article-title> <source>Archives Néerllandaises de Physiologie</source>, <volume>6</volume>, <fpage>77</fpage>–<lpage>110</lpage>.</mixed-citation></ref>
<ref id="r42"><mixed-citation publication-type="journal"><person-group person-group-type="author"><name name-style="western"><surname>Grondin</surname><given-names>S.</given-names></name></person-group> (<year>2001</year>). <article-title>Discriminating time intervals presented in sequences marked by visual signals.</article-title> <source>Perception &amp; Psychophysics</source>, <volume>63</volume>(<issue>7</issue>), <fpage>1214</fpage>–<lpage>1228</lpage>. <pub-id pub-id-type="doi">10.3758/BF03194535</pub-id><pub-id pub-id-type="pmid">11766945</pub-id></mixed-citation></ref>
<ref id="r43"><mixed-citation publication-type="journal"><person-group person-group-type="author"><name name-style="western"><surname>Grondin</surname><given-names>S.</given-names></name></person-group> (<year>2010</year><comment>a</comment>). <article-title>Unequal Weber fractions for the categorization of brief temporal intervals.</article-title> <source>Attention, Perception &amp; Psychophysics</source>, <volume>72</volume>(<issue>5</issue>), <fpage>1422</fpage>–<lpage>1430</lpage>. <pub-id pub-id-type="doi">10.3758/APP.72.5.1422</pub-id><pub-id pub-id-type="pmid">20601721</pub-id></mixed-citation></ref>
<ref id="r44"><mixed-citation publication-type="journal"><person-group person-group-type="author"><name name-style="western"><surname>Grondin</surname><given-names>S.</given-names></name></person-group> (<year>2010</year><comment>b</comment>). <article-title>Timing and time perception: A review of recent behavioral and neuroscience findings and theoretical directions.</article-title> <source>Attention, Perception &amp; Psychophysics</source>, <volume>72</volume>(<issue>3</issue>), <fpage>561</fpage>–<lpage>582</lpage>. <pub-id pub-id-type="doi">10.3758/APP.72.3.561</pub-id><pub-id pub-id-type="pmid">20348562</pub-id></mixed-citation></ref>
<ref id="r45"><mixed-citation publication-type="journal"><person-group person-group-type="author"><name name-style="western"><surname>Grondin</surname><given-names>S.</given-names></name></person-group> (<year>2012</year>). <article-title>Violation of the scalar property for time perception between 1 and 2 seconds: Evidence from interval discrimination, reproduction, and categorization.</article-title> <source>Journal of Experimental Psychology: Human Perception and Performance</source>, <volume>38</volume>(<issue>4</issue>), <fpage>880</fpage>–<lpage>890</lpage>. <pub-id pub-id-type="doi">10.1037/a0027188</pub-id><pub-id pub-id-type="pmid">22390291</pub-id></mixed-citation></ref>
<ref id="r46"><mixed-citation publication-type="journal"><person-group person-group-type="author"><name name-style="western"><surname>Grondin</surname><given-names>S.</given-names></name><name name-style="western"><surname>Killeen</surname><given-names>P. R.</given-names></name></person-group> (<year>2009</year>). <article-title>Tracking time with song and count: Different Weber functions for musicians and nonmusicians.</article-title> <source>Attention, Perception &amp; Psychophysics</source>, <volume>71</volume>(<issue>7</issue>), <fpage>1649</fpage>–<lpage>1654</lpage>. <pub-id pub-id-type="doi">10.3758/APP.71.7.1649</pub-id><pub-id pub-id-type="pmid">19801624</pub-id></mixed-citation></ref>
<ref id="r47"><mixed-citation publication-type="journal"><person-group person-group-type="author"><name name-style="western"><surname>Gu</surname><given-names>B. M.</given-names></name><name name-style="western"><surname>van Rijn</surname><given-names>H.</given-names></name><name name-style="western"><surname>Meck</surname><given-names>W. H.</given-names></name></person-group> (<year>2015</year>). <article-title>Oscillatory multiplexing of neural population codes for interval timing and working memory.</article-title> <source>Neuroscience and Biobehavioral Reviews</source>, <volume>48</volume>, <fpage>160</fpage>–<lpage>185</lpage>. <pub-id pub-id-type="doi">10.1016/j.neubiorev.2014.10.008</pub-id><pub-id pub-id-type="pmid">25454354</pub-id></mixed-citation></ref>
<ref id="r48"><mixed-citation publication-type="journal"><person-group person-group-type="author"><name name-style="western"><surname>Guttman</surname><given-names>S. E.</given-names></name><name name-style="western"><surname>Gilroy</surname><given-names>L. A.</given-names></name><name name-style="western"><surname>Blake</surname><given-names>R.</given-names></name></person-group> (<year>2005</year>). <article-title>Hearing what the eyes see: Auditory encoding of visual temporal sequences.</article-title> <source>Psychological Science</source>, <volume>16</volume>(<issue>3</issue>), <fpage>228</fpage>–<lpage>235</lpage>. <pub-id pub-id-type="doi">10.1111/j.0956-7976.2005.00808.x</pub-id><pub-id pub-id-type="pmid">15733204</pub-id></mixed-citation></ref>
<ref id="r49"><mixed-citation publication-type="journal"><person-group person-group-type="author"><name name-style="western"><surname>Hammerschmidt</surname><given-names>D.</given-names></name><name name-style="western"><surname>Wöllner</surname><given-names>C.</given-names></name></person-group> (<year>2020</year>). <article-title>Sensorimotor synchronization with higher metrical levels in music shortens perceived time.</article-title> <source>Music Perception</source>, <volume>37</volume>(<issue>4</issue>), <fpage>263</fpage>–<lpage>277</lpage>. <pub-id pub-id-type="doi">10.1525/mp.2020.37.4.263</pub-id></mixed-citation></ref>
<ref id="r50"><mixed-citation publication-type="journal"><person-group person-group-type="author"><name name-style="western"><surname>Henry</surname><given-names>M. J.</given-names></name><name name-style="western"><surname>Herrmann</surname><given-names>B.</given-names></name><name name-style="western"><surname>Obleser</surname><given-names>J.</given-names></name></person-group> (<year>2014</year>). <article-title>Entrained neural oscillations in multiple frequency bands comodulate behavior.</article-title> <source>Proceedings of the National Academy of Sciences of the United States of America</source>, <volume>111</volume>(<issue>41</issue>), <fpage>14935</fpage>–<lpage>14940</lpage>. <pub-id pub-id-type="doi">10.1073/pnas.1408741111</pub-id><pub-id pub-id-type="pmid">25267634</pub-id></mixed-citation></ref>
<ref id="r51"><mixed-citation publication-type="book">Herbert, R. (2012). <italic>Everyday music listening: Absorption, dissociation and trancing</italic>. Abingdon, United Kingdom: Routledge.</mixed-citation></ref>
<ref id="r52"><mixed-citation publication-type="journal"><person-group person-group-type="author"><name name-style="western"><surname>Ivry</surname><given-names>R. B.</given-names></name><name name-style="western"><surname>Schlerf</surname><given-names>J. E.</given-names></name></person-group> (<year>2008</year>). <article-title>Dedicated and intrinsic models of time perception.</article-title> <source>Trends in Cognitive Sciences</source>, <volume>12</volume>(<issue>7</issue>), <fpage>273</fpage>–<lpage>280</lpage>. <pub-id pub-id-type="doi">10.1016/j.tics.2008.04.002</pub-id><pub-id pub-id-type="pmid">18539519</pub-id></mixed-citation></ref>
<ref id="r53"><mixed-citation publication-type="journal"><person-group person-group-type="author"><name name-style="western"><surname>Joiner</surname><given-names>W. M.</given-names></name><name name-style="western"><surname>Lee</surname><given-names>J. E.</given-names></name><name name-style="western"><surname>Lasker</surname><given-names>A.</given-names></name><name name-style="western"><surname>Shelhamer</surname><given-names>M.</given-names></name></person-group> (<year>2007</year>). <article-title>An internal clock for predictive saccades is established identically by auditory or visual information.</article-title> <source>Vision Research</source>, <volume>47</volume>(<issue>12</issue>), <fpage>1645</fpage>–<lpage>1654</lpage>. <pub-id pub-id-type="doi">10.1016/j.visres.2007.02.013</pub-id><pub-id pub-id-type="pmid">17445858</pub-id></mixed-citation></ref>
<ref id="r54"><mixed-citation publication-type="journal"><person-group person-group-type="author"><name name-style="western"><surname>Jones</surname><given-names>M. R.</given-names></name></person-group> (<year>1981</year>). <article-title>Music as a stimulus for psychological motion: Part I. Some determinants of expectancies.</article-title> <source>Psychomusicology: Music, Mind, and Brain</source>, <volume>1</volume>(<issue>2</issue>), <fpage>34</fpage>–<lpage>51</lpage>. <pub-id pub-id-type="doi">10.1037/h0094282</pub-id></mixed-citation></ref>
<ref id="r55"><mixed-citation publication-type="book">Jones, M. R. (1990). Musical events and models of musical time. In R. A. Block (Ed.), <italic>Cognitive models of psychological time</italic> (pp. 207-240). NJ, USA: Lawrence Erlbaum Associates.</mixed-citation></ref>
<ref id="r56"><mixed-citation publication-type="book">Jones, M. R. (2010). Attending to sound patterns and the role of entrainment. In A. C., Nobre, &amp; J. T., Coull (Eds.), <italic>Attention and time</italic> (pp. 317-330). Oxford, United Kingdom: Oxford University Press. <pub-id pub-id-type="doi">10.1093/acprof:oso/9780199563456.003.0023</pub-id></mixed-citation></ref>
<ref id="r57"><mixed-citation publication-type="journal"><person-group person-group-type="author"><name name-style="western"><surname>Jones</surname><given-names>M. R.</given-names></name><name name-style="western"><surname>Boltz</surname><given-names>M.</given-names></name></person-group> (<year>1989</year>). <article-title>Dynamic attending and responses to time.</article-title> <source>Psychological Review</source>, <volume>96</volume>(<issue>3</issue>), <fpage>459</fpage>–<lpage>491</lpage>. <pub-id pub-id-type="doi">10.1037/0033-295X.96.3.459</pub-id><pub-id pub-id-type="pmid">2756068</pub-id></mixed-citation></ref>
<ref id="r58"><mixed-citation publication-type="journal"><person-group person-group-type="author"><name name-style="western"><surname>Karmarkar</surname><given-names>U. R.</given-names></name><name name-style="western"><surname>Buonomano</surname><given-names>D. V.</given-names></name></person-group> (<year>2007</year>). <article-title>Timing in the absence of clocks: Encoding time in neural network states.</article-title> <source>Neuron</source>, <volume>53</volume>(<issue>3</issue>), <fpage>427</fpage>–<lpage>438</lpage>. <pub-id pub-id-type="doi">10.1016/j.neuron.2007.01.006</pub-id><pub-id pub-id-type="pmid">17270738</pub-id></mixed-citation></ref>
<ref id="r59"><mixed-citation publication-type="journal"><person-group person-group-type="author"><name name-style="western"><surname>Keller</surname><given-names>P. E.</given-names></name><name name-style="western"><surname>Burnham</surname><given-names>D. K.</given-names></name></person-group> (<year>2005</year>). <article-title>Musical meter in attention to multipart rhythm.</article-title> <source>Music Perception</source>, <volume>22</volume>(<issue>4</issue>), <fpage>629</fpage>–<lpage>661</lpage>. <pub-id pub-id-type="doi">10.1525/mp.2005.22.4.629</pub-id></mixed-citation></ref>
<ref id="r60"><mixed-citation publication-type="book">Langner, J. (2002). <italic>Musikalischer Rhythmus und Oszillation</italic> (Vol. 13). Bern, Switzerland: Peter Lang Publishing.</mixed-citation></ref>
<ref id="r61"><mixed-citation publication-type="book">Large, E. W. (2008). <italic>Resonating to musical rhythm: Theory and experiment</italic>. In S. Grondin (Ed.), <italic>Psychology of time</italic> (pp. 189-232). Bingley, United Kingdom: Emerald Group Publishing. <pub-id pub-id-type="doi">10.1016/B978-0-08046-977-5.00006-5</pub-id></mixed-citation></ref>
<ref id="r62"><mixed-citation publication-type="journal"><person-group person-group-type="author"><name name-style="western"><surname>Large</surname><given-names>E. W.</given-names></name><name name-style="western"><surname>Jones</surname><given-names>M. R.</given-names></name></person-group> (<year>1999</year>). <article-title>The dynamics of attending: How people track time-varying events.</article-title> <source>Psychological Review</source>, <volume>106</volume>(<issue>1</issue>), <fpage>119</fpage>–<lpage>159</lpage>. <pub-id pub-id-type="doi">10.1037/0033-295X.106.1.119</pub-id></mixed-citation></ref>
<ref id="r63"><mixed-citation publication-type="journal"><person-group person-group-type="author"><name name-style="western"><surname>Lieb</surname><given-names>E. H.</given-names></name><name name-style="western"><surname>Yngvason</surname><given-names>J.</given-names></name></person-group> (<year>1999</year>). <article-title>The physics and mathematics of the second law of thermodynamics.</article-title> <source>Physics Reports</source>, <volume>310</volume>, <fpage>1</fpage>–<lpage>96</lpage>. <pub-id pub-id-type="doi">10.1016/S0370-1573(98)00082-9</pub-id></mixed-citation></ref>
<ref id="r64"><mixed-citation publication-type="book">London, J. (2004). <italic>Hearing in time: Psychological aspects of musical meter</italic>. Oxford, United Kingdom: Oxford University Press. <pub-id pub-id-type="doi">10.1093/acprof:oso/9780195160819.001.0001</pub-id></mixed-citation></ref>
<ref id="r65"><mixed-citation publication-type="journal"><person-group person-group-type="author"><name name-style="western"><surname>Lustig</surname><given-names>C.</given-names></name><name name-style="western"><surname>Meck</surname><given-names>W. H.</given-names></name></person-group> (<year>2011</year>). <article-title>Modality differences in timing and temporal memory throughout the lifespan.</article-title> <source>Brain and Cognition</source>, <volume>77</volume>(<issue>2</issue>), <fpage>298</fpage>–<lpage>303</lpage>. <pub-id pub-id-type="doi">10.1016/j.bandc.2011.07.007</pub-id><pub-id pub-id-type="pmid">21843912</pub-id></mixed-citation></ref>
<ref id="r66"><mixed-citation publication-type="journal"><person-group person-group-type="author"><name name-style="western"><surname>Matell</surname><given-names>M. S.</given-names></name><name name-style="western"><surname>Meck</surname><given-names>W. H.</given-names></name><name name-style="western"><surname>Nicolelis</surname><given-names>M. A.</given-names></name></person-group> (<year>2003</year>). <article-title>Interval timing and the encoding of signal duration by ensembles of cortical and striatal neurons.</article-title> <source>Behavioral Neuroscience</source>, <volume>117</volume>(<issue>4</issue>), <fpage>760</fpage>–<lpage>773</lpage>. <pub-id pub-id-type="doi">10.1037/0735-7044.117.4.760</pub-id><pub-id pub-id-type="pmid">12931961</pub-id></mixed-citation></ref>
<ref id="r67"><mixed-citation publication-type="journal"><person-group person-group-type="author"><name name-style="western"><surname>Matthews</surname><given-names>W. J.</given-names></name><name name-style="western"><surname>Grondin</surname><given-names>S.</given-names></name></person-group> (<year>2012</year>). <article-title>On the replication of Kristofferson’s (1980) quantal timing for duration discrimination: Some learning but no quanta and not much of a Weber constant.</article-title> <source>Attention, Perception &amp; Psychophysics</source>, <volume>74</volume>(<issue>5</issue>), <fpage>1056</fpage>–<lpage>1072</lpage>. <pub-id pub-id-type="doi">10.3758/s13414-012-0282-3</pub-id><pub-id pub-id-type="pmid">22391892</pub-id></mixed-citation></ref>
<ref id="r68"><mixed-citation publication-type="journal"><person-group person-group-type="author"><name name-style="western"><surname>Matthews</surname><given-names>W. J.</given-names></name><name name-style="western"><surname>Meck</surname><given-names>W. H.</given-names></name></person-group> (<year>2014</year>). <article-title>Time perception: The bad news and the good.</article-title> <source>Cognitive Science</source>, <volume>5</volume>(<issue>4</issue>), <fpage>429</fpage>–<lpage>446</lpage>. <pub-id pub-id-type="doi">10.1002/wcs.1298</pub-id><pub-id pub-id-type="pmid">25210578</pub-id></mixed-citation></ref>
<ref id="r69"><mixed-citation publication-type="journal"><person-group person-group-type="author"><name name-style="western"><surname>McDermott</surname><given-names>J. H.</given-names></name><name name-style="western"><surname>Wrobleski</surname><given-names>D.</given-names></name><name name-style="western"><surname>Oxenham</surname><given-names>A. J.</given-names></name></person-group> (<year>2011</year>). <article-title>Recovering sound sources from embedded repetition.</article-title> <source>Proceedings of the National Academy of Sciences of the United States of America</source>, <volume>108</volume>(<issue>3</issue>), <fpage>1188</fpage>–<lpage>1193</lpage>. <pub-id pub-id-type="doi">10.1073/pnas.1004765108</pub-id><pub-id pub-id-type="pmid">21199948</pub-id></mixed-citation></ref>
<ref id="r70"><mixed-citation publication-type="journal"><person-group person-group-type="author"><name name-style="western"><surname>McPherson</surname><given-names>T.</given-names></name><name name-style="western"><surname>Berger</surname><given-names>D.</given-names></name><name name-style="western"><surname>Alagapan</surname><given-names>S.</given-names></name><name name-style="western"><surname>Fröhlich</surname><given-names>F.</given-names></name></person-group> (<year>2018</year>). <article-title>Intrinsic rhythmicity predicts synchronization-continuation entrainment performance.</article-title> <source>Scientific Reports</source>, <volume>8</volume>(<issue>1</issue>), <elocation-id>11782</elocation-id>. <pub-id pub-id-type="doi">10.1038/s41598-018-29267-z</pub-id><pub-id pub-id-type="pmid">30082734</pub-id></mixed-citation></ref>
<ref id="r71"><mixed-citation publication-type="journal"><person-group person-group-type="author"><name name-style="western"><surname>Meck</surname><given-names>W. H.</given-names></name></person-group> (<year>1984</year>). <article-title>Attentional bias between modalities: Effect on the internal clock, memory, and decision stages used in animal time discrimination.</article-title> <source>Annals of the New York Academy of Sciences</source>, <volume>423</volume>(<issue>1</issue>), <fpage>528</fpage>–<lpage>541</lpage>. <pub-id pub-id-type="doi">10.1111/j.1749-6632.1984.tb23457.x</pub-id><pub-id pub-id-type="pmid">6588813</pub-id></mixed-citation></ref>
<ref id="r72"><mixed-citation publication-type="journal"><person-group person-group-type="author"><name name-style="western"><surname>Merchant</surname><given-names>H.</given-names></name><name name-style="western"><surname>Harrington</surname><given-names>D. L.</given-names></name><name name-style="western"><surname>Meck</surname><given-names>W. H.</given-names></name></person-group> (<year>2013</year>). <article-title>Neural basis of the perception and estimation of time.</article-title> <source>Annual Review of Neuroscience</source>, <volume>36</volume>, <fpage>313</fpage>–<lpage>336</lpage>. <pub-id pub-id-type="doi">10.1146/annurev-neuro-062012-170349</pub-id><pub-id pub-id-type="pmid">23725000</pub-id></mixed-citation></ref>
<ref id="r73"><mixed-citation publication-type="journal"><person-group person-group-type="author"><name name-style="western"><surname>Miall</surname><given-names>C.</given-names></name></person-group> (<year>1989</year>). <article-title>The storage of time intervals using oscillating neurons.</article-title> <source>Neural Computation</source>, <volume>1</volume>, <fpage>359</fpage>–<lpage>371</lpage>. <pub-id pub-id-type="doi">10.1162/neco.1989.1.3.359</pub-id></mixed-citation></ref>
<ref id="r74"><mixed-citation publication-type="journal"><person-group person-group-type="author"><name name-style="western"><surname>Nagarajan</surname><given-names>S. S.</given-names></name><name name-style="western"><surname>Blake</surname><given-names>D. T.</given-names></name><name name-style="western"><surname>Wright</surname><given-names>B. A.</given-names></name><name name-style="western"><surname>Byl</surname><given-names>N.</given-names></name><name name-style="western"><surname>Merzenich</surname><given-names>M. M.</given-names></name></person-group> (<year>1998</year>). <article-title>Practice-related improvements in somatosensory interval discrimination are temporally specific but generalize across skin location, hemisphere, and modality.</article-title> <source>The Journal of Neuroscience : The Official Journal of the Society for Neuroscience</source>, <volume>18</volume>(<issue>4</issue>), <fpage>1559</fpage>–<lpage>1570</lpage>. <pub-id pub-id-type="doi">10.1523/JNEUROSCI.18-04-01559.1998</pub-id><pub-id pub-id-type="pmid">9454861</pub-id></mixed-citation></ref>
<ref id="r75"><mixed-citation publication-type="book">Nichols, R. (2011). <italic>Ravel</italic>. London, United Kingdom: Yale University Press. <pub-id pub-id-type="doi">10.1093/ml/gcs082</pub-id></mixed-citation></ref>
<ref id="r76"><mixed-citation publication-type="journal"><person-group person-group-type="author"><name name-style="western"><surname>Nozaradan</surname><given-names>S.</given-names></name></person-group> (<year>2014</year>). <article-title>Exploring how musical rhythm entrains brain activity with electroencephalogram frequency-tagging.</article-title> <source>Philosophical Transactions of the Royal Society of London. Series B</source>, <volume>369</volume>, <elocation-id>20130393</elocation-id>. <pub-id pub-id-type="doi">10.1098/rstb.2013.0393</pub-id><pub-id pub-id-type="pmid">25385771</pub-id></mixed-citation></ref>
<ref id="r77"><mixed-citation publication-type="journal"><person-group person-group-type="author"><name name-style="western"><surname>Nozaradan</surname><given-names>S.</given-names></name><name name-style="western"><surname>Peretz</surname><given-names>I.</given-names></name><name name-style="western"><surname>Mouraux</surname><given-names>A.</given-names></name></person-group> (<year>2012</year>). <article-title>Selective neuronal entrainment to the beat and meter embedded in a musical rhythm.</article-title> <source>The Journal of Neuroscience</source>, <volume>32</volume>(<issue>49</issue>), <fpage>17572</fpage>–<lpage>17581</lpage>. <pub-id pub-id-type="doi">10.1523/JNEUROSCI.3203-12.2012</pub-id><pub-id pub-id-type="pmid">23223281</pub-id></mixed-citation></ref>
<ref id="r78"><mixed-citation publication-type="book">Ornstein, R. E. (1969). <italic>On the experience of time</italic>. London, United Kingdom: Penguin Publisher.</mixed-citation></ref>
<ref id="r79"><mixed-citation publication-type="journal"><person-group person-group-type="author"><name name-style="western"><surname>Ortega</surname><given-names>L.</given-names></name><name name-style="western"><surname>López</surname><given-names>F.</given-names></name></person-group> (<year>2008</year>). <article-title>Effects of visual flicker on subjective time in a temporal bisection task.</article-title> <source>Behavioural Processes</source>, <volume>78</volume>(<issue>3</issue>), <fpage>380</fpage>–<lpage>386</lpage>. <pub-id pub-id-type="doi">10.1016/j.beproc.2008.02.004</pub-id><pub-id pub-id-type="pmid">18358636</pub-id></mixed-citation></ref>
<ref id="r80"><mixed-citation publication-type="journal"><person-group person-group-type="author"><name name-style="western"><surname>Penney</surname><given-names>T. B.</given-names></name><name name-style="western"><surname>Gibbon</surname><given-names>J.</given-names></name><name name-style="western"><surname>Meck</surname><given-names>W. H.</given-names></name></person-group> (<year>2000</year>). <article-title>Differential effects of auditory and visual signals on clock speed and temporal memory.</article-title> <source>Journal of Experimental Psychology: Human Perception and Performance</source>, <volume>26</volume>(<issue>6</issue>), <fpage>1770</fpage>–<lpage>1787</lpage>. <pub-id pub-id-type="doi">10.1037/0096-1523.26.6.1770</pub-id><pub-id pub-id-type="pmid">11129373</pub-id></mixed-citation></ref>
<ref id="r81"><mixed-citation publication-type="journal"><person-group person-group-type="author"><name name-style="western"><surname>Phillips</surname><given-names>D. P.</given-names></name><name name-style="western"><surname>Hall</surname><given-names>S. E.</given-names></name></person-group> (<year>2002</year>). <article-title>Auditory temporal gap detection for noise markers with partially overlapping and non-overlapping spectra.</article-title> <source>Hearing Research</source>, <volume>174</volume>(<issue>1-2</issue>), <fpage>133</fpage>–<lpage>141</lpage>. <pub-id pub-id-type="doi">10.1016/S0378-5955(02)00647-0</pub-id><pub-id pub-id-type="pmid">12433404</pub-id></mixed-citation></ref>
<ref id="r82"><mixed-citation publication-type="journal"><person-group person-group-type="author"><name name-style="western"><surname>Polti</surname><given-names>I.</given-names></name><name name-style="western"><surname>Martin</surname><given-names>B.</given-names></name><name name-style="western"><surname>van Wassenhove</surname><given-names>V.</given-names></name></person-group> (<year>2018</year>). <article-title>The effect of attention and working memory on the estimation of elapsed time.</article-title> <source>Scientific Reports</source>, <volume>8</volume>(<issue>1</issue>), <elocation-id>6690</elocation-id>. <pub-id pub-id-type="doi">10.1038/s41598-018-25119-y</pub-id><pub-id pub-id-type="pmid">29703928</pub-id></mixed-citation></ref>
<ref id="r83"><mixed-citation publication-type="journal"><person-group person-group-type="author"><name name-style="western"><surname>Pöppel</surname><given-names>E.</given-names></name></person-group> (<year>1989</year>). <article-title>The measurement of music and the cerebral clock: A new theory.</article-title> <source>Leonardo</source>, <volume>22</volume>, <fpage>83</fpage>–<lpage>89</lpage>. <pub-id pub-id-type="doi">10.2307/1575145</pub-id></mixed-citation></ref>
<ref id="r84"><mixed-citation publication-type="journal"><person-group person-group-type="author"><name name-style="western"><surname>Povel</surname><given-names>D. J.</given-names></name><name name-style="western"><surname>Essens</surname><given-names>P.</given-names></name></person-group> (<year>1985</year>). <article-title>Perception of temporal patterns.</article-title> <source>Music Perception</source>, <volume>2</volume>(<issue>4</issue>), <fpage>411</fpage>–<lpage>440</lpage>. <pub-id pub-id-type="doi">10.2307/40285311</pub-id></mixed-citation></ref>
<ref id="r85"><mixed-citation publication-type="journal"><person-group person-group-type="author"><name name-style="western"><surname>Pressnitzer</surname><given-names>D.</given-names></name><name name-style="western"><surname>Suied</surname><given-names>C.</given-names></name><name name-style="western"><surname>Shamma</surname><given-names>S.</given-names></name></person-group> (<year>2011</year>). <article-title>Auditory scene analysis: The sweet music of ambiguity.</article-title> <source>Frontiers in Human Neuroscience</source>, <volume>5</volume>, <elocation-id>158</elocation-id>. <pub-id pub-id-type="doi">10.3389/fnhum.2011.00158</pub-id><pub-id pub-id-type="pmid">22174701</pub-id></mixed-citation></ref>
<ref id="r86"><mixed-citation publication-type="journal"><person-group person-group-type="author"><name name-style="western"><surname>Rammsayer</surname><given-names>T.</given-names></name><name name-style="western"><surname>Altenmüller</surname><given-names>E.</given-names></name></person-group> (<year>2006</year>). <article-title>Temporal information processing in musicians and nonmusicians.</article-title> <source>Music Perception</source>, <volume>24</volume>(<issue>1</issue>), <fpage>37</fpage>–<lpage>48</lpage>. <pub-id pub-id-type="doi">10.1525/mp.2006.24.1.37</pub-id></mixed-citation></ref>
<ref id="r87"><mixed-citation publication-type="journal"><person-group person-group-type="author"><name name-style="western"><surname>Repp</surname><given-names>B. H.</given-names></name></person-group> (<year>2010</year>). <article-title>Sensorimotor synchronization and perception of timing: Effects of music training and task experience.</article-title> <source>Human Movement Science</source>, <volume>29</volume>(<issue>2</issue>), <fpage>200</fpage>–<lpage>213</lpage>. <pub-id pub-id-type="doi">10.1016/j.humov.2009.08.002</pub-id><pub-id pub-id-type="pmid">20074825</pub-id></mixed-citation></ref>
<ref id="r88"><mixed-citation publication-type="journal"><person-group person-group-type="author"><name name-style="western"><surname>Repp</surname><given-names>B. H.</given-names></name><name name-style="western"><surname>Penel</surname><given-names>A.</given-names></name></person-group> (<year>2002</year>). <article-title>Auditory dominance in temporal processing: New evidence from synchronization with simultaneous visual and auditory sequences.</article-title> <source>Journal of Experimental Psychology: Human Perception and Performance</source>, <volume>28</volume>(<issue>5</issue>), <fpage>1085</fpage>–<lpage>1099</lpage>. <pub-id pub-id-type="doi">10.1037/0096-1523.28.5.1085</pub-id><pub-id pub-id-type="pmid">12421057</pub-id></mixed-citation></ref>
<ref id="r89"><mixed-citation publication-type="journal"><person-group person-group-type="author"><name name-style="western"><surname>Repp</surname><given-names>B. H.</given-names></name><name name-style="western"><surname>Su</surname><given-names>Y. H.</given-names></name></person-group> (<year>2013</year>). <article-title>Sensorimotor synchronization: A review of recent research (2006–2012).</article-title> <source>Psychonomic Bulletin &amp; Review</source>, <volume>20</volume>(<issue>3</issue>), <fpage>403</fpage>–<lpage>452</lpage>. <pub-id pub-id-type="doi">10.3758/s13423-012-0371-2</pub-id><pub-id pub-id-type="pmid">23397235</pub-id></mixed-citation></ref>
<ref id="r90"><mixed-citation publication-type="journal"><person-group person-group-type="author"><name name-style="western"><surname>Schäfer</surname><given-names>T.</given-names></name><name name-style="western"><surname>Fachner</surname><given-names>J.</given-names></name><name name-style="western"><surname>Smukalla</surname><given-names>M.</given-names></name></person-group> (<year>2013</year>). <article-title>Changes in the representation of space and time while listening to music.</article-title> <source>Frontiers in Psychology</source>, <volume>4</volume>, <elocation-id>508</elocation-id>. <pub-id pub-id-type="doi">10.3389/fpsyg.2013.00508</pub-id><pub-id pub-id-type="pmid">23964254</pub-id></mixed-citation></ref>
<ref id="r91"><mixed-citation publication-type="journal"><person-group person-group-type="author"><name name-style="western"><surname>Schnupp</surname><given-names>J. W.</given-names></name><name name-style="western"><surname>Hall</surname><given-names>T. M.</given-names></name><name name-style="western"><surname>Kokelaar</surname><given-names>R. F.</given-names></name><name name-style="western"><surname>Ahmed</surname><given-names>B.</given-names></name></person-group> (<year>2006</year>). <article-title>Plasticity of temporal pattern codes for vocalization stimuli in primary auditory cortex.</article-title> <source>The Journal of Neuroscience</source>, <volume>26</volume>(<issue>18</issue>), <fpage>4785</fpage>–<lpage>4795</lpage>. <pub-id pub-id-type="doi">10.1523/JNEUROSCI.4330-05.2006</pub-id><pub-id pub-id-type="pmid">16672651</pub-id></mixed-citation></ref>
<ref id="r92"><mixed-citation publication-type="journal"><person-group person-group-type="author"><name name-style="western"><surname>Schroeder</surname><given-names>C. E.</given-names></name><name name-style="western"><surname>Lakatos</surname><given-names>P.</given-names></name></person-group> (<year>2009</year>). <article-title>Low-frequency neuronal oscillations as instruments of sensory selection.</article-title> <source>Trends in Neurosciences</source>, <volume>32</volume>(<issue>1</issue>), <fpage>9</fpage>–<lpage>18</lpage>. <pub-id pub-id-type="doi">10.1016/j.tins.2008.09.012</pub-id><pub-id pub-id-type="pmid">19012975</pub-id></mixed-citation></ref>
<ref id="r93"><mixed-citation publication-type="journal"><person-group person-group-type="author"><name name-style="western"><surname>Schulze</surname><given-names>H. H.</given-names></name></person-group> (<year>1978</year>). <article-title>The detectability of local and global displacements in regular rhythmic patterns.</article-title> <source>Psychological Research</source>, <volume>40</volume>(<issue>2</issue>), <fpage>173</fpage>–<lpage>181</lpage>. <pub-id pub-id-type="doi">10.1007/BF00308412</pub-id><pub-id pub-id-type="pmid">693733</pub-id></mixed-citation></ref>
<ref id="r94"><mixed-citation publication-type="journal"><person-group person-group-type="author"><name name-style="western"><surname>Schwartze</surname><given-names>M.</given-names></name><name name-style="western"><surname>Keller</surname><given-names>P. E.</given-names></name><name name-style="western"><surname>Patel</surname><given-names>A. D.</given-names></name><name name-style="western"><surname>Kotz</surname><given-names>S. A.</given-names></name></person-group> (<year>2011</year>). <article-title>The impact of basal ganglia lesions on sensorimotor synchronization, spontaneous motor tempo, and the detection of tempo changes.</article-title> <source>Behavioural Brain Research</source>, <volume>216</volume>(<issue>2</issue>), <fpage>685</fpage>–<lpage>691</lpage>. <pub-id pub-id-type="doi">10.1016/j.bbr.2010.09.015</pub-id><pub-id pub-id-type="pmid">20883725</pub-id></mixed-citation></ref>
<ref id="r95"><mixed-citation publication-type="journal"><person-group person-group-type="author"><name name-style="western"><surname>Shamma</surname><given-names>S. A.</given-names></name><name name-style="western"><surname>Micheyl</surname><given-names>C.</given-names></name></person-group> (<year>2010</year>). <article-title>Behind the scenes of auditory perception.</article-title> <source>Current Opinion in Neurobiology</source>, <volume>20</volume>(<issue>3</issue>), <fpage>361</fpage>–<lpage>366</lpage>. <pub-id pub-id-type="doi">10.1016/j.conb.2010.03.009</pub-id><pub-id pub-id-type="pmid">20456940</pub-id></mixed-citation></ref>
<ref id="r96"><mixed-citation publication-type="journal"><person-group person-group-type="author"><name name-style="western"><surname>Shams</surname><given-names>L.</given-names></name><name name-style="western"><surname>Kamitani</surname><given-names>Y.</given-names></name><name name-style="western"><surname>Shimojo</surname><given-names>S.</given-names></name></person-group> (<year>2002</year>). <article-title>Visual illusion induced by sound.</article-title> <source>Brain Research: Cognitive Brain Research</source>, <volume>14</volume>(<issue>1</issue>), <fpage>147</fpage>–<lpage>152</lpage>. <pub-id pub-id-type="doi">10.1016/S0926-6410(02)00069-1</pub-id><pub-id pub-id-type="pmid">12063138</pub-id></mixed-citation></ref>
<ref id="r97"><mixed-citation publication-type="journal"><person-group person-group-type="author"><name name-style="western"><surname>Shipley</surname><given-names>T.</given-names></name></person-group> (<year>1964</year>). <article-title>Auditory flutter-driving of visual flicker.</article-title> <source>Science</source>, <volume>145</volume>(<issue>3638</issue>), <fpage>1328</fpage>–<lpage>1330</lpage>. <pub-id pub-id-type="doi">10.1126/science.145.3638.1328</pub-id><pub-id pub-id-type="pmid">14173429</pub-id></mixed-citation></ref>
<ref id="r98"><mixed-citation publication-type="journal"><person-group person-group-type="author"><name name-style="western"><surname>Stern</surname><given-names>L. W.</given-names></name></person-group> (<year>1897</year>). <article-title>Psychische Präsenzzeit.</article-title> <source>Zeitschrift für Psychologie und Physiologie der Sinnesorgane</source>, <volume>13</volume>, <fpage>325</fpage>–<lpage>349</lpage>.</mixed-citation></ref>
<ref id="r99"><mixed-citation publication-type="journal"><person-group person-group-type="author"><name name-style="western"><surname>Teki</surname><given-names>S.</given-names></name><name name-style="western"><surname>Griffiths</surname><given-names>T. D.</given-names></name></person-group> (<year>2014</year>). <article-title>Working memory for time intervals in auditory rhythmic sequences.</article-title> <source>Frontiers in Psychology</source>, <volume>5</volume>, <elocation-id>1329</elocation-id>. <pub-id pub-id-type="doi">10.3389/fpsyg.2014.01329</pub-id><pub-id pub-id-type="pmid">25477849</pub-id></mixed-citation></ref>
<ref id="r100"><mixed-citation publication-type="journal"><person-group person-group-type="author"><name name-style="western"><surname>Tobin</surname><given-names>S.</given-names></name><name name-style="western"><surname>Bisson</surname><given-names>N.</given-names></name><name name-style="western"><surname>Grondin</surname><given-names>S.</given-names></name></person-group> (<year>2010</year>). <article-title>An ecological approach to prospective and retrospective timing of long durations: A study involving gamers.</article-title> <source>PLOS ONE</source>, <volume>5</volume>(<issue>2</issue>), <elocation-id>e9271</elocation-id>. <pub-id pub-id-type="doi">10.1371/journal.pone.0009271</pub-id><pub-id pub-id-type="pmid">20174648</pub-id></mixed-citation></ref>
<ref id="r101"><mixed-citation publication-type="journal"><person-group person-group-type="author"><name name-style="western"><surname>Toiviainen</surname><given-names>P.</given-names></name><name name-style="western"><surname>Burunat</surname><given-names>I.</given-names></name><name name-style="western"><surname>Brattico</surname><given-names>E.</given-names></name><name name-style="western"><surname>Vuust</surname><given-names>P.</given-names></name><name name-style="western"><surname>Alluri</surname><given-names>V.</given-names></name></person-group> (<year>2019</year>). <article-title>The chronnectome of musical beat.</article-title> <source>NeuroImage</source>, <elocation-id>116191</elocation-id>. <comment>Advance online publication</comment>. <pub-id pub-id-type="doi">10.1016/j.neuroimage.2019.116191</pub-id><pub-id pub-id-type="pmid">31525500</pub-id></mixed-citation></ref>
<ref id="r102"><mixed-citation publication-type="journal"><person-group person-group-type="author"><name name-style="western"><surname>Treisman</surname><given-names>M.</given-names></name></person-group> (<year>1963</year>). <article-title>Temporal discrimination and the indifference interval: Implications for a model of the" internal clock</article-title>. <source>Psychological Monographs</source>, <volume>77</volume>(<issue>13</issue>), <fpage>1</fpage>–<lpage>31</lpage>. <pub-id pub-id-type="doi">10.1037/h0093864</pub-id><pub-id pub-id-type="pmid">5877542</pub-id></mixed-citation></ref>
<ref id="r103"><mixed-citation publication-type="journal"><person-group person-group-type="author"><name name-style="western"><surname>Treisman</surname><given-names>M.</given-names></name><name name-style="western"><surname>Brogan</surname><given-names>D.</given-names></name></person-group> (<year>1992</year>). <article-title>Time perception and the internal clock: Effects of visual flicker on the temporal oscillator.</article-title> <source>The European Journal of Cognitive Psychology</source>, <volume>4</volume>(<issue>1</issue>), <fpage>41</fpage>–<lpage>70</lpage>. <pub-id pub-id-type="doi">10.1080/09541449208406242</pub-id></mixed-citation></ref>
<ref id="r104"><mixed-citation publication-type="journal"><person-group person-group-type="author"><name name-style="western"><surname>Treisman</surname><given-names>M.</given-names></name><name name-style="western"><surname>Faulkner</surname><given-names>A.</given-names></name><name name-style="western"><surname>Naish</surname><given-names>P. L.</given-names></name><name name-style="western"><surname>Brogan</surname><given-names>D.</given-names></name></person-group> (<year>1990</year>). <article-title>The internal clock: Evidence for a temporal oscillator underlying time perception with some estimates of its characteristic frequency.</article-title> <source>Perception</source>, <volume>19</volume>(<issue>6</issue>), <fpage>705</fpage>–<lpage>742</lpage>. <pub-id pub-id-type="doi">10.1068/p190705</pub-id><pub-id pub-id-type="pmid">2130371</pub-id></mixed-citation></ref>
<ref id="r105"><mixed-citation publication-type="book">van Rijn, H., Gu, B. M., &amp; Meck, W. H. (2014). Dedicated clock/timing-circuit theories of time perception and timed performance. In H. Merchant &amp; V. D. Lafuente (Eds.), <italic>Neurobiology of interval timing</italic> (pp. 75-99). Berlin, Germany: Springer. <pub-id pub-id-type="doi">10.1007/978-1-4939-1782-2_5</pub-id></mixed-citation></ref>
<ref id="r106"><mixed-citation publication-type="journal"><person-group person-group-type="author"><name name-style="western"><surname>van Wassenhove</surname><given-names>V.</given-names></name><name name-style="western"><surname>Buonomano</surname><given-names>D. V.</given-names></name><name name-style="western"><surname>Shimojo</surname><given-names>S.</given-names></name><name name-style="western"><surname>Shams</surname><given-names>L.</given-names></name></person-group> (<year>2008</year>). <article-title>Distortions of subjective time perception within and across senses.</article-title> <source>PLOS ONE</source>, <volume>3</volume>(<issue>1</issue>), <elocation-id>e1437</elocation-id>. <pub-id pub-id-type="doi">10.1371/journal.pone.0001437</pub-id><pub-id pub-id-type="pmid">18197248</pub-id></mixed-citation></ref>
<ref id="r107"><mixed-citation publication-type="confproc">Wang, X., &amp; Shi, Z. (2019, September 10-12). <italic>Temporal entrainment effect: Can music enhance our attention resolution in time?</italic> [Poster presentation]. The 12th International Conference of Students of Systematic Musicology. SysMus, Berlin, Germany.</mixed-citation></ref>
<ref id="r108"><mixed-citation publication-type="confproc">Wang, X., Wöllner, C., &amp; Shi, Z. (2019, September 6-8). <italic>Perceiving tempo in incongruent audiovisual contexts: An exploratory study with a temporal bisection paradigm</italic> [Poster presentation]. Jahrestagung der Deutsche Gesellschaft für Musikpsychologie, Eichstätt, Germany.</mixed-citation></ref>
<ref id="r109"><mixed-citation publication-type="journal"><person-group person-group-type="author"><name name-style="western"><surname>Wearden</surname><given-names>J. H.</given-names></name><name name-style="western"><surname>Jones</surname><given-names>L. A.</given-names></name></person-group> (<year>2007</year>). <article-title>Is the growth of subjective time in humans a linear or nonlinear function of real time?</article-title> <source>Quarterly Journal of Experimental Psychology</source>, <volume>60</volume>(<issue>9</issue>), <fpage>1289</fpage>–<lpage>1302</lpage>. <pub-id pub-id-type="doi">10.1080/17470210600971576</pub-id><pub-id pub-id-type="pmid">17676559</pub-id></mixed-citation></ref>
<ref id="r110"><mixed-citation publication-type="journal"><person-group person-group-type="author"><name name-style="western"><surname>Wearden</surname><given-names>J. H.</given-names></name><name name-style="western"><surname>Williams</surname><given-names>E. A.</given-names></name><name name-style="western"><surname>Jones</surname><given-names>L. A.</given-names></name></person-group> (<year>2017</year>). <article-title>What speeds up the internal clock? Effects of clicks and flicker on duration judgements and reaction time.</article-title> <source>Quarterly Journal of Experimental Psychology</source>, <volume>70</volume>(<issue>3</issue>), <fpage>488</fpage>–<lpage>503</lpage>. <pub-id pub-id-type="doi">10.1080/17470218.2015.1135971</pub-id><pub-id pub-id-type="pmid">26811017</pub-id></mixed-citation></ref>
<ref id="r111"><mixed-citation publication-type="journal"><person-group person-group-type="author"><name name-style="western"><surname>Wittmann</surname><given-names>M.</given-names></name></person-group> (<year>2013</year>). <article-title>The inner sense of time: How the brain creates a representation of duration.</article-title> <source>Nature Reviews. Neuroscience</source>, <volume>14</volume>(<issue>3</issue>), <fpage>217</fpage>–<lpage>223</lpage>. <pub-id pub-id-type="doi">10.1038/nrn3452</pub-id><pub-id pub-id-type="pmid">23403747</pub-id></mixed-citation></ref>
<ref id="r112"><mixed-citation publication-type="journal"><person-group person-group-type="author"><name name-style="western"><surname>Wöllner</surname><given-names>C.</given-names></name><name name-style="western"><surname>Halpern</surname><given-names>A. R.</given-names></name></person-group> (<year>2016</year>). <article-title>Attentional flexibility and memory capacity in conductors and pianists.</article-title> <source>Attention, Perception &amp; Psychophysics</source>, <volume>78</volume>(<issue>1</issue>), <fpage>198</fpage>–<lpage>208</lpage>. <pub-id pub-id-type="doi">10.3758/s13414-015-0989-z</pub-id><pub-id pub-id-type="pmid">26404532</pub-id></mixed-citation></ref>
<ref id="r113"><mixed-citation publication-type="journal"><person-group person-group-type="author"><name name-style="western"><surname>Wöllner</surname><given-names>C.</given-names></name><name name-style="western"><surname>Hammerschmidt</surname><given-names>D.</given-names></name><name name-style="western"><surname>Albrecht</surname><given-names>H.</given-names></name></person-group> (<year>2018</year>). <article-title>Slow motion in films and video clips: Music influences perceived duration and emotion, autonomic physiological activation and pupillary responses.</article-title> <source>PLOS ONE</source>, <volume>13</volume>(<issue>6</issue>), <elocation-id>e0199161</elocation-id>. <pub-id pub-id-type="doi">10.1371/journal.pone.0199161</pub-id><pub-id pub-id-type="pmid">29933380</pub-id></mixed-citation></ref>
<ref id="r114"><mixed-citation publication-type="journal"><person-group person-group-type="author"><name name-style="western"><surname>Zakay</surname><given-names>D.</given-names></name><name name-style="western"><surname>Block</surname><given-names>R. A.</given-names></name></person-group> (<year>1995</year>). <article-title>An attentional-gate model of prospective time estimation.</article-title> <source>Time and the Dynamic Control of Behavior</source>, <fpage>167</fpage>-<lpage>178</lpage>.</mixed-citation></ref>
<ref id="r115"><mixed-citation publication-type="journal"><person-group person-group-type="author"><name name-style="western"><surname>Zentner</surname><given-names>M.</given-names></name><name name-style="western"><surname>Eerola</surname><given-names>T.</given-names></name></person-group> (<year>2010</year>). <article-title>Rhythmic engagement with music in infancy.</article-title> <source>Proceedings of the National Academy of Sciences of the United States of America</source>, <volume>107</volume>(<issue>13</issue>), <fpage>5768</fpage>–<lpage>5773</lpage>. <pub-id pub-id-type="doi">10.1073/pnas.1000121107</pub-id><pub-id pub-id-type="pmid">20231438</pub-id></mixed-citation></ref>
</ref-list>
<fn-group>
<fn fn-type="conflict"><p>The authors have declared that no competing interests exist.</p></fn>
</fn-group>
<ack>
<p>The authors have no support to report.</p>
</ack>
</back>
</article>