<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE article
  PUBLIC "-//NLM//DTD JATS (Z39.96) Journal Publishing DTD with MathML3 v1.2 20190208//EN" "JATS-journalpublishing1-mathml3.dtd">
<article xmlns:xlink="http://www.w3.org/1999/xlink" xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:ali="http://www.niso.org/schemas/ali/1.0/" article-type="editorial" dtd-version="1.2" xml:lang="en">
<front>
<journal-meta>
<journal-id journal-id-type="publisher-id">JM</journal-id>
<journal-id journal-id-type="nlm-ta">Jahrb Musik</journal-id>
<journal-title-group>
<journal-title>Jahrbuch Musikpsychologie</journal-title>
<abbrev-journal-title abbrev-type="pubmed">Jahrb. Musik.</abbrev-journal-title>
</journal-title-group>
<issn pub-type="epub">2569-5665</issn>
<publisher><publisher-name>PsychOpen</publisher-name></publisher>
</journal-meta>
<article-meta>
<article-id pub-id-type="publisher-id">jbdgm.137</article-id>
<article-id pub-id-type="doi">10.5964/jbdgm.137</article-id>
<article-categories>
<subj-group subj-group-type="heading"><subject>Spots</subject></subj-group>
</article-categories>
<title-group>
<article-title>emoTouch Web – A Web-Based System for Continuous Real Time Studies With Smartphones, Tablets, and Computers</article-title>
<trans-title-group xml:lang="de">
<trans-title>emoTouch Web – Ein webbasiertes System für kontinuierliche Echtzeit-Studien mit Smartphones, Tablets und Computern</trans-title>
</trans-title-group>
<alt-title alt-title-type="right-running">emoTouch Web – Continuous Real Time Research</alt-title>
<alt-title specific-use="APA-reference-style" xml:lang="en">emoTouch Web – a web-based system for continuous real time studies with smartphones, tablets, and desktop computers</alt-title>
</title-group>
<contrib-group>
<contrib contrib-type="author" corresp="yes"><name name-style="western"><surname>Louven</surname><given-names>Christoph</given-names></name><xref ref-type="corresp" rid="cor1">*</xref><xref ref-type="aff" rid="aff1"><sup>1</sup></xref></contrib>
<contrib contrib-type="author"><name name-style="western"><surname>Scholle</surname><given-names>Carolin</given-names></name><xref ref-type="aff" rid="aff1"><sup>1</sup></xref></contrib>
<contrib contrib-type="author"><name name-style="western"><surname>Gehrs</surname><given-names>Fabian</given-names></name><xref ref-type="aff" rid="aff1"><sup>1</sup></xref></contrib>
<contrib contrib-type="author"><name name-style="western"><surname>Lenz</surname><given-names>Antonia</given-names></name><xref ref-type="aff" rid="aff1"><sup>1</sup></xref></contrib>
  
<aff id="aff1"><label>1</label><institution content-type="dept">Institut für Musikwissenschaft und Musikpädagogik</institution>, <institution>Universität Osnabrück</institution>, <addr-line><city>Osnabrück</city></addr-line>, <country country="DE">Germany</country></aff>
</contrib-group>
  
<author-notes>
<corresp id="cor1"><label>*</label>Institut für Musikwissenschaft und Musikpädagogik, Universität Osnabrück, 49069 Osnabrück, Germany. <email xlink:href="clouven@uos.de">clouven@uos.de</email></corresp>
</author-notes>
  
<pub-date pub-type="epub"><day>10</day><month>03</month><year>2022</year></pub-date>
  <pub-date pub-type="collection" publication-format="electronic"><year>2022</year></pub-date>
<volume>30</volume>
  <volume-id pub-id-type="title">Musikpsychologie – Empirische Forschungen - Ästhetische Experimente</volume-id>
  <elocation-id>e137</elocation-id>

<permissions><copyright-year>2022</copyright-year><copyright-holder>Louven, Scholle, Gehrs, &amp; Lenz</copyright-holder><license license-type="open-access" specific-use="CC BY 4.0" xlink:href="https://creativecommons.org/licenses/by/4.0/"><license-p>This is an open-access article distributed under the terms of the Creative Commons Attribution (CC BY) 4.0 License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.</license-p></license></permissions>
<abstract>
<p>emoTouch Web is a new web-based system for designing, conducting and evaluating continuous real-time studies in all disciplines dealing with time-bound phenomena, such as music, theatre, dance, film, commercials, lectures, speeches or sport events. The system uses modern web and network technologies to easily turn every modern smartphone, tablet, or computer into a flexible and reliable research tool for continuous response studies in laboratory, online and live settings. Even the participant’s own devices can be used easily (“Bring-Your-Own-Device”, BYOD). The system contains coordinated tools for the graphical and numerical review, evaluation and analysis of the collected real time data in longitudinal and cross-section. emoTouch Web may be used free of charge for scientific purposes.</p>
</abstract><trans-abstract xml:lang="de">
<p>emoTouch Web ist ein neues, webbasiertes System zur Gestaltung, Durchführung und Auswertung kontinuierlicher Echtzeitstudien in allen Disziplinen, die sich mit zeitgebundenen Phänomenen befassen, wie Musik, Theater, Tanz, Film, Werbespots, Vorträge, Reden oder Sportveranstaltungen. Das System nutzt moderne Web- und Netzwerktechnologien, um jedes moderne Smartphone, Tablet oder jeden Computer auf einfache Weise als flexibles und zuverlässiges Forschungsinstrument für kontinuierliche Echtzeit-Studien in Labor-, Online- und Live-Settings nutzbar zu machen. Auch die eigenen Geräte der Studienteilnehmer können dabei problemlos genutzt werden ("Bring-Your-Own-Device", BYOD). Das System enthält aufeinander abgestimmte Werkzeuge zur grafischen und numerischen Darstellung, Auswertung und Analyse der erhobenen Echtzeitdaten im Längs- und Querschnitt. emoTouch Web kann für wissenschaftliche Zwecke kostenlos genutzt werden.</p></trans-abstract>
<kwd-group kwd-group-type="author"><kwd>software</kwd><kwd>research tool</kwd><kwd>continuous response</kwd><kwd>interface</kwd><kwd>real time</kwd><kwd>live</kwd><kwd>online study</kwd><kwd>timeline</kwd><kwd>BYOD</kwd></kwd-group>
<kwd-group kwd-group-type="translator" xml:lang="de"><kwd>Software</kwd><kwd>Forschungswerkzeug</kwd><kwd>kontinuierliche Bewertung</kwd><kwd>Schnittstelle</kwd><kwd>Echtzeit</kwd><kwd>Live</kwd><kwd>Online-Studie</kwd><kwd>Zeitleiste</kwd><kwd>BYOD</kwd></kwd-group>
</article-meta>
</front>
<body>
  <sec sec-type="intro"><title/>
<p>Music and its performance are dynamic phenomena that evolve over time. Therefore, research on musical processes needs suitable methods and tools that make these processes observable continuously in real time, rather than just collecting summarizing data at the end of a test session. The necessity for <italic>continuous response</italic> research approaches and tools has been stressed in music psychology in recent years (<xref ref-type="bibr" rid="r1">Kopiez, 2005</xref>; <xref ref-type="bibr" rid="r2">Kopiez et al., 2011</xref>; <xref ref-type="bibr" rid="r5">Schubert, 2010</xref>) and is basically similar to the situation in other research fields on time-bound phenomena, such as theatre, dance, film, commercials, lectures, speeches, or sports performances. Since the 1980s, several continuous response interfaces have been developed (see <xref ref-type="bibr" rid="r2">Kopiez et al., 2011</xref>). In 2013, our team at Osnabrück University published <italic>emoTouch</italic> for Apple iPad, the first continuous response tool on a standalone multitouch device (<xref ref-type="bibr" rid="r3">Louven &amp; Scholle, 2015</xref>).</p>
<p>However, all these interfaces had a raving disadvantage since they required special hardware (sliders, knobs, PCs with mice, PDAs, tablets with specific apps, etc.) that a researcher first had to provide to all participants before a real-time study could be conducted. This imposes a significant limitation for various reasons, e.g. it is hardly possible to provide costly iPads or wired hardware sliders to hundreds of listeners in a live concert setting.</p></sec>
<sec sec-type="other1"><title>The new Web-Based Concept</title>
<p>The new <italic>emoTouch Web</italic> continuous response research system implements a completely new concept that solves these problems fundamentally (<xref ref-type="bibr" rid="r4">Louven et al., 2022</xref>).</p>
<fig id="f0" position="anchor" fig-type="illustration" orientation="portrait">
  <?float-left 0.15 ?>
<graphic xlink:href="jbdgm.137-f0" position="anchor" orientation="portrait"/></fig>
<p>The whole system is based on web and network technologies with an <italic>emoTouch</italic> server as central element that hosts the database and manages the information flow (<xref ref-type="fig" rid="f1">Figure 1</xref>). Studies are designed, controlled und monitored in the Researcher Interface (RI), a web application that runs on any kind of modern device and that stores its data directly on the server. The RI also creates a specific QR Code with an URL that just has to be accessed in order to participate in a study. Whenever this URL is called, the server deploys the study to the calling device where it runs within the web browser (‘Participant Interface’, PI). The PI is completely hardware independent and works on any modern smartphone, tablet, laptop and desktop computer with an internet connection. The data that is generated by the participant’s action (slider movements etc.) is continuously recorded with their time stamps and directly stored on the <italic>emoTouch</italic> server.</p>
  
  <fig id="f1" position="anchor" fig-type="figure" orientation="portrait"><label>Figure 1</label><caption>
<title>emoTouch Web System Architecture</title></caption><graphic xlink:href="jbdgm.137-f1" position="anchor" orientation="portrait"/></fig>
  
<p>Of course, <italic>emoTouch Web</italic> can be used to conduct continuous real-time studies in laboratory settings with desktop computers or to realize continuous response online studies across the internet. But the system particularly provides completely new possibilities for research at live events with potentially hundreds of participants: Subjects can easily participate in a study by using the smartphones they carry in their pockets anyway ('Bring-Your-Own-Device', BYOD), simply by scanning an <italic>emoTouch</italic> QR Code that is, e.g. printed out in the program booklet.</p></sec>
<sec sec-type="other2"><title>Research With emoTouch Web</title>
<p><italic>emoTouch Web</italic> provides all necessary tools to design, conduct and analyse continuous real-time studies in laboratory, online and live settings.</p>
<p>The screen layout that will show up as PI on the participants’ devices is designed in the RI’s study editor (<xref ref-type="fig" rid="f2">Figure 2</xref>). It offers numerous, fully customizable interface elements like one-dimensional horizontal and vertical sliders, 2D rating areas, category scales, checkboxes, buttons, images and text elements.</p>
  
  <fig id="f2" position="anchor" fig-type="figure" orientation="portrait"><label>Figure 2</label><caption>
<title>emoTouch Researcher Interface With Study Editor</title></caption><graphic xlink:href="jbdgm.137-f2" position="anchor" orientation="portrait"/></fig>
  
<p>The number of interface elements on a screen layout is unlimited. Multiple scales, rating areas and buttons can even be used simultaneously on a multitouch device. All layouts will dynamically adapt to the different screen sizes and display ratios of the various participants’ devices. Beside of graphical elements, layouts can also contain audio and video files which will play directly from the participant’s device, which is especially useful for stimuli in laboratory settings or online studies. Since <italic>emoTouch Web</italic> studies are not limited to just one layout, but can consist of several ‘parts’ with different layouts, it is also possible to implement quite complex study workflows that require multiple, changing interfaces.</p>
<p>While a study is executed, the course and data flow can be controlled, monitored and observed in real time from the ‘realisation’ section of the Researcher Interface. For example, the researcher can actively trigger layout changes for all participating devices simultaneously if an appropriate layout needs to be shown in a changing live situation.</p>
<p>All data that is generated from the participating devices is stored immediately on the <italic>emoTouch</italic> server. The RI provides built-in coordinated tools for the graphical and numerical review, evaluation and analysis of this data in longitudinal and cross-section, e.g. timeline graphs with mean and STD, heat plots, descriptive statistics, or the calculation of interrater reliabilities, autocorrelations and Granger causalities. This also allows to review some aspects of the participants’ behaviour that are essential for assessing data quality in live situations, since subjects use their own devices voluntarily and autonomously and could, for example, briefly switch to a different application or browser tab during a study.</p>
<p>If the built-in tools don’t match the needs, the analysis can be extended by any other Python and JavaScript evaluation scripts. In addition, all data can also be exported in a variety of formats for further processing and analyses with different software tools.</p>
<p><italic>emoTouch Web</italic> may be used free of charge for scientific purposes. See detailed information on <ext-link ext-link-type="uri" xlink:href="https://www.emotouch.de">https://www.emotouch.de</ext-link></p></sec>
</body>
<back>
<ref-list><title>References</title>
  <ref id="r1"><mixed-citation publication-type="book">Kopiez, R. (2005). Experimentelle Interpretationsforschung. In H. de la Motte-Haber &amp; G. Rötter (Eds.), <italic>Handbuch der Systematischen Musikwissenschaft Bd. 3: Musikpsychologie</italic> (pp. 459–514). Laaber Verlag.</mixed-citation></ref>
<ref id="r2"><mixed-citation publication-type="book">Kopiez, R., Dressel, J., Lehmann, M., &amp; Platz, F. (2011). <italic>Vom Sentographen zur Gänsehautkamera: Entwicklungsgeschichte und Systematik elektronischer Interfaces in der Musikpsychologie</italic>. Tectum.</mixed-citation></ref>
<ref id="r3"><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name name-style="western"><surname>Louven</surname>, <given-names>C.</given-names></string-name>, &amp; <string-name name-style="western"><surname>Scholle</surname>, <given-names>C.</given-names></string-name></person-group> (<year>2015</year>). <article-title>emoTouch für iPad: Ein flexibles, mobiles Forschungswerkzeug zur Erhebung kontinuierlicher Probandenratings in ein und zwei Dimensionen.</article-title> <source>Jahrbuch Musikpsychologie</source>, <volume>25</volume>, <fpage>250</fpage>–<lpage>253</lpage>. <pub-id pub-id-type="doi">10.23668/psycharchives.2840</pub-id></mixed-citation></ref>
<ref id="r4"><mixed-citation publication-type="web">Louven, C., Scholle, C., Gehrs, F., &amp; Lenz, A. (2022). <italic>emoTouch Web</italic><italic>.</italic> <ext-link ext-link-type="uri" xlink:href="https://www.emotouch.de">https://www.emotouch.de</ext-link></mixed-citation></ref>
  <ref id="r5"><mixed-citation publication-type="book">Schubert, E. (2010). Continuous self-report methods. In P. N. Juslin &amp; J. A. Sloboda (Eds.), <italic>Handbook of music and emotion: Theory, research, applications</italic> (pp. 223–253). Oxford University Press.</mixed-citation></ref>
</ref-list>
<fn-group>
  <fn fn-type="financial-disclosure"><p>The development of emoTouch Web at Osnabrück University (Germany) was funded by the Volkswagen Foundation.</p></fn>
</fn-group>
<fn-group>
  <fn fn-type="conflict"><p>Christoph Louven is the editor-in-chief of Jahrbuch Musikpsychologie.</p></fn>
</fn-group>
<ack>
<p>The authors have no additional (i.e., non-financial) support to report.</p>
</ack>
</back>
</article>
