Skip Navigation
Skip to contents

Perspect Integr Med : Perspectives on Integrative Medicine

OPEN ACCESS
SEARCH
Search

Articles

Page Path
HOME > Perspect Integr Med > Volume 4(3); 2025 > Article
Guidelines
Reporting Guidelines for Music-Based Interventions: An update and Validation Study☆
Sheri L. Robb1,*orcid, Stacey Springs2orcid, Emmeline Edwards3orcid, Tasha L. Golden4orcid, Julene K. Johnson5orcid, Debra S. Burns6orcid, Melita Belgrave7orcid, Joke Bradt8orcid, Christian Gold9,10,11orcid, Assal Habibi12orcid, John R. Iversen13orcid, Miriam Lense14orcid, Jessica A. MacLean15orcid, Susan M. Perkins16orcid
Perspectives on Integrative Medicine 2025;4(3):205-212.
DOI: https://doi.org/10.56986/pim.2025.10.009
Published online: October 31, 2025

1Indiana University, School of Nursing and School of Medicine, Indianapolis, IN, United States

2Harvard University, Faculty of Arts and Sciences, Cambridge, MA, United States

3National Center for Complementary and Integrative Health, Bethesda, MD, United States

4University of Florida, Center for Arts in Medicine, Gainseville, FL, United States

5University of California San Francisco, Institute for Health & Aging, San Francisco, CA, United States

6University of Memphis, College of Communication and Fine Arts, Memphis, TN, United States

7Arizona State University, School of Music, Dance, and Theatre, Tempe, AZ, United States

8Drexel University, Department of Creative Arts Therapies, Philadelphia, PA, United States

9NORCE Norwegian Research Centre AS, Bergan, Norway

10Grieg Academy Department of Music, University of Bergan, Norway

11Department of Clinical and Health Psychology, Faculty of Psychology, University of Vienna, Austria

12University of Southern California, Brain and Creativity Institute, Los Angeles, CA, United States

13McMaster University, Department of Psychology, Neuroscience and Behaviour, Hamilton, Ontario, Canada

14Vanderbilt University, School of Medicine and Vanderbilt University Medical Center, Nashville, TN, United States

15Indiana University, Department of Speech, Language, Hearing Sciences and Program in Neuroscience, Bloomington, IN, United States

16Indiana University, School of Medicine and Richard M. Fairbanks School of Public Health, Indianapolis, IN, United States

*Corresponding author: Sheri L. Robb, Indiana University Schools of Nursing and Medicine, 600 Barnhill Drive, NU409, Indianapolis, IN 46202, United States, Email: shrobb@iu.edu
☆ This article is a reprint of the article first reported in Frontiers in Psychology (Robb, S. L., Springs, S., Edwards, E., Golden, T., Johnson, J., Burns, D., Belgrave, M., Bradt, J., Gold, C., Habibi, A., Iversen, J., Lense, M., Maclean, J., Perkins, S. Reporting Guidelines for Music-Based Interventions: An Update and Validation Study. Frontiers in Psychology 2025; 16:1–8. doi:https://doi.org/10.3389/fpsyg.2025.1551920).
• Received: July 2, 2025   • Revised: July 11, 2025   • Accepted: July 16, 2025

©2025 Jaseng Medical Foundation

This is an open access article under the CC BY-NC license (http://creativecommons.org/licenses/by-nc/4.0/).

prev next
  • 679 Views
  • 6 Download
  • Background
    Detailed intervention reporting is essential to interpretation, replication, and translation of music-based interventions (MBIs). The 2011 Reporting Guidelines for Music-Based Interventions were developed to improve transparency and reporting quality of published research; however, problems with reporting quality persist.
  • Methods
    The purpose of this study was to update and validate the 2011 reporting guidelines using rigorous Delphi approach that involved an interdisciplinary group of MBI researchers; and to develop an explanation and elaboration guidance statement to support dissemination and usage. We followed the methodological framework for developing reporting guidelines recommended by the EQUATOR Network and guidance recommendations for developing health research reporting guidelines. Our three-stage process included: (1) an initial field scan, (2) a consensus process using Delphi surveys (two rounds) and Expert Panel meetings, and (3) development and dissemination of an explanation and elaboration document.
  • Results
    First-Round survey findings revealed that the original checklist items were capturing content that investigators deemed essential to MBI reporting; however, it also revealed problems with item wording and terminology. Subsequent Expert Panel meetings and the Second-Round survey centered on reaching consensus for item language. The revised RG-MBI checklist has a total of 12-items that pertain to eight different components of MBI interventions including name, theory/scientific rationale, content, interventionist, individual/group, setting, delivery schedule, and treatment fidelity.
  • Conclusion
    We recommend that authors, journal editors, and reviewers use the RG-MBI guidelines, in conjunction with methods-based guidelines (e.g., CONSORT) to accelerate and improve the scientific rigor of MBI research.
Detailed intervention reporting is essential to interpretation, replication, and eventual translation of music-based interventions (MBIs) into practice. Persistent problems with the reporting quality of MBIs represent a significant barrier to advances in scientific research and translation of findings to clinical practice and community settings [14]. Interest in the quality of published research reports emerged in the 1980s due to growing awareness about deficiencies in reports of clinical trials at the time [5,6]. For example, several studies at this time found that an increasing number of randomized controlled trials (RCTs) had missing or inaccurate information, such as whether the assessment of outcomes was masked, a primary endpoint specified, or how sample size was determined [5,7,8]. As a result, the use of reporting guidelines was recommended.
Reporting guidelines are a simple, structured tool for health researchers to use while writing manuscripts, which provides a minimum list of information needed to ensure a published manuscript can be understood by a reader, replicated by a researcher, used to inform clinical decisions, and included in systematic reviews [9]. The Consolidated Standards for Reporting Trials (CONSORT) and Transparent Reporting of Evaluations with Non-randomized Designs (TREND) guidelines were developed to improve the quality and transparency of published research [10,11]. Subsequent publications centered on complexities related to the reporting of behavioral and non-pharmacological interventions, noting that CONSORT and TREND, which have only one item dedicated to intervention reporting, were inadequate [1215]. This led to the development of supplemental guidelines specific to intervention reporting, including elaborated CONSORT guidelines for reporting non-pharmacological interventions [12,13] and the Template for Intervention Description and Replication (TIDieR) checklist [16].
Music-based interventions are especially difficult to fully describe, due to the complexity of music stimuli (e.g., rhythm, pitch, tempo, harmonic structure, timbre), variety of music experiences (e.g., active music making, music listening) and other factors unique to MBIs. To determine whether intervention reporting guidelines were necessary, Robb and Carpenter [17] examined how authors described music interventions and found significant gaps in reporting that hinder cross-study comparisons, generalization, and integration of findings into practice. Subsequently, Robb, Burns, and Carpenter [18] developed Reporting Guidelines for Music-Based Interventions (RG-MBI), which specified components of music interventions that publishing authors were encouraged to report and discuss [18]. The checklist included 11 items organized across seven component areas including intervention content (five items), theory, delivery schedule, interventionist, treatment fidelity, setting, and unit of delivery (one item each).
The 2011 RG-MBIs are available through the EQUATOR Network [19] and have been referenced by authors in > 430 publications. However, recent reviews reveal sustained problems with reporting quality [2033]. In their 2018 review of MBI reporting quality, Robb et al [4] found overall reporting quality was poor with fewer than 50% of authors reporting information for four of the seven checklist components (theory, interventionist qualifications, treatment fidelity, setting). Reporting of intervention content was also poor; again, fewer than 50% of authors reported information about the music used, decibel levels/controls, or materials [4].
Sustained problems with reporting quality suggest limited uptake by authors and journal editors of the 2011 music reporting guidelines. This may be due to limited awareness of those guidelines, problems with perceived relevance or clarity of checklist items, and/or the absence of an explanation and elaboration document to provide practical examples across diverse areas of MBI intervention research. Thus, to ensure validity of current checklist items and improve uptake of the reporting guidelines, we completed a rigorous process to update the current guidelines and to establish a process by which to disseminate the resulting validated checklist and guidance statements.
We followed the methodological framework for developing reporting guidelines recommended by the EQUATOR Network [34] and recommendations for developing health research reporting guidelines [35]. The lead author convened a nine-member advisory group that included leaders from the National Institutes of Health (NIH) Music and Health initiative, music intervention researchers, and policy advocates (see acknowledgements). The advisory group worked with authors SR and SS to develop the study protocol and registered the RG-MBI update with the EQUATOR network [36]. Here we report methods and findings from our three-stage process: (1) field scan, (2) consensus process including Delphi survey and Expert Panel, and (3) resulting modified checklist and planned explanation and elaboration (E&E) guidance statement. This study did not meet criteria for Human Subjects Research and was exempt from Institutional Review Board approval.
1. Stage 1: Field scan
In 2018, based on items specified in the RG-MBI, Robb et al [4] examined reporting quality of published music intervention studies [4]. Overall, reporting quality was determined to be poor, and the terminology used to describe interventions was varied and inconsistent. Golden et al [3] found similar problems with reporting, and recommended the generation and uptake of reporting guidelines [37].
Building on these two reviews, and as our first step, authors SR and JM conducted a field scan of systematic reviews of MBIs published between 2018 and 2022. The purpose of the field scan was to examine and elucidate gaps in reporting quality to inform our Delphi survey and processes. Specifically, we examined whether authors of the systematic reviews discussed reporting quality and, if so, whether they identified additional problems not captured in the current guidelines. We identified 33 systematic reviews, 48% (n = 16) of which discussed specific problems with reporting quality. Notably, all the identified problems had been captured by the 2011 MBI reporting guidelines, suggesting limited awareness or uptake of those early guidelines. As such, the field scan findings supported the use of the 2011 RG-MBI checklist as the starting point for a subsequent Delphi Survey process; it also indicated the need to engage stakeholders and interdisciplinary experts to improve content, item clarity, and usage of the guidelines (see Supplement 1) [38].
2. Stage 2: Item revision and consensus (Delphi survey and expert panel)
The purpose of Stage 2 was to invite music intervention researchers to evaluate content of the 2011 MBI checklist; specifically, they were asked to determine the importance of each item, identify gaps in content, identify problems with wording, and to reach consensus regarding recommended changes to the checklist. Our Delphi process, based on methods described by Sinha and colleagues [37], included two survey rounds to reach item consensus, with the plan to add additional rounds as needed. Following each survey round, an Expert Panel reviewed all survey data and made final consensus decisions concerning checklist items. In this section we provide details about the Expert Panel, survey participants, and methods for reaching consensus.

2.1. Participants

2.1.1. Interdisciplinary expert panel

The Advisory Group worked with lead authors SR and SS to identify expert panelists with varied expertise and who represent different stakeholder groups engaged in the design, conduct, and dissemination of music and health research. Selection criteria were to identify investigators conducting research: (1) along the translational science continuum, (2) across various domains (sociological, psychological, clinical, community health), (3) with varied methodological expertise, and (4) from a variety of disciplinary backgrounds. This eleven-member panel (authors EE, TG, JJ, DB, MB, JB, CG, AH, JI, ML, and SP) included individuals with expertise in the design, conduct, dissemination, and publication of music and behavioral intervention research from the United States, Europe, and Canada. The group included authors of the original MBI reporting guidelines, journal editors, and researchers with expertise in music cognition and neuroscience, music therapy, intervention research, biostatistics, and community music interventions.

2.1.2. Survey participants

Individuals invited to participate in the Delphi survey included United States-based and international experts in music and music-based intervention research. The target sample was comprised of Cochrane review authors, NIH MBI Toolkit panelists, journal editors, and established authors/investigators (including NIH funded Music and Health grant recipients and authors of systematic reviews identified through our initial field scan), and representatives from patient advocacy and arts organizations. Professional backgrounds included behavioral health, neuroscience, nursing, medicine, music therapy, social work, psychology, and public health. The target sample included 106 experts for Round One and 103 experts for Round Two. Accepting the invitation to complete the survey constituted participants’ consent to participate.

2.2. Round one survey

The survey opened with a brief overview of the survey purpose, defined key terms, provided an estimated time commitment (including number of rounds), and emphasized the importance of completing each round. Each reporting item from the original guidelines (12 items total), was assigned an identification number to facilitate random ordering. Participants were asked to rate the importance of each item on a four-point Likert Scale (1 = item has limited importance and not required for reporting; 2 = item has moderate importance; 3 = item has high importance; 4 = item has very high importance and essential to reporting). For each item, participants could also provide additional comments or edits to improve the reporting criterion. For items that received a rating of “1= limited importance” or “4 = very high importance,” we asked participants to provide their rationale for selecting that value and to include any references to support their rationale, if possible. The final two survey items asked participants for additional criteria they believed should be reported in published music intervention research (Question 13) and any additional comments they wanted to share about their responses or the survey (Question 14). See Supplement 2 for survey [38].

2.2.1. Round one data collection and sample

To ensure confidentiality, the Indiana University Center for Survey Research (CSR) distributed and managed survey data using a Qualtrics web survey and recruitment via e-mail. Potential participants were sent an email and invitation; non-respondents and respondents who did not answer all 12 of the first 12 questions received up to two e-mail reminders. To bolster responses, the first author personally e-mailed non-respondents to request their response before the third and final CSR reminder. Additionally, a special reminder with a separate survey link for Questions 13 and 14 only was sent to respondents who partially completed the survey but had not made it to these questions. The first-round survey opened November 3, 2022, and closed January 17, 2023.
The Round One survey was sent to 103 experts for completion after removing three who self-identified as ineligible. The final sample for Round One analysis involved 65 respondents (including partial and complete responses) for a response rate of 63%. Median time to complete the survey was 14 min and an IQR of 23.2 min excluding outliers (> 70 min). Outliers included 5 respondents with survey times between 107.90 and 341.42 min, and 6 respondents over 1160 min. We excluded these cases because they represent individuals who filled out the survey but never submitted it or selected submit after some time with it open in their browser.

2.2.2. Round one analysis and expert panel meeting consensus

All data from completed surveys were downloaded to an Excel spreadsheet for descriptive analysis. Likert scores were grouped based on the four response categories: Limited importance (1); Moderate Importance (2); High importance (3); Very high importance (4). For each item, we calculated descriptive statistics for each response category (frequency, percent, valid percent, cumulative valid percent). Consensus criteria for retaining an item was defined as ≥80% of survey respondents rating an item as having “High” or “Very High” importance. Items scoring lower than this threshold were reviewed by the Expert Panel to determine inclusion, removal, and/or refinement of the item for the second-round survey. In addition, comments provided in open-response fields for all items, as well as any suggested additional items (Question 13), were downloaded verbatim for analysis. Two independent reviewers (SR, SS) identified common themes, and then discussed independent findings to reach agreement. In advance of the first Expert Panel meeting, panelist received numeric data, common themes, and representative statements for each item, along with a synthesized list of any new items and related comments.
Expert Meeting panelists were charged with discussing and reaching consensus about: (1) item retention/removal based on numeric and narrative survey data, (2) item level revision based on narrative data, and (3) inclusion of any newly identified items. During meetings, a meeting facilitator invited each panelist to share a unique comment or insight, with an option to pass or affirm another’s comment. Once each panelist had the opportunity to comment, the group assessed whether they had reached consensus. Originally, the authors had identified nominal group technique as its planned approach to reach consensus; however, the panel did not require voting or ranking to achieve consensus on each item.

2.2.3. Round one survey and expert panel meeting results

Findings from the Round One survey are available in Supplement 2 [38]. Three items did not reach the threshold score for consensus (≥80% of survey respondents rating the item as having “High” or “Very High” importance); these included Q4: Music (78%), Q6: Intervention Materials (64%), and Q11: Setting (75%). Associated comments pointed to the need for revised language (Q4; Q6), with some suggestions that Q11 could be removed and captured in methods-specific checklists. Consensus from the Expert Panel was that current checklist items were adequate, important, and relevant (no items removed or added). However, there was also consensus that wording/language for all checklist items needed revision, and that the revision process should be the focus of the Round Two Survey. To inform revised item language for the second survey, we used discussion notes from the Expert Panel meeting, and gave panelists time after the meeting to submit more detailed edits. Lead authors (SR, SS) then synthesized these recommendations to create revised item language for the second survey.

2.3. Round two survey

All eligible participants from the first-round survey (n = 102; one person removed by request) were invited to complete the second-round survey which provided a side-by-side comparison of checklist items (original vs. revised). For each item (12 items total), participants were asked to indicate one of three options: (1) prefer original checklist wording; (2) prefer revised checklist wording; (3) a suggested edit (with open text box to provide revised wording). See Supplement 3 for survey [38].

2.3.1. Round two data collection and sample

Invitation and reminder e-mails followed the same structure and frequency as Round One. The survey opened May 31, 2023, and closed July 18, 2023. The final sample for Round Two analysis involved 61 respondents (including partial and complete responses) for a response rate of 60%. Median time to complete the survey was seven minutes and an IQR of 5.3 min excluding outliers (> 70 min). Outliers included 9 respondents with survey times between 70.5 and 965.5 min, and 9 respondents over 3,273 min. We excluded these cases because they represent individuals who filled out the survey but never submitted it or selected submit after some time with it open in their browser.

2.3.2. Round two analysis

To determine whether there was consensus for original or revised items we calculated frequency, mean, and percent scores for each item. Consensus was defined as items that were selected by ≥ 80% of survey respondents. In addition, the panel used open-ended comments from survey respondents to determine if an item required further revision. The Expert Panel’s charge was to review items that did not reach consensus using discussion as well as survey respondents’ open-ended comments to inform final changes to item wording, sentence structure, or organization.

2.3.3. Results round two survey and expert panel meetings

Findings from the Round Two survey are available in Supplement 3 [38]. Survey respondents preferred revised language for all items; however, three items did not reach the threshold for consensus (≥ 80% of respondents preferred revised item language): Q2: Person Selecting the Music (63%); Q3: Music (74%); Q9: Treatment Fidelity (52%). For all items, we received suggestions on how we could improve item language. The Expert Panel had two subsequent meetings in which they discussed survey respondent recommendations, terminology, whether to include embedded examples, and the ultimate order of checklist items (including alignment with TIDieR and CONSORT Non-Pharmacological checklists). All Expert Panel decisions were made using our a priori consensus threshold of ≥ 80% agreement.
1. Revised reporting guidelines for music-based interventions
The revised Reporting Guidelines for Music-Based Interventions appear in Table 1 [38,39].
The 2011 Reporting Guidelines for Music-Based Interventions were developed to improve transparency and reporting quality of published research. Despite an increased number of publications citing the guidelines, recent reviews indicate persistent problems with reporting quality. Incomplete and inconsistent reporting of MBIs impedes cross-study comparisons, interpretation, replication, and application of findings to clinical practice and community-based programming.
To improve uptake of the RG-MBIs by a larger and more diverse group of MBI researchers, we convened a team of experts from diverse disciplines to engage in a rigorous Delphi study process. This process revealed that the original checklist items were indeed capturing content that investigators deemed essential to MBI reporting; however, it also identified important problems with existing items that may have been affecting its uptake and effective usage. In particular, findings indicted changes in wording and terminology that would allow checklist items to be inclusive of a wide range of music experiences (e.g., music as a sound stimulus and creating music/musicking) and approaches (e.g., social, psychological, physical, neurological, and biological). The illumination of these issues resulted in robust discussion among Expert Panelists and several rounds of revisions to item language in the guidelines. By engaging an international and diverse group of experts to revise item language, our expectation is that the revised checklist will be clearer, easier to apply, and of greater relevance for a diverse group of MBI investigators.
To further facilitate usage, items were re-ordered to align with the TIDieR checklist including the addition of item one from the TIDieR checklist [16]. Expert panel members also co-authored an Explanation and Elaboration (E&E) guidance document to companion the revised RG-MBI [39]. This document includes a rationale for each item, concrete instructions for optimally reporting each item, and annotated examples from published manuscripts. Our expectation is that the revised RG-MBI will be of greater utility to investigators across a wider range of disciplines and that the E&E document will support greater adoption of the RG-MBI by authors and journal editors.
A primary limitation of this validation study was limited representation of investigators and stakeholders from countries outside the United States. Reliance on systematic reviews, Cochrane Reviews, journal editors, and US-based research initiatives to generate our survey sample did not ensure representation of music and health researchers, clinicians, and advocates at a global level. Second, we did not obtain information about survey respondents’ professional background and country, limiting our ability to assess representation. Finally, we did not conduct a formal study to investigate researchers’ awareness of the 2011 RG-MBIs to gain further insight into specific barriers to adoption.
We recommend that authors, journal editors, and reviewers use the RG-MBI guidelines, in conjunction with methods-based guidelines like CONSORT and TREND, to accelerate and improve the scientific rigor of MBI research. We also recommend a review of MBI reporting quality in five years to evaluate the impact of the revised guidelines and subsequent international studies centered on RG-MBI utility, along with barriers and facilitators to their adoption.
Supplementary materials are available at doi:https://doi.org/10.56986/pim.2025.10.009.

Acknowledgements

Special thanks to our Advisory Panel members: Wen G. Chen, PhD; Emmeline Edwards, PhD; Tasha Golden, PhD; Julene Johnson, PhD; Susan Magsamen, MAS; Coryse St. Hillaire-Clarke, PhD; Dana Greene-Schloesser, PhD; Stacey Springs, PhD. Special thanks to Elizabeth Harman, PhD, MT-BC for assistance with reference management and formatting.

Author Contributions

Conceptualization, Data curation, Formal analysis, Funding acquisition, Investigation, Methodology, Project administration, Resources, Supervision, Validation, Visualization, Writing - original draft, Writing - review & editing: SLR. Conceptualization, Data curation, Formal analysis, Investigation, Methodology, Supervision, Validation, Visualization, Writing - original draft, Writing - review & editing: SS. Conceptualization, Investigation, Validation, Writing - original draft, Writing - review & editing: EE. Conceptualization, Investigation, Validation, Writing - original draft, Writing - review & editing: TLG. Conceptualization, Investigation, Validation, Writing - original draft, Writing - review & editing: JKJ. Investigation, Validation, Writing - original draft, Writing - review & editing: DSB. Investigation, Validation, Writing - original draft, Writing - review & editing: MB. Investigation, Validation, Writing - original draft, Writing - review & editing: JB. Investigation, Validation, Writing - original draft, Writing - review & editing: CG. Investigation, Validation, Writing - original draft, Writing - review & editing: AH. Investigation, Validation, Writing - original draft, Writing - review & editing: JRI. Investigation, Validation, Writing - original draft, Writing - review & editing: ML. Investigation, Validation, Writing - original draft, Writing - review & editing: JAM. Investigation, Validation, Writing - original draft, Writing - review & editing: SMP.

Conflicts of Interest

The authors declare that they have no conflicts of interest.

Author Use of AI Tools Statement

The authors declare that no Generative AI was used in the creation of the manuscript.

Funding

The author(s) declare that financial support was received for the research and/or publication of this article. This study was funded by the Walther Cancer Foundation through Dr. Sheri Robb’s endowed professorship.

Ethical Statement

The Human Research Protection Program of the Indiana University waived the need for ethics approval and oversight for the collection, analysis, and publication of anonymized data for this non-interventional study. Invited survey participants were provided details about the study (purpose, duration, and procedures) and that individual responses would be kept confidential. Accepting the invitation to complete the survey constituted participants’ consent to participate.

De-identified data supporting the conclusions of this article are included in online supplementary material (see original article: https://www.frontiersin.org/journals/psychology/articles/10.3389/fpsyg.2025.1551920/full#SM1). Further inquiries can be directed to the corresponding author.
pim-2025-10-009f1.jpg
Table 1
Reporting Guidelines for Music-Based Interventions Checklist*
Item Number Item Location** (Page or Appendix Number)
1 Brief Name
Provide the name or phrase that describes the intervention.
2 Intervention Theory and/or Scientific Rationale
Provide a rationale for the music and/or music experience(s). Specify how essential features of the music and music experience(s) are expected to influence targeted outcomes.
3 Intervention Content
For Items 3a – 3e, describe the music intervention with enough detail to support replication. When applicable, describe procedures for tailoring the intervention.
 3a Music Selection
Describe the process for how music was selected including who was involved in music selection.
 3b Music
Specify key details about the music that may be relevant to specified outcomes of interest. Characteristics may include compositional features of the music (such as tempo, harmony, rhythm, pitch, tonality, form, instrumentation), sound intensity or volume, lyrics, and/or how the music relates to the participants’ cultural identity and heritage. When using published music, provide reference for a sound recording or sheet music.
 3c Music Delivery Method
Provide details about how music was provided to or created with participants (such as live, recorded, computer generated). Include any details necessary for replication. This might include size of performing group, use of playback equipment, person controlling volume.
 3d Materials
List all materials necessary for the music experience. Include music and non-music equipment and materials.
 3e Intervention Strategies
Describe the music intervention strategy or strategies being studied (such as music listening, improvisation, song writing, rhythmic auditory stimulation).
4 Interventionist
Specify interventionist qualifications, credentials, training, and/or experience. Indicate how many interventionists delivered the music experience.
5 Individual or Group Intervention
Specify whether interventions were delivered to individuals or groups of individuals. For group interventions, specify the size of the group.
6 Setting
Describe where the intervention was delivered. Include location, privacy level, ambient sound, and/or any other factors that may have affected participants’ experiences.
7 Intervention Delivery Schedule
Report number of sessions, session length (for example, 60 min), frequency (for example, 3x/week), time interval between sessions (for example, single day, three consecutive days), and duration (for example, over 4 weeks). Include practice, experiences, or tasks that are assigned to participants between intervention sessions.
8 Treatment Fidelity
Describe strategies and/or measures used to ensure that the music intervention was delivered and received as intended.

We recommend using this checklist in conjunction with the Reporting Guidelines for Music-Based Intervention checklist: Explanation & Elaboration Guide [39].

The focus of the RG-MBI is on reporting details of the music-based intervention under investigation. Importantly, the checklist was designed to be used in conjunction with methodological checklists such as CONSORT (for randomized controlled trials), SPIRIT for clinical trial protocols, and other study designs (see www.equator-netowrk.org). For example, when reporting findings from a randomized controlled trial, the RG-MBI checklist can serve as an extension of Item 5: Interventions on the CONSORT 2010 checklist.

* Reproduced with permission from [38].

** Use N/A if an item is not applicable for the intervention being described.

Item 1 is taken from the TIDieR checklist. Following RG-MBI item validation, we ordered RG-MBI Items 2–8 to coincide with the order of TIDieR items based on content.

Parenthetical details are examples only; they are not intended to be exhaustive.

  • [1] Edwards E, St Hillaire-Clarke C, Frankowski DW, Finkelstein R, Cheever T, Chen WG, et al. NIH music-based intervention toolkit: music-based interventions for brain disorders of aging. Neurology 2023;100(18):868−78.ArticlePubMedPMC
  • [2] Chen WG, Iversen JR, Kao MH, Loui P, Patel AD, Zatorre RJ, et al. Music and brain circuitry: strategies for strengthening evidence-based research for music-based interventions. J Neurosci 2022;42(45):8498−507.ArticlePubMedPMC
  • [3] Golden TL, Springs S, Kimmel HJ, Gupta S, Tiedemann A, Sandu CC, et al. The use of music in the treatment and management of serious mental illness: a global scoping review of the literature. Front Psychol 2021;12:649840. ArticlePubMedPMC
  • [4] Robb SL, Hanson-Abromeit D, May L, Hernandez-Ruiz E, Allison M, Beloat A, et al. Reporting quality of music intervention research in healthcare: a systematic review. Complement Ther Med 2018;38:24−41.ArticlePubMedPMC
  • [5] Matthews R, Chalmers I, Rothwell P. Douglas G Altman: statistician, researcher, and driving force behind global initiatives to improve the reliability of research. BMJ 2018;361:k2588. Article
  • [6] Altman DG. The scandal of poor medical research. BMJ 1994;308(6924):283−4.ArticlePubMedPMC
  • [7] Pocock SJ, Hughes MD, Lee RJ. Statistical problems in the reporting of clinical trials. A survey of three medical journals. N Engl J Med 1987;317(7):426−32.ArticlePubMed
  • [8] Sauerbrei W, Bland M, Evans SJW, Riley RD, Royston P, Schumacher M, et al. Doug Altman: driving critical appraisal and improvements in the quality of methodological and medical research. Biom J 2021;63(2):226−46.ArticlePubMedPDF
  • [9] Equator Network [Internet]. Enhancing the QUAlity and transparency of health research: what is a reporting guideline?: [cited 2024 Jun 16]. Available from: https://www.equatornetwork.org/about-us/what-is-a-reporting-guideline/
  • [10] Des Jarlais DC, Lyles C, Crepaz N, Group T. Improving the reporting quality of non-randomized evaluations of behavioral and public health interventions: the TREND statement. Am J Public Health 2004;94(3):361−6.ArticlePubMedPMC
  • [11] Schulz KF, Altman DG, Moher D. CONSORT 2010 statement: updated guidelines for reporting parallel group randomised trials. J Pharmacol Pharmacother 2010;1(2):100−7.PubMedPMC
  • [12] Boutron I, Moher D, Altman DG, Schulz KF, Ravaud P, Group C. Methods and processes of the CONSORT Group: example of an extension for trials assessing nonpharmacologic treatments. Ann Intern Med 2008;148(4):W60−6.ArticlePubMedPDF
  • [13] Boutron I, Moher D, Altman DG, Schulz KF, Ravaud P, Group C. Extending the CONSORT statement to randomized trials of nonpharmacologic treatment: explanation and elaboration. Ann Intern Med 2008;148(4):295−309.ArticlePubMedPDF
  • [14] Dijkers M, Kropp GC, Esper RM, Yavuzer G, Cullen N, Bakdalieh Y. Quality of intervention research reporting in medical rehabilitation journals. Am J Phys Med Rehabil 2002;81(1):21−33.ArticlePubMed
  • [15] Perera R, Heneghan C, Yudkin P. Graphical method for depicting randomised trials of complex interventions. BMJ 2007;334(7585):127−9.ArticlePubMedPMC
  • [16] Hoffmann TC, Glasziou PP, Boutron I, Milne R, Perera R, Moher D, et al. Better reporting of interventions: template for intervention description and replication (TIDieR) checklist and guide. BMJ 2014;348:g1687. ArticlePubMed
  • [17] Robb SL, Carpenter JS. A review of music-based intervention reporting in pediatrics. J Health Psychol 2009;14(4):490−501.ArticlePubMedPDF
  • [18] Robb SL, Burns DS, Carpenter JS. Reporting guidelines for music-based interventions. J Health Psychol 2011;16(2):342−52.ArticlePubMedPMCPDF
  • [19] Network E [Internet]. Enhancing the QUAlity and transparency of health research: about us: [cited 2024 Jun 16]. Available from: https://www.equator-network.org/about-us/
  • [20] Bradt J, Dileo C, Myers-Coffman K, Biondo J. Music interventions for improving psychological and physical outcomes in people with cancer. Cochrane Database Syst Rev 2021;10(10):CD006911. ArticlePubMedPMC
  • [21] de Witte M, Pinho ADS, Stams GJ, Moonen X, Bos AER, van Hooren S. Music therapy for stress reduction: a systematic review and meta-analysis. Health Psychol Rev 2022;16(1):134−59.ArticlePubMed
  • [22] de Witte M, Spruit A, van Hooren S, Moonen X, Stams GJ. Effects of music interventions on stress-related outcomes: a systematic review and two meta-analyses. Health Psychol Rev 2020;14(2):294−324.ArticlePubMedPDF
  • [23] Duzgun MV, Ozer Z. The effects of music intervention on breast milk production in breastfeeding mothers: a systematic review and meta-analysis of randomized controlled trials. J Adv Nurs 2020;76(12):3307−16.PubMed
  • [24] Gao Y, Wei Y, Yang W, Jiang L, Li X, Ding J, et al. The effectiveness of music therapy for terminally ill patients: a meta-analysis and systematic review. J Pain Symptom Manage 2019;57(2):319−29.ArticlePubMed
  • [25] Jespersen KV, Pando-Naude V, Koenig J, Jennum P, Vuust P. Listening to music for insomnia in adults. Cochrane Database Syst Rev 2022;8(8):CD010459. ArticlePubMedPMC
  • [26] Monsalve-Duarte S, Betancourt-Zapata W, Suarez-Cañon N, Maya R, Salgado-Vasco A, Prieto-Garces S, et al. Music therapy and music medicine interventions with adult burn patients: a systematic review and meta-analysis. Burns 2022;48(3):510−21.ArticlePubMed
  • [27] Moreno-Morales C, Calero R, Moreno-Morales P, Pintado C. Music therapy in the treatment of dementia: a systematic review and meta-analysis. Front Med (Lausanne) 2020;7:160. ArticlePubMedPMC
  • [28] Nguyen KT, Xiao J, Chan DNS, Zhang M, Chan CWH. Effects of music intervention on anxiety, depression, and quality of life of cancer patients receiving chemotherapy: a systematic review and meta-analysis. Support Care Cancer 2022;30(7):5615−26.ArticlePubMedPDF
  • [29] Wang C, Li G, Zheng L, Meng Q, Wang S, Yin H, et al. Effects of music intervention on sleep quality of older adults: a systematic review and meta-analysis. Complement Ther Med 2021;59:102719. ArticlePubMed
  • [30] Wang X, Zhang Y, Fan Y, Tan XS, Lei X. Effects of music intervention on the physical and mental status of patients with breast cancer: a systematic review and meta-analysis. Breast Care 2018;13(3):183−90.ArticlePubMedPMCPDF
  • [31] Yang T, Wang S, Wang R, Wei Y, Kang Y, Liu Y, et al. Effectiveness of five-element music therapy in cancer patients: a systematic review and meta-analysis. Complement Ther Clin Pract 2021;44:101416. ArticlePubMed
  • [32] Yangoz ST, Ozer Z. The effect of music intervention on patients with cancer-related pain: a systematic review and meta-analysis of randomized controlled trials. J Adv Nurs 2019;75(12):3362−73.ArticlePubMedPDF
  • [33] Yangoz ST, Ozer Z. Effects of music intervention on physical and psychological problems in adults receiving haemodialysis treatment: a systematic review and meta-analysis. J Clin Nurs 2022;31(23–24):3305−26.ArticlePubMedPDF
  • [34] Network E [Internet]. How to develop a reporting guideline: [cited 2024 Jun 16]. Available from: https://www.equator-network.org/toolkits/developing-a-reporting-guideline/
  • [35] Moher D, Schulz KF, Simera I, Altman DG. Guidance for developers of health research reporting guidelines. PLoS Med 2010;7(2):e1000217. ArticlePubMedPMC
  • [36] Equator Network [Internet]. Updating the reporting guidelines for music-based interventions: 2023 [cited 2024 Jun 16]. Available from: https://www.equator-network.org/library/reporting-guidelines-under-development/reporting-guidelines-under-development-for-other-study-designs/#MUSIC
  • [37] Sinha IP, Smyth RL, Williamson PR. Using the Delphi technique to determine which outcomes to measure in clinical trials: recommendations for the future based on a systematic review of existing studies. PLoS Med 2011;8(1):e1000393. ArticlePubMedPMC
  • [38] Robb SL, Springs S, Edwards E, Golden TL, Johnson JK, Burns DS, et al. Reporting Guidelines for Music-based Intervention: an update and validation study. Front Psychol 2025;16:1551920.ArticlePubMedPMC
  • [39] Robb SL, Story KM, Harman E, Burns DS, Bradt J, Edwards E, et al. Reporting Guidelines for music-based Interventions checklist: explanation and elaboration guide. Front Psychol 2025;16:1552659. ArticlePubMedPMC

Figure & Data

References

    Citations

    Citations to this article as recorded by  

      • PubReader PubReader
      • ePub LinkePub Link
      • Cite
        Download Citation
        Download a citation file in RIS format that can be imported by all major citation management software, including EndNote, ProCite, RefWorks, and Reference Manager.

        Format:
        • RIS — For EndNote, ProCite, RefWorks, and most other reference management software
        • BibTeX — For JabRef, BibDesk, and other BibTeX-specific software
        Include:
        • Citation for the content below
        Reporting Guidelines for Music-Based Interventions: An update and Validation Study
        Perspect Integr Med. 2025;4(3):205-212.   Published online October 22, 2025
        Close
      • XML DownloadXML Download
      Figure
      • 0
      Related articles
      Reporting Guidelines for Music-Based Interventions: An update and Validation Study
      Image
      Graphical abstract
      Reporting Guidelines for Music-Based Interventions: An update and Validation Study
      Item Number Item Location** (Page or Appendix Number)
      1 Brief Name
      Provide the name or phrase that describes the intervention.
      2 Intervention Theory and/or Scientific Rationale
      Provide a rationale for the music and/or music experience(s). Specify how essential features of the music and music experience(s) are expected to influence targeted outcomes.
      3 Intervention Content
      For Items 3a – 3e, describe the music intervention with enough detail to support replication. When applicable, describe procedures for tailoring the intervention.
       3a Music Selection
      Describe the process for how music was selected including who was involved in music selection.
       3b Music
      Specify key details about the music that may be relevant to specified outcomes of interest. Characteristics may include compositional features of the music (such as tempo, harmony, rhythm, pitch, tonality, form, instrumentation), sound intensity or volume, lyrics, and/or how the music relates to the participants’ cultural identity and heritage. When using published music, provide reference for a sound recording or sheet music.
       3c Music Delivery Method
      Provide details about how music was provided to or created with participants (such as live, recorded, computer generated). Include any details necessary for replication. This might include size of performing group, use of playback equipment, person controlling volume.
       3d Materials
      List all materials necessary for the music experience. Include music and non-music equipment and materials.
       3e Intervention Strategies
      Describe the music intervention strategy or strategies being studied (such as music listening, improvisation, song writing, rhythmic auditory stimulation).
      4 Interventionist
      Specify interventionist qualifications, credentials, training, and/or experience. Indicate how many interventionists delivered the music experience.
      5 Individual or Group Intervention
      Specify whether interventions were delivered to individuals or groups of individuals. For group interventions, specify the size of the group.
      6 Setting
      Describe where the intervention was delivered. Include location, privacy level, ambient sound, and/or any other factors that may have affected participants’ experiences.
      7 Intervention Delivery Schedule
      Report number of sessions, session length (for example, 60 min), frequency (for example, 3x/week), time interval between sessions (for example, single day, three consecutive days), and duration (for example, over 4 weeks). Include practice, experiences, or tasks that are assigned to participants between intervention sessions.
      8 Treatment Fidelity
      Describe strategies and/or measures used to ensure that the music intervention was delivered and received as intended.
      Table 1 Reporting Guidelines for Music-Based Interventions Checklist*

      We recommend using this checklist in conjunction with the Reporting Guidelines for Music-Based Intervention checklist: Explanation & Elaboration Guide [39].

      The focus of the RG-MBI is on reporting details of the music-based intervention under investigation. Importantly, the checklist was designed to be used in conjunction with methodological checklists such as CONSORT (for randomized controlled trials), SPIRIT for clinical trial protocols, and other study designs (see www.equator-netowrk.org). For example, when reporting findings from a randomized controlled trial, the RG-MBI checklist can serve as an extension of Item 5: Interventions on the CONSORT 2010 checklist.

      Reproduced with permission from [38].

      Use N/A if an item is not applicable for the intervention being described.

      Item 1 is taken from the TIDieR checklist. Following RG-MBI item validation, we ordered RG-MBI Items 2–8 to coincide with the order of TIDieR items based on content.

      Parenthetical details are examples only; they are not intended to be exhaustive.


      Perspect Integr Med : Perspectives on Integrative Medicine
      TOP