<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE article PUBLIC "-//NLM//DTD JATS (Z39.96) Journal Publishing DTD v1.1 20120330//EN" "http://jats.nlm.nih.gov/publishing/1.1/JATS-journalpublishing1-mathml3.dtd">
<article article-type="research-article" dtd-version="1.1" xml:lang="en"
   xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink"
   xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance">
   <front>
      <journal-meta>
         <journal-id journal-id-type="DOAJ">15041611</journal-id>
         <journal-title-group>
            <journal-title>Voices: A World Forum for Music Therapy</journal-title>
         </journal-title-group>
         <issn>1504-1611</issn>
         <publisher>
            <publisher-name>GAMUT - Grieg Academy Music Therapy Research Centre (NORCE &amp;
               University of Bergen)</publisher-name>
         </publisher>
      </journal-meta>
      <article-meta>
         <article-id pub-id-type="doi">10.15845/voices.v20i3.2930</article-id>
         <article-categories>
            <subj-group subj-group-type="heading">
               <subject>Research</subject>
            </subj-group>
         </article-categories>
         <title-group>
            <article-title>Program Directors’ Perceptions of the CBMT Exam</article-title>
         </title-group>
         <contrib-group>
            <contrib contrib-type="author">
               <name>
                  <surname>Meadows</surname>
                  <given-names>Anthony</given-names>
               </name>
               <xref ref-type="aff" rid="A_Meadows"/>
               <address>
                  <email>mailto:ameadows2@su.edu</email>
               </address>
            </contrib>
            <contrib contrib-type="author">
               <name>
                  <surname>Eyre</surname>
                  <given-names>Lillian</given-names>
               </name>
               <xref ref-type="aff" rid="L_Eyre"/>
            </contrib>
         </contrib-group>
         <aff id="A_Meadows"><label>1</label>Shenandoah University, USA</aff>
         <aff id="L_Eyre"><label>2</label>Boyer College of Music and Dance, Temple University,
            USA</aff>
         <contrib-group>
            <contrib contrib-type="editor">
               <name>
                  <surname>Norris</surname>
                  <given-names>Marisol Samantha</given-names>
               </name>
            </contrib>
         </contrib-group>
         <contrib-group>
            <contrib contrib-type="reviewer">
               <name>
                  <surname>Bain</surname>
                  <given-names>Candice</given-names>
               </name>
            </contrib>
            <contrib contrib-type="reviewer">
               <name>
                  <surname>Keith</surname>
                  <given-names>Douglas</given-names>
               </name>
            </contrib>
         </contrib-group>
         <pub-date pub-type="pub">
            <day>1</day>
            <month>11</month>
            <year>2020</year>
         </pub-date>
         <volume>20</volume>
         <issue>3</issue>
         <history>
            <date date-type="received">
               <day>3</day>
               <month>12</month>
               <year>2019</year>
            </date>
            <date date-type="accepted">
               <day>3</day>
               <month>7</month>
               <year>2020</year>
            </date>
         </history>
         <permissions>
            <copyright-statement>Copyright: 2020 The Author(s)</copyright-statement>
            <copyright-year>2020</copyright-year>
            <license license-type="open-access"
               xlink:href="http://creativecommons.org/licenses/by/4.0/">
               <license-p>This is an open-access article distributed under the terms of the
                     <uri>http://creativecommons.org/licenses/by/4.0/</uri>, which permits
                  unrestricted use, distribution, and reproduction in any medium, provided the
                  original work is properly cited.</license-p>
            </license>
         </permissions>
         <self-uri xlink:href="https://voices.no/index.php/voices/article/view/2930"
            >https://voices.no/index.php/voices/article/view/2930</self-uri>
         <abstract>
            <p>Forty-one academic program directors completed a survey eliciting their perceptions
               of the Certification Board for Music Therapists (CBMT) board certification exam.
               Survey questions concerned the meaningfulness and utility of the exam in evaluating
               safe and competent practice; reasons students might fail the exam; exam preparation
               methods; and open-ended questions that allowed participants to express specific
               concerns about the exam, if they had any. On average, program directors perceived the
               exam to be “neither effective nor ineffective” in evaluating clinical competence,
               with open-ended responses suggesting the majority of these faculty had a range of
               concerns about the exam. After categorizing and defining these concerns, reflective
               comments serve to stimulate discussion about the meaningfulness and utility of the
               exam, as it is currently constructed. </p>
         </abstract>
         <kwd-group kwd-group-type="author-generated">
            <kwd>CBMT</kwd>
            <kwd>board certification exam</kwd>
            <kwd>education and training</kwd>
            <kwd>music therapy program directors</kwd>
         </kwd-group>
      </article-meta>
   </front>
   <body>
      <!-- sec lvl 2 begin -->
      <sec>
         <title>Study Context</title>
         <p>This article emerged from our experiences as program directors of two music therapy
            training programs in the United States. In our experiences preparing students for
            internship and professional life, we observed inconsistencies in our students’ abilities
            to successfully pass the Certification Board for Music Therapists (CBMT) board
            certification exam. We observed students whom we evaluated as strong entry-level
            clinicians repeatedly having difficulty passing the exam, whereas other students, whom
            we evaluated as less clinically competent based on their performance in fieldwork and
            internship, passed the exam first time. The main difference between these students
            appeared to be the student’s competence as a test taker—that is, students who tended to
            be better at taking multiple choice, timed tests tended to be more successful at passing
            the CBMT exam, even though this did not always correlate with evaluations of their
            clinical competence. </p>
         <p> We were also (and remain) concerned about the declining pass rates for the CBMT exam,
            and what this suggests about the relationship between academic training programs and the
            evaluation of entry-level competence. In the latest communication from CBMT about our
            students’ pass rates (November 2019), CBMT has reported that only 70% of students passed
            the exam on their first attempt, and if they did not pass the exam on their first
            attempt, they only had a 59% chance of passing the exam thereafter. Given the 4½ year
            commitment students make to professional preparation, it has become an ethical concern
            for us in evaluating which students we consider for our academic programs, and whether
            we should evaluate and accept students for our programs who self-identify as struggling
            with multiple choice tests, especially those that are timed. </p>
         <p>These concerns are compounded by our struggles to understand the extent to which these
            kinds of evaluations are suitable for determining which students can practice
            competently and safely. We do not have, to the best of our knowledge, any publicly
            available data that affirms the construct or predictive validity of the CBMT exam,
            making it difficult to understand whether these declining pass rates are a product of
            poor academic preparation, a disconnect between academic preparation and exam content, a
            disconnect between the exam and clinical practice competence, a combination thereof, or
            something else altogether. </p>
         <p> Furthermore, surprisingly little has been written about the CBMT exam, particularly
            articles that discuss the relationship between academic preparation, exam scores, and
            clinical competence. This is further compounded by accreditation requirements with the
            National Commission for Certifying Agencies (NCCE), which certifies the CBMT exam. NCCE
            does not permit music therapy faculty to serve on the CBMT exam committee, the primary
            reason being that to do so may create unfair advantages for some academic programs.
            However, this requirement also creates a further disconnect between exam construction
            and academic programs. </p>
         <p> This article, which identifies and describes AMTA-approved academic program directors’
            perceptions of the CBMT exam is—we hope—a helpful step in opening a scholarly dialogue
            about the exam and its relationship to clinical competence. Forty-one program directors
            responded (58% of program directors in the United States), providing a total of 152
            written responses to our open-ended questions in addition to nine Likert and Likert-type
            questions. This suggested to us that this topic was important to those that responded,
            and that their responses were worthy of consideration as part of a larger dialogue about
            the exam. </p>
         <p> We now invite you into this dialogue. In doing so, we would like you to consider the
            following: Although you will find that we take a critical stance in relation to the
            exam, particularly in the Reflective Comments, we are not advocating for the exam to be
            discontinued. We believe the exam serves an important purpose in advancing the
            profession in the United States, particularly given the ways that healthcare delivery is
            changing. Rather, we are asking that faculty voices be heard, the data be considered,
            and dialogue ensue. Second, we can imagine that some faculty perceptions about the exam
            may be inaccurate, given that faculty do not have first-hand knowledge of the exam. If
            this is the case, we hope the reader will see this an important part of the dialogue,
            illustrating the barriers faculty may experience understanding the exam, its
            construction, and its relationship to clinical competence. Finally, we share this
            article with you in the spirit of advancing the profession by engaging in difficult but
            necessary conversations related to education and training, and the impact this has on
            student preparation. </p>
      </sec>
      <!-- sec lvl 2 end -->
      <!-- sec lvl 2 begin -->
      <sec>
         <title>Introduction</title>
         <p>The purpose of this study was to illuminate and summarize the perceptions of music
            therapy academic program directors regarding the Certification Board for Music
            Therapists (CBMT) board certification exam. Recent changes to the cut score for the CBMT
            exam have impacted music therapy students’ abilities to pass the exam, with first time
            pass rates falling from 84% to 70% over the last decade (<xref ref-type="bibr"
               rid="WBBBCFHHHKKMNS2017">Wylie et al., 2017</xref>). This significant drop in pass
            rates has been reflected in academic programs. Between 2005 and 2015, the number of
            academic programs that had a 90% average first-time pass rate has fallen from 43% to
            15%, while the number of programs that had a pass rate that was lower than 70% increased
            from 10% to 47% (<xref ref-type="bibr" rid="WBBBCFHHHKKMNS2017">Schneck, in Wylie et
               al., 2017</xref>). Factors contributing to this decline have not been adequately
            explored, although discussions about the effectiveness of undergraduate educational
            programs in preparing students for professional practice are ongoing (<xref
               ref-type="bibr" rid="HTC2020">Hsiao et al., 2020</xref>; <xref ref-type="bibr"
               rid="WBBBCFHHHKKMNS2017">Wylie et al., 2017</xref>). This article describes academic
            program directors’ perceptions of the relevance and meaningfulness of the exam, and
            through an examination of these findings, seeks to encourage discussion about the exam
            and its relationship to competent clinical practice.</p>
         <!-- sec lvl 3 begin -->
         <sec>
            <title>Contextualizing the CBMT Exam</title>
            <!-- sec lvl 4 begin -->
            <sec>
               <title>CBMT and the Board Certification Exam</title>
               <p>The Certification Board for Music Therapists (CBMT) was established in 1983, and
                  its current mission is to ensure “a standard of excellence in the development,
                  implementation, and promotion of an accredited certification program for safe and
                  competent music therapy practice” (CBMT, n.d.-a). According to Aigen and Hunter
                     (<xref ref-type="bibr" rid="AH2018">2018</xref>), one of the purposes of
                  establishing the CBMT credential (MT-BC) was to “determine who was qualified to
                  practice music therapy based on a national examination” (p. 186), and to promote
                  music therapy reimbursement for members of the two music therapy associations that
                  existed at the time (National Association of Music Therapy and the American
                  Association of Music Therapy). In doing so, the goal was to create higher
                  professional standards by requiring board certified music therapists to
                  participate in ongoing continuing education, something not previously required. </p>
               <p>Subsequent to the unification of these two associations, and the creation of the
                  American Music Therapy Association (AMTA) in 1998, CBMT became the credentialing
                  body for all students completing AMTA-approved programs. The CBMT certification
                  program is accredited by the National Commission for Certifying Agencies, and CBMT
                  is a charter member of the Institute for Credentialing Excellence (CBMT,
                  n.d.-a).</p>
            </sec>
            <!-- sec lvl 4 end -->
            <!-- sec lvl 4 begin -->
            <sec>
               <title>The Board Certification Exam </title>
               <p>Candidates for board certification have successfully completed the academic and
                  clinical training requirements for music therapy, or its equivalent, as
                  established by AMTA (CBMT Candidate Handbook, 2019). The board certification exam
                  consists of 150 multiple choice questions, completed in 3 hours, of which 130
                  questions are graded (20 are non-scored experimental questions). These questions
                  are distributed among four domain areas, as follows: </p>
               <list list-type="roman-upper">
                  <list-item>
                     <p>Referral, Assessment and Treatment Planning – 40 items </p>
                  </list-item>
                  <list-item>
                     <p>Treatment Implementation and Termination – 70 items</p>
                  </list-item>
                  <list-item>
                     <p>Ongoing Evaluation and Documentation of Treatment – 10 items </p>
                  </list-item>
                  <list-item>
                     <p>Professional Development and Responsibilities – 10 items </p>
                  </list-item>
               </list>
               <p>According to the CBMT (<xref ref-type="bibr" rid="WBBBCFHHHKKMNS2017">Wylie et
                     al., 2017</xref>), the exam cut score (the number of questions that the
                  candidate must answer correctly in order to pass) is currently 95, and 70% of
                  students pass this exam on their first attempt (2015- 2017 [partial] data).
                  Candidates who pass the exam are entitled to call themselves Board-Certified Music
                  Therapists (MT-BC). </p>
            </sec>
            <!-- sec lvl 4 end -->
            <!-- sec lvl 4 begin -->
            <sec>
               <title>The CBMT Exam and Undergraduate Music Therapy Curriculum</title>
               <p>The relevance and meaningfulness of the CBMT exam has been informally discussed
                  among music therapy faculty since the inception of the exam, but recently these
                  discussions have become more focused, stimulated in part by the creation of the
                  Master’s Level Entry (MLE) subcommittee (<xref ref-type="bibr" rid="AMTA2011"
                     >AMTA, 2011</xref>), and in part by the CBMT Executive Director’s presentation
                  to the MLE subcommittee and faculty in November 2017 in which CBMT reported a
                  steady decline in the percentage of first time test-takers passing the exam—from
                  84% (2005–2010) to 70% (2015­–second quarter of 2017; Wylie et al., 2017, pp.
                  10–11). </p>
               <p>At the time of the MLE report, CBMT suggested a number of reasons for the decline
                  in test score pass rates, all based on anecdotal evidence. These include perceived
                  inconsistencies across programs and internships, anxiety about the MT-BC
                  requirement for employment, time between internship and taking the exam, limited
                  experience with multiple choice exams, application and analysis of knowledge vs.
                  memorization, poor study skills and/or test taking skills, and a possible increase
                  of international students for whom English is a second language (<xref
                     ref-type="bibr" rid="WBBBCFHHHKKMNS2017">Wylie et al., 2017, p. 10</xref>). In
                  support of CBMT’s observations, <xref ref-type="bibr" rid="HTC2020">Hsiao et al.
                     (2020)</xref> found that while 85.6% of the survey participants taking the CBMT
                  exam passed on the first attempt, that number dropped to 50% for participants for
                  whom English was a second language. </p>
               <p>Of particular concern to educators was CBMT Executive Director Schneck’s
                  perception of academic training programs, which she connected to declining
                  first-time pass rates at the AMTA national conference in 2017. According to
                  Schneck, “[W]hile music therapy clinical practice is advancing, as reflected in
                  the practice analyses and increased cut scores; [sic.] the change is not being
                  driven by music therapy education as indicated in the declining pass rates” (in
                  Wylie et al., 2017, pp.10–11). Although the CBMT Executive Director does not
                  clarify this statement further, the message she appears to be communicating is
                  that academic programs (some or all) are not keeping up with clinical practice.
                  This assumption that the CBMT exam represents current clinical practice and that
                  academic training programs are not keeping up was, in part, the stimulus for this
                  research study. The authors felt it was important to understand how other program
                  directors were perceiving the problems associated with declining first-time pass
                  rates, and what kinds of solutions they envisioned, if they perceived these to be
                  necessary. </p>
               <p>Schneck (<xref ref-type="bibr" rid="WBBBCFHHHKKMNS2017">in Wylie et al.,
                     2017</xref>) also expressed concerns regarding the AMTA Professional and
                  Advanced Competencies documents, citing job tasks identified in the CBMT Board
                  Certification Domains that do not appear in the AMTA Professional Competencies
                  document, or that are identified only in the AMTA Advanced Competencies document.
                  This discrepancy has two important implications. The first is that it suggests
                  AMTA and CBMT are not in agreement about what constitutes professional competence,
                  and secondly, that AMTA-approved academic training programs, in their requirement
                  to teach AMTA Professional Competencies to their students, may not be teaching all
                  of the CBMT Domains identified in the Candidate Handbook. </p>
               <p>However, other factors associated with exam outcomes emerged from <xref
                     ref-type="bibr" rid="HTC2020">Hsiao et al.’s (2020)</xref> survey of 662 music
                  therapists who completed the exam between 2012 and 2017. Using data from music
                  therapists who completed the exam after 2015, they found that self-reported
                  cumulative Grade Point Average (GPA) and test anxiety scores were significant
                  predictors of one’s ability to pass the board certification exam on the first
                  attempt. Specifically, they found that a) a one unit increase in GPA increased the
                  likelihood of passing the exam on the first attempt by 4.386%, and b) for every
                  unit increase in anxiety as measured by the Westside Test Anxiety Scale, the
                  probability of passing the exam on the first attempt decreased by 53.6%. These
                  findings appear to suggest that academic standing and test anxiety contribute
                  significantly to exam success. </p>
               <p>Certainly, there may be other reasons that contribute to the decrease in pass
                  rates as well, though we have no published data reporting these factors. For
                  example, the increase in failures might indicate that professional standards are
                  becoming higher, and that the exam serves the purpose it should—to act as a
                  gateway to ensure competent practice and prevent under-prepared MT-BCs from
                  entering the profession. Or it may be that inflated academic grades and the
                  acceptance of students who do not develop to the extent expected during the course
                  of the program contribute to student performance that leads to lower pass rates.
                  However, findings from <xref ref-type="bibr" rid="HTC2020">Hsiao et al.’s
                     (2020)</xref> research suggest that grade inflation may not be a contributing
                  factor, as students with higher GPAs were more successful in passing the exam on
                  their first attempt. Therefore, one must consider whether increased failures might
                  be related to other factors, such as exam validity. At present, however, we simply
                  do not know why students are passing at a much lower rate in recent exams. </p>
            </sec>
            <!-- sec lvl 4 end -->
            <!-- sec lvl 4 begin -->
            <sec>
               <title>Academic Challenges Preparing Music Therapy Students</title>
               <p>Concerns regarding an overburdened undergraduate curriculum and expanding scope of
                  practice have existed for decades. As early as 1962, Braswell advocated for major
                  curriculum changes that would give precedence to the unique requirements of a
                  music therapy program, as opposed to a music program with a therapy component
                     (<xref ref-type="bibr" rid="B1962">Braswell, 1962</xref>). Similarly, in 1989
                  Bruscia argued that the profession was at a crossroads, with a burgeoning
                  undergraduate curriculum that was no longer able to competently prepare students
                  for professional practice. As a solution to this problem, Bruscia (<xref
                     ref-type="bibr" rid="B1989">1989</xref>) proposed three levels of competencies,
                  for the bachelor’s, master’s and doctoral levels, defining the breadth and depth
                  of music therapy practice at each level while doing so. </p>
               <p>In the same year, Dileo Maranto (<xref ref-type="bibr" rid="DM1989">1989</xref>)
                  published a summary from the California Symposium on Education and Training,
                  wherein educators sought solutions to curriculum problems. Primary among these
                  recommendations was a proposal for different levels of certification in music
                  therapy, with the CBMT exam occurring after completion of the first level of
                  training. Educators at this symposium proposed a variety of routes to achieve
                  advanced clinical practice, with clear delineations between types and content of
                  practice at each level. Over a decade later, Groene and Pembrook (<xref
                     ref-type="bibr" rid="GP2000">2000</xref>) identified additional educator
                  concerns regarding curriculum, with faculty suggesting specific subject areas that
                  could be reduced to create more room for music therapy-specific content in the
                  undergraduate curriculum. </p>
               <p>Two recent articles point to the continued and urgent concerns of educators to
                  address what is becoming, for some faculty members, an untenable problem (<xref
                     ref-type="bibr" rid="F2018">Ferrer, 2018</xref>; <xref ref-type="bibr"
                     rid="LRBJ2018">Lloyd et al., 2018</xref>). In her in-depth interviews of music
                  therapy faculty and professional members, Ferrer (<xref ref-type="bibr"
                     rid="F2018">2018</xref>) found patterns of concerns regarding the undergraduate
                  music therapy degree, which included: </p>
               <list list-type="order">
                  <list-item>
                     <p>General agreement that the number of requirements set by AMTA and NASM is
                        too high, </p>
                  </list-item>
                  <list-item>
                     <p>Concerns regarding general education requirements in undergraduate programs,
                        and the extent to which this supports or obfuscates the degree focus, </p>
                  </list-item>
                  <list-item>
                     <p>Concerns that students leave academic programs feeling overwhelmed, unable
                        to integrate information in a way that gives them a comprehensive
                        understanding of professional practice, and </p>
                  </list-item>
                  <list-item>
                     <p>Concerns that students are just not developmentally ready to work as music
                        therapists, in that “they have not had the life experiences necessary to
                        fully empathize with individuals facing complex situations” (<xref
                           ref-type="bibr" rid="F2018">Ferrer, 2018, p. 90</xref>). </p>
                  </list-item>
               </list>
               <p>A similar concern was expressed by Aigen and Hunter (<xref ref-type="bibr"
                     rid="AH2018">2018</xref>). In citing the MLE report (<xref ref-type="bibr"
                     rid="WBBBCFHHHKKMNS2017">Wylie et al., 2017</xref>) and declining CBMT exam
                  first time pass rates, they concluded that the knowledge required for entry-level
                  practice may be too large to be taught in four years, and that consideration be
                  given to master’s level entry as a necessary step in addressing professional
                  preparation (p. 192). </p>
               <p>In an effort to understand more about the challenges that faculty face teaching in
                  undergraduate music therapy programs, <xref ref-type="bibr" rid="LRBJ2018">Lloyd
                     et al. (2018)</xref> interviewed music therapy faculty about the internal and
                  external challenges they face in addressing music therapy competencies. While a
                  number of their findings were similar to Ferrer’s (<xref ref-type="bibr"
                     rid="F2018">2018</xref>), <xref ref-type="bibr" rid="LRBJ2018">Lloyd et al.’s
                     (2018)</xref> observations of, and insights into, addressing AMTA competencies
                  in academic programs appear very relevant to the current discussion. This is
                  reflected in a concern expressed by one participant (first quotation) that these
                  authors then contemplate in terms of student preparedness (second quotation):</p>
               <disp-quote>
                  <p>[H]ow many of these [AMTA competencies] are appropriate to be evaluat[ed]? To
                     what extent should students be achieving these competencies? To what extent and
                     where should they be demonstrated? Is 70% okay? Is 70% okay in multicultural
                     areas but not ethics? </p>
               </disp-quote>
               <disp-quote>
                  <p>Though the competencies served as a guide for course content and program
                     structure, program directors and faculty were left to further define those
                     terms, making determinations as to what actually constituted competence, and
                     how to create an environment that allowed students to reach those goals. (p.
                     113) </p>
               </disp-quote>
               <p>
                  <xref ref-type="bibr" rid="LRBJ2018">Lloyd et al.’s (2018)</xref> insights add
                  another important dimension to the CBMT exam discussion—the role of AMTA guidance
                  and oversight. AMTA appears to provide little guidance about minimally competent
                  practice other than providing a list of professional competencies. AMTA also
                  undertakes an evaluation every 10 years, through the Academic Program Approval
                  Committee, of the extent to which these competencies can be verified as being
                  addressed in each academic program, but this does not include evaluation of CBMT
                  domains and related competencies. While the competencies and domains provide an
                  essential foundation for competent educational preparation, this organizational
                  approach appears to leave program directors (and program faculty) with decisions
                  about which competencies, if any, receive emphasis in their academic program.
                  These decisions, at an individual program level, then shape the ways in which
                  students are prepared for the CBMT exam and professional life. The advantage of
                  this approach is that it allows the values, beliefs and experiences of individual
                  program directors and their faculty to shape the development of their students in
                  ways that they believe prepare them for competent practice. The disadvantage is
                  that this preparation may not align with the way the CBMT exam evaluates
                  preparedness, even though the academic program has been approved by AMTA.</p>
               <p>This incongruency leads us to an important juncture, in which the following
                  dimensions interact, illuminating a complex constellation of challenges impacting
                  academic training programs and their relationship to student preparation, as
                  measured by the CBMT exam: </p>
               <list list-type="order">
                  <list-item>
                     <p>CBMT exam first-time pass rates have steadily declined over the last
                        decade.</p>
                  </list-item>
                  <list-item>
                     <p>The CBMT Executive Director states that the scope of music therapy practice
                        is changing and expanding, as defined by the CBMT practice analysis, but
                        that “the change is not being driven by music therapy education” (<xref
                           ref-type="bibr" rid="WBBBCFHHHKKMNS2017">Wylie et al., 2017, p.
                        10</xref>). </p>
                  </list-item>
                  <list-item>
                     <p>Faculty in academic training programs have expressed, for a number of years,
                        concerns regarding the undergraduate music therapy curriculum, and their
                        ability to prepare students for professional practice. </p>
                  </list-item>
               </list>
               <p>The relationship between CBMT exam test scores and academic program preparation
                  therefore reflects a complex constellation of factors that are worthy of
                  examination. This study addresses one dimension of this complex dynamic: academic
                  program directors’ perceptions of the meaningfulness and utility of the CBMT exam
                  as a measure of clinical competence. The following research questions guide this
                  purpose:</p>
               <p>According to academic program directors,</p>
               <list list-type="order">
                  <list-item>
                     <p>How effectively does the CBMT exam evaluate students’ abilities to practice
                        competently?</p>
                  </list-item>
                  <list-item>
                     <p>To what extent, if at all, is there a relationship between students’ overall
                        academic and clinical competence and their CBMT exam performance? </p>
                  </list-item>
                  <list-item>
                     <p>What factors, if any, contribute to students’ exam failures? </p>
                  </list-item>
                  <list-item>
                     <p>If music therapy academic program directors provide CBMT exam preparation,
                        how effective do they perceive this preparation to be?</p>
                  </list-item>
                  <list-item>
                     <p>What changes, if any, should be made to the CBMT exam? </p>
                  </list-item>
               </list>
            </sec>
            <!-- sec lvl 4 end -->
         </sec>
         <!-- sec lvl 3 end -->
      </sec>
      <!-- sec lvl 2 end -->
      <!-- sec lvl 2 begin -->
      <sec>
         <title>Method</title>
         <sec>
            <title>Participants</title>
            <p>Seventy-two program directors at AMTA-approved academic training programs were
               invited to participate in this study, identified from the American Music Therapy
               Association organization directory. In order to ensure anonymity, no descriptive
               information was collected from participants. </p>
         </sec>
         <sec>
            <title>Survey Design</title>
            <p>Specifically designed for this study, the survey comprised 14 questions: nine Likert
               and Likert-type questions and five open-ended questions. See Appendix A for the
               complete survey. After initial construction, the survey went through six rounds of
               revision and included feedback from four music therapy experts and one music therapy
               educator with knowledge in one or more of the following areas: survey design,
               question construction, knowledge of academic training programs, and data analysis
               procedures for survey research. </p>
         </sec>
         <sec>
            <title>Procedures</title>
            <p>Potential participants were sent an email invitation describing the study, along with
               a hyperlink to the survey itself. A follow-up email was sent to all potential
               participants 10 days after the initial invitation. The survey was housed in
               SurveyMonkey, and individual responses were stored in the secure SurveyMonkey portal
               and then downloaded to the researchers’ password-protected personal computers in
               summary form after the close of the survey. The survey included a consent form, which
               participants could review and accept prior to starting the survey or leave the survey
               portal if they did not wish to proceed. </p>
         </sec>
         <sec>
            <title>IRB Approval</title>
            <p>The study was reviewed by the Shenandoah University Institutional Review Board and
               adjudicated exempt from review. </p>
         </sec>
         <sec>
            <title>Data Analysis </title>
            <p>Two forms of data analysis were undertaken: </p>
            <p>1. Summary data for the nine Likert and Likert-type questions were calculated as
               percentages and averages and presented in table or descriptive written form. In doing
               so, we followed guidelines provided by Boone and Boone (<xref ref-type="bibr"
                  rid="BB2012">2012</xref>) and Harpe (<xref ref-type="bibr" rid="H2015"
               >2015</xref>). The authors treated Likert and Likert-type response items (individual
               items) as interval level measurement (e.g. effective to ineffective; agree to
               disagree, etc.), assuming the distance between each item was equal. Further, we used
               criteria provided by Harpe (<xref ref-type="bibr" rid="H2015">2015</xref>) when
               considering responses in Table 1 (faculty perceptions of the effectiveness of the
               CBMT exam in evaluating student competence in the CBMT domains) as a Likert scale
               response, calculating a total average score across all items.</p>
            <p>2. Data from the open-ended questions were analyzed qualitatively, using analytical
               procedures consistent with qualitative content analysis (<xref ref-type="bibr"
                  rid="GK2016">Ghetti &amp; Keith, 2016</xref>; <xref ref-type="bibr" rid="OC2016"
                  >O’Callahan, 2016</xref>). These procedures were as follows:</p>
            <list list-type="order">
               <list-item>
                  <p>The researchers read responses to each open-ended survey question in their
                     entirety to understand these responses as a whole. </p>
               </list-item>
               <list-item>
                  <p>After reviewing each answer, Eyre took primary responsibility for data
                     analysis, organizing answers to each survey question into categories based upon
                     their similarities. The number of responses in each category was also noted,
                     and a summary table was created.</p>
               </list-item>
               <list-item>
                  <p>All of these categories of response were examined together, and further
                     organized into broader themes that subsumed related categories.</p>
               </list-item>
               <list-item>
                  <p>These themes were then shaped into a written narrative that expressed the
                     dimensionality of the theme, using category data to do so. </p>
               </list-item>
               <list-item>
                  <p>Once completed, narrative summaries were reviewed and verified by the
                     co-researchers, with all theme and category data verified against the original
                     responses to ensure the integrity of the analytic process. </p>
               </list-item>
            </list>
            <p>After completing this series of procedural steps for each question, these narrative
               summaries were arranged sequentially by survey question. In order to maintain clarity
               and transparency with regards to qualitative responses in the Results section,
               qualitative codes in each narrative question were transformed into quantitative
               representations for statistical description. This approach of data transformation is
               appropriate in mixed-methods studies for purposes of “paradigmatic corroboration,”
               where both qualitative and quantitative responses examine a similar data set (<xref
                  ref-type="bibr" rid="S2016">Saldaña, 2016, p. 26</xref>).</p>
            <p>Because of the complexity of the answers to each question and the overlap in
               responses between questions, a further analytic stage was undertaken to represent the
               data in ways that a) capture meta-themes, and b) reflect the responses of program
               directors as a whole. In doing so, this process integrated themes across questions,
               removed redundancies in participant responses between questions, and created an
               overall (meta) perspective of participant responses. This was undertaken as
               follows:</p>
            <list list-type="order">
               <list-item>
                  <p>The researchers read the narrative summaries for each question to understand
                     them as a whole. </p>
               </list-item>
               <list-item>
                  <p>These narratives were reorganized into cross-question themes and sub-themes
                     based upon the research questions.</p>
               </list-item>
               <list-item>
                  <p>Redundancies were removed and narrative descriptions created that included:
                     categories (headings), themes (subheadings) and subthemes (topical paragraphs)
                     that reflected an overall conceptualization of participant responses. </p>
               </list-item>
               <list-item>
                  <p>These newly formed narrative descriptions were verified independently by the
                     co-authors, who checked each other’s categories, themes and subthemes against
                     the original written responses (raw data). </p>
               </list-item>
            </list>
            <p>The Discussion section reflects the final written product of this analytic process.
            </p>
         </sec>
         <sec>
            <title>Methods for Ensuring Trustworthiness</title>
            <p>The integrity of the data analysis process was ensured primarily by documenting each
               stage in the analytic process in detail so that each researcher could independently
               verify the co-researcher’s work. As such, researcher triangulation occurred through
               1) numerous discussions of the data, coding procedures, and theme development, 2)
               independent examination of the process through which categories, themes and subthemes
               were developed, 3) independent verification of the number of coded items in each
               category, and 4) independent analysis of the narratives created to form the
               Discussion section. In these ways, each co-researcher was held accountable to the
               other, and each stage of the data analysis process was transparent and available for
               verification. Readers are invited to review all the original responses to the
               open-ended questions in Appendix B and to review these in relation to the
               co-researcher’s data analysis processes (presented in this document). </p>
         </sec>
         <!-- sec lvl 2 end -->
         <!-- sec lvl 2 begin -->
         <sec>
            <title>Results</title>
            <p>Forty-one program directors at AMTA-approved academic training programs participated
               in this study, a response rate of 58%. Survey results are divided into two sections.
               In the first section, summary data are provided regarding program directors’
               responses to the nine Likert and Likert-type questions, followed by section two, an
               analysis of responses to five open-ended questions contained within the survey. </p>
         </sec>
         <sec>
            <title>Summary of Responses to Likert and Likert-type Questions</title>
            <p>Tables 1 and 2 summarize program directors’ responses to two questions addressing the
               effectiveness of the CBMT exam in evaluating clinical competence. Descriptive
               statistics were calculated by assigning a numerical value to each rating (1 =
               ineffective to 5 = effective) and calculating the average ratings accordingly. </p>
            <table-wrap id="tbl1">
               <label>Table 1</label>
               <!-- optional label and caption -->
               <caption>
                  <p>Faculty Perceptions of the Effectiveness of the CBMT Exam in Evaluating Student
                     Competence in the CBMT Domains</p>
               </caption>
               <table>
                  <thead>
                     <tr>
                        <th>CBMT Domain</th>
                        <th>Mean rating</th>
                        <th>Standard deviation</th>
                        <th>95% confidence interval</th>
                     </tr>

                  </thead>
                  <tbody>
                     <tr>
                        <td>1. Referral, assessment and treatment planning</td>
                        <td>3.54</td>
                        <td>1.27</td>
                        <td>3.14-3.94</td>
                     </tr>
                     <tr>
                        <td>2. Treatment, implementation and termination</td>
                        <td>3.27</td>
                        <td>1.38</td>
                        <td>2.83-3.70</td>
                     </tr>
                     <tr>
                        <td>3. Documentation and evaluation </td>
                        <td>3.44</td>
                        <td>1.32</td>
                        <td>3.02-3.96</td>
                     </tr>
                     <tr>
                        <td>4. Professional development and responsibilities</td>
                        <td>3.61</td>
                        <td>1.2</td>
                        <td>3.23-3.99</td>
                     </tr>
                  </tbody>
               </table>
            </table-wrap>
            <p>Table 1 summarizes faculty perceptions regarding the effectiveness of the CBMT exam
               in evaluating the four domain areas derived from the CBMT practice analysis. Average
               responses varied for each item, from 3.27 (treatment, implementation and termination)
               to 3.61 (professional development and responsibilities). When responses to these
               questions were treated as a Likert scale response (<xref ref-type="bibr" rid="BB2012"
                  >Boone &amp; Boone, 2012</xref>; <xref ref-type="bibr" rid="H2015">Harpe,
                  2015</xref>), the total average response was 3.34. Scores in the 3 to 4 range are
               rated as “neither effective nor ineffective” (a rating of 3) to “somewhat effective”
               (a rating of 4). </p>
            <table-wrap id="tbl2">
               <label>Table 2</label>
               <!-- optional label and caption -->
               <caption>
                  <p>Faculty Perceptions of the Relationship between Clinical Competence, Academic
                     Grades, and the CBMT Exam</p>
               </caption>
               <table>
                  <thead>
                     <tr>
                        <th>Clinical competence, academic grades and CBMT exam performance</th>
                        <th>Mean response</th>
                        <th>Standard deviation</th>
                        <th>95% confidence interval</th>
                     </tr>
                  </thead>
                  <tbody>
                     <tr>
                        <td>Clinical competence and exam performance</td>
                        <td>3.32</td>
                        <td>1.25</td>
                        <td>2.92-3.71</td>
                     </tr>
                     <tr>
                        <td>Academic grades and exam performance</td>
                        <td>3.78</td>
                        <td>1.12</td>
                        <td>3.42-4.13</td>
                     </tr>
                  </tbody>
               </table>
            </table-wrap>
            <p>Table 2 provides a summary of faculty perceptions regarding the relationship between
               clinical competence, academic grades and the CBMT exam. When faculty rated the
               relationship between clinical competence and exam performance, the average of these
               responses was 3.32. When faculty rated the relationship between academic grades and
               exam performance, the average of these responses was 3.78. Scores in the 3 to 4 range
               are rated as “neither related nor unrelated” (a rating of 3) to “somewhat related” (a
               rating of 4).</p>
            <p>Table 3 provides a summary of the reasons program directors perceive students “could
               or might fail the exam” (1 = disagree to 5 = agree). Average responses to each
               question varied from 2.40 (Unable to provide breadth and depth of coursework) to 3.47
               (I do not teach to the exam). Of particular interest were responses concerning
               academic preparation for the exam (I do not teach to the exam; I am unable to provide
               the breadth and depth of coursework; inconsistent philosophy) and perceptions related
               to the validity of the exam (the exam is not valid; some of the exam is irrelevant to
               competent and/or safe clinical practice). Regarding academic preparation, the average
               response to the item “unable to provide breadth and depth of coursework” was 2.4,
               where scores in the 2 to 3 range reflect “somewhat disagree” to “neither agree nor
               disagree.” When program directors were asked about “inconsistent philosophy,” the
               average of these responses was 2.63, where scores in the 2 to 3 range reflect
               “somewhat disagree” to “neither agree nor disagree.” Finally, when program directors
               were asked to rate the extent of agreement regarding “I do not teach to the exam,”
               the average of these responses was 3.47, where scores in the 3 to 4 range reflect
               “neither agree nor disagree” to “somewhat agree.”</p>
            <table-wrap id="tbl3">
               <label>Table 3</label>
               <!-- optional label and caption -->
               <caption>
                  <p>Faculty Perceptions of Reasons Why Students Could or Might Fail the CBMT
                     Exam</p>
               </caption>
               <table>
                  <thead>
                     <tr>
                        <th>Reason student could or might fail exam</th>
                        <th>Mean response</th>
                        <th>Standard deviation</th>
                        <th>95% confidence interval</th>
                     </tr>
                  </thead>
                  <tbody>
                     <tr>
                        <td>I do not teach to the exam</td>
                        <td>3.47</td>
                        <td>1.45</td>
                        <td>3.00–3.95</td>
                     </tr>
                     <tr>
                        <td>Unable to provide breadth and depth of coursework</td>
                        <td>2.40</td>
                        <td>1.37</td>
                        <td>1.96–2.84</td>
                     </tr>
                     <tr>
                        <td>Inconsistent philosophy</td>
                        <td>2.63</td>
                        <td>1.43</td>
                        <td>2.17–3.08</td>
                     </tr>
                     <tr>
                        <td>Fieldwork and internship supervisors share equal responsibility</td>
                        <td>3.41</td>
                        <td>1.37</td>
                        <td>2.97–3.85</td>
                     </tr>
                     <tr>
                        <td>The exam is not valid</td>
                        <td>2.80</td>
                        <td>1.49</td>
                        <td>2.32–3.28</td>
                     </tr>
                     <tr>
                        <td>Recent changes to the passing score make it too hard</td>
                        <td>2.88</td>
                        <td>1.36</td>
                        <td>2.44–3.31</td>
                     </tr>
                     <tr>
                        <td>Some of the exam is irrelevant to competent and/or safe practice</td>
                        <td>3.28</td>
                        <td>1.47</td>
                        <td>2.81–3.74</td>
                     </tr>
                  </tbody>
               </table>
            </table-wrap>
            <p>Further, when program directors were asked about their perceptions of the validity of
               the exam, responses to these two questions were as follows. When asked to respond to
               the statement “the exam is not valid,” the average of these responses was 2.8, where
               a score of 2.8 reflects “neither agree nor disagree” to “somewhat disagree.” When
               asked to respond to the statement “some of the exam is irrelevant to competent and/or
               safe clinical practice,” the average of these responses was 3.28, where scores in the
               3 to 4 range reflect “neither agree nor disagree” to “somewhat agree.” </p>
            <p>Program directors were also asked to indicate their level of agreement with a
               statement made by CBMT Executive Director Schneck that was included in the final
               report of the Master’s Level Entry subcommittee: “While music therapy clinical
               practice is advancing, as reflected in the practice analysis and increased cut
               scores; [sic.] the change is not being driven by music therapy education as indicated
               in the declining pass rate” (in Wylie et al., 2017, pp. 10–11). When these responses
               were examined as a whole, 18% (7) of faculty disagreed, 15% (6) somewhat disagreed,
               15% (6) neither agreed nor disagreed, 23% (9) somewhat agreed, and 30% (12) agreed,
               with the average these responses being 3.33. Scores in the 3 to 4 range are rated as
               “neither agree nor disagree” (a rating of 3) to “somewhat agree” (a rating of 4). </p>
            <p>Finally, program directors were asked if they, or their faculty colleagues, provided
               specific preparation for the CBMT exam: 59% said they did, whereas 41% said they did
               not. When those faculty who said they provided specific preparation for the CBMT exam
                  (<italic>n</italic> = 28) were asked how effective they perceived their
               preparation activities to be, 3% (1) reported they were ineffective, none found them
               somewhat ineffective (0), 8% (3) described them as neither effective nor ineffective,
               41% (16) described them as somewhat effective, and 21% (8) described them as
               effective.</p>
         </sec>
         <sec>
            <title>Responses to Open-Ended Questions</title>
            <p>Participant responses to the five open-ended questions are summarized below (Tables
               4–8) and described in detail in the Discussion. This includes comments related to the
               exam (Table 4), why programs directors believe students may be failing the exam
               (Table 5), how academic programs support student preparation for the exam (Table 6),
               perceived barriers to effective preparation (Table 7), and what changes, if any,
               program directors would make to the exam (Table 8). In providing this summary data,
                  <italic>respondents</italic> and <italic>responses </italic>are differentiated.
               The term <italic>respondents</italic> denotes the total number of program directors
               who responded to a particular question, whereas <italic>responses</italic> reflects
               the number of comments made for each theme or category of response. Thus, the total
               number of responses is often larger than the number of respondents as many of the
               program directors gave detailed written responses that were included in analysis for
               multiple themes. Percentage calculations in each of these categories are based on the
               number of respondents who made a statement about a particular theme, compared to the
               total number of respondents for the question. </p>
         </sec>
         <sec>
            <title>Comments related to the CBMT exam </title>
            <p>Program directors were given the opportunity to describe what they perceived the CBMT
               exam evaluates, based on their experiences of preparing students to take the exam.
               Thirty-nine program directors responded, providing 52 responses that are summarized
               in Table 4. These responses are as follows: 1) 54% (21) of respondents questioned or
               criticized the exam’s ability to evaluate clinical competence, 2) 41% (16) perceived
               the exam to evaluate test-taking abilities, 3) 31% (12) perceived the exam to
               evaluate a student’s ability to practice competently and/or safely, and 4) 8% (3)
               were unsure what the exam evaluated. </p>
            <table-wrap id="tbl4">
               <label>Table 4</label>
               <!-- optional label and caption -->
               <caption>
                  <p>Program Directors’ Comments Related to the CBMT Exam</p>
               </caption>
               <table>
                  <thead>
                     <tr>
                        <th>Categories of overall narrative responses to CBMT Exam</th>
                        <th>Percentage of <italic>N</italic></th>
                        <th><italic>n</italic>
                        </th>
                     </tr>
                  </thead>
                  <tbody>
                     <tr>
                        <td>Questioned or criticized the exam’s ability to evaluate clinical
                           competence</td>
                        <td>54%</td>
                        <td>21</td>
                     </tr>
                     <tr>
                        <td>Exam evaluates test-taking abilities</td>
                        <td>41%</td>
                        <td>16</td>
                     </tr>
                     <tr>
                        <td>Exam evaluates ability to practice competently and/or safely</td>
                        <td>31%</td>
                        <td>12</td>
                     </tr>
                     <tr>
                        <td>Unsure what exam evaluates</td>
                        <td>8%</td>
                        <td>3</td>
                     </tr>
                  </tbody>
               </table>
            </table-wrap>
         </sec>
         <sec>
            <title>Comments Related to the MLE Report</title>
            <p>Program directors were given an opportunity to respond to CBMT Executive Director
               Schneck’s (<xref ref-type="bibr" rid="WBBBCFHHHKKMNS2017">in Wylie et al., 2017, p.
                  10</xref>) comments related to academic training and exam pass rates. Twenty-nine
               program directors responded, providing a total of 49 responses, which are summarized
               in Table 5. These include concerns regarding: 1) external validity (62%–ability of
               exam to measure entry level competency); 2) erroneous assumptions on the part of CBMT
               regarding reason for increased failures (24%); 3) undergraduate curriculum cannot
               expand to reflect the breadth and depth of our developing practice (21%); and 4)
               insufficient communication by and between CBMT, AMTA and National Association of
               Schools of Music (NASM) (21%). </p>
            <table-wrap id="tbl5">
               <label>Table 5</label>
               <!-- optional label and caption -->
               <caption>
                  <p>Faculty Perceptions of Reasons Why Students May be Failing the CBMT Exam</p>
               </caption>
               <table>
                  <thead>
                     <tr>
                        <th>Categories of narrative responses to reasons for student failures of
                           CBMT exam</th>
                        <th>Percentage of <italic>N</italic>
                        </th>
                        <th>
                           <italic>n</italic>
                        </th>
                     </tr>
                  </thead>
                  <tbody>
                     <tr>
                        <td>Concerns with external validity: Ability of exam to measure entry level
                           competency </td>
                        <td>62%</td>
                        <td>18</td>
                     </tr>
                     <tr>
                        <td>Erroneous assumptions on the part of the CBMT regarding reason for
                           increased failures</td>
                        <td>24%</td>
                        <td>7</td>
                     </tr>
                     <tr>
                        <td>Undergraduate curriculum cannot expand to reflect the breadth and depth
                           of our developing practice</td>
                        <td>21%</td>
                        <td>6<break/>
                        </td>
                     </tr>
                     <tr>
                        <td>Issues of poor communication between organizations and educators</td>
                        <td>21%</td>
                        <td>6</td>
                     </tr>
                     <tr>
                        <td>Concerns with internal validity: Often more than one “correct” answer
                           depending on perspective </td>
                        <td>17%</td>
                        <td>5</td>
                     </tr>
                     <tr>
                        <td>Problems related to disconnect among requirements related to governing
                           organizations (AMTA, CBMT, &amp; NASM)</td>
                        <td>17%</td>
                        <td>5</td>
                     </tr>
                     <tr>
                        <td>Faculty may not be current with evidence-based practice in their
                           teaching</td>
                        <td>7%</td>
                        <td>2</td>
                     </tr>
                  </tbody>
               </table>
            </table-wrap>
         </sec>
         <sec>
            <title>CBMT Exam Preparation </title>
            <p>Program directors were asked if they provided specific kinds of exam preparation for
               their students, and if they did, to describe this preparation. Twenty-eight program
               directors (59%) reported that they (or their faculty colleagues) provided specific
               exam preparation, of which the three main methods were: 1) formal instruction (32%),
               2) instructional methods within a specific course(s) (25%), and 3) specific exam
               preparation outside of class (21%). These responses are summarized in Table 6. </p>
            <table-wrap id="tbl6">
               <label>Table 6</label>
               <!-- optional label and caption -->
               <caption>
                  <p>Specific CBMT Exam Preparation Methods</p>
               </caption>
               <table>
                  <thead>
                     <tr>
                        <th>Categories of narrative responses to methods of academic preparation for
                           CBMT exam</th>
                        <th>Percentage of <italic>N</italic>
                        </th>
                        <th>
                           <italic>n</italic>
                        </th>
                     </tr>
                  </thead>
                  <tbody>
                     <tr>
                        <td>Formal instruction, context unspecified</td>
                        <td>32%</td>
                        <td>9</td>
                     </tr>
                     <tr>
                        <td>Instructional methods within specific course or courses</td>
                        <td>25%</td>
                        <td>7</td>
                     </tr>
                     <tr>
                        <td>Specific exam preparation outside of class</td>
                        <td>21%</td>
                        <td>6</td>
                     </tr>
                     <tr>
                        <td>Informal instruction (1-1 mentorship with faculty and graduated
                           students)</td>
                        <td>11%</td>
                        <td>3</td>
                     </tr>
                     <tr>
                        <td>Distribution of prepared materials</td>
                        <td>7%</td>
                        <td>2</td>
                     </tr>
                     <tr>
                        <td>Internship Seminar</td>
                        <td>4%</td>
                        <td>1</td>
                     </tr>
                  </tbody>
               </table>
            </table-wrap>
            <p>Program directors were also asked if they experienced any barriers in preparing
               students for the exam. Thirty-four program directors responded, providing a total of
               54 responses, which are summarized in Table 7. Although the majority of the responses
                  (<italic>n</italic> = 24; 71%) reflect a belief that the exam preparation they
               provided students was effective or somewhat effective, the three most common barriers
               reported were: 1) a lack of knowledge about exam content and lack of experience with
               the exam (32%), 2) the perception that the undergraduate curriculum is insufficient
               in providing the breadth of knowledge and skills required to prepare students for
               competent practice (27%), 3) students have weak analytical and critical thinking
               skills, (9%) and 4) students wait too long after internship to take the exam (9%).
               All responses are summarized in Table 7.</p>
            <table-wrap id="tbl7">
               <label>Table 7</label>
               <!-- optional label and caption -->
               <caption>
                  <p>Perceived Barriers to Exam Preparation</p>
               </caption>
               <table>
                  <thead>
                     <tr>
                        <th>Categories of narrative responses to barriers to academic preparation
                           for CBMT exam</th>
                        <th>Percentage of <italic>N</italic>
                        </th>
                        <th>
                           <italic>n</italic>
                        </th>
                     </tr>
                  </thead>
                  <tbody>
                     <tr>
                        <td>Lack of knowledge about the exam content and lack of experience with the
                           exam created barriers</td>
                        <td>32%</td>
                        <td>11</td>
                     </tr>
                     <tr>
                        <td>Undergraduate curriculum insufficient in music therapy and related
                           credits given the breadth of knowledge and practice skills required to
                           prepare students for competent practice</td>
                        <td>27%</td>
                        <td>9</td>
                     </tr>
                     <tr>
                        <td>Students may have a weakness with analytical and critical thinking
                           skills rendering them less effective as test takers</td>
                        <td>9%</td>
                        <td>3</td>
                     </tr>
                     <tr>
                        <td>Students may wait too long after internship to take the exam or neglect
                           to reach out to educators for help after internship</td>
                        <td>9%</td>
                        <td>3</td>
                     </tr>
                     <tr>
                        <td>NASM curriculum focus on classical Western music does not meet
                           educational needs of music therapy students</td>
                        <td>6%</td>
                        <td>2</td>
                     </tr>
                     <tr>
                        <td>Persons who are not native English speakers have additional problems
                           with the structure of the exam</td>
                        <td>6%</td>
                        <td>2</td>
                     </tr>
                  </tbody>
               </table>
            </table-wrap>
         </sec>
         <sec>
            <title>Changes to the CBMT exam</title>
            <p>Finally, program directors were asked what changes, if any, they would make to the
               exam. Thirty-four program directors responded, providing a total of 41 responses. A
               wide variety of changes were suggested, which will be presented in detail in the
               Discussion. </p>
            <table-wrap id="tbl8">
               <label>Table 8</label>
               <!-- optional label and caption -->
               <caption>
                  <p>Changes Program Directors Would Make to the Exam</p>
               </caption>
               <table>
                  <thead>
                     <tr>
                        <th>Categories of narrative responses to changes needed in CBMT exam</th>
                        <th>Percentage of <italic>N</italic>
                        </th>
                        <th>
                           <italic>n</italic>
                        </th>
                     </tr>
                  </thead>
                  <tbody>
                     <tr>
                        <td>Various changes needed (presented in Discussion)</td>
                        <td>85%</td>
                        <td>29</td>
                     </tr>
                     <tr>
                        <td>Concerns about lack of educators’ knowledge about: exam content, exam
                           changes, changing the focus of the exam without providing information to
                           faculty</td>
                        <td>21%</td>
                        <td>7</td>
                     </tr>
                     <tr>
                        <td>No changes needed; exam is accurate and valid and reflects current
                           practice</td>
                        <td>15%</td>
                        <td>5</td>
                     </tr>
                  </tbody>
               </table>
            </table-wrap>
         </sec>
      </sec>
      <!-- sec lvl 2 end -->
      <!-- sec lvl 2 begin -->
      <sec>
         <title>Discussion</title>
         <p>The purpose of this study was to examine academic program directors’ perceptions of the
            meaningfulness and utility of the CBMT exam in measuring clinical competence. This
            purpose was guided by a series of research questions that also sought to understand the
            relationship between academic and clinical competence and exam performance, reasons
            students “could or might fail the exam,” and any changes program directors would make to
            the CBMT exam. In this section we combine responses to the Likert and Likert-type
            questions with our analysis of the narrative responses (open-ended questions), to
            provide a comprehensive picture of participants’ perceptions of the exam. In doing so,
            we begin by addressing the effectiveness of the exam, addressing reasons students could
            or might fail the exam, and conclude the Discussion by summarizing recommendations these
            program directors made for changing the exam. </p>
         <p>Summary data from the Likert and Likert-type questions (Tables 1–3) suggest that program
            directors do not, on average, perceive the exam to be effective or ineffective in
            evaluating student competence to practice safely and effectively (x&#772; = 3.34; Table
            1), and that, on average, they viewed clinical competence and academic grades as neither
            related nor unrelated to CBMT exam performance (x&#772; = 3.32 and x&#772; = 3.78; Table
            2). These averages were clarified in the written responses (open-ended questions), where
            the majority of the faculty (54%; Table 4) questioned the exam’s capacity to evaluate
            clinical competence. When reporting what they perceived the CBMT evaluates, 41% of
            respondents indicated that they perceived the exam to be a test of a student’s ability
            to take a standardized multiple-choice test.</p>
         <p>These faculty perceptions conflict somewhat with a recently completed study
            investigating certificants’ perceptions and experiences of the board certification exam,
            including identifying predictors of exam success. <xref ref-type="bibr" rid="HTC2020"
               >Hsiao et al. (2020)</xref> found that GPA and test anxiety were two strong
            predictors of exam success, with GPA showing an odds ratio of 4.386 (<italic>p
            </italic>= .001), suggesting that a unit increase in GPA increases the odds of passing
            the exam on the first attempt by 4.386%. Additionally, anxiety showed an odds ratio of
            0.464 (<italic>p </italic>&lt; .001), suggesting that for every unit increase in
            anxiety, the odds of passing the exam on the first attempt decrease by 53.6%. </p>
         <p>Significantly, of 192 respondents who provided comments in the <xref ref-type="bibr"
               rid="HTC2020">Hsiao et al. (2020)</xref> study in response to the question, “Would
            you agree that the [board certification] examination reflects your competence as a music
            therapist?,” almost all stated reasons for their disagreement. The following three
            reasons were given: The exam did not fully reflect the test takers’ actual educational
            and clinical experiences (<italic>n </italic>= 64; 33.3%), 2) the testing format
            addressed only content knowledge and favored one type of test taker (<italic>n
            </italic>= 43; 22.4%), and 3) the test was subjective due to its bias toward certain
            theoretical approaches and philosophies and its US-centrism (<italic>n </italic>= 10;
            5.4%). This appears to align with the findings from this study, in which faculty
            expressed concerns about the construction of the exam.</p>
         <sec>
            <title>Construct and Content Validity </title>
            <p>Concerns regarding content and construct validity were expressed throughout the
               survey, particularly in the open-ended questions (see Table 5). The leading problem
               that program directors had with the exam was its concreteness, which they viewed as
               disregarding the complex clinical contexts in which knowledge and skills are applied.
               From this perspective, clinical interventions are dependent upon the context in which
               they occur, whereas the exam accepts only one correct answer, which is often correct
               only within one particular theoretical approach (e.g. behavioral) and/or clinical
               context. Thus, from their perspective, the exam does not adequately reflect clinical
               practice, revealing instead a disconnection between the “correct” answer and the
               myriad ways clinicians actually think about their work with clients. </p>
            <p>Some educators also reported that their clinical philosophy and way of thinking about
               and teaching music therapy was not consistent with the exam format or content, nor
               was the exam able to evaluate the kinds of clinical practice skills they identified
               as central to competent practice. For example, these educators believed that students
               who reflected deeply about the clinical context of the question tended to choose more
               incorrect answers because they were aware of the multiple ways one might respond to a
               client, depending on the context of the response. In addition, other educators
               observed that the test questions often had distractors that led students to have
               difficulty recognizing what knowledge the question was testing. Overall, for these
               respondents, the exam was therefore perceived to be a measure of test taking (i.e.
               understanding what is being tested) rather than evaluating sound clinical
               decision-making skills that are foundational to safe and competent practice. </p>
         </sec>
         <sec>
            <title>Evaluating Entry-Level Practice</title>
            <p>Program directors also expressed a range of opinions as to whether the current exam
               is reflective of entry-level practice. Reasons for this perceived lack of content
               validity were premised upon concerns that some questions included in the exam may
               require knowledge that extends beyond what one can reasonably be expected to know at
               the bachelor’s level. These concerns were connected to the construction of the exam,
               which is derived from the Practice Analysis. The Practice Analysis is undertaken
               every five years (the last was in 2019) to generate a list of job tasks that are used
               to generate and define Board Certification Domains. These domains are then used in
               the construction of the exam, which is managed by the testing firm<italic
               > </italic>Applied Measurement Professionals (AMP; CBMT, n.d.-b). </p>
            <p>While the involvement of AMP ensures psychometrically sound procedures, many faculty
               remained doubtful that the exam solely tests bachelor’s level competency. These
               concerns were based on the perception that the exam may include questions formulated
               from 1) the knowledge and experience of graduate equivalency professionals (master’s
               degree), which they do not perceive as comparable to bachelor’s level entry (BLE)
               professionals, 2) professionals who have specialized training (e.g. NICU, NMT etc.),
               3) professionals working in a specialized environment performing duties that require
               an advanced breadth and depth of practice (beyond BLE), and 4) professionals who
               carry out duties that extend beyond the job description of a music therapist (e.g.
               activity director, recreational therapist or case manager). Thus, these faculty
               believed that the Practice Analysis process results in an exam that extends beyond
               the scope and training for a BLE music therapist as defined by AMTA for entry into
               the profession (see Table 5). </p>
            <p>Faculty also questioned whether it was preferable to have an exam that evaluated
               entry-level practice at the bachelor’s level or required a master’s degree to enter
               the profession. These comments were related to the MLE Subcommittee report and the
               AMTA Board of Directors’ decision about MLE, which was unknown at the time of this
               survey. A frequent comment was that undergraduate education and training was not
               adequate to teach entry-level skills and knowledge, while the current exam tests
               beyond entry level. The response of one educator succinctly captures the perspective
               of a number of these faculty members: </p>
            <disp-quote>
               <p>“[Music Therapy] is not an evolving undergraduate profession, we have huge burn
                  out because jobs are being created for graduates with undergraduate level training
                  who cannot and should not be exposed to the deeper, advanced clinical work
                  required from graduate level training. We are in desperate need to re-think this
                  as it impacts so much more than internal decisions. This one exam is impacting our
                  profession and field as it is known to the public, job market, and any kind of
                  potential career trajectory right up into leadership positions in [administration]
                  where we need [graduates] to be heading to support and sustain the future of the
                  field.”</p>
            </disp-quote>
         </sec>
         <sec>
            <title>CBMT, AMTA and NASM</title>
            <p>Throughout the narrative responses, program directors expressed a range of concerns
               about the exam that they viewed to be a result of the relationship between CBMT,
               AMTA, and NASM. The primary theme expressed through these statements was the
               disconnect between the Professional Competencies of AMTA and the CBMT Domains, which
               faculty perceived as being qualitatively different. Further, the distribution of
               credits required by NASM for music therapy degree programs was another concern,
               particularly the core music credits that many perceived as having little application
               to the music skills required for music therapy practice. Finally, some faculty felt
               that the exclusion of educators during exam construction exacerbated the perceived
               disconnect between AMTA and CBMT competencies, making it difficult for educators to
               create a curriculum based on AMTA requirements that also addresses the CBMT domains. </p>
            <p>The majority of program directors also felt that there was a lack of communication
               between CBMT, AMTA and educators, and that this posed a number of problems in
               preparing students for the exam. Notable among these concerns were the following: 1)
               faculty lack accurate knowledge about exam content and are likely to have no recent
               experience taking the exam, 2) there is no systematic mechanism through which faculty
               are informed about changes to cut scores, 3) there is no way for educators to
               identify where the focus of the exam will shift from cycle to cycle based on the
               practice analysis, and 4) there are no examples of actual test questions that mirror
               the current exam. When taken as a whole, this appears to suggest that educators
               perceive significant obstacles in preparing students for the exam. </p>
         </sec>
         <sec>
            <title>Comments Related to Decreased Pass Rates</title>
            <p>Program directors’ responses to Executive Director Schneck’s statement regarding
               educational preparation and clinical competence (“While music therapy clinical
               practice is advancing, as reflected in the practice analyses and increased cut
               scores; [sic] the change is not being driven by music therapy education as indicated
               in the declining pass rates”) may be viewed from two different perspectives. When
               Likert and Likert-type responses are examined, they suggest a widely divergent points
               of view: 18% (7) of faculty disagreed with the Executive Director’s statement, 15%
               (6) somewhat disagreed, 15% (6) neither agreed nor disagreed, 23% (9) somewhat
               agreed, and 30% (12) agreed. When the average of these responses was calculated, the
               mean was 3.3. Scores in the 3 to 4 range are rated as “neither agree nor disagree” (a
               rating of 3) to “somewhat agree” (a rating of 4). However, written responses from
               program directors provide a different picture, with most faculty members responding
               in ways that suggest increased failures were the result of systemic problems in the
               construction of the exam (already described in this section). </p>
            <p>Some educators also drew attention to the fact that there is no evidence that the
               recent decline in the pass rate is associated with inadequate academic preparation.
               Logically, they point out, if clinical practice is advancing, how can CBMT conclude
               that education is not driving this advancement? Two educators take this logic
               further, suggesting that improvements in education may be causing lower scores, as
               students are being taught more advanced skills and concepts than those represented in
               the exam—which requires a more cognitive, cause-effect clinical perspective. </p>
         </sec>
         <sec>
            <title>Responsibilities of Faculty </title>
            <p>While most faculty members perceived that increased failures were due to systemic
               problems, two program directors suggested that educators themselves may be at fault,
               in that they may not be keeping abreast of evidence-based practice, or they may not
               include an adequate number of clinical approaches and philosophies, thus failing to
               prepare students. Several faculty members also stressed that it was the educator’s
               job to prepare students to take this kind of test by developing test-taking skills
               throughout their undergraduate or equivalency training. Recent successful exam
               candidates in <xref ref-type="bibr" rid="HTC2020">Hsiao et al.’s (2020)</xref> study
               found academic course work to be helpful or very helpful (65%), but they requested
               additional support in terms of: 1) provision of an overview of the exam process, 2)
               provision of current resources to prepare for the exam, 3) assistance to develop
               skills needed to take standardized tests, and 4) the inclusion of exam questions in
               academic coursework that would be similar to questions used in the board
               certification exam. Certainly, the apparent discrepancy between the level of
               effectiveness with which faculty believe they prepare students for the exam (the
               majority of responses reflect a belief that the exam preparation they provide
               students was effective, or somewhat effective), and the continued decline in first
               time pass rates, suggests faculty give increased attention to the preparation methods
               and materials they provide for their students.</p>
         </sec>
         <sec>
            <title>Responsibilities of Students</title>
            <p>Some faculty also recognized that students had a role to play in passing the exam,
               noting that while some students had excellent clinical skills and an instinctive
               awareness when working with clients, they do not possess the analytical and critical
               thinking skills needed to pass the exam. Others believed that the timing of the test
               was a factor, since students who take the exam too long after internship and/or
               neglect to ask for help in preparing for the exam have more difficulty passing.
               Finally, some program directors observed that students for whom English is a second
               language had much greater difficulty passing the exam because of the ways in which
               language is used in the exam, a perception consistent with <xref ref-type="bibr"
                  rid="HTC2020">Hsiao et al.’s (2020)</xref> study findings. </p>
         </sec>
         <sec>
            <title>Recommendations for Changes to the CBMT Exam</title>
            <p>While 5 of the 34 respondents (15%) to the question “What changes, if any, would you
               make to the CBMT exam?” believed that no changes are needed, 29 program directors
               recommended changes. These include 1) organizational and curriculum changes, 2) exam
               content changes, 3) alternate exam formats, and 4) entry level and advanced practice
               exams. Each will be briefly summarized below. </p>
         </sec>
         <sec>
            <title>Organizational and Curriculum Changes</title>
            <p>Program directors suggested a number of solutions that address their perception that
               there is a lack of communication and coordination between CBMT, AMTA, and faculty.
               For example, one program director suggested that CBMT could be more forthcoming in
               helping academic programs to identify where their students are having problems so
               that these can be addressed. Another suggested that AMTA perform a complete review of
               their educational competencies and realign them to more accurately reflect current
               practice, as defined by the CBMT scope of practice. </p>
            <p>Two specific solutions related to an over-extended undergraduate curriculum were also
               suggested. These were: 1) addressing the percent of music therapy courses in the
               undergraduate curriculum required by NASM and AMTA, and 2) reviewing the content of
               the NASM requirements with a particular focus to musical skills development. Program
               directors concerned with music skills requirements believe that by reducing the
               applied music and ensemble requirements, students could focus more on
                  <italic>clinical</italic> music skills and ensemble requirements, which are
               ever-expanding and demanding, though often underestimated by applied music faculty.
               Such a focus on clinical music skills would be of great benefit in preparing students
               for their profession as <italic>music </italic>therapists, while also addressing
               burdens associated with an over-extended undergraduate curriculum. </p>
         </sec>
         <sec>
            <title>Exam Content </title>
            <p>Program directors suggested a number of solutions to address their concerns regarding
               exam content. These included: </p>
            <list list-type="bullet">
               <list-item>
                  <p>Removing questions pertaining to theoretical orientations and models that
                     require institute training (as they believe this level of knowledge reflects
                     advanced practice) </p>
               </list-item>
               <list-item>
                  <p>Reducing the scope of practice, particularly where it pertains to knowledge
                     that is not based on music therapy</p>
               </list-item>
               <list-item>
                  <p>Removing specialized medical terminology from exam questions</p>
               </list-item>
               <list-item>
                  <p>Eliminating questions that are reflective of private practice (e.g.
                     termination; billing)</p>
               </list-item>
               <list-item>
                  <p>Aligning CBMT and AMTA competencies</p>
               </list-item>
               <list-item>
                  <p>Educators also suggested developing resources to help with exam preparation.
                     This included creating study guides, providing more access to retired test
                     questions, and providing more resources for international students. This need
                     for resources that help students to prepare for the exam, as well as CBMT
                     resources that more accurately mirror the exam, is supported by <xref
                        ref-type="bibr" rid="HTC2020">Hsiao et al.’s (2020)</xref> study of recent
                     exam takers.</p>
               </list-item>
            </list>
         </sec>
         <sec>
            <title>Alternative Exam Formats</title>
            <p>Because of the challenges they perceived that some students have with the exam, a
               number of program directors favored exploring alternative ways of evaluating clinical
               competence. These included adding a live clinical component to the exam, focusing
               more exam questions on clinical practice, and providing alternative test-taking
               formats for students with disabilities. Some respondents in Hsiao et al.’s. (<xref
                  ref-type="bibr" rid="HTC2020">2020</xref>) study also suggested that CBMT might
               enhance the validity of the exam by considering other forms of testing such as essay
               questions and experiential components. </p>
         </sec>
         <sec>
            <title>Entry-Level and Advanced Practice Exams</title>
            <p>Finally, several program directors suggested creating different levels of exams,
               which they believed would address issues related to the scope of practice and an
               over-burdened undergraduate curriculum. These solutions included: 1) creating entry
               level and advanced practice exams, and 2) creating a tiered exam system similar to
               nursing. Such a system would reflect different levels of training and expertise,
               allowing for both undergraduate and graduate levels of examination. </p>
         </sec>
         <sec>
            <title>Study Limitations </title>
            <p>Three limitations are acknowledged when considering the findings from this study. The
               first is the sample size. While reflecting the responses of over half of all eligible
               program directors, a larger sample of academic faculty may have yielded different
               categories and distributions of responses. Second, as no identifying information was
               gathered on participants, comparisons from different groups of respondents were not
               possible. Gathering data such as program philosophy, recent first-time program pass
               rates, years as an educator, and region may have provided additional information that
               clarified responses and distinguished categories of response. Finally, as program
               directors are not permitted on the CBMT Exam Committee, nor do they have access to
               the exam, their survey responses reflect perceptions about the exam, and as such, may
               vary in their accuracy and/or depth of understanding. </p>
         </sec>
         <sec>
            <title>Reflective Comments</title>
            <p>The findings from this survey reveal a broad range of perspectives among program
               directors about the meaningfulness of the CBMT exam in measuring clinical competence.
               Summary data from the Likert and Likert-type questions suggest that program directors
               do not, on average, perceive the exam to be effective or ineffective in evaluating
               competence to practice safely and effectively, and that, on average, they view
               clinical competence and academic grades as being neither related nor unrelated to
               CBMT exam performance. Written responses from participants provide a clearer picture,
               with 54% of responses critical of the exam’s ability to evaluate clinical competence,
               and a further 41% suggesting the exam evaluates test-taking abilities. These concerns
               were further expressed in the number of comments related to external validity, with
               62% of responses expressing concerns about the ability of the exam to measure
               undergraduate entry-level job tasks and competence. </p>
            <p>While these findings suggest that program directors have a number of concerns about
               the CBMT exam, a number of faculty also appear to believe the CBMT exam is a relevant
               and meaningful measure of clinical competence (see Table 1). One might ask, how is
               this possible? How can it be that some faculty are supportive of the exam, perhaps
               even strongly so, whereas other faculty are critical of the exam, perhaps equally
               strongly so? </p>
            <p>These conflicting perspectives may serve as an important starting point for a larger
               discussion about how music therapy is defined and practiced, and it may serve to
               bring us closer to addressing core concerns about the exam expressed in this survey.
               For example, perhaps those program directors who expressed support for the exam do so
               because they are aligned philosophically with the exam. That is, they believe that
               the exam evaluates the “correct” or “right” way of thinking about music therapy; that
               there are “first” and “best” clinical decisions that can be made outside of the
               clinical context in which they occur; and that music therapy is best understood
               causally. From this perspective, it follows that music therapy interventions can be
               understood objectively, with clients behaving in predictable ways in relation to
               musical stimuli, and that we can therefore predict the ways groups of clients respond
               to a music experience (whether this be a specific music element, activity, or
               experience). When this clinical perspective is taken, then the CBMT exam appears to
               make sense, and may well be a reliable way of measuring clinical competence. </p>
            <p>What happens though, if you don’t believe music therapy works this way? Or, that you
               were not trained to think this way because your instructors taught you a different
               way of thinking about music therapy? What happens if you believe that benefits of
               music are not causal, and that music experiences evoke myriad reactions from clients,
               both conscious and unconscious, and these are best addressed within the context of
               the unique therapeutic relationship between the client and their music therapist?
               From this perspective, each therapeutic process may be different, even when working
               with clients who have the same diagnosis and clinical goals, and therefore deciding
               the “first” and “best” response to a client can only occur within the specific
               context of that client or session. </p>
            <p>While we can understand music as a stimulus, in which the specific elements of music
               evoke specific responses from clients, this is only one way of understanding music.
               We can also understand music as a symbol, a metaphor, a cultural marker, an energy
               system, and a portal to the spirit world, just to name a few such perspectives. Such
               perspectives reflect more than philosophical differences in individual music
               therapists’ approaches to work with clients. They reflect equally valid ways of
               thinking about and practicing music therapy, and as such are equally important ways
               of understanding clinical competence. </p>
            <p>Herein lies a core concern about the CBMT exam, and may be one way of drawing
               together the plethora of concerns that program directors have regarding the exam:
               whereas music therapy can be practiced in a wide variety of ways, each of which has
               its own integrity, the CBMT exam may only evaluate one way of thinking about music
               therapy clinical practice. </p>
            <p>Students’ struggles with the exam, especially in the last decade, may therefore
               reflect two important things: 1) their exam performance may be an indication of the
               extent to which their academic program is philosophically aligned with the ways in
               which the CBMT exam defines music therapy clinical practice, and 2) the drop in
               first-time pass rates may reflect a deepening and differentiation of clinical
               practice knowledge that no longer aligns with the fundamental premises of the exam.
               That is, educators are advancing clinical practice, and one of the ways of doing so
               is to develop the sophistication of their own theoretical perspective. If this
               perspective does not align with the philosophical premises of the CBMT exam, measured
               in the ways exam questions ask students to think about therapy, then students in
               these programs may well do poorly. </p>
            <p>A second and related concern has to do with the relationship between the academic
               training program, the internship, and the CBMT exam. As any client would hope, music
               therapy students must pass three levels of evaluation before they can work
               clinically: academic, internship, and exam. In this process, the academic training
               program “approves” the student for internship. That is, they vouch for the student by
               verifying to the internship director that the student has met all the academic and
               clinical training competencies necessary to start internship. Second, the internship
               director, at the end of a successful internship, vouches for the student. Through
               their final evaluation, the internship director says, in essence: “This student is
               ready to work as a music therapist.” That is, prior to being eligible to take the
               exam, the student has passed two levels of evaluation that verify the student’s
               competence. How is it that, even with these two levels of verification, students are
               not able to practice because they cannot pass the exam? Would it not be equally
               plausible to say that the exam is not measuring the student’s competence, especially
               if the internship director, who has observed the student working clinically for 6
               months (approximately 1000 hours), says that the student is competent? </p>
            <p>These clinical practice problems are compounded by AMTA and CBMT’s definitions of
               music therapy, as characterized by the Professional Competencies and Board
               Certification Domains. From these perspectives, music therapy has cognitive,
               communicative, emotional, musical, physiological, psychosocial, sensorimotor and
               spiritual benefits (CBMT Domain I.B.3) that can be addressed behaviorally,
               developmentally, humanistically, psychodynamically, neurologically, and medically
               (CBMT Domain II.A.4). Also included in CBMT’s treatment approaches and models are
               holistic, culture centered, community music therapy and improvisational (CBMT Domain
               II.A.4). How much of each of these theories are students expected to know, and even
               more importantly, how much clinical practice knowledge should students have about
               each model and treatment approach in relation to each clinical setting? None of this
               is defined, and yet students are being evaluated on these competencies. </p>
            <p>Such a broad definition of music therapy has significant clinical practice
               implications. For example, should we expect a 22-year-old new graduate to work
               psychotherapeutically with a 54-year-old man with testicular cancer who has just been
               told his disease is terminal and he should “get his house in order.” According to
               both AMTA and CBMT, this newly board-certified music therapist has met the competency
               requirements to practice with this client (AMTA Professional Competencies 10.3, 10.5,
               13.5 and 13.13; CBMT Domains I.B.3.c and II.A.4.i).</p>
            <p>Further, how should this student’s competence to practice be evaluated prior to
               starting work? Does the CBMT exam evaluate minimal competence to practice when the
               music therapist is working psychodynamically with an adult addressing emotional
               goals? We propose that a majority of psychodynamically trained music therapists would
               argue that the CBMT exam does not evaluate this kind of clinical competence, even
               though the student has the designated credential (MT-BC) to practice. </p>
            <p>Finally, we acknowledge concerns expressed by some faculty that some academic
               programs may not be “keeping up.” We believe this is an important topic for
               discussion, especially in light of the increased concerns expressed by many faculty
               about their students’ mental health, an overburdened undergraduate curriculum, and
               the financial pressures many students experience completing an undergraduate degree.
               But we also suggest any such discussions be carefully considered, especially if
               “keeping up” is only being measured by first-time pass-rates for the CBMT exam. The
               findings from this survey present a much more complex picture, in which it is equally
               important to ask: “Is the CBMT exam keeping up with clinical practice?” And, perhaps
               more importantly: “What kind of clinical practice is the CBMT exam evaluating?”</p>
         </sec>
      </sec>
      <!-- sec lvl 2 end -->
      <!-- sec lvl 2 begin -->
      <sec> 
         <title>Acknowledgement</title>
         <p>The authors would like to thank Dr Audra Gollenberg for her assistance
         with the analysis and interpretation of the survey data. </p>
      </sec>
      <sec>
         <title>About the authors</title>
         <p>Anthony Meadows is the Director of Music Therapy at Shenandoah University (Virginia,
            USA). He has more than 20 years clinical experience, working with both children and
            adults, and 18 years of experience as an educator. Anthony has served in a wide range of
            positions, including the Assembly of Delegates (AMTA), MAR-AMTA Research Committee, and
            as Editor of <italic>Music Therapy Perspectives</italic> (2011–2018). He has published
            broadly, including research in cancer care and the edited volume <italic>Developments in
               Music Therapy Practice: Case Study Perspectives</italic> (2011). </p>
         <p>Lillian Eyre is an accredited music therapist (MT-BC), a licensed professional counselor
            (LPC, Pennsylvania), and a Fellow of the Association for Music &amp; Imagery (FAMI). She
            is a visiting associate professor at Temple University, USA. Prior to joining Temple,
            Eyre was Associate Professor and Director of Music Therapy at Immaculata University,
            USA. In 1995, she founded music therapy programs in psychiatry, dialysis and long-term
            care in the McGill University Heath System, Canada, where she worked until 2006. She
            co-founded <italic>Le groupe Musiart</italic>, a performing arts group and choir for
            persons with serious mental illness. She serves on the editorial review board of
               <italic>Music Therapy Perspectives </italic>and the <italic>Canadian Journal of Music
               Therapy</italic>. In addition to article and chapter publications, she edited
               <italic>Guidelines to Music Therapy Practice in Mental Health</italic> (2013,
            Barcelona Publishers).</p>
      </sec>
      <!-- sec lvl 2 end -->
   </body>
   <back>
      <ref-list>
         <ref id="AH2018">
            <!--Aigen, K., & Hunter, B. (2018). The creation of the American Music Therapy Association: Two perspectives. <italic>Music Therapy Perspectives, 36</italic>(2), 183–194. <uri>https://doi.org/10.1093/mtp/miy016</uri>.-->
            <element-citation publication-type="journal" publication-format="web">
               <person-group person-group-type="author">
                  <name>
                     <surname>Aigen</surname>
                     <given-names>K</given-names>
                  </name>
                  <name>
                     <surname>Hunter</surname>
                     <given-names>B</given-names>
                  </name>
               </person-group>
               <year>2018</year>
               <article-title>The creation of the American Music Therapy Association: Two
                  perspectives</article-title>
               <source>Music Therapy Perspectives</source>
               <volume>36</volume>
               <issue>2</issue>
               <fpage>183</fpage>
               <lpage>194</lpage>
               <pub-id pub-id-type="doi" xlink:href="https://doi.org/10.1093/mtp/miy016"
                  >10.1093/mtp/miy016</pub-id>
            </element-citation>
         </ref>
         <ref id="AMTA2011">
            <!--American Music Therapy Association (AMTA) (2011). <italic>Master’s level entry: Core considerations</italic>. <uri>http://www.musictherapy.org/assets/1/7/Masters_Level_Entry_Core_Considerations.pdf</uri>-->
            <element-citation publication-type="book" publication-format="web">
               <person-group person-group-type="author">
                  <collab>American Music Therapy Association (AMTA)</collab>
               </person-group>
               <year>2011</year>
               <source>Master’s level entry: Core considerations</source>
               <uri>http://www.musictherapy.org/assets/1/7/Masters_Level_Entry_Core_Considerations.pdf</uri>
            </element-citation>
         </ref>
         <ref id="AMTA2013">
            <!--American Music Therapy Association (AMTA) (2013). American Music Therapy Association Professional Competencies. <uri>https://www.musictherapy.org/about/competencies/</uri>-->
            <element-citation publication-type="book" publication-format="web">
               <person-group person-group-type="author">
                  <collab>American Music Therapy Association (AMTA)</collab>
               </person-group>
               <year>2013</year>
               <source>American Music Therapy Association Professional Competencies</source>
               <uri>https://www.musictherapy.org/about/competencies/</uri>
            </element-citation>
         </ref>
         <ref id="BB2012">
            <!--Boone, H., & Boone, D. (2012). Analyzing Likert data. Journal of Extension, 50(2). Article Number 2TOT2. <uri>https://www.joe.org/joe/2012april/tt2.php</uri>-->
            <element-citation publication-type="journal" publication-format="web">
               <person-group person-group-type="author">
                  <name>
                     <surname>Boone</surname>
                     <given-names>H</given-names>
                  </name>
                  <name>
                     <surname>Boone</surname>
                     <given-names>D</given-names>
                  </name>
               </person-group>
               <article-title>Analyzing Likert data</article-title>
               <year>2012</year>
               <source>Journal of Extension</source>
               <volume>50</volume>
               <issue>2</issue>
               <elocation-id>2TOT2</elocation-id>
               <uri>https://www.joe.org/joe/2012april/tt2.php</uri>
            </element-citation>
         </ref>
         <ref id="B1962">
            <!--Braswell, C. (1962). The future of psychiatric music therapy: A review of the profession. In Music Therapy, 1961, Eleventh Book of Proceedings of the National Association for Music Therapy, 11 (65–76). University of Kansas Press. -->
            <element-citation publication-type="book" publication-format="print">
               <person-group person-group-type="author">
                  <name>
                     <surname>Braswell</surname>
                     <given-names>C</given-names>
                  </name>
               </person-group>
               <chapter-title>The future of psychiatric music therapy: A review of the
                  profession</chapter-title>
               <year>1962</year>
               <source>Music Therapy, 1961, Eleventh Book of Proceedings of the National Association
                  for Music Therapy</source>
               <volume>11</volume>
               <fpage>65</fpage>
               <lpage>76</lpage>
               <publisher-name>University of Kansas Press</publisher-name>
            </element-citation>
         </ref>
         <ref id="B1989">
            <!--Bruscia, K. (1989). The current content of music therapy education at undergraduate and graduate levels. Music Therapy Perspectives, 7, 83–87.-->
            <element-citation publication-type="journal" publication-format="print">
               <person-group person-group-type="author">
                  <name>
                     <surname>Bruscia</surname>
                     <given-names>K</given-names>
                  </name>
               </person-group>
               <article-title>The current content of music therapy education at undergraduate and
                  graduate levels</article-title>
               <year>1989</year>
               <source>Music Therapy Perspectives</source>
               <volume>7</volume>
               <fpage>83</fpage>
               <lpage>87</lpage>
            </element-citation>
         </ref>
         <ref id="CBMT2019a">
            <!--Certification Board for Music Therapists. (n.d.-a). <italic>About CBMT</italic>. Retrieved May 11th, 2019. <uri>https://cbmt.org/about-cbmt/</uri>-->
            <mixed-citation publication-type="journal" publication-format="web">Certification Board
               for Music Therapists. (n.d.-a). <italic>About CBMT</italic>. Retrieved May 11th,
               2019. <uri>https://cbmt.org/about-cbmt/</uri>
            </mixed-citation>
         </ref>
         <ref id="CBMT2019b">
            <!--Certification Board for Music Therapists. (n.d.-b). <italic>What is a Practice Analysis?</italic> Retrieved August 1st, 2019. <uri>https://www.cbmt.org/frequently-asked-questions/</uri>-->
            <mixed-citation publication-type="journal" publication-format="web">Certification Board
               for Music Therapists. (n.d.-b). <italic>What is a Practice Analysis?</italic>
               Retrieved August 1<sup>st</sup>, 2019.
                  <uri>https://www.cbmt.org/frequently-asked-questions/</uri>
            </mixed-citation>
         </ref>
         <ref id="CBMT2019">
            <!--Certification Board for Music Therapists. (2019). <italic>Candidate Handbook</italic>. CBMT. <uri>https://www.cbmt.org/candidates/certification/</uri>-->
            <element-citation publication-type="book" publication-format="web">
               <person-group person-group-type="author">
                  <collab>Certification Board for Music Therapists</collab>
               </person-group>
               <year>2019</year>
               <source>Candidate Handbook</source>
               <publisher-name>CBMT</publisher-name>
               <uri>https://www.cbmt.org/candidates/certification/</uri>
            </element-citation>
         </ref>
         <ref id="DM1989">
            <!--Dileo Maranto, C. (1989). A letter from the president. <italic>Music Therapy Perspectives, 6</italic>, 7–9. <uri>https://doi.org/10.1093/mtp/7.1.7</uri>-->
            <element-citation publication-type="journal" publication-format="web">
               <person-group person-group-type="author">
                  <name>
                     <surname>Dileo Maranto</surname>
                     <given-names>C</given-names>
                  </name>
               </person-group>
               <year>1989</year>
               <article-title>A letter from the president</article-title>
               <source>Music Therapy Perspectives</source>
               <volume>6</volume>
               <fpage>7</fpage>
               <lpage>9</lpage>
               <pub-id pub-id-type="doi" xlink:href="https://doi.org/10.1093/mtp/7.1.7"
                  >10.1093/mtp/7.1.7</pub-id>
            </element-citation>
         </ref>
         <ref id="F2018">
            <!--Ferrer, A. (2018). Music therapy profession: An in-depth analysis of the perceptions of Educators and AMTA board members. <italic>Music Therapy Perspectives, 36</italic>(1), 87–96. <uri>https://doi.org/10.1093/mtp/miw041</uri>-->
            <element-citation publication-type="journal" publication-format="web">
               <person-group person-group-type="author">
                  <name>
                     <surname>Ferrer</surname>
                     <given-names>A</given-names>
                  </name>
               </person-group>
               <year>2018</year>
               <article-title>Music therapy profession: An in-depth analysis of the perceptions of
                  Educators and AMTA board members</article-title>
               <source>Music Therapy Perspectives</source>
               <volume>36</volume>
               <issue>1</issue>
               <fpage>87</fpage>
               <lpage>96</lpage>
               <pub-id pub-id-type="doi" xlink:href="https://doi.org/10.1093/mtp/miw041"
                  >10.1093/mtp/miw041</pub-id>
            </element-citation>
         </ref>
         <ref id="GK2016">
            <!--Ghetti, C. M., & Keith, D. R. (2016). Qualitative content analysis. In Murphy, K. & Wheeler, B. L. (Eds.), <italic>Music Therapy Research </italic>(pp. 965–977). Plass, NN: Barcelona Publishers.-->
            <element-citation publication-type="book-chapter" publication-format="print">
               <person-group person-group-type="author">
                  <name>
                     <surname>Ghetti</surname>
                     <given-names>C M</given-names>
                  </name>
                  <name>
                     <surname>Keith</surname>
                     <given-names>D R</given-names>
                  </name>
               </person-group>
               <year>2016</year>
               <chapter-title>Qualitative content analysis</chapter-title>
               <person-group person-group-type="editor">
                  <name>
                     <surname>Murphy</surname>
                     <given-names>Murphy</given-names>
                  </name>
                  <name>
                     <surname>K.</surname>
                     <given-names>K</given-names>
                  </name>
                  <name>
                     <surname>Wheeler</surname>
                     <given-names>Wheeler</given-names>
                  </name>
                  <name>
                     <surname>L</surname>
                     <given-names>B</given-names>
                  </name>
               </person-group>
               <source>Music Therapy Research</source>
               <fpage>965</fpage>
               <lpage>977</lpage>
               <publisher-name>Barcelona Publishers</publisher-name>
            </element-citation>
         </ref>
         <ref id="GP2000">
            <!--Groene, R., & Pembrook, R. (2000). Curricular issues in music therapy: A survey of Collegiate faculty. <italic>Music Therapy Perspectives, 18</italic>(2), 92-102. <uri>https://doi.org/10.1093/mtp/18.2.92</uri>-->
            <element-citation publication-type="journal" publication-format="web">
               <person-group person-group-type="author">
                  <name>
                     <surname>Groene</surname>
                     <given-names>R</given-names>
                  </name>
                  <name>
                     <surname>Pembrook</surname>
                     <given-names>R</given-names>
                  </name>
               </person-group>
               <year>2000</year>
               <article-title>Curricular issues in music therapy: A survey of Collegiate
                  faculty</article-title>
               <source>Music Therapy Perspectives</source>
               <volume>18</volume>
               <issue>2</issue>
               <fpage>92</fpage>
               <lpage>102</lpage>
               <pub-id pub-id-type="doi" xlink:href="https://doi.org/10.1093/mtp/18.2.92"
                  >10.1093/mtp/18.2.92</pub-id>
            </element-citation>
         </ref>
         <ref id="H2015">
            <!--Harpe, S. (2015). How to analyze Likert and other rating scale data. Currents in Pharmacy Teaching and Learning, 7, 836–850. <uri>https://doi.org/10.1016/j.cptl.2015.08.001</uri>-->
            <element-citation publication-type="journal" publication-format="web">
               <person-group person-group-type="author">
                  <name>
                     <surname>Harpe</surname>
                     <given-names>S</given-names>
                  </name>
               </person-group>
               <article-title>How to analyze Likert and other rating scale data</article-title>
               <year>2015</year>
               <source>Currents in Pharmacy Teaching and Learning</source>
               <volume>7</volume>
               <fpage>836</fpage>
               <lpage>850</lpage>
               <pub-id pub-id-type="doi" xlink:href="https://doi.org/10.1016/j.cptl.2015.08.001"
                  >10.1016/j.cptl.2015.08.001</pub-id>
            </element-citation>
         </ref>
         <ref id="HTC2020">
            <!--Hsiao, F. Tan, X., Tang, J., & Chen, M. (2020). Factors associated with music therapy board certification examination outcomes. Music Therapy Perspectives, 38(1), 51–60. <uri>https://doi.org/10.1093/mtp/miz017</uri>-->
            <element-citation publication-type="journal" publication-format="web">
               <person-group person-group-type="author">
                  <name>
                     <surname>Hsiao</surname>
                     <given-names>X</given-names>
                  </name>
                  <name>
                     <surname>Tang</surname>
                     <given-names>J</given-names>
                  </name>
                  <name>
                     <surname>Chen</surname>
                     <given-names>M</given-names>
                  </name>
               </person-group>
               <article-title>Factors associated with music therapy board certification examination
                  outcomes</article-title>
               <year>2020</year>
               <source>Music Therapy Perspectives</source>
               <volume>38</volume>
               <issue>1</issue>
               <fpage>51</fpage>
               <lpage>60</lpage>
               <pub-id pub-id-type="doi" xlink:href="https://doi.org/10.1093/mtp/miz017"
                  >10.1093/mtp/miz017</pub-id>
            </element-citation>
         </ref>
         <ref id="LRBJ2018">
            <!--Lloyd, K., Richardson, T., Boyle, S., & Jackson, N. (2018). Challenges in music therapy undergraduate education: Narratives from the front lines. Music Therapy Perspectives, 36(1), 108–116. <uri>https://doi.org/10.1093/mtp/mix009</uri>-->
            <element-citation publication-type="journal" publication-format="web">
               <person-group person-group-type="author">
                  <name>
                     <surname>Lloyd</surname>
                     <given-names>K</given-names>
                  </name>
                  <name>
                     <surname>Richardson</surname>
                     <given-names>T</given-names>
                  </name>
                  <name>
                     <surname>Boyle</surname>
                     <given-names>S</given-names>
                  </name>
                  <name>
                     <surname>Jackson</surname>
                     <given-names>N</given-names>
                  </name>
               </person-group>
               <article-title>Challenges in music therapy undergraduate education: Narratives from
                  the front lines</article-title>
               <year>2018</year>
               <source>Music Therapy Perspectives</source>
               <volume>36</volume>
               <issue>1</issue>
               <fpage>108</fpage>
               <lpage>116</lpage>
               <pub-id pub-id-type="doi" xlink:href="https://doi.org/10.1093/mtp/mix009"
                  >10.1093/mtp/mix009</pub-id>
            </element-citation>
         </ref>
         <ref id="OC2016">
            <!--O’Callahan, C. (2016). Grounded theory. In Murphy, K. & Wheeler, B.L. (Eds.), Music Therapy Research (3rd ed., pp. 1001–1021). Plass, NN: Barcelona Publishers.-->
            <element-citation publication-type="book-chapter" publication-format="print">
               <person-group person-group-type="author">
                  <name>
                     <surname>O’Callahan</surname>
                     <given-names>C</given-names>
                  </name>
               </person-group>
               <year>2016</year>
               <chapter-title>Grounded theory</chapter-title>
               <person-group person-group-type="editor">
                  <name>
                     <surname>Murphy</surname>
                     <given-names>K</given-names>
                  </name>
                  <name>
                     <surname>Wheeler</surname>
                     <given-names>B L</given-names>
                  </name>> </person-group>
               <source>Music Therapy Research</source>
               <edition>3</edition>
               <fpage>1001</fpage>
               <lpage>1021</lpage>
               <publisher-name>Barcelona Publishers</publisher-name>
            </element-citation>
         </ref>
         <ref id="S2016">
            <!--Saldaña, J. (2016). The coding manual for qualitative researchers (3rd ed.). Sage Publications. -->
            <element-citation publication-type="book" publication-format="print">
               <person-group person-group-type="author">
                  <name>
                     <surname>Saldaña</surname>
                     <given-names>J</given-names>
                  </name>
               </person-group>
               <year>2016</year>
               <source>The coding manual for qualitative researchers</source>
               <edition>3</edition>
               <publisher-name>Sage Publications</publisher-name>
            </element-citation>
         </ref>
         <ref id="WBBBCFHHHKKMNS2017">
            <!--Wylie, M., Borling, J., Borczon, R., Briggs, C., Creagan, J., Furman, A., Hairston, M., Hughes, M., Hunter, B., Kahler, E., Kaplan, R., Montague, E., Neugebauer, C., & Snell, A. (2017). A question of degree: Final report of the Master’s Level Entry (MLE) subcommittee. <uri>https://www.musictherapy.org/assets/1/7/MLE_11-30-17_Part_I.pdf</uri>-->
            <element-citation publication-type="book" publication-format="web">
               <person-group person-group-type="author">
                  <name>
                     <surname>Wylie</surname>
                     <given-names>M</given-names>
                  </name>
                  <name>
                     <surname>Borling</surname>
                     <given-names>J</given-names>
                  </name>
                  <name>
                     <surname>Borczon</surname>
                     <given-names>R</given-names>
                  </name>
                  <name>
                     <surname>Briggs</surname>
                     <given-names>C</given-names>
                  </name>
                  <name>
                     <surname>Creagan</surname>
                     <given-names>J</given-names>
                  </name>
                  <name>
                     <surname>Furman</surname>
                     <given-names>A</given-names>
                  </name>
                  <name>
                     <surname>Hairston</surname>
                     <given-names>M</given-names>
                  </name>
                  <name>
                     <surname>Hughes</surname>
                     <given-names>M</given-names>
                  </name>
                  <name>
                     <surname>Hunter</surname>
                     <given-names>B</given-names>
                  </name>
                  <name>
                     <surname>Kahler</surname>
                     <given-names>E</given-names>
                  </name>
                  <name>
                     <surname>Kaplan</surname>
                     <given-names>R</given-names>
                  </name>
                  <name>
                     <surname>Montague</surname>
                     <given-names>E</given-names>
                  </name>
                  <name>
                     <surname>Neugebauer</surname>
                     <given-names>C</given-names>
                  </name>
                  <name>
                     <surname>Snell</surname>
                     <given-names>A</given-names>
                  </name>
               </person-group>
               <year>2017</year>
               <source>A question of degree: Final report of the Master’s Level Entry (MLE)
                  subcommittee</source>
               <uri>https://www.musictherapy.org/assets/1/7/MLE_11-30-17_Part_I.pdf</uri>
            </element-citation>
         </ref>
      </ref-list>
   </back>
</article>
