The MCHES® examination contains 165 items; 150 of the items are scored and 15 of the items are used as pilot items and do not contribute to the final score on each exam. Examinees are informed that there are pilot items on the exam, however, the candidate is not told which items are being piloted and which items are being scored. NCHEC pilots new items on the MCHES® exam to ensure high statistical integrity of this certification examination. Item piloting is used to determine the psychometric properties of an item before the item is actually included as a “scored item” on an examination. This also allows for the removal of items that do not perform at acceptable levels for certifying examinations.
Determining the passing point for a multiple-choice examination involves the conduct of a standard-setting study, typically conducted each time the content outline for the exam is updated. A representative panel of subject-matter experts develops a standard of performance for minimally-qualified candidates (also known as borderline candidates – those who know just enough to have earned the certification). Panel members then rate each test question as to how well minimally-qualified candidates will perform on each question. Questions with disparate ratings are discussed at length, and the panel comes to agreement on a recommended cut score for the organization’s decision makers to review and make a final decision.
The goal of the standard-setting study is not to determine how many candidates will pass and how many will fail. The goal is to determine how much of the test has to be answered correctly in order to pass, taking into account the difficulty of the questions for minimally-qualified candidates. With the cut score determined in this way, all candidates who meet the minimally-qualified standard will pass. There is no pre-determined “curve” for scoring in which a certain percentage of candidates must either pass or fail. Technically all candidates could pass, or all candidates could fail, but the usual passing rate is somewhere between and can vary from testing window to testing window. The standard of performance on which the cut score is based remains the same until changed by a new standard setting study, usually 5-7 years after the first. What determines any variation in passing rate from one testing window to the next is the ability of the group of candidates taking the test in each window to meet the minimally-qualified standard.
MCHES® exam candidates will receive an email notification of provisional results as "pass/did not pass status" after exiting the exam. Provisional results are provided as a convenience to the examinee, and examinees' official scores may differ from their provisional results. Additional evaluation of the functioning of all exam items is done during and at the close of each exam testing window. This additional review is to ensure that all examinees are evaluated accurately and fairly. Candidates will receive an official scaled score report showing each candidate's overall pass/fail status, as well as diagnostic information about the candidate's performance in each area of responsibility or domain. A diagnostic level of "proficient," "moderately proficient," or "below proficient" will be presented for each area of responsibility. To protect confidentiality, results will not be provided by e-mail, phone, or fax.
Item analyses are conducted and the results are reviewed for each examination form administered. Reliability of the examination is calculated using the Cronbach’s coefficient alpha. Reliability coefficients above 0.80 are considered satisfactory for credentialing exams. The MCHES® exam reliability coefficient, as determined by the Cronbach’s alpha, has consistently met or exceeded the standard over the years.
Candidates will receive an official scaled score report showing each candidate’s overall pass/fail status, as well as diagnostic information about the candidate’s performance in each area of responsibility or domain. A diagnostic level of “proficient,” “moderately proficient,” or “below proficient” will be presented for each Area of Responsibility. This information is provided to aid in self-assessment and can help candidates to focus on areas in which they need to pursue further study or professional development. It is important to note that the Area of Responsibility proficiency information is provided ONLY to inform decisions regarding further studying and professional development purposes. Please note that each MCHES® candidate’s pass/fail status is determined solely by their performance on the entire examination, not on performance in individual Areas of Responsibility.
The exam score is confidential and will not be disclosed unless NCHEC receives a written request to do so from a candidate or is directed to do so by subpoena or court order. A candidate wanting scores released to another entity must indicate in writing which particular scores may be disclosed and identify specifically the person or organization to which the scores should be revealed. No candidate scores will be given by telephone, facsimile or e-mail for any reason.
Statistical Information Regarding the MCHES® April 2023 and October 2023 Exams
April 2023 Examination | October 2023 Examination | |
---|---|---|
Number of Items | 150 | 150 |
Pass Point | 600 | 600 |
Average Scale Score | 626 | 587.18 |
Standard Deviation | 67 | 58.17 |
Range of Scale Scores | 463-736 | 452-728 |
Range of Possible Scores | 200-800 | 200-800 |
Number of Candidates | 69 | 55 |
Pass Rate | 66.18% | 40.43% |
The below table shows the combined average score for each Area of Responsibility for the 2023 MCHES® exams.
I. Assessment of needs and capacity | II. Planning | III. Implementation | IV. Evaluation and Research | V. Advocacy | VI. Communication | VII. Leadership and management | VIII. Ethics and professionalism | |||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
% Questions on MCHES Exam | 12 | 10 | 9 | 20 | 9 | 12 | 18 | 10 | ||||||||
Diagnostic Level | ||||||||||||||||
Proficient | 50 | 40% | 30 | 24% | 39 | 31% | 27 | 22% | 30 | 24% | 53 | 43% | 47 | 38% | 58 | 47% |
Moderately Proficient | 68 | 55% | 79 | 64% | 79 | 64% | 65 | 52% | 64 | 52% | 62 | 50% | 53 | 43% | 53 | 43% |
Below Proficient | 6 | 5% | 15 | 12% | 6 | 5% | 32 | 26% | 29 | 24% | 9 | 7% | 24 | 19% | 13 | 10% |
% of Items Correct | ||||||||||||||||
Average | 72 | 77 | 77 | 65 | 71 | 70 | 72 | 72 | ||||||||
Minimum | 22 | 33 | 21 | 17 | 7 | 29 | 22 | 33 | ||||||||
Maximum | 100 | 100 | 100 | 90 | 100 | 100 | 93 | 93 |