- Research
- Open access
- Published:
Development and validation of a tool to measure telehealth educational environment (THEEM)
BMC Medical Education volume 25, Article number: 136 (2025)
Abstract
Background
Telehealth is gaining importance in improving healthcare access and outcomes, particularly in underserved regions. Despite its potential, healthcare providers in developing countries struggle to effectively utilize telehealth tools, highlighting the need for structured training. This study aims to develop and validate a specialized tool to assess the telehealth educational environment, addressing the unique challenges of integrating clinical, technological, and interpersonal skills in telehealth education.
Objectives
(1) To develop a tool for measuring the telehealth educational environment, addressing the unique aspects of telehealth education. (2) To evaluate the validity and reliability of the tool.
Method
This mixed-methods exploratory study had two phases. In Phase 1, the THEEM tool was developed through a literature review and expert feedback. In Phase 2, the tool was validated using content validity, response process validity via cognitive interviews, and construct validity through Exploratory factor analysis (EFA). Reliability was assessed using Cronbach’s alpha. Participants included telehealth physicians, selected via purposive sampling. Data analysis involved calculating the content validity index for items (CVI-I) and the scale (CVI-S), with values ≥ 0.7. For factor analysis, eigenvalues > 1 and factor loadings ≥ 0.50 were retained. Cronbach’s alpha for internal consistency was considered acceptable at ≥ 0.7.
Results
The Preliminary THEEM tool consisted of 35 items. Following content validity through expert validation, the number of items was reduced to 32 based on I-CVI values. S-CVI was 0.86. Exploratory factor analysis yielded three factors with a further reduction in the item number to 30. Internal consistency of the final tool with 30 items was calculated as 0·9, suggesting excellent internal consistency of the tool.
Conclusion
The THEEM tool provides a valuable, reliable, and valid instrument for assessing the telehealth educational environment. Its development fills a significant gap in existing measurement tools, specifically addressing the needs of telehealth education.
Background
Telehealth utilizes digital technologies to deliver healthcare services remotely, especially in underserved areas. Recognized by the World Health Organization (WHO) as a key strategy for 2020–2024, digital health aims to improve healthcare access, outcomes, and equity globally [1]. In Pakistan, telehealth is an emerging field, gaining attention due to its potential to enhance healthcare delivery alongside increasing mobile phone usage. By November 2024, 109 telemedicine companies were operational in Pakistan, including SehatKahani with over 7,000 doctors (90% women) [2] and eDoctor, which has trained more than 1,500 doctors across 27 countries [3]. These platforms provide initial training and continuous medical education (CME) for their doctors, with 64.8% of surveyed clinicians having at least a basic understanding of telemedicine [4]. It offers new opportunities and challenges for educators and healthcare professionals. As its adoption increases, there is a need for clinicians to adapt to the evolving telehealth environment, which involves integrating technology, delivering virtual patient care, and maintaining effective communication and empathy. While majority of clinicians have a basic understanding of telemedicine and view it positively [4], there is a lack of widespread adoption due to regulatory barriers, inadequate digital infrastructure, regulatory issues and low digital literacy [5]. Additionally, a lack of confidence among physicians in delivering effective telehealth services highlights the need to upskill the workforce [6]. A study suggests that over 60% of healthcare providers in developing regions such as Pakistan face challenges in effectively utilizing telehealth tools, highlighting the urgent need for structured training programs [7].
Existing tools for online learning primarily focus on general content delivery and instructor–learner interaction [8] but fail to capture the specific needs of telehealth education, such as integrating clinical, technological, and interpersonal skills. For telehealth clinicians and faculty, learning involves understanding both the intricacies of telehealth delivery and the digital healthcare experience [9]. Telehealth educational environment require learners to navigate advanced medical technologies, manage remote consultations, and maintain empathy and communication through virtual platforms, a combination not typically encountered in traditional e-learning [10]. There is a critical gap in tools that measure and address the telehealth learning environment to support the specific requirements of telehealth physicians and necessitate new tools to ensure quality education.
Therefore, a specialized evaluation tool is essential for evaluating the telehealth educational environment, identifying challenges, and areas for improvement.
This study aimed to develop and validate a specialized tool to evaluate telehealth educational environment.
Significance of the tool
The THEEM tool is designed for educators and curriculum developers in medical universities to evaluate the telehealth educational environment for undergraduate and postgraduate training, practicing physicians in continuous medical education programs to assess and enhance their telehealth learning experiences, and healthcare administrators to identify areas for improving telehealth-based clinical practices. By measuring the unique aspects of the telehealth educational environment, the tool helps address challenges, supports effective teaching and learning, and ensures better preparedness of physicians, ultimately leading to improved patient care outcomes.
Methodology
Study design
A mixed-methods exploratory sequential design with two phases: qualitative phase (Phase 1) for tool development and quantitative phase (Phase 2) for tool validation [11].
The research was conducted at Riphah International University. Ethical approval was obtained from the IRC/Ethics Review Committee at Islamic International Medical College. Informed consent was secured from all participants prior to their involvement in the study in each step. Participants were provided with detailed information about the study’s purpose, procedures, and their right to withdraw at any time without consequence.
Phase 1: tool development (qualitative)
Phase 1 involved developing the first version of the ‘Telehealth Educational Environment Measurement (THEEM)’ tool and included (1) Literature review and (2) Development of preliminary tool items with expert feedback.
-
1.
Literature review with identification of themes: A comprehensive literature review was conducted to identify relevant research articles related to telehealth and the online educational environment. Key themes were extracted from the literature review via an inductive approach, addressing the key components of telehealth educational environment, including technological, clinical, and interpersonal aspects, to inform the design of the tool items.
-
2.
Development of Preliminary Items: The research team developed preliminary items for the THEEM tool. The items were designed to reflect the identified themes, in the form of statements, on a five-point Likert scale ranging from 1- strongly disagree to 5- strongly agree. The THEEM tool was developed in English and all validation procedures were conducted using this language.
Qualitative expert feedback was obtained from 2 telehealth experts with experience in telehealth and 2 medical educationists for content addition, deletion or modification to finalize the initial version of the tool for validation in Phase 2.
Phase 2: tool validation (quantitative)
This phase comprised three steps: (1) establishing content validity, (2) response process validity, and (3) construct validity and reliability of the THEEM tool.
Step 1: content validity
Participants
For content validation, 10 telehealth experts were recruited through purposive sampling to ensure the inclusion of individuals with relevant expertise in providing informed feedback on item relevance and clarity. The experts were contacted directly through professional networks and provided with study details. The use of 10 experts aligns with established guidelines for content validity studies, where a panel of 5–10 subject matter experts is considered sufficient to evaluate the relevance and clarity of tool items [12].
Inclusion criteria
-
1.
Physicians actively work in telehealth settings.
-
2.
Having a minimum of 5 years of experience in telehealth practice.
-
3.
Having administrative or leadership roles in telehealth.
Data collection method
The preliminary tool was shared via Google Forms, and the experts rated each item as follows:
-
Relevance: On a 4-point Likert scale (1-Not relevant to 4-Highly relevant).
-
Clarity: On a 3-point Likert scale (1-Not clear to 3-Very clear).
Data analysis
The content validity index (CVI) was calculated for the items (CVI-I) and for the scale (CVI-S). These values were calculated via expert agreement on item relevance. Relevance ratings of 3 and 4 by the experts were coded as 1 for agreement and of 2 and 1 were coded as 0 for no agreement (Refer Table 1).
-
The I-CVI was calculated via the following formula: I-CVI = Number of experts in agreement/Total number of experts. Items with I-CVI values less than 0.7 were excluded from the tool.
-
The S-CVI was calculated on the basis of the I-CVI via the following formula: S-CVI = Sum of I-CVI scores/Total number of items.
-
The content clarity average (CVA) for the items (CVA-I) was calculated via the following formula: Sum of clarity scores/total number of experts. An average of > 2.4 was taken as acceptable clarity. For values below 2.4, the item statements were modified for clarity [13].
Step 2: validity of the response process through cognitive interviews
Participants
Five telehealth physicians were selected via convenience sampling as per standard guidelines for cognitive interviewing [14]. The inclusion criterion was telehealth physicians with a minimum of 3 years of experience.
Data collection method
Online cognitive interviews were conducted via Zoom (Zoom.us) with the 5 telehealth physicians, to check for respondents’ ease in understanding the tool items. A pilot interview was first conducted with one telehealth physician to identify potential issues, allowing for adjustments to be made before conducting the actual online interviews. The interviews were conducted via the ‘think-aloud technique’ by the primary researcher. To increase the credibility of the technique and reduce bias, another researcher was also present. The interviews were recorded for later review. The participants were instructed to read each item from the tool, rephrase the meaning of the item in their own words and clarify any unclear words or phrases. They were also asked to select appropriate responses on the basis of their understanding of the items so that the clarity of their understanding could be guaranteed.
Data analysis
Memos were created independently by the 2 researchers during the cognitive interviews, categorizing each item into 3 categories: (1) items with no problems in understanding, (2) items with minor problems in understanding, and (3) items with major problems in understanding. Items requiring clarification were reworded following a review by all the authors [14].
Step 3: establishing construct validity and reliability
Exploratory Factor Analysis (EFA) was conducted to evaluate the construct validity of the telehealth learning environment questionnaire. Before conducting the EFA, the dataset was assessed for suitability using the Kaiser-Meyer-Olkin (KMO) measure of sampling adequacy and Bartlett’s test of sphericity. Following this, Principal Component Analysis (PCA) with Varimax rotation was employed to identify the underlying factor structure. The internal consistency reliability of the tool was calculated through Cronbach’s alpha.
Participants
Purposive sampling was performed to recruit 320 telehealth physicians. An adequate sample size was calculated on the basis of an item-to-participant ratio of 1:10, as reported in the literature (32 items × 10 = 320)(11). The inclusion criterion was telehealth physicians working in different telehealth settings.
Data collection method
The finalized tool, after response process validity, was shared as a Google Form along with consent via emails, WhatsApp, and Facebook groups, including those managed by telehealth companies like SehatKahani, eDoctor and Mentor Health.
It consists of 2 sections:
-
Section 1: Participant demographics.
-
Section 2: THEEM items to be rated on a 5-point Likert scale: 1 = strongly disagree, 2 = somewhat disagree, 3 = neither agree nor disagree, 4 = somewhat agree, and 5 = strongly agree.
Data analysis
The data were exported from Google as an Excel file into SPSS software version 26. All analyses were performed using SPSS version 26. Factors were retained based on the Kaiser-Guttman criterion (eigenvalue > 1) and examination of the scree plot. Only items with factor loadings ≥ 0.50 were retained, ensuring that the items contributed significantly to the factors.
The internal consistency of the THEEM tool was assessed using Cronbach’s alpha. A Cronbach’s alpha of ≥ 0.7 was considered an acceptable value for satisfactory internal consistency for the subscales and scale. Values ≥ 0.80 were considered indicative of good reliability.
Results
Phase one: tool development
Phase 1 of the development of the THEEM tool involved a comprehensive literature review, development of preliminary items, and expert feedback.
-
1.
Literature Review and Identification of Themes: Literature review identified seven themes including Technology, Learning Resources, Instructor Support, Learner Engagement, Assessment and Evaluation, Social Support, and Motivation, which formed the foundation for developing the THEEM tool.
-
2.
Item Development: 35 preliminary items were developed for the THEEM tool, designed to reflect the seven key themes and were structured on a five-point Likert scale, ranging from 1 (completely disagree) to 5 (completely agree). The number of items developed under each theme were:
-
Technology (6 items).
-
Learning Resources (6 items).
-
Instructor Support (5 items).
-
Learner Engagement (3 items).
-
Assessment and Evaluation (5 items).
-
Social Support (5 items).
-
Motivation (5 items).
-
3.
Expert Feedback: After the development of the preliminary items, expert feedback was obtained from two telehealth experts and two medical educationists. The experts provided qualitative feedback regarding the content and clarity of the items, suggesting modifications to ensure that the tool accurately captured the intended aspects of the telehealth learning environment. Based on their feedback, items were revised for clarity, relevance, and appropriateness before further validation in Phase 2.
Phase two: tool validation
Content validation
The Content Validity Index (CVI) was calculated for individual items (I-CVI) and the overall scale (S-CVI). Items with I-CVI < 0.7 were excluded as shown in Table 1 and clarity issues were addressed for items with a Content Clarity Average (CCA) < 2.4 (Refer Table 2) [15]. Of the 35 items, two were eliminated during this phase, and eight items were modified for improved clarity. The final S-CVI was 0.84, indicating strong content validity.
Response process validity
Five telehealth physicians participated in cognitive interviews using the think-aloud technique. Seven items categorized as having minor or major issues were rephrased by the authors based on participant feedback, for better comprehension (Refer Table 3).
Construct validity and reliability
Exploratory factor analysis (EFA)
EFA was conducted on data from 320 telehealth physicians, 253 (79.1%) were based in Pakistan, while 67 (20.9%) were from other countries, including the USA, UAE, UK, Bangladesh, Singapore, Saudi Arabia, Qatar, and Australia. The gender distribution showed a predominance of females, with 245 (76.6%) participants, compared to 75 (23.4%) males. The Kaiser-Meyer-Olkin (KMO) value of 0.959 and significant Bartlett’s test (p < 0.001) confirmed data suitability for factor analysis (Refer Table 4) [16].
Three factors were extracted based on eigenvalues > 1 (Refer Fig. 1). Items with factor loadings ≥ 0.5 were retained (Refer Table 5). The factor loading values indicate the degree of association between the items and the underlying factors derived from the analysis [17]. The three identified factors were:
-
Factor 1: Learner engagement and support (17 items): This construct measures the level of physician engagement and satisfaction, reflecting the overall learner experience in telehealth education environment.
-
Factor 2: Instructor guidance and communication: (6 items): This factor measures learners’ perceptions of communication and support from the instructors during the telehealth education program.
-
Factor 3: Quality and Accessibility of Educational Resources and Technology: (7 items): This factor evaluates the effectiveness, value, and accessibility of educational resources, along with the reliability of technology used (e.g., the telehealth platform, internet connectivity and audio/video quality etc.)
Two items with factor loadings < 0.5 were eliminated, resulting in a final tool of 30 items.
Reliability
The internal consistency of the tool with 30 items and the individual factors was assessed via SPSS software version 26. Cronbach’s alpha was calculated to evaluate the internal consistency for reliability. The overall Cronbach’s alpha value for the entire tool (n = 30) was 0.9, indicating excellent internal consistency. Additionally, individual factors also demonstrated strong reliability, with values ranging from 0.884 to 0.950 (Refer Table 6).
The summary flowchart detailing the methods and outcomes achieved at each step in development and validation of THEEM tool are given in figure below (Refer Fig. 2):
Discussion
Telehealth offers benefits such as improved access to care and cost reduction, however it also presents safety risks if implemented without proper training and competence of healthcare workforce [18]. Numerous studies emphasize the importance of evidence-based approaches to interactive health communication, highlighting the need for better access and improved safety in virtual healthcare delivery [19]. One key distinction between telehealth learning environments and other online learning environments is their integration of real-time patient interactions, which adds layers of complexity [20]. Unlike standard online learning environments, telehealth educational environments require learners to navigate advanced medical technologies and manage remote consultations, through virtual platforms [21]. Existing tools, such as the SPROUT Telehealth Evaluation and Measurement (STEM), have primarily focused on the impact of telehealth on patients [15]. Similarly, extensive literature on the evaluation of online learning environments, such as the development and validation of EEAM, while informative, does not address the unique challenges of the telehealth learning environment directly [16].
This study aimed to develop and validate the Telehealth Educational Environment Measurement (THEEM) tool to assess the telehealth learning environment, which is increasingly becoming a crucial part of healthcare education, especially in underserved regions. The tool was developed through a systematic process involving a comprehensive literature review, expert feedback, and validation via content validity, response process validity, and construct validity.
In the development phase, the THEEM tool items were developed through key themes identified from the literature on telehealth education including; technology, learning resources, instructor support, learner engagement, assessment and evaluation, social support and motivation. Technological infrastructure is crucial for ensuring seamless communication and interactive learning, which is essential for the success of telehealth programs. Learner engagement and motivation, extending beyond content delivery to include active participation and interaction with digital platforms, is critical to enhancing outcomes in telehealth education. Instructor support is another essential element, offering guidance on navigating technology, addressing questions, and fostering empathy during remote consultations, as is effective assessment and feedback.
The THEEM tool evaluates the adequacy of support to ensure learners are confident in their skills. Additionally, the quality of educational resources, such as simulations, is key to achieving learning objectives in telehealth training. The tool assesses the quality and accessibility of these resources to ensure they meet the needs of healthcare professionals. By addressing these components, the THEEM tool captures these aspects of telehealth education, ensuring a comprehensive evaluation of the environment. Its ability to accurately reflect the core aspects of telehealth education, such as technological infrastructure, instructor support, and learner engagement, further establishes its utility in educational settings [22].
Furthermore, this tool not only facilitates a better understanding of the educational environment but also serves as a catalyst for changes in telehealth education. By providing a structured approach to evaluating the telehealth learning environment, the THEEM tool can help educational institutions build more comprehensive training programs that are aligned with clinical and technological best practices. This approach ensures that telehealth professionals can effectively leverage virtual platforms, fostering confidence, proficiency, and an emphasis on patient safety [23]. With structured evaluations, telehealth education can evolve to be more responsive to learners’ needs, supporting their growth as competent, adaptable, and safety-conscious practitioners [24].
The relevance of the THEEM tool extends beyond individual training programs, influencing the broader landscape of telehealth. However, its effectiveness also requires continuous refinement and alignment with evolving telehealth technologies and practices, ensuring its relevance in dynamic healthcare environments.
The validation of the THEEM tool was a crucial step in ensuring its applicability and reliability in assessing the telehealth educational environment. The Content Validity Index (CVI), calculated for both individual items (I-CVI) and the overall scale (S-CVI), showed that the tool met the required relevance thresholds, with I- CVI value above 0.7 and an overall S-CVI of 0.86. This high agreement among experts confirms the tool’s content validity, supporting the importance of expert consensus in the development of educational measurement tools, as emphasized in previous studies [25].
The tool’s construct validity, confirmed through factor analysis with factor loadings ≥ 0.50, shows that it can reliably assess key aspects of the telehealth educational environment. This ensures the tool effectively measures important dimensions of telehealth education and provides useful insights to improve training programs. It revealed three distinct factors: Learner engagement and support, instructor guidance and communication, and quality and accessibility of educational resources and technology. These three factors align well with the essential components of telehealth education identified in the literature. The learner engagement factor emphasizes the need for active participation and sustained interest in telehealth education, which is critical for achieving optimal learning outcomes. Instructor support and communication have been identified as crucial factors for fostering successful learning environments in telehealth settings [26]. Similarly, the importance of technology and resources in telehealth training has been well-documented [27], as telehealth platforms and tools serve as the foundation for remote healthcare delivery.
These findings are consistent with existing literature that suggests that effective telehealth education requires a balance between technological tools, instructor involvement, and student engagement [26, 27]. By evaluating these critical dimensions, the THEEM tool provides a comprehensive measure of the telehealth learning environment, offering insights that can guide improvements in telehealth training and professional development.
The THEEM tool demonstrated excellent internal consistency, with a Cronbach’s alpha of 0.90, indicating its reliability in measuring the telehealth learning environment. This high value exceeds the commonly accepted threshold of ≥ 0.70 for educational measurement tools [28]. The result highlights the tool as a dependable instrument for evaluating telehealth education and its potential for widespread use in telehealth training programs.
The THEEM tool has significant implications for telehealth education. It provides a comprehensive framework for assessing key components of the telehealth learning environment. Given the rapid growth of telehealth, particularly in response to the COVID-19 pandemic, such tools are crucial for identifying gaps in training, addressing challenges, and improving educational outcomes.
Moreover, the tool’s ability to identify areas for improvement in telehealth programs can help inform future curriculum development, ensuring that healthcare professionals are adequately trained to provide effective care in virtual settings especially in countries with infrastructural limitations, low digital literacy, and a lack of formal training programs [29].
Conclusion
The THEEM tool provides a valuable, reliable, and valid instrument for assessing the telehealth educational environment. Its development fills a significant gap in existing measurement tools, specifically addressing the needs of telehealth education. This tool can be used by healthcare institutions to evaluate and improve their telehealth training programs, ultimately contributing to the professional development of healthcare providers and the advancement of telehealth services.
Limitations of the study
This study has few limitations. Potential bias may exist in expert selection from one country. Also, only telehealth physicians were included, excluding input from other key stakeholders such as IT professionals, patients, tech companies, nurses, and other healthcare providers. This may limit the generalizability. Participant responses may have been influenced by subjective experiences or social desirability bias.
Future recommendations
Future research is suggested for establishing effectiveness of THEEM tool in different telehealth training environments. Longitudinal studies could assess the tool’s ability to track improvements in telehealth education over time. Also, empirical research to establish the cutoff scores of the tool for better interpretation of findings is desirable.
Data availability
All data generated or analysed during this study are included in this published article.
References
Kazi AM, Qazi SA, Ahsan N, Khawaja S, Sameen F, Saqib M, et al. Current Challenges of Digital Health Interventions in Pakistan: mixed methods analysis. J Med Internet Res. 2020;22(9):e21691.
Khan S. Sehat Kahani is showing Pakistan that digital health services can change lives– for both patients and doctors [Internet]. 5 January 2023. 2023. Available from: https://www.gavi.org/vaccineswork/sehat-kahani-showing-pakistan-digital-health-services-can-change-lives-both?utm_source=chatgpt.com
Tracxn. Top 5 startups in Telemedicine in Pakistan in Nov, 2024 - Tracxn [Internet]. 2024. Available from: https://tracxn.com/d/explore/telemedicine-startups-in-pakistan/__aSUs8XMypMHc8_kgG8CjWrfm-rYlEY4YO0B1RixWqr4/companies?utm_source=chatgpt.com
Karim M, Saeed Khurram S, Aga IZ, Muzzamil M, Hashmi S, Saeed M et al. Analysis of telemedicine service delivery in Karachi, Pakistan: A cross-sectional survey examining practices and perspectives of healthcare providers. Clin Epidemiol Glob Heal [Internet]. 2024;27(March):101607. Available from: https://doi.org/10.1016/j.cegh.2024.101607
Bilal W, Qamar K, Siddiqui A, Kumar P, Essar MY. Digital health and telemedicine in Pakistan: Improving maternal healthcare. Ann Med Surg [Internet]. 2022;81. Available from: https://api.semanticscholar.org/CorpusID:251674307
Kazmi S, Yasmin F, Siddiqui SA, Shah M, Tariq R, Nauman H, et al. Nationwide Assessment of Knowledge and Perception in reinforcing Telemedicine in the age of COVID-19 among medical students from Pakistan. Front Public Heal. 2022;10(March):1–8.
Chike-Harris KE, Durham C, Logan A, Smith G, DuBose-Morris R. Integration of telehealth education into the health care provider curriculum: a review. Telemed e-Health. 2021;27(2):137–49.
Maynard K, Knickerbocker J. A telemedicine standardized patient experience: enhancing the virtual Classroom and Preparing for Alternative modalities of Care. Nurs Educ Perspect. 2023;44(3):181–2.
Klasen JM, Meienberg A, Bogie BJM. Medical student engagement during COVID-19: Lessons learned and areas for improvement. Med Educ [Internet]. 2021;55(1):115–8. Available from: https://doi.org/10.1111/medu.14405
Rashid M, Foulds J, Forgie S. Practical tips on incorporating learners into virtual clinical care. MedEdPublish. 2020;9:167.
Artino ARJ, La Rochelle JS, Dezee KJ, Gehlbach H. Developing questionnaires for educational research: AMEE Guide 87. Med Teach. 2014;36(6):463–74.
Yusoff MSB. ABC of response process validation and face Validity Index calculation. Educ Med J. 2019;11(3):55–61.
Khan RA, Spruijt A, Mahboob U, Al Eraky M, van Merrienboer JJG. Development and validation of teacher and student questionnaires measuring inhibitors of curriculum viability. BMC Med Educ [Internet]. 2021;21(1):405. Available from: https://doi.org/10.1186/s12909-021-02843-0
Willis GB, Artino ARJ. What do our respondents think we’re asking? Using Cognitive interviewing to improve medical education surveys. J Grad Med Educ. 2013;5(3):353–6.
Chuo J, Macy ML, Lorch SA. Strategies for evaluating Telehealth. Pediatrics. 2020;146(5).
Mousavi A, Mohammadi A, Mojtahedzadeh R, Shirazi M, Rashidi H. E-learning educational atmosphere measure (EEAM): a new instrument for assessing e-students’ perception of educational environment. Res Learn Technol. 2020;28(1063519):1–12.
Watkins MW. Exploratory factor analysis: a guide to best practice. J Black Psychol. 2018;44(3):219–46.
Golub SA, Pham DQ, Bargeron EL, Breuner CC, Evans YN. Evaluating the Educational Impact of Telehealth on Adolescent Medicine trainees: a qualitative Approach. Curr Pediatr Rep. 2021;9(3):72–6.
Downes E, Horigan A, Teixeira P. The transformation of health care for patients: information and communication technology, digiceuticals, and digitally enabled care. J Am Assoc Nurse Pract. 2019;31(3):156–61.
Prosen M, Karnjuš I, Ličen S. Evaluation of E-Learning experience among Health and Allied Health Professions Students during the COVID-19 pandemic in Slovenia: an Instrument Development and Validation Study. Int J Environ Res Public Health. 2022;19(8).
Irby DM, O’brien BC, Stenfors T, Palmgren PJ. Selecting instruments for measuring the clinical learning environment of Medical Education: a 4-Domain Framework. Acad Med. 2021;96(2):218–25.
Hui KY, Haines C, Bammann S, Hallandal M, Langone N, Williams C et al. To what extent is telehealth reported to be incorporated into undergraduate and postgraduate allied health curricula: A scoping review. PLoS One [Internet]. 2021;16(8 August):1–18. Available from: https://doi.org/10.1371/journal.pone.0256425
Kruse CS, Williams K, Bohls J, Shamsi W. Telemedicine and health policy: A systematic review. Heal Policy Technol [Internet]. 2021;10(1):209–29. Available from: https://www.sciencedirect.com/science/article/pii/S2211883720301155
Agboola S, Hale TM, Masters C, Kvedar J, Jethwani K. Real-World Practical Evaluation Strategies: A Review of Telehealth Evaluation. JMIR Res Protoc [Internet]. 2014;3(4):e75. Available from: http://www.researchprotocols.org/2014/4/e75/
Gabriela M, Susana M. Content Validity of a Questionnaire to Assess Parental Involvement in Education. Eur J Psychol Educ Res [Internet]. 2021; Available from: https://api.semanticscholar.org/CorpusID:245203057
Bashshur R, Doarn CR, Frenk JM, Kvedar JC, Woolliscroft JO. Telemedicine and the COVID-19 pandemic, lessons for the future. Vol. 26, Telemedicine journal and e-health: the official journal of the American Telemedicine Association. United States; 2020. pp. 571–3.
Sood A, Watts SA, Johnson JK, Hirth S, Aron DC. Telemedicine consultation for patients with diabetes mellitus: a cluster randomised controlled trial. J Telemed Telecare. 2018;24(6):385–91.
Tavakol M, Dennick R. Making sense of Cronbach’s alpha. Int J Med Educ. 2011;2:53–5.
Farooq F, Rathore FA, Mansoor SN. Challenges of Online Medical Education in Pakistan during COVID-19 pandemic. J Coll Physicians Surg Pak. 2020;30(6):67–9.
Acknowledgements
Not Applicable.
Funding
This project was self-funded and undertaken as part of my master’s degree program.
Author information
Authors and Affiliations
Contributions
MA led the conceptualization, resource management, formal analysis and manuscript writing. HR and MA handled data curation and investigation. MS and MA designed the methodology. MS and RA oversaw project administration and validation and provided critical feedback (manuscript editing and finalization). All the authors read and approved the final manuscript.
Corresponding author
Ethics declarations
Ethics approval and consent to participate
This study was conducted following ethical guidelines, in full compliance with the Declaration of Helsinki, and the National Bioethics Committee for Research Pakistan, with approval obtained from the Institutional Review Committee/Ethics Review Committee of Islamic International Medical College (Ref No. Riphah/IIMC/IRC/23/3026). Verbal informed consent was obtained from the participants involved in the cognitive interviews to ensure their understanding and voluntary participation. For the construct validation, written informed consent was obtained from all participants, ensuring that they were fully informed about the study’s purpose, procedures, and rights, including the right to withdraw at any stage without consequences. These measures were taken to uphold ethical standards and participant autonomy.
Consent for publication
Not applicable.
Competing interests
The authors declare no competing interests.
Additional information
Publisher’s note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Electronic supplementary material
Below is the link to the electronic supplementary material.
Rights and permissions
Open Access This article is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License, which permits any non-commercial use, sharing, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if you modified the licensed material. You do not have permission under this licence to share adapted material derived from this article or parts of it. The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by-nc-nd/4.0/.
About this article
Cite this article
Arif, M., Sajjad, M., Khan, R.A. et al. Development and validation of a tool to measure telehealth educational environment (THEEM). BMC Med Educ 25, 136 (2025). https://doi.org/10.1186/s12909-025-06751-5
Received:
Accepted:
Published:
DOI: https://doi.org/10.1186/s12909-025-06751-5