Are Video Recordings Reliable for Assessing Surgical Performance? A Prospective Reliability Study Using Generalizability Theory

Reliability is pivotal in surgical skills assessment. Video-based assessment can be used for objective assessment without physical presence of assessors. However, its reliability for surgical assessments remains largely unexplored. In this study, we evaluated the reliability of video-based versus ph...

Full description

Saved in:
Bibliographic Details
Published in:Simulation in healthcare : journal of the Society for Medical Simulation Vol. 18; no. 4; pp. 219 - 225
Main Authors: Frithioff, Andreas, Frendø, Martin, Foghsgaard, Søren, Sørensen, Mads Sølvsten, Andersen, Steven Arild Wuyts
Format: Journal Article
Language:English
Published: United States Lippincott Williams & Wilkins 01-08-2023
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Abstract Reliability is pivotal in surgical skills assessment. Video-based assessment can be used for objective assessment without physical presence of assessors. However, its reliability for surgical assessments remains largely unexplored. In this study, we evaluated the reliability of video-based versus physical assessments of novices' surgical performances on human cadavers and 3D-printed models-an emerging simulation modality. Eighteen otorhinolaryngology residents performed 2 to 3 mastoidectomies on a 3D-printed model and 1 procedure on a human cadaver. Performances were rated by 3 experts evaluating the final surgical result using a well-known assessment tool. Performances were rated both hands-on/physically and by video recordings. Interrater reliability and intrarater reliability were explored using κ statistics and the optimal number of raters and performances required in either assessment modality was determined using generalizability theory. Interrater reliability was moderate with a mean κ score of 0.58 (range 0.53-0.62) for video-based assessment and 0.60 (range, 0.55-0.69) for physical assessment. Video-based and physical assessments were equally reliable (G coefficient 0.85 vs. 0.80 for 3D-printed models and 0.86 vs 0.87 for cadaver dissections). The interaction between rater and assessment modality contributed to 8.1% to 9.1% of the estimated variance. For the 3D-printed models, 2 raters evaluating 2 video-recorded performances or 3 raters physically assessing 2 performances yielded sufficient reliability for high-stakes assessment (G coefficient >0.8). Video-based and physical assessments were equally reliable. Some raters were affected by changing from physical to video-based assessment; consequently, assessment should be either physical or video based, not a combination.
AbstractList Reliability is pivotal in surgical skills assessment. Video-based assessment can be used for objective assessment without physical presence of assessors. However, its reliability for surgical assessments remains largely unexplored. In this study, we evaluated the reliability of video-based versus physical assessments of novices' surgical performances on human cadavers and 3D-printed models-an emerging simulation modality. Eighteen otorhinolaryngology residents performed 2 to 3 mastoidectomies on a 3D-printed model and 1 procedure on a human cadaver. Performances were rated by 3 experts evaluating the final surgical result using a well-known assessment tool. Performances were rated both hands-on/physically and by video recordings. Interrater reliability and intrarater reliability were explored using κ statistics and the optimal number of raters and performances required in either assessment modality was determined using generalizability theory. Interrater reliability was moderate with a mean κ score of 0.58 (range 0.53-0.62) for video-based assessment and 0.60 (range, 0.55-0.69) for physical assessment. Video-based and physical assessments were equally reliable (G coefficient 0.85 vs. 0.80 for 3D-printed models and 0.86 vs 0.87 for cadaver dissections). The interaction between rater and assessment modality contributed to 8.1% to 9.1% of the estimated variance. For the 3D-printed models, 2 raters evaluating 2 video-recorded performances or 3 raters physically assessing 2 performances yielded sufficient reliability for high-stakes assessment (G coefficient >0.8). Video-based and physical assessments were equally reliable. Some raters were affected by changing from physical to video-based assessment; consequently, assessment should be either physical or video based, not a combination.
INTRODUCTIONReliability is pivotal in surgical skills assessment. Video-based assessment can be used for objective assessment without physical presence of assessors. However, its reliability for surgical assessments remains largely unexplored. In this study, we evaluated the reliability of video-based versus physical assessments of novices' surgical performances on human cadavers and 3D-printed models-an emerging simulation modality. METHODSEighteen otorhinolaryngology residents performed 2 to 3 mastoidectomies on a 3D-printed model and 1 procedure on a human cadaver. Performances were rated by 3 experts evaluating the final surgical result using a well-known assessment tool. Performances were rated both hands-on/physically and by video recordings. Interrater reliability and intrarater reliability were explored using κ statistics and the optimal number of raters and performances required in either assessment modality was determined using generalizability theory. RESULTSInterrater reliability was moderate with a mean κ score of 0.58 (range 0.53-0.62) for video-based assessment and 0.60 (range, 0.55-0.69) for physical assessment. Video-based and physical assessments were equally reliable (G coefficient 0.85 vs. 0.80 for 3D-printed models and 0.86 vs 0.87 for cadaver dissections). The interaction between rater and assessment modality contributed to 8.1% to 9.1% of the estimated variance. For the 3D-printed models, 2 raters evaluating 2 video-recorded performances or 3 raters physically assessing 2 performances yielded sufficient reliability for high-stakes assessment (G coefficient >0.8). CONCLUSIONSVideo-based and physical assessments were equally reliable. Some raters were affected by changing from physical to video-based assessment; consequently, assessment should be either physical or video based, not a combination.
Author Sørensen, Mads Sølvsten
Frendø, Martin
Frithioff, Andreas
Andersen, Steven Arild Wuyts
Foghsgaard, Søren
AuthorAffiliation From the Copenhagen Hearing and Balance Center, Department of Otorhinolaryngology—Head & Neck Surgery and Audiology (A.F., M.F., S.F., M.S., S.A.W.A.), Rigshospitalet, Copenhagen; and Copenhagen Academy for Medical Education and Simulation (CAMES; A.F., M.F., S.A.W.A.), Center for HR & Education, Copenhagen, Denmark
AuthorAffiliation_xml – name: From the Copenhagen Hearing and Balance Center, Department of Otorhinolaryngology—Head & Neck Surgery and Audiology (A.F., M.F., S.F., M.S., S.A.W.A.), Rigshospitalet, Copenhagen; and Copenhagen Academy for Medical Education and Simulation (CAMES; A.F., M.F., S.A.W.A.), Center for HR & Education, Copenhagen, Denmark
Author_xml – sequence: 1
  givenname: Andreas
  surname: Frithioff
  fullname: Frithioff, Andreas
  organization: From the Copenhagen Hearing and Balance Center, Department of Otorhinolaryngology—Head & Neck Surgery and Audiology (A.F., M.F., S.F., M.S., S.A.W.A.), Rigshospitalet, Copenhagen; and Copenhagen Academy for Medical Education and Simulation (CAMES; A.F., M.F., S.A.W.A.), Center for HR & Education, Copenhagen, Denmark
– sequence: 2
  givenname: Martin
  surname: Frendø
  fullname: Frendø, Martin
– sequence: 3
  givenname: Søren
  surname: Foghsgaard
  fullname: Foghsgaard, Søren
– sequence: 4
  givenname: Mads Sølvsten
  surname: Sørensen
  fullname: Sørensen, Mads Sølvsten
– sequence: 5
  givenname: Steven Arild Wuyts
  surname: Andersen
  fullname: Andersen, Steven Arild Wuyts
BackLink https://www.ncbi.nlm.nih.gov/pubmed/36260767$$D View this record in MEDLINE/PubMed
BookMark eNpdUNtOHDEMjRCoXNo_QFUe-zKQyyaz81StEDcJCcRC1bcokzhs2uxkm8yAlhd-nQBDi2rJsmX7HNtnF212sQOE9ik5oKSpD-fnZwfko8mabaAdKkRT1ZT_3BxzxjnbRrs5_yJkIoign9A2l0ySWtY76GmWAP_wFiK-BhOT9d1dLmnwug2AXUx4ljPkXOp4PqQ7b3TAV5BKZ6k7A9_xDF-lmFdgen8PI9QH36_xvB_sGt--Yk-hg6SDf3xv3iwgpvVntOV0yPBljHvo9uT45uisurg8PT-aXVSGEyoqY5mZtq2dAuGOCTJxlrVGU-c4b4wjGrgVWhKmoRFTYoSeWgOSUSadBQC-h7698a5S_DNA7tXSZwMh6A7ikBWrmWwYJZKX0cnbqClv5QROrZJf6rRWlKgX6VWRXv0vfYF9HTcM7RLsX9C71v94H2LoIeXfYXiApBagQ79QpJxa7qdV04ykVXEq-DNSi5NM
Cites_doi 10.1097/ACM.0000000000004150
10.1177/1553350607308466
10.1136/bmjstel-2017-000234
10.1177/0163278708324444
10.1097/SLA.0000000000000866
10.1016/j.surg.2004.06.011
10.4300/JGME-D-11-00123.1
10.1097/MLG.0b013e31811edd7a
10.1007/s00405-019-05572-9
10.1002/lary.20678
10.1097/SLA.0000000000002652
10.1016/j.jsurg.2017.10.005
10.2147/AMEP.S131638
10.4300/JGME-D-16-00411.1
10.1097/MAO.0000000000002541
10.1097/ACM.0000000000003550
10.1007/S40037-015-0160-5
10.1016/j.gie.2017.08.020
10.1093/ptj/85.3.257
10.1046/j.1365-2923.2003.01594.x
10.1002/lary.24838
10.1111/j.1365-2929.2004.01932.x
10.1177/0194599816670886
10.3109/0142159X.2012.703791
10.1002/lary.21287
10.1007/s00405-017-4824-0
10.1177/0163278707304040
ContentType Journal Article
Copyright Lippincott Williams & Wilkins
Copyright © 2022 Society for Simulation in Healthcare.
Copyright_xml – notice: Lippincott Williams & Wilkins
– notice: Copyright © 2022 Society for Simulation in Healthcare.
DBID NPM
AAYXX
CITATION
7X8
DOI 10.1097/SIH.0000000000000672
DatabaseName PubMed
CrossRef
MEDLINE - Academic
DatabaseTitle PubMed
CrossRef
MEDLINE - Academic
DatabaseTitleList PubMed
MEDLINE - Academic
DeliveryMethod fulltext_linktorsrc
Discipline Medicine
EISSN 1559-713X
EndPage 225
ExternalDocumentID 10_1097_SIH_0000000000000672
36260767
01266021-990000000-00015
Genre Journal Article
GroupedDBID ---
.Z2
0R~
53G
5VS
AAAAV
AAHPQ
AAIQE
AARTV
AASCR
ABASU
ABBUW
ABDIG
ABJNI
ABVCZ
ABXVJ
ACDDN
ACEWG
ACGFS
ACILI
ACNWC
ACWDW
ACWRI
ACXJB
ACXNZ
ADGGA
ADHPY
AFDTB
AHQNM
AHVBC
AINUH
AJIOK
AJNWD
AJZMW
ALMA_UNASSIGNED_HOLDINGS
ALMTX
AMJPA
AMKUR
AMNEI
AOHHW
BQLVK
C45
DIWNM
E.X
EBS
EEVPB
EX3
F5P
FCALG
FL-
GNXGY
GQDEL
HLJTE
HZ~
IKREB
IN~
KD2
L-C
O9-
OPUJH
OVD
OVDNE
OXXIT
RLZ
S4S
TEORI
TSPGW
V2I
W3M
WOQ
WOW
ZY1
NPM
AAYXX
CITATION
7X8
ID FETCH-LOGICAL-c3015-cd2c8bbd8e03f2504fd2bca1ff339cf0ae3d5a602ae9580c5a8dce62126fdeee3
ISSN 1559-2332
IngestDate Fri Aug 16 22:55:20 EDT 2024
Fri Nov 22 01:34:27 EST 2024
Wed Oct 16 00:39:06 EDT 2024
Thu Nov 14 19:00:40 EST 2024
IsPeerReviewed false
IsScholarly true
Issue 4
Language English
License Copyright © 2022 Society for Simulation in Healthcare.
LinkModel OpenURL
MergedId FETCHMERGED-LOGICAL-c3015-cd2c8bbd8e03f2504fd2bca1ff339cf0ae3d5a602ae9580c5a8dce62126fdeee3
Notes ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 23
PMID 36260767
PQID 2726921063
PQPubID 23479
PageCount 7
ParticipantIDs proquest_miscellaneous_2726921063
crossref_primary_10_1097_SIH_0000000000000672
pubmed_primary_36260767
wolterskluwer_health_01266021-990000000-00015
PublicationCentury 2000
PublicationDate 2023-08-01
PublicationDateYYYYMMDD 2023-08-01
PublicationDate_xml – month: 08
  year: 2023
  text: 2023-08-01
  day: 01
PublicationDecade 2020
PublicationPlace United States
PublicationPlace_xml – name: United States
PublicationTitle Simulation in healthcare : journal of the Society for Medical Simulation
PublicationTitleAlternate Simul Healthc
PublicationYear 2023
Publisher Lippincott Williams & Wilkins
Publisher_xml – name: Lippincott Williams & Wilkins
References (bib5-20241014) 2019; 276
(bib24-20241014) 2007; 117
(bib13-20241014) 2012; 4
(bib6-20241014) 2016; 21
(bib10-20241014) 2012; 34
(bib28-20241014) 2009; 119
(bib16-20241014) 2016; 57
(bib20-20241014) 2015; 125
(bib12-20241014) 2018; 87
(bib30-20241014) 2007; 11
(bib1-20241014) 2011; 306
(bib15-20241014) 2007; 14
(bib9-20241014) 2020; 95
(bib29-20241014) 2017; 156
(bib3-20241014) 2018; 267
(bib27-20241014) 2008; 31
(bib25-20241014) 2011; 121
(bib32-20241014) 2007; 30
(bib14-20241014) 2014; 72
(bib7-20241014) 2003; 37
(bib19-20241014) 2018; 275
(bib18-20241014) 2018; 4
(bib33-20241014) 2020; 41
(bib23-20241014) 2005; 85
(bib21-20241014) 2015; 4
(bib26-20241014) 2021; 96
(bib34-20241014) 2017; 8
(bib31-20241014) 2017; 9
(bib8-20241014) 2004; 38
(bib17-20241014) 2005; 137
(bib2-20241014) 2015; 261
(bib11-20241014) 2018; 75
(bib4-20241014) 2004; 18
References_xml – volume: 96
  start-page: 1609
  issue: 11
  year: 2021
  ident: bib26-20241014
  article-title: Use of generalizability theory for exploring reliability of and sources of variance in assessment of technical skills: a systematic review and meta-analysis
  publication-title: Acad Med
  doi: 10.1097/ACM.0000000000004150
– volume: 14
  start-page: 211
  issue: 3
  year: 2007
  ident: bib15-20241014
  article-title: Evaluating intraoperative laparoscopic skill: direct observation versus blinded videotaped performances
  publication-title: Surg Innov
  doi: 10.1177/1553350607308466
– volume: 4
  start-page: 27
  year: 2018
  ident: bib18-20241014
  article-title: 3D printing materials and their use in medical education: a review of current technology and trends for the future
  publication-title: BMJ Stel
  doi: 10.1136/bmjstel-2017-000234
– volume: 18
  start-page: 1800
  year: 2004
  ident: bib4-20241014
  article-title: Toward reliable operative assessment: the reliability and feasibility of videotaped assessment of laparoscopic technical skills
  publication-title: Surg Endosc Other Interv Tech
– volume: 31
  start-page: 419
  issue: 4
  year: 2008
  ident: bib27-20241014
  article-title: Reliability of surgical skills scores in otolaryngology residents: analysis using generalizability theory
  publication-title: Eval Health Prof
  doi: 10.1177/0163278708324444
– volume: 261
  start-page: 1046
  issue: 6
  year: 2015
  ident: bib2-20241014
  article-title: Assessing technical competence in surgical trainees: a systematic review
  publication-title: Ann Surg
  doi: 10.1097/SLA.0000000000000866
– volume: 137
  start-page: 141
  issue: 2
  year: 2005
  ident: bib17-20241014
  article-title: Assuring the reliability of resident performance appraisals: more items or more observations?
  publication-title: Surgery
  doi: 10.1016/j.surg.2004.06.011
– volume: 4
  start-page: 312
  issue: 3
  year: 2012
  ident: bib13-20241014
  article-title: Prospective comparison of live evaluation and video review in the evaluation of operator performance in a pediatric emergency airway simulation
  publication-title: J Grad Med Educ
  doi: 10.4300/JGME-D-11-00123.1
– volume: 117
  start-page: 1803
  year: 2007
  ident: bib24-20241014
  article-title: Reliability of the Welling Scale (WS1) for rating temporal bone dissection performance
  publication-title: Laryngoscope
  doi: 10.1097/MLG.0b013e31811edd7a
– volume: 276
  start-page: 2783
  issue: 10
  year: 2019
  ident: bib5-20241014
  article-title: Decentralized virtual reality mastoidectomy simulation training: a prospective, mixed-methods study
  publication-title: Eur Arch Otorhinolaryngol
  doi: 10.1007/s00405-019-05572-9
– volume: 119
  start-page: 2402
  issue: 12
  year: 2009
  ident: bib28-20241014
  article-title: Pilot testing of an assessment tool for competency in mastoidectomy
  publication-title: Laryngoscope
  doi: 10.1002/lary.20678
– volume: 267
  start-page: 1063
  issue: 6
  year: 2018
  ident: bib3-20241014
  article-title: Gathering validity evidence for surgical simulation: a systematic review
  publication-title: Ann Surg
  doi: 10.1097/SLA.0000000000002652
– volume: 75
  start-page: 671
  issue: 3
  year: 2018
  ident: bib11-20241014
  article-title: Direct observation vs. video-based assessment in flexible cystoscopy
  publication-title: J Surg Educ
  doi: 10.1016/j.jsurg.2017.10.005
– volume: 72
  start-page: 90
  issue: 1
  year: 2014
  ident: bib14-20241014
  article-title: Fundamentals of laparoscopic surgery manual test: is videotaped performance assessment an option?
  publication-title: J Surg Educ
– volume: 11
  start-page: 793
  year: 2007
  ident: bib30-20241014
  article-title: Objective assessment of temporal bone drilling skills
  publication-title: Ann Otol Rhinol Laryngol
– volume: 57
  start-page: 1
  issue: 1–2
  year: 2016
  ident: bib16-20241014
  article-title: Direct observation versus endoscopic video recording-based rating with the objective structured assessment of technical skills for training of laparoscopic cholecystectomy
  publication-title: Eur Surg Res
– volume: 306
  start-page: 978
  issue: 9
  year: 2011
  ident: bib1-20241014
  article-title: Technology-enhanced simulation for health professions education: a systematic review and meta-analysis
  publication-title: JAMA
– volume: 8
  start-page: 269
  year: 2017
  ident: bib34-20241014
  article-title: Mastery learning: how is it helpful? An analytical review
  publication-title: Adv Med Educ Pr
  doi: 10.2147/AMEP.S131638
– volume: 9
  start-page: 162
  issue: 2
  year: 2017
  ident: bib31-20241014
  article-title: Cognitive demands and bias: challenges facing clinical competency committees
  publication-title: J Grad Med Educ
  doi: 10.4300/JGME-D-16-00411.1
– volume: 41
  start-page: 476
  issue: 4
  year: 2020
  ident: bib33-20241014
  article-title: Decentralized virtual reality training of mastoidectomy improves cadaver dissection performance: a prospective, controlled cohort study
  publication-title: Otol Neurotol
  doi: 10.1097/MAO.0000000000002541
– volume: 95
  start-page: 1929
  issue: 12
  year: 2020
  ident: bib9-20241014
  article-title: Reliable assessment of surgical technical skills is dependent on context : an exploration of different variables using generalizability theory
  publication-title: Acad Med
  doi: 10.1097/ACM.0000000000003550
– volume: 4
  start-page: 14
  issue: 1
  year: 2015
  ident: bib21-20241014
  article-title: Data analysis in medical education research: a multilevel perspective
  publication-title: Perspect Med Educ
  doi: 10.1007/S40037-015-0160-5
– volume: 87
  start-page: 766
  issue: 3
  year: 2018
  ident: bib12-20241014
  article-title: A prospective comparison of live and video-based assessments of colonoscopy performance
  publication-title: Gastrointest Endosc
  doi: 10.1016/j.gie.2017.08.020
– volume: 85
  start-page: 257
  issue: 3
  year: 2005
  ident: bib23-20241014
  article-title: The kappa statistic in reliability studies: use, interpretation, and sample size requirements
  publication-title: Phys Ther
  doi: 10.1093/ptj/85.3.257
– volume: 37
  start-page: 830
  issue: 9
  year: 2003
  ident: bib7-20241014
  article-title: Validity: on the meaningful interpretation of assessment data
  publication-title: Med Educ
  doi: 10.1046/j.1365-2923.2003.01594.x
– volume: 125
  start-page: 431
  issue: 2
  year: 2015
  ident: bib20-20241014
  article-title: Mastoidectomy performance assessment of virtual simulation training using final-product analysis
  publication-title: Laryngoscope
  doi: 10.1002/lary.24838
– volume: 38
  start-page: 1006
  year: 2004
  ident: bib8-20241014
  article-title: Reliability: on the reproducibility of assessment data
  publication-title: Med Educ
  doi: 10.1111/j.1365-2929.2004.01932.x
– volume: 156
  start-page: 61
  issue: 1
  year: 2017
  ident: bib29-20241014
  article-title: Performance assessment for mastoidectomy: state of the art review
  publication-title: Otolaryngol Head Neck Surg
  doi: 10.1177/0194599816670886
– volume: 21
  year: 2016
  ident: bib6-20241014
  article-title: Reliability analysis of the objective structured clinical examination using generalizability theory
  publication-title: Med Educ Online
– volume: 34
  start-page: 960
  year: 2012
  ident: bib10-20241014
  article-title: Generalizability theory for the perplexed: a practical introduction and guide: AMEE guide no. 68
  publication-title: Med Teach
  doi: 10.3109/0142159X.2012.703791
– volume: 121
  start-page: 831
  year: 2011
  ident: bib25-20241014
  article-title: Can virtual reality simulator be used as a training aid to improve cadaver temporal bone dissection? Results of a randomized blinded control trial
  publication-title: Laryngoscope
  doi: 10.1002/lary.21287
– volume: 275
  start-page: 357
  issue: 2
  year: 2018
  ident: bib19-20241014
  article-title: European status on temporal bone training: a questionnaire study
  publication-title: Eur Arch Otorhinolaryngol
  doi: 10.1007/s00405-017-4824-0
– volume: 30
  start-page: 266
  issue: 3
  year: 2007
  ident: bib32-20241014
  article-title: Rater errors in a clinical skills assessment of medical students
  publication-title: Eval Health Prof
  doi: 10.1177/0163278707304040
SSID ssj0045051
Score 2.36209
Snippet Reliability is pivotal in surgical skills assessment. Video-based assessment can be used for objective assessment without physical presence of assessors....
INTRODUCTIONReliability is pivotal in surgical skills assessment. Video-based assessment can be used for objective assessment without physical presence of...
SourceID proquest
crossref
pubmed
wolterskluwer
SourceType Aggregation Database
Index Database
Publisher
StartPage 219
Title Are Video Recordings Reliable for Assessing Surgical Performance? A Prospective Reliability Study Using Generalizability Theory
URI http://ovidsp.ovid.com/ovidweb.cgi?T=JS&NEWS=n&CSC=Y&PAGE=fulltext&D=ovft&AN=01266021-990000000-00015
https://www.ncbi.nlm.nih.gov/pubmed/36260767
https://search.proquest.com/docview/2726921063
Volume 18
hasFullText 1
inHoldings 1
isFullTextHit
isPrint
link http://sdu.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwtV1ba9swFBZpC2NjjN26Zjc0WJ-KmS1Hsv000jYhg7YrOB15M4olN2ZrPJxme9xf39HNdto-dA_LgwmKLUO-T0dHR-d8QuhjEPCIFnwA45tE3kBQ3-MiCD3KYgqLsbk_p6p2eJJGZ7P4eDQY9XpOzaJt-69IQxtgrSpn_wHtplNogO-AOVwBdbjeC_dhLQ--lUJWB2ZlqY_lVInHukZKJRWafV6txL2ujeE7b8sH9sMxGIvzunI1mPbhUvvrqdagNnkGVrFapYWZH5s6_8bdTcsrezyYiqss2lwzFYjoiFYo77ebP-q2j9rnG57V5fWirKyWpMrG5J0KFhUmPwr3D2NbhuR0xfU0e7lYXXJuUvlTc1unEK5tsTGpUy5W7r4fv2AwLLsREhI2-XkwwVmrThMPVuOzu81-G9PYsOHGHSCmLvvWTGMUjNMvE6OA6T7MHES0Kex99jUbX5ycZNPRbLqFdgjYRDDJO8Px8fTQuQ0DcEUDV9uZRJ_u6nnTd7q1IHqEHv-uVI7F6rsuseg4StOn6Ild4eChoeYz1JPL5-jBqc3heIH-AEOxZihuGYodQzHAjxuGYsdQ3GHoZzzEHX7iDj-x5ifW_MQ3-YkNP1-ii_FoejTx7DEgXg6zD_VyQfJ4Phex9MNCKe4VgsxzHhRFGCZ54XMZCsqZT7hMaOznlMcilwx8MlYIKWW4i7aX1VLuIQxehgiVip5U4U7AIS6igBUSfFQmGMn7yHN_cPbTqL1kLksDAMluAtJHHxwKGZhltdfGl7JarzISEZaQABYAffTKwNP0qBSg_IhF8LYNvDIzDDPwGRnTqVOJfZXSfgjo63u87Q162A6Bt2j7ul7Ld2hrJdbvLeH-AosxxZM
link.rule.ids 315,782,786,27933,27934,64549,64569,65344,65364
linkProvider Ovid
linkToHtml http://sdu.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwvV1Lj9MwEB6xXYmHEM8FytNIKDfvpnbsJIcIRd0uLXSrSikLnKIktiWElKzaLWhP_HU8cQK7IA5c8CVRnrJmPJ7xfPMZ4NVoVITCFIEd3yykgRI-LdSIUyEjYYOx0i8F1g5Ps3DxMTqcIE1Oj1XF4rPN6T4eWjONJ7gdDsaFq-Rt5i0mH7Kk9sbZOPnkLdM3k3aZGkES3mHSfDVnXrpIrNGVssUexL5rtK0Z3oFdKXgQDGA3Hc_ms95kB9YNcMSqIqaMc9bX2MXhQTab7vsXmwzZ5TnsD8f0Btz81mCue_OlhbpfmLCObv-nrt6BW51HS1Kngnfhiq7vwdXjLmd_H76na01OPivdEBfr4to8QSg0Vm0R6zQTl3m210m2XbemmCx_FTS8JilZrpu-JrR7FTG95wRhkOekxT2QjkEbYWrupuMd2IP3R5PVeEq7bR9oZa2NoJViVVSWKtI-N8iwZhQrq2JkDOdxZfxCcyUK2-lCxyLyK1FEqtLSzsHSKK01fwCDuqn1IyB2VlEcWdM0Lm-xMI6MjQ-Ntj6JVJJVQ6C9IPNTx-6R91l5K_j8d8EP4WUv7dwOQ8ytFLVutpuchUzGNnyWfAgPnRr8_CIy_vihDO3fLulF7kpd87_J8PE_Pv8Crk1Xx_N8Plu8ewLXmXXHHFTxKQzO1lv9DHY2avu8U_4fJF8Lbg
linkToPdf http://sdu.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwvV1Lj9MwELbYXWkFQrwf5WkklFuW1K7t5BChqA9aWKJKWV6nyIltCSE1q3YL6om_zkycwC6IAxd8SZSnrBmPZzzffCbk-XColXB6BOObqXBkRBRqM-ShkLGAYKyKKoG1w_NC5R_jyRRpcvoMPhafbU6P8NCaaTzB7XAwLjxJXxfBuBinn4J8-qFI82CZvZq2y9QIkgiWk1m6CyZp89WdBVmegumVskUgJJFvYVs5vEcOkMYKhsBBNl4cL3rDPQJnwNOriiRknLO-0i5RL4rF_Cg636RiF2eyP9zTK-TqtwYz3psvLeD93LQ1u_5fO3yDXOu8W5p5dbxJLtnVLXL4tsvf3ybfs7Wl7z8b21Af9-I6PUVYNFZwUXCgqc9Cw3VabNetWabLX8UNL2lGl-umrw_tXkV8744iJHJHWwwE7di0EbLmb3oOgjvk3Wx6Mp6H3RYQYQ2WR4S1YXVcVSa2EXfItuYMq2o9dI7zpHaRttwIDZ3WNhFxVAsdm9pKmI-lM9Zafpfsr5qVvU8ozDCGI4OaxaUuppLYQazoLPgn0khWD0jYi7M89UwfZZ-hB_GXv4t_QJ71Mi9hSGKeRa9ss92UTDGZQCgt-YDc88rw84vI_hMpqeBvF7Sj9GWv5d9k-OAfn39KDkE9yuNF_uYhuczAM_OoxUdk_2y9tY_J3sZsn3Qj4AcHVA5Q
openUrl ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=Are+Video+Recordings+Reliable+for+Assessing+Surgical+Performance%3F+A+Prospective+Reliability+Study+Using+Generalizability+Theory&rft.jtitle=Simulation+in+healthcare+%3A+journal+of+the+Society+for+Medical+Simulation&rft.au=Frithioff%2C+Andreas&rft.au=Frend%C3%B8%2C+Martin&rft.au=Foghsgaard%2C+S%C3%B8ren&rft.au=S%C3%B8rensen%2C+Mads+S%C3%B8lvsten&rft.date=2023-08-01&rft.eissn=1559-713X&rft.volume=18&rft.issue=4&rft.spage=219&rft.epage=225&rft_id=info:doi/10.1097%2FSIH.0000000000000672&rft.externalDBID=NO_FULL_TEXT
thumbnail_l http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=1559-2332&client=summon
thumbnail_m http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=1559-2332&client=summon
thumbnail_s http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=1559-2332&client=summon