Skip to main content

Main menu

  • HOME
  • LATEST ARTICLES
  • ALL ISSUES
  • AUTHORS & REVIEWERS
  • RESOURCES
    • About BJGP Open
    • BJGP Open Accessibility Statement
    • Editorial Board
    • Editorial Fellowships
    • Audio Abstracts
    • eLetters
    • Alerts
    • BJGP Life
    • Research into Publication Science
    • Advertising
    • Contact
  • SPECIAL ISSUES
    • Artificial Intelligence in Primary Care: call for articles
    • Social Care Integration with Primary Care: call for articles
    • Special issue: Telehealth
    • Special issue: Race and Racism in Primary Care
    • Special issue: COVID-19 and Primary Care
    • Past research calls
    • Top 10 Research Articles of the Year
  • BJGP CONFERENCE →
  • RCGP
    • British Journal of General Practice
    • BJGP for RCGP members
    • RCGP eLearning
    • InnovAIT Journal
    • Jobs and careers

User menu

  • Alerts

Search

  • Advanced search
Intended for Healthcare Professionals
BJGP Open
  • RCGP
    • British Journal of General Practice
    • BJGP for RCGP members
    • RCGP eLearning
    • InnovAIT Journal
    • Jobs and careers
  • Subscriptions
  • Alerts
  • Log in
  • Follow BJGP Open on Instagram
  • Visit bjgp open on Bluesky
  • Blog
Intended for Healthcare Professionals
BJGP Open

Advanced Search

  • HOME
  • LATEST ARTICLES
  • ALL ISSUES
  • AUTHORS & REVIEWERS
  • RESOURCES
    • About BJGP Open
    • BJGP Open Accessibility Statement
    • Editorial Board
    • Editorial Fellowships
    • Audio Abstracts
    • eLetters
    • Alerts
    • BJGP Life
    • Research into Publication Science
    • Advertising
    • Contact
  • SPECIAL ISSUES
    • Artificial Intelligence in Primary Care: call for articles
    • Social Care Integration with Primary Care: call for articles
    • Special issue: Telehealth
    • Special issue: Race and Racism in Primary Care
    • Special issue: COVID-19 and Primary Care
    • Past research calls
    • Top 10 Research Articles of the Year
  • BJGP CONFERENCE →
Research

Clinical judgment of GPs for the diagnosis of dementia: a diagnostic test accuracy study

Samuel Thomas Creavin, Judy Haworth, Mark Fish, Sarah Cullum, Anthony Bayer, Sarah Purdy and Yoav Ben-Shlomo
BJGP Open 2021; 5 (5): BJGPO.2021.0058. DOI: https://doi.org/10.3399/BJGPO.2021.0058
Samuel Thomas Creavin
1 Population Health Sciences, University of Bristol, Bristol, UK
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
  • For correspondence: sam.creavin{at}bristol.ac.uk
Judy Haworth
1 Population Health Sciences, University of Bristol, Bristol, UK
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
Mark Fish
2 Royal Devon and Exeter NHS Foundation Trust, Exeter, UK
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
Sarah Cullum
3 Depatment of Psychological Medicine, School of Medicine, The University of Auckland, Grafton, New Zealand
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
Anthony Bayer
4 School of Medicine, Cardiff University, Cardiff, UK
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
Sarah Purdy
1 Population Health Sciences, University of Bristol, Bristol, UK
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
Yoav Ben-Shlomo
1 Population Health Sciences, University of Bristol, Bristol, UK
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
  • Article
  • Figures & Data
  • Info
  • eLetters
  • PDF
Loading

Abstract

Background GPs often report using clinical judgment to diagnose dementia.

Aim To investigate the accuracy of GPs’ clinical judgment for the diagnosis of dementia.

Design & setting Diagnostic test accuracy study, recruiting from 21 practices around Bristol, UK.

Method The clinical judgment of the treating GP (index test) was based on the information immediately available at their initial consultation with a person aged ≥70 years who had cognitive symptoms. The reference standard was an assessment by a specialist clinician, based on a standardised clinical examination and made according to the 10th revision of the International Classification of Diseases (ICD-10) criteria for dementia.

Results A total of 240 people were recruited, with a median age of 80 years (interquartile range [IQR] 75–84 years), of whom 126 (53%) were men and 132 (55%) had dementia. The median duration of symptoms was 24 months (IQR 12–36 months) and the median Addenbrooke's Cognitive Examination III (ACE-III) score was 75 (IQR 65–87). GP clinical judgment had sensitivity 56% (95% confidence interval [CI] = 47% to 65%) and specificity 89% (95% CI = 81% to 94%). Positive likelihood ratio was higher in people aged 70–79 years (6.5, 95% CI = 2.9 to 15) compared with people aged ≥80 years (3.6, 95% CI = 1.7 to 7.6), and in women (10.4, 95% CI = 3.4 to 31.7) compared with men (3.2, 95% CI = 1.7 to 6.2), whereas the negative likelihood ratio was similar in all groups.

Conclusion A GP clinical judgment of dementia is specific, but confirmatory testing is needed to exclude dementia in symptomatic people whom GPs judge as not having dementia.

  • dementia
  • general practice
  • sensitivity and specificity
  • medical history taking
  • symptom assessment

How this fits in

Previous studies in this area have investigated the accuracy of GP clinical judgment as a screening test for dementia in unselected people attending a primary care clinic or as a retrospective test based on their knowledge of their patient. Some studies have derived the accuracy of judgment from the medical records, which may not reflect the judgment of the clinician. The role of the GP in supporting a more effective route to diagnosis for people with dementia is a research priority for patients, carers, and clinicians. This study shows that, in a symptomatic older adult, clinical judgment may be useful for helping to confirm a diagnosis of dementia, but GP judgment should not by itself be used to exclude dementia.

Introduction

The James Lind Alliance has identified the role of general practice in supporting a more effective route to diagnosis of dementia as a priority for health research.1 People with symptoms of dementia have historically faced long delays to get an assessment and an explanation for their symptoms.2 Approaches to address waiting lists have included psychiatrists supporting primary care memory clinics,3 integrated one-stop clinics,4 and training GPs to make a diagnosis in uncomplicated cases,5,6 which is supported by the National Institute for Health and Care Excellence (NICE).7 Some GPs have in the past been hesitant about diagnosing dementia when there is no disease-modifying treatment,8 and disclosure of a diagnosis can still be problematic, especially if the affected person is not seeking help.9 The situation has been complicated in the UK by controversial policies that have funded case-finding for dementia.10–12 Formally evaluating cognition takes time and familiarity with tests. A GP could use a range of brief cognitive assessments13 to evaluate a person with symptoms of dementia, and national guidelines differ on which test to use.14,15 Instead, GPs report using non-standardised processes16 such as clinical judgment17 to diagnose dementia. The sensitivity of GP clinical judgment for diagnosing dementia has been reported as between 51%18 and 100%,19 and the specificity ranges from 58%20 to 100%.19

Previous studies to investigate the accuracy of GP clinical judgment have typically suffered from one of two significant limitations.21 First, a definition of clinical judgment that is of unclear relevance to practice, such as judgment in hindsight, or documentation of recorded diagnoses in the medical record that are systematically incomplete.22 Second, sampling unselected people attending general practice regardless of symptoms, which is more akin to screening. The aim of this study was to address these limitations of earlier studies and investigate the prospective diagnostic accuracy of GP clinical judgment for the diagnosis of dementia syndrome in symptomatic people aged >70 years.23

Method

Population

Participants were recruited from 21 participating GP surgeries in the Bristol, North Somerset, and South Gloucestershire (BNSSG) area, which is a diverse geographic area within 15 miles of the city of Bristol, covering a total population of around 900 000 people across 82 GP practices. Research clinics were in four participating GP surgeries, strategically located for accessibility. It was calculated that a minimum sample size of 200 was needed, based on a specificity of 95% in prior studies, and a 75% prevalence of dementia in local memory clinic data.24

Inclusion and exclusion criteria

Participants were people with cognitive symptoms but no prior diagnosis of dementia, aged >70 years, and who had been referred by their GP to this research study. Cognitive symptoms were not specified but generally include disturbance in memory, language, executive function, behaviour, and visuospatial skills.25 Symptoms were required to be present for at least 6 months, and could be reported by the person themselves, a family member, a professional, or another person. There was no severity threshold. Cognitive problems did not need to be the focus of the consultation and (as routine practice) GPs could enquire about cognition if they perceived a problem. Symptom duration was determined from the clinical history. An accompanying informant was mandatory. All participants were offered free accessible transport and translation services. People were excluded if they had a known neurological disorder (that is, Parkinsonism, multiple sclerosis, learning disability, Huntington’s disease), were registered as blind, had profound deafness (that is, were unable to use a telephone), had a psychiatric disorder requiring current secondary care input, or if cognitive symptoms were either rapidly progressive or coincident with neurological disturbance. People with cognitive problems that were so advanced that they were unable to consent were excluded, as they were judged by a lay advisory group to find the research process overly burdensome. GPs were encouraged to make a clinical judgment and refer a consecutive series of all eligible patients with cognitive symptoms to the study, regardless of what their clinical judgment was or of any test results. GPs gave study information including a leaflet, and obtained verbal consent to share contact details with the study on a referral form, including the person's age, sex, contact details, and the GP's clinical judgment. The study team contacted people referred by GPs to re-confirm eligibility, provide further written study details, and offer a research clinic appointment. The research team took written consent from all participants.

Index test of clinical judgment

The referring GP recorded their clinical judgment using an electronic referral form during a consultation with their patient about cognitive symptoms. Clinical judgment was operationalised as 'normal' cognition, 'cognitive impairment not dementia (CIND)', or 'dementia' as options for response to the question 'Is your gut feeling that this person has ___ ?'. GPs were not specially trained, were not required to arrange any test, and could refer people simultaneously or subsequently to NHS services. The study team contacted the practice at least three times to obtain any missing referral data.

Reference standard

At the research clinic, a single specialist physician conducted a standardised assessment lasting approximately 60 minutes, comprising clinical history, the ACE-III,26 Brief Assessment Schedule Depression Cards (BASDEC),27 and the informant-completed Bristol Activities of Daily Living (BADL) Questionnaire.28 The specialist was not aware of other test results, including GP judgment or investigations. The reference standard was based on the evaluation of the specialist physician for dementia, according to ICD-10 criteria29 for each individual patient. Specific cut-offs on the aforementioned measures were not used and the expert used their integrated assessment to reach a diagnosis. CIND was diagnosed by the same expert and included Petersen mild cognitive impairment (MCI)30 and other causes of cognitive impairment that met neither criteria for ICD-10 dementia nor Petersen MCI, such as traumatic brain injury or affective disorder. Medical records were reviewed for all participants 6 months after the research clinic to identify any subsequent information that would contradict this judgment. A second specialist adjudicated cases where there was diagnostic uncertainty at the research clinic using the initial specialist assessment and the medical record review; the second specialist also did not have access to the GP judgment. Study data were electronically entered and managed using REDCap (Research Electronic Data Capture) hosted at the University of Bristol.31

Statistical methods

Separate logistic regression analyses were used with non-participation (referred by GP but not taking part) as the dependent variable and GP judgment, age (in years), and female sex as the independent variables to test the hypothesis of no association with these variables. Time from referral to appointment was described using median and IQR, and logistic regression was used to test the hypothesis of no association between time to appointment (in days) and dementia (as the dependent variable). Measures of diagnostic test accuracy were calculated together with 95% CIs, for GP judgment of dementia against reference standard of dementia. Sensitivity analyses were done to explore whether accuracy varied by age (<80 years and ≥80 years, since prediction models perform differently in these age groups)32 and sex. Cochran’s Q test was used to test the hypothesis of no difference in likelihood ratios between groups.33 This diagnostic test accuracy study is reported in line with STARDdem guidelines.34

Results

Participants

Recruitment took place between March 2015 and May 2017. Figure 1 shows a flowchart for inclusion in the study. The theoretically 'eligible' figure of 1735 people was derived from the age-specific incidence of dementia35 and the demographics of the population in the participating practices (34 956 people aged >70 years).36 The number approached is unknown. One person who consented withdrew before any data were collected because they were acutely ill. Of the 240 with available data, there were 20 borderline cases that were adjudicated by a second specialist. The 240 people were classified by the reference standard as: 'normal' cognition (n = 47); 'dementia' (n = 132, of whom one had DSM-5 but not ICD-10 because they had subjective but not objective amnesia); or 'CIND' (n = 61), of whom 59 met criteria for MCI (one affective disorder, one brain injury). Compared with people who participated, there was little evidence of an association between non-participation and a GP clinical judgment of CIND (odds ratio [OR] 1.2; 95% CI = 0.55 to 2.41) or dementia (OR 1.9; 95% CI = 0.90 to 3.93). Compared with people who participated, non-participants were older (OR per year 1.08; 95% CI = 1.04 to 1.12), or more often female (OR 1.88; 95% CI = 1.21 to 2.92). The median time between referral (clinical judgment) and the clinic appointment (reference standard) was 47 days (IQR 30–72 days). The longest interval was 177 days, owing to difficulties attending earlier appointments. There was no association between time from referral to appointment and dementia (OR per day 1.0; 95% CI = 0.99 to 1.01). Table 1 shows the demographics of participants and shows a cross-tabulation of GP opinion against the reference standard, allowing derivation of diagnostic accuracy of clinical judgment for both CIND and dementia.

Figure 1.
  • Download figure
  • Open in new tab
  • Download powerpoint
Figure 1. STARDdem flowchart for inclusion of participants in the study. CIND = cognitive impairment not dementia. ICD-10 = International Classification of Diseases, 10th revision. MCI = mild cognitive impairment.

aOne person had to withdraw part way through the reference test as they were acutely ill. bDementia according to ICD-10.29 cOne person met criteria for ICD-10 dementia and also had features of normal pressure hydrocephalus. Expert review endorsed a reference standard diagnosis of dementia. dOf 61 with CIND: 59 met criteria for Peterson MCI;30 1 affective disorder; 1 brain injury.

View this table:
  • View inline
  • View popup
Table 1. Characteristics of participants by cognitive category

Two people could not complete the ACE-III because English was not their first language; they had both declined an interpreter. In both cases sufficient information was available from other parts of the assessment for a categorisation about cognition to be made (one had normal cognition, one had dementia). For the 238 people who had an ACE-III score, the median was 75 (interquartile range 65–87). Referring GPs judged that 34 people had normal cognition, 86 had dementia, and 120 had CIND; the one person who withdrew from the study, owing to acute illness, was judged by the referring GP to have CIND. People whom GPs judged as having dementia had a total ACE-III score IQR of 60–74, with a 90th centile of 81/100 and highest score of 95/100. Similarly, people whom GPs judged as having CIND had an ACE-III score IQR 71–89.

Diagnostic accuracy

Table 2 shows the diagnostic accuracy for GP judgment for dementia. The sensitivity of GP judgment was 56% (95% CI = 47% to 65%) and the specificity was 89% (95% CI = 81% to 94%). Clinical judgment was more useful for ruling in dementia, than ruling it out, with higher specificity and positive predictive value than sensitivity and negative predictive value. In people aged ≥80 years, clinical judgment had similar sensitivity (P = 0.296) and specificity (P = 0.798) to those aged <80 years. There was weak evidence that clinical judgment in women had a higher specificity (P = 0.074) and a higher sensitivity (P = 0.064) than clinical judgment in men.

View this table:
  • View inline
  • View popup
Table 2. Accuracy of GP judgment for the diagnosis of dementia

Discussion

Summary

From 21 participating GP surgeries, 456 people were referred and 240 were evaluated. Of these, 132 (55%; 95% CI = 48% to 61%) had dementia. Clinical judgment as a single test had a positive likelihood ratio (LRP) of 5 (95% CI = 3 to 9) and a negative likelihood ratio (LRN) of 0.5 (95% CI = 0.4 to 0.6) for the target condition dementia. People whom GPs judged as having dementia had a total ACE-III score IQR of 60–74, and those whom they judged as having MCI had a total ACE-III IQR 71–89. This compares with published ACE-III thresholds of <82 for dementia37 and <88 for MCI,37 and suggests that in this study, GPs are not being overly restrictive in their judgment for dementia, or liberal in their judgment for CIND.

Strengths and limitations

The patient selection in the current study closely reflects real-world clinical practice in the UK, with efforts to avoid exclusion based on language, transport, or appointment availability. Participants were included with a range of GP opinions about the presence of cognitive impairment in people who had presented with symptoms in a consultation; typically 2.5 problems are discussed per appointment.38 The index test reflects an average measure of diagnostic accuracy for an estimated 142 whole-time equivalent GPs working in different settings,39 who were not specially trained. GPs were instructed not to use any formal test to inform their judgment, but it is possible that brief cognitive tests, such as the General Practitioner Assessment of Cognition (GPCOG),40 may have been occasionally used. Based on previous studies, clinical judgment is likely to be based on rules of thumb,16 not formal tests,17 and information on referral forms indicated that judgment was informed by 'face-to-face presentation'. The interval between clinical judgment and the reference standard was unlikely to be associated with a significant progression in cognitive impairment.15 The index test for all consenting participants was fully verified, follow-up data were obtained after 6 months, and uncertain cases were adjudicated.

There was no evidence of selective participation by cognitive status, but non-participants may differ in other unmeasured ways that affect diagnostic accuracy. As reported in the Results, it is estimated that up to 1735 people in the study population would have developed symptoms in the study period, but it is unknown how many of these would have presented to their GP. The authors have no data on recruitment bias, but dementia was less prevalent than they predicted based on local memory clinic data, suggesting a lower threshold for referral to the study. Any systematic selection bias in who GPs referred to the study (such as excluding more frail people) would limit the generalisability of the findings to that group. An important limitation is that despite providing translation services, the population was largely White, native English-speakers. In addition, the CIs for the subgroups are still wide. People with advanced cognitive impairment who could not consent were excluded, so the findings cannot be generalised to that group, although it is likely that GPs would be more sensitive in identifying cognitive impairment at a more advanced stage.

Comparison with existing literature

Table 3 summarises the features of this study compared with the existing literature.41,42 A major strength of this study for applicability to practice is that it is one of only two studies to evaluate symptomatic people. The present study has the smallest number undergoing the index test, but only one other study has complete verification by the reference standard.43 The present study has lower sensitivity and higher specificity than the French study,20 but this could be because the French study verified only 26% of people who underwent the index test (where participating GPs referred five patients per GP over 2 years), or because other studies did not require participants to be symptomatic and consequently had a lower prevalence of dementia (ranging 2%–29%).44–47

View this table:
  • View inline
  • View popup
Table 3. Summary of seven studies investigating GP judgment for the diagnosis of dementia

Implications for practice

The accuracy of clinical judgment was comparable to other brief cognitive tests, many of which are now subject to licensing restrictions. The test characteristics of clinical judgment would support an approach to subsequent testing; for example, where highly sensitive tests are performed in people whom GPs judge as not having dementia, but there is significant patient concern (to rule out disease); and where very highly specific, but minimally burdensome tests are done in people whom GPs do think have dementia. This would be a change to current practice where cognitive testing is typically done with the same tests regardless of GP judgment.

Notes

Funding

The Wellcome Trust (Fellowship 108804/Z/15/z £321,248), Avon Primary Care Research Collaboration (£19,705), The Claire Wand fund (£5040), and the National Institute for Health Research School for Primary Care research (£9,971). This research was funded in whole, or in part, by the Wellcome Trust [108804/Z/15/z]. For the purpose of Open Access, the author has applied a CC BY public copyright licence to any Author Accepted Manuscript version arising from this submission. For access to data please contact the corresponding author. The Western Clinical Research Networks approved an application for service support costs for practices to provide for the expense of room hire in GP surgeries and GPs referring people to the study. YBS is supported by the NIHR Applied Research Collaboration West (NIHR ARC West). The views expressed in this article are those of the author(s) and not necessarily those of the NIHR or the Department of Health and Social Care.

Provenance

Freely submitted; externally peer reviewed.

Acknowledgements

The authors thank the participants and the staff at participating practices, without whom this work would not have been possible. The staff at the West of England Clinical Research Network arranged for redaction, collection, and transport of medical records from general practices.

Competing interests

The authors declare that no competing interests exist.

The authors report no conflicts of interest in this work

Ethical approval.

The National Research Ethics Service Committee London – Bromley (reference 14/LO/2025) gave a favourable ethical opinion on 25 November 2014. NHS Research and Development approvals were granted by Avon Primary Care Research Collaboration on behalf of Bristol, North Somerset and South Gloucestershire clinical commissioning groups. The University of Bristol acted as Sponsor.

  • Received April 2, 2021.
  • Accepted June 1, 2021.
  • Copyright © 2021, The Authors

This article is Open Access: CC BY license (https://creativecommons.org/licenses/by/4.0/)

References

  1. 1.↵
    1. James Lind Alliance
    (2014) Dementia Top 10 Priorities. https://www.jla.nihr.ac.uk/priority-setting-partnerships/dementia/top-10-priorities/. 13 Aug 2021.
  2. 2.↵
    1. Manthorpe J,
    2. Samsi K,
    3. Campbell S,
    4. et al.
    (2013) From forgetfulness to dementia: clinical and commissioning implications of diagnostic experiences. Br J Gen Pract 63(606):e69–e75, doi:10.3399/bjgp13X660805, pmid:23336476.
    OpenUrlAbstract/FREE Full Text
  3. 3.↵
    1. Greaves I,
    2. Greaves N,
    3. Walker E,
    4. et al.
    (2015) Gnosall primary care memory clinic: eldercare facilitator role description and development. Dementia 14(4):389–408, doi:10.1177/1471301213497737, pmid:24339104.
    OpenUrlCrossRefPubMed
  4. 4.↵
    1. Wells CE,
    2. Smith SJ
    (2017) Diagnostic care pathways in dementia: a review of the involvement of primary care in practice and innovation. J Prim Care Community Health 8(2):103–111, doi:10.1177/2150131916678715.
    OpenUrlCrossRef
  5. 5.↵
    1. Dodd E,
    2. Cheston R,
    3. Fear T,
    4. et al.
    (2014) An evaluation of primary care led dementia diagnostic services in Bristol. BMC Health Serv Res 14(1), doi:10.1186/s12913-014-0592-3, pmid:25432385. 592.
    OpenUrlCrossRefPubMed
  6. 6.↵
    1. Dodd E,
    2. Cheston R,
    3. Fear T,
    4. et al.
    (2014) An evaluation of primary care led dementia diagnostic services in Bristol. BMC Health Serv Res 14, doi:10.1186/s12913-014-0592-3, pmid:25432385. 592.
    OpenUrlCrossRefPubMed
  7. 7.↵
    1. National Institute for Health and Care Excellence
    (2013) 5.1 Improving early identification, assessment and diagnosis. In: Support for commissioning dementia care [CMG48] (NICE, London), 64.
  8. 8.↵
    1. Iliffe S,
    2. Wilcock J,
    3. Drennan V,
    4. et al.
    (2015) Changing practice in dementia care in the community: developing and testing evidence-based interventions, from timely diagnosis to end of life (EVIDEM). Programme Grants Applied Research 3(3):1–596, doi:10.3310/pgfar03030.
    OpenUrlCrossRef
  9. 9.↵
    1. Low L-F,
    2. McGrath M,
    3. Swaffer K,
    4. Brodaty H
    (2019) Communicating a diagnosis of dementia: a systematic mixed studies review of attitudes and practices of health practitioners. Dementia 18(7-8):2856–2905, doi:10.1177/1471301218761911, pmid:29544345.
    OpenUrlCrossRefPubMed
  10. 10.↵
    1. Burns A
    (2012) The benefits of early diagnosis of dementia. BMJ 344, doi:10.1136/bmj.e3556, pmid:22619201. e3556.
    OpenUrlFREE Full Text
  11. 11.
    1. Burns A,
    2. 51 Colleagues
    (2013) Alistair Burns and 51 colleagues reply to David Le Couteur and colleagues. BMJ 347, doi:10.1136/bmj.f6125, pmid:24129376. f6125.
    OpenUrlFREE Full Text
  12. 12.↵
    1. Le Couteur DG,
    2. Doust J,
    3. Creasey H,
    4. Brayne C
    (2013) Political drive to screen for pre-dementia: not evidence based and ignores the harms of diagnosis. BMJ 347, doi:10.1136/bmj.f5125, pmid:24018000. f5125.
    OpenUrlFREE Full Text
  13. 13.↵
    1. Brown J
    (2015) The use and misuse of short cognitive tests in the diagnosis of dementia. J Neurol Neurosurg Psychiatry 86(6):680–685, doi:10.1136/jnnp-2014-309086, pmid:25411547.
    OpenUrlAbstract/FREE Full Text
  14. 14.↵
    1. Davis DHJ,
    2. Creavin ST,
    3. JLY Y,
    4. et al.
    (2015) Montreal Cognitive Assessment for the diagnosis of Alzheimer’s disease and other dementias. Cochrane Database Syst Rev 10:CD010775, doi:10.1002/14651858.CD010775.pub2.
    OpenUrlCrossRef
  15. 15.↵
    1. Creavin ST,
    2. Wisniewski S,
    3. Noel-Storr AH,
    4. et al.
    (2016) Mini-Mental State Examination (MMSE) for the detection of dementia in clinically unevaluated people aged 65 and over in community and primary care populations. Cochrane Database Syst Rev 1(1):CD011145, doi:10.1002/14651858.CD011145.pub2, pmid:26760674.
    OpenUrlCrossRefPubMed
  16. 16.↵
    1. Pentzek M,
    2. Fuchs A,
    3. Wiese B,
    4. et al.
    (2009) General practitioners' judgment of their elderly patients' cognitive status. J Gen Intern Med 24(12):1314–1317, doi:10.1007/s11606-009-1118-2, pmid:19844763.
    OpenUrlCrossRefPubMed
  17. 17.↵
    1. O'Connor DW,
    2. Pollitt PA,
    3. Hyde JB,
    4. et al.
    (1988) Do general practitioners miss dementia in elderly patients? BMJ 297(6656):1107–1110, doi:10.1136/bmj.297.6656.1107, pmid:3143447.
    OpenUrlAbstract/FREE Full Text
  18. 18.↵
    1. Kaduszkiewicz H,
    2. Zimmermann T,
    3. Van den Bussche H,
    4. et al.
    (2010) Do general practitioners recognize mild cognitive impairment in their patients? J Nutr Health Aging 14(8):697–702, doi:10.1007/s12603-010-0038-5, pmid:20922348.
    OpenUrlCrossRefPubMed
  19. 19.↵
    1. De Lepeleire J,
    2. Aertgeerts B,
    3. Umbach I,
    4. et al.
    (2004) The diagnostic value of IADL evaluation in the detection of dementia in general practice. Aging Ment Health 8(1):52–57, doi:10.1080/13607860310001613338, pmid:14690868.
    OpenUrlCrossRefPubMed
  20. 20.↵
    1. Rondeau V,
    2. Allain H,
    3. Bakchine S,
    4. et al.
    (2008) General practice-based intervention for suspecting and detecting dementia in France: a cluster randomized controlled trial. Dementia 7(4):433–450, doi:10.1177/1471301208096628.
    OpenUrlCrossRef
  21. 21.↵
    1. Creavin ST,
    2. Noel-Storr AH,
    3. Richard E,
    4. et al.
    (2017) Clinical judgement by primary care physicians for the diagnosis of all-cause dementia or cognitive impairment in symptomatic people. Cochrane Database Syst Rev (2):CD012558, doi:10.1002/14651858.CD012558.
    OpenUrlCrossRef
  22. 22.↵
    1. Russell P,
    2. Banerjee S,
    3. Watt J,
    4. et al.
    (2013) Improving the identification of people with dementia in primary care: evaluation of the impact of primary care dementia coding guidance on identified prevalence. BMJ Open 3(12), doi:10.1136/bmjopen-2013-004023, pmid:24366579. e004023.
    OpenUrlAbstract/FREE Full Text
  23. 23.↵
    1. Creavin ST,
    2. Cullum SJ,
    3. Haworth J,
    4. et al.
    (2016) Towards improving diagnosis of memory loss in general practice: TIMeLi diagnostic test accuracy study protocol. BMC Fam Pract 17(1), doi:10.1186/s12875-016-0475-2, pmid:27430736. 79.
    OpenUrlCrossRefPubMed
  24. 24.↵
    1. Flahault A,
    2. Cadilhac M,
    3. Thomas G
    (2005) Sample size calculation should be performed for design accuracy in diagnostic test studies. J Clin Epidemiol 58(8):859–862, doi:10.1016/j.jclinepi.2004.12.009, pmid:16018921.
    OpenUrlCrossRefPubMed
  25. 25.↵
    1. American Psychiatric Association
    (2013) Diagnostic and Statistical Manual of Mental Disorders (DSM-5) (American Psychiatric Association, Washington, DC).
  26. 26.↵
    1. Hsieh S,
    2. Schubert S,
    3. Hoon C,
    4. et al.
    (2013) Validation of the Addenbrooke's cognitive examination III in frontotemporal dementia and Alzheimer's disease. Dement Geriatr Cogn Disord 36(3-4):242–250, doi:10.1159/000351671, pmid:23949210.
    OpenUrlCrossRefPubMed
  27. 27.↵
    1. Adshead F,
    2. Cody DD,
    3. Pitt B
    (1992) BASDEC: a novel screening instrument for depression in elderly medical inpatients. BMJ 305(6850), doi:10.1136/bmj.305.6850.397, pmid:1392921. 397.
    OpenUrlFREE Full Text
  28. 28.↵
    1. Bucks RS,
    2. Ashworth DL,
    3. Wilcock GK,
    4. Siegfried K
    (1996) Assessment of activities of daily living in dementia: development of the Bristol activities of daily living scale. Age Ageing 25(2):113–120, doi:10.1093/ageing/25.2.113, pmid:8670538.
    OpenUrlCrossRefPubMed
  29. 29.↵
    1. World Health Organization
    (1993) The ICD-10 classification of mental and behavioural disorders: diagnostic criteria for research (WHO, Geneva).
  30. 30.↵
    1. Petersen RC
    (2004) Mild cognitive impairment as a diagnostic entity. J Intern Med 256(3):183–194, doi:10.1111/j.1365-2796.2004.01388.x, pmid:15324362.
    OpenUrlCrossRefPubMed
  31. 31.↵
    1. Harris PA,
    2. Taylor R,
    3. Minor BL,
    4. et al.
    (2019) The REDCap consortium: building an international community of software platform partners. J Biomed Inform 95, doi:10.1016/j.jbi.2019.103208, pmid:31078660. 103208.
    OpenUrlCrossRefPubMed
  32. 32.↵
    1. Walters K,
    2. Hardoon S,
    3. Petersen I,
    4. et al.
    (2016) Predicting dementia risk in primary care: development and validation of the dementia risk score using routinely collected data. BMC Med 14(1), doi:10.1186/s12916-016-0549-y, pmid:26797096. 6.
    OpenUrlCrossRefPubMed
  33. 33.↵
    1. Cohen JF,
    2. Chalumeau M,
    3. Cohen R,
    4. et al.
    (2015) Cochran's Q test was useful to assess heterogeneity in likelihood ratios in studies of diagnostic accuracy. J Clin Epidemiol 68(3):299–306, doi:10.1016/j.jclinepi.2014.09.005, pmid:25441698.
    OpenUrlCrossRefPubMed
  34. 34.↵
    1. Noel-Storr AH,
    2. McCleery JM,
    3. Richard E,
    4. et al.
    (2014) Reporting standards for studies of diagnostic test accuracy in dementia: the STARDdem initiative. Neurology 83(4), 373, doi:10.1212/WNL.0000000000000621, pmid:24944261. 364.
    OpenUrlCrossRefPubMed
  35. 35.↵
    1. Prince M,
    2. Wimo A,
    3. Guerchet M
    (2015) World Alzheimer Report 2015. The global impact of dementia: an analysis of prevalence, incidence, cost and trends. https://www.alzint.org/u/WorldAlzheimerReport2015.pdf. 26 Jul 2021.
  36. 36.↵
    1. NHS Digital
    (2018) Patients registered at a GP practice April 2018. https://files.digital.nhs.uk/A7/EF50EA/gp-reg-pat-prac-topic-int.pdf. 26 Jul 2021.
  37. 37.↵
    1. Beishon LC,
    2. Batterham AP,
    3. Quinn TJ,
    4. et al.
    (2019) Addenbrooke's Cognitive Examination III (ACE-III) and mini-ACE for the detection of dementia and mild cognitive impairment. Cochrane Database Syst Rev 12, doi:10.1002/14651858.CD013282.pub2, pmid:31846066. CD013282.
    OpenUrlCrossRefPubMed
  38. 38.↵
    1. Salisbury C,
    2. Procter S,
    3. Stewart K,
    4. et al.
    (2013) The content of general practice consultations: cross-sectional study based on video recordings. Br J Gen Pract 63(616):e751–e759, doi:10.3399/bjgp13X674431, pmid:24267858.
    OpenUrlAbstract/FREE Full Text
  39. 39.↵
    1. NHS Digital
    (2019) General Practice Workforce (NHS Digital, London) In. Selected CCG information: NHS Bristol, North Somerset and South Gloucestershire CCG.
  40. 40.↵
    1. Brodaty H,
    2. Pond D,
    3. Kemp NM,
    4. et al.
    (2002) The GPCOG: a new screening test for dementia designed for general practice. J Am Geriatr Soc 50(3):530–534, doi:10.1046/j.1532-5415.2002.50122.x, pmid:11943052.
    OpenUrlCrossRefPubMed
  41. 41.↵
    1. van den Dungen P,
    2. van Marwijk HWM,
    3. van der Horst HE,
    4. et al.
    (2012) The accuracy of family physicians' dementia diagnoses at different stages of dementia: a systematic review. Int J Geriatr Psychiatry 27(4):342–354, doi:10.1002/gps.2726, pmid:21626568.
    OpenUrlCrossRefPubMed
  42. 42.↵
    1. Mitchell AJ,
    2. Meader N,
    3. Pentzek M
    (2011) Clinical recognition of dementia and cognitive impairment in primary care: a meta-analysis of physician accuracy. Acta Psychiatr Scand 124(3):165–183, doi:10.1111/j.1600-0447.2011.01730.x, pmid:21668424.
    OpenUrlCrossRefPubMed
  43. 43.↵
    1. Valcour VG,
    2. Masaki KH,
    3. Curb JD,
    4. Blanchette PL
    (2000) The detection of dementia in the primary care setting. Arch Intern Med 160(19):2964–2968, doi:10.1001/archinte.160.19.2964, pmid:11041904.
    OpenUrlCrossRefPubMed
  44. 44.↵
    1. Cooper B,
    2. Bickel H,
    3. Schäufele M
    (1992) The ability of general practitioners to detect dementia and cognitive impairment in their elderly patients: a study in Mannheim. Int J Geriatr Psychiatry 7(8):591–598, doi:10.1002/gps.930070809.
    OpenUrlCrossRef
  45. 45.
    1. Pond CD,
    2. Mant A,
    3. Kehoe L,
    4. et al.
    (1994) General practitioner diagnosis of depression and dementia in the elderly: can academic detailing make a difference? Fam Pract 11(2):141–147, doi:10.1093/fampra/11.2.141, pmid:7958576.
    OpenUrlCrossRefPubMed
  46. 46.
    1. De Lepeleire J,
    2. Aertgeerts B,
    3. Umbach I,
    4. et al.
    (2004) The diagnostic value of IADL evaluation in the detection of dementia in general practice. Aging Ment Health 8(1):52–57, doi:10.1080/13607860310001613338, pmid:14690868.
    OpenUrlCrossRefPubMed
  47. 47.↵
    1. Kaduszkiewicz H,
    2. Zimmermann T,
    3. Van den Bussche H,
    4. et al.
    (2010) Do general practitioners recognize mild cognitive impairment in their patients? J Nutr Health Aging 14(8):697–702, doi:10.1007/s12603-010-0038-5, pmid:20922348.
    OpenUrlCrossRefPubMed
Back to top
Previous ArticleNext Article

In this issue

BJGP Open
Vol. 5, Issue 5
October 2021
  • Table of Contents
  • Index by author
Download PDF
Download PowerPoint
Email Article

Thank you for recommending BJGP Open.

NOTE: We only request your email address so that the person to whom you are recommending the page knows that you wanted them to see it, and that it is not junk mail. We do not capture any email address.

Enter multiple addresses on separate lines or separate them with commas.
Clinical judgment of GPs for the diagnosis of dementia: a diagnostic test accuracy study
(Your Name) has forwarded a page to you from BJGP Open
(Your Name) thought you would like to see this page from BJGP Open.
CAPTCHA
This question is for testing whether or not you are a human visitor and to prevent automated spam submissions.
Citation Tools
Clinical judgment of GPs for the diagnosis of dementia: a diagnostic test accuracy study
Samuel Thomas Creavin, Judy Haworth, Mark Fish, Sarah Cullum, Anthony Bayer, Sarah Purdy, Yoav Ben-Shlomo
BJGP Open 2021; 5 (5): BJGPO.2021.0058. DOI: 10.3399/BJGPO.2021.0058

Citation Manager Formats

  • BibTeX
  • Bookends
  • EasyBib
  • EndNote (tagged)
  • EndNote 8 (xml)
  • Medlars
  • Mendeley
  • Papers
  • RefWorks Tagged
  • Ref Manager
  • RIS
  • Zotero
Share
Clinical judgment of GPs for the diagnosis of dementia: a diagnostic test accuracy study
Samuel Thomas Creavin, Judy Haworth, Mark Fish, Sarah Cullum, Anthony Bayer, Sarah Purdy, Yoav Ben-Shlomo
BJGP Open 2021; 5 (5): BJGPO.2021.0058. DOI: 10.3399/BJGPO.2021.0058
del.icio.us logo Facebook logo Mendeley logo Bluesky logo
  • Tweet Widget
  • Facebook Like
  • Google Plus One
  • Mendeley logo Mendeley

Jump to section

  • Top
  • Article
    • Abstract
    • How this fits in
    • Introduction
    • Method
    • Results
    • Discussion
    • Notes
    • References
  • Figures & Data
  • Info
  • eLetters
  • PDF

Keywords

  • dementia
  • general practice
  • sensitivity and specificity
  • medical history taking
  • symptom assessment

More in this TOC Section

  • Experiences of dyslexia in GP training in the UK: a qualitative study
  • Artificial intelligence in general practice in Germany: an online survey of current use, perceived benefits, barriers, and future needs
  • Planetary health in general practice: a cross-sectional survey in France
Show more Research

Related Articles

Cited By...

Intended for Healthcare Professionals

 
 

British Journal of General Practice

NAVIGATE

  • Home
  • Latest articles
  • Authors & reviewers
  • Accessibility statement

RCGP

  • British Journal of General Practice
  • BJGP for RCGP members
  • RCGP eLearning
  • InnovAiT Journal
  • Jobs and careers

MY ACCOUNT

  • RCGP members' login
  • Terms and conditions

NEWS AND UPDATES

  • About BJGP Open
  • Alerts
  • RSS feeds
  • Facebook
  • Twitter

AUTHORS & REVIEWERS

  • Submit an article
  • Writing for BJGP Open: research
  • Writing for BJGP Open: practice & policy
  • BJGP Open editorial process & policies
  • BJGP Open ethical guidelines
  • Peer review for BJGP Open

CUSTOMER SERVICES

  • Advertising
  • Open access licence

CONTRIBUTE

  • BJGP Life
  • eLetters
  • Feedback

CONTACT US

BJGP Open Journal Office
RCGP
30 Euston Square
London NW1 2FB
Tel: +44 (0)20 3188 7400
Email: bjgpopen@rcgp.org.uk

BJGP Open is an editorially-independent publication of the Royal College of General Practitioners

© 2025 BJGP Open

Online ISSN: 2398-3795