Skip to main content


Role of professional networks on social media in addressing clinical questions at general practice: a cross-sectional study of general practitioners in Australia and New Zealand

Article metrics



Clinicians frequently have questions about patient care. However, for more than half of the generated questions, answers are never pursued, and if they are, often not answered satisfactorily. We aimed to characterise the clinical questions asked and answers provided by general practitioners (GP) through posts to a popular professional social media network.


In this cross-sectional study, we analysed clinical questions and answers posted between January 20th and February 10th 2018 on a popular GP-restricted (Australia, New Zealand) Facebook group. Each clinical question was categorised according to ‘background’ or ‘foreground’ question; type (e.g. treatment, diagnosis); and the clinical topic (e.g. cardiovascular). Each answer provided in response to included questions was categorised into: (i) short answer (e.g. agree/disagree); (ii) provided an explanation to justify the answer; and (iii) referred to a published relevant evidence resource.


Of 1060 new posts during the study period, 204 (19%) included a clinical question. GPs most commonly asked about treatment (n = 87; 43%) and diagnosis (n = 59; 29%). Five major topics (23% skin, 10% psychology, 9% cardiovascular, 8% female genital, and 7% musculoskeletal) accounted for 118 (58%) questions. Each question received on average 10 (SD = 9) answers: 42% were short; 51% provided an explanation; and only 6% referred to relevant research evidence. Only 3 answers referred to systematic reviews.


In this sample of Australian and New Zealand GPs, who were members of a GP social media group, GPs asked clinical questions that can be organised into a limited number of question types and topics. This might help guide the development of GP learning programs.


There has been a rapid expansion of information in health care over the last few decades [1]. The challenge of keeping up with this information overload in health care is becoming harder, if not impossible [2, 3]. An information paradox exists, as despite being overwhelmed by this huge amount of information, clinicians frequently face personal knowledge gaps, ask clinical questions about patient care, and have many unanswered questions [1, 4].

A systematic review of clinical questions raised by clinicians showed that clinicians ask about 1 question every 2 patients [5]. However, for more than half of the generated questions, answers are never pursued, and if they are, often not answered satisfactorily [5, 6] – suggested missed opportunities for continuous learning. Lack of time and clinicians’ doubt about the existence and usefulness of relevant answers are the most commonly reported barriers to pursuing the answers for their clinical questions [5, 7]. Thus, addressing clinicians’ personal knowledge gaps provides an opportunity for continuing learning, and enhanced patient care. This is especially important for general practitioners (GPs) as their information needs are much broader than that of other specialties because of the wider spectrum of clinical problems encountered [8]. Consulting colleagues to answer clinical questions is one of the most common strategies that clinicians adopt to cope with the information overload [1, 2]. Clinicians are increasingly using social media to communicate and network with colleagues, share information, and disseminate research findings [9]. Thus, understanding clinicians’ use of social media networks to overcome information overload and address clinical questions generated from patient care is warranted. We aimed to characterise the clinical questions asked and answers provided by general practitioners and posted to a popular professional social media network.


In this cross-sectional study, we analysed all clinical questions posted on a popular GP-restricted Facebook group ‘GPs Down Under’ between January 20th and February 10th 2018. ‘GPs Down Under’ is a GP community-led closed professional Facebook group restricted to GPs practising in Australia and New Zealand. It has over 5800 members and generates over 50 posts per day.

The criteria for GPDU group membership include being a GP or a GP registrar and working in general practice with registration to practice in Australia and/or New Zealand. A three-step verification procedure is used. Two of the co-authors (KM and KP) were co-developers and are administrators of the GPDU Facebook group.

Two of the co-authors (KM and KP) scraped all the data (including each original post and all subsequent comments and replies to that post) of the posts that were posted during the study period. One of the authors who is also a member of GPDU (PG) de-identified the data and developed a de-identified anonymised dataset for screening and analysis.

We screened all posts that were posted to the group during the study period to identify those that included a clinical question (as defined by Ely et al. [10] - ‘questions about medical knowledge that could potentially be answered by general sources such as textbooks and journals, not questions about patient data that would be answered by the medical record’) – the focus of this analysis is on clinical questions posts. We categorised each included question as ‘background’ (e.g. What is myocardial infarction?) or ‘foreground’ question (e.g. In adult patients with myocardial infarction, does aspirin increase survival?). We also classified the type of each question (e.g. treatment, diagnosis) per the taxonomy used by Ely et al. [10]. We also classified the clinical topics of each included question according to the revised version of International Classification of Primary Care (ICPC-2) [11]. ICPC is a coding system co-developed and endorsed by the World Organization of Family Doctors to allow for more appropriate classification of data frequently encountered in a primary care setting [12, 13]. We screened all comments for answers provided in response to each question and classified each answer as: (i) short answer (e.g. yes/no or agree/disagree); (ii) provided an explanation (e.g. justify the answer or provide supporting clinical examples); and (iii) referred to a published relevant evidence resource (e.g. provided a website link to a research article or guideline). Three of the authors (LA, TH, and PG) independently analysed a random sample of 5% of posts and continued discussion until consensus was attained. LA coded the included questions and answers of the rest of included posts. Any uncertainties in the coding decisions were resolved by one of the co-authors with extensive experience in primary care (PG).

The study was approved by Bond University Human Research Ethics Committee. Group members were informed that all new posts during the study period would be anonymously used for research purposes without breaching members’ privacy.


During the study period, 504 GPs contributed a total of 1060 new posts, of which 204 (19%) included a clinical question. Of these 204 included questions, 174 (85%) were foreground and 30 (15%) background questions. The characteristics of clinical questions posted to GPDU group are presented in Table 1. Overall, most asked questions (165; 81%) concerned around 14 (30%) of the 42 identified question types: 87 (43%) about treatment followed by 59 (29%) diagnosis. The most frequently asked question types were: (i) 34 (17%) questions about the efficacy or the indication of a treatment (e.g. Does procedure/treatment x work for condition y?); (ii) 28 (14%) questions about the management (i.e. diagnostic or therapeutic) of a condition or finding (e.g. How should I manage condition/finding/situation y?); (iii) 23 (11%) questions about the cause or the interpretation of unspecified multiple findings (e.g. What does this patient have given these findings?). Table 2 lists the 10 most frequently asked clinical question types, with examples from the included questions.

Table 1 Characteristics of clinical questions posted to the GPDU group (n = 204)
Table 2 The most frequently asked clinical questions’ types with examples from the included questions

The clinical question topics were fairly distributed across all the clinical topics reflecting the range of patients seen by GPs. However, over half of all included clinical questions (n = 118; 58%) concerned five major clinical topics. The five most frequently addressed topics were skin (n = 47; 23%, 11 about skin neoplasm/lesion and 9 were related to a ‘rash’), mental health (n = 20; 10%), cardiovascular (n = 19; 9%), women’s health (n = 17; 8%), and musculoskeletal (n = 15; 7%). Table 3 shows the distribution of clinical questions across the clinical topics.

Table 3 The distribution of clinical questions across clinical topics per ICPC-2 classification system

The 204 included questions elicited 4065 comments, with a mean of 20 (SD 19) comments per included question (i.e. this refers to all comments that were posted as a reply to a clinical question; whether they provide an answer or not). GPDU members commented and provided answers for all 204 included questions. On average, 10 (SD 9) of the 20 (SD 19) comments were answers to the posted question; the remaining comments did not answer the clinical question originated in the post. On average, 42% (SD 27%) of these answers were short answers; 51% (SD 27%) were answers which provided an explanation or justification to the answer; and 6% (SD 11%) referred to published relevant evidence resource. Only three answers referred to evidence derived from systematic reviews (Table 1).

Figure 1 shows that engagement of GPDU members in asking and answering clinical questions per day and time. The engagement is peaked in the mornings (9 am) and on Thursdays, with a decline in the activity in late afternoon (4-5 pm) and on weekends.

Fig. 1

The activity of GPDU members in posting clinical questions (solid line) and comments (dashed line) per time (panel a) and day (panel b)


In this study of GPs’ use of social media networks to answer their clinical questions – GPs posted approximately 10 questions per day. The majority of questions asked were about treatment and diagnosis and more than half of all included clinical questions were about a small number of clinical topics.

Our results regarding the question types are consistent with the results of a systematic review of 11 studies which examined 7012 clinical questions raised by clinicians (mostly GPs) at the point of care and found that the majority of clinical questions concerned treatment (34%) and diagnosis (24%) - with 30% of the question types accounting for 80% of the questions asked [5]. Similar, treatment and diagnosis were the most frequently observed types of clinical questions by Allan et al. (observed 38 GPs during 420 consultations) [14] and Green et al. (interviewed 64 residents after 401 consultations) [15].

Despite the wide spectrum of clinical presentations seen by GPs in practice, we found that most of the clinical questions asked about a handful of clinical topics. This is consistent with frequencies in previous studies of the most frequently asked clinical questions’ topics [14, 16], and most commonly managed conditions in general practice settings [11]. For instance, Bjerre et al. analysed 1871 questions asked by 88 Canadian GPs and found that musculoskeletal, skin, and cardiac were among the five most frequently asked question topics [17].

In this sample of GPs evidence-based resources (e.g. systematic reviews) were infrequently used to support answers to the posted clinical questions. This aligns with the findings of a systematic review of 19 studies that described information seeking behaviour of clinicians and found that evidence-based resources were rarely used by clinicians as a primary source of information to guide their decisions [18, 19].

A limitation of this study is that we focused on questions that GPs pursued, articulated, and posted a clinical question to find an answer (i.e. known unknowns), but we likely missed their unpursued recognised questions as well as their unrecognised questions (i.e. unknown unknowns). Direct observation studies and post-consultation interviews may better capture the information needs of clinicians at point of care (i.e. less susceptible to memory bias), although these methods might generate superfluous questions from clinicians because they are being observed or interviewed [7, 16]. Nor would they be useful in investigating the role of social media networks in addressing clinical questions asked by clinicians. Another limitation is that screening and coding of the posts were performed by one author, and three authors independently coded data from only a random sample of 5% of posts. Further, we analysed questions posted in a single restricted Facebook group by GPs who thought to be active social media users (504 GPs out of 5800 GPDU members), therefore, our findings may not be generalised to GPs who do not actively use social media or use other social media platform, or do not use social media at all. We also did not verify the validity of provided answers or the evidence used to support these answers. Thus, answers that referred to sources of evidence might not be accurate or correct and answers that did not cite a source of evidence might be evidence-based answers or correct (i.e. the lack of referral to evidence sources did not necessarily mean that these answers are not evidence-based).

Our findings that the majority of questions asked were about a limited number of questions types and topics suggest that questions raised on social media networks may be helpful in guiding the development of GP future continuous learning programs (e.g. tailored according to identified information needs) and research activities (e.g. by identifying research-practice evidence gaps) [20]. Although professional social media networks might be useful in providing evidence-based answers to clinical questions, the quality of the evidence underpinning the answers provided in social media should be questioned. Disadvantages of using the social media network in answering clinical questions might include: (i) GP members are responsible for discerning relevant answers and ascertaining the validity of the answers provided; and (ii) the possibility of delivering and perpetuation of unsound answers to a large group of GPs. Therefore, methods to enhance active dissemination of question-specific evidence-based information (such as by Facebook group administrators or evidence champions) are warranted [21].


In this sample of Australian and New Zealand GPs, who were members of a GP social media group, the majority of clinical questions asked were about a limited number of questions types and topics which may help inform the development of GP future continuous learning programs and research activities. The validity of the evidence underpinning the answers provided for clinical questions asked in social media needs to be considered.



General Practitioner


General Practitioners Down Under


International Classification of Primary Care


Standard Deviation


  1. 1.

    Smith R. Strategies for coping with information overload. BMJ. 2010;341:c7126.

  2. 2.

    Smith R. What clinical information do doctors need? BMJ. 1996;313(7064):1062–8.

  3. 3.

    Bastian H, Glasziou P, Chalmers I. Seventy-five trials and eleven systematic reviews a day: how will we ever keep up? PLoS Med. 2010;7(9):e1000326.

  4. 4.

    Gorman PN, Helfand M. Information seeking in primary care: how physicians choose which clinical questions to pursue and which to leave unanswered. Med Decis Mak. 1995;15(2):113–9.

  5. 5.

    Del Fiol G, Workman TE, Gorman PN. Clinical questions raised by clinicians at the point of care: a systematic review. JAMA Intern Med. 2014;174(5):710–8.

  6. 6.

    Covell DG, Uman GC, Manning PR. Information needs in office practice: are they being met? Ann Intern Med. 1985;103(4):596–9.

  7. 7.

    Ely JW, Osheroff JA, Chambliss ML, Ebell MH, Rosenbaum ME. Answering physicians' clinical questions: obstacles and potential solutions. J Am Med Inform Assoc. 2005;12(2):217–24.

  8. 8.

    Bennett NL, Casebeer LL, Kristofco R, Collins BC. Family physicians' information seeking behaviors: a survey comparison with other specialties. BMC Med Inform Decis Mak. 2005;5:9.

  9. 9.

    von Muhlen M, Ohno-Machado L. Reviewing social media use by clinicians. J Am Med Inform Assoc. 2012;19(5):777–81.

  10. 10.

    Ely JW, Osheroff JA, Gorman PN, Ebell MH, Chambliss ML, Pifer EA, Stavri PZ. A taxonomy of generic clinical questions: classification study. BMJ. 2000;321(7258):429–32.

  11. 11.

    Cooke G, Valenti L, Glasziou P, Britt H. Common general practice presentations and publication frequency. Aust Fam Physician. 2013;42(1–2):65–8.

  12. 12.

    Hofmans-Okkes IM, Lamberts H. The international classification of primary care (ICPC): new applications in research and computer-based patient records in family practice. Fam Pract. 1996;13(3):294–302.

  13. 13.

    World Organisation of Family Doctors (WONCA). ICPC-2: International Classification of Primary Care Oxford: Oxford University Press; 1997.

  14. 14.

    Allan GM, Ma V, Aaron S, Vandermeer B, Manca D, Korownyk C. Residents' clinical questions: how are they answered and are the answers helpful? Can Fam Physician. 2012;58(6):e344–51.

  15. 15.

    Green ML, Ciampi MA, Ellis PJ. Residents' medical information needs in clinic: are they being met? Am J Med. 2000;109(3):218–23.

  16. 16.

    Gonzalez-Gonzalez AI, Dawes M, Sanchez-Mateos J, Riesgo-Fuertes R, Escortell-Mayor E, Sanz-Cuesta T, Hernandez-Fernandez T. Information needs and information-seeking behavior of primary care physicians. Ann Fam Med. 2007;5(4):345–52.

  17. 17.

    Bjerre LM, Paterson NR, McGowan J, Hogg W, Campbell CM, Viner G, Archibald D. What do primary care practitioners want to know? A content analysis of questions asked at the point of care. J Contin Educ Heal Prof. 2013;33(4):224–34.

  18. 18.

    Dawes M, Sampson U. Knowledge management in clinical practice: a systematic review of information seeking behavior in physicians. Int J Med Inform. 2003;71(1):9–15.

  19. 19.

    Alper BS, White DS, Ge B. Physicians answer more clinical questions and change clinical decisions more often with synthesized evidence: a randomized trial in primary care. Ann Fam Med. 2005;3(6):507–13.

  20. 20.

    Albarqouni L, Hoffmann T, Straus S, et al. Core competencies in evidence-based practice for health professionals: consensus statement based on a systematic review and delphi survey. JAMA Netw Open. 2018;1(2):e180281.

  21. 21.

    Cheston CC, Flickinger TE, Chisolm MS. Social media use in medical education: a systematic review. Acad Med. 2013;88(6):893–901.

Download references


We thank all the members of GPDU Facebook group for their participation.


L. Albarqouni is supported by an Australian Government Research Training Program Scholarship. The funders had no role in the design and conduct of the study, data collection, and analysis or in article preparation and approval.

Availability of data and materials

The datasets used and/or analysed during the current study are available from the corresponding author on reasonable request.

Author information

LA, TH, and PG conceived the research idea. All authors contributed to the design of the study. LA drafted the original manuscript. All authors contributed to the revision of the paper and approved the final manuscript for submission.

Correspondence to Loai Albarqouni.

Ethics declarations

Ethics approval and consent to participate

The study was approved by Bond University Human Research Ethics Committee. Individual consent from each GPDU Facebook member was waived by the ethic committee. Instead, group members were informed that all new posts during the study period would be anonymously used for research purposes without breaching members’ privacy.

Consent for publication

Not applicable.

Competing interests

Karen Price and Katrina McLean are administrators on the GPDU Facebook group. Otherwise, authors declare that they have no competing interests.

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is distributed under the terms of the Creative Commons Attribution 4.0 International License (, which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The Creative Commons Public Domain Dedication waiver ( applies to the data made available in this article, unless otherwise stated.

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark


  • Information needs
  • Clinical questions
  • Social media networks
  • Evidence-based practice