Article Text

Download PDFPDF

Using hospital discharge data for injury research or surveillance? An observational study illustrating the impact of administrative change
Free
  1. Gabrielle Davie1,
  2. Dave Barson1,
  3. Jean C Simpson1,
  4. Rebbecca Lilley1,
  5. Pauline Gulliver2,
  6. Colin Cryer1
  1. 1 Injury Prevention Research Unit, Department of Preventive and Social Medicine, University of Otago, Dunedin, New Zealand
  2. 2 Section of Social and Community Health, University of Auckland, Auckland, New Zealand
  1. Correspondence to Gabrielle Davie, IPRU, Department of Preventive and Social Medicine, Dunedin School of Medicine, University of Otago, Dunedin 9054, New Zealand; gabrielle.davie{at}otago.ac.nz

Abstract

Introduction Hospital discharge data provide an important basis for determining priorities for injury prevention and monitoring trends in incidence. This study aims to illustrate the impact of a recent change in administrative practice on estimates of hospitalised injury incidence and to investigate the extent to which different case selection affects trends in injury incidence rates.

Methods New Zealand (NZ) hospital discharges (2000–2014) with a primary diagnosis of injury were identified. Additional case selection criteria included first admissions only, and for serious injury, a high threat-to-life estimate. Comparisons were made, over time and by District Health Board, between hospitalised injury incidence estimates that included, or not, short-stay emergency department (SSED) discharges.

Results Of the 1 229 772 injury hospital discharges, 365 114 were SSED; 16% of the annual total in 2000, 38% in 2014. Identification of readmissions prior to the exclusion of SSED discharges resulted in 30 724 cases being erroneously removed. Age-standardised rates of hospitalised injury over the 15-year period increased by, on average, 2.7% per year when SSED discharges were included; there was minimal secular change (−0.2%) when SSEDs were excluded. For serious hospitalised injury, the annual increase was 2.3% when SSED was included compared with 1.1% when SSEDs were excluded.

Conclusion Spurious trends in hospitalised injury incidence can result when administrative practices are not appropriately accounted for. Exclusion of SSED discharges before the identification of readmissions and the use of a severity threshold are recommended to minimise the reporting bias in NZ hospitalised injury incidence estimates.

  • injury incidence
  • trends
  • hospital discharge data
  • bias

Statistics from Altmetric.com

Request Permissions

If you wish to reuse any or all of this article please use the link below which will take you to the Copyright Clearance Center’s RightsLink service. You will be able to get a quick price and instant permission to reuse the content in many different ways.

Introduction

Hospital discharge data provide an important basis for determining priorities for prevention, emerging issues and trends in incidence worldwide. Before using hospital discharge data for these purposes, processing is advocated to ensure cases are selected in such a way as to sufficiently minimise reporting biases.1–3 For injury, there is a considerable body of work illustrating the variation in estimates obtained when different inclusion and exclusion criteria are used to identify cases.4–9 Substantial changes to administrative systems used to collect and collate hospital discharge data have the potential to have undue impact on injury incidence estimates, depending on case selection criteria. Use of primary diagnosis of injury only, first admissions only and the use of an objective severity measure (rather than length of stay) have been advocated to minimise the impact of reporting biases in injury estimates.10 11

An illustrative example of the impact of administrative changes that highlights the importance of regular review of inclusion and exclusion criteria is provided by recent changes in reporting practices associated with New Zealand (NZ) hospital discharge data collection. Historically, short-stay emergency department (SSED) events in which a patient is treated in ED for 3 hours or more and discharged alive without admission to hospital as an inpatient were not included in NZ’s National Minimum Dataset (NMDS) of hospital discharges.12 13 In 1999, admission practices began to vary considerably by District Health Board (DHB) and subsequently the reporting to and capture of ED events in the NMDS steadily increased. Despite NZ’s Ministry of Health (MoH) mandating the reporting of SSED events in 2007, consistent reporting was not considered to have been achieved until July 2012.12 In a factsheet published in 2015, MoH recommended ‘excluding these events from analyses of hospital data for the years prior to 2012/13’.14

One of the criteria recommended to identify cases for inclusion in injury incidence estimates is that ‘first admissions only’ are retained, that is, subsequent admissions for the same injury event (readmissions) are excluded. NZ is in the fortunate position of having a person-level identifier (National Health Index) and date of injury included in hospital discharge data enabling the identification and exclusion of readmissions.15 To our knowledge, no recommendations exist on whether readmissions should be identified before or after SSED are excluded. Similarly, the impact of the ordering of these case selection criteria on injury incidence estimates is unknown. At a theoretical level, a number of scenarios exist that, depending on when readmissions are identified, would result in an injury event being included or not. A patient being treated in ED, discharged and then admitted to a hospital ward a few days later is one example of such a scenario as is being treated in ED then transferred to a ward in another hospital.

The aims of this study are to illustrate the impact of a recent change in NZ administrative practices on estimates of hospitalised injury incidence and to investigate the extent to which different case selection criteria affect trends in the rate of hospitalised injury incidence. The hypothesis was that, depending on how SSED events are managed, estimates of hospitalised injury incidence obtained from NZ hospital discharge data will vary considerably over time and between DHBs.

Methods

For illustrative purposes, two operational definitions of injury incidence were used to assess the impact of the inclusion of SSED discharges. The first definition included all hospital discharges reported to the NMDS with a primary diagnosis of injury that were not identified as readmission; estimates obtained using this definition are referred to as ‘hospitalised injury incidence’. The second definition, used to obtain ‘serious hospitalised injury incidence’ estimates, further restricted the cases identified in the first definition to those considered serious, using a threat to life measure (further details below).

Hospital discharges with primary diagnosis of injury (International Classification of Diseases, 10th Edition (ICD10) S00-T78) and a date of discharge within 2000–2014 were extracted from the NMDS. The starting year was chosen to align with NZ’s adoption of the ICD10.

Readmissions to hospital for subsequent treatment for a previous hospitalised injury were identified using unique person identifiers and dates of injury, admission and discharge using a method described previously.15 The goal of removing readmissions is to avoid multiple counting of the same injury event while avoiding removing incident cases.

Serious injury events were identified as either: (1) discharge events for which the patient was discharged dead from the first admission or any subsequent readmission or (2) discharge events in which the patient was discharged alive but had an ICD-based injury severity score (ICISS) ≤0.931 (ie, that the combined injuries gave an estimated threat to life of 6.9% of more).12 16

SSED discharges were identified as those discharge events with a Health Specialty (classification describing the specialty or service to which a healthcare user has been assigned, which reflects the nature of the services being provided) in the range M05-M08 (discharge from emergency medical services) and none or 1-day stay past midnight in hospital.14 As the concern was that identifying readmissions prior to excluding SSED was potentially removing incident cases, two readmissions indicators were created so that resulting datasets could be compared. The first readmissions indicator assigned readmission status (first admission or not) prior to the exclusion of SSED, while the second assigned readmission status after the exclusion of SSED.

DHB was obtained directly from the NMDS. DHB of facility submitting the discharge data was used for the purpose of comparing reporting practices and DHB of patient’s usual residence was used to compare rates per 100 000 person years.

Annual estimated resident populations for 2000–2014 by DHB and 5-year age group were obtained from Statistics NZ. Direct age-standardised rates (ASRs) and corresponding 95% CIs by DHB were calculated using Stata’s dstdize command. Average percentage increase in annual ASRs were calculated using regression to obtain the average change over the 15-year period and then using this to obtain the average annual increase since 2000. StataSE V.14.2 was used for analysis.17

Results

For 2000–2014, there was a total of 1 229 772 hospital discharges with a primary diagnosis of injury reported to the NMDS, approximately 80 000 per year for an average population of 4.2 million. As a first step in understanding the magnitude of the administrative change, the relative contribution of SSED discharges to non-SSED discharges over time was calculated. Of the 1.2 million injury discharges, 367 875 had a health specialty in the range M05-M08, thus satisfying one of the two components of the MoH’s SSED definition. Only 2260 (0.7%) of these ‘M05-M08’ discharges had two or more (mid)nights in hospital resulting in n=365 114 (30% of 1 229 772) being identified as SSED (table 1). The increasing contribution of SSED events to the NMDS over time is dramatic (table 1). Between 2000 and 2014, there was close to a 400% increase (9804 to 37 851) in the number of annual SSED events compared with a 19% increase (from 52 599 to 62 749) in the number of non-SSED events. The ratio of SSED events to non-SSED events increased steadily from 0.19 in 2000 to a peak of 0.61 in 2013.

Table 1

The frequency of short-stay emergency department (SSED) injury discharges and the ratio of SSED injury discharges to non-SSED injury discharges from 2000 to 2014 in New Zealand’s National Minimum Dataset of hospital discharges

Variation in patterns of reporting of SSED injury discharges by NZ’s 20 DHBs over the 15-year period is also dramatic. Examining injury discharges by DHB by year indicates no SSED discharges were reported on 50 occasions (17% of 300 DHB-years). At the other end of the spectrum on 24 occasions (8%), DHBs reported more SSED injury discharges than non-SSED injury discharges, with this ratio reaching as high as 4:1. As shown in figure 1, the ratios of SSED injury discharges to non-SSED injury discharges for six (chosen for illustrative purposes) DHBs in NZ indicates at least one DHB was reporting a large proportion of SSED events prior to 2000 (labelled ‘Pre-2000’), whereas another reported almost no SSED events over the 15-year period (labelled ‘Minimal’). A steady increase over time in the ratio of SSED to non-SSED injury discharges was apparent for one DHB (labelled ‘Steady’), whereas sudden uptake in the reporting of SSED events for three DHBs was evident, although at three different time points (labelled ‘Sharp 2004’, ‘Sharp 2007’, ‘Sharp 2010’). Of the 20 DHBs, five could be qualitatively classified as ‘Pre-2000’, four as ‘Steady’, one as ‘Minimal’ with the remaining ten exhibiting ‘Sharp’ increases at different years from 2004 to 2012.

Figure 1

The ratio of short-stay emergency department (SSED) injury discharges to non-SSED injury discharges reported to New Zealand’s National Minimum Dataset of hospital discharges for six District Health Boards (DHB) chosen for illustrative purposes.

The variation in the inclusion of injury SSED discharges over time and by DHB both confirm the need to abide by MoH’s recommendation to exclude SSED from analyses of NZ hospital discharge data. As part of process of creating analytical data from administrative data, an investigation was undertaken to determine whether readmissions should be identified before or after SSED discharges are excluded. Options for when readmissions could be identified during this process are presented in figure 2.

Figure 2

From administrative data to analytical data fit for injury incidence research: options for the timing of the identification of readmissions in New Zealand’s hospital discharge data. ED, emergency department; SSED, short-stay emergency department.

Under option 1, 101 357 (8.2% of total injury discharges) were identified as readmissions; of these 6161 were also SSED, leaving 95 196 non-SSED readmissions to be removed (table 2). When readmissions were identified after the exclusion of SSED discharges (option 2), 64 472 were identified. As the 769 462 cases retained under option 1 were a complete subset of those retained in option 2, the additional 30 724 cases retained in option 2 were thus examined.

Table 2

Estimating hospitalised injury incidence: number of injury cases retained for two options of case selection ordering using New Zealand hospital discharge data (2000–2014)

Through communication with expert MoH staff and manual review of discharge data grouped at the patient level, a number of scenarios were identified that resulted in a hospitalised incident injury event being erroneously removed for analysis under option 1 (online supplementary table 1). Where readmissions exist and are identified prior to excluding SSED cases (option 1), then the injury event will not be retained for analysis if the first discharge associated with the injury event is a SSED discharge. Readmissions should thus be identified after excluding SSED (option 2).

The age-standardised hospitalised injury incidence rates per 100 000 person-years calculated using option 2 varied from a high of 1284 in 2001 to a low of 1224 in 2004 (figure 3, online supplementary table 2). The congruency in these ASRs over the 15-year period is in stark contrast to ASRs based on numerators including SSED, again emphasising the need to exclude SSED. In comparison, the trends over time in the two series of ASRs for serious hospitalised injury incidence were more similar. Over the 15-year period, the ASR based on numerators including SSED increased by on average 2.7% per year compared with minimal change (−0.2%) when SSED was excluded. For serious hospitalised injury including SSED, the average annual increase was 2.3% compared with 1.1% when SSED were excluded.

Figure 3

Age-standardised rates (ASRs) by year (2000–2014) calculated using four methods of estimating injury incidence from New Zealand hospital discharge data. SSED, short-stay emergency department.

At the DHB level, ASRs calculated using the different selection criteria also vary considerably (table 3). To help illustrate this variation, for each method of estimating injury incidence, the DHBs were ranked, from lowest to highest, based on their ASRs. Variation in ranking is marked, for example, West Coast’s rank varies from 8th to 18th, South Canterbury’s from 3rd to 15th. Interestingly, Capital and Coast DHB consistently has the lowest ASR irrespective of the case selection criteria used.

Table 3

Frequencies and age-standardised rates (ASRs) by District Health Boards (DHBs) using four methods of identifying injury incidence from NZ hospital discharge data (2000–2014)

For hospitalised injury incidence, Waitemata, the DHB with the largest estimated resident population, had an average ASR over the 15-year period of 2049 per 100 000 when SSED were included compared with 1225 per 100 000 when SSED were excluded. This change in ASR resulted in Waitemata substantially improving its ASR ranking from 16th of 20 DHBs to 8th. More dramatic was the drop in hospitalised injury incidence ASR ranking for South Canterbury DHB from 3rd to 15th when SSEDs were excluded. Auckland and West Coast DHBs also had substantial changes in rank of hospitalised injury incidence when SSEDs were excluded; Auckland’s rank improved from 14th to 3rd, whereas the West Coast’s rank dropped from 8th to 18th.

Comparison of ranks between ASRs calculated for serious hospitalised injury incidence using numerators including then excluding SSEDs were on average less dramatic with 65% (n=13) of DHBs having a change in rank of no more than two. The two DHBs with the greatest change in rank were South Canterbury (dropped from 4th to 13th) and Waitemata (improved from 12th to 4th).

Discussion

Given the obvious variation in the inclusion of SSED discharges over time and by DHB, users of any NZ hospital discharge data extract from 2000 clearly need to consider its impact on their work and exclude SSED cases where necessary. The quadrupling of injury SSED discharges reported to the NMDS between 2000 and 2014 substantially altered the ratio of SSED events to non-SSED events for injury discharges from 1:5 to 3:5 over the 15-year period. Marked differences in trends over time were also apparent from the age-standardised hospitalised injury incidence rates by year depending on whether SSED discharges were included or not. These injury-specific findings endorse the MoH’s recommendation to exclude SSED from analyses of NZ hospital discharge data and emphasise that this is critical when calculating hospitalised injury incidence.18

Building on the MoH’s Factsheet that stated excluding SSED events had ‘different effects on the data depending on the cause of the hospitalisation’, this is the first comprehensive analysis of the impact of SSED events on injury incidence estimates.14 One of our findings is that, for injury, identifying readmissions prior to excluding SSED events will result in discharges being erroneously removed. The MoH Factsheet released in June 2015 was welcome but potentially too late to avoid the reporting of spurious findings especially given variation in reporting by DHB is apparent as early as the change in the millennium. In today’s world with the current emphasis on use, including linkage, of secondary data, it is critical that all those involved in the acquisition, collection and provision of administrative data are cognisant that changes may jeopardise the validity of research based on such data. Informative and timely communication of changes made at any stage of administrative data collection and their likely impact on analyses using the resulting data is thus essential.

Trends observed in hospitalised injury incidence estimates that have not excluded SSED discharges will be affected by reporting bias. This is a concern as following international trends, NZ’s national hospital discharge data are being increasingly linked with additional administrative datasets maintained by NZ’s MoH and other government agencies enabling novel research opportunities and increasing the likelihood that, without appropriate action, including data cleaning, conclusions made from data not collected for research purposes will be misinformed.19–21

The focus of this paper was to investigate reporting bias caused by changes in the reporting of SSEDs in NZ. Differences in treatment practices are likely to have also occurred across time and between place—at the primary, secondary and tertiary levels. The use of a severity threshold, such as that used in this paper, has previously been advocated to ameliorate these effects.22 While it was outside the scope of this study to investigate changes in treatment practices, in future studies, investigation of the impact of these changes on serious and moderately severe incident rates is recommended. With the use of the appropriate severity thresholds, however, these impacts are expected to be small.

Interestingly, the ASR for hospitalised injury incidence excluding SSED is relatively constant over the 15-year period, whereas the corresponding ASR for cases identified using a severity threshold increased from 130 per 100 000 to 146 per 100 000. This increase was particularly noticeable during 2004–2006. Explanations for this are not obvious but could possibly include administrative changes in precision of registration or medical specialty assignment, limitations to ICISS or, more likely, given the reported validity of serious injury indicators, that the observed increase is real. Changes in treatment practices such as a shift from inpatient admissions to outpatient clinics may also have impacted on hospitalised injury incidence estimates over time and by DHB, although the use of a ‘threat-to-life’ severity threshold should have minimised this.

The existence of a nationally compiled hospital discharge dataset is a strength of this study that made possible the comprehensive analysis undertaken. A possible limitation of this study is the assumption that the application of the 3-hour time limit used to define SSED is sufficiently stable across the DHBs and over time. Although the definition of the time limit is well specified in the NMDS data dictionary, variation may exist in terms of application, although it is unlikely to result in a systematic bias such as that observed. In analyses comparing different case selection criteria for identifying injury incidence, the use of DHB of patient’s usual residence rather than DHB of hospital treatment could also be seen as a limitation. DHB of usual residence was used to align with population denominators. As there are clearly DHB-level reporting behaviours that influence the number of injury discharges reported to the NMDS, discharge summaries for the 16.9% of patients hospitalised for injury in a ‘non-usual residence’ DHB may not have been processed in the same way as patients hospitalised in their DHB of usual residence.

MoH recommendations, as reported in the 2015 Factsheet, are that SSED events should be identified using health specialty code and length of stay.14 Users should be aware this may underestimate SSED events as variation in DHB-level reporting behaviours may affect SSED identification; for instance, South Canterbury DHB initially reported SSED events using M00 (General Medicine) rather than M05-M08. Since available discharge type codes were extended in July 2007 so that discharges from ED acute facilities could be specifically identified, further research should explore whether this provides a more accurate method of identifying SSED events for data collected after July 2007. Also required is research into definitions (theoretical and operational) that leads to the most valid estimates of injury incidence (ie, stable over time and place), using hospital discharge data that include SSED events. Although its inclusion is problematic, having consistently collected national ED discharge data will be useful for research purposes assuming the accuracy of ED discharge diagnoses is ‘fit-for-purpose’. Recent Canadian research assessing quality of ICD10 coding in ED records concluded that diagnoses codes showed high agreement and reliability, although variations were observed across hospitals.23

Conclusion

Spurious trends in injury incidence are obtained when changes in administrative practices relating to NZ hospital discharge data are not appropriately accounted for. To minimise the reporting bias in hospitalised injury incidence estimates over time, in NZ, we endorse the MoH recommendation to exclude SSED events but add that, for injury, SSED should be excluded before readmissions are identified. The use of a severity threshold is also recommended. Given the worldwide interest in the use and linkage of large administrative datasets, it is essential that the ‘trips and traps’ of data sources are understood and well documented.

What is already known on the subject

  • Datasets produced from hospital discharge information are a cornerstone of injury research and injury surveillance.

  • Changes in administrative processes have the potential to lead to spurious injury incidence estimates.

What this study adds

  • Injury incidence estimates that use any New Zealand hospital discharge data should exclude short-stay emergency department discharges, then identify and exclude readmissions to increase the validity of comparisons over time and by District Health Board.

  • This study is a valuable reminder that in an age where routinely collected data are being increasingly used and often linked, the ‘trips and traps’ associated with each data source need to be understood and well documented.

Acknowledgments

The authors would like to acknowledge the Ministry of Health (MoH) for their role as custodian of New Zealand’s hospital discharge data. The authors would like to thank Chris Lewis (information analyst, MoH) for providing helpful comments on an earlier draft of this paper.

References

Footnotes

  • Contributors GD: conceptualisation, funding acquisition, formal analysis and writing – original draft preparation; DB: conceptualisation funding acquisition, and data curation; JCS: funding acquisition; RL: conceptualisation; all authors: writing – review and editing.

  • Funding Funding for this research was obtained from the University of Otago.

  • Competing interests None declared.

  • Patient consent for publication Not required.

  • Ethics approval This research is covered under ethical approval granted by The New Zealand Health and Disability Ethics Committee OTA/99/02/008/AM08.

  • Provenance and peer review Not commissioned; externally peer reviewed.

  • Data availability statement New Zealand’s MoH are the custodians of the hospital discharge data used in this research. Requests for these data should be directed to data-enquiries@moh.govt.nz