Statistics from Altmetric.com
As the focus of the journal continues to expand, we have enlisted many experts to our board and family of reviewers. With this issue we welcome three new board members, Denise Kendrick, Mark Stevenson and Nancy Peck, and with regret say farewell and sincere thanks to others. This issue also heralds several new policies. One is to feature more guest editorials to provide a wider perspective, such as that which Roy Shephard and Steve Marshall offer with respect to the study by Conn et al (p117) on sports injuries. A second is to encourage more reports on program evaluations. But first, some long awaited good news.
INDEXED AT LAST! IMPACT FACTOR COMING SOON
With this issue Injury Prevention will appear in the Institute for Scientific Information’s Current Contents (Clinical Medicine and Social and Behavioral Sciences). This is welcome news, albeit long overdue. It means, for better or worse, that we will eventually receive an impact factor.
The Institute for Scientific Information offers many services to scientists, among which Current Contents is perhaps the best known. In the old days this took the form of a booklet that reproduced the latest table of contents page for all journals that the Institute for Scientific Information deemed worthy of indexing. Because of its cost, researchers usually shared a subscription to one or more of these compilations. When a new issue of Current Contents arrived, each “partner” leafed through it, and ticked all papers for which they wanted a reprint and a secretary sent postcards requesting them.
This seemingly primitive process is now virtually unknown. In most parts of the world it has been replaced by the photocopier or by the computer. For many journals, PDF files can be found and printed. But the essence of staying current remains the same. It is based on including the contents of new issues of most worthy journals in databases like Current Contents.
The impact factor, much loved by deans, department chairs, and promotion committees in spite of its imperfections, is “a measure of the frequency with which the ‘average article’ in a journal has been cited in a particular year or period”,1 and in this respect alone, it reflects a journal’s scientific status. Specifically, an impact factor is calculated “by dividing the number of current year citations to the source items published in that journal during the previous two years”.
The decision to include Injury Prevention in Current Contents is another indicator of the scholarly importance of the journal. We are confident that viewed in the context of our field—that is, in comparison with other journals focusing on this topic, our ranking will be impressive.
PROGRAM EVALUATIONS WANTED: WALKING A TIGHTROPE
After consulting the editorial board, I have decided the time has come to act on one of the many sermons I have been preaching—that more injury prevention programs be evaluated. I now offer a possible incentive for doing so: the opportunity to share findings with others. In the future, we will encourage such reports by making it somewhat easier for authors to survive our rigorous peer review process. The bar will not be lowered; it will simply be bent a bit.
One the one hand, we will expand our view of what is acceptable as much as we responsibly can. We realize how difficult evaluations can be and how intimidating it can be to submit a report to the tough scrutiny of reviewers. Although reviewers will be expected to be as critical as before, we will give the benefit of doubt to authors of these reports. On the other hand, to keep the balance, we will do so with the understanding that such papers may be accompanied by a constructive commentary addressing its shortcomings. We will also commission papers providing guidelines for such studies and reports.
Although most scientists agree that the randomized trial is the ideal design for evaluation, few would argue that anything less is unacceptable. Nor, however, do we wish to suggest that any evaluation is better than none. But programs that attract public attention, funding, or both and remain unproven pose immense problems for everyone in the field. Thus, our goal is to encourage many more such studies.
The paper by Macarthur in this issue illustrates the challenges well (p112). It is an important contribution in spite of its acknowledged limitations. For future reports, much will hinge on how well authors address similar shortcomings. Readers can then judge for themselves how much confidence to place on the conclusions. Clearly, we will not knowingly publish any report that is fatally flawed. But to encourage more program evaluators to share the results from them, without violating basic scientific principles, we will begin walking this tightrope. Bear with us as we learn to do so and be patient if we occasionally fall off.
KEYWORDS: ONE KEY TO BEING FOUND
In this issue, assistant editor Genevieve Gore provides a guide to the often confusing task of computerized searches. Her advice is intended equally for authors and readers. Before launching any new study researchers (later would-be authors) must be reasonably certain what the literature has to say about their topic to ensure that they will not be reinventing wheels. Likewise, when a study is completed and a manuscript is being prepared, this exercise needs to be repeated to update references. Both steps require a thorough search of all relevant databases. Similarly when papers in this journal or others kindle a reader’s interest they may begin a web based quest for related publications.
In either situation, searching efficiently hinges on understanding the process by which papers are indexed and this, in turn, leads to considering the choice of keywords provided by authors. Although Medline (Index Medicus) does not do so, many other systems include these words when indexing a paper. In such instances, choosing them wisely may make the difference between others finding your paper readily or having it sink into oblivion. Because the number of keywords an author can provide is limited, and because other elements are used in the indexing process, a few suggestions seem in order.
First, don’t waste words by including terms that are automatically included because they appear in the title, the abstract, or in the name of the journal. Second, don’t use terms that are so peculiar that no sane reader would consider including them in their search strategy. Third, don’t include terms that are so general they would yield thousands of hits such as some examples from recent submissions: incidence, hospitalization, injuries, prevention, or provision. Finally, to be on the safe side, don’t rely on keywords at all but try to include all essential terms somewhere in the title or abstract. If in doubt about the wisdom of including a word, use it in a search and see what it yields. Authors should consider including some or all of the terms they used (or should have used) when searching the literature to bring their paper up to date before submission.
Attentive readers may have noted that since the journal was redesigned last year we no longer list keywords alongside the abstract. Nonetheless, we still ask authors for them and they are used by the technical editor for indexing and by Bench>Press when searching for possible reviewers. For these reasons and because some indexes use them, I advise taking much more care in choosing them than what so often appears to be the case in many of the papers that cross my desk.
If you wish to reuse any or all of this article please use the link below which will take you to the Copyright Clearance Center’s RightsLink service. You will be able to get a quick price and instant permission to reuse the content in many different ways.