2016 User Survey

Like in previous year, we attempted to collect some information about the users of the light and neutron sources. Of particular interest is again the amount of common users across the different facilities. This year the scope of the survey is somewhat different from earlier year. Rather than collecting data from the individual user offices, we collected the publications together with the authors and related data. In a sense we interpret an AUTHOR as a facility USER, which appears fully justified since any author of an experiment based publication is certainly a beneficiary of the user services provided by the Neutron and Photon user facilities. The change in procedures slightly changes the accounting for individual users: users involved in preparation or post-processing of experiments are more accurately captures whereas users registered at the facilities, but never co-authored any of the experiment-driven publication, will drop out of the statistics. Naturally, not all experiments result in publications and not all publications are properly acknowledging all facilities and instruments involved, so publication based user statistics are bound to come up with somewhat smaller number of users. Overall, the figures statistically describing the facilities user communities remain amazingly similar.

Aim of the user survey

The primary goal of the survey is the description of the user communities, how many users are there, how many users are common across facilities (users using more than a single facility involved in multiple experiments), where do user come from. The publication based user survey has the advantage to answer other questions: how are users collaborating, what kind of scientific categories are being tackled, what's the impact of the research.

Resources used

We tried to harvest as much information of possible. In particular data have been obtained from:

and we gratefully acknowledge the services, user friendly APIs and data we could harvest without being blocked from sites while retrieving more than a handful of data items.

We also attempted to collect some information directly from Publishers sites searching for scientific classification, metrics information or just the authors names and affiliations, but with very little success.


All facilities maintain databases or at least a list of publications related to experiments pursued at the facility. In almost all cases these lists of publications are openly accessible. In some cases a registration is required. The quality and accessibility of the publication lists vary substantially. Some of the facilities provide complete instrument specific exports in formats very easy to parse; some provide html-pages with MS Windows specific encoding. All publication list contained information about authors, titles and journals, the majority also enlists DOIs, which makes further processing comparably simple.

Where available we harvested the pure DOIs from the publication lists. In cases where no DOIs where available the publication titles where used to obtain the DOIs. The CrossRef API makes it amazingly easy to obtain a DOI for a publication title, and CrossRef does a fantastic job identifying the publication title from very limited information. Unfortunately the CrossRef API will always return a match, even if doesn't resemble the search pattern at all and the score reported by CrossRef is not really useful to distinguish between good and completely wrong matches. So we needed to employ some fuzzy comparison of presumed publication title and CrossRefs match. As it turns out, in very many cases the titles of publications deposited in publication databases was not identical to the title of the publication in the journal. Particularly cumbersome were mathematical or chemical formulas.

Where DOIs where available, we used the pure DOIs and disregarded any additional information. A small but not insignificant number (roughly 1%) of DOIs where either incorrect or not resolvable . Where possible the DOIs have been corrected using additional information.

Quality of the data

Of course, data are not perfect and retrieval of DOIs from publications titles failed for roughly 10% of the items, mostly because some publication lists also contained entries which never got DOIs assigned. Though we tried to collect only DOIs of proper peer-reviewed articles, some other publication types also made it into the dataset.; partially due to facilities tagging non-articles as articles and partially because some publishers do it likewise.

The final publication list with more than 58.000 entries is however very solid, solely derived from validated DOIs and cross-validated using a number of different data sources.

The authors list (525.000 non-unique entries, roughly 130.000 unique authors) are somewhat more error-prone, since authors change affiliation or don't always use an identical subset of affiliations, use variations of names, have more than one ORCID or ResearcherID and identifiers like ORCID, Scopus or ResearcherID are available for only a subset of authors, which makes close to 100% accurate assignment difficult. Despite the small errors in authors lists, the general figures and trends are very consistent.

One major problem is the inclusion of articles from LHC experiments in some of the facilities publication lists. Even though Photon and Neutron facilities might have a contribution to such publications, it's bound to be a very small contribution. LHC publications tend to come with extremely long list of authors (we had up to 3000 authors on a single publication in the list), which would render any statistics fairly useless. Such publications have therefore been excluded from any of the data.

The facilities

We collected publication data from 29 different facilities for the years 2010-2016 (where available). Please resist the temptation to compare facilities by number of publications or users!

The idea behind these surveys is to document the commonalities across facilities and scientific disciplines, not to establish a metric. All facilities are in a very different state, and experimental setup requirements can be extremely different. For a example a standard MX experiment can be done within minutes (including structure determination), whereas a complex setup at a Neutron source or FEL can easily take more than just a few days.

Some global figures

Number of facilities included: 29
Number of Neutron facilities: 7
Number of Photon facilities: 22
Number of (unique) publications (DOIs): 70026
Number of (unique) users (authors): 178107
Countries of users home institutions: 131

Total number of unique users for individual years:

year 2010 2011 2012 2013 2014 2015 2016 Sum 2010-2016
users 33039 36522 38476 43191 45192 47026 44196 287642 178107
publications 8574 9461 9393 10636 11017 11218 9727 70026 70026

Number of users are continuously increasing. The counts for 2016 are somewhat lower simply because recording of publications takes some time and are presumably not very complete yet for 2016.

brief overview of publications and users per facility

# Facility Type P|N Country Region DOIs Public # Publications # Users U/P Map
1 ALBA Synchrotron P Spain Europe yes yes 368 1802 4.9 g
2 ALS Synchrotron P USA America yes yes 5833 20388 3.5 g
3 ANKA Synchrotron P Germany Europe yes yes 348 1553 4.5 g
4 ANSTO Neutron Source N Australia Asia-Pacific yes yes 1276 3699 2.9 g
5 APS Synchrotron P USA America yes yes 10605 30370 2.9 g
6 Australian Synchrotron Synchrotron P Australia Asia-Pacific no yes 2184 6075 2.8 g
7 CLS Synchrotron P Canada America yes yes 1629 5593 3.4 g
8 DESY Synchrotron/FEL P Germany Europe yes yes 2305 8508 3.7 g
9 DLS Synchrotron P UK Europe yes yes 4683 17843 3.8 g
10 ELETTRA Synchrotron/FEL P Italy Europe yes yes 2436 7621 3.1 g
11 ESRF Synchrotron P France Europe yes yes 13342 40213 3.0 g
12 FRM-II Fission Reactor N Germany Europe yes yes 1191 3769 3.2 g
13 HZB Ber II Fission Reactor N Germany Europe yes yes 574 1817 3.2 g
14 HZB Bessy II Synchrotron P Germany Europe yes yes 2640 9272 3.5 g
15 ILL Fission reactor N France Europe yes yes 3470 10610 3.1 g
16 ISIS pulsed neutron and muon source N UK Europe yes yes 2871 8442 2.9 g
17 LCLS FEL P USA America yes yes 803 3295 4.1 g
18 LNLS Synchrotron P Brazil America yes yes 1786 5802 3.2 g
19 MAX Synchrotron P Sweden Europe no yes 868 3220 3.7 g
20 NSLS Synchrotron P USA America yes no 3270 10341 3.2 g
21 NSRRC Synchrotron P Taiwan Asia-Pacific yes yes 1659 3560 2.1 g
22 ORNL Spallation Neutron Source N USA America no yes 1643 4766 2.9 g
23 PSI SINQ Spallation Neutron Source N Switzerland Europe yes no 861 2590 3.0 g
24 PSI SLS Synchrotron P Switzerland Europe yes no 3597 14510 4.0 g
25 SACLA FEL P Japan Asia-Pacific yes yes 66 454 6.9 g
26 SOLEIL Synchrotron P France Europe no yes 1340 6398 4.8 g
27 SPRING8 Synchrotron P Japan Asia-Pacific no yes 4151 9066 2.2 g
28 SSLS Synchrotron P Singapore Asia-Pacific no yes 104 404 3.9 g
29 SSRL Synchrotron P USA America yes yes 2378 9665 4.1 g

User counts

Users in Common

Users Geography

One of questions asked in earlier user surveys was the origin of our users, or rather the geographical location of users. Little surprising in view of the number of facilities involved in this survey, the largest number of user are coming from labs located in the USA.

(1) (1) Number of unique users (authors) per country
(2) Number of unique users (authors) per country divided by the number of inhabitants
(3) Number of unique users (authors) per country divided by GDP (2015)

The shift in ranking for GDP or population normalized data is quite remarkable. It would also be possible to scale these by e.g. expenditures for military or research&education etc. The http://data.worldbank.org/ is a great resource for country related data.

(2) (3)

A view on collaboration profiles

Publications, Publishers, Impact and Citations


Altmetric is collecting and collating all of this disparate information to provide you with a single visually engaging and informative view of the online activity surrounding the scholarly content.. Altmetric is in essence providing some information about the popularity of the research outcomes, and offers a nice and easy to use API providing access to quite a bit of data.

We were mostly interested in the "cited_by" counts (see API documentation for details) obtained from DOIs. For roughly a third of the PaN publication altmetric data were available. For those only very few cited_by counts had a significant number of entries. Out of the so-called social platforms only Twitter has a significant contribution to the altmetric scores. Interestingly there was literally no correlation at all between the various counts and the number of citations a publication has received. So the Altmetric indices are indeed orthogonal to the classic citation based metrics.

The altmetric scores are largely very low. The research done at PaN facilities is dominated by material science and fundamental research. It's difficult to interest a larger audience for e.g. magnetic properties of rare metal compounds. So this is not really surprising.

The publications with the highest scores were about cannabis enhancing fracture healing and a publication in Science on insect evolution. Both (and a few more) being in the top 5% of all research outputs ever tracked by Altmetric.

The interest in cannabis is kind of obvious (told ya, it's good for my health, mom). For the publication on insect evolution the huge number of authors (close to 100) and institutions involved and the mentionings of the nice cover art could already make up for substantial parts of the altmetric score.

Clearly, the metrics derived from sources outside the usual scientific circles tend to disappear in the social noise. Any tweet from a soccer team like Real Madrid receives more attention than the highest scored publication in our field, and some tweets from the team largely outperform the tweets summing up all 60.000 publications from Photon and Neutron research collected over the last 7 years. Wether this reflects the relative importance or recognition of soccer versus science, or just the (lack of) relevance of social metrics remains an open question.

Scientific areas


We encountered a few interesting experiences while harvesting information from various different sources.