Impact factors of GIScience journals continue to increase in 2016

I am not a particular fan of the journal impact factor (IF): it is obsolete, it is susceptible to manipulation, and it does not guarantee quality. Furthermore, the distribution of citations within the same journal is usually highly skewed, which makes it inappropriate to talk about arithmetic means (on which IF is based). Even some editors of journals with a high IF denounce it openly.

Having said that, it’s hard to deny its importance in academia. For instance, in some countries it conditions promotions and the allocation of government funding.

For starters: the impact factor is a measure of citability of recently published papers in a journal. It is supposed to quantify the influence, reputation, and prestige of a journal; and everything is gauged and consolidated in a single number. The IF of a journal is calculated yearly as the number of references made to all papers published in the journal in the two preceding years, divided by their number (resulting in the yearly average of citedness of recent papers). For instance, the impact factor in 2015 of a journal is calculated as the ratio of cites in 2015 to papers published in 2013 and 2014, and the number of papers published in 2013 and 2014. For example, the journal Housing Theory & Society had published 42 papers in 2013 and 2014. In 2015 there were 43 cites to these 42 papers, as indexed by Thomson Reuters, resulting in an impact factor of 1.024. This also means that papers published before 2013 do not count. The same goes for papers published in 2015 (so a citation from a 2015 paper to a 2015 paper doesn’t count, nor ever will, oddly enough). Finally, not all journals have an IF – only those that are considered influential and of high quality by Thomson Reuters.

The IFs are announced yearly by Thomson Reuters in the Journal Citation Reports. The impact factors for 2015 have just been announced in the 2016 Journal Citation Reports. You can check them here if your institution has a subscription.

As I did last year, I checked the new IFs for the 19 journals that I consider relevant to people in GIScience. The list is composed from my scientometric paper published in IJGIS earlier this year (with the exception of JOSIS – Journal of Spatial Information Science, because it is not yet indexed by Thomson Reuters). For an extended list of journals please see the page compiled by my group.

The results are presented in the table below, also with the IFs in 2013 and 2014:


Generally the IFs continue to rise, as it was the case last year. On average, the IFs grew 16.5%.

The impact of 5 out of 18 journals has dropped: PFG‘s IF shrank by 24% (highest in relative terms), and IJDE’s by 0.5 (highest in absolute terms).

Cartography and Geographic Information Science is a clear winner, as its IF continues to grow substantially – from 0.5 to 2.2 in just two years.

While some journals experienced a considerable boost, in the previous edition their IF plummeted. See the case of GeoInformatica: 1.288 in 2013, down to 0.745 in 2014, and up by 42% to 1.061 in 2015. A substantial increase, but still lower than what it had two years ago.

One news is that the ISPRS International Journal of Geo-Information (IJGI), the Open Access journal published by MDPI got its IF (0.651) for the first time.

For some journals it paid off to deliberately delay paginating papers to boost the impact factor. There are some quintessential cases of holding papers without pagination for a long time, like Transactions in GIS:

2016-06-14 at 09.38

For IJDE that was not so lucrative: despite that the journal holds papers for long, its IF dropped. I wonder how much further it would drop if they didn’t employ such methods.

In total, the sum of all IFs continued to increase:


This might indicate that GIScience papers are recently attaining an increased reach outside the field. The results also show that IFs can be quite dynamic: IFs really go up and down, and in just one year their difference can be substantial, as for a third of journals (6/18) the IF changed by more than 30%.

Despite the general aversion to the IF, and its flaws, it’s certainly good news that GIScience papers continue to get more attention.

j j j

An improved LOD specification for 3D building models

The CityGML 2.0 LODs are an industry standard for conveying the grade of 3D city models. However, the 5 LODs are not defined precisely, and they are not sufficient for this purpose. In our new paper

Biljecki, F., Ledoux, H., & Stoter, J. (2016). An improved LOD specification for 3D building models. Computers, Environment and Urban Systems, 59, 25–37. doi:10.1016/j.compenvurbsys.2016.04.005

we present a refined series of 16 LODs that overcomes these issues:


The freely available author’s version PDF is available here. Please use the publisher’s version if available to you.

Abstract: The level of detail (LOD) concept of the OGC standard CityGML 2.0 is intended to differentiate multi-scale representations of semantic 3D city models. The concept is in practice principally used to indicate the geometric detail of a model, primarily of buildings. Despite the popularity and the general acceptance of this categorisation, we argue in this paper that from a geometric point of view the five LODs are insufficient and that their specification is ambiguous.

We solve these shortcomings with a better definition of LODs and their refinement. Hereby we present a refined set of 16 LODs focused on the grade of the exterior geometry of buildings, which provide a stricter specification and allow less modelling freedom. This series is a result of an exhaustive research into currently available 3D city models, production workflows, and capabilities of acquisition techniques. Our specification also includes two hybrid models that reflect common acquisition practices. The new LODs are in line with the LODs of CityGML 2.0, and are intended to supplement, rather than replace the geometric part of the current specification. While in our paper we focus on the geometric aspect of the models, our specification is compatible with different levels of semantic granularity. Furthermore, the improved LODs can be considered format-agnostic.

Among other benefits, the refined specification could be useful for companies for a better definition of their product portfolios, and for researchers to specify data requirements when presenting use cases of 3D city models. We support our refined LODs with experiments, proving their uniqueness by showing that each yields a different result in a 3D spatial operation.

j j j

A scientometric analysis of selected GIScience journals

Scientometrics and bibliometrics provide statistical techniques for measuring and analysing science. Scientometric studies have been carried out in many disciplines, but not particularly in GIScience. My study, published in IJGIS, bridges this gap.

Biljecki, F. (2016). A scientometric analysis of selected GIScience journals. International Journal of Geographical Information Science, vol. 30(7), July 2016, pp. 1302-1335. doi:10.1080/13658816.2015.1130831

Abstract: A set of 12,436 papers published in 20 GIScience journals in the period 2000–2014 were analysed to extract publication patterns and trends. This comprehensive scientometric study focuses on multiple aspects: output volume, citations, national output and efficiency (output adjusted with econometric indicators), collaboration, altmetrics (Altmetric score, Twitter mentions, and Mendeley bookmarking), authorship, and length. Examples of notable observations are that 5% countries account for 76% of global GIScience output; a paper published 15 years ago received a median of 12 citations; and the share of international collaborations in GIScience has more than tripled since 2000 (31% papers had authors from multiple countries in 2014, an increase from 10% in 2000).

The freely available author’s version PDF is available here. Please use the authoritative publisher’s publication if available to you.


fig01 fig05 fig09 fig14

j j j

Two papers at UDMV 2015: (1) conversion between CityGML and OBJ, and (2) visibility analysis in point clouds

My group is organising the forthcoming Eurographics Workshop on Urban Data Modelling and Visualisation 2015. I’ve co-authored two peer-reviewed papers that will be presented at the workshop:


Biljecki, F., & Arroyo Ohori, K. (2015). Automatic Semantic-preserving Conversion Between OBJ and CityGML. Eurographics Workshop on Urban Data Modelling and Visualisation 2015, Delft, Netherlands, pp. 25-30. [PDF] [DOI]


Peters, R., Ledoux, H., & Biljecki, F. (2015). Visibility Analysis in a Point Cloud Based on the Medial Axis Transform. Eurographics Workshop on Urban Data Modelling and Visualisation 2015, Delft, Netherlands, pp. 7-12. [PDF] [DOI]


The code supporting the first paper is available on Github.

j j j

Speed of publication of 18 GIS journals (publication delay)

TL; DR: I have analysed the submitted/accepted/online dates for 1023 papers published in 18 GIScience journals to determine the median lag from submission to publication for each journal. Several ranked lists of journals are presented as plots. On average it takes 6.9 months to get a paper reviewed and published online since submission, and about 2.8 more months to have it included in an issue.

Edit on 18 Sep 2015: this blog post attracted substantial interest (thanks everyone for spreading it around). In the meantime two new journals have been added to the analysis.

Edit on 18 Jan 2016: I have published a related (scientometric) study in IJGIS which investigates many more different aspects related to GIScience publishing.


Scholarly publishing can be an annoying process due to an often slow peer-review stage, editorial decision, and inert typesetting procedure. This processing time is known also as publication delay: the chronological distance between the moment a paper is submitted and its publication (Amat, 2009). It can take several months (or years?) to see one’s paper published in an issue of the journal. Naturally, publishing great papers takes time, but long publication delays are frequently caused by the inefficiency of journals and publishers (not to mention reviewers), and are frowned upon by authors. They have been cited to considerably influence one’s decision where to submit a paper (Strevens 2003, Dong et al 2006, Carroll 2001, Swan and Brown 2003, Rousseau and Rousseau 2012, Solomon and Björk 2012). Therefore it comes as no surprise that publishers started to boast about their processing times.

Publication delay is a principal topic in scientometrics, with many analyses and papers written on this topic. Long story short, while a long publication delay is a source of frustration for authors, actually it can benefit journals to boost their impact factor (Tort et al, 2012), hence the interest from the scientometric community. The goal of this analysis is to expose the publication delay of journals in GIScience (Geographical Information Science), as done in similar analyses, e.g. analysis for PLOS journals, and various disciplines.


There is not a firm consensus on the measures about the publication delay. Some researchers consider the publication delay the time from submission to publication, and some distinguish the publication online and publication in an issue. Some people consider only the delay from the paper available online to the date when it becomes paginated and in print (allocated to a volume, assigned with page numbers, and distributed to readers).

In the digital age the publication in issue is becoming less and less important, and to me (as author) what matters most is the moment the paper becomes available online. However, when the paper is published in print is still the authoritative epoch for determining the impact factor, hence it is of relevance to scientometrics.

Let’s consider all the aforementioned measures:

  • A: submission to acceptance (includes peer review, editorial decision, and revision; often in multiple rounds).
  • B: acceptance to published online; i.e. online posting (includes proof reading and typesetting the paper by the publisher, among other things).
  • C: published online to published in an issue (the period the paper is available online but not yet part of an issue with page numbers and not yet distributed to readers).
  • PD: the publication delay (from submitted to the journal to published online; A+B).
  • TPD: the total publication delay (from submitted to published in an issue; A+B+C).

The following figure illustrates the typical process of publishing, with the considered lags.


Data has been acquired from publishers’ records. 18 GIScience journals have been considered for this analysis.  These are the journals I usually consider when preparing a paper, with some additions for added diversity. The list is a subset of the extensive list compiled at my department. The selected journals are listed below (alphabetically):

  1. AAG: Annals of the Association of American Geographers (Taylor and Francis)
  2. CaGIS: Cartography and Geographic Information Science (Taylor and Francis)
  3. CEUS: Computers, Environment, and Urban Systems (Elsevier)
  4. C&G: Computers and Geosciences (Elsevier)
  5. EPB: Environment and Planning B: Planning and Design (Sage) – date of acceptance is missing
  6. GEAN: Geographical Analysis (Wiley)
  7. GEIN: Geoinformatica (Springer)
  8. G&RS: GIScience and Remote Sensing (Taylor and Francis)
  9. JAG: International Journal of Applied Earth Observation and Geoinformation (Elsevier)
  10. IJDE: International Journal of Digital Earth (Taylor and Francis)
  11. IJGI: ISPRS International Journal of Geo-Information (MDPI)
  12. IJGIS: International Journal of Geographical Information Science (Taylor and Francis)
  13. P&RS: ISPRS Journal of Photogrammetry and Remote Sensing (Elsevier)
  14. JGS: Journal of Geographical Systems (Springer)
  15. JSS: Journal of Spatial Science (Taylor and Francis)  – not fully transparent (does not state the submission and acceptance date)
  16. PE&RS: Photogrammetric Engineering & Remote Sensing (ASPRS) – date of online posting is missing
  17. SCC: Spatial Cognition and Computation (Taylor and Francis)
  18. TGIS: Transactions in GIS (Wiley) – not fully transparent (does not state the submission and acceptance date)

All articles published in 2014 have been selected, except those published in special issues. This results in 1023 papers included in the analysis, enough to draw a solid conclusion about the publication delay.

Luckily most publishers are fully transparent about the chronological record of each paper (although for some papers the data was missing). However, the analysis for is limited for some journals because their publishers do not state the dates of the submission and acceptance of each paper:


Not enough information to derive the publication delay; only C can be calculated. Example of Wiley / Transactions in GIS


Because of the multiple metrics, multiple ranking lists of journals can be derived, and there are multiple aspects to look at.

The results are shown graphically, with a table with all the data in the end. The plots show the distribution of individual papers in each journal. The small white point denotes the median, and the thick stroke the interquartile range.

Let’s start with the A: the median time from submission of the paper to its acceptance per journal, i.e. peer review:

Exhibit A: Time from the submission of the paper to its acceptance. Entries are sorted by median. (Violin plot made possible by Matplotlib and Seaborn, and inspired by Daniel Himmelstein.)

The ISPRS International Journal of Geo-Information is a clear winner here with a median of 79 days: less than 3 months from a paper submission to acceptance. No wonder this fairly new journal is rapidly gaining popularity. The plot also shows that there is a substantial difference between journals.

After a paper is accepted, it takes some time to get it published online (B):

Exhibit B: Delay of online posting after acceptance

The ranking differs from the previous one, and Computers & Geosciences emerges as the quickest one (median of 11 days). This can be explained by the practice, that upon acceptance, the journal posts online the accepted manuscript while the final publisher-formatted version is being prepared. This is not the final version of the paper, but having the accepted version available in the meantime is a favourable measure to alleviate publication delay. Kudos to those journals that make their papers accessible as soon as possible.

AAG and GEAN do not fit the plot due to their pitiful performance: their medians are 7.3 and 14.7 months, respectively. Yes, it took Geographical Analysis more than a year to publish an already accepted paper. Shameful. However, it seems that this practice was not continued in 2015.

The lag from submission to publication online of a paper (A+B) is what most people care about (PD: publication delay). Another ranking summing the two above metrics is exposed:

Exhibit C: Publication delay (from submission to online publication) of the considered GIS journals

IJGI is first with a median of 92 days, JAG comes second with 161 days, and C&G is third with 186 days. Again, a substantial discrepancy between journals is exposed, ending with Geographical Analysis with a performance of almost two years.

The presented three plots also reveal something else of interest: it seems that some journals have a limit on the duration of the process, e.g. for IJGI and JAG almost no paper took more than approx. 6 and 12 months, respectively, to get published. Furthermore, their narrow distributions reveal a consistent process. For other journals, some papers seem to be stuck in peer review much much longer than others.

After a paper is accepted and published online most authors don’t care anymore. But we are not done yet: the following plot shows the lag from publication online to publication in an issue and print (C). A long online-to-print lag can artificially boost a journal’s impact factor, so technically, editors may hold a paper online for a very long time to manipulate their impact factor.

Exhibit D: Time from online posting to publication in print

Again, another ranking is exposed: with GIScience & Remote Sensing, Cartography and Geographic Information Science, and Annals of the Association of American Geographers being the quickest three when it comes to the time between online posting and publication in print. IJDE does not fit the plot owing to its slow reaction, with the median of 18.6 months. The story with GEAN is a bit different. Apparently the papers are published online only when they are published in the issue, so technically C is always zero. Of course this poor practice is reflected in the very long B. Note that this plot debuts JSS and TGIS, the latter not being particularly quick. However, the impact factor of TGIS has recently increased by nearly 40%. Coincidence?

The C (online-to-print) delay, while overlooked by most authors, appears to be exploited by journals to boost their impact factor. For instance, JAG, which has recently increased its impact factor by 36.7%, at the time of this blog post (September 2015) has already its February 2016 issue almost ready with a dozen papers:

2015-09-06 at 18.05

I don’t see a reason why to hold unpaginated papers online for months, except for impact factor related reasons. Not to mention the lack of other explanation behind publishing a 2016 issue when we are still chronologically far from it.

Finally, the total publication delay (TPD) is given below (from submitting the paper to get it published in issue):

Exhibit E: Total publication delay (time from paper submission to publication in print/issue)

As illustrated in the plot, a paper submitted to a GIS journal can take anywhere from a few months to a few years to see it published in an issue.

The results for some journals are stimulating and for some are appalling, prompting me to reconsider my list of GIScience journals. The key takeaways:

  • IJGI is the quickest journal from submission to online posting, followed closely by JAG and C&G (or G&RS and CaGIS if you consider the complete cycle). MDPI (publisher of IJGI) is definitely disrupting the GIS publishing scene, for the better.
  • Assuming that the records are correct, Geographical Analysis takes 15 months to publish a paper after acceptance. Ridiculous.
  • Within a journal there may be considerable differences (e.g. IJGIS exhibits a significant variation), and one cannot always accept the medians as a guarantee when submitting a paper, rather as a guideline.
  • Some editors appear to artificially boost impact factors by deliberately holding papers online for a long time (in press without pagination), e.g. International Journal of Digital Earth takes 18.6 months to assign a paper to an issue, the slowest of all analysed journals. Coincidence or not, IJDE has seen the highest increase of the impact factor in GIS.
  • Most papers in GIS take around 7 months to get published, but again, there are differences:

Distribution of the publication delay of GIS journal papers. Joint data for all journals.

Finally, here are the medians in a table. First the three metrics A, B, C, and the publication delays PD and TPD.

Journal A B C PD TPD
AAG 8.0 7.3 1.4 16.9 17.9
CaGIS 4.9 1.3 1.3 6.3 8.3
CEUS 8.5 0.9 2.3 9.4 11.8
CG 5.6 0.4 2.9 6.1 9.0
EPB 3.7 20.5 24.0
GEAN 11.5 14.7 0.0 23.3 23.3
GEIN 8.6 1.2 8.4 9.6 17.9
GRS 4.6 1.5 1.1 6.6 7.3
IJDE 7.0 0.9 18.6 7.6 27.7
IJGI 2.6 0.5 1.8 3.0 4.1
IJGIS 6.5 1.4 3.8 8.5 12.3
JAG 4.0 1.0 4.9 5.3 10.6
JGS 9.9 0.8 5.9 10.7 15.8
JSS 1.3
PERS 6.1 11.4
PRS 5.8 1.0 1.6 7.0 8.4
SCC 5.6 0.5 2.2 5.9 9.1
TGIS 10.5
All papers 5.6 1.0 2.8 6.9 10.7


And here is a stacked bar plot with the medians of the three components (A, B, C):

Decomposing the metrics A, B, and C (medians) for the selected journals. The sum of these metrics corresponds to the total publication delay (TPD), however, the sum of medians may slightly deviate from the median of TPD.

Decomposing the metrics A, B, and C (medians) for the selected journals. The sum of these metrics corresponds to the total publication delay (TPD), however, the sum of medians may slightly deviate from the median of TPD.

The last line of the table above exposes the median of all analysed GIS papers. How does that compare to other disciplines? Björk and Solomon (2013) conducted an extensive cross-disciplinary experiment. I incorporated my findings in their data:

Comparing the publication delay in GIS with other disciplines: could be better. Data source: doi:10.1016/j.joi.2013.09.001


Comparing the results reveals that the total publication delay of GIS is exactly at the average of the considered disciplines (slightly longer than one year). However, the metric A is longer than the cross-disciplinary average. Furthermore, the delay found in GIS is considerably longer than the average of engineering, but not far from earth sciences, to which GIS could be partly assigned. The longer delay may also be explained by the fact that some GIS journals are closer to social sciences, which rank among the disciplines with the longest publication delay.

Now let’s examine individual papers. What are the fastest and slowest papers? Here are some extremes:


I didn’t know it was possible to get a paper accepted the same day (ironic considering that GEAN is the slowest journal in the sample; note that it took it one and half year to publish it after acceptance). An explanation is that it could be a paper resubmitted to the same journal after a prolonged revision. Our champion is followed by a few papers accepted within a month:







On the other side of the axis, we have a Geoinformatica paper that took more than 4 years from its submission to be published in an issue. Mind you, it takes less time to complete a PhD.



Finally, an interesting thing to explore is the relation between the following two measures:

  • A, which includes a possible delay by authors, peer-reviewers, and editors; and
  • B, which includes a possible delay by authors, editors, and publication office.

They are not so much related (weak correlation of 0.20).

(Cor)relation between A and B


Publishing in GIS can be slow. Choose your journal wisely.

References and further reading

Amat, C. B. (2009). Editorial and publication delay of papers submitted to 14 selected Food Research journals. Influence of online posting. Scientometrics, 74(3), 379–389.

Björk, B.-C., & Solomon, D. (2013). The publishing delay in scholarly peer-reviewed journals. Journal of Informetrics, 7(4), 914–923.

Carroll, R. J. (2001). Review times in statistical journals: Tilting at windmills? Biometrics, 57(1), 1–6.

Dong, P., Loh, M., & Mondry, A. (2006). Publication lag in biomedical journals varies due to the periodical’s publishing model. Scientometrics, 69(2), 271–286.

Rousseau, S., & Rousseau, R. (2012). Interactions between journal attributes and authors’ willingness to wait for editorial decisions. Journal of the American Society for Information Science and Technology, 63(6), 1213–1225.

Strevens, M. (2003). The role of the priority rule in science. The Journal of Philosophy, 100(2), 55–79.

Solomon, D. J., & Björk, B.-C. (2012). Publication fees in open access publishing: Sources of funding and factors influencing choice of journal. Journal of the American Society for Information Science and Technology, 63(1), 98–107.

Swan, A., & Brown, S. (2003). Authors and electronic publishing: what authors want from the new technology. Learned Publishing, 16(1), 28–33.

Tort, A. B. L., Targino, Z. H., & Amaral, O. B. (2012). Rising Publication Delays Inflate Journal Impact Factors. PLOS One, 7(12), e53374.

Trivedi, P. K. (1993). An analysis of publication lags in econometrics. Journal of Applied Econometrics, 8(1), 93–100.

Woolston, C. (2015): Long wait for publication plagues many journals. Nature, 523(7559), 131.

Disclaimer and further information:

  1. Data has been prepared with care, but I cannot guarantee that it’s 100% correct as I noticed many errors in the publisher records, e.g. date of acceptance is 2 years before the date of submission, which I filtered out, but other errors could have passed unnoticed.
  2. Many predatory and low-quality journals are very quick in publishing papers. With this analysis I don’t want to imply that there is a correlation between publication delay of a journal and its quality. All of the analysed journals are ISI journals so a reasonable degree of quality is ensured.
  3. The presented data is an aggregation of hundreds of records, so it goes without saying that your mileage may vary: this is no guarantee that your submission will fall into these averages.
  4. The date of the publication in issue is somewhat ambiguous in scientometrics. For instance, an issue may bear the date of March, but that can deviate from the actual of distribution: it could have been distributed in February or April. Moreover, some journals prepare issues well in advance, and keep them filling until the cover month. Therefore, for the authoritative date I have taken the first day of the month of the time on the cover  (i.e. 1 March). An exception are JSS and IJGI, as it is obvious it releases the issues at the end of the month or later; those have been adjusted to the actual release date as I have received them.
  5. Many papers are delayed because of authors (e.g. long time to revise the paper), but scientometric researchers assume this is uniform for all journals, so it’s not specifically elaborated.
j j j