QUEBRADA VERDE _Desastre de Seveso -Foreword_1_22

31
Escalas de tiempo estelares Las estrellas son sistemas que permanecen estables durante la mayor parte de su vida. Pero los cambios de una fase a otra son etapas de transición que se rigen en escalas de tiempo mucho más cortas. A pesar de eso casi todas las escalas temporales superan con mucho a la humana. Las estrellas se hallan en un delicado equilibrio hidrostático entre la presión originada por las reacciones nucleares y la atracción gravitatorioria generada por toda su masa . La aceleración vertical neta del plasma que la compone habitualmente es casi nula por lo que casi siempre se dice que las estrellas están en condiciones cuasiestáticas. De hecho, vence la presión lo que conlleva ligeras pérdidas de masa en forma de viento solar , fulguraciones , eyecciones de masa coronal u otros fenómenos extrusivos. Pero para las estrellas de menos de 10 masas solares estas pérdidas son despreciables con respecto a su masa total. Así pues podemos escribir una ecuación que iguale la presión producida por el movimiento radial del material estelar a la suma de las fuerzas de presión positiva (hacia afuera) generadas en el núcleo y las fuerzas negativas de la gravedad (hacia adentro). Donde ρ (rho ) es la densidad, r la distancia al centro, S la superficie y F g la fuerza gravitatoria, F p la fuerza de presión. En las condiciones de equilibrio esta ecuación valdría aproximadamente cero ya que ambas fuerzas tenderían a igualarse. Índice [ocultar ] 1 Escala de tiempo dinámica 2 Escala de tiempo térmica 3 Escala de tiempo nuclear 4 Bibliografía 5 Véase también Escala de tiempo dinámica[editar ]

description

QUEBRADA VERDE _Desastre de Seveso -Foreword_1_22

Transcript of QUEBRADA VERDE _Desastre de Seveso -Foreword_1_22

Escalas de tiempo estelaresLasestrellasson sistemas que permanecen estables durante la mayor parte de su vida. Pero los cambios de una fase a otra son etapas de transicin que se rigen en escalas de tiempo mucho ms cortas. A pesar de eso casi todas las escalas temporales superan con mucho a la humana. Las estrellas se hallan en un delicadoequilibrio hidrostticoentre lapresinoriginada por lasreacciones nuclearesy laatraccin gravitatorioriagenerada por toda sumasa. La aceleracin vertical neta del plasma que la compone habitualmente es casi nula por lo que casi siempre se dice que las estrellas estn en condiciones cuasiestticas. De hecho, vence la presin lo que conlleva ligeras prdidas de masa en forma deviento solar,fulguraciones,eyecciones de masa coronalu otros fenmenos extrusivos. Pero para las estrellas de menos de 10 masas solares estas prdidas son despreciables con respecto a su masa total.As pues podemos escribir una ecuacin que iguale la presin producida por el movimiento radial del material estelar a la suma de las fuerzas de presin positiva (hacia afuera) generadas en el ncleo y las fuerzas negativas de la gravedad (hacia adentro).

Donde (rho) es la densidad,rla distancia al centro,Sla superficie yFgla fuerza gravitatoria,Fpla fuerza de presin.En las condiciones de equilibrio esta ecuacin valdra aproximadamente cero ya que ambas fuerzas tenderan a igualarse.ndice[ocultar] 1Escala de tiempo dinmica 2Escala de tiempo trmica 3Escala de tiempo nuclear 4Bibliografa 5Vase tambinEscala de tiempo dinmica[editar]En ocasiones se produce una gran descompensacin entre presin y gravedad. Esto es as en los momentos finales de la vida de una estrella cuando las reacciones nucleares que sostienen a la estrella agotan su combustible y se vuelven incapaces de frenar el colapso. A qu escala de tiempo se modificara la estrella?Para hacernos una idea de dicha escala de tiempo se usar la ecuacin descrita en el apartado anterior. Si se anula la presin se hablar del coeficiente decada libresi, en cambio, se elimina la componente de la gravedad se obtendr una escala de tiempo explosiva. El hecho es que ambas escalas temporales son semejantes y se pueden denominar como escala de tiempo dinmica. Aislando y operando se obtiene:s.Donde 1,610 es el valor de G(-1/2)calculado enmasasyradios solares.As mismo, para hallar el resultado anterior se han realizado tambin las siguientes aproximaciones:yAs, para elSolel tiempo dinmico ser de 1600 segundos, o sea, media hora aproximadamente. Como se ve si una de las dos fuerzas fallase los acontecimientos se sucederan muy repentinamente hasta volver a recuperar el equilibrio.Nota: esta es la escala de tiempo a la que se transmiten las ondas sonoras u ondas de presin.Escala de tiempo trmica[editar]Mide cuanto tiempo puede subsistir la estrella con una determinada luminosidad a partir de sus reservas deenerga potencial gravitatoria(). Esta escala, por ejemplo, es la que rige la vida de lasprotoestrellas. Estos cuerpos ganan temperatura mediante el colapso gravitatorio hasta que llegan al punto de ignicin del hidrgeno momento en el cual se transforman en estrellas de verdad.En el equilibrio hidrosttico se podr hacer uso delTeorema del Virialsegn el cual:yDondeEes la energa total,Ula energa interna yla energa potencial gravitatoria.As pues, cuando ocurre una contraccin la mitad de la energa potencial liberada se transforma en energa interna que no es otra cosa que la agitacin trmica. Esto hace aumentar la temperatura del interior. La otra mitad de la energa se libera en forma de radiacin que contribuye a la luminosidad del astro.En las estrellas, a medida que se agota una fase de combustin la luminosidad debera tender a disminuir pero esas prdidas se van compensando con una contraccin del ncleo. Este llega a calentarse tanto que llegado un punto empezar a quemar las cenizas de la fase anterior entrando en una segunda fase de combustin de helio. La realidad es que las estrellas no solo no pierden sino que, de hecho, ganan brillo con el paso del tiempo y esto es porque cada vez hay ms material implicado en la fusin debido justamente a ese aumento de las temperaturas nucleares. De hecho, el propio ncleo no solo se contrae sino que aumenta su frontera englobando a nuevas capas de hidrgeno sin procesar.Se puede decir que mientras las estrellas pierden energa se calientan. La variacin de la energa total de las estrellas es pues igual a la luminosidad.La energa potencial gravitatoria se calcula como:Sabiendo tambin que segn el teorema del Virial:A partir de estas tres relaciones se puede deducir la escala de tiempo trmica que da:a.Para el Sol esto da unos 20 millones de aos de tiempo trmico. Durante un tiempo esta fue la nica hiptesis para el brillo del Sol y fue un gran misterio la discordancia entre la pequea edad calculada para el sol frente a los registros geolgicos y fsiles que databan de miles de millones de aos atrs. Esto fue as hasta que se descubri laenerga nuclear.Nota: esta es la escala de tiempo a la que se transmiten las ondas trmicas.Escala de tiempo nuclear[editar]Mide el tiempo que puede subsistir la estrella a partir de sus reservas dehidrgeno,helioo el combustible que est quemando en ese momento. Para estudiarla basta con tratar bsicamente la etapa del hidrgeno que es la que ocupa el 90% de la vida de las estrellas. Las estrellas son cuerpo formados en su mayor parte por hidrgeno y helio en menor medida. En el caso del Sol un 70% de este es hidrgeno. La mayora de este hidrgeno no fusiona y solo en torno a un 10% acabar consumindose en el ncleo del comn de las estrellas como el Sol. Si en el caso anterior el tiempo trmico era igual a la energa potencial gravitatoria dividida entre la luminosidad de la estrella aqu ser lo mismo pero usando la energa nuclear obtenida a partir de las reacciones de fusin. Es decir:

Donde:SiendoXla fraccin de masa de hidrgeno (0,7 en el caso del Sol),Mla masa total y QHla energa liberada en la fusin de un gramo de hidrgeno. El coeficiente 0,1 se ha puesto asumiendo que solo un 10% de ese hidrgeno participar en las reacciones de fusin.Sabiendo que QH= 6,31018erg/g y que en la secuencia principal la relacin luminosidad masa es L M3,5se puede obtener el tiempo nuclear que ser aproximadamente:a.Para el Sol esto da unos 9millardosde aos de tiempo nuclear, que es un valor aproximado para la estancia del Sol en lasecuencia principal.Queda claro pues que:.Bibliografa[editar] Howard S. Goldberg:Physics of Stellar Evolution and Cosmology.M.E. Sharpe (1982).ISBN 0677055404 Amos Harpaz:Stellar Evolution.AK Peters, Ltd. (1994).ISBN 1568810121 Mike Inglis:Observer's Guide to Stellar Evolution.Springer (2003).ISBN 1852334657 Dina Prialnik:An Introduction to the Theory of Stellar Structure and Evolution. Cambridge University Press (2000).ISBN 052165937X Robert T. Rood, Alvio Renzini, Jos Franco, Steven M. Kahn, Andrew R. King, Barry F. Madore:Advances in Stellar Evolution (Cambridge Contemporary Astrophysics). Cambridge University Press (1997).ISBN 0521591848

Desastre de SevesoEl desastre de Svesofue un accidente industrial que ocurri alrededor de las 12:37, el 9 de julio de1976, en una pequea planta qumica en el municipio deSveso, 25kmal norte deMiln, en la regin deLombarda, enItalia. El accidente produjo la liberacin al medio ambiente de cantidades de ladioxinaTCDDy que sta llegara a zonas de poblacin, causando diversos efectos. Segn los que pretenden quitarle importancia al accidente, el efecto causado por ste fue principalmente pnico. Se le conoce en Italia como "el Hiroshima de Italia", lo que es totalmente exagerado a juicio de los que trivializan las consecuencias, pues ningn ser humano perdi la vida en este accidente[citarequerida], incluidos bebs, pese a que todos continuaron viviendo all durante ms de quince das. Las mascotas y otros animales domsticos murieron a los pocos das de ser abandonadas sin agua ni alimentos[citarequerida], por la poblacin aterrorizada. Segn la opinin los que pretenden quitarle importancia al accidente, ste es el mejor ejemplo de que el pnico puede causar mucho ms dao que el hecho en s que genera ese temor descontrolado. Los bebs en gestacin que continuaron en los vientres maternos no presentaron deformaciones atribuibles al accidente[citarequerida]. Las investigaciones cientficas hasta el ao 2008 no muestran incrementos en la tasa de incidencia de cncer en la provincia[citarequerida].Las normas de seguridad industriales de laUnin Europease conocen como laDirectivaSeveso II.ndice[ocultar] 1Situacin 2Accidente 3Zonas afectadas 4Consecuencias 5Caso criminal 6Directiva de Sveso 7Conclusiones 8Referencias 9Enlaces externosSituacin[editar]La planta industrial perteneca a ICMESA (Industrie Chimiche Meda Societ), una subsidiaria de Givaudan, la cual era a su vez, una subsidiaria deHoffmann-La Roche. La fabrica se haba construido haca mucho tiempo[citarequerida]y la poblacin no la consideraba una fuente de peligro[citarequerida].Sin embargo en la fbrica se produca, como subproducto, la sustancia2,3,7,8-tetraclorodibenzo-p-dioxina, oTCDD, que est considerada una de las dioxinas ms letales que se conocen. Por ejemplo 6 millonsimas de gramo de 2,3,7,8-TCDD, puede matar a una rata[citarequerida], y es conocida por ser uno de los componentes delagente Naranja, producto qumico desfoliante para limpiar la densa vegetacin de la selva vietnamita, utilizada porEstados Unidosen laGuerra de Vietnam.Accidente[editar]El accidente ocurri en uno de los edificios de la fbrica, donde estaba siendo producido elherbicidade nombre2,4,5-T. Debido a un error humano, alrededor de medioda del sbado, se produce una reaccin incontrolada que supera el disco de seguridad. Se forma una nube en forma de aerosol que contiene, entre otras sustancias txicas,TCDD (entre unos pocos cientos de gramos y hasta unos pocos kilogramos),Hidrxido de sodio(sosa custica), glicol (HO-CH2CH2-OH),triclorofenato de sodioy que se extendi por una zona de 18kmalrededor de la fbrica.Zonas afectadas[editar]Las zonas afectadas se dividieron en tres zonas, de acuerdo con la concentracin en el suelo de TCDD. La zona A se dividi en otras 7 subzonas. A la poblacin local se le recomend no tocar o comer vegetales o frutas locales. Zona A: concentracin de TCDD en el suelo de > 50 microgramos por metro (g/m), tena 736 residentes Zona B: concentracin de TCDD en el suelo entre 5 y 50 g/m, tena alrededor de 4.700 residentes Zona R: concentracin de TCDD en el suelo menor de 5 g/m, tena alrededor de 31.800 residentes.Consecuencias[editar] Varios bebs nacidos unos meses despus del accidente presentaban deformidades.Aquellos que pretenden quitarle importancia al accidente dicen que no pueden directamente atribuirse a ste, ya que los bebs que llevaban poco tiempo de gestacin, que son los ms vulnerables, nacieron meses ms tarde sin deformaciones[citarequerida]. 1.600 personas fueron examinadas, y 417 tuvieron la enfermedad cutneacloracn, causada por la dioxina. Cinco trabajadores de descontaminacin contrajeron una enfermedad del hgado, a pesar de trabajar slo jornadas cortas y de usar proteccin, obviamente[citarequerida]inadecuada. 400 mujeres embarazadas de "alto riesgo" fueron sometidas a abortos, ilegales en Italia pero autorizados en circunstancias especiales y debido segn unos al riesgo de malformaciones congnitas y segn otros al pnico que se apoder de la poblacin. El gobierno italiano hizo un prstamo especial de 40 billones de liras, que en junio de 1978 creci a 115 billones.1 Paolo Paoletti, Director de Produccin en Icmesa, fue asesinado el 2 de febrero de 1980 en Monza por la organizacin radical izquierdista Prima Linea. La mayor parte de las compensaciones individuales se resolvieron individualmente. El 19 de diciembre de 1980 representantes de la Regin de Lombarda y de la Repblica Italiana, e Icmesa firmaron un acuerdo compensatorio, con la presencia del primer ministro de Italia,Arnaldo Forlani. El total de indemnizaciones fue de 20 billones de liras (10,3 millones ). 3 300 animales que haban sido abandonados fueron encontrados muertos, la mayor parte conejos y aves de corral que intentaron sobrevivir alimentndose de vegetales contaminados. Para evitar que la toxina entrara en la cadena alimentaria, 80 000 animales fueron sacrificados. 15 nios fueron inmediatamente hospitalizados por inflamacin de la piel. Herwig von Zwehl, director tcnico de Icmesa, y el doctor Paolo Paoletti - director de produccin en Icmesa, fueron arrestados. Se hicieron dos comisiones del gobierno para establecer un plan para poner en cuarentena y descontaminar la zona.El tratamiento del suelo afectado fue tan completo que el nivel de dioxina en 2008 es an menor que el normalmente encontrado.Caso criminal[editar]En septiembre la Corte criminal deMonzasentenci a cinco antiguos empleados de ICMESA y a la empresa Givaudan, a sentencias de dos aos y medio a cinco aos. Todos apelaron.En mayo de1985, la Corte de apelaciones deMilnencontr a tres de los cinco acusados no culpables. Los otros dos apelaron a la Corte Suprema de Roma.El23 de mayode1986elTribunal Supremode Roma confirm la pena a los dos restantes, los cuales fueron condenados a 1 ao y medio y a 2 aos de prisin condicionalDirectiva de Sveso[editar]Los entonces diez pases miembros de laComunidad Europeaacordaron nuevas reglas de seguridad para las plantas industriales que utilizaran elementos peligrosos en 1982, mediante la llamadaDirectiva82/501/EEC o "Directiva Seveso"2que impona duras regulaciones industriales. En 1996, esta norma se actualiz dando lugar a la Directiva 96/82/CE relativa al control de riesgos inherentes a los accidentes graves en los que intervengan sustancias peligrosas. La directiva fue actualizada en 1999 y revisada en 2001. Posteriormente en 2003: directiva 2003/105/CE del 31 de diciembre.3En Espaa en el ao 2005 se propugn el RD 119/2005 de 4 de febrero, conocido como SEVESO IIIConclusiones[editar]Las operaciones de seguridad por parte de los directores de la compaa y del gobierno local fueron mal coordinadas, y hasta algn extremo, incompetentes. Se tard una semana en decir que la dioxina haba sido emitida, y otra semana hasta que empez la evacuacin. Muy pocos estudios cientficos haban demostrado el peligro de la dioxina hasta el momento, y apenas haba regulaciones industriales. La poblacin local no supo qu hacer y se sinti asustada, siendo una experiencia traumtica para esas pequeas comunidades rurales. Algunas consecuencias a largo plazo (traducido del artculo de Wikipedia en italiano): En la poca del desastre, muchos cientficos haban sostenido la posibilidad de una verdadera y concreta epidemia en el rea. Hoy dia, algunas investigaciones cientficas dicen que el nmero de muertes por cncer se ha mantenido relativamente en la misma media de la Brianza (subregin de la provincia de Monza y Brianza, que incluye a Seveso); pero tales investigaciones son contestadas por algunos comits cvicos.Se hizo el documental alemnGambitsobre Joerg Sambeth, el director tcnico de Icmesa, que fue sentenciado a cinco aos en el primer juicio, posteriormente a dos y sali en libertad condicional4

ForewordThis 5th edition includes the following principal changes from the previous edition: weblinks and associated informationhave been updated; errata identified in the 4th edition have been corrected; the Afterword section has been re-writtenand addresses the question of GIS and Big Data; and as with the 4th edition, this edition is provided in web and special PDFelectronic formats only.Geospatial Analysis: A Comprehensive Guide to Principles, Techniques and Software Tools originated as material toaccompany the spatial analysis module of MSc programmes at University College London delivered by the principal author,Dr Mike de Smith. As is often the case, from its conception through to completion of the first draft it developed a life ofits own, growing into a substantial Guide designed for use by a wide audience. Once several of the chapters had beenwritten: notably those covering the building blocks of spatial analysis and on surface analysis. The project was discussedwith Professors Longley and Goodchild. They kindly agreed to contribute to the contents of the Guide itself. As such, thisGuide may be seen as a companion to the pioneering book on Geographic Information Systems and Science by Longley,Goodchild, Maguire and Rhind, particularly the chapters that deal with spatial analysis and modeling. Their participationhas also facilitated links with broader spatial literacy and spatial analysis programmes. Notable amongst these are theGIS&T Body of Knowledge materials provided by the Association of American Geographers together with the spatialeducational programmes provided through UCL and UCSB.The formats in which this Guide has been published have proved to be extremely popular, encouraging us to seek toimprove and extend the material and associated resources further. Many academics and industry professionals haveprovided helpful comments on previous editions, and universities in several parts of the world have now developed courseswhich make use of the Guide and the accompanying resources. Workshops based on these materials have been run inIreland, the USA, East Africa, Italy and Japan, and a Chinese version of the Guide (2nd ed.) has been published by thePublishing House of Electronics Industry, Beijing, PRC, www.phei.com.cn in 2009.A unique, ongoing, feature of this Guide is its independent evaluation of software, in particular the set of readily availabletools and packages for conducting various forms of geospatial analysis. To our knowledge, there is no similarly extensiveresource that is available in printed or electronic form. We remain convinced that there is a need for guidance on whereto find and how to apply selected tools. Inevitably, some topics have been omitted, primarily where there is little or noreadily available commercial or open source software to support particular analytical operations. Other topics, whilstincluded, have been covered relatively briefly and/or with limited examples, reflecting the inevitable constraints of timeand the authors limited access to some of the available software resources.Every effort has been made to ensure the information provided is up-to-date, accurate, compact, comprehensive andrepresentative - we do not claim it to be exhaustive. However, with fast-moving changes in the software industry and inthe development of new techniques it would be impractical and uneconomic to publish the material in a conventionalmanner. Accordingly the Guide has been prepared without intermediary typesetting. This has enabled the time betweenproducing the text and delivery in electronic (web, e-book) formats to be greatly reduced, thereby ensuring that the workis as current as possible. It also enables the work to be updated on a regular basis, with embedded hyperlinks to externalresources and suppliers thus making the Guide a more dynamic and extensive resource than would otherwise be possible.This approach does come with some minor disadvantages. These include: the need to provide rather more subsections tochapters and keywording of terms than would normally be the case in order to support topic selection within theweb-based version; and the need for careful use of symbology and embedded graphic symbols at various points within thetext to ensure that the web-based output correctly displays Greek letters and other symbols across a range of webbrowsers.We would like to thank all those users of the book, for their comments and suggestions which have assisted us inproducing this latest edition.Mike de Smith, UK, Mike Goodchild, USA, Paul Longley, UK, 2015 (5th edition)

1 Introduction and terminologyIn this Guide we address the full spectrum of spatial analysis and associated modeling techniques that areprovided within currently available and widely used geographic information systems (GIS) and associatedsoftware. Collectively such techniques and tools are often now described as geospatial analysis, althoughwe use the more common form, spatial analysis, in most of our discussions.The term GIS is widely attributed to Roger Tomlinson and colleagues, who used it in 1963 to describetheir activities in building a digital natural resource inventory system for Canada (Tomlinson 1967, 1970).The history of the field has been charted in an edited volume by Foresman (1998) containing contributionsby many of its early protagonists. A timeline of many of the formative influences upon the field up to theyear 2000 is available via: http://www.casa.ucl.ac.uk/gistimeline/; and is provided by Longley et al.(2010). Useful background information may be found at the GIS History Project website (NCGIA): http://www.ncgia.buffalo.edu/gishist/. Each of these sources makes the unassailable point that the success ofGIS as an area of activity has fundamentally been driven by the success of its applications in solving realworld problems. Many applications are illustrated in Longley et al. (Chapter 2, A gallery ofapplications). In a similar vein the web site for this Guide provides companion material focusing onapplications. Amongst these are a series of sector-specific case studies drawing on recent work in andaround London (UK), together with a number of international case studies.In order to cover such a wide range of topics, this Guide has been divided into a number of main sectionsor chapters. These are then further subdivided, in part to identify distinct topics as closely as possible,facilitating the creation of a web site from the text of the Guide. Hyperlinks embedded within thedocument enable users of the web and PDF versions of this document to navigate around the Guide and toexternal sources of information, data, software, maps, and reading materials.Chapter 2 provides an introduction to spatial thinking, recently described by some as spatial literacy,and addresses the central issues and problems associated with spatial data that need to be considered inany analytical exercise. In practice, real-world applications are likely to be governed by theorganizational practices and procedures that prevail with respect to particular places. Not only are therewide differences in the volume and remit of data that the public sector collects about populationcharacteristics in different parts of the world, but there are differences in the ways in which data arecollected, assembled and disseminated (e.g. general purpose censuses versus statistical modeling ofsocial surveys, property registers and tax payments). There are also differences in the ways in whichdifferent data holdings can legally be merged and the purposes for which data may be used particularlywith regard to health and law enforcement data. Finally, there are geographical differences in the cost ofgeographically referenced data. Some organizations, such as the US Geological Survey, are bound bystatute to limit charges for data to sundry costs such as media used for delivering data while others, suchas most national mapping organizations in Europe, are required to exact much heavier charges in order torecoup much or all of the cost of data creation. Analysts may already be aware of these contextualconsiderations through local knowledge, and other considerations may become apparent through browsingmetadata catalogs. GIS applications must by definition be sensitive to context, since they representunique locations on the Earths surface.This initial discussion is followed in Chapter 3 by an examination of the methodological background to GISanalysis. Initially we examine a number of formal methodologies and then apply ideas drawn from these tothe specific case of spatial analysis. A process known by its initials, PPDAC (Problem, Plan, Data, Analysis,Conclusions) is described as a methodological framework that may be applied to a very wide range ofspatial analysis problems and projects. We conclude Chapter 3 with a discussion on model-building, withparticular reference to the various types of model that can be constructed to address geospatialproblems.Subsequent Chapters present the various analytical methods supported within widely available softwaretools. The majority of the methods described in Chapter 4 Building blocks of spatial analysis) and many ofthose in Chapter 6 (Surface and field analysis) are implemented as standard facilities in moderncommercial GIS packages such as ArcGIS, MapInfo, Manifold, TNTMips and Geomedia. Many are alsoprovided in more specialized GIS products such as Idrisi, GRASS, QGIS (with SEXTANTE Plugin) Terraseerand ENVI. Note that GRASS and QGIS (which includes GRASS in its download kit) are OpenSource.In addition we discuss a number of more specialized tools, designed to address the needs of specificsectors or technical problems that are otherwise not well-supported within the core GIS packages atpresent. Chapter 5, which focuses on statistical methods, and Chapter 7 and Chapter 8 which addressNetwork and Location Analysis, and Geocomputation, are much less commonly supported in GIS packages,but may provide loose- or close-coupling with such systems, depending upon the application area. In allinstances we provide detailed examples and commentary on software tools that are readily available.As noted above, throughout this Guide examples are drawn from and refer to specific products thesehave been selected purely as examples and are not intended as recommendations. Extensive use has alsobeen made of tabulated information, providing abbreviated summaries of techniques and formulas forreasons of both compactness and coverage. These tables are designed to provide a quick reference to thevarious topics covered and are, therefore, not intended as a substitute for fuller details on the variousitems covered. We provide limited discussion of novel 2D and 3D mapping facilities, and the support fordigital globe formats (e.g. KML and KMZ), which is increasingly being embedded into general-purpose andspecialized data analysis toolsets. These developments confirm the trend towards integration ofgeospatial data and presentation layers into mainstream software systems and services, both terrestrialand planetary (see, for example, the KML images of Mars DEMs at the end of this Guide).Just as all datasets and software packages contain errors, known and unknown, so too do all books andwebsites, and the authors of this Guide expect that there will be errors despite our best efforts toremove these! Some may be genuine errors or misprints, whilst others may reflect our use of specificversions of software packages and their documentation. Inevitably with respect to the latter, newversions of the packages that we have used to illustrate this Guide will have appeared even beforepublication, so specific examples, illustrations and comments on scope or restrictions may have beensuperseded. In all cases the user should review the documentation provided with the software versionthey plan to use, check release notes for changes and known bugs, and look at any relevant onlineservices (e.g. user/developer forums and blogs on the web) for additional materials and insights.The web version of this Guide may be accessed via the associated Internet site: http://www.spatialanalysisonline.com. The contents and sample sections of the PDF version may also beaccessed from this site. In both cases the information is regularly updated. The Internet is now wellestablished as societys principal mode of information exchange and most GIS users are accustomed tosearching for material that can easily be customized to specific needs. Our objective for such users is toprovide an independent, reliable and authoritative first port of call for conceptual, technical, softwareand applications material that addresses the panoply of new user requirements.

1.1 Spatial analysis, GIS and software toolsOur objective in producing this Guide is to be comprehensive in terms of concepts and techniques (but notnecessarily exhaustive), representative and independent in terms of software tools, and above allpractical in terms of application and implementation. However, we believe that it is no longer appropriateto think of a standard, discipline-specific textbook as capable of satisfying every kind of new user need.Accordingly, an innovative feature of our approach here is the range of formats and channels throughwhich we disseminate the material.Given the vast range of spatial analysis techniques that have been developed over the past half centurymany topics can only be covered to a limited depth, whilst others have been omitted because they are notimplemented in current mainstream GIS products. This is a rapidly changing field and increasingly GISpackages are including analytical tools as standard built-in facilities or as optional toolsets, add-ins oranalysts. In many instances such facilities are provided by the original software suppliers (commercialvendors or collaborative non-commercial development teams) whilst in other cases facilities have beendeveloped and are provided by third parties. Many products offer software development kits (SDKs),programming languages and language support, scripting facilities and/or special interfaces for developingones own analytical tools or variants.In addition, a wide variety of web-based or web-deployed tools have become available, enabling datasetsto be analyzed and mapped, including dynamic interaction and drill-down capabilities, without the need forlocal GIS software installation. These tools include the widespread use of Java applets, Flash-basedmapping, AJAX and Web 2.0 applications, and interactive Virtual Globe explorers, some of which aredescribed in this Guide. They provide an illustration of the direction that many toolset and serviceproviders are taking.Throughout this Guide there are numerous examples of the use of software tools that facilitate geospatialanalysis. In addition, some subsections of the Guide and the software section of the accompanyingwebsite, provide summary information about such tools and links to their suppliers. Commercial softwareproducts rarely provide access to source code or full details of the algorithms employed. Typically theyprovide references to books and articles on which procedures are based, coupled with online help andwhite papers describing their parameters and applications. This means that results produced using onepackage on a given dataset can rarely be exactly matched to those produced using any other package orthrough hand-crafted coding. There are many reasons for these inconsistencies including: differences inthe software architectures of the various packages and the algorithms used to implement individualmethods; errors in the source materials or their interpretation; coding errors; inconsistencies arising outof the ways in which different GIS packages model, store and manipulate information; and differingtreatments of special cases (e.g. missing values, boundaries, adjacency, obstacles, distance computationsetc.).Non-commercial packages sometimes provide source code and test data for some or all of the analyticalfunctions provided, although it is important to understand that non-commercial often does not meanthat users can download the full source code. Source code greatly aids understanding, reproducibility andfurther development. Such software will often also provide details of known bugs and restrictionsassociated with functions although this information may also be provided with commercial products it isgenerally less transparent. In this respect non-commercial software may meet the requirements ofscientific rigor more fully than many commercial offerings, but is often provided with limiteddocumentation, training tools, cross-platform testing and/or technical support, and thus is generally more

demanding on the users and system administrators. In many instances open source and similar not-forprofitGIS software may also be less generic, focusing on a particular form of spatial representation (e.g.a grid or raster spatial model). Like some commercial software, it may also be designed with particularapplication areas in mind, such as addressing problems in hydrology or epidemiology.The process of selecting software tools encourages us to ask: (i) what is meant by geospatial analysistechniques? and (ii) what should we consider to be GIS software? To some extent the answer to thesecond question is the simpler, if we are prepared to be guided by self-selection. For our purposes wefocus principally on products that claim to provide geographic information systems capabilities,supporting at least 2D mapping (display and output) of raster (grid based) and/or vector (point/line/polygon based) data, with a minimum of basic map manipulation facilities. We concentrate our review ona number of the products most widely used or with the most readily accessible analytical facilities. Thisleads us beyond the realm of pure GIS. For example: we use examples drawn from packages that do notdirectly provide mapping facilities (e.g. Crimestat) but which provide input and/or output in widely usedGIS map-able formats; products that include some mapping facilities but whose primary purpose is spatialor spatio-temporal data exploration and analysis (e.g. GS+, STIS/SpaceStat, GeoDa, PySal); and productsthat are general- or special-purpose analytical engines incorporating mapping capabilities (e.g. MATLabwith the Mapping Toolbox, WinBUGS with GeoBUGS) for more details on these and other examplesoftware tools, please see the website page:http://www..spatialanalysisonline.com/software.htmlThe more difficult of the two questions above is the first what should be considered as geospatialanalysis? In conceptual terms, the phrase identifies the subset of techniques that are applicable when, asa minimum, data can be referenced on a two-dimensional frame and relate to terrestrial activities. Theresults of geospatial analysis will change if the location or extent of the frame changes, or if objects arerepositioned within it: if they do not, then everywhere is nowhere, location is unimportant, and it issimpler and more appropriate to use conventional, aspatial, techniques.Many GIS products apply the term (geo)spatial analysis in a very narrow context. In the case of vectorbasedGIS this typically means operations such as: map overlay (combining two or more maps or maplayers according to predefined rules); simple buffering (identifying regions of a map within a specifieddistance of one or more features, such as towns, roads or rivers); and similar basic operations. Thisreflects (and is reflected in) the use of the term spatial analysis within the Open Geospatial Consortium(OGC) simple feature specifications (see further Table 4-2). For raster-based GIS, widely used in theenvironmental sciences and remote sensing, this typically means a range of actions applied to the gridcells of one or more maps (or images) often involving filtering and/or algebraic operations (map algebra).These techniques involve processing one or more raster layers according to simple rules resulting in anew map layer, for example replacing each cell value with some combination of its neighbors values, orcomputing the sum or difference of specific attribute values for each grid cell in two matching rasterdatasets. Descriptive statistics, such as cell counts, means, variances, maxima, minima, cumulativevalues, frequencies and a number of other measures and distance computations are also often included inthis generic term spatial analysis.However, at this point only the most basic of facilities have been included, albeit those that may be themost frequently used by the greatest number of GIS professionals. To this initial set must be added alarge variety of statistical techniques (descriptive, exploratory, explanatory and predictive) that havebeen designed specifically for spatial and spatio-temporal data. Today such techniques are of great

importance in social and political sciences, despite the fact that their origins may often be traced back toproblems in the environmental and life sciences, in particular ecology, geology and epidemiology. It is alsoto be noted that spatial statistics is largely an observational science (like astronomy) rather than anexperimental science (like agronomy or pharmaceutical research). This aspect of geospatial science hasimportant implications for analysis, particularly the application of a range of statistical methods tospatial problems.Limiting the definition of geospatial analysis to 2D mapping operations and spatial statistics remains toorestrictive for our purposes. There are other very important areas to be considered. These include:surface analysis in particular analyzing the properties of physical surfaces, such as gradient, aspect andvisibility, and analyzing surface-like data fields; network analysis examining the properties of naturaland man-made networks in order to understand the behavior of flows within and around such networks;and locational analysis. GIS-based network analysis may be used to address a wide range of practicalproblems such as route selection and facility location, and problems involving flows such as those found inhydrology. In many instances location problems relate to networks and as such are often best addressedwith tools designed for this purpose, but in others existing networks may have little or no relevance ormay be impractical to incorporate within the modeling process. Problems that are not specifically networkconstrained, such as new road or pipeline routing, regional warehouse location, mobile phone mastpositioning, pedestrian movement or the selection of rural community health care sites, may beeffectively analyzed (at least initially) without reference to existing physical networks. Locationalanalysis in the plane is also applicable where suitable network datasets are not available, or are toolarge or expensive to be utilized, or where the location algorithm is very complex or involves theexamination or simulation of a very large number of alternative configurations.A further important aspect of geospatial analysis is visualization ( or geovisualization) the use, creationand manipulation of images, maps, diagrams, charts, 3D static and dynamic views, high resolutionsatellite imagery and digital globes, and their associated tabular datasets (see further, Slocum et al.,2008, Dodge et al., 2008, Longley et al. (2010, ch.13) and the work of the GeoVista project team). Forfurther insights into how some of these developments may be applied, see Andrew Hudson-Smith (2008)Digital Geography: Geographic visualization for urban environments and Martin Dodge and Rob Kitchinsearlier Atlas of Cyberspace which is now available as a free downloadable document.GIS packages and web-based services increasingly incorporate a range of such tools, providing static orrotating views, draping images over 2.5D surface representations, providing animations and fly-throughs,dynamic linking and brushing and spatio-temporal visualizations. This latter class of tools has been, untilrecently, the least developed, reflecting in part the limited range of suitable compatible datasets and thelimited set of analytical methods available, although this picture is changing rapidly. One recent exampleis the availability of image time series from NASAs Earth Observation Satellites, yielding vast quantitiesof data on a daily basis (e.g. Aqua mission, commenced 2002; Terra mission, commenced 1999).Geovisualization is the subject of ongoing research by the International Cartographic Association (ICA),Commission on Geovisualization, who have organized a series of workshops and publications addressingdevelopments in geovisualization, notably with a cartographic focus.As datasets, software tools and processing capabilities develop, 3D geometric and photo-realisticvisualization are becoming a sine qua non of modern geospatial systems and services see Andy Hudson-Smiths Digital Urban blog for a regularly updated commentary on this field. We expect to see anexplosion of tools and services and datasets in this area over the coming years many examples areincluded as illustrations in this Guide. Other examples readers may wish to explore include: the static anddynamic visualizations at 3DNature and similar sites; the 2D and 3D Atlas of Switzerland; Urban 3Dmodeling programmes such as LandExplorer and CityGML; and the integration of GIS technologies and datawith digital globe software, e.g. data from Digital Globe and GeoEye/Satellite Imaging, and Earth-basedframeworks such as Google Earth, Microsoft Virtual Earth, NASA Worldwind and Edushi (Chinese). Thereare also automated translators between GIS packages such as ArcGIS and digital Earth models (see forexample Arc2Earth).These novel visualization tools and facilities augment the core tools utilized in spatial analysis throughoutmany parts of the analytical process: exploration of data; identification of patterns and relationships;construction of models; dynamic interaction with models; and communication of results see, forexample, the recent work of the city of Portland, Oregon, who have used 3D visualization to communicatethe results of zoning, crime analysis and other key local variables to the public. Another example is the 3Dvisualizations provided as part of the web-accessible London Air Quality network (see example at thefront of this Guide). These are designed to enable:users to visualize air pollution in the areas that they work, live or walktransport planners to identify the most polluted parts of London.urban planners to see how building density affects pollution concentrations in the City and other highdensity areas, andstudents to understand pollution sources and dispersion characteristicsPhysical 3D models and hybrid physical-digital models are also being developed and applied to practicalanalysis problems. For example: 3D physical models constructed from plaster, wood, paper and plasticshave been used for many years in architectural and engineering planning projects; hybrid sandtables arebeing used to help firefighters in California visualize the progress of wildfires (see Figure 1-1A, below);very large sculptured solid terrain models (e.g. see STM) are being used for educational purposes, toassist land use modeling programmes, and to facilitate participatory 3D modeling in less-developedcommunities (P3DM); and 3D digital printing technology is being used to rapidly generate 3D landscapesand cityscapes from GIS, CAD and/or VRML files with planning, security, architectural, archaeological andgeological applications (see Figure 1-1B, below and the websites of Z corporation and Stratasys for moredetails). To create large landscape models multiple individual prints, which are typically only around 20cmx 20cm x 5cm, are made, in much the same manner as raster file mosaics.Figure 1-1A: 3D Physical GIS models: Sand-in-a-box model, Albuquerque, USAGIS software, notably in the commercial sphere, is driven primarily by demand and applicability, asmanifest in willingness to pay. Hence, to an extent, the facilities available often reflect commercial andresourcing realities (including the development of improvements in processing and display hardware, andthe ready availability of high quality datasets) rather than the status of development in geospatialscience. Indeed, there may be many capabilities available in software packages that are provided simplybecause it is extremely easy for the designers and programmers to implement them, especially thoseemploying object-oriented programming and data models. For example, a given operation may be providedfor polygonal features in response to a well-understood application requirement, which is then easilyenabled for other features (e.g. point sets, polylines) despite the fact that there may be no known orlikely requirement for the facility.Despite this cautionary note, for specific well-defined or core problems, software developers willfrequently utilize the most up-to-date research on algorithms in order to improve the quality (accuracy,optimality) and efficiency (speed, memory usage) of their products. For further information on algorithms

Intended audience and scopeThis Guide has been designed to be accessible to a wide range of readers from undergraduates andpostgraduates studying GIS and spatial analysis, to GIS practitioners and professional analysts. It isintended to be much more than a cookbook of formulas, algorithms and techniques ? its aim is to providean explanation of the key techniques of spatial analysis using examples from widely available softwarepackages. It stops short, however, of attempting a systematic evaluation of competing softwareproducts. A substantial range of application examples are provided, but any specific selection inevitablyillustrates only a small subset of the huge range of facilities available. Wherever possible, examples havebeen drawn from non-academic sources, highlighting the growing understanding and acceptance of GIStechnology in the commercial and government sectors.The scope of this Guide incorporates the various spatial analysis topics included within the NCGIA CoreCurriculum (Goodchild and Kemp, 1990) and as such may provide a useful accompaniment to GIS Analysiscourses based closely or loosely on this programme. More recently the Education Committee of theUniversity Consortium for Geographic Information Science (UCGIS) in conjunction with the Association ofAmerican Geographers (AAG) has produced a comprehensive Body of Knowledge (BoK) document, whichis available from the AAG bookstore (http://www.aag.org/cs/aag_bookstore). This Guide coversmaterials that primarily relate to the BoK sections CF: Conceptual Foundations; AM: Analytical Methodsand GC: Geocomputation. In the general introduction to the AM knowledge area the authors of the BoKsummarize this component as follows:This knowledge area encompasses a wide variety of operations whose objective is to derive analyticalresults from geospatial data. Data analysis seeks to understand both first-order (environmental) effectsand second-order (interaction) effects. Approaches that are both data-driven (exploration of geospatialdata) and model-driven (testing hypotheses and creating models) are included. Data-driven techniquesderive summary descriptions of data, evoke insights about characteristics of data, contribute to thedevelopment of research hypotheses, and lead to the derivation of analytical results. The goal of modeldrivenanalysis is to create and test geospatial process models. In general, model-driven analysis is anadvanced knowledge area where previous experience with exploratory spatial data analysis wouldconstitute a desired prerequisite. (BoK, p83 of the e-book version).

1.3.1 GIS and related software toolsThe GIS software and analysis tools that an individual, group or corporate body chooses to use will dependvery much on the purposes to which they will be put. There is an enormous difference between therequirements of academic researchers and educators, and those with responsibility for planning anddelivery of emergency control systems or large scale physical infrastructure projects. The spectrum ofproducts that may be described as a GIS includes (amongst others):highly specialized, sector specific packages: for example civil engineering design and costing systems;satellite image processing systems; and utility infrastructure management systemstransportation and logistics management systemscivil and military control room systemssystems for visualizing the built environment for architectural purposes, for public consultation or aspart of simulated environments for interactive gamingland registration systemscensus data management systemscommercial location services and Digital Earth modelsThe list of software functions and applications is long and in some instances suppliers would not describetheir offerings as a GIS. In many cases such systems fulfill specific operational needs, solving a welldefinedsubset of spatial problems and providing mapped output as an incidental but essential part oftheir operation. Many of the capabilities may be found in generic GIS products. In other instances aspecialized package may utilize a GIS engine for the display and in some cases processing of spatial data(directly, or indirectly through interfacing or file input/output mechanisms). For this reason, and in orderto draw a boundary around the present work, reference to application-specific GIS will be limited.A number of GIS packages and related toolsets have particularly strong facilities for processing andanalyzing binary, grayscale and color images. They may have been designed originally for the processing ofremote sensed data from satellite and aerial surveys, but many have developed into much moresophisticated and complete GIS tools, e.g. Clark Labs Idrisi software; MicroImages TNTMips product set;the ERDAS suite of products; and ENVI with associated packages such as RiverTools. Alternatively, imagehandling may have been deliberately included within the original design parameters for a generic GISpackage (e.g. Manifold), or simply be toolsets for image processing that may be combined with mappingtools (e.g. the MATLab Image Processing Toolbox). Whatever their origins, a central purpose of such toolshas been the capture, manipulation and interpretation of image data, rather than spatial analysis per se,although the latter inevitably follows from the former.In this Guide we do not provide a separate chapter on image processing, despite its considerableimportance in GIS, focusing instead on those areas where image processing tools and concepts are appliedfor spatial analysis (e.g. surface analysis). We have adopted a similar position with respect to other formsof data capture, such as field and geodetic survey systems and data cleansing software although theseincorporate analytical tools, their primary function remains the recording and georeferencing of datasets,rather than the analysis of such datasets once stored.For most GIS professionals, spatial analysis and associated modeling is an infrequent activity. Even forthose whose job focuses on analysis the range of techniques employed tends to be quite narrow andapplication focused. GIS consultants, researchers and academics on the other hand are continually

exploring and developing analytical techniques. For the first group and for consultants, especially incommercial environments, the imperatives of financial considerations, timeliness and corporate policyloom large, directing attention to: delivery of solutions within well-defined time and cost parameters;working within commercial constraints on the cost and availability of software, datasets and staffing;ensuring that solutions are fit for purpose/meet client and end-user expectations and agreed standards;and in some cases, meeting political expectations.For the second group of users it is common to make use of a variety of tools, data and programmingfacilities developed in the academic sphere. Increasingly these make use of non-commercial wide-rangingspatial analysis software libraries, such as the R-Spatial project (in R); PySal (in Python); and Splancs(in S).Sample software productsThe principal products we have included in this latest edition of the Guide are included on theaccompanying websites software page. Many of these products are free whilst others are available (atleast in some form) for a small fee for all or selected groups of users. Others are licensed at varying peruser prices, from a few hundred to over a thousand US dollars per user. Our tests and examples havelargely been carried out using desktop/Windows versions of these software products. Different versionsthat support Unix-based operating systems and more sophisticated back-end database engines have notbeen utilized. In the context of this Guide we do not believe these selections affect our discussions in anysubstantial manner, although such issues may have performance and systems architecture implicationsthat are extremely important for many users. OGC compliant software products are listed on the OGCresources web page: http://www.opengeospatial.org/resource/products/compliant. To quote from theOGC: The OGC Compliance Testing Program provides a formal process for testing compliance of productsthat implement OpenGIS Standards. Compliance Testing determines that a specific productimplementation of a particular OpenGIS Standard complies with all mandatory elements as specified inthe standard and that these elements operate as described in the standard.Software performanceSuppliers should be able to provide advice on performance issues (e.g. see the ESRI web site, "Services"area for relevant documents relating to their products) and in some cases such information is providedwithin product Help files (e.g. see the Performance Tips section within the Manifold GIS help file). Someanalytical tasks are very processor- and memory-hungry, particularly as the number of elements involvedincreases. For example, vector overlay and buffering is relatively fast with a few objects and layers, butslows appreciably as the number of elements involved increases. This increase is generally at least linearwith the number of layers and features, but for some problems grows in a highly non-linear (i.e.geometric) manner. Many optimization tasks, such as optimal routing through networks or tripdistribution modeling, are known to be extremely hard or impossible to solve optimally and methods toachieve a best solution with a large dataset can take a considerable time to run (see Algorithms andcomputational complexity theory for a fuller discussion of this topic). Similar problems exist with theprocessing and display of raster files, especially large images or sets of images. Geocomputationalmethods, some of which are beginning to appear within GIS packages and related toolsets, are almost bydefinition computationally intensive. This certainly applies to large-scale (Monte Carlo) simulationmodels, cellular automata and agent-based models and some raster-based optimization techniques,especially where modeling extends into the time domain.

A frequent criticism of GIS software is that it is over-complicated, resource-hungry and requires specialistexpertise to understand and use. Such criticisms are often valid and for many problems it may provesimpler, faster and more transparent to utilize specialized tools for the analytical work and draw on thestrengths of GIS in data management and mapping to provide input/output and visualization functionality.Example approaches include: (i) using high-level programming facilities within a GIS (e.g. macros, scripts,VBA, Python) many add-ins are developed in this way; (ii) using wide-ranging programmable spatialanalysis software libraries and toolsets that incorporate GIS file reading, writing and display, such as theR-Spatial and PySal projects noted earlier; (iii) using general purpose data processing toolsets (e.g.MATLab, Excel, Pythons Matplotlib, Numeric Python (Numpy) and other libraries from Enthought; or(iv) directly utilizing mainstream programming languages (e.g. Java, C++). The advantage of theseapproaches is control and transparency, the disadvantages are that software development is never trivial,is often subject to frustrating and unforeseen delays and errors, and generally requires ongoingmaintenance. In some instances analytical applications may be well-suited to parallel or grid-enabledprocessing as for example is the case with GWR (see Harris et al., 2006).At present there are no standardized tests for the quality, speed and accuracy of GIS procedures. Itremains the buyers and users responsibility and duty to evaluate the software they wish to use for thespecific task at hand, and by systematic controlled tests or by other means establish that the product andfacility within that product they choose to use is truly fit for purpose caveat emptor! Details of how toobtain these products are provided on the software page of the website that accompanies this book. Thelist maintained on Wikipedia is also a useful source of information and links, although is far from beingcomplete or independent. A number of trade magazines and websites (such as Geoplace andGeocommunity) provide ad hoc reviews of GIS software offerings, especially new releases, althoughcoverage of analytical functionality may be limited.

1.3.2 Suggested readingThere are numerous excellent modern books on GIS and spatial analysis, although few address softwarefacilities and developments. Hypertext links are provided here, and throughout the text where they arecited, to the more recent publications and web resources listed.As a background to this Guide any readers unfamiliar with GIS are encouraged to first tackle GeographicInformation Systems and Science (GISSc) by Longley et al. (2010). GISSc seeks to provide acomprehensive and highly accessible introduction to the subject as a whole. The GB Ordnance SurveysUnderstanding GIS also provides an excellent brief introduction to GIS and its application.Some of the basic mathematics and statistics of relevance to GIS analysis is covered in Dale (2005) andAllan (2004). For detailed information on datums and map projections, see Iliffe and Lott (2008). Usefulonline resources for those involved in data analysis, particularly with a statistical content, include theStatsRef website and the e-Handbook of Statistical Methods produced by the US National Institute onStandards and Technology, NIST). The more informally produced set of articles on statistical topicsprovided under the Wikipedia umbrella are also an extremely useful resource. These sites, and themathematics reference site, Mathworld, are referred to (with hypertext links) at various pointsthroughout this document. For more specific sources on geostatistics and associated software packages,the European Commissions AI-GEOSTATS website is highly recommended, as is the web site of the Centerfor Computational Geostatistics (CCG) at the University of Alberta. For those who find mathematics andstatistics something of a mystery, de Smith (2006) and Bluman (2003) provide useful starting points. Forguidance on how to avoid the many pitfalls of statistical data analysis readers are recommended thematerial in the classic work by Huff (1993) How to lie with statistics, and the 2008 book by Blastlandand Dilnot The tiger that isnt.A relatively new development has been the increasing availability of out-of-print published books, articlesand guides as free downloads in PDF format. These include: the series of 59 short guides published underthe CATMOG umbrella (Concepts and Methods in Modern Geography), published between 1975 and 1995,most of which are now available at the QMRG website (a full list of all the guides is provided at the end ofthis book); the AutoCarto archives (1972-1997); the Atlas of Cyberspace by Dodge and Kitchin; and FractalCities, by Batty and Longley.Undergraduates and MSc programme students will find Burrough and McDonnell (1998) provides excellentcoverage of many aspects of geospatial analysis, especially from an environmental sciences perspective.Valuable guidance on the relationship between spatial process and spatial modeling may be found in Cliffand Ord (1981) and Bailey and Gatrell (1995). The latter provides an excellent introduction to theapplication of statistical methods to spatial data analysis. OSullivan and Unwin (2010, 2nd ed.) is a morebroad-ranging book covering the topic the authors describe as Geographic Information Analysis. Thiswork is best suited to advanced undergraduates and first year postgraduate students. In many respects adeeper and more challenging work is Hainings (2003) Spatial Data Analysis Theory and Practice. Thisbook is strongly recommended as a companion to the present Guide for postgraduate researchers andprofessional analysts involved in using GIS in conjunction with statistical analysis.However, these authors do not address the broader spectrum of geospatial analysis and associatedmodeling as we have defined it. For example, problems relating to networks and location are often notcovered and the literature relating to this area is scattered across many disciplines, being founded uponthe mathematics of graph theory, with applications ranging from electronic circuit design to computernetworking and from transport planning to the design of complex molecular structures. Useful books

addressing this field include Miller and Shaw (2001) Geographic Information Systems forTransportation (especially Chapters 3, 5 and 6), and Rodrigue et al. (2006) "The geography of transportsystems" (see further: http://people.hofstra.edu/geotrans/).As companion reading on these topics for the present Guide we suggest the two volumes from theHandbooks in Operations Research and Management Science series by Ball et al. (1995): NetworkModels, and Network Routing. These rather expensive volumes provide collections of reviews coveringmany classes of network problems, from the core optimization problems of shortest paths and arc routing(e.g. street cleaning), to the complex problems of dynamic routing in variable networks, and a great dealmore besides. This is challenging material and many readers may prefer to seek out more approachablematerial, available in a number of other books and articles, e.g. Ahuja et al. (1993), Mark Daskinsexcellent book Network and Discrete Location (1995) and the earlier seminal works by Haggett andChorley (1969), and Scott (1971), together with the widely available online materials accessible via theInternet. Final recommendations here are Stephen Wises excellent GIS Basics (2002) and Worboys andDuckham (2004) which address GIS from a computing perspective. Both these volumes covers many topics,including the central issues of data modeling and data structures, key algorithms, system architecturesand interfaces.Many recent books described as covering (geo)spatial analysis are essentially edited collections of papersor brief articles. As such most do not seek to provide comprehensive coverage of the field, but tend tocover information on recent developments, often with a specific application focus (e.g. health, transport,archaeology). The latter is particularly common where these works are selections from sector- ordiscipline-specific conference proceedings, whilst in other cases they are carefully chosen or speciallywritten papers. Classic amongst these is Berry and Marble (1968) Spatial Analysis: A reader in statisticalgeography. More recent examples include GIS, Spatial Analysis and Modeling edited by Maguire, Battyand Goodchild (2005), and the excellent (but costly) compendium work The SAGE handbook of SpatialAnalysis edited by Fotheringham and Rogerson (2008).A second category of companion materials to the present work is the extensive product-specificdocumentation available from software suppliers. Some of the online help files and product manuals areexcellent, as are associated example data files, tutorials, worked examples and white papers (see forexample, ESRIs What is GIS, which provides a wide-ranging guide to GIS. In many instances we utilizethese to illustrate the capabilities of specific pieces of software and to enable readers to replicate ourresults using readily available materials. In addition some suppliers, notably ESRI, have a substantialpublishing operation, including more general (i.e. not product specific) books of relevance to the presentwork. Amongst their publications we strongly recommend the ESRI Guide to GIS Analysis Volume 1:Geographic patterns and relationships (1999) by Andy Mitchell, which is full of valuable tips andexamples. This is a basic introduction to GIS Analysis, which he defines in this context as a process forlooking at geographic patterns and relationships between features. Mitchells Volume 2 (July 2005)covers more advanced techniques of data analysis, notably some of the more accessible and widelysupported methods of spatial statistics, and is equally highly recommended. A number of the topicscovered in his Volume 2 also appear in this Guide. David Allen has recently produced a tutorial book andDVD (GIS Tutorial II: Spatial Analysis Workbook) to go alongside Mitchells volumes, and these areobtainable from ESRI Press. Those considering using Open Source software should investigate the recentbooks by Neteler and Mitasova (2008), Tyler Mitchell (2005) and Sherman (2008).In parallel with the increasing range and sophistication of spatial analysis facilities to be found within GISpackages, there has been a major change in spatial analytical techniques. In large measure this has comeabout as a result of technological developments and the related availability of software tools and detailedpublicly available datasets. One aspect of this has been noted already the move towards network-basedlocation modeling where in the past this would have been unfeasible. More general shifts can be seen inthe move towards local rather than simply global analysis, for example in the field of exploratory dataanalysis; in the increasing use of advanced forms of visualization as an aid to analysis and communication;and in the development of a wide range of computationally intensive and simulation methods that addressproblems through micro-scale processes (geocomputational methods). These trends are addressed atmany points throughout this Guide.