

b 
Email: Peter Collier 
A sample articles are available to download free from:http://www.tandfonline.com/action/showOpenAccess?journalCode=ysre20 To ascertain other articles which have been written by a
particular author:
Survey Review 45, No 333. November/December 2013 1. Land surveying in ancient Mesopotamia: ethical ‘Algebraic Geometry’ The earliest known evidence of a system of land measurement comes from Ancient Mesopotamian clay tablets dating from around 3500 BC. This essay suggests why and how the practice began and how it developed until the Old Babylonian era 2500 years later. Archaeological evidence from clay tablets, analysed in relation to the natural environment and social context of the time, shows the interdependence of Mesopotamian mathematics and land surveying as a response to practical needs, made possible by the invention of writing. Sumerian literature of the time is cited to show that land surveying was practised by the gods who handed down its symbols to earthly rulers to imbue them with justice and truth. Further information:
2. Towards common basis for 3D Cadastres from legal perspective Legal aspects play an increasingly important role in establishing a functional 3D Cadastre, especially in urban areas with multiple strata, crossing networks and overlapping legal spaces in the form of apartments and utility infrastructures. A wide range of national laws and regulations govern the way these real world objects and associated rights, restrictions and responsibilities (RRRs) are defined and represented in the two or threedimensional (2D/3D) space. In fact, despite all research and latest technological advances, no country has so far developed a real 3D Cadastre, while in all legal systems, basic terminology and key concepts need to be clarified first in order to meet the harmonisation of the different Land Administration Systems (LASs) in the near future. This paper aims to contribute to reach towards this harmonisation, by identifying similarities, differences and constraints imposed by jurisdictions that need to be addressed, to ensure 3D cadastral registration and representation of the real world objects, as well as determination of RRRs complying with the different national land law. This paper also engages in worldwide trends of modelling legal relations between spatial units and RRRs on the basis of existing standards that ensure harmonisation and interoperability. Thoughts and proposals for the integration of legally defined spaces into a 3D Cadastres system are offered, including the clarification of RRRs and related source documents, within the context of the Land Administration Domain Mode, recently approved as official International ISO Standard. Further information:
3. Technical aspects for 3D hybrid cadastral model Urbanisation and increasing use of space above and below land surface with a fragmented tenure status brings the need for transforming existing cadastral systems based on a twodimensional parcel representation of space (2D cadastral unit), into threedimensional cadastral systems. The recent technological developments provide the necessary tools such as spatial databases, 3D GIS and CAD applications that may be effectively used for 3D modelling, spatial analysis and visualisation of 3D cadastral objects. This paper proposes a smooth transition from the existing 2D GIS national cadastral systems to a 3D hybrid model, so as to preserve the existing 2D unit systems but also integrate 3D representations of the physical objects. Three alternative approaches for the development of the 3D cadastral system are developed and presented. In the first two approaches the cadastral registration of objects is implemented in the SDBMS of Oracle Spatial, which is linked to a Geographic Information System (ArcInfo) and AutoCAD Map 3D respectively, so that the properties are visualised in threedimensional level. In the third approach, the registration and 3D representation of cadastral objects are implemented in the ArcGIS environment, while for their 3D modelling Google SketchUP is chosen, because of its capability to incorporate 3D models to ArcGIS. An application of a 3D cadastral process is developed for the island of Kimolos, Greece, in which typical examples of special real property objects exist, such as settlements carved into the rocks with overlapping projections of the private properties, and/or settlements with buildings arching over the alleys with overlapping projections of the private and public properties. The results of the implementation of the three approaches are given, regarding the advantages and weaknesses of each approach through appropriate comparisons. Finally a proposal for the best procedure for creating a 3D hybrid cadastral model is given. Further information:
4. Integrating network structures of different geometric representations The availability of reliable vector geodata is increasing rapidly. However, there still exists a lack of appropriate tools and processes for integrated data management and analysis solutions that can handle the diversity of geodata, since structural, geometric and topological aspects affect their data modelling. This paper presents a process that is designed to handle not only existing geometric and topological differences but also structural differences associated with the interoperation and representation of 2D networks. While network structures (such as roads) are usually treated as areal objects in cadastre databases, they are also commonly treated as linear objects in topographic databases. Our integration method is designed to solve not only the positional conflicts in the geodata, but also the existing dissimilarities that are the result of different structural geometric representation primitives used. A localised geometric matching process is introduced for aligning these networks, in which distortions are monitored and quantified locally via sets of specifically selected observation constraints derived from the geometric structures. The aim is to assure that spatial consistency of the 2D geodata is maintained. The outcome presents a significant improvement of the initial state, suggesting a reliable solution for the problem of creating a homogenous unified geodata infrastructure with a statistically sound basis. Further information:
5. Change detection of buildings in suburban areas from high resolution satellite data developed through object based image analysis The aim of this paper was to investigate the development of a fuzzy knowledge base within an object based image analysis (OBIA) system for automatic change detection of buildings. A multitemporal analysis of very high resolution satellite data (QuickBird and IKONOS) was performed. Two case studies for the Keratea suburb of Athens, Greece were selected. For each dataset, primitive image objects were created through multiresolution segmentation, in five hierarchical levels, following a mixed topdown strategy established by a trialanderror procedure. Subsequently each object was assigned by fuzzy classification to one of the classes representing the land cover/use categories of each level. The aim of the classification procedure was to separate the image objects into buildings and not buildings, extract the changes occurring between the two dates, and perform qualitative and quantitative evaluation. Further information:
6. Voxel based volumetric visibility analysis of urban environments The objective of this work is to develop integrated volumetric visibility analysis and modelling for environmental and urban systems. This work involves interdisciplinary research efforts that focus primarily on architecture design discipline and Geoinformatics. The work integrates an advanced Spatial Openness Index (SOI) model within a realistic geovisualised Geographical Information System (GIS) environment. It is based on the assumption that the measured volume of visible space can indicate the perceived density. Most previous work aimed at computing visibility in open terrain and was based on the common Line of Sight (LOS) approach. Open terrain is usually defined as a 2·5D Digital Elevation Model (DEM) and the visibility analysis is carried out by computing profiles from a view point to all the DEM points. Applying the DEM/LOS method in a 3D urban environment causes severe difficulties: a 2·5D DEM structure is unable to accurately model 3D objects (and especially complex buildings), exact visibility computation is a long process and requires a very detailed scanning of the 3D objects. Accordingly, and in order to bypass these difficulties, a new approach has been developed ‐ an approach which is based on subdividing the urban environment volume into voxels (volume elements, representing a value on a regular grid in 3D space). Implementing a spatial intersection between the buildings and the 3D grid of voxels on the one hand, and applying a sophisticated computation sequence of onetime handling a voxel on the other, enables the efficient computation of the visibility in a fast, flexible and accurate process. Moreover, in contrast to the common approach of a binary visibility decision ‐ a point can be visible versus invisible; the suggested approach enables to compute visibility as a continuous figure with inbetween values from fully visible up to fully invisible. This visibility model measures the volume of visible space at any required view point. This model enables accurate 3D simulation of the built environment regarding built structure and surrounding vegetation. A 3D model of our casestudy, the NeveShaanan neighborhood in Haifa, was developed. The paper introduces the model, explains its main attributes and demonstrates the procedure of evaluating/measuring a realistic built environment. The model is planned to be assessed using subjective residents’ evaluation. The results of this research have shown its potential contribution to professional users, such as researchers, designers and city planners, at the same time as being easily used by nonprofessionals such as city dwellers, contractors and developers. Further information:
7. Can visibility predict location? Visibility graph of food and drink facilities in the city The spatial arrangement of socioeconomic facilities in the city is shaped by the interaction of many individuals in the context of a particular physical structure. The urban physical environment displays characteristics of networks (graphs) where nodes and edges are embedded in space. For decades, the analysis of urban network structure represents an attractive model for describing urban phenomena. This paper presents novel means of understanding how socioeconomic activities are distributed in urban environment, what forces influence their spatial patterns and how urban structure and functions are mutually dependent. We investigate the functional aspect of urban spatial networks; specifically we study the spatial distribution of food and drink public facilities in the historical district of Tel AvivYafo Israel. These places, cafés, coffee shops, restaurants and others are known as ‘third place’ in urban sociology and play an important role in establishing a sense of place. We propose a novel graph analytic framework in which the third places are incorporated by means of visual accessibility. The development of this framework emerged from the concept of Integrative Visibility Graph (IVG), a quantitative method, based on visibility analysis of urban structure and its functioning. Several centrality measures from complex network theory are applied to the proposed graphs in order to evaluate structural position of third place locations in the urban network. Our findings illustrate a strong correlation between street centrality values and third place distribution. Further information:
8. Obituary  Alan Frederic Wright Further information:
9. Correspondence on ‘Establishment of the MGI EDM calibration baseline’ by B. Bozic, H. Fan and Z. Milosavljevic Further information:
10. Response to correspondence on ‘Establishment of the MGI EDM Calibration Baseline’, by B. Bozic, H. Fan and Z. Milosavljevic Further information:
Survey Review 45, No 332. September/October 2013 1. Assessment of EGM2008 over Britain using vertical deflections, and problems with historical data Vertical deflections synthesised from the Earth Gravitational Model 2008 (EGM2008) agree with astrogeodetic vertical deflections observed over mainland Britain to within ~1·2″ RMS (northsouth) and ~1·4″ RMS (eastwest), which is commensurate with values reported for North America, Australia and parts of continental Europe. For this assessment in Britain, there has been the additional need to transform the observed relative vertical deflections to absolute ones. Not applying horizontal datum transformations led to spurious results, so absolute vertical deflections must always be used to assess EGMs. Three datum transformations were trialled (threeparameter, sevenparameter and OSTN02), which show similar results when considering the estimated ~0·3″ precision of these historical (19501976) astrogeodetic observations. Several other problems were encountered because of the historical nature of the data, comprising destruction of survey pillars, ambiguous station names and a mixture of horizontal geodetic datums available in Britain. Further information:
2. Minimum mapping units in topographic information systems: a case study from Croatia The paper discusses the minimum mapping units used in topographic information systems of scale of 1:25,000. It refers to experience in data acquisition based on given criteria within ATKIS, traditional Croatian topographic maps and STOKIS. The emphasis is given to forest and arable land minimum mapping units (MMU). More adequate MMU is recommended on the basis of research conducted. The model can also be used for other topographic information system scales. The research results are applicable in countries attempting to create their own topographic information system, or wanting to produce a new edition of their object catalogue, as is the case in Croatia. Further information:
3. Automatic positional accuracy assessment of geospatial databases using linebased methods In this paper, we present a methodology for automating the positional assessment of vector geospatial databases (GDBs). We establish a framework which works with a nonpunctual perspective of the accuracy assessment problem, i.e. considering polygons (which represent buildings) as closed linear shapes and using techniques based on buffer generation on the perimeterlines, which allow us to analyse the displacement between the location of the polygons stored in a GDB (tested source) and a location determined by another GDB (reference source) having higher accuracy. In order to determine a set of homologous objects of both GDBs, a genetic algorithm has been used. The results obtained demonstrate the viability of this evaluation and its potential compared to traditional methods. Further information:
4. Laser scanner point cloud colouring algorithm applied on real site Virtual 3D serves in many fields such as architectural conservation, cultural heritage documentation, industrial and engineering. It is difficult to recover a complex site as a model. In some cases, the modelling process is impossible for objects like trees and pedestrians. Coloured point cloud offers a suitable solution especially in complex sites. It saves time, effort, and of course money. Historical sites can be then accessed virtually for all people all over the world, not only as tourists but also as researchers. A coloured point cloud is obtained as a result of fusing the recent techniques of laser scanning and digital photogrammetry. Scanners equipped with digital cameras deliver in one step a coloured point cloud for the captured scene. After the registration step between multiple point clouds, the user obtains bad colour quality. In this paper, a new algorithm laser points colouring (LPC) integrated in the 3DImage software is developed in order to reprocess the merged point clouds again. This will lead to simplify the work flow of documentation which will give the chance for more historical sites to be documented. The recovery of a crosssection in Germany and a historical site in Turkey are presented. The final coloured point clouds for both sites show the efficiency of the developed algorithm and software. Further information:
5. A coordinate vector correction method to improve the traditional affine transformation graphic digitised cadastral map Parameter coordinate transformation regards the whole transformation region under the same condition for coordinate adjustment. However, the practical errors and ideal corrected values are not ‘individually’ discussed. The proposed coordinate vector correction method uses ‘area’ and ‘point’ to further correct individual coordinates in order to amend the blind spots resulting from overall coordinate transformation. The experimental areas in this study were five graphic area sectors in Taichung City. The graphic digitised TWD67 coordinate system was converted using affine transformation into the TWD97 coordinate system. The coordinate vector correction method was used to correct various boundary point coordinates. The results were compared with affine transformation results and analysed. According to the data obtained from the experimental areas, when the traditional affine transformation was used, the rootmeansquare error value of the converted and registered areas could be reduced by 6‐69% by using the proposed coordinate vector correction method. The quantity of land parcels exceeding the margin of error (the margin of error adopted in this paper is 0·2√A+0·0003A, where A is the total area of the land parcel in m2 units) decreased by 2635%. As for the boundary point displacement, the average boundary point offset of the areas was 0·34·2 cm. The positions of boundary points could be corrected by small amounts using the proposed coordinate vector correction method, so that the discrepancies between the area formed by boundary points and the registered area could be effectively solved. Further information:
6. Ontology for real estate cadastre Standardisation efforts in the geospatial domain, provided by ISO TC 211 and OpenGIS Consortium, have brought many specifications and standards that define structure and encoding rules of data and interfaces of services. These standards provide syntactic interoperability of geospatial data and services, while neglecting their semantics. In order to achieve semantic interoperability, i.e. explicitly define the meaning of data and services to make them understandable both to machines and humans, there are attempts to develop Geospatial Semantic Web, where semantics of data will be expressed explicitly and formally. An important part of this process is the development of domain ontologies, i.e. a common vocabulary in some domain. In order to bring semantics into cadastral systems, it is necessary to develop ontology for real estate cadastre. Therefore, the authors propose the ontology based knowledge model in the field of real estate cadastre based on ISO 19152 international standard and other geospatial standards, as a core ontology for cadastre on top of which a domain ontology for a specific country should be built. The domain ontology presented in this paper refers to the real estate cadastre of Serbia, but a similar approach can be adopted for different cadastral organisations in other countries. In this way, different domain ontologies based on various cadastral organisations will conform to the same standard based core ontology that will provide a basis for the semantic search and integration of cadastral data on national and international level. Further information:
7. Solving planar intersection problem using Gauss quadrature rule exact for threeorder monomials Intersection problem is a basic problem in geodesy and surveying. In planar intersection problem, if the observation noises of the intersection angles are assumed Gaussian, the calculation of the mean and covariance of the target’s coordinate is constructed as a multidimensional nonlinear integral, i.e. the nonlinear intersection functions’ Gauss weighted integral. From the perspective of numerical integration, conventional method based on the Jacobian matrix of a nonlinear function is just the result of simplifying the integrated nonlinear function by its oneorder Taylor series truncation. The Gauss quadrature rule which is exact for all monomials of order not greater than 3, constructed by McNamee and Stenger, is adopted to numerically solve the integral, and a derivative free method (without the derivation of the Jacobian matrix of the nonlinear intersection function) is proposed to calculate the mean and covariance of the target’s coordinate. The proposed method uses five elaborately sampled quadrature points and their corresponding weights to transform the mean and covariance of the intersection angles to the mean and covariance of the target’s coordinate. It is assured that both the mean and covariance of the coordinate calculated by the Gauss quadrature rule based method are exact for up to twoorder terms of the Taylor series of the intersection function. Simulation is constructed in which the true mean and covariance of the target’s coordinate are obtained with Monte Carlo method using 100,000 randomly sampled intersection angle suits. The mean and covariance of the coordinate are estimated using both the conventional and the proposed method, both of which are compared to the Monte Carlo calculated true ones. Simulation results show that the Gauss quadrature rule based method can give higher precision of both the coordinate and its covariance than the conventional method. Hopefully, the work also has values for some other geodetic and surveying topics where the nonlinear transformation of mean and/ or covariance is involved. Further information:
8. Search procedure for improving modified ambiguity function approach The modified ambiguity function approach (MAFA) is a method of GNSS carrier phase data processing. In this method, the functional model of the adjustment problem contains the conditions ensuring the ‘integerness’ of the ambiguities. These conditions are expressed in the form of differentiable function. A prerequisite for obtaining the correct solution is a mechanism ensuring not only the ‘integerness’ of the ambiguities, but also appropriate convergence of the computational process. One of such mechanisms is a cascade adjustment, applying the linear combinations of the L1 and L2 signals with the integer coefficients and various wavelengths. Another method of increasing the efficiency of the MAFA method is based on the application of the integer decorrelation matrix to transform observation equations into equivalent, but better conditioned, observation equations. This paper presents the search procedure as the next technique of improving the MAFA method. This technique together with the decorrelation procedure allows to reduce the number of stages of the cascade adjustment and to obtain correct solution even in the case when a priori position is a few metres away from the actual position. In this paper an example of data processing using the proposed algorithm is given. The results of numerical tests based on real data are presented. Further information:
9. Impact of different GNSS antenna calibration models on height determination in the ASGEUPOS network: a case study Before 6 November 2006, the International global navigation satellite system (GNSS) Service used relative phase centre models for GNSS antenna receivers. When absolute calibration models were introduced, it resulted in significant differences in the scale of GNSS networks compared to the very long baseline interferometry and sidelooking radar measurements. The differences were due to the lack of the GNSS satellite antenna calibration models. This problem was sufficiently resolved and the International GNSS Service decided to switch from relative to absolute models for both satellites and receivers. This decision caused some variations in the results of the GNSS network solutions  especially in the vertical component. To date, the problem of switching from relative to absolute antenna phase centre variations has been mainly considered for global or continental networks using relatively long observation sessions. The aim of this paper was to study the height differences caused by using different calibration models in GNSS observation processing done in the national GBAS network (ASGEUPOS). The analysis was done using 3 days of GNSS data, collected with four different receivers and antennas, divided by 1 h observation sessions. The results of the calculations show that switching from relative to absolute phase centre variation models may have a significant effect on height determination in the ASGEUPOS network, particularly in high accuracy applications. Further information:
Survey Review 45, No 331. July/August 2013 1. The point cadastre requirement revisited Certain countries need to establish a faster, cheaper and more fitforpurpose cadastre than those offered by conventional strategies. This paper reintroduces the strategy of the point cadastre: a cadastral system where geographic points are used to represent land parcels. When point features are combined with satellite imagery, freely available topographic maps (e.g. OpenStreetMap) and managed using cloud based geographic information services, a simple cadastral solution becomes apparent. This paper concentrates only on defining drivers and requirements for point cadastres. Three discrete studies were used to generate the requirements: expert group meetings, a pressure cooker meeting and an online questionnaire. The requirements are classified under preparation, functional, quality and architectural categories. Preparation requirements illustrate the need for contextual awareness before commencing any point cadastral project. Functional requirements are found to be similar to the requirements of parcel based cadastres; however, the necessity for parcel boundary identification is removed. Quality requirements promote the need for ‘ease of use’ and ‘low cost’: ‘accuracy’ is found to rank lowest out of six quality requirements. Architectural requirements provide various options for collecting, storing, maintaining and visualising the cadastral point information. Together, the requirements provide a basic blueprint for cadastral practitioners considering point cadastre solutions. Further work is required on development of indicators for assessing achievement of the requirements in practice. Further information:
2. Obtaining orthophotographs using SRTM digital models Orthophotograph, orthoimages or orthophotomaps have become common and fundamental documents for urban planning, civil engineering projects, etc., by themselves or as a complement of geographic information systems. The success of these products, as opposed to topographic maps, lies in their facility in locating zones and buildings. This paper presents several digital elevation models (DEMs) obtained by photogrammetric correlation and shuttle radar topography mission (SRTM) models. Not only is the potential application of SRTM models analysed, but also their direct use and support to photogrammetric correlation. This process has been applied to several zones that differ in topography: plain, undulating, mountainous. As a result of the analysis, these models showed themselves to be useful in obtaining orthophotomaps accurate enough to work at medium to small scale in mapping, engineering projects, etc. They save time and money in the map production process and they are mainly used in engineering works and projects in developing areas that do not have any more accurate digital terrain model (DTM) available or the means to obtain them. Valid scales are analysed for each type of terrain. Likewise, DTMs would be very useful to obtain not only orthophotograph, but also rectified photographs in global applications for free distribution. A cost reduction study is also presented when these models are used as part of the digital photogrammetric process to obtain orthophotomaps. Further information:
3. Accuracy of vertical datum surfaces in coastal and offshore zones The Vertical Offshore Reference Frames (VORF) project is described, with a summary of the methodology and an explanation of the data sets used. The latter include satellite altimetry, tidal and geoid models, long and short term tide gauge data, and specially undertaken GNSS observations. This paper goes on to describe the theoretical basis for deriving spatially variant error estimates that respond to the varying quality of the input data. The paper then describes the testing programme undertaken by the United Kingdom Hydrographic Office, which has included 245 checks on datum connections at mostly coastal points, 63 comparisons between VORFcorrected tidal levels observed with GNSS and tide gauge data, and six specially commissioned offshore tide gauge deployments. It is shown that across the vast majority of the domain of applicability the VORF surfaces meet their target accuracies of 0·10 m inshore and 0·15 m offshore (both 1σ values) and the formal uncertainties are a fair reflection of the errors actually encountered. The main discrepancy between the modelled surfaces and the test data is found in the sharply varying tidal regime south of Portland on the south coast of England; however, preliminary results from incorporating the next generation of global ocean tide models show a marked improvement in this area. Further information:
4. Establishment of the MGI EDM calibration baseline This paper deals with the estimation of the quality of the baseline for the calibration of distance measurement devices which was established by Serbian Military Geographic Institute for military use. The basic characteristics of the baseline are explained, and a plan for the checking of the baseline quality is proposed. The measurements realised so far can be grouped into two phases. The measurements have been processed, and the estimates of the distances of this length standard have been obtained. The standard deviations of the least squares estimates of the lengths were better than 0·3 mm in each epoch. This precision offers the possibility to check all measurement devices with a minimum uncertainty of the calibrations, of ±(1 mm+1 ppm). The stability of the pillars is also analysed. The conventional deformation analysis method was applied to three datasets and the results obtained by evaluating them are shown. Further information:
5. M split transformation of coordinates The transformation of coordinates allows for the conversion of coordinates from one geodetic system to another. Usually, the determination of transformation parameters is performed by the means of a least squares method. Unfortunately, the least squares method is not immune to outliers. It means that if, for any reason, some reference points are disturbed with gross errors or they belong to two different archival coordinate systems, transformation parameters will be estimated with those errors. Therefore, it is very important to identify incorrect data and remove them from the estimation process or decrease their influence on the estimated parameters. This problem can be solved by applying the M split estimation to calculate transformation parameters. The method of estimation adopted in the paper allows the determination of two competitive vectors of transformation parameters and two competitive residual vectors. The suitability of using the M split estimation method in the process of coordinate transformation was tested on a real geodetic network. In the ‘M split estimation’ section the authors presents the idea of M split estimation, along with its application to estimation of transformation parameters. The authors performed the calculations in three scenarios: with different number, value and distribution of gross errors respectively. The results of the transformations compared with the catalogue value of coordinates, as well as the differences between coordinates after Helmert transformation, the M split transformation (for the vector of parameters V α ), the M split transformation (for the vector of parameters V β ) and the catalogue coordinates are presented in the ‘Numerical example’ section. Further information:
6. Checking GNSSdetermined positions with EDMobserved distances Standard surveying practice in Connecticut requires that positions determined using global navigation satellite system methods be checked by comparing their inversed distance against an electronic distance measuring device distance, but no guidance was provided how to do this. A statistical test is called for, which requires means, variance and a probability density function for the test statistic. The electronic distance measuring device distance can be assumed to follow a normal distribution, but inversed distances belong to an unnamed distribution (it would be akin to a chi distribution). This paper first formulates a standard statistical test to perform the check using a maximum likelihood ratio that turns out to be not very practical. Several simplifying assumptions are shown to be warranted that allow a much simpler, but practically equivalent, ztest. Formulas for threedimensional, geocentric Cartesian and twodimensional, topocentric Cartesian positions are developed with allowable limits placed on horizontal and vertical separations for topocentric coordinate system positions. Further information:
7. On accuracy specifications of electronic distance meter The electronic distance meter (EDM) is an essential surveying tool. The principle of operation is the same among various EDM devices such as the stand alone EDM device, the theodolite mounted EDM unit and the coaxial design integrated with a total station. However, two interpretations exist for EDM accuracy specifications. This study investigates the meaning and cause of the two interpretations of identical accuracy specifications. Explanations from textbooks and manufacturers are reviewed. The procedures for deriving the value used in the accuracy specifications in practice are also detailed briefly. This research suggests that a unified interpretation of EDM accuracy specifications would be beneficial for EDM education and practice. Further information:
8. Influence of lateral refraction on measured horizontal directions The objective of this research was to gain new insights concerning the impact of lateral refraction on the measurement of horizontal angles. The study includes various tests in order to better understand this phenomenon. This paper presents information from measurements conducted at the Laboratory for Measuring and the Measurement Technique of the Faculty of Geodesy in Zagreb and describes the preparatory activities, measurements and calculations of temperature gradients. Results of 10 different tests are shown, in which hot and cold sources of refraction were placed at different distances from the instrument. Measurement results were analysed. Using indirect measurements, new constants were then calculated for the existing algorithms for calculating the effects of lateral refraction. The new algorithms were given. Further information:
9. Efficiency of BERNESE single baseline rapid static positioning solutions with search strategy Rapid static GPS positioning is an alternative to real time kinematic (RTK) GPS positioning or to static GPS positioning. With careful attention, one can obtain cm positioning using the method. Rapid static GPS has become popular through online rapid static GPS positioning services that emerged only recently. Then researchers have studied the accuracy of rapid static GPS from various software packages. However, thorough statistical evaluation of rapid static GPS resultshas been missing in the field. In this study, we analysed GPS data from SOPAC archives,targeting to provide a richer statistical description of BERNESE rapid static GPS solutions. Weapplied SEARCH ambiguity resolution in a single baseline mode over baseline lengths of shorterthan 10 km. We investigated the distribution of the results, determined an appropriate outlierdetection strategy using traditional and robust techniques as well as studying systematicbehaviours in the data sets. The research results indicate that the number of successful solutionsfrom the processing is decreased due to failures in ambiguity fixing if typical 1–15 min observingsession durations is adopted. The failed ambiguity resolution results exist as outliers in thesolution series. This degrades the efficiency of the method. At this stage, the robust outlierdetection appears to be a fast, effective and objective method to clean these outliers. A fall at theoutlier percentage is observed by extending the session duration for rapid static surveying fromtypical 1–15 to 30 min. Further information:
10. Contribution of instrument centring to the uncertainty of a horizontal angle One of the sources of uncertainty in the measurement of a horizontal angle in surveying comes from the instrument centring. There are two ways of evaluating the uncertainty coming from this source. One is to assume that the instrument centring uncertainty is dependent on both directions in the angle. The second way is to assume that it is independent on each direction. In this study, the differences between both approaches are analysed and some examples are given. For lengths of sight longer than 100150 m, these differences are minimal. For short sighting distances of 10 m and a 1 mm centring uncertainty, a maximum positive difference of 90 centesimal seconds (cc) (29·1'') is derived when the angle is close to 0 or 400 grads (360°), and a maximum negative difference of −40cc (−13'') when the angle equals 200 grads (180°). The differences increase as the instrument centring uncertainty increases. An accurate setting up of the instrument is, thus, important. When assessing the effect of the differences between both approaches on the uncertainty of a horizontal angle, other sources of uncertainty should also be taken into account, mainly reading and pointing, target centring and target levelling. As is shown in the examples, the differences between the two approaches become small once all components are considered. This study is relevant and useful for the uncertainty budget of any surveying task where the two approaches are applicable, e.g. traverses or the assessment of tolerances in straight line traverses between two points of known coordinates. In addition, it is useful to investigate whether the differences are relevant in least squares adjustment of survey networks of observed directions when compared to network adjustments of observed angles. Further information:
Survey Review 45, No 330. May/June 2013 1. Displacement of GNSS permanent stations depending on the distance to the epicentre due to Japan’s earthquake on 11 March 2011 One of the effects of earthquakes is the permanent displacement of the areas near the epicentre. Global navigation satellite system (GNSS) technologies and permanent station networks have created a tool and an essential terrestrial reference frame for the study of those displacements. This paper aims to use GNSS techniques to place and to geographically quantify the seismic displacements arising over time as a result of the Tohoku’s earthquake on 11 March 2011. International GNSS Service data have been used to compensate the coordinates of permanent stations, and then using these coordinates, equation calculations were made indicating that displacement decreases as the distance to the epicentre increases. Further information:
2. Optimisation of the new Croatian fundamental levelling network A height system reconstruction includes renovation of benchmark fields and geometric levelling networks. It is periodical process that has purpose to ensure reliable, quality and accurate height system for different users. The height system reconstruction in the Republic of Croatia is based on II NVT levelling lines (second levelling of high accuracy). Specific Croatian territorial shape and geopolitical relationships must be considered for new levelling polygons. Since the height system reconstruction in Croatia is imminent, the purpose of this article is to investigate a priori influences of levelling lines in the neighbouring countries on the quality of nodal benchmark accuracy in the new projected Croatian fundamental levelling network. Further information:
3. Motorised digital levels: development and applications A lowcost motorisation of the Leica DNA03 digital level was designed to enable the automatic monitoring of deformations without compromising its manual use in levelling. The upgrade of the instrument and its electronic control system are described together with laboratory tests carried out to study the effects of ambient temperature on the level. A simple model for the correction of measurements as a function of the temperature is proposed. The detection of vertical movements with submillimetre accuracy was obtained in a simulated permanent monitoring system. The operational advantages of the instrument are shown in a field test. Despite the encouraging results, research is still needed to find a reliable model for the correction of the staff readings under changing temperature conditions. Further information:
4. Fast GNSS ambiguity resolution by ant colony optimisation Global Navigation Satellite System (GNSS) carrier phase ambiguity resolution (AR) is the key technique to high precision positioning and navigation. Ant colony optimisation (ACO) as a stochastic metaheuristic method solves combinatorial optimisation problems by construsting solutions iteratively using a colony of ants guided by pheromone trails and heuristic information. This paper seeks to explore the effectiveness of ACO to deal with the AR problem and closest lattice point problem. The performance of this new method is evaluated considering several simulated examples with different dimensions. The results show that the proposed algorithm can compete efficiently with other promising approaches to the problem and provide integer optimal solutions in often simulated scenarios. We hope that this paper provides a starting point for researches in applying ACO algorithm and other stochastic methods in the AR problem and other GNSS problems due to the simplicities involved in algebraic manipulation. Further information:
5. Reengineering of Turkish land administration The role assigned to cadastral systems has evolved over time from supporting taxation to assisting the land market, land management and sustainable development. This change has forced countries to reengineer their traditional cadastral systems into land administration systems. Governments need to develop their own solutions in the reengineering process for their own circumstances. In this context, this paper examines the findings of academic research carried out to analyse the efficiency of the current Land Administration System (LAS) in Turkey and then to develop a new vision for the future of the Turkish LAS. The analysis shows that there is a need for reengineering the Turkish LAS. The main characteristics of the vision, inspired by the case study research carried out in some European countries and the statements of some well known international reports, are a land law, a leading institution and a land information system. Further information:
6. Cycle slips detection algorithm for low cost single frequency GPS RTK positioning Conventionally, cycle slips are detected by combining phase observations or phase/code observations. However, this is unsuitable for single frequency receivers in real time kinematic (RTK) positioning. Therefore, this study introduces an algorithm based on outlier detection concept to the detection and repair of cycle slip during GPS RTK positioning. The efficacy of the algorithm was verified on a low cost single frequency GPS receiver. To investigate the ability of the algorithm to detect error within a single cycle, minimum detectable bias (MDB) of less than one cycle was used as the index of success. Experiments verified the availability of the algorithm up to 96·12% (10° mask angle). The algorithm was able to accurately detect the time when cycle slips occur and precisely estimate their size in various simulated scenarios. Finally, tests were performed based on real data, and the results confirm that the proposed algorithm is applicable for single frequency RTK positioning. Further information:
7. Robustness analysis using the measure of external reliability for multiple outliers Robustness analysis is a natural merger of reliability and strain and defined as the ability to resist deformations induced by the maximum undetectable errors as determined from internal reliability analysis. Thus far, robustness analysis has been carried out using reliability theory based on the assumption of a single outlier. However, in practice, there might be multiple outliers in a data set. Therefore, measures of reliability for multiple outliers ought to be used. This paper extends robustness analysis so that it can determine the deformation induced by multiple undetected errors through the evaluation of a strain matrix using the proper external reliability measure. In this study, the question of whether a network is robust against deformations induced by two or more undetected outliers is investigated. The results indicate that in the case of multiple outliers, robustness of geodetic networks decreases. Further information:
8. Cadastral system migration from deeds registration to titles registration: case study of Sri Lanka Sri Lanka has two types of cadastral system operating in the country and is in the process of cadastral system migration from deeds registration to titles registration. The deeds system, which was introduced in 1863, is well established, and the titles registration system, the introduction of which is ongoing, was initiated in 1998. Expectedly, the deeds system has several drawbacks, while the titles system meets unexpected difficulties. The purpose of this paper is to give an overview and a comparative analysis of both systems, thereby suggesting options for improvement. Further information:
9. More efficient methods among commonly used robust estimation methods for GPS coordinate transformation Generally, coordinates obtained by global positioning systems (GPS) can be applied when they are transformed into local or national coordinates. The four and sevenparameter models for GPS coordinate transformation are two of the most frequently used methods. Robust estimation methods are often used to eliminate or weaken the influences of gross errors on coordinate transformation. The current paper employs simulation experiments using different coincident points and the number of gross errors included in the observations to compare the robustness of 13 commonly used robust estimation methods. The results indicate that the L1 and GermanMcClure methods are relatively more efficient than other robust estimation methods for the GPS coordinate transformation of four and sevenparameter models. Further information:
Survey Review 45, No 329. March/April 2013 1. ZigBee network positioning with support of RealTime Kinematic GPS and terrestrial measurements Ad hoc network based positioning is particularly useful when fixed positioning infrastructures are unavailable, or were destroyed in a disaster. It can also be used as an augmented system to continuously provide spatial information when satellite positioning systems fail. ZigBee is an emerging wireless technology based on the IEEE 802·15·4 standard. Its advantages include low cost, low power consumption and license free operating frequencies. In addition to applications for low volume data transmissions, some ZigBee modules such as the TI/Chipcon CC2431 already have a builtin location engine for positioning applications. One of the main challenges in ZigBee positioning is to obtain the coordinates of the ZigBee reference nodes that form a network of control points for position fixing. This is particularly challenging when the network is continuously expanding and covers areas with various obstructions, and when the control point coordinates of newly covered areas need to be defined in the ZigBee system. In this paper, the ZigBee wireless technology and its positioning concept is introduced, followed by an accuracy test along a long and narrow corridor and of a proposal of an algorithm for a quick establishment of a ZigBee network. Our indoor investigations show that the ZigBee positioning can generally achieve a better than 5 m accuracy, that a ZigBee network can be efficiently and effectively established with the support of Networked Transport of RTCM via Internet Protocol based RealTime Kinematic GPS (NTRIP RTK GPS) and conventional measurements and that the network node connectivity status can be monitored by a modelling algorithm of the node connectivity matrix. Further information:
2. Real time dynamic precise point positioning with mixture filtering The linearisation of nonlinear observation equations in precise point positioning (PPP) can always bring out additional errors. To weaken such linearisation errors and improve the performance of unscented Kalman filtering (UKF), a kind of mixture filtering based on combined UKF and extended Kalman filtering (EKF) is presented in this contribution. First, the PPP solutions estimated on the base of EKF are taken as predicted information for UKF, which can avoid the impossibility of matrix decomposition during UKF implementation; second, the accuracy of PPP solutions can be optimised by rerunning the UKF. An airborne test is used to demonstrate the performance of the new mixture filtering concept. The experimental results show that the accuracy of dynamic PPP with mixture filtering is better than their counterparts as estimated with EKF only. Further information:
3. Iteratively reweighted total least squares: a robust estimation in errorsinvariables models In this contribution, the iteratively reweighted total least squares (IRTLS) method is introduced as a robust estimation in errorsinvariables (EIV) models. The method is a followup to the iteratively reweighted least squares (IRLS) that is applied to the Gauss–Markov and/or Gauss–Helmert models, when the observations are corrupted by gross errors (outliers). In a relatively new class of models, namely EIV models, IRLS or other known robust estimation methods introduced in geodetic literature cannot be directly applied. This is because the vector of observations or the coefficient matrix of the EIV model may be falsified by gross errors. The IRTLS can then be a good alternative as a robust estimation method in the EIV models. This method is based on the algorithm of weighted total least squares problem according to the traditional Lagrange approach to optimise the target function of this problem. Also a new weight function is introduced for IRTLS approach in order to obtain better results. A simulation study and an empirical example give insight into the robustness and the efficiency of the procedure proposed. Further information:
4. The land administration domain model (LADM) as the reference model for the Cyprus land information system (CLIS) In this paper, the enhancement of the data model of the Cyprus land information system (CLIS), with the adoption of the land administration domain model (LADM) is examined. The CLIS was established in 1999, within the Department of Lands and Surveys, to support the operation of the Cyprus cadastral system and has met the majority of its initial set goals. It is however now broadly accepted that the CLIS should be improved and upgraded, and a new data model should be introduced to facilitate the manipulation and provision of data to internal and external users/customers in a more effective way. The need to enhance the CLIS coincides with the introduction of the LADM, which is under development within the Technical Committee 211 of the International Organization for Standardization and identified as ISO 19152. The LADM provides an abstract, conceptual schema with three basic packages: parties (such as people and organisations), administrative rights, responsibilities and restrictions (such as property rights) and spatial units (such as parcels, buildings and networks), with the latter having one subpackage: surveying and spatial representation [6]. [Note: In this moment (January 2012), the LADM is at the development stage of Final Draft International Standard]. In this paper, the basic entities of the CLIS are presented and restructured, in a way to comply with the LADM. After analysing the characteristics of LADM, it is concluded that this is compatible with CLIS, and can be used as a data model framework for CLIS’s upgrade. Thus, the Cyprus country profile is proposed. The adoption of the LADM is a great opportunity for the Department of Lands and Surveys to introduce an International Organization for Standardization standard model, based on model driven architecture, and to gain all the benefits derived from such a movement. Such benefits include the improvement in the effectiveness and the efficiency of the current system and the expansion of the services provided by CLIS to the broader land administration system and to the Cyprus community. The new functionality includes: better structuring of the rights, responsibilities and restrictions (and related source documents); better fitting in the information infrastructure, both national (e.g. valuation, taxation, building, address and person registrations) and international (e.g. INSPIRE cadastral parcels); and future capabilities for representing threedimensional spatial units (e.g. legal spaces related to apartment or utility infrastructure). Further information:
5. Research on least squares adjustment of high precision network of triangulateration The Helmert method has been widely used in data processing for high precision network which consists of different types of observations. For triangulaterations, the analyses show that the iterative calculation cannot be achieved smoothly while applying the initial coordinates of the unknown points, which come from the direct field observations. One reason lies in that model errors may arise from the nonlinear observation functions’ linearisation using Taylor’s theorem. While that problem could be avoided when using the traditional adjustment solutions of the unknown points as the initials in high precision networks. Moreover, if enough redundant observations have been measured, improved twotimeweighting estimation (ITTWE) will solve the problem more conveniently and efficiently, from which weight matrix determined by the posteriori estimation of the electric distance measurement instrument. The data processing results from a real nuclear power plant project show consistency between Helmert and ITTWE method. Further information:
6. Statistical testing of directions observations independence Independence of observations is often assumed when adjusting geodetic network. Unlike the distance observations, no dependence of environmental conditions is known for horizontal direction observations. In order to determine the dependence of horizontal direction observations, we established test geodetic network of a station and four observation points. Measurements of the highest possible accuracy were carried out using Leica TS30 total station along with precise prisms GPH1P. Two series of hundred sets of angles were measured, with the first one in bad observation conditions. Using different methods, i.e. variance–covariance matrices, x2 test and analyses of time series, the independence of measured directions, reduced directions and horizontal angles were tested. The results show that the independence of horizontal direction observations is not obvious and certainly not in poor conditions. In this case, it would be appropriate for geodetic network adjustments to use variance–covariance matrix calculated from measurements instead of diagonal variance–covariance matrix. Further information:
7. Evolution of land registration and cadastral survey systems in Sri Lanka This paper reviews the evolutionary process of land registration and cadastral survey systems in Sri Lanka. It is a case study from Sri Lanka in the southern part of Asia. This case is of relevance to other countries in the region as they share common experiences in the extensive periods of western power each country’s recent history. The study investigates how diverse policy objectives, evident in various stages of Sri Lankan history, have led to evolutionary change in the processes of land registration and in cadastral survey systems in Sri Lanka. Four prominent historical stages are discussed: the ancient Sinhalese kingdom (before 1505), the period of Portuguese rule (1505‐1658), the period of Dutch rule (1658‐1796) and the period of British rule (1796‐1948). It is evident that the power shifts between different successive regimes with varying land policy objectives have greatly influenced the evolution of the land registration and cadastral survey systems in the country. Further information:
8. Metaheuristic optimisation approach for designing reliable and robust geodetic networks Robustness analysis is a combination of reliability and geometrical strength analysis using a strain technique. It refers to the ability of a network to resist deformations caused by the largest undetectable blunders. The displacement of each point in the network is computed in order to measure the robustness of the network. This paper tries to optimally design a geodetic network in the sense of high reliability and geometrical strength. For this purpose, a metaheuristic method called the shuffled frog leaping algorithm (SFLA) is used to solve the first order design problem in which the geometric configuration of the network is optimised. Such algorithms have been developed to determine high quality solutions to complex optimisation problems. The efficiency of the method is demonstrated using a synthetic network example. The results show that the displacements can be decreased by maximising the minimum redundancy number in the network. This procedure can yield both reliable and robust networks. Further information:
9. Studies on renovation of cadastral sheets for urbanisation In this article, the renovation of the insufficient cadastral sheets in Turkey is investigated. After the information on cadastre and renovation is presented in detail, the samples are given from different villages in Center Town of Osmaniye in the line of the renovation process sequence to enrich the application. Osmaniye is a new and developing city. Therefore, it has a rapid urbanisation. It is sure that the accurate and updated maps are needed in the environment of this rapid urbanisation. Also, this renovation is an important step for a strong urbanisation and land information system with actual and accurate data. Further information:
Survey Review 45, No 328. January/February 2013 1. Measuring meteorological data along the ray path of a distance meter with an ultralight aircraft When measuring distances, the density of the air that electromagnetic waves travel through is important. In practice, the meteorological parameters usually measured at the endpoints of the measured lines, which may be oversimplified for long distances. Thus, the basic idea behind our research was to measure the meteorological parameters along the path of the measuring beam through the air by using an ultralight aircraft flying at low speed. The test measurements were carried out on two lines of a small geodynamic network of the Coalmine Velenje, where the observations are used to determine displacements due to mining in this area. Three flights were done in different conditions of the atmosphere. Within one flight, we conducted two independent measurements of two lines. In this way, we wanted to obtain more precise data on the actual conditions of the atmosphere. The aim of our analysis is to critically evaluate the calculations found in literature and to find an optimal way on how to account for the meteorological parameters in the calculation of the actual atmosphere when measuring longer distances in difficult measuring conditions. We expect that the obtained data, which reflect the actual conditions in the atmosphere, can have a significant influence on the measuring of longer distances. Further information:
2. Crustal deformation analysis by using collocate model based on regional tectonic features Collocation method, when applied to crustal deformation analysis, can estimate the deformation trend reliably not only of the observed stations but also of the unobserved stations. If there are faults or hidden faults in the area being researched, conventional collocation, however, is not applicable to deformation analysis because of the discontinuous deformation. Two alternative collocation methods are proposed. One is based on regional geological tectonic background (called geo_LSC), which divides the area roughly according to geological information and then establishes collocation models respectively in each subdivision. The stations near the boundary are assigned to reasonable subdivision by test of confidence interval of deformation parameter. The other method is based on robust collocation with high breakdown point (called geo_RC). Since the local deformation or abnormal observations may influence the reliability of modelling, robust collocation with high breakdown point is first proposed to calculate the model parameters which are not disturbed by abnormal measurements. A simulated example and a practical example are given. The results show that the proposed methods are effective in actual deformation analysis. Further information:
3. Deferred monumentation and the shakedown factor Any guarantee of secure title is only as good as our ability to clarify what land is being spoken about. However, in countries where the majority of boundaries are straight lines between marked turning points, experience shows that boundary features such as fences and walls are not always erected in sympathy with corner boundary marks. In other words, legally speaking, what rightholders see is not always what they get. This article explores two questions: first, whether the placing of boundary corner marks should be deferred until occupation lines have shaken down to positions mutually agreed by adjoining rightholders, and second, whether boundary marks should be placed only in specified conflict cases. For the first question, a case study of high density suburbs in Zimbabwe is considered, where legal boundary corner marks are typically placed some years after physical boundary features have been erected. This practice achieves a close congruence between physical and legal boundaries but also has drawbacks that make it difficult to justify deferring monumentation unless the later surveys are done at very low cost. The second question draws on the case of New Zealand, in particular the responses made to a proposal in 2007 to mark boundaries only in conflict cases but also to the implications for disaster situations offered by the Canterbury earthquake. The article finishes with a more global discussion stemming from the two case studies, and concludes that boundary marks placed early on in the development process serve a public as well as a private good function from early on in a suburb’s development through to its more mature phases, especially when related to a network of well defined survey marks. It is further concluded that boundary marks with well defined centres fulfil an important role in densifying urban survey control networks. Further information:
4. Local to ETRS89 datum transformation for Slovenia: trianglebased transformation using virtual tie points The present paper presents a geodetic datum transformation between the old and new national coordinate reference systems of Slovenia. The basis for transformation is a set of about 2000 points coordinated in both systems. Virtual tie points are used, which form a regular triangular network covering the entire country. In order to enable extrapolation, the network was expanded, thereby reducing its density. Coordinate shifts between both coordinate systems were determined using bestfit transformation in the immediate neighbourhood of each virtual tie point. Weights assigned to these points depend upon their density and distance from the virtual tie point. The results prove significant advantages of the proposed model: high accuracy, minimisation of distortions, continuity and reversibility of transformation. Therefore, the model has been chosen for transformation of all spatial databases which continuously cover the entire territory of the country and require transformation accuracy of better than one metre. Further information:
5. Thermal3DImage Energy saving is an important issue. Heating and cooling systems in buildings consume too much energy. Therefore new regulations have appeared in Europe to control the energy consumed by such systems. Building insulation efficiency is our concern here, not the heating or cooling systems themselves. Thermography is a practical tool for checking insulation efficiency. This tool requires the geometry and visible view of the building concerned, to compute correctly the amount of heat emitted and then check it against the accepted limits. In this paper, an integration technique for the thermal image, the laser scanner point cloud/mesh and the digital image is introduced. After registering and fusing the three data sets a Themal3DImage is created. An evaluation test was performed and it was found that registering the thermal and visual images using twodimensional transformation equations provided reliable results. Further information:
6. Local variance factors in deformation analysis of nonhomogenous monitoring networks This paper proposes a modification of the classical deformation analysis algorithm for nonhomogeneous (e.g. linearangular) monitoring networks. The basis for the proposed solution is the idea of local variance factors. The theoretical discussion was complemented with an example of its application on a simulated horizontal monitoring network. The obtained results confirm the usefulness of the proposed solution. Further information:
7. An improved cascading ambiguity resolution (CAR) method with Galileo multiple frequencies Modernised GPS will provide navigation signals in three frequency bands and Europe's Galileo system will provide navigation signals in four. When more carrierphase frequency bands are available, more frequency combinations with longer equivalent wavelengths can be formed. If carrier phase ambiguity can be resolved quickly, the carrier phase measurements are eventually converted to ‘pseudorange’ type measurements, but with much higher measurement precision. Cascade ambiguity resolution (CAR) and integer least squares (ILS) methods are widely used for multiple frequency ambiguity resolution. However, there is a weakness in the CAR method. When trying to fix ambiguities of one combination, only part of all available measurements is used in CAR and this is the reason why the success rate of CAR is lower than that of ILS. In this paper, we propose an algorithm to improve CAR for multiple frequency ambiguity resolution. Instead of directly using formed combination, original carrier phase measurements of E1, E5a, E5b and E6 are used in every step while solving for the ambiguities with different combinations. Based on simulated Galileo data, ambiguity resolution performance with the improved CAR method is investigated and compared with the ILS method. It has shown that the performance of the proposed method is better than the ILS method in terms of time required for ambiguity resolution and misfixed rate. Also, we have adapted this method for long baselines. The ambiguity resolution performances for baselines of different distances are investigated. Further information:
8. Factors affecting the estimation of GPS receiver instrumental biases Global positioning system (GPS) has been widely used to investigate the ionosphere through the estimation of the total electron content (TEC) and its distributions in space. One of the important factors affecting the ionosphere TEC estimation accuracy is the hardware differential code biases (DCBs) inherited in both GPS satellites and receivers. This paper investigates various factors affecting GPS receiver instrumental bias estimation accuracy. Through a number of designed tests, we concluded that the most important factor is the ionosphere model accuracy. Some of large daily bias variations of receiver DCB detected by other studies, such as receivers in low latitude regions, are not due to the DCB changes, but the estimation errors. The DCB estimation values can vary significantly for different ionospheric models and different sizes of networks. For example, the receiver DCB values obtained from the global and the station specific models exhibit difference from −2·5 to 14·3 TEC unit (TECU) for different stations. Different data processing methods also contribute to DCB estimation errors. The results from smoothing and nonsmoothing GPS observation show that the difference of DCB reaches up to 6·8 TECU for some stations, with the mean difference of 3 TECU. On the other hand, the elevation cutoff angle does not play an import role in ionospheric delay estimation. For elevation cutoff angles from 10 to 30°, our tests show that the DCB estimation differences are <0·4 TECU. Further information:
9. Refraction corrections from complex measurements of atmospheric parameters for electronic tacheometry In this paper, a new method for the estimation of the effect of the atmosphere is presented. The proposed procedure is based on the complex application of geodetic and meteorological measurements. The field test showed the efficiency of this method for electronic tacheometry. The average standard deviation of the elevation differences equalled 8 mm over 1 km and the number of additional measurements required is minimal. An electronic gradientometer developed by the authors was used as a mobile instrument for the measurement of atmospheric parameters. Further information:
10. Book Review Mapping South Africa, a historical survey of South African Maps and Charts. A Duminy, 2011. ISBN 9781431402212. Further information:


#  