|
|||||
b |
Email: Peter Collier |
A sample articles are available to download free from:http://www.tandfonline.com/action/showOpenAccess?journalCode=ysre20 To ascertain other articles which have been written by a
particular author:
Survey Review 47, No 345. November/December 2015 1. A phase space reconstruction based single channel ICA algorithm and its application in dam deformation analysis Separating noise, systematic errors and deformation components that are caused by different forces such as water pressure and temperature should be identified from dam deformation monitoring data to facilitate effective dam deformation analysis and consequently determine the potential dam structural damage. As source signals are assumed to be mutually independent, independent component analysis (ICA) can be used to separate the source signals from their mixtures. In addition to signal independence, another requirement of ICA theory is that the number of mixtures should not be less than that of source signals. Unfortunately, in most cases of dam deformation monitoring, only a single channel of observed displacement data is available for each monitoring point. Therefore, ICA with single channel data input (called single channel ICA) is necessary in the application of deformation analysis. In this paper, we apply a phase space reconstruction based single channel ICA (PSR-ICA) algorithm to de-noise deformation monitoring data and separate deformation components introduced by different forces from these de-noised data. A numerical simulation is conducted, and results indicate that PSR-ICA is an efficient tool not only for denoising data but also for separating deformation components caused by different forces. PSR-ICA is then further utilised to process the displacement monitoring data of Wuqiangxi Dam. Results indicate that the two extracted main displacement components are nearly consistent with the displacement components; temperature and water level are used as variables and are computed by using a regression model. Both numerical simulated and real life dams demonstrate that PSR-ICA is an effective tool for separating deformation components by different causative forces and is therefore beneficial to dam deformation analysis. Further information:
2. Extendable linearised adjustment model for deformation analysis This paper gives a linearised adjustment model for the affine, similarity and congruence transformations in 3D that is easily extendable with other parameters to describe deformations. The model considers all coordinates stochastic. Full positive semi-definite covariance matrices and correlation between epochs can be handled. The determination of transformation parameters between two or more coordinate sets, determined by geodetic monitoring measurements, can be handled as a least squares adjustment problem. It can be solved without linearisation of the functional model, if it concerns an affine, similarity or congruence transformation in one-, two- or three-dimensional space. If the functional model describes more than such a transformation, it is hardly ever possible to find a direct solution for the transformation parameters. Linearisation of the functional model and applying least squares formulas is then an appropriate mode of working. The adjustment model is given as a model of observation equations with constraints on the parameters. The starting point is the affine transformation, whose parameters are constrained to get the parameters of the similarity or congruence transformation. In this way the use of Euler angles is avoided. Because the model is linearised, iteration is necessary to get the final solution. In each iteration step approximate coordinates are necessary that fulfil the constraints. For the affine transformation it is easy to get approximate coordinates. For the similarity and congruence transformation the approximate coordinates have to comply to constraints. To achieve this, use is made of the singular value decomposition of the rotation matrix. To show the effectiveness of the proposed adjustment model total station measurements in two epochs of monitored buildings are analysed. Coordinate sets with full, rank deficient covariance matrices are determined from the measurements and adjusted with the proposed model. Testing the adjustment for deformations results in detection of the simulated deformations. Further information:
3. Least fourth powers: optimisation method favouring outliers A new optimisation method is proposed in this paper. The least fourth powers method allows fitting a geometric figure to a set of points in such a way, that the maximal value of displacement between the fitted figure and the points is smaller than in the least squares method. This property can be very useful in some engineering tasks, e.g. in the realignment of a railway track. The objective function of the new optimisation method is proposed, along with an analysis of some theoretical properties of the new method. It was pointed out that some computational problems can appear. Appropriate computational techniques were proposed to overcome these problems. Detailed algorithms were presented and illustrated by numerical examples. The efficiency of various computational techniques is compared, and the resulting conclusions are presented. Further information:
4. Establishment of new Oman National Geodetic Datum ONGD14 The main objective of enhancing national geodetic control is to create a homogenous horizontal survey control for the country. The realisation of a new geocentric datum, namely the Oman National Geodetic Datum (ONGD14), provides the GPS user community with a modern surveying infrastructure. The existing national geodetic control will be strengthened through the enhancement of the old control network and the future establishment of GPS Oman National CORS Network stations. A dedicated project has been initiated by the National Survey Authority of Oman for the establishment of ONGD14. The first task involves checking the quality of the existing geodetic network of first and second order GPS control stations. This was then followed by carrying out a longer GPS geodetic measurement campaign involving 20 existing control stations throughout the country. GPS processing for precise baselines computation using Bernese software has been carried out. In realising the ONGD14, a network of 20 occupied control stations was connected to almost 50 IGS sites in the vicinity of Oman. A nation-wide combined network adjustment of common station coordinates in latest solution of ITRF2008 frame epoch 2013 (ITRF2008@2013) was then performed to establish the ONGD14. Further information:
5. Ambiguity resolution with double troposphere parameter restriction for long range reference stations in NRTK System The correct ambiguity resolution between reference stations is the key to calculate the high precision Network Real-Time Kinematic (NRTK) differential information. For long range reference stations (≧50 km), the double difference troposphere model residuals should be considered as the parameters being solved, but this will aggravate the ill conditioning of ambiguity resolution (AR) model between reference stations; as a result, the ambiguity fixing becomes more difficult for the case of long range station ambiguity resolution. In the paper, a new method with double troposphere parameters restriction is put forward for ambiguity resolution of long range reference stations. The proposed method applies GPT2 model, which is called the state of the art empirical troposphere model, to form a high precision troposphere a priori estimation to provide high accuracy double difference troposphere delay estimation. Based on the principles of TIKHONOV regularisation, a regularisation criterion for the double difference restriction model is then built. The difference between the troposphere estimation and the truth value is used as a restriction parameter to improve the estimation of the unknown parameters and optimisation of the ambiguity search range. Trials verify the significant reduction in the ill conditioning of the parametric resolution functions when the double troposphere restriction model is applied. The success rate of ambiguity resolution within 60 s is above 98% for baselines over 80 km, which is an immense improvement from conventional methods. Further information:
6. Subsidence monitoring using D-InSAR and probability integral prediction modelling in deep mining areas Land subsidence processes in deep mining areas have long time durations, and land deformation models should be obtained using many field observations. In this paper, the capability of monitoring deep mining subsidence of ALOS PALSAR pairs with short and long time baselines has been investigated in the area of Xuzhou, Jiangsu province. For the image pairs with poor temporal baselines, it is difficult to correctly generate the whole subsidence basin, and more information is lost in the areas that have rapid changes in deformation and vegetation. Therefore, an approach combining differential interferometric synthetic aperture radar (D-InSAR) results and probability integral model (PIM) results, to generate the whole mining subsidence basin, is proposed. D-InSAR-derived subsidence observations are used to deduce prediction parameters, and then the parameters and mining conditions of working faces are used in a probability integral model to obtain the whole subsidence basin. The results are compared with levelling field survey data, and the prediction results and levelling measurements agree well with each other. Further information:
7. Data quality assessment and the positioning performance analysis of BeiDou in Hong Kong As the satellite constellation, frequency structure, time and coordinate reference system of the Chinese BeiDou are all different from those of GPS, the quality of observations is also supposed to be different from GPS. At the same time, the positioning performance of the GPS and BeiDou combined system should be much better in view of the availability of more visible satellites. The purpose of this paper is to assess the quality of BeiDou pseudorange and carrier phase observation, and to compare the performance of GPS, BeiDou and the combined system in terms of availability, integrity and positioning accuracy with practical data in Hong Kong. The experiment results showed that the noise and multipath of GEOs (geostationary earth orbit satellites) pseudorange measurement fluctuates within 1.5 m, while the variations of IGSOs (geostationary earth orbit inclined geosynchronous orbit satellites) and MEOs (medium earth orbit satellites) are more obvious. The amplitudes of the triple-carrier combination of time series of GEOs, IGSOs and MEOs carrier phase measurements are 4, 6, and 6 mm, respectively. In addition, the absolute and relative positioning precision of BeiDou is very similar to that of GPS, and the performance of the combined systems is best. Further information:
8. Locating and estimating multiple gross errors during coordinate transformation It is important that the author detects gross errors when using coincidence points for coordinate transformation. Gross errors in the coincidence points are sometimes unavoidable and affect the accuracy of the estimated transformation parameters. The author proposes a method that is different from the commonly used robust estimation and least squares with hypothesis testing methods. The author calls our technique for locating gross errors the ‘full search’ algorithm. A location matrix is formed when the full search is complete. The author has also developed a strict estimation equation for multiple gross errors, which estimates the errors and the optimal transformation parameters. The author used coordinate transformation experiments to compare the residuals and transformation parameters of three schemes. Our results indicate that the proposed method can give satisfactory results when transforming coordinates. Further information:
Survey Review 47, No 344. September/October 2015 The following articles were presented at the FIG Commission 3 Annual Workshop “Geospatial Crowdsourcing and VGI: Establishment of SDI & SIM”, Bologna, Italy, 4-7 November 2014
1. SDI and crowdsourced spatial information management automation for disaster management Modern disaster reporting is becoming increasingly sophisticated with the ready access to social media and user-friendly online mapping tools. Citizen engagement in location enabled disaster reporting is more obvious, and the availability of crowd generated geospatial data is higher than ever before. Crowd generated geospatial content is current and more diverse than conventional geographic information; however, quality and credibility issues exist. Although spatial data infrastructures (SDIs) have proven to be successful in supporting disaster management activities in the past, delays in providing public mapping portals and gaps in data are common. Crowd support and crowd generated spatial data have the potential to speed up disaster management actions and disaster mitigations. Within the study, crowd communications that occurred during the 2011 Queensland floods through the Australian Broadcasting Corporation's (ABC's) QLD flood crisis map were critically analysed to investigate the readiness of current information sources to support disaster management. The accuracy of the reported event locations were compared to the authoritative Queensland Government street network, Open Street Map's (OSM's) streets and Google streets to compare the accuracy of the street and address names provided through the crowdsourced data. The study reveals that several issues exist regarding the quality of the data provided and the intent of the data provider. Moreover, the results indicate that the direct usage of reported location is problematic and that the semantic processing of the information location along with available spatial data may be required to improve data quality. Further information:
2. Enhancing cadastral surveys by facilitating the participation of owners While some countries have over centuries developed a nation-wide spatial framework, some others have been left behind. However, because of the global economic and social challenges, there is an urgent need for those countries to develop similar systems in a fast and efficient way for their economic survival. Building such a framework should be always based on each country's resources and capacities. In systematic cadastral registrations, the participation of owners has always been crucial for the success of the project. Within this concept, the authors investigate the potential of new tools to increase the participation of land right holders and local volunteers (non-professionals) and to enhance the cadastral surveying procedure. A hybrid “crowdsourcing” approach is proposed, mainly based on the direct participation of property rights holders in which the role of the cadastral surveyor is crucial. A commercial application for smartphones is tested. The smartphone's GPS is used only for the general positioning on the basemap to avoid gross errors when owners are not accustomed in using aerial photos. A recent orthoimage of 20 cm pixel size in urban areas and an orthoimage basemap of 50 cm pixel size in rural areas is used as the basemap. Once roughly positioned, owners may then digitise the boundary coordinates on the basemap off-line, with little training support; they may also work from a distance. In areas where the property boundaries are easily recognised on the basemap, boundary coordinates have the expected geometric accuracy that can be achieved on the basemap while no gross errors are detected. Attachment of photos is also possible (e.g. photos of the property or of the deeds). Restricted access to personal data may also be achieved. The method is also useful for cadastre updating purposes (e.g. the periodic updating of buildings database). Further information:
3. Towards the production of digital terrain models from volunteered GPS trajectories There currently exists a wide variety of online resources providing mapping infrastructures and geographic information. Most web-based map services, such as Google Maps, Yahoo! Maps and Bing Maps, are mostly based on data that is collected by authoritative mapping agencies. Alternatively, some relatively new web-map services, such as OpenStreetMap (OSM) and Wikimapia, are mostly based on volunteered data collected by the public (e.g. crowdsourced mapping). Although such volunteered-based map services platforms show an increasing planimetric (2D) accuracy, completeness and update-rate of their mapping infrastructure, surprisingly enough, there is a lack of comparable data and accuracy measures in respect to the third dimension, i.e. height; more specifically, the topographic representation that is based on the volunteered collected data. Most of these web services still rely on existing open-source authoritative topographic infrastructures, and not on data collected by the volunteers. Moreover, topographic information that is open to the public and is free to use, e.g. advanced spaceborne thermal emission and reflection radiometer and shuttle radar topography mission, is regularly available with relatively low height accuracy (not better than 5 m) and low planimetric resolution (over 30 m). Volunteered data, on the other hand, collected by individuals that are situated ‘all over’ the globe can offer with new capabilities and data characteristics having potentially higher qualities. This research proposes to examine the feasibility of using crowdsourced volunteered geographic information working paradigm for the task of producing a reliable digital terrain model (DTM) infrastructure for general use. This is achieved by collecting GPS observations that are available from VG data sources, while applying a 2D Kalman filter-based algorithm, aimed at reducing noise and ambiguities. This paper presents this methodology, with preliminary analysis results achieved by this implementation, showing the feasibility of this working methodology, having good results and accuracy of the DTM generated. Further information:
4. OpenStreetMap for cadastral purposes: an application using VGI for official processes in urban areas The scope of the paper is to test if the online dynamic maps such as the OpenStreetMap (OSM) can be used for official mapping projects such as Cadastre, to investigate the advantages and the concerns of online and open to the public procedures and to identify those differentiations between experts and amateurs that play a critical role in such official projects. The specific research is focused on the use of OSM in urban areas as an alternative method to the official cadastral surveys. This paper presents the possibilities and the perspectives of OSM for spatial and attribute cadastral data collection and storage for the compilation of draft cadastral maps as an alternative methodology within the terms of the volunteered geographic information (VGI). The authors carried out a practical experiment in an extended part of the historic city centre of Athens and updated the online dynamic map of OSM with attribute and spatial cadastral data. Surveying students explored the capacities of the dynamic map in two steps: (a) in a section where the polygons of the buildings already existed on the map, they had to improve it with attribute data, and (b) in another section where no relevant polygons existed, a spatial and attribute data enhancement was required. The research was based on the various approaches that each student adopted and the freedom that the OSM offers to the users. The results show that users can easily distinguish the differences in capacities between the OSM and the commercial software; the inexpensive, easy to use and quick methodology of the OSM in contrast to the accurate, authoritative and assured methodology of the commercial software. Further information:
5. Participatory mapping in support of improved land administration and management of natural resources In many countries, the trend of building and maintaining efficient land administration systems has expanded rapidly over the past decades. There has also been growing awareness and development of spatial data infrastructures, the sustainable management of natural resources and preservation of the environment because of the realisation that these are all vital to socioeconomic progress around the world. The number of participatory mapping initiatives is quickly increasing in many parts of the world. Participatory mapping has emerged as a process and a powerful tool, utilising visual techniques to better understand local natural resources, together with their management, dynamics and related challenges, and with potential solutions to the challenges. It is a relatively fast way of gaining information from those who live with and use these resources. Although there are differences among the initiatives in their methods, applications and users, the common theme linking them is that the process of map-making is undertaken by a group of non-experts who are associated with each other based on a shared interest. Decisions about resource tenure are some of the most critical ones for forests and livelihoods in many contexts, and secure tenure arrangements are an important prerequisite for achieving sustainable forest management. Hereinafter a general overview of the current status of land administration in Albania is presented, with an emphasis on the modern development and changing priorities of the national registration institution and the government. The more innovative part of the paper deals with the participatory mapping initiative of communal forests and pasture use rights in Albania, the experiences gained and the suggested path ahead. Experience with participatory mapping in land registration combined with the forestry management in Europe is relatively rare when compared to other parts of the world. Further information:
6. Quantitative evaluation of volunteered geographic information paradigms: social location-based services case study As of 2010, 90% of the data that existed in the world were created within the previous 2 years, while personal location data have been singled out as one of the five primary ‘big data’ streams in the 2011 McKinsey report. By 2020, the volume of existing data will increase by 50-fold, where a large percentage of this volume will be associated with geospatial data. One of the reasons for this is the existence of the volunteered geographic information (VGI) paradigm, which encapsulates the idea of using the Internet (Web 2.0) to create, share, visualise, and analyse geographic information and knowledge. This neogeography revolution has started to fundamentally transform how geographic data are acquired, maintained, analysed, visualised, and consequently used. Thus, it has the potential to influence common practices, since it captures a broad knowledge of the environment we live in, in all aspects of life, encompassing new services to take place, applications and processes to be developed – all of which are location based. The diversity of applications and services that explore the potential of VGI argues for its current usability relevance: ranging from transportation network analysis, to air pollution and air quality, to natural disaster decision-making systems. This revolution has contributed to the development of two important working and knowledge paradigms: Crowdsourcing and Wisdom of the Crowd, widely used today within the mapping and geo-information discipline. Still, both terms are commonly misused and replaced. This paper aims at distinguishing between the terms via the quantitative and theoretical examination of four widely used social location-based services: OpenStreetMap (OSM), Moovit, Waze and Ushahidi. Eight primary characteristics that influence the paradigm of both Crowdsourcing and Wisdom of the Crowd are defined and examined, aiming to investigate and emphasise the differences between the four, namely: diversity, decentralisation, independency, aggregation, knowledge, activity, privacy and exploitation. It was found that OSM is an excellent example of a Crowdsourcing service, while though Ushahidi is considered as a Crowdsourcing service, its characteristics are coupled better with those of Wisdom of the Crowd. Moovit and Waze do not correspond to the Crowdsourcing paradigm, and thus are categorised as Wisdom of the Crowd services. Further information:
Regular contributions to the journal
7. Alternative methodology for classical geodetic reference system assessment using GNSS and recent tectonic plate model: case of hellenic geodetic reference system of 1987 The classical geodetic reference systems suffer in many cases from severe inconsistencies, mainly due to the fact that they were realised by old type observations. By the use of GNSS, these distortions could be identified implementing a 3D similarity transformation between the old geodetic system and a modern terrestrial reference frame. A similarity transformation is a common solution for the assessment of the classical geodetic reference system. However, low accuracy height information could cause significant distortions to the results of the 3D similarity transformation. In the present study we describe an alternative methodology for a classical geodetic reference system assessment using GNSS and a recent tectonic plate model. Further information:
8. Kinematic cycle slip detection and correction for carrier phase based navigation applications in urban environment in case of ultra high rate GNSS observations In recent years, GNSS has been more and more widely used for all kinds of applications, including those in urban environment. Carrier phase observations from GNSS are necessary for precise navigation applications. However, before that, cycle slip has to be detected and corrected. For static GPS observations, as the distance between the receiver and the GPS satellites is smooth, the testing quantities formed with carrier phase observations between satellites can be used for cycle slip detection and correction. These testing quantities have two advantages. First, as only carrier phase observations are used, they are not affected by big noise of code observations. Second, the wavelength of the testing quantities is about 20 cm, long enough to be insensitive to carrier phase noise and multipath. However, generally these testing quantities cannot be used for kinematic observations with a sample interval of 1 s as the moving of the receiver and the distance between the receiver and the satellites is not smooth. Kinematic cycle slip detection and correction has been a challenge for many years. Currently, two methods are popularly used: geometry-free and time relative. Both of these two methods are sensitive to observation noise and multipath of carrier phase and code, especially the latter one. For carrier phase based applications in urban environment, this weakness will become more outstanding. For kinematic ultra high rate observations, the changes of the speed and acceleration of the receiver can be neglected in very short time such as 1 s or less if there is no abrupt movement. In this case, the distance between the receiver and the satellites can be regarded as smooth and the testing quantities formed between satellites can be used for cycle slip detection and correction. Based on this, in this paper, a new kinematic cycle slip detection and correction method is proposed, aiming at navigation applications in urban environment with ultra high rate GPS observations. The new method has three features: first, it is based on the use of ultra high rate observations (20 Hz); second, the speed and acceleration change of the vehicle is neglected; third, code measurement is not involved. The new method is tested with practical ultra high rate GPS observations in urban environment and compared with ionospheric residual method and time relative method. The numerical results show that the new method performs obviously better than the others with all cycle slips detected and determined reliably. Further information:
9. New data processing strategy for single frequency GPS deformation monitoring Although the application of the single frequency receiver in GPS deformation monitoring is limited mainly by the effect of the ionospheric delays, the relevant studies have never been stopped due to the much cheaper price of the single frequency receivers. In this paper, we introduce a new data processing strategy for the deformation monitoring network where the baselines between any two nearest stations are processed instead of the baselines formed only between the reference stations and monitoring stations in the traditional strategy. As a result, most of the baselines in the monitoring networks are very short so as to the ionospheric effects can be safely ignored. The results from the experiments show that the new strategy can eliminate the effect of ionospheric delay by processing the short baselines in the network mode. The accuracy and integrity of the deformation solutions can be improved by the presented strategy. Further information:
Survey Review 47, No 343. July/August 2015 1. Rapid urbanisation and slum upgrading: What can land surveyors do? This paper aims at understanding the domain of rapid urbanisation and slum upgrading. Therefore it collects facts in order to clarify the status quo. The paper highlights relevant aspects, such as development of new forms of spatial planning, modern slum upgrading methods, provision of security to flexible people–land relationships, linking informality and formality and enhancing land and property tax revenue to facilitate urban services. It appears that for all aspects, a role for the land surveying profession can be formulated. This role requires mastering the newest geospatial and non-spatial technologies, and the capability to design and maintain cost-effective land information systems, which can deliver relevant services to urban residents and city managers. Further information:
2. Digital cadastral map as foundation of coordinate based cadastre of Serbia The subject of the research is the methodology for the successful production of the digital cadastral map (DCM) database in a new spatial reference system (SRS) of Serbia. The DCM is treated as the basis for establishment of the coordinate based cadastre (CBC) in Serbia. The research is focused on the following: the status of the state survey data especially regarding analogue cadastral maps and state geodetic reference network, production of DCM database, the introduction of a new spatial reference system, and legislation covering this field in Serbia. The existing geodetic/surveying network for cadastral/topographic survey and analogue cadastral maps are analysed in terms of their quality. The research also covers an analysis of present methods and procedures for DCM production using data obtained from various sources. Coordinate differences for DCM points and differences between areas of polygon features obtained using original survey data and those obtained by digitising scanned cadastral maps are analysed. The procedure of legislating information letters and resolutions on changes of areas within the DCM production procedure and the impact of the transition from the Gauss–Krüger to the UTM projection on areas for DCM polygon features are also analysed. Further information:
3. Database inconsistency errors correction, on example of LPIS databases in Poland Topological correctness of objects is a problem that needs solving in every project related to the development and updating of spatial databases. This problem can be solved using contemporary GIS software. However, the GIS systems, while working on large volumes of data, tend to be inefficient in editing separated elementary areas (parcels, non-eligible areas) which are presented in the form of objects. An alternative to GIS systems can be the development of a CAD software based technology. The article presents the technology of detection and automatic elimination of topological errors related to a lack of consistency in the official Land Parcel Identification System databases. The developed methodology operates on virtual objects in a CAD software environment. It can be freely used on outdated databases, which contain very large numbers of objects. This technology was used on a test object, made available by a Government Agency. The developed methodology allowed for an automatic correction of errors that result from the inconsistency of the two databases, a non-eligible areas database containing information on the land use, and the reference database (cadastral database). This methodology is a part of the technology for improving the quality and performance of LPIS database systems control. Further information:
4. Weighted total least squares for solving non-linear problem: GNSS point positioning In this contribution, two algorithms are developed for parameter estimation in a non-linear measurement error model with errors in both the coefficient matrix and the vector of measurement. They are based on the complete description of the variance–covariance matrices of the observation errors and of the coefficient matrix errors without any restriction. The paper reinvestigates the nonlinear measurement model associated with GNSS point positioning. Various simulation experiments indicate that GNSS point positioning is much better formulated as a non-linear WTLS problem with errors in both the coefficient matrix and measurement variables. The efficacy of the proposed algorithms is verified through the numerical results. Further information:
5. Divisional ambiguity resolution for long range reference stations in network RTK Increasing the distance between reference stations of network RTK can improve the flexibility of the reference stations sitting, reduce the number of reference stations and cut down construction and maintenance costs. However, it also decreases the correlation of spatial error and affects the real time performance and effectiveness of the network RTK positioning. Thus, the key techniques of long range network RTK need to be further optimised. Ambiguity resolution between reference stations is one of the most critical techniques. In this paper a divisional ambiguity resolution for long range reference stations of network RTK is proposed, so as to deal with the long fixed time and low success rate, especially for low elevation satellites. Based on conventional two-step ambiguity resolution, firstly, this new method calculates the wide-lane ambiguity by building M-W combination of pseudo-range and carrier observations. Second, the conventional ambiguity resolution as a whole is changed and satellites are divided into high elevation ones and low elevation ones, then the integer ambiguity of high elevation satellites is resolved by means of ionosphere-free combination with the relative zenith wet delay as a parameter. Finally, the Kalman filter model is established by the observation equation of high elevation satellites with calculated ambiguity and low elevation satellites, assisting to obtain the ambiguity of low elevation satellites quickly. The example selects a 196 km long baseline for comparative analysis. Result shows that it takes 733 s to resolve the ambiguity of all satellites by conventional ambiguity resolution in whole, and 252 s by divisional ambiguity resolution proposed in this paper. The new method saves about two thirds of resolution time, and also greatly improves the efficiency of ambiguity resolution of low elevation satellites for long baseline. Further information:
6. Detection of high speed railway track static regularity with laser trackers Track regularity is of vital importance in the safety of high speed railway operation. A laser tracker can collect highly accurate three-dimensional (3D) point measurements. Therefore, it is considered as a promising surveying technique for the detection of railway track static irregularity as opposed to using a total station. This study proposes a new approach that uses a laser tracker as the main sensor for obtaining the coordinates of left- and right-track points to detect potential track static irregularities. In this method, the reflecting target of the laser tracker is on a track inspection trolley moving in a round trip along the railway track. A field experiment was conducted to validate the results by comparing the results with the field measurements gathered using a track inspection trolley. The results show that the track static regularity detection method with laser trackers is feasible and indicate that track geometry parameters such as gauges, elevations and lateral deviations of centreline, superelevations, lateral profiles and vertical profiles obtained using the laser tracker and a track inspection trolley are in a good agreement. The average of deviations of track centreline elevations, lateral deviations, and gauges are 0·8, 0·7 and 0·3 mm. Further information:
7. Research on regional zenith tropospheric delay based on neural network technology Tropospheric delay is a primary error source in earth observations and a variety of radio navigation technologies. However, the main problem still present is that not all strategic points around the world will have a GPS receiver. To overcome the shortcoming, a Fusion model was proposed to compensate for errors in prediction of zenith tropospheric delay (ZTD), by incorporating back propagation neural network (BPNN). The input parameters include the surface meteorological data (altitude, atmospheric pressure, absolute temperature and water vapour partial pressure) and Hopfield model predicted ZTD, the output is the difference between the Hopfield predicted and high altitude balloon derived ZTD (also called true ZTD). The datasets covered ∼250 meteorological observation stations within 8∶00 and 20∶00 for May 1st, 10th, 30th, and July 1st, August 1st and September 1st, 2010. Data from 16 uniformly distributed stations were used for training at 8∶00 on May 1st, and the remaining for validation. Modelling results from the Hopfield and current model were compared by reference to the true ZTD. It shows that the current model generates an average RMSE value of 0.0040 m, compared with 0.3445 m for the Hopfield model. In overall, the current model can improve ZTD prediction accuracy by more than 90%. In addition, ZTD predictions from the current model were compared to those obtained directly from GPS data, indicating that our model provides a good alternative for ZTD prediction when the GPS receiver at a specific location is absent. Further information:
8. Three-dimensional lunar surface reconstruction based on remote sensing imagery from Chang E-2 In this paper, we propose a complete solution for effective, automatic and accurate reconstruction of the lunar surface based on the linear array push-broom imagery from Chang E-2 (CE-2). First, with the sparse ephemeris data, an approach for estimating the corresponding areas between forward (F) imagery and backward (B) imagery is proposed through exploiting the imaging characteristics of linear push-broom cameras. Second, feature based matching (FBM) is conducted between F and B imagery, followed by Area Based Matching (ABM) for dense correspondence. Third, the extrinsic parameters of every scan line are estimated according to the supplied sparse ephemeris, and the orientation matrix is derived. Finally, the digital ortho map (DOM) and digital elevation model (DEM) of one orbit are generated automatically. The results show that the relative and absolute accuracies of triangulation are relatively high and generally acceptable. The visualised 3D realistic scene and the mosaicked DOM and DEM of lunar surface validate the feasibility of the proposed solution. Further information:
Survey Review 47, No 342. May/June 2015 1. Efficient obstruction analysis for GNSS relative positioning of terrestrial mobile mapping system A mobile mapping system MMS is a combination of several modern sensors and is capable of acquiring meticulous spatial information in a fast and automatic manner. However, the result obtained is highly dependent on the quality of the global navigation satellite system GNSS which is expected to constantly provide accurate positioning solutions for the system. As a consequence, in an urban or mountainous area where satellite signals are frequently obstructed by local topography, the reliability of a terrestrial MMS system is not always guaranteed. In this study, a rigorous approach for predicting GNSS satellite reception under topographic obstructions along an MMS surveyed path is developed. By utilising a digital terrain model and a lineofsight analysis, visible satellites for a moving platform are identified and used for estimating positioning quality under a GNSS relative positioning model. Based on the results from real case studies, it is illustrated that the proposed approach provides a reliable prediction for GNSS satellite obstruction and relative positioning quality. Consequently, the locations andor epochs associated with poor GNSS positioning qualities can be identified beforehand and an appropriate survey schedule can thus be arranged. Further information:
2. Irregular variations in GPS time series by probability and noise analysis The character of the topocentric components in ETRF2000(R08) from the Polish ASG-EUPOS system was analysed using skewness and kurtosis derived from the data probability density function. The data from 115 permanent GPS stations with a time span of more than 5 years were used. The main goal of this research was to show that any unmodelled systematics can disrupt the results. The obtained median values of skewness and kurtosis clearly indicate the discrepancies between the assumed normality of the GPS time series distribution and the reality, mainly due to stochastic and/or deterministic parts that are still present in the data. The quadratic relationship between skewness and kurtosis was developed with the empirically determined constants. The noise analysis with the Maximum Likelihood Estimation was also performed with the assumptions of white and white plus power-law noise. The estimated spectral indices for power law noises are close to −1 (flicker noise). The uncertainties of the intraplate velocities with white and white plus power law noise assumptions were calculated. It was shown that these uncertainties can be underestimated up to 5 mm/year. Additionally we made 1000 simulation which were aimed at showing how the values of skewness and kurtosis are changed when some mismodelled part of the data remains in the time series. We assumed different values of trend (1 and 6 mm/year), seasonal amplitudes (1 and 6 mm) and offsets (3 and 10 mm) with flicker noise of 1 mm amplitude to see how the metrics we use are biased. Further information:
3. Application of Msplit estimation to determine control points displacements in networks with unstable reference system This paper presents the use of Msplit estimation to determine the displacements of control points in an unstable reference system. The theoretical part presents a strategy for adjusting observation sets measured in several measurement epochs using an additional (virtual) measurement epoch. The paper also includes several examples demonstrating possible applications of the method in surveying practice. The examples demonstrate the efficiency of a virtual measurement epoch in adjusting observation sets measured in an unstable reference system and also when gross errors appear in observations. Further information:
4. Combined adjustment of angle and distance measurements in a dam monitoring network The safety control and structural analysis of Portuguese large dams has been using, for many decades, small triangulation networks to monitor the horizontal displacements of a carefully selected set of target points on the dams. The recent surge in the use of motorised tacheometers, which enable automatic measurement of angles and distances, is changing traditional triangulation networks into combined angles/distances networks. This paper addresses the estimation of weights for the combined adjustment of angles and distances. Further information:
5. Method for precise determination of eccentric instrument set-ups A method for precise determination of station eccentre elements is described. The approach presented is based on adapting a digital camera to an optical plummet. The measurement set is described, as are the problems that arise in its determination and application together with their solutions, including the necessary algorithms. Tests have been performed that confirm the effectiveness of the proposed approach. The estimated accuracy in determining eccentre elements corresponds to centring with an accuracy of at least 0.1 mm. Further information:
6. Adapting 2D positional control methodologies based on linear elements to 3D In this study we describe the adaptation of several 2D positional control methodologies based on linear elements to 3D. The selected methods have been used previously in several studies for controlling different kinds of lines and cartographic products, but always in a planimetric way. This study analyses the consequences of the inclusion of the heights in these methods by adapting the processes of calculation to the Z component. The methods to be adapted have been selected from the literature and are related to those based on calculations of distances and areas. The proposed methodology is applied to a real case, more concretely to a road extracted from two official databases in Spain, in order to check it and as a preliminary test for performing future 3D positional controls using these methods. Further information:
7. Assessment of underground wine cellars using geographic information technologies Geographic information technologies (GIT) are essential to many fields of research, such as the preservation and dissemination of knowledge of cultural heritage buildings, a category which includes traditional underground wine cellars. This paper presents a methodology based on research carried out on this type of rural heritage building. The data were acquired using the following sensors: EDM, total station, close range photogrammetry and laser scanning, and subsequently processed with a specific software which was verified for each case, in order to obtain a satisfactory graphic representation of these underground wine cellars. Two key aspects of this work are the accuracy of the data processing and the visualisation of these traditional constructions. The methodology includes an application for geovisualising these traditional constructions on mobile devices in order to contribute to raising awareness of this unique heritage. Further information:
8. Effect of sea level rise in the validation of geopotential/geoid models in Metro Manila, Philippines The release of new global geopotential models (GGMs) has raised the question of whether these GGMs could now supplant the development of regional/local geoid models. For geodetic surveying purposes the importance of a geoid model fitted to the local condition will greatly help in the resolution of many vertical datum issues especially for an archipelagic country such as the Philippines. Since the geoid as a vertical reference surface is used to convert ellipsoidal heights acquired through GNSS survey to orthometric heights, it is vital that this should at least match the heights derived through geodetic levelling within the allowed accuracy. However, the issue of mean sea level (MSL) as vertical reference for geodetic levelling has introduced another issue of concern especially when it is exhibiting accelerated sea level rise (ASLR). The results of the validation and comparison of recent GGMs showed that EGM 2008 and the 2012 released EIGEN-6C2 both classified as combined type model have RMS around 14 and 13 cm respectively in the study area. These results were achieved when ASLR was considered in the validation. The local geoid model (LGM) developed using least squares modification of Stokes formula with additive corrections (LSMSA) approach has RMS around 11 cm using an integration cap of 3° from the limit of study area. The LGM validated the level of the combined type GGMs in reference to the local vertical datum in Manila Bay. Based on the result of this study, the ASLR contributed much to the departure of the latest MSL from the geoid. Further information:
9. Neo-cadastres: innovative solution for land users without state based land rights, or just reflections of institutional isomorphism? In many countries, authority for maintaining records of land ownership lies with national or state institutes, called ‘cadastres’ or ‘land registries’. The emergence of volunteered geographic information (VGI) and crowdsourcing potentially challenges this state based authority, enabling the construction of ‘neo’-cadastres (using the analogy of ‘neo’geography). Individual citizens can themselves map and record land tenure rights. This paper explores if and how VGI and crowdsourcing may redefine the state based cadastres and land registries, and the roles of land users who claim land rights outside of the state based institutions. Using theories of institutions and isomorphism we hypothesise that a parallel exists between emerging open systems and situations where no state-based cadastre exists: participants in both situations will position themselves to protect their interests. Three cases from Ghana, Canada, and Indonesia demonstrate how land users indicate their land tenure right boundaries based on personal views (the neo-cadastre), rather than on rules stipulated by a national authority (the traditional cadastre). In each case land users root their behaviour partly in local dependency relations and social advocacy networks. These locally embedded rules may not necessarily coincide with hierarchical institutional relations. Even in participatory adjudication activities conducted under the authority of national cadastres, land users are not completely free in providing their land boundaries. Instead, they adhere to certain locally embedded microsocial conventions. We conclude that VGI and crowdsourced based neo-cadastres will likely redefine, from passive to active, the roles of land users in cadastres. This creates an opportunity for citizens, but also a potential risk. Where implicit rules of the neo-cadastre do not co-evolve with traditional cadastral institutions, conflicts of land information and access to land may emerge. Neo-cadastres will be an artefact of this dissatisfaction, and may reflect new directions for cadastral institutions. Neo-cadastres will not be a direct trigger for wider cadastral change, but a piece of evidence that change and resistance are occurring. Further information:
Survey Review 47, No 341. March/April 2015 1. Mine surface deformation monitoring using modified GPS RTK with surveying rod: initial results The reasonable and effective exploitation and optimal utilisation of coal resources as well as ensuring the safe operation of main buildings, railways, high-voltage power lines, and other facilities of mining-influenced areas are the main technical requirements of the coal industry. The key problem is to implement certain methods and techniques to acquire deformation information, summarise the deformation laws, and ultimately serve mine production and construction. Considering the surface deformation of mining areas, traditional measurement techniques (automatic/digital levels, theodolites and total stations, etc.) are generally used to measure the well-arranged (several hundreds of) monitoring points repetitively, obtain deformation information, and calculate the deformation parameters. These parameters are used to design protective coal pillars. However, surface deformation is expensive to monitor or measure; it cannot be measured by traditional techniques in several coal mine areas, particularly in mountainous regions. Hence, effective techniques must be employed to obtain deformation data. Although global positioning system real-time kinematic (GPS RTK) technology provides an effective solution to this problem, the traditional GPS RTK with surveying rod can only determine deformation at centimetre level or low positioning precision because the rod is subject to vertical deviation and shaking error; thus, it cannot satisfy the precision requirement of surface deformation monitoring. A new GPS RTK surveying method that involves rod measuring is proposed in this study to address these issues. This method can effectively avoid the effects of the vertical deviation and shaking error of the surveying rod, weaken the impact of multipath error in U direction, and further improve positioning precision. Experiment results show that at 1 s sampling interval, the standard deviations estimated by the proposed method toward N, E, and U directions are 11·4, 8·9, and 4·9 mm for 10 s observation, respectively, and 8·9, 5·1, and 4·0 mm for 20 s observation, respectively. The impact of the major error source was reduced, and positioning precision improved significantly. This particular result provides strong technical support for fast and high-precision mine surface deformation monitoring. Further information:
2. Development and evaluation of GNSS/INS data processing software for position and orientation systems Currently there are limited supportive and reliable commercial software packages for GNSS/INS data processing. The shortage of compatible GNSS/INS data processing software has become a bottleneck in the development of position and orientation systems (POS) for survey and mapping applications. Thus this paper introduces a GNSS/INS data processing software called Cinertial, which has been recently developed in China, with open definitions of data format and parameters. The algorithm design of the software is described in detail, including the realisation of INS mechanisation, Kalman filtering, and backward smoothing. The developed software is tested and evaluated by comparing it with mainstream commercial software through processing the same field test datasets. A precise and feasible comparison procedure is proposed in the paper, so as to evaluate the quality of any new POS software in an efficient, convenient, and cost-effective way. The procedure was applied to compare our new software with two well known commercial software packages through processing airborne and terrestrial datasets respectively. The results show that the new software can achieve the same level of accuracy as the current commercial software on the market, which reflects a new progress of the POS development in China. Further information:
3. Estimation of multi-constellation GNSS observation stochastic properties using single receiver single satellite data validation method The single receiver single satellite validation method is a technique that screens data from each satellite independently to detect and identify faulty observations. A new method for estimation of the stochastic properties of multi-constellation GNSS observation is presented utilising parameters of this validation method. Agreement of the characteristics of the validation statistics with theory is used as the criterion to select the best precision of the observations, spectral density and correlation time of the unknowns. A curve fitting approach in an iterative scheme is employed. The method is applicable to any GNSS with any arbitrary number of frequencies. Demonstration of the method results and performance is given using multiple-frequency data from GPS, GLONASS and Galileo in static and kinematic modes. Further information:
4. Determination of local geoid model in Attica Basin Greece Orthometric elevations are nowadays determined not only through spirit or trigonometric levelling but also using GNSS techniques, under the condition that a geoid model exists in the area of interest. In this paper the estimation of a local geoid model in the area of the Attica Basin is presented. For this purpose a network of 15 points was established and measured using GNSS techniques. From the existing orthometric elevations and the geometric heights obtained after the adjustment, the geoid heights were determined. The best fitting surface approximating the local geoid in the Attica Basin is estimated and presented using three different parametric models as well as interpolation functions. Further information:
5. Photogrammetric techniques and surveying applied to historical map analysis Since historical maps are kept in digital format archives, they have not only the planimetric errors attributable to the cartographer but also those transferred during digitalising. This paper describes a methodology based on photogrammetric techniques and surveying applied to evaluate the accuracy of historical maps. The aim of the study is to take reference points on tracing paper from the original map and ground control points in order to use photogrammetric techniques to recover the metrics of the digitalised map. This ensures that the historical map in digital form has the same metric quality as the original map, since the errors arising from scanning are removed. This method is applied to 1775 map of the Real Sitio de Aranjuez (Spain), demonstrating that the application of photogrammetric techniques are useful tools for correction and analysis of historical maps. Consequently, photogrammetric techniques applied to the analysis of historical maps open up new perspectives for the study of the historical Cartography. Further information:
6. Target identification in terrestrial laser scanning Target identification is an important process in terrestrial laser scanner (TLS) measurements; however, due to strong competition between manufacturers, the design of laser scanners is kept secret and is usually strengthened by accompanying proprietary software. Moreover, the target identification algorithms (i.e. definitions of the target centre) are not specified. This makes it difficult for users to objectively compare scanners from different manufacturers and to judge the reliability of the captured scan data by a brand scanner and accompanying software. This paper presents a unified general method to complete the process of target identification. The proposed method consists of four major steps: first, determination of the target plane; second, classification of the reflection intensity values and extraction of the border between white and black; third, detection and elimination of erroneous points from step two; and fourth, fitting of the intersection lines and calculation of the centre of the two lines. Because TLS is a reflectorless surveying model that can receive hundreds of signals, its measurements require more stringent objective conditions than traditional measurement by total stations (TS). Therefore, robust estimation methods are used to reduce the influence of random errors; moreover, the model of error-in-value (EIV) is also introduced to deal with captured data. Finally, the target’s centre can be obtained from an iteration process. For the experiments, a Leica HDS 7000 terrestrial laser scanner, with its accompanying software, Cyclone, and a Leica Laser Tracker AT901were employed. The performance of the proposed method is compared with Cyclone and some early methods from published studies at different resolutions and distances. The paper concludes that the proposed method can obtain reliable results at the same level of accuracy level as those obtained using accompanying software; thus, it provides an objective means to compare the quality of different scanners. The advantage is that our method only makes use of information provided by all scanners and does not require additional proprietary information that cannot be accessed. Further information:
7. Evaluation of cadastre renovation studies in Turkey In Turkey, cadastral surveys have been carried out by various measurement techniques, at different standards and based on different legislation. Consequently, one of the most important problems with the cadastre is that approximately half of the current cadastral maps are insufficient to meet the expectations of today’s modern cadastre. In order to resolve the problem and renew the cadastral maps, the General Directorate of Land Registry and Cadastre commenced the Land Registry and Cadastre Modernization Project in 2009. Within the scope of the project, it is intended that 10 million parcels will be renewed by the end of 2014. This study aims to evaluate the cadastre renovation studies in Turkey with respect to their legal and technical aspects using a case study area. Results show that cadastre renovation studies may be insufficient to enrich and update the cadastral data. As a solution, legal amendments should be made in the Cadastre Law in a manner that will allow a new cadastral survey to be performed. Further information:
Survey Review 47, No 340. January/February 2015 1. Use of genetic algorithm and sliding windows for optimising ambiguity fixing rate in GPS kinematic positioning mode Two keys to achieving high precision positioning results from using GPS carrier phase observations are the data differencing technique and the ambiguity resolution process. The double differencing technique has been widely used to reduce biases in GPS observation. However, un-modelled biases still remain in the GPS observations and they can deteriorate the number of ambiguity fixed solutions especially in the GPS kinematic positioning mode. Therefore, noisy or unwanted GPS satellites must be identified and removed from the data processing step. Previous study has successfully demonstrated the use of a Genetic Algorithm (GA) to optimise the selection of the best combination of GPS satellites which can improve the number of ambiguity fixed solutions in GPS kinematic positioning mode. Further investigation has been carried out to enhance the number of ambiguity fixed solutions by varying the selection windows with finite length used in the data processing step. This paper will present a methodology and test results obtained from the ambiguity fixing rate optimisation using a sliding window and a GA. Further information:
2. Reducing distance dependent bias in low-cost single frequency GPS network to complement dual frequency GPS stations in order to derive detailed surface deformation field A total of 17 low-cost single-frequency L1 global positioning system (GPS) receivers with real-time internet transmission have been set up to intensify the pre-existing network of continuously operating reference stations (CORS) in southeastern Taiwan since 2008. The main objective of this study is to investigate the validity and uncertainty of the L1 stations in southeastern Taiwan. It is well known that the main error source of single-frequency GPS relative positioning in low latitude areas comes from an atmospheric delay, even if the relative distance is only a few kilometres. In this study, two methods of correction algorithms, including adopting local ionospheric models and applying correction terms from local CORS, are tested to estimate the long-period accuracy of station positioning. Our results indicate that the standard deviation of calibrated relative positioning is in a linear trend with respect to the baseline length. The derived positioning accuracies from applying correction terms from CORS provide satisfactory results with the linear ratios of standard deviation/baseline of 0·11±0·02, 0·12±0·02, 0·44±0·06 mm km–1 in the north, east and up component, respectively for relative distances under 30 km. The corresponding positioning scatterings amount to 3, 3 and 13 mm, in the north, east and up component, respectively. Although the use of a local ionospheric model algorithm can significantly reduce positioning variation, especially in the north component, the use of the correction terms method yields the best positioning results for three components, horizontal and vertical. Further information:
3. Linear observation based total least squares This paper presents a total least squares (TLS) method in an iterative way when the observations are linear with applications in two-dimensional linear regression and three-dimensional coordinate transformation. The second order smaller terms are preserved and the unbiased solution and the variance component estimate are both obtained rigorously from traditional non-linear least squares theory. Compared with the traditional TLS algorithm dealing with the so called errors-in-variables (EIV) model, this algorithm can be used to analyse all the observations involved in the observation vector and the design matrix coequally; the non-linear adjustment with constraints or partial EIV model can also be solved using the same method. In addition, all the errors of observations can be considered in a heteroscedastic or correlated case, and the calculation of the solution and the variance component estimate are much simpler than the traditional TLS and its related improved algorithms. Experiments using statistical methods show the deviations between the designed true value of the variables and the estimated ones using this algorithm and the traditional least squares algorithm respectively, and the mean value of the posteriori variance in 1000 simulations of coordinate transformation is computed as well to test and verify the efficiency and unbiased estimation of this algorithm. Further information:
4. On evaluation of different methods for quality control of correlated observations This paper evaluates, compares, and discusses different methods for quality control in geodetic data analysis in the general scenario of correlated observations and multiple outliers. The investigated methods are the data snooping procedure, the statistical tests for multiple outliers, the recently proposed quasi-accurate detection of outliers method for correlated observations, the Danish method for correlated observations, the robust estimator for correlated observations based on bifactor equivalent weights, and the robust estimator for correlated observations based on a local sensitivity downweighting strategy. To evaluate these methods, outliers between 3σ and 9σ magnitude (positives and/or negatives) are randomly generated and added to some observations (σ being the respective standard deviation of the observation) in two different global navigation satellite system (GNSS) networks that contain correlated observations. For each network, 15 000 scenarios are performed, 5000 with one outlier, 5000 with two outliers, and 5000 with three outliers, using Monte–Carlo simulations. The investigated methods have advantages and limitations, and the discussions and conclusions about the experiments are accurately presented. Further information:
5. Progress of cost recovery on cadastre based on land management implementation in Turkey The General Directorate of Land Registry and Cadastre (GDLRC, in Turkish TKGM), which is under the authority of the Prime Ministry of the Republic of Turkey, has started taking advantage of the private sector for the production of cadastre products that have been more effective and economical since 2005. During this progress, the Licensed Surveying Engineer and Bureau (LSEB, in Turkish LİHKAB) was established. Change operations on demand after the finishing of cadastre works were passed on to the LSEB. In addition, the GDLRC has started accelerating the data quality to meet the cadastre data standard of the European Union. Besides, since 1993, the information-produced cadastre maps and projects have been sold to whoever they may concern by way of GDLRC through the concept of revolving funds. Through this concept, it is intended that experiences of revolving funds in our country would be beneficial for transferring this matter to the concerned country. In this sense, the progress of cost recovery is investigated with selected areas serving as examples. Through this article, it has been proved that the progress of cost recovery is very fast in the areas in which intensive operations of property are performed in accordance with data obtained. However, it is thought that if the diversity of cadastre data is increased during the progress of cost recovery on cadastre works in Turkey, the conclusions of the cost recovery system will be much more effective than it is at present. It is determined that insufficient diversity of cadastre data creates a disadvantage for the cost recovery system in Turkey. Finally, solving the problems on progress of cost recovery, enrichment activity for cadastre scope and content should be started. Simultaneously, it is observed that initiating the work of enrichment activity would be beneficial as well as would sustain the progress of cost recovery. Further information:
6. Core immovable property vocabulary for European linked land administration Public information regarding immovable property is recorded in national registries, including cadastre and land registry, building and dwelling (or address) registry, and property tax registry. An efficient land administration system is supposed to orchestrate these registries for cross-border and cross-sector land administration services. The recent developments in Semantic Web technologies and the Linked Data approach provide an efficient and flexible solution for web based integration of datasets recorded in these registries. The mentioned registries were specified according to different data models and standards which lead to interoperability problems. The concerned public organisations therefore need a common data model that clearly identifies the recording units of these registries and their core attributes, as well as relationships between them. The present research responds to this requirement by developing a common vocabulary, a Core Immovable Property Vocabulary, as an extension to the e-Government Core Vocabularies (version 1·0) issued recently by the European Commission (EC). The Core Immovable Property Vocabulary is developed by simplifying and capturing the minimal characteristics of complex domain standards and data specifications, thereby enabling a plain representation of immovable property units and their core attributes that are needed for land administration processes. It thus allows for the publication of land administration data sets within the Resource Description Framework (RDF), and renders the integration of land administration datasets with other registries such as address and civil registries encoded according to the e-Government Core Vocabularies. The vocabulary is the result of an individual research effort, carried out by the authors of the present article, and has recently been hosted by the EC ISA (Interoperability Solutions for European Public Administration) Joinup platform as a semantic asset (see https://joinup.ec.europa.eu/asset/cipv/description). The main contribution is the introduction of the Linked Data approach to the land administration domain which may make land administration data sets more accessible via the Web. Further information:
7. On the establishment and implementation of GPS CORS for cadastral surveying and mapping in Indonesia Fast integer ambiguity resolution for single epoch observation is one of main issues of GPS precise positioning in real time surveying applications. An improved solution of dual frequency correlation method (DUFCOM) and direct calculation method (DC), named as fast ambiguity resolution for single epoch scheme (FARSE), is proposed in this paper. A software based on the proposed scheme for monitoring of construction cranes, Gsertcas, is developed. With the help of Gsertcas, the performance and suitability of FARSE are investigated through simulation experiments. The result of the experiment demonstrates that the success rate for ambiguity resolution is above 97% and the root mean square of the position solution with correct ambiguity resolution is better than 3·8 mm. Further information:
8. Impact of vertical deflection on direct georeferencing of airborne images This paper is aimed at analysing the influence of the vertical deflection (DOV) in direct georeferencing (DG) of image data from an aerial digital frame camera. Without considering the value of the DOV, a systematic error is involved in the determination of the point position. This investigation has been carried out considering several values of the vertical deviation and, for each of them, the vertical and horizontal error obtained by varying the FOV (Field of View) over the altitude has been analysed. Further information:
|
|||
# | |||||