This page has only limited features, please log in for full access.
Volunteered geographic information (VGI) is a large and up-to-date data source, which is available to the public easily. VGI enables public participation by leveraging scientific research and ease of data entry. OpenStreetMap (OSM) is one of the most popular examples of a volunteered geographic information project that has turned into a major source as the substitution of geographical data over the past years. Because OSM data quality is very variable, its various aspects have been investigated in previous studies. Assessing the reliability of volunteered geographic data has been a topic of interest to researchers during recent years. The objective of this study is to introduce an approach for computing the reliability indicators as tools for assessing OSM data quality using the history of data. To prepare the data required, the history file of the OSM dataset for the study region was extracted. Then, historical data cleaning was carried out by identifying and eliminating the outlier data. Afterward, the reliability indicator was calculated through criteria such as the number of versions, the number of user participation, temporal variations, and the number of tags editing. In the last step, to evaluate the proposed approach in calculating the reliability indicator, the level of feature reliability was compared with their spatial accuracy calculated via feature matching of the OSM and official data. The results show among 7478 reliability features of the OSM, approximately 4338 feature involves reliability of above 50%, containing 58.01% of the datasets, and among 5659 matching features of the OSM dataset, 4429 features have a similarity percentage of above 70%, containing 78.26% of the datasets. Increasing the number of versions, the number of users, and the temporal variation range of a route increased the reliability. Contrastingly, tag editing reduces reliability. Moreover, according to the results, a correlation coefficient of 0.695 between the reliability and spatial accuracy indicates a direct relationship of reliability in the quality of the OSM dataset.
Najmeh Teimoory; Rahim Ali Abbaspour; Alireza Chehreghan. Reliability extracted from the history file as an intrinsic indicator for assessing the quality of OpenStreetMap. Earth Science Informatics 2021, 14, 1413 -1432.
AMA StyleNajmeh Teimoory, Rahim Ali Abbaspour, Alireza Chehreghan. Reliability extracted from the history file as an intrinsic indicator for assessing the quality of OpenStreetMap. Earth Science Informatics. 2021; 14 (3):1413-1432.
Chicago/Turabian StyleNajmeh Teimoory; Rahim Ali Abbaspour; Alireza Chehreghan. 2021. "Reliability extracted from the history file as an intrinsic indicator for assessing the quality of OpenStreetMap." Earth Science Informatics 14, no. 3: 1413-1432.
The continuous development of positioning technologies and computing solutions for the integration of large trajectory data sets offers many novel research opportunities. Among various research domains, the extraction of users' movement patterns is an important issue that is yet to be addressed. While many previous studies have analyzed human and animal movements from a predominantly geometrical point of view, additional semantics are still required to provide a better understanding of the patterns that emerge. User activity data provide important information resources to analyze and predict movement patterns in urban environments. This study introduces a computational framework that combines the geometric and activity‐based dimensions of human trajectories. First, the geometrical dimension considers a series of parameters (i.e., turning points, curvature, and self‐intersection) that are extracted by a convex‐hull algorithm and characterizes a given trajectory. Second, user activity transitions are modeled and then denote some recurrent patterns. Finally, geometric and activity patterns are integrated into a unified trajectory modeling framework. This favors the analysis of human movement patterns by taking into account the geometric and activity dimensions. The entire approach and framework have experimented with the LifeMap Korean trajectory data set commonly considered as a reference benchmark. The experiments showed how the integration of geometrical and activity‐based dimensions could provide a better understanding of the patterns and trends that emerge from a large trajectory data set.
Amin Hosseinpoor Milaghardan; Rahim Ali Abbaspour; Christophe Claramunt; Alireza Chehreghan. An activity‐based framework for detecting human movement patterns in an urban environment. Transactions in GIS 2021, 1 .
AMA StyleAmin Hosseinpoor Milaghardan, Rahim Ali Abbaspour, Christophe Claramunt, Alireza Chehreghan. An activity‐based framework for detecting human movement patterns in an urban environment. Transactions in GIS. 2021; ():1.
Chicago/Turabian StyleAmin Hosseinpoor Milaghardan; Rahim Ali Abbaspour; Christophe Claramunt; Alireza Chehreghan. 2021. "An activity‐based framework for detecting human movement patterns in an urban environment." Transactions in GIS , no. : 1.
Earthquake hazards cause changes in landforms, economic losses, and human casualties. Seismic Vulnerability Mapping (SVM) is key information to prevent and predict the damage of earthquakes. The purpose of this study is to train and compare the results of the Classification Tree Analysis (CTA) learner model with three Gini, Entropy, Ratio split algorithms, and Fuzzy ARTMAP (FAM) model by the development of hybrid models for SVM. The Seismic Vulnerability Conditioning Factors (SVCFs) such as environmental, physical, and social were selected using experts' opinions and experience. Thirteen factors were edited and prepared as the seismic vulnerability conditioning factors (SVCFs) used in this study. In order to seismic vulnerability mapping and models training, a database of training sites was created by the Multi-Criteria Decision Analysis-Multi-Criteria Evaluation (MCDA-MCE) hybrid process. Then, 70% of the points were used for training and 30% were used to validate the models' results based on the holdout method. Moreover, Relative Operating Characteristics (ROC), Seismic Relative Index (SRI), and Frequency Ratio (FR) were used to validate the results. The Area under the curve (AUC) for the algorithms Gini, Entropy, Ratio, and FAM model are 0.895, 0.890, 0.876, and 0.783, respectively. The results of the three validation methods show the highest performance for the Gini splitting algorithm. Accordingly, the percentage of social and physical vulnerability of Sanandaj city was determined based on the MCE-Gini optimal model: 27% of the area and 62% of the population of Sanandaj are under high vulnerability to earthquakes. So that, various factors such as worn urban texture, high population density and environmental factors were among the most important factors affecting seismic vulnerability.
Peyman Yariyan; Rahim Ali Abbaspour; Alireza Chehreghan; Mohammadreza Karami; Artemi Cerdà. GIS-based seismic vulnerability mapping: a comparison of artificial neural networks hybrid models. Geocarto International 2021, 1 -24.
AMA StylePeyman Yariyan, Rahim Ali Abbaspour, Alireza Chehreghan, Mohammadreza Karami, Artemi Cerdà. GIS-based seismic vulnerability mapping: a comparison of artificial neural networks hybrid models. Geocarto International. 2021; ():1-24.
Chicago/Turabian StylePeyman Yariyan; Rahim Ali Abbaspour; Alireza Chehreghan; Mohammadreza Karami; Artemi Cerdà. 2021. "GIS-based seismic vulnerability mapping: a comparison of artificial neural networks hybrid models." Geocarto International , no. : 1-24.
Snow avalanches can destroy lives and infrastructure and are very important phenomena in some regions of the world. This study maps snow avalanche susceptibility in Sirvan Watershed, Iran, using a new approach. Two statistical models – belief function (Bel) and probability density (PD) – are combined with two learning models – multi-layer perceptron (MLP) and logistic regression (LR) – to predict avalanche susceptibility using remote sensing data in a geographic information system (GIS). A snow avalanche inventory map was generated from Google Earth imagery, regional documentation, and field surveys. Of 101 avalanche locations, 71 (70%) were used to train the models and 30 (30%) were used to validate the resulting models. Fourteen snow avalanche conditioning factors were used as independent variables in the predictive modeling process. First, the weight of Bel and PD techniques were applied to each class of factors. Then, they were combined with two MLP and LR learning models for snow avalanche susceptibility mapping (SASM). The results were validated using positive predictive values, negative predictive values, sensitivity, specificity, accuracy, root-mean-square error, and area-under-the-curve (AUC) values. Thus, the AUCs for the PD-LR, Bel-LR, Bel-MLP, and PD-MLP hybrid models are 0.941, 0.936, 0.931 and 0.924, respectively. Based on the validation results, the PD-LR hybrid model achieved the best accuracy among the models. This hybrid modeling approach can provide accurate and reliable evaluations of snow avalanche-prone areas for management and decision making.
Peyman Yariyan; Mohammadtaghi Avand; Rahim Ali Abbaspour; Mohammadreza Karami; John P. Tiefenbacher. GIS-based spatial modeling of snow avalanches using four novel ensemble models. Science of The Total Environment 2020, 745, 141008 .
AMA StylePeyman Yariyan, Mohammadtaghi Avand, Rahim Ali Abbaspour, Mohammadreza Karami, John P. Tiefenbacher. GIS-based spatial modeling of snow avalanches using four novel ensemble models. Science of The Total Environment. 2020; 745 ():141008.
Chicago/Turabian StylePeyman Yariyan; Mohammadtaghi Avand; Rahim Ali Abbaspour; Mohammadreza Karami; John P. Tiefenbacher. 2020. "GIS-based spatial modeling of snow avalanches using four novel ensemble models." Science of The Total Environment 745, no. : 141008.
Peyman Yariyan; Mohammadtaghi Avand; Rahim Ali Abbaspour; Ali Torabi Haghighi; Romulus Costache; Omid Ghorbanzadeh; Saeid Janizadeh; Thomas Blaschke. Flood susceptibility mapping using an improved analytic network process with statistical models. Geomatics, Natural Hazards and Risk 2020, 11, 2282 -2314.
AMA StylePeyman Yariyan, Mohammadtaghi Avand, Rahim Ali Abbaspour, Ali Torabi Haghighi, Romulus Costache, Omid Ghorbanzadeh, Saeid Janizadeh, Thomas Blaschke. Flood susceptibility mapping using an improved analytic network process with statistical models. Geomatics, Natural Hazards and Risk. 2020; 11 (1):2282-2314.
Chicago/Turabian StylePeyman Yariyan; Mohammadtaghi Avand; Rahim Ali Abbaspour; Ali Torabi Haghighi; Romulus Costache; Omid Ghorbanzadeh; Saeid Janizadeh; Thomas Blaschke. 2020. "Flood susceptibility mapping using an improved analytic network process with statistical models." Geomatics, Natural Hazards and Risk 11, no. 1: 2282-2314.
One of the requirements for planning and decision-making to develop the infrastructures is to prepare the landslide occurrence hazard map. For this purpose, in this article, the Shannon entropy and Dempster–Shafer intuition theory methods were used to prepare the hazard map and applying the data uncertainty in the Tutkabon region, Guilan Province. In this study, parameters of the slope, elevation, geomorphological conditions, the curvature of the earth, proximity to the river and proximity to faults were used as the affecting factors on the landslide occurrence. By using these parameters, the map of the landslide occurrence hazard was prepared using the entropy index; besides, the belief values were calculated by the Dempster–Shafer method. To investigate the uncertainty, the disbelief and uncertainty values were calculated by the Dempster–Shafer method. Besides, in the Shannon entropy method the maps were compared before and after applying the entropy. Finally, by evaluating the results using comparing the landslide occurrence places in the study area and modeled hazard map, the value of 0.69 was obtained for the area under the prediction rate curve (as the parameter of prediction total precision) in the status with entropy and the value of 0.54 was obtained for the status without entropy. Similarly, the evaluation of the hazard belief map by the Dempster–Shafer method indicates that 65% of the landslide occurrence places occur in the classes of high and very high hazard and the value of 0.74 was obtained for the area under the prediction rate curve in the belief map.
Amin Hosseinpoor Milaghardan; Rahim Ali Abbaspour; Mina Khalesian. Evaluation of the effects of uncertainty on the predictions of landslide occurrences using the Shannon entropy theory and Dempster–Shafer theory. Natural Hazards 2019, 100, 49 -67.
AMA StyleAmin Hosseinpoor Milaghardan, Rahim Ali Abbaspour, Mina Khalesian. Evaluation of the effects of uncertainty on the predictions of landslide occurrences using the Shannon entropy theory and Dempster–Shafer theory. Natural Hazards. 2019; 100 (1):49-67.
Chicago/Turabian StyleAmin Hosseinpoor Milaghardan; Rahim Ali Abbaspour; Mina Khalesian. 2019. "Evaluation of the effects of uncertainty on the predictions of landslide occurrences using the Shannon entropy theory and Dempster–Shafer theory." Natural Hazards 100, no. 1: 49-67.
Spatiotemporal movement pattern discovery has stimulated considerable interest due to its numerous applications, including data analysis, machine learning, data segmentation, data reduction, abnormal behaviour detection, noise filtering, and pattern recognition. Trajectory clustering is among the most widely used approaches of extracting interesting patterns in large trajectory datasets. In this paper, regarding the optimal performance of density-based clustering, we present a comparison between eight similarity measures in density-based clustering of moving objects’ trajectories. In particular, Distance Functions such as Euclidean, L1, Hausdorff, Fréchet, Dynamic Time Warping (DTW), Longest Common SubSequence (LCSS), Edit Distance on Real signals (EDR), and Edit distance with Real Penalty (ERP) are applied in DBSCAN on three different datasets with varying characteristics. Also, experimental results are evaluated using both internal and external indices. Furthermore, we propose two modified validation measures for density-based trajectory clustering, which can deal with arbitrarily shaped clusters with different densities and sizes. These proposed measures were aimed at evaluating trajectory clusters effectively in both spatial and spatio-temporal aspects. The evaluation results show that choosing an appropriate Distance Function is dependent on data and its movement parameters. However, in total, Euclidean distance proves to show superiority over the other Distance Functions regarding the Purity index and EDR distance can provide better performance in terms of spatial and spatio-temporal quality of clusters. Finally, in terms of computation time and scalability, Euclidean, L1, and LCSS are the most efficient Distance Functions.
A. Moayedi; R. Ali Abbaspour; A. Chehreghan. An evaluation of the efficiency of similarity functions in density-based clustering of spatial trajectories. Annals of GIS 2019, 25, 313 -327.
AMA StyleA. Moayedi, R. Ali Abbaspour, A. Chehreghan. An evaluation of the efficiency of similarity functions in density-based clustering of spatial trajectories. Annals of GIS. 2019; 25 (4):313-327.
Chicago/Turabian StyleA. Moayedi; R. Ali Abbaspour; A. Chehreghan. 2019. "An evaluation of the efficiency of similarity functions in density-based clustering of spatial trajectories." Annals of GIS 25, no. 4: 313-327.
Afsaneh Nasiri; Rahim Ali Abaspour; Alireza Chehreghan; University of Tehran; University of Sahand. A Geometric Approach to Improve the Quality of voluntary spatial information Using Data History (Case Study: OSM Linear Data). Journal of Geospatial Information Technology 2019, 7, 177 -194.
AMA StyleAfsaneh Nasiri, Rahim Ali Abaspour, Alireza Chehreghan, University of Tehran, University of Sahand. A Geometric Approach to Improve the Quality of voluntary spatial information Using Data History (Case Study: OSM Linear Data). Journal of Geospatial Information Technology. 2019; 7 (2):177-194.
Chicago/Turabian StyleAfsaneh Nasiri; Rahim Ali Abaspour; Alireza Chehreghan; University of Tehran; University of Sahand. 2019. "A Geometric Approach to Improve the Quality of voluntary spatial information Using Data History (Case Study: OSM Linear Data)." Journal of Geospatial Information Technology 7, no. 2: 177-194.
Air pollution is a major concern in some megacities of Iran. Specific cities in the country have reached an extremely harmful level of air pollution which poses a serious risk to the daily lives of Iranians. According to news reports, the air quality index of the city of Tehran hovers around 159, which is more than three times the World Health Organization’s advised maximum. For the purpose of air pollution abatement, it is necessary to precisely know the air pollution distribution in the area. In order to obtain this figure, it is necessary to properly locate the city’s air quality monitoring stations that measure the spatial pollutant distribution. According to various reports, the city must have at least 56 air quality monitoring stations to properly measure Tehran’s air quality. However, there are currently only 20 stations within the city. Thus, the main purpose of this study was to identify the most sufficient areas for deploying new air quality monitoring stations. This study provided an integration of hybrid multi-criteria decision-making (MCDM) theories and geographical information system (GIS) processes in order to determine suitable areas to establish air quality monitoring stations. Unlike traditional models, the proposed MCDM method, ANP-OWA, is an efficient decision analysis which considers dependencies between criteria and defines different scenarios between pessimistic and optimistic conditions for decision makers. This method was applied to several parameters such as point, area, and line sources; population density; sensitive receptors; distance from current air quality stations; prediction error; and spatial distribution of CO, NO2, SO2, and PM10 pollutants. The output results specified several suitable locations to establish air pollution monitoring stations within Tehran Province. The stability and reliability of the output results were evaluated with a robust sensitivity analysis method. Moreover, the results demonstrated that the proposed method can produce stable results. Obtaining knowledge regarding population density, distance from current air quality stations, and spatial distribution of CO pollutant criteria is essential when selecting locations for air quality monitoring stations.
Mohammad Kazemi-Beydokhti; R. Ali Abbaspour; M. Kheradmandi; Ali Bozorgi-Amiri. Determination of the physical domain for air quality monitoring stations using the ANP-OWA method in GIS. Environmental Monitoring and Assessment 2019, 191, 299 .
AMA StyleMohammad Kazemi-Beydokhti, R. Ali Abbaspour, M. Kheradmandi, Ali Bozorgi-Amiri. Determination of the physical domain for air quality monitoring stations using the ANP-OWA method in GIS. Environmental Monitoring and Assessment. 2019; 191 (2):299.
Chicago/Turabian StyleMohammad Kazemi-Beydokhti; R. Ali Abbaspour; M. Kheradmandi; Ali Bozorgi-Amiri. 2019. "Determination of the physical domain for air quality monitoring stations using the ANP-OWA method in GIS." Environmental Monitoring and Assessment 191, no. 2: 299.
Grey wolf optimizer (GWO) is a new nature-inspired algorithm that simulates the predatory behaviors of grey wolves in nature. The GWO mainly divides the whole hunting process into three stages: encircling, hunting, and attacking when they are nearby the prey. Since its introduction, the GWO has found its applications in a wide range of engineering and science fields. However, when tackling more complex optimization problems, especially the high dimensional and multimodal tasks, GWO may easily fall into the local optima or be unsuccessful in finding the global best. In addition, the convergence behaviors may not be very satisfying. In this study, the performance of basic GWO is enhanced using effective exploratory and exploitative mechanisms such as random leaders, opposition-based learning, levy fight patterns, random spiral-form motions, and greedy selection. These concepts are utilized to improve the global exploration and local exploitation capacities of the conventional technique and deepen the searching advantages of GWO in dealing with more complex problems. Also, the proposed mechanisms can ameliorate the convergence inclinations and the quality of the solutions. In order to verify the efficacy of the proposed method, which is called OBLGWO; it is compared to a comprehensive set of the new and state-of-the-art optimizers on 23 benchmark test sets and 30 well-known CEC problems. Additionally, the proposed OBLGWO is also applied to the tuning of the key parameters of kernel extreme learning machine (KELM) in dealing with two real-world problems. The experimental results and analysis demonstrate that the proposed OBLGWO can significantly outperform GWO, previous enhanced GWO variants and some of the other well-established algorithms in terms of convergence speed and the quality of solutions.
Ali Asghar Heidari; Rahim Ali Abbaspour; Huiling Chen. Efficient boosted grey wolf optimizers for global search and kernel extreme learning machine training. Applied Soft Computing 2019, 81, 105521 .
AMA StyleAli Asghar Heidari, Rahim Ali Abbaspour, Huiling Chen. Efficient boosted grey wolf optimizers for global search and kernel extreme learning machine training. Applied Soft Computing. 2019; 81 ():105521.
Chicago/Turabian StyleAli Asghar Heidari; Rahim Ali Abbaspour; Huiling Chen. 2019. "Efficient boosted grey wolf optimizers for global search and kernel extreme learning machine training." Applied Soft Computing 81, no. : 105521.
The present paper combines the strengths of path dependent and non-path dependent modeling approaches to detect sustainable urban development profiles in a rapidly expanding region in Iran. In doing so, the Cellular Automata-Markov Chain (CA-MC) model is calibrated and validated using an integrative application of Multi Criteria Evaluation (MCE) method and Multi Layer Perceptron Neural Network (MLPNN) algorithm in addition to applying different Kappa-based metrics. The CA-MC model is then employed to project urban growth trajectories in the targeted research location. By combining the results of different urban allocation approaches, highly suitable zones for future development of the area are detected. These lands were not identifiable if each approach was implemented alone. The potential lands for future urbanization are evaluated and ranked in terms of their connectivity and compactness as well as their effect on other land-use categories such as forest and agriculture. The suggested methodology not only improves the calibration of a spatially-explicit model, but also it provides a wider range of options for policy makers. The findings of this study are useful to municipal decision makers in order to protect against further ecological consequences of ill-planned and uncontrolled urbanization.
Reza Arasteh; Rahim Ali Abbaspour; Abdolrassoul Salmanmahiny. A modeling approach to path dependent and non-path dependent urban allocation in a rapidly growing region. Sustainable Cities and Society 2018, 44, 378 -394.
AMA StyleReza Arasteh, Rahim Ali Abbaspour, Abdolrassoul Salmanmahiny. A modeling approach to path dependent and non-path dependent urban allocation in a rapidly growing region. Sustainable Cities and Society. 2018; 44 ():378-394.
Chicago/Turabian StyleReza Arasteh; Rahim Ali Abbaspour; Abdolrassoul Salmanmahiny. 2018. "A modeling approach to path dependent and non-path dependent urban allocation in a rapidly growing region." Sustainable Cities and Society 44, no. : 378-394.
The casualties and financial losses caused by large earthquakes have led to an awareness of prediction importance of such earthquakes. The earthquake prediction is divided into four categories: long term, intermediate term, short term, and immediate. The M8 algorithm is one of the intermediate-term middle-range prediction algorithms primarily used to predict earthquake of magnitude 8 or more and is applied later for smaller magnitudes. The Iranian Plateau is less exposed to earthquake with magnitude 8 or more and it is observed that the seismicity rate in this region is generally low. Thus; the original M8 is not suitable for applying in this region. The objective of this study is to modify the M8 algorithm for Prediction of Major M7.0+ Earthquakes in the Iranian Plateau. The major earthquake of magnitude 7 or more in the Iranian plateau from 1975 to 2018 is considered as the target earthquakes. The hit rate times 1 minus alarm rate is defined as objective function and the particles swarm optimization meta-heuristic algorithm is used to maximize it. The optimum M8 could predict 14 out of 17 large earthquakes in the Iranian plateau while occupying 31.7% of the spatio-temporal space as the alarm. The results show that by employing an optimization algorithm, we can modify the M8 algorithm for efficient prediction of the target magnitudes less than 8 in the regions with low seismicity rate.
Ali Ramezani; Rahim Ali Abbaspour; Masoud Mojarab. An Optimization of Using the M8 Algorithm for Prediction of Major M7.0+ Earthquakes in the Iranian Plateau. Pure and Applied Geophysics 2018, 176, 119 -131.
AMA StyleAli Ramezani, Rahim Ali Abbaspour, Masoud Mojarab. An Optimization of Using the M8 Algorithm for Prediction of Major M7.0+ Earthquakes in the Iranian Plateau. Pure and Applied Geophysics. 2018; 176 (1):119-131.
Chicago/Turabian StyleAli Ramezani; Rahim Ali Abbaspour; Masoud Mojarab. 2018. "An Optimization of Using the M8 Algorithm for Prediction of Major M7.0+ Earthquakes in the Iranian Plateau." Pure and Applied Geophysics 176, no. 1: 119-131.
Evaluation of the quality of volunteered geographic information (VGI) has been the subject of a plethora of research in recent years as an imperative issues. In this paper, the corresponding objects between data sets of OpenStreetMap, as one of the well-known VGI projects, and the reference data sets are identified based on an automatic linear object matching method. Moreover, to identify more corresponding objects in two data sets and measuring the completeness more accurately, geometric properties are used. The results showed that 92% of the objects of the OSM data set matched to the reference data set and the total length of the matched objects was 87% of the total length of the objects. The corresponding objects in two data sets have average of 0.86° of spatial similarity. By examining the objects of the OSM data set from 2013 to 2017, a rise of 87.2% detected in individuals’ participation in creating the objects and the average spatial similarity degree also underwent an improvement of 0.15.
Alireza Chehreghan; Rahim Ali Abbaspour. An evaluation of data completeness of VGI through geometric similarity assessment. International Journal of Image and Data Fusion 2018, 9, 319 -337.
AMA StyleAlireza Chehreghan, Rahim Ali Abbaspour. An evaluation of data completeness of VGI through geometric similarity assessment. International Journal of Image and Data Fusion. 2018; 9 (4):319-337.
Chicago/Turabian StyleAlireza Chehreghan; Rahim Ali Abbaspour. 2018. "An evaluation of data completeness of VGI through geometric similarity assessment." International Journal of Image and Data Fusion 9, no. 4: 319-337.
OpenStreetMap (OSM) has proven to serve as a promising free global encyclopedia of maps with an increasing popularity across different user communities and research bodies. One of the unique characteristics of OSM has been the availability of the full history of users’ contributions, which can leverage our quality control mechanisms through exploiting the history of contributions. Since this aspect of contributions (i.e., historical contributions) has been neglected in the literature, this study aims at presenting a novel approach for improving the positional accuracy and completeness of the OSM road network. To do so, we present a five-stage approach based on a Voronoi diagram that leads to improving the positional accuracy and completeness of the OSM road network. In the first stage, the OSM data history file is retrieved and in the second stage, the corresponding data elements for each object in the historical versions are identified. In the third stage, data cleaning on the historical datasets is carried out in order to identify outliers and remove them accordingly. In the fourth stage, through applying the Voronoi diagram method, one representative version for each set of historical versions is extracted. In the final stage, through examining the spatial relations for each object in the history file, the topology of the target object is enhanced. As per validation, a comparison between the latest version of the OSM data and the result of our approach against a reference dataset is carried out. Given a case study in Tehran, our findings reveal that the completeness and positional precision of OSM features can be improved up to 14%. Our conclusions draw attention to the exploitation of the historical archive of the contributions in OSM as an intrinsic quality indicator.
Afsaneh Nasiri; Rahim Ali Abbaspour; Alireza Chehreghan; Jamal Jokar Arsanjani. Improving the Quality of Citizen Contributed Geodata through Their Historical Contributions: The Case of the Road Network in OpenStreetMap. ISPRS International Journal of Geo-Information 2018, 7, 253 .
AMA StyleAfsaneh Nasiri, Rahim Ali Abbaspour, Alireza Chehreghan, Jamal Jokar Arsanjani. Improving the Quality of Citizen Contributed Geodata through Their Historical Contributions: The Case of the Road Network in OpenStreetMap. ISPRS International Journal of Geo-Information. 2018; 7 (7):253.
Chicago/Turabian StyleAfsaneh Nasiri; Rahim Ali Abbaspour; Alireza Chehreghan; Jamal Jokar Arsanjani. 2018. "Improving the Quality of Citizen Contributed Geodata through Their Historical Contributions: The Case of the Road Network in OpenStreetMap." ISPRS International Journal of Geo-Information 7, no. 7: 253.
The rapid proliferation of sensors and big data repositories offer many new opportunities for data science. Among many application domains, the analysis of large trajectory datasets generated from people’s movements at the city scale is one of the most promising research avenues still to explore. Extracting trajectory patterns and outliers in urban environments is a direction still requiring exploration for many management and planning tasks. The research developed in this paper introduces a spatio-temporal framework, so-called STE-SD (Spatio-Temporal Entropy for Similarity Detection), based on the initial concept of entropy as introduced by Shannon in his seminal theory of information and as recently extended to the spatial and temporal dimensions. Our approach considers several complementary trajectory descriptors whose distribution in space and time are quantitatively evaluated. The trajectory primitives considered include curvatures, stop-points, self-intersections and velocities. These primitives are identified and then qualified using the notion of entropy as applied to the spatial and temporal dimensions. The whole approach is experimented and applied to urban trajectories derived from the Geolife dataset, a reference data benchmark available in the city of Beijing.
Amin Hosseinpoor Milaghardan; Rahim Ali Abbaspour; Christophe Claramunt. A Spatio-Temporal Entropy-based Framework for the Detection of Trajectories Similarity. Entropy 2018, 20, 490 .
AMA StyleAmin Hosseinpoor Milaghardan, Rahim Ali Abbaspour, Christophe Claramunt. A Spatio-Temporal Entropy-based Framework for the Detection of Trajectories Similarity. Entropy. 2018; 20 (7):490.
Chicago/Turabian StyleAmin Hosseinpoor Milaghardan; Rahim Ali Abbaspour; Christophe Claramunt. 2018. "A Spatio-Temporal Entropy-based Framework for the Detection of Trajectories Similarity." Entropy 20, no. 7: 490.
Nowadays, location-based data collected by GPS-equipped devices such as smartphones and cars are often stored as spatio-temporal sequences of points denoted as trajectories. The analysis of the large generated trajectory databases such as the detection of patterns, outliers, and stops has a great importance for many application domains. Over the past few years, several successful trajectory data infrastructures have been progressively developed for a large range of applications in both the terrestrial and maritime environments. However, it still appears that amongst many research issues to consider, the resulting uncertainties when analyzing local trajectory properties have not been completely taken into account. In particular, determining for instance certainty rates, while detecting stop points, might have valuable impacts on most cases. The framework developed in this paper introduces an approach based on the Dempster-Shafer theory of evidence, and whose objective is to detect trajectory stop points and associated degrees of uncertainty. The approach is experimented using a large urban trajectory database and is compared to several computational algorithms introduced in previous studies. The results show that our approach reduces uncertainty values when detecting trajectory stop points as well as a significant improvement of the recall and precision values.
Amin Hosseinpoor Milaghardan; Rahim Ali Abbaspour; Christophe Claramunt. A Dempster-Shafer based approach to the detection of trajectory stop points. Computers, Environment and Urban Systems 2018, 70, 189 -196.
AMA StyleAmin Hosseinpoor Milaghardan, Rahim Ali Abbaspour, Christophe Claramunt. A Dempster-Shafer based approach to the detection of trajectory stop points. Computers, Environment and Urban Systems. 2018; 70 ():189-196.
Chicago/Turabian StyleAmin Hosseinpoor Milaghardan; Rahim Ali Abbaspour; Christophe Claramunt. 2018. "A Dempster-Shafer based approach to the detection of trajectory stop points." Computers, Environment and Urban Systems 70, no. : 189-196.
Large volumes of trajectory-based data require development of appropriate data manipulation mechanisms that will offer efficient computational solutions. In particular, identification of meaningful geometric points of such trajectories is still an open research issue. Detection of these critical points implies to identify self-intersecting, turning and curvature points so that specific geometric characteristics that are worth identifying could be denoted. This research introduces an approach called Trajectory Critical Point detection using Convex Hull (TCP-CH) to identify a minimum number of critical points. The results can be applied to large trajectory data sets in order to reduce storage costs and complexity for further data mining and analysis. The main principles of the TCP-CH algorithm include computing: convex areas, convex hull curvatures, turning points, and intersecting points. The experimental validation applied to Geolife trajectory dataset reveals that the proposed framework can identify most of intersecting points in reasonable computing time. Finally, comparison of the proposed algorithm with other methods, such as turning function shows that our approach performs relatively well when considering the overall detection quality and computing time.
Amin Hosseinpoor Milaghardan; Rahim Ali Abbaspour; Christophe Claramunt. A Geometric Framework for Detection of Critical Points in a Trajectory Using Convex Hulls. ISPRS International Journal of Geo-Information 2018, 7, 14 .
AMA StyleAmin Hosseinpoor Milaghardan, Rahim Ali Abbaspour, Christophe Claramunt. A Geometric Framework for Detection of Critical Points in a Trajectory Using Convex Hulls. ISPRS International Journal of Geo-Information. 2018; 7 (1):14.
Chicago/Turabian StyleAmin Hosseinpoor Milaghardan; Rahim Ali Abbaspour; Christophe Claramunt. 2018. "A Geometric Framework for Detection of Critical Points in a Trajectory Using Convex Hulls." ISPRS International Journal of Geo-Information 7, no. 1: 14.
Tourism activities are highly dependent on spatial information. Finding the most interesting travel destinations and attractions and planning a trip are still open research issues to GIScience research applied to the tourism domain. Nowadays, huge amounts of information are available over the world wide web that may be useful in planning a visit to destinations and attractions. However, it is often time consuming for a user to select the most interesting destinations and attractions and plan a trip according to his own preferences. Tourism recommender systems (TRSs) can be used to overcome this information overload problem and to propose items taking into account the user preferences. This chapter reviews related topics in tourism recommender systems including different tourism recommendation approaches and user profile representation methods applied in the tourism domain. The authors illustrate the potential of tourism recommender systems as applied to the tourism domain by the implementation of an illustrative geospatial collaborative recommender system using the Foursquare dataset.
Zahra Bahramian; Rahim Ali Abbaspour; Christophe Claramunt. Toward Geospatial Collaborative Tourism Recommender Systems. Handbook of Research on the Impacts and Implications of COVID-19 on the Tourism Industry 2018, 212 -248.
AMA StyleZahra Bahramian, Rahim Ali Abbaspour, Christophe Claramunt. Toward Geospatial Collaborative Tourism Recommender Systems. Handbook of Research on the Impacts and Implications of COVID-19 on the Tourism Industry. 2018; ():212-248.
Chicago/Turabian StyleZahra Bahramian; Rahim Ali Abbaspour; Christophe Claramunt. 2018. "Toward Geospatial Collaborative Tourism Recommender Systems." Handbook of Research on the Impacts and Implications of COVID-19 on the Tourism Industry , no. : 212-248.
Automated fare collection (AFC) systems are regarded as valuable resources for public transport planners. In this paper, the AFC data are utilized to analysis and extract mobility patterns in a public transportation system. For this purpose, the smart card data are inserted into a proposed metaheuristic-based aggregation model and then converted to O-D matrix between stops, since the size of O-D matrices makes it difficult to reproduce the measured passenger flows precisely. The proposed strategy is applied to a case study from Haaglanden, Netherlands. In this research, moth-flame optimizer (MFO) is utilized and evaluated for the first time as a new metaheuristic algorithm (MA) in estimating transit origin-destination matrices. The MFO is a novel, efficient swarm-based MA inspired from the celestial navigation of moth insects in nature. To investigate the capabilities of the proposed MFO-based approach, it is compared to methods that utilize the K-means algorithm, gray wolf optimization algorithm (GWO) and genetic algorithm (GA). The sum of the intra-cluster distances and computational time of operations are considered as the evaluation criteria to assess the efficacy of the optimizers. The optimality of solutions of different algorithms is measured in detail. The traveler's behavior is analyzed to achieve to a smooth and optimized transport system. The results reveal that the proposed MFO-based aggregation strategy can outperform other evaluated approaches in terms of convergence tendency and optimality of the results. The results show that it can be utilized as an efficient approach to estimating the transit O-D matrices.
A. A. Heidari; A. Moayedi; Rahim Ali Abbaspour. ESTIMATING ORIGIN-DESTINATION MATRICES USING AN EFFICIENT MOTH FLAME-BASED SPATIAL CLUSTERING APPROACH. The International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences 2017, XLII-4/W4, 381 -387.
AMA StyleA. A. Heidari, A. Moayedi, Rahim Ali Abbaspour. ESTIMATING ORIGIN-DESTINATION MATRICES USING AN EFFICIENT MOTH FLAME-BASED SPATIAL CLUSTERING APPROACH. The International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences. 2017; XLII-4/W4 ():381-387.
Chicago/Turabian StyleA. A. Heidari; A. Moayedi; Rahim Ali Abbaspour. 2017. "ESTIMATING ORIGIN-DESTINATION MATRICES USING AN EFFICIENT MOTH FLAME-BASED SPATIAL CLUSTERING APPROACH." The International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences XLII-4/W4, no. : 381-387.
Users planning a trip to a given destination often search for the most appropriate points of interest location, this being a non-straightforward task as the range of information available is very large and not very well structured. The research presented by this paper introduces a context-aware tourism recommender system that overcomes the information overload problem by providing personalized recommendations based on the user’s preferences. It also incorporates contextual information to improve the recommendation process. As previous context-aware tourism recommender systems suffer from a lack of formal definition to represent contextual information and user’s preferences, the proposed system is enhanced using an ontology approach. We also apply a spreading activation technique to contextualize user preferences and learn the user profile dynamically according to the user’s feedback. The proposed method assigns more effect in the spreading process for nodes which their preference values are assigned directly by the user. The results show the overall performance of the proposed context-aware tourism recommender systems by an experimental application to the city of Tehran.
Z. Bahramian; Rahim Ali Abbaspour; C. Claramunt. A CONTEXT-AWARE TOURISM RECOMMENDER SYSTEM BASED ON A SPREADING ACTIVATION METHOD. The International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences 2017, XLII-4/W4, 333 -339.
AMA StyleZ. Bahramian, Rahim Ali Abbaspour, C. Claramunt. A CONTEXT-AWARE TOURISM RECOMMENDER SYSTEM BASED ON A SPREADING ACTIVATION METHOD. The International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences. 2017; XLII-4/W4 ():333-339.
Chicago/Turabian StyleZ. Bahramian; Rahim Ali Abbaspour; C. Claramunt. 2017. "A CONTEXT-AWARE TOURISM RECOMMENDER SYSTEM BASED ON A SPREADING ACTIVATION METHOD." The International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences XLII-4/W4, no. : 333-339.