Comparing multilevel modelling and artificial neural networks in house price prediction

Yingyu Feng, Kelvyn Jones

Research output: Chapter in Book/Report/Conference proceedingConference Contribution (Conference Proceeding)

22 Citations (Scopus)
336 Downloads (Pure)

Abstract

Two advanced modelling approaches, Multi-Level Models and Artificial Neural Networks are employed to model house prices. These approaches and the standard Hedonic Price Model are compared in terms of predictive accuracy, capability to capture location information, and their explanatory power. These models are applied to 2001-2013 house prices in the Greater Bristol area, using secondary data from the Land Registry, the Population Census and Neighbourhood Statistics so that these models could be applied nationally. The results indicate that MLM offers good predictive accuracy with high explanatory power, especially if neighbourhood effects are explored at multiple spatial scales.
Original languageEnglish
Title of host publication2015 2nd IEEE International Conference on Spatial Data Mining and Geographical Knowledge Services (ICSDM 2015)
Subtitle of host publicationProceedings of a meeting held 8-10 July 2015, Fuzhou, China.
PublisherInstitute of Electrical and Electronics Engineers (IEEE)
Pages108-114
Number of pages7
Volume1
ISBN (Electronic)9781479977482
ISBN (Print)9781479977505
DOIs
Publication statusPublished - 8 Jul 2015
EventSpatial Data Mining and Geographical Knowledge Services (ICSDM), 2015 2nd IEEE International Conference - Fuzhou, China
Duration: 8 Jul 201510 Jul 2015

Conference

ConferenceSpatial Data Mining and Geographical Knowledge Services (ICSDM), 2015 2nd IEEE International Conference
Country/TerritoryChina
CityFuzhou
Period8/07/1510/07/15

Keywords

  • Artificial neural networks
  • House prices
  • Multilevel modelling
  • Predictive accuracy

Fingerprint

Dive into the research topics of 'Comparing multilevel modelling and artificial neural networks in house price prediction'. Together they form a unique fingerprint.

Cite this