The Robot Crawler Model on Complete k-Partite and Erdős-Rényi Random Graphs

A. Davidson*, A. Ganesh

*Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingConference Contribution (Conference Proceeding)

Abstract

Web crawlers are used by internet search engines to gather information about the web graph. In this paper we investigate a simple process which models such software by walking around the vertices of a graph. Once initial random vertex weights have been assigned, the robot crawler traverses the graph deterministically following a greedy algorithm, always visiting the neighbour of least weight and then updating this weight to be the highest overall. We consider the maximum, minimum and average number of steps taken by the crawler to visit every vertex of firstly, sparse Erdős-Rényi random graphs and secondly, complete k-partite graphs. Our work is closely related to a paper of Bonato et al. who introduced the model.

Original languageEnglish
Title of host publicationAlgorithms and Models for the Web Graph - 16th International Workshop, WAW 2019, Proceedings
EditorsKonstantin Avrachenkov, Paweł Prałat, Nan Ye
PublisherSpringer Verlag
Pages57-70
Number of pages14
ISBN (Print)9783030250690
DOIs
Publication statusPublished - 4 Jul 2019
Event16th International Workshop on Algorithms and Models for the Web Graph, WAW 2019 - Brisbane, Australia
Duration: 6 Jul 20197 Jul 2019

Publication series

NameLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
Volume11631 LNCS
ISSN (Print)0302-9743
ISSN (Electronic)1611-3349

Conference

Conference16th International Workshop on Algorithms and Models for the Web Graph, WAW 2019
CountryAustralia
CityBrisbane
Period6/07/197/07/19

Keywords

  • random graphs, web crawlers

Fingerprint

Dive into the research topics of 'The Robot Crawler Model on Complete k-Partite and Erdős-Rényi Random Graphs'. Together they form a unique fingerprint.

Cite this