Visual and Tactile 3D Point Cloud Data from Real Robots for Shape Modeling and Completion

  • Yasemin Bekiroglu (Contributor)
  • Mårten Björkman (Contributor)
  • Gabriela Zarzar Gandler (Creator)
  • Johannes Exner (Contributor)
  • Carl Henrik Ek (Contributor)
  • Danica Kragic (Contributor)

    Dataset

    Description

    If you use this data, please cite "Y. Bekiroglu, M. Björkman, G. Zarzar Gandler, J. Exner, C. H. Ek, D. Kragic. Visual and Tactile 3D Point Cloud Data from Real Robots for Shape Modeling and Completion, Data in Brief (2020), https://doi.org/10.1016/j.dib.2020.105335". The data was used for shape completion and modeling via Implicit Surface representation and Gaussian-Process-based regression, in the work “G. Zarzar Gandler, C. H. Ek, M. Björkman, R. Stolkin, Y. Bekiroglu. Object shape estimation and modeling, based on sparse Gaussian process implicit surfaces, combining visual data and tactile exploration, Robotics and Autonomous Systems (2020), https://doi.org/10.1016/j.robot.2020.103433”, and also used partially in “M. Björkman, Y. Bekiroglu, V. Högman, D. Kragic. Enhancing visual perception of shape through tactile glances, in IEEE/RSJ International Conference on Intelligent Robots and Systems (2013)".
    Date made available2020
    PublisherMendeley Data

    Cite this