Probabilistic logic programming formalisms permit the definitionof potentially very complex probability distributions. This complexitycan often make learning hard, even when structure is fixed andlearning reduces to parameter estimation. In this paper an approximateBayesian computation (ABC) method is presented which computes approximationsto the posterior distribution over PRISM parameters. Thekey to ABC approaches is that the likelihood function need not be computed,instead a ‘distance’ between the observed data and synthetic datagenerated by candidate parameter values is used to drive the learning.This makes ABC highly appropriate for PRISM programs which can havean intractable likelihood function, but from which synthetic data can bereadily generated. The algorithm is experimentally shown to work wellon an easy problem but further work is required to produce acceptableresults on harder ones.