Inductive programming is a new machine learning paradigm combining functional programming (FP) for writing statistical models and the information theoretic criterion, Minimum Message Length (MML), to prevent overfitting. Type-classes specify general properties that statistical models must have. Many statistical models, estimators and operators have polymorphic type. Useful operators transform and combine models; and estimators, to form new ones. FP's compositional style of programming is a great advantage in this domain. MML fits well with FP in providing a compositional measure of the complexity of a model from its parts. Inductive programming is illustrated by a case study of Bayesian net-works. Networks are built from classification- (decision-) trees. Trees, and networks; are general [4] as a natural consequence of the method. Discrete and continuous variables, and missing values are handled. Trees are built from partitioning functions and models on dataspaces. Finally the Bayesian networks are applied to a challenging data set on lost persons.