The natural or generational learning process consists of building models based on available experiences. Each generation learns from the models obtained by its predecessors and gets a new model for its own batch of experiences. In this note, we discuss this step-by-step learning procedure for supervised classification and regression problems on large datasets. We show that the stepwise learning procedure performs competitively with respect to the approach that uses a single model for the entire dataset.