Single-cell methods are beginning to reveal the intrinsic heterogeneity in cell populations. However, it remains challenging to quantify single-cell behaviour from time-lapse microscopy data, owing to the difficulty of extracting reliable cell trajectories and lineage information over long time-scales and across several generations. To address this challenge, we developed a hybrid deep learning and Bayesian cell tracking approach to reconstruct lineage trees from live-cell microscopy data (Figure 1). We implemented a residual U-Net model coupled with a cell state CNN classifier to allow accurate instance segmentation of the cell nuclei. To track the cells over time and through cell divisions, we developed a Bayesian cell tracking methodology that uses input features from the images to enable the retrieval of multi-generational lineage information from a corpus of thousands of hours of live-cell imaging data. Using our approach, we extracted 20,000+ fully annotated single-cell trajectories from over 3,500 hours of video footage, organised into multi-generational lineage trees spanning up to 8 generations and fourth cousin distances. Benchmarking tests against other tracking algorithms, including lineage tree reconstruction assessments, demonstrate that our approach yields high-fidelity results with our data, with minimal requirement for manual curation. To demonstrate the robustness of our minimally supervised cell tracking methodology, we retrieve cell cycle durations and their extended inter- and intra-generational family relationships in 5,000+ fully annotated cell lineages without any manual curation. Our analysis expands the depth and breadth of investigated cell lineage relationships in approximately two orders of magnitude more data than in previous studies of cell cycle heritability.