Bayesian optimization (BO) has proven to be effective approach for guiding sample-efficient exploration of materials domains and is increasingly being used in automated materials optimization set-ups. However, when exploring novel materials, sample quality may vary unexpectedly, which can even invalidate the optimization procedure if it remains undetected. This issue limits the use of highly-automated optimization loops, especially in high-dimensional materials spaces with a lot of samples. Sample quality may be hard to define unequivocally for a machine but human scientists are usually good at judging sample quality, at least on a cursory yet often sufficient level. In this work, we demonstrate that humans can be added into the BO loop as experts to comment on the sample quality, which results in more trustworthy BO results. We implemented human-in-the-loop BO via a data fusion approach and applied virtual BO cycles on experimental perovskite film stability data from literature. The human-in-the-loop approach facilitates automated materials design and characterization by reducing the occurrence of invalid optimization results.