The whole slide image (WSI) systems recently approved by FDA for primary diagnosis have opened up new possibilities for digital pathology to have an increased impact in clinical care and research. However, the process of scanning slides, autofocus algorithms failed, combined with data storage and management needs, represents a much more complex, expensive, and resource-intensive workflow than the simple glass slide and microscope workflow, which hinders adoption. This project aimed to make progress toward the whole slide image acquisition issues by 1) creating 'on-demand' resolution recovery algorithms that could generate high resolution image outputs from low-resolution WSI image inputs in areas of user interest, and 2) to automatically detect and correct regions of poor focus, eliminating the need for QC personnel and high re-scan rates. To address both of these goals, we used unsupervised Cycle-Consistent Adversarial Networks (Cycle-GAN), which are designed to work without paired training data. More specifically, we propose two networks SR-CycleGAN and refocus-CycleGAN, to achieve up to 4-times super resolution and dynamic focus recovery in whole slide images without any a priori knowledge, and demonstrated the generalizability of the models across multiple tissue types. We also present a deployment pipeline for practical scenarios, where users can choose ROIs for super-resolution by user-interactive on-demand, and focus quality can be automatically detected and corrected in WSIs.