Timezone: »
Most real-world networks are too large to be measured or studied directly and there is substantial interest in estimating global network properties from smaller sub-samples. One of the most important global properties is the number of vertices/nodes in the network. Estimating the number of vertices in a large network is a major challenge in computer science, epidemiology, demography, and intelligence analysis. In this paper we consider a population random graph G = (V;E) from the stochastic block model (SBM) with K communities/blocks. A sample is obtained by randomly choosing a subset W and letting G(W) be the induced subgraph in G of the vertices in W. In addition to G(W), we observe the total degree of each sampled vertex and its block membership. Given this partial information, we propose an efficient PopULation Size Estimation algorithm, called PULSE, that accurately estimates the size of the whole population as well as the size of each community. To support our theoretical analysis, we perform an exhaustive set of experiments to study the effects of sample size, K, and SBM model parameters on the accuracy of the estimates. The experimental results also demonstrate that PULSE significantly outperforms a widely-used method called the network scale-up estimator in a wide variety of scenarios.
Author Information
Lin Chen (Yale University)
Amin Karbasi (Yale)
Forrest W. Crawford (Yale University)
More from the Same Authors
-
2020 Poster: Submodular Maximization Through Barrier Functions »
Ashwinkumar Badanidiyuru · Amin Karbasi · Ehsan Kazemi · Jan Vondrak -
2020 Poster: Continuous Submodular Maximization: Beyond DR-Submodularity »
Moran Feldman · Amin Karbasi -
2020 Spotlight: Submodular Maximization Through Barrier Functions »
Ashwinkumar Badanidiyuru · Amin Karbasi · Ehsan Kazemi · Jan Vondrak -
2020 Poster: Minimax Regret of Switching-Constrained Online Convex Optimization: No Phase Transition »
Lin Chen · Qian Yu · Hannah Lawrence · Amin Karbasi -
2020 Poster: Online MAP Inference of Determinantal Point Processes »
Aditya Bhaskara · Amin Karbasi · Silvio Lattanzi · Morteza Zadimoghaddam -
2019 Poster: Adaptive Sequence Submodularity »
Marko Mitrovic · Ehsan Kazemi · Moran Feldman · Andreas Krause · Amin Karbasi -
2019 Poster: Online Continuous Submodular Maximization: From Full-Information to Bandit Feedback »
Mingrui Zhang · Lin Chen · Hamed Hassani · Amin Karbasi -
2019 Poster: Stochastic Continuous Greedy ++: When Upper and Lower Bounds Match »
Amin Karbasi · Hamed Hassani · Aryan Mokhtari · Zebang Shen -
2018 Poster: Do Less, Get More: Streaming Submodular Maximization with Subsampling »
Moran Feldman · Amin Karbasi · Ehsan Kazemi -
2018 Spotlight: Do Less, Get More: Streaming Submodular Maximization with Subsampling »
Moran Feldman · Amin Karbasi · Ehsan Kazemi -
2017 Workshop: Discrete Structures in Machine Learning »
Yaron Singer · Jeff A Bilmes · Andreas Krause · Stefanie Jegelka · Amin Karbasi -
2017 Poster: Interactive Submodular Bandit »
Lin Chen · Andreas Krause · Amin Karbasi -
2017 Poster: Streaming Weak Submodularity: Interpreting Neural Networks on the Fly »
Ethan Elenberg · Alex Dimakis · Moran Feldman · Amin Karbasi -
2017 Oral: Streaming Weak Submodularity: Interpreting Neural Networks on the Fly »
Ethan Elenberg · Alex Dimakis · Moran Feldman · Amin Karbasi -
2017 Poster: Gradient Methods for Submodular Maximization »
Hamed Hassani · Mahdi Soltanolkotabi · Amin Karbasi -
2016 Poster: Fast Distributed Submodular Cover: Public-Private Data Summarization »
Baharan Mirzasoleiman · Morteza Zadimoghaddam · Amin Karbasi -
2015 Poster: Distributed Submodular Cover: Succinctly Summarizing Massive Data »
Baharan Mirzasoleiman · Amin Karbasi · Ashwinkumar Badanidiyuru · Andreas Krause -
2015 Spotlight: Distributed Submodular Cover: Succinctly Summarizing Massive Data »
Baharan Mirzasoleiman · Amin Karbasi · Ashwinkumar Badanidiyuru · Andreas Krause