Journal Articles

  • S. Contreras, J. Dehning, S.B. Mohr, F.P. Spitzner and V. Priesemann
    Towards a Long-Term Control of COVID-19 at Low Case Numbers
    [] [arXiv]
    The traditional long-term solutions for epidemic control involve eradication or herd immunity 1;2. Neither of them will be attained within a few months for the COVID-19 pandemic. Here, we analytically derive the existence of a third, viable solution: a stable equilibrium at low case numbers, where test-trace-and-isolate policies partially compensate for local spreading events, and only moderate contact restrictions remain necessary. Across wide parameter ranges of our complementary compartmental model 3, the equilibrium is reached at or below 10 daily new cases per million people. Such low levels had been maintained over months in most European countries. However, this equilibrium is endangered (i) if contact restrictions are relaxed, or (ii) if case numbers grow too high. The latter destabilisation marks a novel tipping point beyond which the spread self-accelerates because test-trace-and-isolate capacities are overwhelmed. To reestablish control quickly, a lockdown is required. We show that a lockdown is either effective within a few weeks, or tends to fail its aim. If effective, recurring lockdowns are not necessary — contrary to the oscillating dynamics previously presented in the context of circuit breakers 4;5;6, and contrary to a regime with high case numbers — if moderate contact reductions are maintained. Hence, at low case numbers, the control is easier, and more freedom can be granted. We demonstrate that this strategy reduces case numbers and fatalities by a factor of 5 compared to a strategy focused only on avoiding major congestion of hospitals. Furthermore, our solution minimises lockdown duration, and hence economic impact. In the long term, control will successively become easier due to immunity through vaccination or large scale testing programmes. International coordination would facilitate even more the implementation of this solution.
  • S. Contreras, J. Dehning, M. Loidolt, J. Zierenberg, F.P. Spitzner, J.H. Urrea-Quintero, S.B. Mohr, M. Wilczek, M. Wibral and V. Priesemann
    The Challenges of Containing SARS-CoV-2 via Test-Trace-and-Isolate
    Nat. Commun. 12, 378 (2021)
    [] [arXiv]
    Without a cure, vaccine, or proven long-term immunity against SARS-CoV-2, test-trace-and-isolate (TTI) strategies present a promising tool to contain its spread. For any TTI strategy, however, mitigation is challenged by pre- and asymptomatic transmission, TTI-avoiders, and undetected spreaders, which strongly contribute to ”hidden' infection chains. Here, we study a semi-analytical model and identify two tipping points between controlled and uncontrolled spread: (1) the behavior-driven reproduction number $${R}_{t}{̂H}$$RtHof the hidden chains becomes too large to be compensated by the TTI capabilities, and (2) the number of new infections exceeds the tracing capacity. Both trigger a self-accelerating spread. We investigate how these tipping points depend on challenges like limited cooperation, missing contacts, and imperfect isolation. Our results suggest that TTI alone is insufficient to contain an otherwise unhindered spread of SARS-CoV-2, implying that complementary measures like social distancing and improved hygiene remain necessary.
  • F.P. Spitzner, J. Dehning, J. Wilting, A. Hagemann, J.P. Neto, J. Zierenberg and V. Priesemann
    MR. Estimator, a Toolbox to Determine Intrinsic Timescales from Subsampled Spiking Activity
    (under review)
    [] [GitHub] [arXiv]
    Here we present our Python toolbox 'MR. Estimator' to reliably estimate the intrinsic timescale from electrophysiologal recordings of heavily subsampled systems. Originally intended for the analysis of time series from neuronal spiking activity, our toolbox is applicable to a wide range of systems where subsampling - the difficulty to observe the whole system in full detail - limits our capability to record. Applications range from epidemic spreading to any system that can be represented by an autoregressive process. In the context of neuroscience, the intrinsic timescale can be thought of as the duration over which any perturbation reverberates within the network; it has been used as a key observable to investigate a functional hierarchy across the primate cortex and serves as a measure of working memory. It is also a proxy for the distance to criticality and quantifies a system's dynamic working point.
  • J. Dehning, J. Zierenberg, F.P. Spitzner, M. Wibral, J.P. Neto, M. Wilczek and V. Priesemann
    Inferring Change Points in the Spread of COVID-19 Reveals the Effectiveness of Interventions
    Science 369 (2020)
    [] [GitHub] [arXiv]
    Keeping the lid on infection spread From February to April 2020, many countries introduced variations on social distancing measures to slow the ravages of severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2). Publicly available data show that Germany has been particularly successful in minimizing death rates. Dehning et al. quantified three governmental interventions introduced to control the outbreak. The authors predicted that the third governmental intervention—a strict contact ban since 22 March—switched incidence from growth to decay. They emphasize that relaxation of controls must be done carefully, not only because there is a 2-week lag between a measure being enacted and the effect on case reports but also because the three measures used in Germany only just kept virus spread below the growth threshold.
  • J.P. Neto, F.P. Spitzner and V. Priesemann
    A Unified Picture of Neuronal Avalanches Arises from the Understanding of Sampling Effects
    (under review)
    [] [GitHub] [arXiv]

    To date, it is still impossible to sample the entire mammalian brain with single-neuron precision. This forces one to either use spikes (focusing on few neurons) or to use coarse-sampled activity (averaging over many neurons, e.g. LFP). Naturally, the sampling technique impacts inference about collective properties. Here, we emulate both sampling techniques on a spiking model to quantify how they alter observed correlations and signatures of criticality. We discover a general effect: when the inter-electrode distance is small, electrodes sample overlapping regions in space, which increases the correlation between the signals. For coarse-sampled activity, this can produce power-law distributions even for non-critical systems. In contrast, spike recordings enable one to distinguish the underlying dynamics. This explains why coarse measures and spikes have produced contradicting results in the past – that are now all consistent with a slightly subcritical regime.

  • F.P. Spitzner, J. Zierenberg and W. Janke
    The Droplet Formation-Dissolution Transition in Different Ensembles: Finite-Size Scaling from Two Perspectives
    SciPost Phys. 5, 062 (2018)
    The formation and dissolution of a droplet is an important mechanism related to various nucleation phenomena. Here, we address the droplet formation-dissolution transition in a two-dimensional Lennard-Jones gas to demonstrate a consistent finite-size scaling approach from two perspectives using orthogonal control parameters. For the canonical ensemble, this means that we fix the temperature while varying the density and vice versa. Using specialised parallel multicanonical methods for both cases, we confirm analytical predictions at fixed temperature (rigorously only proven for lattice systems) and corresponding scaling predictions from expansions at fixed density. Importantly, our methodological approach provides us with reference quantities from the grand canonical ensemble that enter the analytical predictions. Our orthogonal finite-size scaling setup can be exploited for theoretical and experimental investigations of general nucleation phenomena - if one identifies the corresponding reference ensemble and adapts the theory accordingly. In this case, our numerical approach can be readily translated to the corresponding ensembles and thereby proves very useful for numerical studies of equilibrium droplet formation, in general.
  • J. Zierenberg, N. Fricke, M. Marenz, F.P. Spitzner, V. Blavatska and W. Janke
    Percolation Thresholds and Fractal Dimensions for Square and Cubic Lattices with Long-Range Correlated Defects
    Phys. Rev. E 96, 062125 (2017)
    [] [arXiv]
    We study long-range power-law correlated disorder on square and cubic lattices. In particular, we present high-precision results for the percolation thresholds and the fractal dimension of the largest clusters as a function of the correlation strength. The correlations are generated using a discrete version of the Fourier filtering method. We consider two different metrics to set the length scales over which the correlations decay, showing that the percolation thresholds are highly sensitive to such system details. By contrast, we verify that the fractal dimension df is a universal quantity and unaffected by the choice of metric. We also show that for weak correlations, its value coincides with that for the uncorrelated system. In two dimensions we observe a clear increase of the fractal dimension with increasing correlation strength, approaching df→2. The onset of this change does not seem to be determined by the extended Harris criterion.
  • N. Fricke, J. Zierenberg, M. Marenz, F.P. Spitzner, V. Blavatska and W. Janke
    Scaling Laws for Random Walks in Long-Range Correlated Disordered Media
    Condens. Matter Phys. 20, 13004 (2017)
    [] [arXiv]
    We study the scaling laws of diffusion in two-dimensional media with long-range correlated disorder through exact enumeration of random walks. The disordered medium is modelled by percolation clusters with correlations decaying with the distance as a power law, $r{̂-a}$, generated with the improved Fourier filtering method. To characterize this type of disorder, we determine the percolation threshold $p_c$ by investigating cluster-wrapping probabilities. At $p_c$, we estimate the (sub-diffusive) walk dimension $d_w$ for different correlation exponents $a$. Above $p_c$, our results suggest a normal random walk behavior for weak correlations, whereas anomalous diffusion cannot be ruled out in the strongly correlated case, i.e., for small $a$.