This year's Dynamics Days was hosted by Northwestern University in Evanston, Illinois, from January 4-6, 2019. As those familiar with the conference might expect, the topics covered ranged widely around the sciences, from active fluids to biological problems to reservoir computing, and many in between. I won't be able to give every talk its due here, but there were some memorable highlights.
Starting with the classic nonlinear dynamics of coupled oscillators, we were treated to two talks on how synchronizing clusters arise from network structure. Louis Pecora (Naval Research Lab) delivered a great talk illuminating the group theory reasoning behind clustering in multilayered networks with different types of oscillators in each layer, and how we can reconcile the asymmetry of the oscillator types. Takashi Nishikawa (Northwestern University) followed up later with a fast method to identify independently-synchronizable clusters, in order to identify and enumerate stable chimera states. Three-minute flash talks allowed Zachary Nicolaou (Northwestern) and Yuanzhao Zhang (Northwestern) to give us a taste of their coupled-oscillator projects before the poster session: Zachary Nicolaou examined the rich dynamical portfolio of Janus oscillator networks, and Yuanzhao Zhang highlighted how heterogeneity can aid long-term stability of synchrony under noisy conditions. Finally, Hyunsuk Hong (Chonbuk National University, Korea) walked us through a definition of heat for coupled oscillators, and how this heat dissipates as the oscillators approach synchrony.
In the intersection of oscillators with neuroscience, Jeff Moehlis (University of California, Santa Barbara) compared several methods for controlling populations of neural oscillators in the context of avoiding pathological synchrony such as is observed in Parkinson's disease. Current methods (pun intended) of deep-brain stimulation (DBS) involve constant high frequency pulses, which are very effective in certain patients, but more energy-efficient strategies for DBS may extend the lifetime of these devices and reduce the number of costly replacement surgeries.
My favorite talk of the weekend was that of Sara Solla (Northwestern), whose team tackled the challenge of measuring consistent brain activity involved in a learned behavior from measurements months or years apart. With no way to record from every neuron involved in the task, or even from the same neurons each measurement, they projected the measured traces onto a low-dimensional manifold which not only showed consistent dynamics across long periods of time but also showed relative positioning in phase space representative of the task at hand (an eight-directional eye saccade task). This discovery of "cognitive space" (my words, not theirs) may have huge implications for brain-machine interfacing and interpretation of neural activity in general.
Two more neuroscience perspectives were given by Bingni Brunton (University of Washington) - who used data-driven modeling to understand dynamics of several neurological systems, healthy and unhealthy - and Zachary Kilpatrick (University of Colorado, Boulder), who approached the problem of drift-diffusion decision making with a stochastically switching state, and found the statistically optimal rate to discount old information. Turns out humans and other animals discount a bit too slow.
A similar process of discounting old information was one of the insights that let Mauricio Santilana (Harvard Medical School) improve "nowcasting" for medical professionals. He explained how he kept beating Google's Flu Trends algorithm (which aimed to predict flu prevalence using search data) with much less perfect data. He also used machine learning to predict the best time to change medical treatment using continuous-time data from their monitoring devices. The opportunities to inform healthcare with machine learning are exciting.
In another public-health relevant talk, Lauren Childs (Virginia Tech) talked about the dynamics of the malaria infection, in particular how it expresses different protein markers to keep the immune system on the run for an extended period of time. Several unique signatures of parasite-count traces were examined and matched by an intricate model of each step of the life cycle.
We learned more biology from Tomas Bohr (Technical University of Denmark), who explained the amazing system by which trees distribute sugar, from its production in the leaves and subsequent distribution by phloem tubes back to growing parts of the tree. He put forward a leaky pipe-flow model for short- to medium-needled trees, but many mysteries remain to be explained.
Sean Cornelius (Northeastern University) found a compellingly general explanation for the ubiquity of Laplace-distributed growth rates in many fields, including biological populations and corporation profits. He explained how in competitive networks exhibiting bistability in the activation of each node, noise leads not to normal growth distributions but Laplace ones.
Active fluids are another intersection of the natural and artificial. Microorganisms often present self-propelled spinning behavior, and engineered microrotors show great potential for application. Petia Vlahovska (Northwestern) examined active fluids of Quincke rotors, including particle and continuum models for their behavior. Vincenzo Vitellli (University of Chicago) examined some time-reversal symmetry-breaking properties of active magnetically-driven microrotors fluids, including a new dissipationless viscosity term.
Throughout the conference, machine learning showed its amazing practicality, cropping up in a large portion of the talks over the weekend. Reservoir computing was used to 'learn' many nonlinear dynamical systems, with applications such as enabling model-free control mechanisms for chaotic systems - as brought to light in a flash talk by Daniel Canaday (Ohio State University). Edward Ott (University of Maryland) showed how even though machine-learned models necessarily diverge from the true nonlinear system, they sometimes retain 'realistic-looking' dynamics and match its ergodic (long-term statistical) behavior.
Many more insights were shared, but these margins are too small to contain them. Suffice to say that there was something for everyone, which is one of the great features of dynamics as a field, and the reason I'm looking forward to Dynamics Days 2020!