DJ Strouse

I am a PhD student in Physics at Princeton University, working with Bill Bialek and David Schwab. I'm broadly interested in approaches to artificial general intelligence. My PhD is funded by a Hertz Fellowship and a DOE Computational Science Graduate Fellowship.

I did a master's at the University of Cambridge with Mate Lengyel as a Churchill Scholar and studied physics and mathematics at the University of Southern California, where I worked with Bartlett Mel and Paolo Zanardi. I've also spent time at DeepMind working with Matt Botvinick, Stanford University working with Kwabena Boahen, the Institute for Quantum Computing working with Andrew Childs, and Spotify NYC working with their machine learning team.

Email  /  Twitter  /  CV  /  Blog  /  Google Scholar  /  GitHub

News
Research

I'm interested in reinforcement learning (RL), information theory, and deep learning with an eye toward understanding and creating intelligent agents. Much of my work has focused on the information bottleneck (IB) and lately I've been interested in using IB to improve the training of RL agents. In the past, I've also worked on quantum information theory and computational neuroscience.

Current Projects
Signalling and hiding intentions to cooperate and compete
DJ Strouse, Max Kleiman-Weiner, David Schwab, Matt Botvinick
in progress
code

We encourage RL agents to signal/hide their intentions in a cooperative/competitive multi-agent setting by adding an information-theoretic regularizer to their cost function that encourages their actions to hide/provide information about their goal.

Hierarchical reinforcement learning via variational information minimization
DJ Strouse, Jane Wang, Neil Rabinowitz, David Pfau, Matt Botvinick
in progress
talk / note on Distral

We encourage RL agents to develop efficient hierarchical representations of task structure in a multi-goal environment by limiting the information about the goal that is used by the policy.

Efficient use of discrete latent variables using the deterministic information bottleneck
DJ Strouse, David Schwab
in progress
note

We use the deterministic information bottleneck to regularize discrete latent variable models, encouraging the use of as few latent variables as possible for a given level of performance.

Publications
The information bottleneck and geometric clustering
DJ Strouse, David Schwab
arXiv Preprint, 2017
arXiv / code

We show how to use the (deterministic) information bottleneck to perform geometric clustering, introducing a novel information-theoretic model selection criterion.

The deterministic information bottleneck
DJ Strouse, David Schwab
Neural Computation (NECO), 2017 & Uncertainty in Artificial Intelligence (UAI), 2016
arXiv / code / UAI / NECO

We introduce the deterministic information bottleneck (DIB), an alternative formulation of the information bottleneck that uses entropy instead of mutual information to measure compression. This results in a hard clustering algorithm with a built-in preference for using fewer clusters.

Optimizing online learning capacity in a biologically-inspired neural network
Xundong Wu, DJ Strouse, Bartlett Mel
Society for Neuroscience (SfN), 2011
poster

We study the optimal conditions for online recognition memory in a biologically-inspired neural network with "dendrite-aware" learning rules.

Levinson's theorem for graphs
Andrew Childs, DJ Strouse
Journal of Mathematical Physics (JMP), 2011
arXiv / JMP / talk

We prove an analog of a classic result in quantum scattering theory for the setting of scattering on graphs. The goal is to provide additional tools for designing quantum algorithms in this setting.


Good artists copy.