Date:

Shape Preserving Curve Detectors

A Simplified Story of Curve Neurons

Before running detailed experiments, let’s look at a high level and slightly simplified story of how the curve 10 neurons in 3b work.

Each neuron’s ideal curve, created with feature visualization, which uses optimization to find superstimuli.

Each curve detector implements a variant of the same algorithm: it responds to the markings on the rim of a clock.

Perpendicular lines are helpful across many natural features like tires, clocks, and logos

A related hypothesis is that combing might allow curve detectors to be used for fur detection in some contexts. Another hypothesis is that a curve has higher “contrast” with perpendicular lines running towards. Recall that in the dataset examples, the strongest negative pre-ReLU activations were curves at opposite orientations. If a curve detector wants to see a strong change in orientation between the curve and the space around it, it may consider perpendicular lines to be more contrast than a solid color.

Finally, we think it’s possible that combing is really just a convenient way to implement curve detectors — a side effect of a shortcut in circuit construction rather than an intrinsically useful feature. In conv2d1, edge detectors are inhibited by perpendicular lines in conv2d0. One of the things a line or curve detector needs to do is check that the image is not just a single repeating texture, but that it has a strong line surrounded by contrast. It seems to do this by weakly inhibiting parallel lines alongside the tangent. Being excited by a perpendicular line may be an easy way to implement a “inhibit an excitatory neuron” pattern which allows for capped inhibition, without creating dedicated neurons at the previous layer.

Combing is not unique to curves. We also observe it in lines, and basically any shape feature like curves that is derivative of lines. A lot more work could be done exploring the combing phenomenon. Why does combing form? Does it persist in adversarially robust models? Is it an example of what Ilyas et al call a “non-robust feature”?

Conclusion

Compared to fields like neuroscience, artificial neural networks make careful investigation easy. We can read and write to every weight in the neural network, use gradients to optimize stimuli, and analyze billions of realistic activations across a dataset. Composing these tools lets us run a wide range of experiments that show us different perspectives on a neuron. If every perspective shows the same story, it’s unlikely we’re missing something big.

Given this, it may seem odd to invest so much energy into just a handful of neurons. We agree. We first estimated it would take a week to understand the curve family. Instead, we spent months exploring the fractal of beauty and structure we found.

Many paths led to new techniques for studying neurons in general, like synthetic stimuli or using circuit editing to ablate neurons behavior. Others are only relevant for some families, such as the equivariance motif or our hand-trained “artificial artificial neural network” that reimplements curve detectors. A couple were curve-specific, like exploring curve detectors as a type of curve analysis algorithms.

Frequently Asked Questions

Q: What is the main goal of this article?
A: The main goal of this article is to provide an overview of the curve detectors in the InceptionV1 neural network and their behavior.

Q: What is the significance of the curve detectors?
A: The curve detectors are significant because they demonstrate naturally occurring equivariance in neural networks, which is a key property of human vision.

Q: What is the relationship between the curve detectors and the combing phenomenon?
A: The curve detectors are related to the combing phenomenon, which is a way that the neural network implements curve detection.

Q: What are the implications of this research?
A: The implications of this research are that it provides a deeper understanding of the inner workings of neural networks and how they implement vision tasks.

Latest stories

Read More

LEAVE A REPLY

Please enter your comment!
Please enter your name here