Making Transformers Smarter: A Memory Boost for Symbolic Tasks | HackerNoonThe authors propose a new attention mechanism to enhance model architecture for improved systematic generalization.