![Chaitanya K. Joshi | @chaitjo@sigmoid.social on Twitter: "Exciting paper by Martin Jaggi's team (EPFL) on Self-attention/Transformers applied to Computer Vision: "A self-attention layer can perform convolution and often learns to do so Chaitanya K. Joshi | @chaitjo@sigmoid.social on Twitter: "Exciting paper by Martin Jaggi's team (EPFL) on Self-attention/Transformers applied to Computer Vision: "A self-attention layer can perform convolution and often learns to do so](https://pbs.twimg.com/media/EKRtjJ9U8AAyOz3.jpg:large)
Chaitanya K. Joshi | @chaitjo@sigmoid.social on Twitter: "Exciting paper by Martin Jaggi's team (EPFL) on Self-attention/Transformers applied to Computer Vision: "A self-attention layer can perform convolution and often learns to do so
![Vision Transformers: Natural Language Processing (NLP) Increases Efficiency and Model Generality | by James Montantes | Becoming Human: Artificial Intelligence Magazine Vision Transformers: Natural Language Processing (NLP) Increases Efficiency and Model Generality | by James Montantes | Becoming Human: Artificial Intelligence Magazine](https://miro.medium.com/v2/resize:fit:1400/0*y-DGZNTUMAKNV-76.jpg)
Vision Transformers: Natural Language Processing (NLP) Increases Efficiency and Model Generality | by James Montantes | Becoming Human: Artificial Intelligence Magazine
![Tsinghua & NKU's Visual Attention Network Combines the Advantages of Convolution and Self-Attention, Achieves SOTA Performance on CV Tasks | Synced Tsinghua & NKU's Visual Attention Network Combines the Advantages of Convolution and Self-Attention, Achieves SOTA Performance on CV Tasks | Synced](https://i0.wp.com/syncedreview.com/wp-content/uploads/2022/02/image-86.png?fit=960%2C538&ssl=1)
Tsinghua & NKU's Visual Attention Network Combines the Advantages of Convolution and Self-Attention, Achieves SOTA Performance on CV Tasks | Synced
![How Attention works in Deep Learning: understanding the attention mechanism in sequence models | AI Summer How Attention works in Deep Learning: understanding the attention mechanism in sequence models | AI Summer](https://theaisummer.com/static/e9145585ddeed479c482761fe069518d/ee604/attention.png)
How Attention works in Deep Learning: understanding the attention mechanism in sequence models | AI Summer
![New Study Suggests Self-Attention Layers Could Replace Convolutional Layers on Vision Tasks | Synced New Study Suggests Self-Attention Layers Could Replace Convolutional Layers on Vision Tasks | Synced](https://i0.wp.com/syncedreview.com/wp-content/uploads/2020/01/image-25-1.png?fit=1137%2C526&ssl=1)