Exactly 7 years ago, a team of research scientists at Google Research and Google Brain released a paper titled “Attention is all you need”. In this paper, they proposed a “simple network architecture” called the “Transformer”. This became the pivotal and legendary architecture which now powers every large language model on our planet. Such is the nature of technology. One research paper is enough, to change the world. What kind of advancements are waiting for us in the future? Will there be another paper like this? More importantly, can we afford to miss it? This is why we come to you weekly with all the latest updates in the world of technology, right from the edge of tomorrow. Let’s dive into what happened this week.