May 4, 2024

TechNewsInsight

Technology/Tech News – Get all the latest news on Technology, Gadgets with reviews, prices, features, highlights and specificatio

“Transformers” and 8 Google employees who changed the history of artificial intelligence WIRED.jp

“Transformers” and 8 Google employees who changed the history of artificial intelligence WIRED.jp

Vaswani remembers collapsing on the couch in his office one night while writing a research paper with his team. As I stared at the curtain that separated the sofa from the rest of the room, I was surprised to see that the patterns drawn on the fabric looked like synapses or neurons. I told Gomez, who was there, that what we were looking for would definitely go beyond machine translation. “Ultimately, speech, audio information, and visual information must be integrated into a single structure, just like the human brain,” Vaswani says. “I had a strong hunch that what our team was developing was something comprehensive.”

「Attention is all you need」

However, the company's senior management saw this research as just another interesting AI project. I asked several members of the Transformers team if their bosses had ever asked them to report on the project's progress, and they responded sparsely. However, “the team knew that this project could be very large,” says Oskoriet. “That's why I was very specific about my comments about future research at the end of the paper.”

This sentence predicted what might come next, a future in which all patterns of human expression are applied to Transformer models: “We are very excited about the future of attention-based models, and we plan to expand the functionality of Transformers to domains including input.” / Directing methods, as well as the study of images, audio and video.

A few nights before the deadline, Oscuret realized he needed to give his paper a title. According to Jones, the team agreed to essentially reject current best practices like LSTM and introduce a technique called attention. Jones then remembered that the Beatles wrote the song “All You Need Is Love.” So what about the title “Attention is All You Need”?

See also  TFT: Billing Set 5 is launched today

Why the Beatles?

“Because I'm British,” Jones says. “I literally thought about it in five seconds, and I never thought I would get the job.”

The team continued collecting experimental results until the deadline. “The English-to-French translation numbers came up five minutes before I submitted my paper,” Palmer says. “I sat in the little kitchen in Building 1965 and recorded the last number.” By the time I submitted my paper, there was less than two minutes left until the deadline.

Like almost every technology company, Google immediately filed for a provisional patent for search. This is not to prevent others from using the idea, but rather to expand its patent list as a defense against infringement claims.”

When responses came from conference reviewers, the reactions were mixed. “There was one person who was positive, another person who was very positive, and another person who was like, ‘Okay,’” Palmer said. The research will be presented in the evening poster session.

By December, the newspaper had become a hot topic. The four-hour session on December 6 was packed with scientists looking to learn more about the research. The team talked until their voice was hoarse, and even after the session ended at 10:30 p.m., there was still a crowd of people. “There was a security guard asking us to leave,” Oscurit said. Perhaps his most satisfying moment was when computer scientist Sepp Hofreiter came up and praised his team's work. That's quite a compliment. Hofreiter is the co-inventor of the long-short-term memory model, which has just been phased out as a basic tool in artificial intelligence by Transformers.

See also  How to delete cache on Mac (Chrome/Safari/Firefox)

“No one really understood what that meant.”

However, Transformers didn't immediately conquer the world, nor did they take over Google right away. Kaiser recalls that around the time the research was published, Shazier suggested to Google executives that they should get rid of all existing search indexes and use transformers to train large networks. This is a proposal that would completely change the way Google organizes information. At the time, even Kaiser himself thought this was far-fetched. Now everyone thinks it's only a matter of time.