Wanting into the mathematics and the information reveals that transformers are each overused and underused.
Transformers are finest identified for his or her purposes in pure language processing. They have been initially designed for translating between languages,[1] and at the moment are most well-known for his or her use in massive language fashions like ChatGPT (generative pretrained transformer).
However since their introduction, transformers have been utilized to ever extra duties, with nice outcomes. These embody picture recognition,[2] reinforcement studying,[3] and even climate prediction.[4]
Even the seemingly particular job of language era with transformers has numerous surprises, as we’ve already seen. Massive language fashions have emergent properties that really feel extra clever than simply predicting the following phrase. For instance, they could know varied info in regards to the world, or replicate nuances of an individual’s model of speech.
The success of transformers has made some individuals ask the query of whether or not transformers can do all the things. If transformers generalize to so many duties, is there any purpose not to make use of a transformer?
Clearly, there’s nonetheless a case for different machine studying fashions and, as is commonly forgotten nowadays, non-machine studying fashions and human mind. However transformers do have numerous distinctive properties, and have proven unbelievable outcomes thus far. There may be additionally a substantial mathematical and empirical foundation…