Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Both sides previous revision Previous revision
Next revision Both sides next revision
education [2021/09/08 09:59]
fablpd
education [2021/09/08 10:00]
fablpd
Line 49: Line 49:
 \\ \\
  
-  * **GANs with Transformers**:​ Since their introduction in 2017, the Transformer architecture +  * **GANs with Transformers**:​ Since their introduction in 2017, the Transformer architecture revolutionized the NLP machine learning models. Thanks to the
-revolutionized the NLP machine learning models. Thanks to the+
 scalability of self-attention only architectures,​ the models can now scalability of self-attention only architectures,​ the models can now
 scale into trillions of parameters, allowing human-like capacities of scale into trillions of parameters, allowing human-like capacities of