Editing Help:Editing

Jump to navigation Jump to search
Warning: You are not logged in. Your IP address will be publicly visible if you make any edits. If you log in or create an account, your edits will be attributed to your username, along with other benefits.

The edit can be undone. Please check the comparison below to verify that this is what you want to do, and then publish the changes below to finish undoing the edit.

Latest revision Your text
Line 152: Line 152:
<references /></pre>
<references /></pre>
{{Paper|title=Attention Is All You Need|authors=Ashish Vaswani et al.|url=https://arxiv.org/abs/1706.03762|tldr=They propose a new network architecture called the Transformer, based solely on attention mechanisms, which outperforms existing models in machine translation tasks while being more parallelizable and requiring less training time.|publication=arXiv|year=2017}}
{{Paper|title=Attention Is All You Need|authors=Ashish Vaswani et al.|url=https://arxiv.org/abs/1706.03762|tldr=They propose a new network architecture called the Transformer, based solely on attention mechanisms, which outperforms existing models in machine translation tasks while being more parallelizable and requiring less training time.|publication=arXiv|year=2017}}
==== References ====
==== References ===
<references />
<references />


Please note that all contributions to Robowaifu Institute of Technology are considered to be released under the Creative Commons Attribution-ShareAlike (see RWTech:Copyrights for details). If you do not want your writing to be edited mercilessly and redistributed at will, then do not submit it here.
You are also promising us that you wrote this yourself, or copied it from a public domain or similar free resource. Do not submit copyrighted work without permission!

To edit this page, please answer the question that appears below (more info):

Cancel Editing help (opens in new window)