| ... | ... | @@ -10,13 +10,8 @@ In the mean time, try it for yourself! |
|
|
|
* [Talk to Transformer](https://talktotransformer.com/) - "See how a modern neural network completes your text. Type a custom snippet or try one of the examples."
|
|
|
|
* [Write With Transformer](https://transformer.huggingface.co/) "See how a modern neural network auto-completes your text"
|
|
|
|
|
|
|
|
|
|
|
|
This week's exercise:
|
|
|
|
|
|
|
|
* Read some or all of the articles about GPT-2 listed below
|
|
|
|
* Try to generate your own snippets using the [sample GPT-2 generator](http://text-gen.simon-mo.com/)
|
|
|
|
* Decide for yourself the merits of OpenAI's approach
|
|
|
|
* supplementary question: how would _you_ deal with the potential social impacts of AI?
|
|
|
|
GLTR does something slightly different; it tells you how likely it is that the text _could have been_ generated bu GPT-2
|
|
|
|
* [GLTR / glitter](http://gltr.io/dist/index.html)
|
|
|
|
|
|
|
|
(Bear in mind that the sample GPT-2 generator is a cut-down version of the actual tool).
|
|
|
|
|
| ... | ... | |