Anyway, while looking over the twitter feed of that “predictive text” guy I found this, which I had never seen before (although @deusvulture says it’s famous)
Recurrent neural networks: what the fuck
(What’s especially impressive is that the parameters were fit using an individual piece of consumer hardware. High-end consumer hardware, but still. This isn’t just throwing giant clusters at the problem)
if you haven’t, check out talcos’s MTG card generation using RNNs (Can give link if you want). It’s where i learned about how hype RNNs are.
I have, and they’re hilarious, especially the “meaningful, novel, but profoundly useless cards”:
* When $THIS enters the battlefield, each creature you control loses trample until end of turn.
* Whenever another creature enters the battlefield, you may tap two untapped Mountains you control.
* 3, : Add 2 to your mana pool.
* Legendary creatures can’t attack unless its controller pays 2 for each Zombie you control.
But it didn’t bowl me over in the way the article linked in the OP did, I imagine because generating M:TG cards is an unusully hard task. The game has a fairly strict syntax on top of English grammar, and also has a very large card base which already includes most card ideas beneath a certain complexity level (excepting “profoundly useless cards”), so the RNN has three separate problems to solve: “make cards that obey English grammar,” “make cards that obey M:TG syntax,” and “make cards that don’t strike an M:TG player as boring or pointless.”
By contrast, the Shakespeare generation in the OP article is an unusually easy task. Most of us don’t know much about Early Modern English syntax, and thus are used to Shakespeare sounding grammatically weird in arbitrary ways, so it’s hard to distinguish RNN grammar mistakes from authentic Shakespearean grammar. Also, Shakespeare is such a prestigious author that imitating him seems especially impressive for any given level of imitation quality.
(Of course I am making these judgments of difficulty in hindsight, with knowledge of how impressive the RNN output actually was, so they should be taken with a grain of salt.)
(via eikotheblue)
