It’s funny, the conventional wisdom I got used to when I was a nerdy high schooler was “neural nets are overrated, look, they can’t even learn simple grammar rules”
(Gary Marcus’ book The Algebraic Mind is all about this, in the context of neural nets as models in psychology, i.e. connectionism)
And now it’s like “another day, another deep learning net that captures a concept in a way eerily close to our own intuition”
(I’m not sure if they still have the problems Marcus was complaining about, though – it seems like people still prefer more essentially discrete methods like PCFGs for natural language stuff? But still, we are a long way from my old assumption that neural networks tended to learn functions that had no relation to human thought or perception)
raginrayguns liked this
dagny-hashtaggart liked this
amaranththallium liked this
shacklesburst reblogged this from nostalgebraist
light-rook liked this
eka-mark liked this
flamingwendigo reblogged this from nostalgebraist
cofinaldestination liked this
nostalgebraist liked this
tweedymcgee liked this
kerapace liked this
towardsagentlerworld liked this
typicalacademic reblogged this from nostalgebraist and added:
Neural nets being overrated is still the conventional wisdom I get from my advisor (and, to varying degrees, from the...
seasonoftowers liked this
shacklesburst liked this
blashimov liked this
youzicha liked this
typicalacademic liked this
rangi42 liked this
interstitialentity-blog liked this
nostalgebraist reblogged this from atonaltantrum and added:
ah yes, that famous pair of antonyms, “tremendous” and “david.brett_@_thomsonreuters.com”(Very interesting, thanks for...
atonaltantrum reblogged this from nostalgebraist and added:
But also have you seen Jordan Ellenberg’s post on word2vec? It’s not quite as impressive as it seems at first....
illidanstr liked this
wirehead-wannabe liked this
