Install Theme

It’s funny, the conventional wisdom I got used to when I was a nerdy high schooler was “neural nets are overrated, look, they can’t even learn simple grammar rules”

(Gary Marcus’ book The Algebraic Mind is all about this, in the context of neural nets as models in psychology, i.e. connectionism)

And now it’s like “another day, another deep learning net that captures a concept in a way eerily close to our own intuition”

(I’m not sure if they still have the problems Marcus was complaining about, though – it seems like people still prefer more essentially discrete methods like PCFGs for natural language stuff?  But still, we are a long way from my old assumption that neural networks tended to learn functions that had no relation to human thought or perception)

  1. shacklesburst reblogged this from nostalgebraist
  2. flamingwendigo reblogged this from nostalgebraist
  3. typicalacademic reblogged this from nostalgebraist and added:
    Neural nets being overrated is still the conventional wisdom I get from my advisor (and, to varying degrees, from the...
  4. nostalgebraist reblogged this from atonaltantrum and added:
    ah yes, that famous pair of antonyms, “tremendous” and “david.brett_@_thomsonreuters.com”(Very interesting, thanks for...
  5. atonaltantrum reblogged this from nostalgebraist and added:
    But also have you seen Jordan Ellenberg’s post on word2vec? It’s not quite as impressive as it seems at first....