It’s not always easy to distinguish between existentialism and a bad mood.
I’m not spending the additional 34min apparently required to find out what in the world they think neural network training actually is that it could ever possibly involve strategy on the part of the network, but I’m willing to bet it’s extremely dumb.
I’m almost certain I’ve seen EY catch shit on twitter (from actual ml researchers no less) for insinuating something very similar.
It’s a sad fate that sometimes befalls engineers who are good at talking to audiences, and who work for a big enough company that can afford to have that be their primary role.
edit: I love that he’s chief evangelist though, like he has a bunch of little google cloud clerics running around doing chores for him.
debate pervert in a reply-guy world
Well done.
Seriously, the mandatory forced equanimity of the text went from merely off-putting to pretty gross actually as it was becoming increasingly apparent the nonlinear people are basically sociopaths who make it a point of pride to flagrantly abuse anyone who finds themselves at the other end of a business arrangement with them, not to mention that their employment model and accounting practices as described seem wildly illegal anywhere not a libertarian dystopia, even without going into the allegations about workplace romance.
Except they are EAs doing unspecified x-risk work, aka literally God’s work, so they are afforded every lenience and every benefit of a doubt, I guess.
It hasn’t worked ‘well’ for computers since like the pentium, what are you talking about?
The premise was pretty dumb too, as in, if you notice that a (very reductive) technological metric has been rising sort of exponentially, you should probably assume something along the lines of we’re probably still at the low hanging fruit stage of R&D, it’ll stabilize as it matures, instead of proudly proclaiming that surely it’ll approach infinity and break reality.
There’s nothing smart or insightful about seeing a line in a graph trending upwards and assuming it’s gonna keep doing that no matter what. Not to mention that type of decontextualized wishful thinking is emblematic of the TREACLES mindset mentioned in the community’s blurb that you should check out.
So yeah, he thought up the Singularity which is little more than a metaphysical excuse to ignore regulations and negative externalities because with tech rupture around the corner any catastrophic mess we make getting there won’t matter. See also: the whole current AI debacle.