Superintelligence, by Nick BostromAn extremely important topic that has the most focused minds on the planet (Gates, Musk, Minsky, Hawking rip...) issuing desperate calls to pay attention to the possible risks associated with a reckless sprint towards AI before we can adequately control it. This book does a fantastic job of explaining the perils of this sitauation, and the frankly near-impossible task of devising ways to keep ourselves safe from a superintelligence once it comes about. Both the speed with which an AI would go from human-level intelligence, to singleton superintelligence is hard to believe until you really spend some time thinking about how many evolutionary iterations such a system can perform each second. Then think about how we can possibly come up with ways to contain such a system when it would be orders of magnitude beyond us, and the problem becomes clear. Interestingly, he makes a case for the importance of accelerating certain types of research which would allow us to better understand how to control an intelligence explosion, while having to be extremely careful not to simply quicken the arrival of said explosion. A fine line to walk, for sure. As solid as this book is for the first half, the second half became something of an exercise in repetition, and the author continued to explore nearly identical topics from very slightly different angles, with a fair amount of covering already seen ground. I find myself writing a lot of reviews like this. Perhaps I just get bored with a book around page 200? I suppose that's possible. In any case, even for the first half of this text, it is as Musk and Gates indicate, required reading so as to understand the dangers that may be suddenly unleashed by any one of the well-meaning AI labs around the world. I also appreciate any book with a fine bibliography, and this one qualifies. I've snapped pics of it for future reference, see if you can find some fun suggestions in there for yourself! Lots of Minsky and Hofstadter and Rhodes. An entire page of the author's own previous work, which seemed a little excessive! But, I suppose when you are an expert in the field... |
|
|
|