There you go:
http://www.dailytech.com/article.aspx?newsid=12384
Regards,
Bernard
[a href=\"index.php?act=findpost&pid=208884\"][{POST_SNAPBACK}][/a]
Yes, it's something to think about for sure. But to choose "either or" at this stage is premature.
Well guys, have fun with it. I'm unsubscribing. You all have good points, BTW. Thanks for responding.
From the link above:
Dr. Nick Bostrom, director of Oxford's Future of Humanity Institute, host of the symposium, is fearful that mankind may eventually create such a machine, capable of destroying its creators. He states, "Any entity which is radically smarter than human beings would also be very powerful. If we get something wrong, you could imagine the consequences would involve the extinction of the human species."
On the other hand:
Bostrom leads a movement known as transhumanism, which dually aims to watch for potential threats in emerging technologies and conversely adopt radical emerging technologies to enrich human life. Bostrom and other transhumanist hope that one day biotechnology, molecular nanotechnologies, and artificial intelligence will merge man with machine, yielding humans that have increased cognitive abilties, are physically stronger, and emotionally more stable. This path, they say will lead to "posthumans", augmented beings so superior to traditional man, they are separate entity.