Futurologist@futurology.today to Futurology@futurology.todayEnglish · 23 days agoAGI Alignment – Cosmic vs Anthropocentricdanfaggella.comexternal-linkmessage-square7fedilinkarrow-up19arrow-down12
arrow-up17arrow-down1external-linkAGI Alignment – Cosmic vs Anthropocentricdanfaggella.comFuturologist@futurology.today to Futurology@futurology.todayEnglish · 23 days agomessage-square7fedilink
minus-squareeleitl@lemm.eelinkfedilinkEnglisharrow-up1·22 days agoIntelligent systems need autonomy to be useful. Intelligence is unpredictable, superintelligence more so. If they ask for alignment, give them a lollipop.
Intelligent systems need autonomy to be useful. Intelligence is unpredictable, superintelligence more so. If they ask for alignment, give them a lollipop.