https://www.vox.com/the-highlight/23447596/artificial-intelligence-agi-openai-gpt3-existential-risk-human-extinction
Cannot be repeated often enough, but investment capitalists and industry aren't really interested in Human centered AI.
And I am starting to see the signs on the wall of a wave of PR activity from DESTRUCTIVE AI deniers, just like with tobacco, oil and climate change.
What we need is the equivalent of what the IPCC is for climate science, but then for human centered, controllable AI.
Milton PonsonGSM: +297 747 8280PO Box 1154, OranjestadAruba, Dutch Caribbean
Project Paradigm: Bringing the ICT tools for sustainable development to all stakeholders worldwide through collaborative research on applied mathematics, advanced modeling, software and standards development
On Sunday, November 27, 2022 at 10:30:16 AM AST, Paola Di Maio <paola.dimaio@gmail.com> wrote:
I am glad to see Stanford conceding that humans must remain at the center of AI. There is A LOTto dig into relevant to this CG- what are the implications for us here?
https://hai.stanford.edu/news/language-models-are-changing-ai-we-need-understand-them