- From: Dave Raggett <dsr@w3.org>
- Date: Sat, 6 Jul 2019 20:30:01 +0100
- To: Owen Ambur <owen.ambur@verizon.net>
- Cc: public-aikr@w3.org
- Message-Id: <2E39833E-7FE0-44E9-994A-46040F63FB64@w3.org>
Hmm, a simpler interpretation is that feelings and emotions are computations that guide our behaviour in respect to our goals and our social interactions with others. Some of this further relates to fast vs slow modes of thinking as popularised by Daniel Kahneman: "System 1 and System 2 are two distinct modes of decision making: System 1 is an automatic, fast and often unconscious way of thinking. It is autonomous and efficient, requiring little energy or attention, but is prone to biases and systematic errors. System 2 is an effortful, slow and controlled way of thinking." This is all too evident in how people think about politics, and for me, suggests that as we work on developing strong AI, we need to ensure that AI systems have feelings along with empathy and compassion, and avoid the lazy ways of thinking that far too many humans use in respect to politics and society. If anyone is actually interested in working on the practical aspects of this, please contact me directly. > On 6 Jul 2019, at 18:50, Owen Ambur <owen.ambur@verizon.net> wrote: > > In Incognito: The Secret Lives of the Brain <http://www.eagleman.com/incognito>, David Eagleman downplays the role of consciousness in determining our behavior, most of which is on autopilot. https://www.linkedin.com/pulse/consciously-connected-communities-owen-ambur/ <https://www.linkedin.com/pulse/consciously-connected-communities-owen-ambur/> > > In Against Empathy: The Case for Rational Compassion, Paul Bloom says, "When some people think about empathy, they think about kindness. I think about war." (p. 188) > > While the math eludes me, the broader logic seems clear: > > Do we want to use our powers of reasoning merely to justify our emotions, after-the-fact, as seems to be natural for us? And should we use AI to augment (accentuate) the expression of our emotions ... as "social" networking services tend to do? (It seem like mind altering drugs might be more efficiently and effectively applied for that purpose.) > > Or might we prefer to apply logic (math) to improve the outcomes of our actions? > > Which of those two alternatives might make us "feel" better (be more satisfied) in the long run? Dave Raggett <dsr@w3.org> http://www.w3.org/People/Raggett W3C Data Activity Lead & W3C champion for the Web of things
Received on Saturday, 6 July 2019 19:30:06 UTC