Re: Open letter urging to pause AI

Exactly, an open letter to the UN and a resolution wouldn't put this back
in the box, you're not going to stop the world from turning or an era
changing rapidly accelerating technology that has all eyes on it, and
you're certainly not going to achieve it with an email to a couple of w3
lists with no real eyes on it. Totally futile.

To try to contribute something useful though, as somebody who uses it both
personally and in a work capacity, I have been jarred by the presentation
of falsities as facts, making up URLs, authors, sources for responses,
which simply don't exist.

Similarly the /flow/ of conversations can be dangerous, yesterday for
example gpt4 solved a complex equation for me which (correctly) told me
exactly the resistor I needed in an audio attenuation circuit. I then asked
it to perform some others I'd already done manually, and it gave answers,
well explained with the math, that would have had me putting in 63volts and
blowing both my ears and likely walls out. But a lay person would trust the
conversation after the first couple of complicated answers were correct.

That is, massive warnings and reminders to users are certainly needed,
before real damage happens.

On Fri, Mar 31, 2023 at 10:55 AM Hugh Glaser <hugh@glasers.org> wrote:

>
> > On 30 Mar 2023, at 23:33, Dan Brickley <danbri@danbri.org> wrote:
> >
> ...
> >
> > Having been here 25+ years I have some instincts about which topics will
> just fill up email inboxes with no ultimate impact on the world and
> benefit. Right now “something must be banned” threads on AI look to me to
> fall in that category.
> >
> > Cheers,
> >
> > Dan
> >
>
> Exactement, mon ami.
> When I saw the original open letter, my heart sank, as it reminded me of
> those ‘viral’ emails, reporting something that needs to be forwarded to
> everyone you know.
> Where essentially the virus is the email itself, prompting discussions
> like this one.
> Thus wasting everyone’s time, to pretty much no useful purpose.
> And this is my useless contribution.
>
> Hugh
>

Received on Friday, 31 March 2023 12:18:24 UTC