- From: Binyamin <7raivis@inbox.lv>
- Date: Tue, 17 Jun 2014 23:40:04 +0300
- To: Doug Schepers <schepers@w3.org>, public-pointer-events@w3.org
- Cc: Marcos Caceres <w3c@marcosc.com>, public-device-apis@w3.org
- Message-ID: <CABj=UkK4sVKGTk=X578W8DE34Ub7To=Avb5vgNUYav=64NfMmQ@mail.gmail.com>
בע"ה
I think today screens can be more sensitive (with built-in multiple
sensors) and recognize if touched human body, plastic or metal object with
temperature measure and electrical conductor.
Binyamin
On Mon, Jan 6, 2014 at 11:59 PM, Doug Schepers <schepers@w3.org> wrote:
> Hi, Binyamin–
>
> Sorry, I missed this earlier...
>
> I don't know the context here, so I don't know exactly what you're
> commenting on. Have you seen the Pointer Events spec?
>
> Here is the interface [1]:
>
> dictionary PointerEventInit : MouseEventInit {
> long pointerId = 0;
> long width = 0;
> long height = 0;
> float pressure = 0;
> long tiltX = 0;
> long tiltY = 0;
> DOMString pointerType = "";
> boolean isPrimary = false;
> };
>
> It doesn't support shape detection per se, but it does address contact
> geometry via width and height (so, an implied rectangle), and pressure, as
> well as multi-touch. All of this assumes that the device itself is capable
> of these distinctions, of course.
>
> It doesn't detect different materials (finger, gloves, foot, pencil,
> etc.), because most screens can't make those distinctions (though some can
> detect degree of conductivity of the touching item), but the pointerType
> property [2] does allow you to distinguish between mouse, pen, and touch
> (where it is known).
>
> If you have other feedback, or want to suggest new features or reliable
> ways of doing more detection, please let us know at
> public-pointer-events@w3.org.
>
> [1] http://www.w3.org/TR/pointerevents/#pointerevent-interface[2]
> http://www.w3.org/TR/pointerevents/#widl-PointerEvent-pointerType
>
> Regards-
> -Doug
>
>
> On 1/6/14 2:53 PM, Binyamin wrote:
>
>> בע"ה
>>
>>
>> Hi Marcos and Doug,
>>
>> Any feedback for the proposal request - touch coordination, shape, size
>> and strength?
>>
>>
>> Binyamin
>>
>>
>> On Wed, Dec 18, 2013 at 2:09 AM, Marcos Caceres <w3c@marcosc.com
>> <mailto:w3c@marcosc.com>> wrote:
>> >
>> > Hi Binyamin,
>> > Sorry to top post, but I wonder if this is better feedback for the
>> IndieUI group? I’ve cc’ed Doug Schepers who can probably say where this
>> feedback would be most valuable.
>> >
>> > http://www.w3.org/WAI/IndieUI/
>> >
>> > --
>> > Marcos Caceres
>> >
>> >
>> > On Tuesday, December 17, 2013 at 5:09 PM, Binyamin wrote:
>> >
>> > > בע"ה
>> > >
>> > >
>> > > Implement Web API to return super-sensitive multi-touch
>> coordination, shape, size and strength.
>> > >
>> > >
>> > > Steps to reproduce:
>> > > Touch on screen with any kind and shape of object (finger, gloves,
>> foot, pencil, etc.).
>> > >
>> > > Current results:
>> > > Currently returns just approximate touch coordination.
>> > >
>> > > Expected results:
>> > > Return data of touch coordination, shape, size and strength
>> (strength could be calculated also to weight). Must works also on
>> multi-touch.
>> > >
>> > > Resources:Georgia Tech pressure-based sensor
>> https://plus.google.com/+BinyaminLaukstein/posts/SKykHFaESFe is able to
>> return all that data.
>> > > Current very-basic touch size implement on Android with
>> MotionEvent.getSize()
>> http://developer.android.com/reference/android/view/
>> MotionEvent.html#getSize%28int%29
>> > >
>> > >
>> > > Binyamin
>>
>
Received on Tuesday, 17 June 2014 20:41:09 UTC