Re: When are "open" data open?

Hi Pierre,

I may be missing some perspective or mechanism as I only recently started 
looking into the work. Notiving the deadline, I decided it was better to 
voice my understanding.

> If you check the GLD recommendations that are currently being created, 
the standard that is recommended to use is RDF and vocabularies based on 
this standard, which I think is quite flexible and able 
> to adapt freely with the changing "world reality". If you think this is 
not the case, it would be interesting to hear your point of view, even if 
I think it's an issue wider than the GLD group.

RDF is indeed very flexible. So flexibile that it is non-operational so 
instead we get a lot of low-level one-size-fits-nothing standards. 

I see this work as the later. From what I see, the chosen level of 
abstraction is too restricting as in ONE standard failling on both 
supporting the reality of changes in many different directions in 
parallell and especially  on security. 

In general, I see government creating tomorrows legacy and problems today 
making one-size-fits-nothing standards and central structure. We see the 
mistakes in reality as in UK, EU, Denmark, Estonia eGovernment  etc. where 
the shortterm focus on just getting some connection create unflexible, 
unsecure and centralised legacy structures. It turns into building 
"virtual mainframes" on open structures instead of extending the open 
structures to the application layers. Whether it is bad craftsmanship or 
ideological regulation is irrelevant as the consequences are real.

>As Richard mentioned, it would be good to understand how these comments 
specifically relate to the current GLD work. 

What we need are semantic structures that
a) Support control at endpoints instead of centraslisation
b) Support heterogenous resolution and asynchroneus change at runtime. 
>From one aspect we need non-ambiguity in order not to kill patients, from 
another aspect we need flexiblity to crasp the world wihtout assuming we 
can define everything for everyone wihtout ending up in dys-functional and 
ineffective structures.

I think that an inpur is the suggestion of a ore dynamic structure based 
on gradual standardisation, i.e. a standard for gradually creaing 
interoperaiblity between existing systems rather than forcing everything 
into one standard.

> If you think there are recommendations that are missing,  please be 
specific on where and what could be rewritten.

Sorry, I dont have a sponsor. It would require a lot more work to go into 
that level of detail.  I know it is not particularly helpfull, but that is 
the reality.

> I have checked the linksmart sourceforge page, and it's not clear what it 
offers and how it solves the issues you have raised.

My point was an analogy.

A number of aspects.
1) Run-time resolution fully data-driven and each "device" able to be 
instantiated as different logical devices simultanously
2) Semantic devices can be combined and nested in such a way that any 
combination of physcial and logical devices can be created.  I dont see 
mechanisms to link logics with data to create new semantic data?
3) The implementation do not contain the full structure. My focus were on 
semantic resolution of security at runtime mapped towards an overall 
security onthology. In security one critical element is that evalutaions 
of security propertios can change and thus the need for runtime 3rd party 
assertion or status verficiation as part of a semantic link. I dont see 
such mechanisms.
4) The structure hide properties, e.g. IPV6 addresses are contained as 
they leak information. 

5) My real point is the need for heteorogenous semantic definitions. I am 
not certain that it is benefiscial to lock structures against ONE 
defintion, but rather or in parallelt have mechanisms to link data elemens 
or a more loosely coupled manor including the ablut to have multiple 
definitions semantically with various degree of formalism. E.g. I should 
be able to make smenatic data deinfiions that can change and are resolved 
at runtime as we get wiser and the smenatic linkage grow as we dig into 
the mechanisms.

E.g. in the HYDRA project we worked to create parameterized security 
resolved against an onoltogy of disjunkt security objektives and with 
ontology-support to compare various technologies and cryptio-proofs 
semantic contribution to security.

> Now, for security of the opened data, it's an interesting issue, and there 
is a section about it in the GLD recommendation document. Maybe you want 
to make specific comments on this.

I am not sure which section, you refer to.

But e.g. the defintion of a person is clearly hopeless as it assume 
identification which means that the structure is incompatible with clould 
(as that would require pseudonyms only) 
http://xmlns.com/foaf/spec/#term_Person

Similar I cannot find anything dealing with distributed key control and 
management. Perimeter security is clearly unable to cope with the 
challanges, we need to build security into the data structures and means 
of encryption

Try for instance having a look on this for intra-contry interoperability 
and security in cloud (where security is not even realistic in theory).
http://digitaliser.dk/resource/896495



Hope the above is usefull. 

Semantic interoperability is important but as Einsteain said - make it as 
simple as possible, but not more. 


Regards

Stephan Engberg


- - - - - - - - - - - - - - - - - 
Stephan J. Engberg
Priway - Security in Context




Pierre Andrews <pierre.andrews@gmail.com>
25-03-2013 20:16

 
        To:     Stephan.Engberg@priway.com
        cc:     public-gld-comments@w3.org
        Fax to: 
        Subject:        Re: When are "open" data open?


Hi Stephan,

I have checked the linksmart sourceforge page, and it's not clear what it 
offers and how it solves the issues you have raised.

As Richard mentioned, it would be good to understand how these comments 
specifically relate to the current GLD work. If you think there are 
recommendations that are missing, please be specific on where and what 
could be rewritten.

If you check the GLD recommendations that are currently being created, the 
standard that is recommended to use is RDF and vocabularies based on this 
standard, which I think is quite flexible and able to adapt freely with 
the changing "world reality". If you think this is not the case, it would 
be interesting to hear your point of view, even if I think it's an issue 
wider than the GLD group.

Now, for security of the opened data, it's an interesting issue, and there 
is a section about it in the GLD recommendation document. Maybe you want 
to make specific comments on this.

Thanks,

Pierre

--
Pierre Andrews, Ph.D.
Research Fellow


On Mon, Mar 25, 2013 at 12:02 PM, <Stephan.Engberg@priway.com> wrote:
Dear Sir,

Creating semantic interoperability represent huge possibilities for
cost-redcution, improving quality and enabling new kinds of previously
unseen solutions.

However, when studying the available work on linked data, 2 vital aspects
not incorporated jumps to my mind - one about innovation or continous
change and one about Empowerment or the assurances that control rests with
the entity at risk and defining the demand (mostly the citizen)

a) The approach assume standardisation around a single univeral definition
b) The approach fail to separate between data that are safe to share and
data that represent a risk to someone.

Ad a) Making strucgtures arund a single univesal standard would make
everything stalemate by legacy.
We need structures that are much more resilient to continous change in 
many
directions. And yes this means that we must accept that we cannot FORCE 
the
world into a standard bucket unless such as bucket is able to crasp the
world reality.

I sugest a nested approach without any assumptions on outcome. We applied
such an approach in the EU HYDRA project which is partly implemented
http://sourceforge.net/projects/linksmart/


Ad b) Even more important is the need to respect fundamental rights and
society needs.

Buracurats and cynical corporate interests wants to ecxhange data ABOUT
someone as that increase their power and ability to profit. However such a
structure represent a failure by design. EVEN if "anonymised" or
"pseudonymised" such an approach represent a certain failure as it drives
linkage in sources without security.

I kindly refer you to this presentation that are in essence stating the 
key
elements.
https://ec.europa.eu/digital-agenda/sites/digital-agenda/files/Stephan.pdf

As can be seen the definition of what can constitute "open" data and how
data must be incapsulated to maintain or eliminate linkage to context is
not a simple question.

We should be extremely carefull NOT to see this from a system-centric or
bureaucrat perspective for WHATEVER excuse, e.g. assuming researchers or
even security administrators CAN access and link data on individuals for
research perspectives.


I kindly suggest to you that failure to incorporate the two above issues
represents a failure to the economy not smaller than that of former 
Eastern
European Communism as it leads to legacy-based ineffectiveness and massive
centraslisation of power and control at the expense of citizens and
society.


Sincerely,

Stephan Engberg
Priway - Security in Context

 ..  because the alternative is not an option

=======================================================
Stephan Engberg | Stephan.Engberg@priway.com
Priway - VAT/SE DK  25 77 53 76
Stengaards Alle 33D - 2800 Kgs. Lyngby - Denmark
Tel.: (+45) 2834 0404  - Internet: www.priway.com

Received on Wednesday, 27 March 2013 17:27:41 UTC