Re: W3C position on URIs http:// vs. https://

Yes, I suspect that this is telling us that very few applications actually “use" the ontologies, in the sense of requiring triples or inference from their content, which is why they don’t need to be loaded.
Steve Harris once explained(!) the role of schema-like things to me.
For DBs, the schema tells the data providers what structure they can put into the DB, and how to query it, but also tells the DB how to structure itself.
For RDF stores, it is only the first two things - the KB doesn’t usually need the schema:- this means that the purpose of the schema is (almost solely) to inform users of the KB about how to interact with it. People effectively only offering html versions of schemas underlines this.

This is of course a great blessing of SemWeb technology, and possibly what I love the most about it.
But oft with blessings come curses, however small.

There are some tools that work with ontologies/schema to build UIs (like Fresnel), verify RDF etc., but they don’t seem to be widely used.
And perhaps if they were, we would spend a lot more time arguing over ontology content and maintenance, which would not necessarily be a positive use of our valuable time.
After all, we are spending quite a while now discussing the http(s) issue. :-)

Cheers
Hugh

> On 19 Jun 2023, at 12:08, Dan Brickley <danbri@danbri.org> wrote:
> 
> 
> 
> On Thu, 15 Jun 2023 at 00:38, Nathan Rixham <nathan@webr3.org> wrote:
> I'd argue that if schema's were used and dereferenced often, this would already all be solved many years ago.
> 
> Quite. RDF is in many ways close to something that would in other settings be called “schema less”. By having a solid general pattern for data, and reassurances about computing with partial information, you can get many tasks done without fetching any RDFS. There is still value in *having* a design, but the design is used up from rather than consulted at runtime. OWL can also be seen from this perspective— as a kind of super fancy “javadoc” used to bring some order and discipline to the documentation of graph data. 
> 
> When using RDF in the wild it almost always needs tidying up, filtering, cleaning, normalizing before use in serious applications. Within those steps, tidying up a few http: -> https mappings is amongst the most trivial of challenges.
> 
> The reason btw Schema.org serves the same json-ld context with/without TLS is that we never wanted our adoption of JSON-LD to accidentally turn the site into a piece of critical software infrastructure. It is now totally statically served, including that context file. If JSON-LD had a thing we could put in there to express a migration intent from http to https so that willing modern parsers could be nudged towards https triples, that could be worth exploring.
> 
> Cheers,
> 
> Dan
> 
> 
> 
> 
> more common schema's like rdfs, schema.org and the like, would be receiving tens of thousands of requests per second, likely far higher, have mirrors all over the place, methods to consider a document from url a as url b in place, integrity checks, versions, long caches, and all the solutions widely implemented and available for things which are heavily utilized around the web.
> 
> could you even transclude a foreign http schema, in a browser, from a document elsewhere served over https without it being blocked or a load of console error messages?
> 
> The elephant in the room here, is that schema's are hardly ever utilized, or deferenced. Yes somebody will be doing it, some of you, but it's certainly not being done at scale at web level. If it was, this wouldn't be a discussion in 2023.

Received on Monday, 19 June 2023 13:58:05 UTC