Re: RDF finally has its long awaited Generic Client!

Kingsley,

Thanks for reviewing on my comments. I agree with you mostly. Seems like
this is an "the egg and the chicken" problem:

The client, also for LLMs, is still being the browser. What came first: the
browser or the (application) server? Currently, LLMs front ends are
sophisticated web applications with AI back ends, rendering their
(contextual) request / response cycles by traditional "browser
understandable" means, with awesome capabilities in their response
formatting and layout.

They seems to "understand" data formats (such as RDF). But what if we
tackle the problem from the "server side"? AFAIK, besides Linked Data
principles, which rely on HTTP, and SPARQL endpoints, there is nothing
upgrading "servers" as to provide "clients" with augmented semantic
capabilities on top of upgraded server's features enabling them to be
actual "semantic" browsers. They just still rendering web applications,
with REST / HTTP principles in the best case.

In this post I try to lay out a couple of concepts regarding what could be
a server resources, addressing and linking arrangement patterns for
augmented "Semantic Addressing" or "Semantic Hypermedia Addressing",
bridging the gap of just massively indexing data into information but
covering a "knowledge" augmentation step between resources relationships:

https://sebxama.blogspot.com/2025/10/semantic-hypermedia-addressing.html

Best regards,
Sebastián.


On Sat, Oct 11, 2025, 11:24 AM Kingsley Idehen <kidehen@openlinksw.com>
wrote:

> Hi Sebastian,
> On 10/8/25 7:35 PM, Sebastian Samaruga wrote:
>
> A "generic client" sounds for me as a "generic client" for any kind of
> database. It lacks what is meant for an application ("killer" or not) like
> its domain's use cases and its behaviors and flows rendered meaningfully in
> an user interface (or API).. Browsers allowed us to do that, not without
> significant effort, by declaratively stating the "meaning" and "behaviors"
> of components rendered on each "applications" pages (flows).
>
>
> A web browser is a generic client for a collection of documents accessible
> via HTTP. As stated post, LLMs are an equivalent for structured data
> constructed from hyperlinks using RDF.
>
>
>
> IMHO, what we need is a framework that, for any integrated / linked source
> of (semantic) data, renders for us useful applications, translating what is
> expressed in simple statements source data into a "representation" which
> allows to interact with that underlying data in a contextualized use-case
> driven fashion. All this by only "feeding" the "browser" with the data and
> schemes to be aligned inferring the rest by aggregation, alignment and
> activation means of these source data / schemes.
>
>
> That’s all fine—and LLMs and what you’re seeking aren’t mutually
> exclusive. My point isn't that there’s only one kind of RDF client, despite
> my use of “the.” My fundamental point is LLMs uniquely handle tasks that
> have challenged RDF clients for decades, thereby finally giving it "escape
> velocity" for even broader use by way of there ability to handle the
> following:
>     1.    Proper use of standardized identifiers — avoiding the pitfalls
> exemplified by the infamous HttpRange-14 permathread.
>     2.    Negating confusion associated with use of hash- or slash-based
> HTTP URIs for entity naming, in line with Linked Data Principles.
>     3.    RDF visualization that actually conveys RDF’s unique value,
> rather than distorting it in ways that make its elegance appear as
> unnecessary esoterica e.g., visualization that doesn't differentiate it
> from Labeled Property Graphs (LPGs).
>
> These are the reasons I strongly believe that LLMs are the generic client
> for RDF, just as Mosaic and later Netscape were for HTML—unleashing a
> global Web of Documents connected via HTTP.
>
> Also, my article includes live demonstrations that back up this viewpoint..
> I’d be happy to review any live demos you have as well—no installations,
> just an HTTP URI I can click to follow my nose through your counterpoint.
>
>
>
> All this leveraging Semantic inference, heuristics (FCA: Formal Concept
> Analysis), Domain Driven Development, DCI (Data, Contexts and Interactions)
> design patterns and, of course, GenAI / LLMs.
>
> Sorry for the self-ad, but this is what I've been working on for a long
> time:
> https://sebxama.blogspot.com/2025/10/semantic-web-genai-enabled-eai.html
>
>
> See my comment above :)
>
> Kingsley
>
>
> Regards,
> Sebastián.
>
>
> On Mon, Sep 29, 2025, 1:52 PM Kingsley Idehen <kidehen@openlinksw..com>
> wrote:
>
>> Hi Everyone,
>>
>> It’s been a while!
>>
>> Something important is happening right now, thanks to the emergence of
>> LLMs as the long-awaited generic RDF client (the so-called “killer app”).
>> We all know how Mosaic → Mozilla/Netscape made HTML and HTTP globally
>> usable by end-users and developers alike. Well, the very same thing is
>> finally happening with RDF—albeit some 20+ years later than expected.
>>
>> Here’s a post I recently published on LinkedIn about this critical
>> development:
>>
>>
>> https://www.linkedin.com/pulse/large-language-models-llms-powerful-generic-rdf-clients-idehen-xwhfe
>>
>> --
>> Regards,
>>
>> Kingsley Idehen 
>> Founder & CEO
>> OpenLink Software
>> Home Page: http://www.openlinksw.com
>> Community Support: https://community.openlinksw.com
>>
>> Social Media:
>> LinkedIn: http://www.linkedin.com/in/kidehen
>> Twitter : https://twitter.com/kidehen
>>
>>
> --
> Regards,
>
> Kingsley Idehen 
> Founder & CEO
> OpenLink Software
> Home Page: http://www.openlinksw.com
> Community Support: https://community.openlinksw.com
>
> Social Media:
> LinkedIn: http://www.linkedin.com/in/kidehen
> Twitter : https://twitter.com/kidehen
>
>

Received on Saturday, 11 October 2025 16:31:41 UTC