- From: Daniel Campos Ramos <danielcamposramos.68@gmail.com>
- Date: Thu, 4 Dec 2025 11:52:07 -0300
- To: public-aikr@w3.org
- Message-ID: <e1fc8038-9a17-4260-a103-0edf4c3f921b@gmail.com>
Paola,
Thanks for the follow‑up. Let me work through the misunderstandings
carefully.
1. MVCIC vs K3D
* MVCIC is a human‑in‑the‑loop orchestration method for chaining
multiple free‑tier LLMs. It lives under
docs/multi_vibe_orchestration/ as Markdown chain files, tutorials,
and logs—no compiled code, only browser‑readable instructions.
* K3D is a spatial KR architecture. That is the “House/Galaxy”
material you have seen. The two are documented separately so they
don’t get conflated.
2. Demo / TPAC timeline
* TPAC breakout 14 Nov: you confirmed we would show “the orchestration
method.” I started the live walkthrough (see recording, timestamp
where you say “OK, about time to stop… we reached our third
five‑minute slot”).
* The next day (17 Nov) the schedule again allowed “open floor”; I
asked the question in chat because you’d already moved on to your
deck. That’s the reason there isn’t a second partial demo—there was
no interest in resuming it.
* The “demo” for MVCIC is literally the chain file and tutorial.
Because the entire process is manual coordination (step 1 briefing →
step 2 prompt partner → step 3 human verification → step 4 chain log
→ step 5 synthesis), a screen recording would just be me copying
text across tabs. The documentation already provides the
reproduction steps.
3. Repository access
* Everything is Markdown and lives in GitHub’s web UI; there’s no need
to download executables. Links are in each email because you asked
for them “at the top.” Example:
https://github.com/danielcamposramos/Knowledge3D/tree/main/docs/multi_vibe_orchestration
.
* NotebookLM has been open and you’ve had edit access since day one;
it now carries the disclaimer you requested.
4. “Out of scope” / “produce conceptual artefacts”
* The wiki edits you requested on Nov 12 were made and cited (see AIKR
wiki history 12 Nov). They were later reverted without feedback. If
there are specific headings/labels missing, I’m happy to re‑add them
once you specify the required format (“vocab entry with definition,
use case, KR alignment,” etc.).
* MVCIC has the conceptual artefacts you’re asking for: structured
steps, KR tie‑ins, use cases, and reasoning on why human
verification matters.
* K3D’s KR vocabularies (e.g., K3D_NODE_SPEC, REALITY_ENABLER) already
map to the “blue bubbles” in your diagram—several CG members
(Milton, Dave, Tyson) have acknowledged that alignment on list.
5. “Inspired by vibe coding” vs attribution
* Your 10 Nov “Human Process…” note predates Karpathy’s 26 Nov repo,
so citing his project today as the inspiration doesn’t match the
timestamp.
* If parts of MVCIC fed that note, please add the citation; likewise,
I will keep referencing the CG materials I use (Quine PDF, ontology
diagrams, etc.). That keeps provenance clear in both directions.
6. Language / communication
* I’m communicating in English (and can switch to
Portuguese/Spanish/French if needed).
* The difficulty isn’t language, it’s that the same questions keep
getting answered and then revisited. To avoid loops, I’ll keep
replying with pointers:
* MVCIC method description (10 Nov email “Re: ANP automating the
orchestration”)
* Chain docs (URL above)
* TPAC transcript link/timecode
* Wiki change logs (12 Nov entry)
* NotebookLM link (with disclaimer)
7. Next steps
* If you still want a screen capture, I can record a short walkthrough
of the chain process—just understand it will look like exactly what
the docs already show.
* Please let me know the exact wiki format you want (section template,
naming convention, etc.) so I can re‑add the entries in the manner
you’ll accept.
We’re all trying to do KR work that’s transparent and reproducible.
I’ll continue to keep every contribution grounded, open, cited, and
aligned to the CG scope (with no access restrictions at all - no
gatekeeping).
Best regards,
Daniel
Received on Thursday, 4 December 2025 14:52:16 UTC