Re: [community-group] [Meta] Do functions/transforms happen before tokens.json (i.e. to generate it)? Or within tokens.json? (#238)

Might have some bias as the person behind the modern [Token Operations](https://github.com/ddamato/token-operations) approach.

I'm not sure if you've carefully crafted the term to describe non-static as "formulae" but I think it might potentially leave out a feature of the spec; aliasing.

In a purely static ecosystem, a system which reads tokens can always traverse to a token and knows once it gets there that there is no further work to do. The value there is precisely sent upstream with no further work.

In an ecosystem where the thing at the end of the traversal says "there's more work to do" causes the system to have some level of dynamism; even if trivial. I'd argue aliasing a token is the most simple form of computing a token.

So in my mind, if we want to allow for aliasing, then we are opening the door for other types of dynamic value generation. Meanwhile, if we are interested in static representations, that would mean that once the traversal is done to a particular token there is no more "work" to be done by token distribution tools; WYSIWYG.

I've spoken to @kaelig about Token Operations before, and he is admittedly not convinced that formulae is appropriate in the spec. It is truly a matter of where the responsibility lies. The question is, are we aiming to define how _people_ are meant to curate a token file OR are we aiming to define how _systems_ are meant to read the file? If this is system based and we don't expect people to be writing these files, then I believe a static file is more appropriate. However, if we believe people will be authoring these files, formulaic niceties like aliasing and others will improve the DX of authoring.

I came to Token Operations with the assumption that people would be in these files in lieu of waiting for tools to support the curation process. The concept of Token Operations (or other formulae) can still exist outside of the spec in order to make the process of curating thousands of token values more manageable.  I find value not just in standardizing the format of the file, but also standardizing how dynamic values could be resolved. I do recognize that this value might be out of scope for the specification.

-- 
GitHub Notification of comment by ddamato
Please view or discuss this issue at https://github.com/design-tokens/community-group/issues/238#issuecomment-2087675005 using your GitHub account


-- 
Sent via github-notify-ml as configured in https://github.com/w3c/github-notify-ml-config

Received on Tuesday, 30 April 2024 22:58:16 UTC