Re: [community-group] $type should be a required property (#139)

Perhaps the spec can be worded more clearly but: **Every token has an unambiguous type** and tools are expressly forbidden from trying to guess a type (e.g. from the value or name).

However, the `$type` property on a token is only one of several ways of specifying what the type is. The algorithm for determining a token's type is described in the [Design Token - Type chapter](https://design-tokens.github.io/community-group/format/#type-0).

As it stands, the order of precedence is (from highest to lowest):

1. The `$type` property on the token object, if present. Otherwise...
2. If the token is a reference, the type of the token it is referencing. Otherwise...
3. The inherited type from the nearest parent group that has a `$type` property. Otherwise...
4. Whichever of the basic JSON types the token's value is

If we mandated that every token object must have `$type` property, then that would change things quite substantially. For instance, we'd need to remove the ability to have a `$type`  property on a group since it would become completely useless.

Personally, I prefer to retain the ability to set a default type at the group level (yes, I'm a lazy typist! 😛) but that only makes sense if the `$type` property on tokens remains optional.

Btw, the last fallback case (4.) is a bit like the "default" type idea proposed in @reblim's OP. Admittedly, most of those basic types are unlikely to be useful in the context of a design system (though `number` could be useful for things like unitless line heights). However, I do believe they are worth keeping in the spec for the following reasons:

* I think having some kind of fallback or "default" type is useful to guarantee that every token always has a type - even when the author forgot to set one via the `$type` property. However, just picking one of the spec's "normal" types (`color` `dimension`, `fontFamily`, etc.) feels arbitrary. Saying a string is a string, a number is a number etc. might not be useful but at least it's not _wrong_.
* For teams and tools that have exotic requirements which are not directly covered by the spec, these fallback types in conjunction with `$extensions` could provide a neat way to define unofficial types. E.g. Let's say you want to create "roughness" tokens for 3D graphics which define how glossy or matt a texture is. Perhaps you'd represent the value of such tokens as a number, but since there's no "roughness" type in the spec, you could define an extension to differentiate you roughness tokens from ones that are "just" numbers[^1]. If we did not have these fallback types, then you'd necessarily need to pick one of the spec's actual types and misuse use it for your intended purpose. To me that feels more hacky.
* The 1st editor's draft had a concept of user-defined composite types. While this was removed in the 2nd draft in favour of a set of pre-defined composite types, I for one would like to revisit this idea in future versions of the spec. It could become an "official" way to define your own token types (as opposed to the unofficial route I outlined in the previous point) and my hunch is that having those basic JSON types available could be useful for that. I can imagine we might one day provide a way to extend or compose them into whatever you need.

Note that I think it's perfectly acceptable for tools to ignore tokens that have a type that's not relevant to them. For example, imagine I wrote a tokens file that contained a `color` token and a `boolean` token. If I loaded that file into a design tool like Figma, I'd probably expect it to make the color token available to me wherever I can set the color for something (essentially like the "color styles" Figma has today). However, the boolean token probably has no use in the context of that tool, so I think it would be fine for Figma to just ignore it.

In my view, what _every_ tool MUST be capable of is to parse the entire file and implement that type resolution algorithm in order to determine what kinds of tokens there are. Whether it subsequently does stuff with all or just a sub-set of the tokens is entirely up to the tool though.




[^1]: Example of how a hypothetical "roughness" token could look (please excuse the crappy formatting, but it seems I can't do code _blocks_ in footnotes): `{ "shiny": { "$value": 0.8, "$extensions": { "com.example.custom-type": "roughness" } } }`.

-- 
GitHub Notification of comment by c1rrus
Please view or discuss this issue at https://github.com/design-tokens/community-group/issues/139#issuecomment-1166640576 using your GitHub account


-- 
Sent via github-notify-ml as configured in https://github.com/w3c/github-notify-ml-config

Received on Sunday, 26 June 2022 20:45:13 UTC