Re: [community-group] How should tools process higher fidelity values than they can handle internally? (#157)

I generally agree with everything above and I think that only lowering fidelity of modified tokens is a good strategy.

For me this has a lot of overlap with "forward compatibility" and providing an "escape hatch" in case a tool needs to process something it wasn't designed for.

I think the principle behind this can be further abstracted and will overal improve the format.

- explicit vs. implied information
- destructuring values vs. micro syntaxes
- raw vs. encoded data
- no type overloading
- ...

_A hex color is composed of 3 or 4 numbers which have been encoded and then concatenated.
It also has an implied color space of sRGB._

Defining solid principles to make the points above easier and revisiting past choices in the format with these in mind will make the format better.

-- 
GitHub Notification of comment by romainmenke
Please view or discuss this issue at https://github.com/design-tokens/community-group/issues/157#issuecomment-1176154524 using your GitHub account


-- 
Sent via github-notify-ml as configured in https://github.com/w3c/github-notify-ml-config

Received on Wednesday, 6 July 2022 12:21:06 UTC