2021-06-13: Updated for TS4.1+
Not truly at compile-time, no. There is a suggestion (now at microsoft/TypeScript#41160) to allow regular-expression validated string types, but it's not clear that it will ever be implemented. If you want to go to that suggestion and give it a 👍 and describe a compelling use case that isn't already listed, it couldn't hurt (but it probably won't really help either).
You could try to use template literal types for this to programmatically generate a big union which matches every acceptable string literal. This even kind of works if you only need three-ish digits:
type UCaseHexDigit = '0' | '1' | '2' | '3' | '4' | '5' | '6' | '7' |
'8' | '9' | 'A' | 'B' | 'C' | 'D' | 'E' | 'F'
type HexDigit = UCaseHexDigit | Lowercase<UCaseHexDigit>
type ValidThreeDigitColorString = `#${HexDigit}${HexDigit}${HexDigit}`;
// type ValidThreeDigitColorString = "#000" | "#001" | "#002" | "#003" | "#004" | "#005"
// | "#006" | "#007" | "#008" | "#009" | "#00A" | "#00B" | "#00C" | "#00D" | "#00E"
// | "#00F" | "#00a" | "#00b" | "#00c" | "#00d" | // "#00e"
// | ... 10626 more ... | "#fff"
but since such template literal types can only handle unions on the order of tens-of-thousands of members, this will break if you try to do this with six digits:
type ValidSixDigitColorString =
`#${HexDigit}${HexDigit}${HexDigit}${HexDigit}${HexDigit}${HexDigit}`; // error!
// ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
// Expression produces a union type that is too complex to represent
So you'll have to use a workaround.
One workaround is to use template literal types as a generic constraint instead of making ValidColorString a concrete type. Instead, have a type like AsValidColorString<T> that takes a string-type T and checks it to see if it's valid. If it is, it is left alone. If not, then return a valid color string that is "close" to the bad one. For example:
type ToHexDigit<T extends string> = T extends HexDigit ? T : 0;
type AsValidColorString<T extends string> =
T extends `#${infer D1}${infer D2}${infer D3}${infer D4}${infer D5}${infer D6}` ?
`#${ToHexDigit<D1>}${ToHexDigit<D2>}${ToHexDigit<D3>}${ToHexDigit<D4>}${ToHexDigit<D5>}${ToHexDigit<D6>}` :
T extends `#${infer D1}${infer D2}${infer D3}` ?
`#${ToHexDigit<D1>}${ToHexDigit<D2>}${ToHexDigit<D3>}` :
'#000'
const asTextProps = <T extends string>(
textProps: { color: T extends AsValidColorString<T> ? T : AsValidColorString<T> }
) => textProps;
It's pretty complicated; mostly it splits apart the string T and inspects each character, converting bad ones into 0. Then, instead of annotating something as TextProps, you call asTextProps on it for validation:
const textProps = asTextProps({
color: "#abc" // okay
})
const badTextProps = asTextProps({
color: "#00PS1E" // error
// ~~~~~
// Type '"#00PS1E"' is not assignable to type '"#00001E"'.(2322)
})
This works at compile time, but might be more trouble than it's worth.
Finally, you can fall back to pre-TS4.1 solution and make a nominal-ish subtype of string with a user-defined type guard to narrow string values to it... and then jump through all sorts of hoops to use it:
type ValidColorString = string & { __validColorString: true };
function isValidColorString(x: string): x is ValidColorString {
const re = /#[0-9a-fA-F]{3}([0-9a-fA-F]{3})?/g; // you want hex, right?
return re.test(x);
}
Usage:
const textProps: ITextProps = {
color: "#abc"
}; // error, compiler doesn't know that "#abc" is a ValidColorString
const color = "#abc";
if (isValidColorString(color)) {
const textProps2: ITextProps = {
color: color
}; // okay now
} else {
throw new Error("The world has ended");
}
The latter isn't perfect, but it at least gets you a bit closer to enforcing such constraints.
Hope that gives you some ideas; good luck!
Playground link to code