I think the whole point of a type fully understanding the constraints of your data suggests it would ideally.
Typescript is able to do this with strings when using templates.
type Suit = "C" | "D" | "H" | "S";
type Rank = "A" | "2" | "3" | "4";
type Card = `${Suit}${Rank}`;
const club = "D";
const ace = "A";
const aceOfClubs: Card = `${club}${ace}`;
Even though I have not declared the club or ace as a Suit or Rank, it still infers the Card correctly in the last line. This is a stronger form of typing than what you are settling for where 1..100 doesn't know its own range.
The difference you are referring to was already asserted in your first comment. I'm afraid whatever you were trying to add here is being lost. Unless you were, strangely, straight up repeating yourself for no reason?
You'd like what? For the type checker to understand math? We know that already. You said that in your first comment. And the second one. And the third one. And now seemingly the fourth one. But let it be known, whatever new information you are trying to add here is getting lost in translation.
The former is difficult to track through arithmetic operations.