Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

TypeScript does the latter.

The former is difficult to track through arithmetic operations.



Typescript also does the former, albeit with less friendly syntax.

   X: 1 | 2 | 3 // …


Kinda but not really. It actually handles type inference on string concatenation, but doesn't understand math.

  type N = 1 | 2

  const x1:N = 1
  const x2:N = 1
  
  const sum:N = x1 + x2
Typescript (the version running on my box) considers this an error.


There is nothing to suggest that 1..100 understands math.


I think the whole point of a type fully understanding the constraints of your data suggests it would ideally.

Typescript is able to do this with strings when using templates.

  type Suit = "C" | "D" | "H" | "S";
  type Rank = "A" | "2" | "3" | "4";
  type Card = `${Suit}${Rank}`;

  const club = "D";
  const ace = "A";

  const aceOfClubs: Card = `${club}${ace}`;
Even though I have not declared the club or ace as a Suit or Rank, it still infers the Card correctly in the last line. This is a stronger form of typing than what you are settling for where 1..100 doesn't know its own range.

This is the difference I'm referring to.


The difference you are referring to was already asserted in your first comment. I'm afraid whatever you were trying to add here is being lost. Unless you were, strangely, straight up repeating yourself for no reason?


let me say in fewer words:

>There is nothing to suggest that 1..100 understands math.

That's like, your opinion man. I'd like it.


You'd like what? For the type checker to understand math? We know that already. You said that in your first comment. And the second one. And the third one. And now seemingly the fourth one. But let it be known, whatever new information you are trying to add here is getting lost in translation.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: