Hacker Newsnew | past | comments | ask | show | jobs | submit | dwattttt's commentslogin

By the by, code blocks on here are denoted by two leading spaces on each line

  like
  this

I've tried it recently, from memory error inference wasn't that great through it.

That's exactly what's currently being fixed before stabilizing it.

Is it any more or less amusimg, or perhaps tedious, watching the first Rust Linux kernel CVE be pounced on as evidence that "problems .. did not magically disappear"?

Does anyone involved in any of this work believe that a CVE in an unsafe block could not happen?


What should a type checker say about this code?

  x = []
  x.append(1)
  x[0] = "new"
  x[0] + "oops"

It's optionally typed, but I would credit both "type checks correctly" and "can't assign 'new' over a number" as valid type checker results.

TypeScript widens the type of x to allow `number | string`, there are no type errors below:

    const x = []
    x.push(1)
    type t = typeof x
    //   ^? type t = number[]
    x[0] = "new"
    type t2 = typeof x
    //   ^? type t2 = (number | string)[]
    const y = x[0] + "oops"
    //    ^? const y: string
https://www.typescriptlang.org/play/?#code/GYVwdgxgLglg9mABA...


It depends on the semantics the language specifies. Whether or not the annotations are optional is irrelevant.

Either way, you didn't annotate the code so it's kind of pointless to discuss.

Also fwiw python is typed regardless of the annotations; types are not optional in any sense. Unless you're using BCPL or forth or something like that


> Either way, you didn't annotate the code so it's kind of pointless to discuss.

There are several literals in that code snippet; I could annotate them with their types, and this code would still be exactly as it is. You asked why there are competing type checkers, and the fact that the language is only optionally typed means ambiguity like that example exists, and should be a warning/bug/allowed; choose the type checker that most closely matches the semantics you want to impose.


> There are several literals in that code snippet; I could annotate them with their types, and this code would still be exactly as it is.

Well, no, there is one literal that has an ambiguous type, and if you annotated its type, it would resolve entirely the question of what a typechecker should say; literally the entire reason it is an open question is because that one literal is not annotated.


True, you could annotate 3 of the 4 literals in this without annotating the List, which is ambiguous. In the absence of an explicit annotation (because those are optional), type checkers are left to guess intent to determine whether you wanted a List[Any] or List[number | string], or whether you wanted a List[number] or List[string].

Right. And the fact that python doesn't specify the semantics of its type annotations is a super interesting experiment.

Optimally, this will result in a democratic consensus of semantics.

Pessimistically, this will result in dialects of semantics that result in dialects of runtime languages as folks adopt type checkers.


> And the fact that python doesn't specify the semantics of its type annotations is a super interesting experiment.

That hasn't been a fact for quite a while. Npw, it does specify the semantics of its type annotations. It didn't when it first created annotations for Python 3.0 (PEP 3107), but it has progressively since, starting with Python 3.5 (PEP 484) through several subsequent PEPs including creation of the Python Typing Council (PEP 729).


So why do the type checkers differ in behavior?

The existence of a specification does not make all things striving to implement it compliant with the spec. As the history of web standards (especially back when there were more browsers and the specs weren't entirely controlled by the people making them) illustrates.

> I could annotate them with their types, and this code would still be exactly as it is.

Well, no, you didn't. Because it's not clear whether the list is a list of value or a list of values of a distinct type. And there are many other ways you could quibble with this statement.


That's why I hexify all binary files, to make it easier to understand them.

You're thinking of cross platform codebases. There's nothing about cross compilation that stops the toolchain from knowing what APIs are present & not present on a target system.

Cross compilation and cross platform are synonymous in compiled languages, in regards of many issues that one needs to care about.

Cross platform goes beyond in regards to UI, direction locations, user interactions,...

Yeah, if you happen to have systemd Linux libraries on macOS to facilitate cross compilation into a compatible GNU/Linux system than it works, that is how embedded development has worked for ages.

What doesn't work is pretending that isn't something to care about.


> Cross compilation and cross platform are synonymous in compiled languages

Err, no. Cross-platform means the code can be compiled natively on each platform. Cross-compilation is when you compile the binaries on one platform for a different platform.


Not at all, cross platform means executing the same application in many platforms, regardless of the hardware and OS specific features of each platform.

Cross-compilation is useless if you don't actually get to executed the created binaries in the target platform.

Now, how do you intend to compile from GNU/Linux into z/OS, so that we can execute the generated binary out from the C compiler ingesting the code written in GNU/Linux platform, in the z/OS language environment inside an enclave, not configured in POSIX mode?

Using z/OS, if you're feeling more modern, it can be UWP sandboxed application with identity in Windows.


> cross platform means executing the same application in many platforms, regardless of the hardware and OS specific features of each platform.

That is a better definition yes. But it's still not synonymous with cross-compilation, obviously. Most cross-platform apps are not cross-compiled because it's usually such a pain.


You just need a compiler & linker that understand the target + image format, and a sysroot for the target. I've cross compiled from Linux x86 clang/lld to macOS arm64, all it took was the target SDK & a couple of env vars.

Clang knows C, lld knows macho, and the SDK knows the target libraries.


"A language that doesn't affect the way you think about programming is not worth knowing." ― Alan J. Perlis

Learning Rust definitely made me a better C programmer.


> The Python devs, by contrast, have actually put in a lot of work over the years to support a plethora of operating system

This pre-pep is proposed by core Python devs. The discussion includes comparing the supported targets of both Python and Rust, the people who are left worse off are devs external to both groups, who've made Python work on unsupported targets (e.g. Gentoo, who are represented in the thread).


> "if I spent the time, risk, effort, and money to develop the pre-eminent protocol and hardware used by most TV's in the world... would I want to give that work away for free?"

This is absolutely fine. But it should preclude them from becoming a public standard.


> This is absolutely fine. But it should preclude them from becoming a public standard.

Define "public standard". And how is HDMI one of them?

HDMI is a private bundle of IP that the license holders are free to give (or not give) to anyone. We're not talking about a statue by a government 'of the people' what should be public. No one is mandated by any government to implement it AFAICT: and even if it was, it would be up to the government to make sure they only reference publicly available documents in laws.


Devil's Advocate time. Would the result of that be better or worse quality public standards?

(I don't actually know what I think off the cuff - but it's the obvious follow on question to your statement and I don't think your statement can stand on it's own without a well argued counter)


It's a fine question. I think the onus is on public regulatory bodies responsible for the standards; if they aren't able to pay for the work to be published as an open standard, it wasn't worth the cost.

Standards also benefit the industry as a whole, and it's generally in the interests of the companies involves to participate in the standardisation process anyway. Charging for the description of them is just a cherry on top (compared to e.g. licensing any relevant patents), I don't believe it's at all required to incentivize a standardization process.

(this is of course looking at interoperation standards - regulatory bodies are going to be more concerned with e.g. safety standards)


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: