Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

There's a lot of things, some about old-school types (Java, C), other about modern ones. I don't think most are fundamental, even though some are common experiences today.

#1 is fundamental. (Yet people somehow live with the JS ecosystem that's slower than GHCi.) It's supposed to evolve into always becoming a smaller problem, since computers are always getting faster; but I don't think we've put everything we can into types already, so I expect it to get worse in the near future.

#2 and #3 are about old-school types.

#4 Oh, yeah, they can. But they can also help a lot in team coordination. Powerful stuff enable you either way, if you harm yourself or take advantage of it is your choice.

#5 Failures in type systems encourage code generation. Expect that to always improve, but always slowly.

#6 That's why there's always a parsing stage between input and processing. You deal with input errors at the parsing stage, and processing errors at the processing stage. Most communities of dynamic and old-school languages make a large disservice to the industry by mixing those; they explode error handling into something intractably complex.

#7 Hum... You are holding it wrong. Do not state variants into your types. Instead, use the type system to get every invariant out of the way, so the variants stand clear. (And yeah, there are plenty of libraries and frameworks out there that try to encode the environment into types. That deeply annoys me... But anyway, if you do that, take the types as requirements upon the environment, not its description. Those are different in very subtle ways.)

#8 This shouldn't be fundamental. AFAIK there are not many people trying this, and the few there face a Sisyphean task of keeping their code up to date with the mainstream changes. I do hope people make progress here, but I'm not optimistic.



> Yet people somehow live with the JS ecosystem that's slower than GHCi.

I think a lot of people, including the parent, seem to equate speed of ecosystem and iteration with web development and instant reload of web pages. When other systems allow fast iteration, it goes unnoticed unless it's for web dev. Luckily, a bunch of those 'impossible' systems have to now too, like [0].

[0] https://ihp.digitallyinduced.com/blog/2020-08-10-ihp-live-re...


Web development is an example. Fast iteration is the thing that I like. I have so far associated fast iteration with dynamic languages, and it is certainly the case that my experience is that most fast iteration systems are dynamic.

But maybe that simply reflects a concern of the relevant communities. If strongly typed language systems start adding fast iteration approaches to things and are able to achieve a similar level of quick iteration then that will definitely address one of the things I dislike about them. I haven't coded significant amounts of haskell since 2000, but back then what you could do interactively was very restricted.

At the end of the day, the compiler is doing a bunch more stuff in strongly typed languages. It's like taking a bunch of your verification infrastructure and saying 'these must run before you're allowed to see the result of what you wrote'. It will necessarily be slower, although with work maybe it won't be so much slower that it matters.


> Fast iteration is the thing that I like

> haskell since 2000,

Things changed a lot in 20 years.

Thanks for noting down something about your age; I have always been a bit ageist about 'fast iteration' as I never met someone close to my age (been devving professionally for 30 years this year) that cares too much about it. I am not a very good programmer, but a very experienced one and i'm consistently faster at delivering than my 'fast iterating younger peers' as I simply know what i'm going to type beforehand, I don't need too many iterations to get it right and I have enough experience to know that i'm close to what we need after it compiles. The people who just type/run 1000 times/minute get stuff done, but it's not the way I would ever like (or liked) to work.

> It will necessarily be slower,

GHCi is fast but other avenues can be explored as well, like creating a real interpreter just for development , like Miri for Rust. Only for faster iteration of logic, you forgo some of the type benefits, but when you are done iterating, you compile and voila. I guess the merging of incremental compilation, jits, interpreters etc will evolve in something that might not run optimally but gives blazingly fast iteration up to perfect performance after deployment. And anything in between.


> #2 and #3 are about old-school types.

There absolutely are approaches to this that don't fall foul of my complaints, but when you say 'old-school types' I think you're talking about Java and non inferred types.

I was including other more modern languages in my criticism. Scala for example ends up with pretty hairy types very quickly for higher level code. So much so that they made the documentation system lie about the types to make it easier to understand.

And most currently popular languages don't give you runtime access to types and allow you to treat them as first class.

The languages that allow you to deal with types with the same language you write code in are not remotely mainstream. So unless by 'old school' you include all mainstream languages then I disagree.


Yes, I meant types systems like Java's.

I was thinking about unusable code, caused by the need to write way more down in brittle types than anything you save on coding. In fact there are problems with complex types.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: