Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Type Systems Explained with Examples (thevaluable.dev)
114 points by thanato0s on Sept 28, 2020 | hide | past | favorite | 31 comments


I would pay good money for a comparison of type systems, delivered in the vein of Cow Economics.

In Self, you have a cow. You ask the cow to moo. It moos.

In Java, you have a cow. You construct a CowMooFactory to create an MooIpml. The MooImpl moos.

In Python, you have a cow. You ask it to moo. It quacks.

In C, you have a pointer to a cow. You ask it to moo. Segmentation fault (core dumped).


In Haskell mooing is an impure side effect. It has to be handled with a special design pattern - which recursively generates further side effects, including an infinite number of online tutorials which fail to explain what it does.

This pattern is called a Moonad.



In ML, your value moos. It is inferred to be a cow.

In Ruby, your value moos. Could be a cow; might be a MooDuck. Do you really care?


In Assembly, there is no cow, no mooing, there's a bunch of places where parts of cows and moos may or may not be stored. It's up to you to pick the parts of the cows and moos in the correct order, then send the numbers representing the places where the cow and moo parts are stored to a magic box that interprets your sequence of numbers and puts the cow into the correct order after which you jump over to the cow and tickle its belly until it moos.


In ML, it's inferred if it might, in any potential future, moo.


So what's the difference between ruby and python?


During the Ruby's runtime the cow might get monkeypatched into a bat.


In Ruby, the cow moos automatically.


About 3 seconds, depending on how many cows you have. ;)


In Swift, if cow is a reference type you all share a cow that moos whoever asks, but if cow is a value type then you all have your own copy of a cow that only moos for you.


That doesn't make any sense.


Didn't think Python had duck typing...


What do you think "duck typing" is? Historically, it feels like I've mostly heard the term applied to Python.

Edited to add: Please don't take my question as rhetorical attack - it was intended as inquiry. There may be different definitions floating around.


Well, there we are. I don't know Python so, both jokingly and literally, I didn't know Python had duck typing.


In Rust, you have a Cow. It's smart and allows you to clone borrowed things, including things which can moo.


But only clones on write, which is a useful optimization.


If anyone is interested in Type systems and what's a type, I highly recommend watching the first 20 mins of Sean Parent's presentation:

https://youtu.be/sWgDk-o-6ZE

It will change the way you think about types after this.


I watched it, very much looking for some illumination. But it was just an exhaustive and formal definition, not giving any insight (and not appearing to attempt to)

Maybe I missed something, or maybe I need to watch more. Can you explain why you recommend this segment in more detail?


That video is titled

  CppCon 2015: Sean Parent "Better Code: Data Structures"
Did you mean instead this video? https://www.youtube.com/watch?v=mYrbivnruYw

  Goals for Better Code - Implement Complete Types


Having watched both, I believe the parent comment meant the talk they linked. The first 20 minutes has some talk about representation and type and meaning.

Both talks are interesting. Both are from a very C++ perspective, particularly the one you suggest. That doesn't make them wrong, by any means. Much of it generalizes, although depending on your background it might take varying amounts of effort to understand the generalizations.

In the first talk in particular, I think he conflates (in a small way) type with representation in a way that's... not... ideal? But I don't think it does a lot of harm to the talk; he has very interesting things to say particularly about representation.

Some of the point of both talks overlaps with the common admonishment to "make illegal states unrepresentable", but I don't think any fully implies the other.

For a very different perspective on types, I highly recommend watching https://www.youtube.com/watch?v=3U3lV5VPmOU, although I don't highly recommend actually programming this way.


Having watched the first 20 minutes of the linked video, I'm sure that's what OP wanted to share, but I got next to nothing from it.

The entire talk is about memory layouts and how nothing has a meaning unless we ascribe a meaning to it.

I don't really know what this has to do with types, or type systems, which are an algebraic concern, as far as I'm concerned.


I ended up watching the two videos from Sean Parent mentioned here and they are great! His explanations are simple but touch important considerations we should have about type, especially when we begin to create some ADT. Thanks a lot for that!


yes, this video is excellent


I suspect the use of “semantic” where it should be “semantics” is explained by the French author translating “semantique” a little too directly.


Why does English has all those random "s" at the end of the Greek-borrowed words? AFAIK, most other languages that borrowed the same words don't have those.


Found an interesting paper on this!

https://www.tandfonline.com/doi/pdf/10.1080/00437956.2007.11...

"The morphology of English words like dependence, linguistics, and news suggests that English has an /-s/ suffix forming certain kinds of abstract nouns not used as plurals. While dependence and dependents are homophones, the former takes singular verb agreement, as do linguistics and news, and the latter takes plural. We argue that English has a highly productive derivational suffix /-s/ that creates abstract nouns from adjectives, dependence from dependent + /-s/, linguistics from linguistic + /-s/, and news from new + /-s/. While this suffix originated in the Latin present participle, the sources of a modern /-s/ came into English through extensive borrowing from Latin and French as well as through loan translation and borrowing of Greek words, but then it combined with existing English /-s/ that marked plural or genitive. Orthographic, phonological, morphological, and semantic evidence suggests that this modern /-s/ arose from several distinct sources that came together as a single suffix in the early 171h century."


That's interesting... German has still the genitive case and it is formed with an "s" too. That's a nice resource, thanks for that!


Nice! I didn't know indeed. Thanks for that!


> or to perform “impossible” operations, like division by 1.

This looks like a mistake.


It is, and I fixed it yesterday. Of course I didn't push it... Thanks!




Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: