> whenever something APL related comes up on Hacker News, it's always in the form of arcana you'll laugh at rather than fascinating work being being done by very smart people. I find that offensive, but I'm not going to push back on that. I don't talk on Hacker News. But I've seen a lot of interesting things being posted about it, and the comments are always almost universally, like: "what the hell is that for?". And not understanding that that language was once the best interactive language out there, and in many ways still is. And there's some phenomenal stuff being done with it. But it's hived off somehow from most of the rest of the computing world in a way that, for instance, Lisp (which is, I think, an almost contemporaneous language; Lisp was a little earlier, but similar) ... Lisp is sort of part of the universe of programming that people understand. They'll laugh at it, but they understand it, and they know what to do with it. And APL is something altogether different. I think that's a shame.
I will admit my interest in learning APL has been piqued. Though I do find it odd that they never once mention the most popular array programming language of all time, MATLAB
Sure, you can learn to read and write what superficially looks like line noise, but the market has spoken: very few people want to.
Look at the success of Tensorflow and PyTorch. They’re also tensor / array programming systems. They’re wildly popular, with at least two orders of magnitude more users each than all array languages put together.
The difference is that they’re built on top of Python, which is famously user-friendly in its syntax.
More importantly: they were parallel and GPU-accelerated from the beginning.
What’s the point of an array-based language if it’s slower and harder to write than the equivalent in mainstream languages!?
Their current niche of “quants analysing time-series data” is tiny and shrinking all the time.
There was even a chap here advocating the speed and terseness of his preferred array-based language.
Meanwhile the equivalent in Rust was something like a thousand times faster and not much longer!
Array-based languages have been infected by a particular style largely unique to certain types of mathematicians: brevity over clarity, obscure syntax over English, idioms instead of identifiers, etc…
They’ll never be popular while they remain purposefully niche, appealing only to the type of developer that uses single-character file names:https://news.ycombinator.com/item?id=31363844
> What’s the point of an array-based language if it’s slower and harder to write than the equivalent in mainstream languages!?
How many times would you throw away a 1000 line program and start over with completely different data structures and algorithms, just to try a different approach? What about a 10 line program? Or a 1-line program?
When it's basically free to rewrite your entire program, you tend to explore more. That's one benefit terseness affords you.
With auto complete I rarely type more that a character or two of any single thing. I doubt it takes me much longer to compose a traditional program compared to array language syntax with modern tooling.
The (human) memory footprint of the more verbose program is much larger though, requiring it to be thought of in smaller chunks, and thus in a slower manner. You can't keep the mental model of a 1000 line program entirely in your memory, but a 10 line program is different, even if the individual lines are more dense.
Indeed it does!
Not to try to be confrontational, but have you written any substantial programs in an array language like APL? I'm sure that any APL programmer will be the first to tell you that writing APL would be unbearable if those "unreadable" symbols were replaced with names! Why? Because in APL, each symbol is a unit of meaning, and there's simply no reason for each unit of meaning to be more than a single character. Why should I type `add folded divided_by count` when I can just write `+/÷≢`?
> Sure, you can learn to read and write what superficially looks like line noise, but the market has spoken: very few people want to.
True! And it will surely always be true! But no one would want to write APL without the symbols. ("But isn't Tensorflow just like APL without the symbols?" No. Tensorflow is based on the array paradigm, but it is very, very different from an array __language__ like APL.)
> obscure syntax over English
This is effectively like berating the Chinese for inventing a writing system that looks nothing like the Latin script. Is it totally different? Yes. Does that make it inherently bad? No. (Can it still be bad? Yes! But I don't think APL and friends are as bad as people might think.)
Anyway, that's my two cents on the obscure syntax of array languages are a tool, not a problem. They'll always limit the userbase, there's no doubt about that, but I couldn't imagine a world without them.
Probably because Matlab is basically an imperative scripting language with a bunch of array features bolted on. It's been pretty successful for numerous reasons although there is definitely a lot i dislike like the crazy high costs (you pay $$$ for the language environment, $$$ for the libraries that you want, and $$$ if you want to create a standalone executable). However, it's worth it to a lot of scientists for valid reasons.
APL is a lot cooler as a language, but is missing a lot of built-ins that Matlab has for numerical computing. The APL community would just tell you to write a few lines of code to implement what you need rather than call out to a massive black-box function. It's certainly a better approach if you have the time/ability, but I often don't. More recently, Dyalog Apl and Kdb+ (the two big commercial Apl or Apl-derived languages) have support to use Python libraries if you need that.
I wish the licensing of APLs were more free. Yes, GNU APL exists and related languages like J are free. However, if you want the APL symbol set your choices are few: GNU APL, Dyalog, or one-man hobby projects. As far as I can tell, GNU APL is more of a historical time capsule than a practical development language.
The array language community seems friendly and exceedingly competent. It also appears to have a strong "get it done" attitude which prioritizes engineering over freedom. The community is tightly entwined with proprietary software.
I can't bring myself to invest the time to meaningfully learn APL because it's hard to see it as a real investment—the community doesn't own its contributions. It looks like trading dollars for arcade tokens.
While I am the author of one of those one-man projects, I would like to point your attention towards BQN. It started as a one-man project, but isn't anymore.
As much as I would like people to look at my implementation, if you want to play around with something more complete, and also uses symbols in the same way APL does, then BQN is the one to look at.
You know, there's a reason why there's so many "one-man hobby project" implementations of, say, APL. It's actually not that hard! If you're willing to spend a week (yes, a week), you can have your own barebones implementation of APL. And from there, it's really sort of your playground, which is neat. Thought of a cool idea? Implement it!
Obviously, most people won't want to do this, investing time (even if it's not that much) in something they don't really even have any experience invested in. But it's an easier option than you might expect.
Shouldn't hurt most programmers to look at problems from a different angle. Or at least allow their brains to re-wire into considering other programming styles.
I listened to this. He talks about language diversity but I never heard Rob Pike mention anything about Haskell or ML style languages and his opinions on that. He never commented on functional programming languages and the closest he gets to it is mentioning lisp.
The design of Go feels almost as if he doesn't even know about those ML languages and it feels as if he doesn't like FP. Of course he probably does know about ML style languages. But I would be very interested to hear his take on it and his opinion of FP because I couldn't find anything from google. Anybody know of any links to writings/videos where he elucidates his viewpoints?
Previously: "[Pike is] hardly the first hard-core hacker to be ignorant of the degree to which type theory has seen dramatic advances since the 1980s."
It's a comment on this quote from Pike:
https://news.ycombinator.com/item?id=6821389
I think Pike definitely had not, at that point, explored the way types work in ML-style languages.
I remember this quote. The context of that quote was actually a reference to OOP and design patterns. He started talking about Nomenclature and referred to OOP Design patterns as just spending time creating taxonomy.
FP wasn't really referenced at all in that quote. FP definetely has it's own vocabulary, but nowhere near as extensive as OOP.
Well, Lisp is just as functional as ML-family languages like OCaml and F#. Haskell's typed/pure FP is a branch off of this. Surely between Pike, Griesemer, and Thompson someone had some ML experience, but this doesn't matter much for the design of Go. More important is that newcomers can't be expected to have ML experience, and a central goal of Go is to be quickly accessible to new programmers. And there was also a focus on fast compile times, fast execution, and low-level control, which are very hard to achieve with immutable data structures. Overall Go's design is very restrained, which I think shows a lot of wisdom on the part of the authors. ML is not the only language slighted: as Rob says, the only bit of APL that made it in was the name "iota"!
I think it'd be a bit of a shame if everyone were pushed to be a polyglot with a finger in every currently popular paradigm. Connections between different approaches to programming are very useful, but the sort of effort made in Go, to refine and simplify one imperative/OO approach, also helps push our understanding of programming into new territory.
Maybe so. I guess you're talking about option types, although it's not obvious to me that these do any better given the requirement that errors are always explicitly shown in the code. So maybe your problem is with that requirement instead. But why are you listening to a podcast in the hopes the guest will admit his ignorance and tell you something you already know, instead of to learn new things?
I'm talking about a more general concept. A kind of type that can be either one thing or another. For example an Int or an Error.
You have Product types which are types that are two things at the same time an (int and an error) and you have sum types (int or an error).
To illustrate say I have two types that consists of a small finite set of values type A and type B.
A = 1 | 2 (cardinality = 2, A can be a 1 or a 2)
and another type that consists of 3
B = '1 | '2 | '3 (cardinality = 3, B can be a '1, '2 or '3)
A product type is like a tuple or a struct containing both A and B: (A, B). The cardinality of (A, B) is the product of the cardinality of the individual types: 2 * 3 = 6
(1,'1) (1,'2), (1,'3) (2,'1) (2,'2) (2,'3)
in other words cardinality is the total amount of possible values that can be represented by the type.
A sum type is a type that consists of EITHER A or B: (A | B). The value can be one or the other. The cardinality becomes the sum of the cardinality of the original types 2 + 3 = 5. In this case the sum type of A | B can be one of these values:
1, 2, '1, '2, '3
Go is missing the sum type. It's like the world of math with only multiplication and no addition. You are missing a critical piece of programming.
An option type is simply a Sum type with two possible types. (Any | None)
But it goes far beyond just Options.
For example JSON is not definable as a type in Go. Not without some really awkward stuff (aka reflection lol). You can define it in almost every other modern language:
You can define a type completely isomorphic to json in almost every language. You can't do that at all with Go. Literally this popular data format cannot defined in go. How the heck is that suppose to be "practical"?
What does go do when it comes to parsing json? I've seen it and it appears to be the ugliest thing I've ever seen. But that's another deep dive.
>But why are you listening to a podcast in the hopes the guest will admit his ignorance and tell you something you already know, instead of to learn new things?
I just got a job that involves golang and its now a big part of my life as it's now my daily driver. I thought due to the popularity of the language it must be great.
What ended up happening was Go feels like a broken language. But maybe I'm wrong. Maybe Rob Pike had a good reason not to include sum types in his language, or maybe he just didn't know about it. Imagine that, my life and the lives of other people defined by the fact Rob Pike didn't know something.
That's what I want to find out. Is the popularity of his language really stemming from him and other people not knowing any better? Or is it me not knowing any better? Have I not seen the light? Or have you not?
Put it this way. If you read my post and you knew about everything I said here and you love golang... Then you know something I don't. If you learned something then maybe you haven't seen the light.
Sure, I've run into this when I went to do some language implementation in Go (dumped at [0]; didn't keep up with it just because I didn't have much reason to do it in the first place). I'd prefer ADTs, but I just frowned a bit and used interfaces. Your strong feelings here aren't because this is objectively inconvenient but because you think it's something you shouldn't have to put up with. If there's a factor you're missing—not saying there is—it's that types aren't supposed to tell the whole story in Go and it's fine to have data that's not fully described by a type.
I don't believe in a single "the light" to be seen. I know something you don't (the ability to create your own array DSL gives you none of the advantages of a particular array DSL developed over decades of hard work), you know something I don't. Go's popular because it got stuff right that other languages in the space like Dart and Swift missed.
> I know something you don't (the ability to create your own array DSL gives you none of the advantages of a particular array DSL developed over decades of hard work)
I don't see how this has anything to do with golang.
>you know something I don't
That go is missing sum types?
> Go's popular because it got stuff right that other languages in the space like Dart and Swift missed.
The only advantage I see are implementation specific and not language specific. Go has a better ecosystem, it's cross platform, it has extremely fast compilation times. The language itself is independent of these factors. You haven't mentioned any specific advantage by golang as a language here which is my main gripe with it as of now. But if you meant the implementation specific stuff than I agree with you, those are big advantages. It's one of the reasons why a language like python hasn't fully taken over... if python had the speed of C then it would likely even replace go.
Because of this I'm going to have to assume you agree with me. Go from a language design viewpoint is fundamentally broken. You clearly still like it, but you also clearly can't articulate specific reasons as to why. I mean this is normal, if you like something for really long and you learn that you've liked something flawed for years you're not going to flip around in seconds.
Anyway I think Go only appears fundamentally broken. There must be someone who knows why Rob Pike made these choices to leave our really fundamental primitives when designing go. I don't see why yet, and even though you're a supporter of go you haven't clearly elucidated why either.
I'm still waiting on the reasoning why Rob decided to leave out sum types and make go routines a first class feature when the concept of green threads are easily created as library sub rountines. Not being snarky here. I think this reasoning exists, we just don't see it yet. Perhaps someone else does?
I got curious and decided to google around. I din't find anything from Pike specifically, but my overall impression is that the maintainers simply refuse to see significant value in sum types. They do acknowledge the benefits but not their importance. Interestingly, the overwhelming majority of Go users (at least as indicated by this[0] GH issue) do wish for Go to have sum types. My speculation for Pike in particular though is that he's not that important to have significant influence on the evolution of Go.
They knew about it, they considered it, and they decided not to do it. Sometimes it's just that simple. It has been mentioned in the FAQ since day one: https://go.dev/doc/faq#variant_types
Is it? That seems like a very bold claim. MLs have been at the heart of a lot of language development for the last few decades, both as research languages for various concepts, but also practically: OCaml is fairly famous as a language to write programming languages in. FP isn't something wild and niche, it's discussed fairly regularly even amongst users of mainstream programming languages. Even Go has an FP library now, coming out of IBM of all places.
I don't think you can attribute this to ignorance, because it's very clear that Pike is not ignorant of programming language design - even if you disagree with his decisions, he has had tremendous success at implementing his vision. To me it speaks more of disinterest - FP doesn't seem to really register on his radar as a useful mine of PL ideas. That's fine, although I agree with the previous poster that it's ironic to be such a proponent if APL, and yet have such a blind spot to another very fascinating area of his field.
Would Haskel or some other functional language be a part of a typical CS curriculum?
I remember having a course about programming paradigms which introduced different languages and we wrote a small project based on functional languages.
Possibly. It's wierd he's so enamored with apl which just seems to me like languages focused an algebra designed around arrays.
Algebra based designs can be formed around many data structures and many languages generalize this concept like Haskell. With Haskell you can create your own algebraic DSL around arrays and anything else you can think of. It seems he's enamored with the specific array instance of algebra based designs and unaware of how it's only one specific case of a general concept.
APL wasn't originally a programming language at all. Ken Iverson designed it as a mathematical notation to express and reason about computer algorithms. "Notation as a tool of thought," is how he thought about it. It was implemented as a programming language years after he created the notation in 1960.
An APL-like Haskell DSL could be interesting, but to match APL's expressiveness you'd basically need to reimplement all of APL. For maximum generality, one could just skip both APL and Haskell and just use the lambda calculus. I found that a bit hard to work with, though.
As far as Numpy and all, they are all directly descended from APL. The difference is that it takes 10 lines of Python to match 10 characters of APL. While the array languages' terseness can be excessive, doing it in Python is not very pleasant either.
I will admit my interest in learning APL has been piqued. Though I do find it odd that they never once mention the most popular array programming language of all time, MATLAB