The author claims that monkey-patching private methods to expose them publicly is representative of Ruby's beauty, and specifically:
"... later upgrades to the library don’t need to be re-fixed: the one-time patch will automatically get re-applied by application code automatically!"
Anyone who fails to realize how monkey-patching can and will break both subtly and catastrophically across library upgrades has no business advising anyone regarding language choices.
[Edit] I hate April Fool's Day. Depressingly, the author's satire is so subtly aligned with common arguments as to so thoroughly fool.
Just because it's a satire doesn't mean the title in incorrect. I would argue that Ruby is, in fact, the future for the same reason that Python is the future.
As for monkey patching, yes, it's ill-advised in general just as global scope and gotos are ill-advised for similar reasons. However, monkey patching is a powerful feature if used responsibly as Chad Fowler has noted ("The Virtues of Monkey Patching"):
Now admittedly, not every programmer is as competent or responsible as Chad Fowler or the programming world would be a very different place but the point is there's a right way and a wrong way to do it.
Werner Schuster's article on InfoQ ("Ruby's Open Classes - Or: How Not To Patch Like A Monkey") has more good advice on how to do it the right way:
As noted in Werner Schuster's article, many languages (such as C# or Scala) do support type-safe, non-ambiguous, non-conflicting extension of existing classes, demonstrating that extension of existing classes can be safely implemented and does not require the use of dangerous monkey-patching.
dealing with the type system in scala can be a pain and is a hit to productivity in the early stages. The question is if the hit in the early stages leads to more productivity in the later stages. I'm not convinced that it does.
I've done a lot of ruby, and I've had monkey patching bite me a few times. However, fixing the problem was never that hard (two days max) and is clearly overshadowed by the productivity gained from the power ruby gave my teams.
How can one effectively argue against an entirely subjective judgment of difficulty in terms of dealing with a type system?
Anecdotally, I find that leveraging functional language programming features coupled with type inference and polymorphism leads to considerable productivity gains as code correctness can be ensured through judicious use of types -- without requiring extensive testing or programmer effort -- while excess verbosity can be eliminated through the use of FP features such as type-safe anonymous functions, currying, etc.
How do you deal with excess verbosity in C#? I know this is a separate argument from the monkey patching issue but for me the conciseness and elegance of the Ruby syntax versus C# is the primary reason to choose Ruby.
Hopefully a user of C# will chime in -- I've spent very little time with it, and only recently evaluated C#'s extension mechanism for the purpose of comparing it against other language's implementations.
That said, elegant and concise syntax does not require eschewing an internally consistent, fully specified type system -- the two are not incompatible, as demonstrated (subjectively) -- by many existing FP languages. I argue that elegance/conciseness requires either FP language features and advanced type system, or abandonment of rigorous typing.
In C# 3, the verbosity level is much less than in C# 2, where it's slightly less than C# 1.
I don't see where Ruby syntax is particularly more concise than C# 3's.
Well, let me clarify "particularly". I don't find myself feeling like I'm suffering under C#'s syntax, relative to Haskell's. It has about the same amount of syntactic overhead: writing types for method parameters, and when defining data structures. Occasionally I have to explicitly write a type parameter to a function.
Of course, "UpTo" requires first extending int, which is trivial:
public static class IntExtensions {
public static List<int> UpTo(this int start, int end) {
var range = new List<int>();
for(int i = start; i < end; i++) {
range.Add(i);
}
return range;
}
}
This is a fairly uninteresting example, however. Something more interesting would be, for instance, the use of structural types to implement type-checked duck typing.
Scala example:
// Declare a structural type for any class implementing close()
type Closable = {def close(): Any}
// Executes the provided function with the given
// closable generator, creating a new instance which
// will then be closed on completion. The provided
// function's value will be returned.
def withClosable[T, C <: Closable] (c: => C) (f: (C) => T) = {
val closable = c
try {
f(closable)
} finally {
closable.close
}
}
// Example usage
def usage = {
val updated = withClosable(db.openConnection) { conn =>
conn.update("INSERT INTO example VALUES (...")
}
System.out.println("Rows updated: " + updated)
}
This could be compared with Python's new 'with' syntax.
I find monkeypatching happens because the framework doesn't utilize proper baseline OO techniques. By "baseline" I'm excluding mixins. I have rarely found a case where mixins do a better job over well factored inheritance. I don't think mixins are "evil". Just over utilized and inheritance heavily underutilized in Ruby.
You're lucky that in those cases you were able to recover in only two days. The time it takes to get back on track can be proportional to the size of the project so imagine if you were dealing with an enterprise system written in and monkey patched in Ruby and you had a similar problem. Depending on your luck, you might get back on track or you might end up having a lot of extra time to work on your resume.
Why? Because of monkey patching, syntax, performance, the community or some other reason such as you prefer Python or have an intuitive, ineffable and unexpressable disklike of the language or because Matz is from Japan and they bombed Pearl Harbor?
I have very few issues with Ruby as a language, in general it let's me get from thought to working code quicker than any other language I've used. Honestly the largest issues I've run into with Ruby typically deal with people trying to be too clever.
Ruby certainly provides a lot of ways for people who want to look incredibly clever to hang themselves and anyone using their code. I typically avoid libraries or frameworks where clever code is the norm (I'm looking at you Rails), and because of this I tend not to have too many problems.
It's certainly not perfect, and I've been frustrated by bad error messages, stupid type system mistakes, etc. But I've found that those issues simply replace issues in other languages.
I still feel that Ruby gets me from Thought to Code quicker, and for the kind of projects I typically do, that's worth dodging a couple of warts.
-> "the largest issues I've run into with Ruby typically deal with people trying to be too clever"
I agree with you about Thought-to-Code and even though I argued in favor of monkey patching earlier, the willingness to use it without discipline has raised a red flag for me and is the biggest sign of this "too cleverness" that you're referring to (at least for me). If Ruby becomes the new Java, maintaining irresponsibly monkey patched legacy systems could be a nightmare for the enterprise developer in the years to come.
I love Ruby. I love that code can be so short and yet easily readable. It might be slower and perhaps more memory-hungry than other languages, but for my needs that's usually not an issue.
What I do miss are good, well documented, stable libraries. For example, I currently process RSS/Atom feeds with FeedTools, whose own creator says he's tired of maintaining (I totally understand him and am extremely thankful for the time he put into it). I assume if there were more people coding Ruby, there would be a better chance someone would step up and take his place. So, having a community that is more welcoming to newcomers would eventually benefit us all.
By "open to newcomers" I don't mean that Ruby-celebs should stop posting snarky comments at each other in their blogs. This is pretty negilible. I mean stuff like having better documentation, for example, online, free, in googleable format (rather than screencasts or print books). IMHO PHP3 annotated online docs back in 1999 were significantly more usable than what Rails has to offer now.
I don't really know Python's community that well, but I would wager it's probably due to sheer size. I disagree with PG's "law of averages" regarding choosing environments - working in a language that has more users does have significant benefits in my opinion.
BTW another example that comes to mind regarding beginner-friendliness is how long it took to get Rails working with Apache. Before Phusion you were supposed to run your own mongrel cluster or similar fringe solutions - it was insane to ignore the world's most widely installed server. Personally I only started considering Rails production-worthy after mod_rails turned out to be fast and stable. Sure, I can run Thin/nginx or whatever, but I don't want to learn a new server (and wait for it to mature) just so I can use Rails.
I'm not qualified to respond to your points about Phusion and mongrel but on the previous point about the size of the community, are you also finding as a result that the job market for Ruby positions is sparse (forgetting about the economy for a second)? That's the impression I have from the job boards but you work in the field so maybe you have a better idea.
Actually, there's plenty of Rails work out there, and usually in higher rates than PHP (granted, you do accomplish more work per Rails hour).
It seems to me most Rails jobs are either for companies who chose Ruby because they really know their stuff, which is great, or companies who chose Rails because 2 years ago it was touted as some sort of Web app pixie dust in Wired Magazine, which is often less fun :)
I wasn't sure whether the latter firms had found some new pixie dust by now. Good to know. Obie Fernandez has a talk online somewhere about billing where he says something like no one working in Rails should bill less than $150/hour and that he bills himself and his developers at $250 but will occasionally give discounts for various reasons. Maybe I have numbers wrong - just found it - http://www.infoq.com/presentations/fernandez-sales-do-the-hu...
I recommend watching it. Good talk. Takeaway point - always err on the side of charging too much :)
I am not happy with them: recently I was looking for the API docs of the standard library for Ruby 1.9, and it turns out it is not available online anywhere. ruby-doc only has the 1.8 stuff. I spent hours trying to figure out how to create my own documentation with rdoc ad ri but eventually gave up. So now I don't even have the documentation of the standard library for Ruby.
ruby-doc.org is nice, but there's a lot missing in the package level (FeedTools etc). Also, it could use a lot more code examples.
Here again I think it's interesting to compare to PHP's annotated docs ca. 2000, which were a great resource since they usually contained a few common usage examples for every function. The language itself was less consistent and mature than Ruby, but the user contributed (pre-Web 2.0! ;)) docs made up for it.
martoo on the equivalent Reddit post had the best thing to say about this:
"There needs to be a special term for an attempt at an April Fool's Day Joke, which is in all of its points true and then ends up looking like a joke made by an author at his own expense."
"... later upgrades to the library don’t need to be re-fixed: the one-time patch will automatically get re-applied by application code automatically!"
Anyone who fails to realize how monkey-patching can and will break both subtly and catastrophically across library upgrades has no business advising anyone regarding language choices.
[Edit] I hate April Fool's Day. Depressingly, the author's satire is so subtly aligned with common arguments as to so thoroughly fool.