Monday, March 8, 2010

Ruby and Apple Fans Need to Settle Down

Mac Users Need to Settle Down

Recently, a Mac user and I had to decide on the order of the "OK" and "Cancel" buttons on a dialog in an Android application. The facts are that Windows puts the "OK" button on the left, and Mac puts the "OK" button on the right. The Mac user actually called the Windows way "bad design". I believe the rationale behind the Mac way is that the right side of a screen should correspond to forward and the left side should correspond to backward (makes me think of Mac's placement of the close window button.). In the discussion, we didn't really have time to argue about it a lot, and he was in a somewhat authority-like position over me, so it's probably good that I didn't pursue my side further at the time, but I'd like to do that now.

The text of a dialog box in all operating systems is either left-aligned or centered, and the two are indistinguishable for some lengths of text (since the dialog changes size to fit the text). The default position of your focus as you read the text starts at the left, averages in the center, and ends at the right. I'd argue that the center is really the main point of attention and so the button closest to the center is the first button the user sees and thinks about. Since buttons are right-aligned (at least in Windows), the closest button to the center is the button on the left, which should be the button the user wants to click. That's a good reason for Windows to put the "OK" button on the left instead of the right.

All that to say, Mac users seem to be a little too aggressive. They'll defend Mac to the death, which is just plain ridiculous. That much enthusiasm is really unhealthy for a topic like Windows vs Mac. Just look at the iPad. Any support for that at all is evidence of an unhealthy devotion to Mac products.

That being said, a counter example to my point is a very respectable Mac fan at the place where I work.

Ruby Developers Need to Settle Down

The same goes for Ruby developers (who seem to be the same people as Mac users. Hmmmmmm.). Ruby, just like Mac, is potentially beautiful, but completely non-standard. Ruby's syntax is not like any other languages' syntax. And I don't mean that it's hip and fresh; I mean that it's difficult to learn and nothing really that exciting. Here's an example of Ruby's syntax:
file_contents = open("filename"){|f| f.read}
I know what the line does: it opens a file, reads all of the text in it, and assigns the text into the variable file_contents. But how in the world does that curly-bracket lambda function fit into the semantics of the open() thing?? Obviously, I could find out with some research, but I thought I knew Ruby's semantics pretty well, and this totally stumped me. For comparison, here's Python's equivalent:
file_contents = open("filename").read()
Seems a lot simpler; particularly, you know that read() is a function of the object returned from the open() function. (It's possible that the Ruby code actually closes the file in some kind of with-/using-/try-finally-like semantics, which makes python's code a bit more complex to match the functionality exactly.) One more argument against the Ruby language itself, is that it was designed by one guy, designed for himself, and designed with no regard for anyone else's preferences (see Wikipedia).

But enough about Ruby, the point I'm trying to make here is that Ruby users need to settle down. The first search result for "C type casting parser" (as of the day of this post) is CAST -- Ruby's C parsing dog. Woof.. A couple things to note, one is how silly the title is, but that's fine. The second is that it's on a website called rubyforge.org. What's wrong with sourceforge.org? The third thing to notice is about 1 screen down the page in the section "The Parser".
Here's a quiz: what does "a * b;" do?

I bet you said "why you l4m3r n00b, that's a statement that multiplies a by b and throws away the answer -- now go take your meaningless snippetage to your computing 101 class and let me finish hurting this JavaTM programmer." Well, you'd be both mean and wrong. It was, of course, a trick question. I didn't say if any of a and b are types! If only a is a type, it's actually a declaration. And if b is a type, it's a syntax error.
Was the caricaturization really necessary? That bash on Java was completely unrelated. The real point the author is trying to make is actually a very good one, and was the reason I sought "C type casting parser" in the first place (has to do with parser state).

Ruby on Rails developers are usually good examples of over-excited Ruby fans. Ruby on Rails is a FANTASTIC framework for website development compared to PHP, but there exists a framework comparable to Ruby on Rails that uses Python instead of Ruby, Django. Python, as I will eagerly argue to the death (wait a minute...), is a better programming language than Ruby. Everyone I know that knows both Rails and Django likes Django way better, but Rails developers will hear none of it. Python is arguably a much easier language to learn than Ruby, and, from what I hear, Django is a better framework than Rails. Why does Ruby on Rails maintain its popularity? Well, besides momentum, I'm sure there's some unhealthy fanaticism as well.

That being said, I am embracing a lot of the semantics of the Ruby programming language in my programming language Jax. However, I'm not acknowledging any causality or direct inspiration from Ruby; it's just the way programming languages should work.

In Conclusion

Mac sucks!! Up with Linux (and windows)!!

Ruby sucks!! Up with Python (and java)!!

2 comments:

Jose said...

Well said my friend. I have never used Ruby, but I heard that learning Ruby is more like learning an actual language...which is not standard like you mentioned.

thejoshwolfe said...

here's more Ruby code from the same place

## print all global variables
ast.entities.each do |node|
  node.Declaration? or next
  node.declarators.each do |decl|
    unless decl.type.Function?
      puts "#{decl.name}: #{decl.type}"
    end
  end
end

the second line looks kind of like "node had better be a Declaration, or else skip to the next one", which makes some sense in human language terms. Similarly, t = 2.weeks is very human-readable. However, the real language semantics going on behind those two things is really counter intuitive. node.Declaration? returns a boolean indicating weather node is of type Declaration. Then the or boolean operator is short circuited to only evaluate the right-hand side if the left-hand side is false. If the next expression is evaluated, it causes control flow to skip to the next iteration of the loop. For the 2.weeks code, weeks is a function of all integer-type objects that returns a timespan object equal to the number of weeks specified by the integer. The backend is way more complicated and counter intuitive for Ruby than for Python, but the resulting code is more human-readable. This doesn't mean anything to me. The average code reader is not an average human, but a programmer who knows popular languages. Knowing popular languages doesn't help much at all for reading Ruby. The best non-Ruby programmer candidates for reading Ruby code are people that don't know how to program at all. The code they read might make hazy sense, but they can't write any code for themselves. And you know why? Because you have to know the subtleties of the backend in order to write Ruby code, and the backend doesn't make any sense. The only thing that makes sense is reading it without touching it. This makes Ruby shiny and unmaintainable.