I’m on the
of this debate. While this appears to be a purely philosophical concern,
this stuff matters. In any case, while Google may be the
first crawler of this sort, it most definitely won’t be the
It’s like a time warp, right down to people misinterpreting RFC 2119 and not grasping that “full implications should be understood” means “and responsibility accepted” rather than “but if other people do it then it’s not your fault, la la la.”
Your post, and the one you linked to, really are from October, and not posts from May that just got republished with new dates, right?
The framework knows which ActiveRecord operations are safe, right? It also knows what the incoming method is. The tutorials could also drop the notion of using link_to for destroy operations (section 4.8--the fragment identifiers don’t work).
I suppose Rails will have this problem as long as it keeps the way it maps “actions” to URI fragments instead of HTTP methods. I doubt they’ll ever change that. I’ve heard it said that Rails is opinionated software. Google seems to have a different opinion. A more melodramatic way to put it would be:
I can not fathom that Ruby On Rails would not merely repeat the mistake from round 1, but actually tweak the offering to increase the chances and scope of hurt?!
Someone, somewhere, please tell me this is not so. That we accidently got a bastard, mutant version of Ruby On Rails. That it’s not actually software that 37 Signals is allowing unknowing souls everywhere to download and rampage with.
The GWA is back and following GET links again. No big surprise there - the web is going to be rife with this kind of automation before the decade is out. Here’s the deal - if you are in the business of allowing the use of GET for something like...
Besides the run-of-the-mill morons, there are two factions of morons that are worth special mention. The first work from examples, and ship code, and get yelled at, just like all the other morons. But then when they finally bother to read the spec, they magically turn into assholes and argue that the spec is ambiguous, or misleading in some way, or ignoreable because nobody else implements it, or simply wrong. These people are called sociopaths. They will never write conformant code regardless of how good the spec is, so they can safely be ignored.
Mark may have been writing in another time, about another war, but his piece has proven again and again to transcend time and topic.
Robert: Rails is no more broken than any other web framework out there. No web framework that I’m aware of has the kind of coupling needed with lower level components to be able to tell whether some action is destructive and to then not route GETs to it. I’m not sure that would be a good idea if you could. Like any other framework, it’s up to the developer to write conditionals that test for the expected verb around potentially non-safe/non-idemponent methods. I think Rails may even have some class level method that allows you to block access to certain controller methods based on the verb, which is a little less cluttered than conditionals.
I have considered tweaking Rails to allow me to qualify routes with verb requirements. I don’t think it would be all that hard to implement to be honest. Right now this stuff goes into the actual controller but I’m fairly certain the routing could be enhanced so that you could do stuff like this:
And now I remember why I haven’t hacked that in yet - it’s cumbersome. It’s no less cumbersome than just doing it in the controller, which is what I usually do today. I have my route map a URI to a single action method and then have a set of case statements in that method that dispatch out to protected methods:
when get? then get_foo
when post? then post_foo
when delete? then delete_foo
It would be very easy to write a small Rails plugin that did this type of routing for you. Consider a “has_uniform_interface” mixin that created the dispatching method exactly like the hand coded one above and marked each method as protected for you:
class FooController < ApplicationController
That has_uniform_interface mixin would be trivially simple to implement and would put Rails far ahead of most other frameworks.
There’s numerous other mixin methods one could implement that would let you annotate controller methods with different bits of information for use in dispatching (content negotation for instance) but it doesn’t matter. People still have to understand the basic situation (and they don’t) enough so that they could employ these techniques. I don’t know how Rails can solve that problem. The only way to solve that problem is to mount an educational crusade the likes of which hasn’t been seen since “Designing with Web Standards”. The other option is to let something like GWA come along and break everything. I’ll sit here quietly with my hands folded.
Anyway, Rails isn’t any more broken than any other web framework I have experience with. Indeed, it’s quite less broken than most. The only thing I can think of that could really been done at the framework level to mandate correct verb use is to make all controller methods GET-only by default. That is, if a request other than GET comes in, the dispatching code would return a 405 “Method Not Allowed” without invoking the method. In order to handle any other verb, you would have to explicitly mark the method as supporting it. Something like:
# non-safe / non-idemponent code here
Something like that... I imagine this would just annoy the masses. Perhaps not as much as GWA, however.
And now we begin the next chapter in which Pooh discovers that five months after the first time Google turned on GWA that standards still matter. HERE is Edward Bear, coming downstairs now, bump, bump, bump, on the back of his head, behind...
Google Web Accelerator vs. unsafe linking: Round Two!
The good folks at 37signals are once again up in arms about Google Web Accelerator (GWA). David Heinemeier Hansson (DHH), in particular, writes in a recent post to Signal vs. Noise that “[GWA] was evil enough the first time around, but this...
Similarly lots of other applications do things like...
Could we apply the DRY principle here?
class ResourcesController < ApplicationController
confirmation_dialog "Are you REALLY sure?", :actions=> %w(destroy)
# destroy resource
Figure out how to make that work, and change the scaffolding generated to take advantage of it, and we can put this behind us. Sure those who wish to can delete this logic, and such people get what they deserve.
Stuff like the code in the ToDoList tutorial wouldn’t work by default, right?
There’s nothing you can do at the framework level that would make the ToDoList code, as it is today, not work by default. In order to do so, you would have to make a determination on whether the method to be invoked will result in any non-safe / non-idempotent behavior in the context of that resource, which is impossible. It’s almost the halting problem but harder.
Even if you could make a determination that something looking a lot like non-safe or non-idempotent behavior would occur, you would not then also know that it was non-safe / non-idempotent in the context of that resource. For example, using ActiveRecord to log GET requests to the database is perfectly okay. It’s both safe and idempotent in the context of the resource.
There’s just no way to distinguish whether something is non-safe / non-idempotent without a human annotating it as such, leading us to Sam’s solution, which seems to be the simplest thing that could possibly work.
People will always be able to perform non-safe / non-idempotent actions in the context of a GET. Every web framework is in this same situation.
I don’t disagree with your assessment. If URLs map to Actions, that problem is not solvable. This is the same for any similar framework, like Struts for example. I’m certainly not going to get on a soapbox and tell the Rails people to change their approach... it seems to work pretty well, they are awesome programmers, and it keeps getting better. But, you know, it’s not perfect. Their setup has drawbacks, and tradeoffs. One of them is that their approach pushes the job of HTTP compliance down to app developers. It’s strange to me that they won’t admit their approach has flaws. It’s not the fault any intermediary, including GWA.
I hear GWA is back. This is cool :-) And back with it is the polemic. The two usual sides are represented: Those who like specs: Bitworking, Bill de hÓra, Sam Ruby. And those who seem to have a problem reading specs in English: 37signals If you...
Yo ! Sam Ruby: They’re baaaaaaack! The Google Web Accelerator is back with a vengeance (tags: web apps google http ) Les blogs, menace ou opportunité ? (tags: blogging business ) Yet Another RSS History Great conclusion. "There was no RSS. No...
A game I sometimes play when inflicting my presentation skills upon developers is to ask the simple question “so when do you use HTTP POST as opposed to GET?” Depressingly the answers always involve the length of ‘URLs’, size and complexity of...
How did Amazon allow this design out into the wild? Apparently they’ve already had a private beta period with non-Amazon developers. No one suspected this use of HTTP GET to be a poor choice? It’s 2007. There are all kinds of basic REST reference...
[Ed: For anyone who’s living under a bush Microsoft have released a CTP of the new ASP.NET MVC framework. ] I’m reluctant to get involved in this - but I feel like there’s another side to this story which is worth considering. Since it...
[Ed: For anyone who’s living under a bush Microsoft have released a CTP of the new ASP.NET MVC framework. ] I’m reluctant to get involved in this – but I feel like there’s another side to this story which is worth considering....