Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
SSL with Rails (collectiveidea.com)
65 points by danielmorrison on Nov 30, 2010 | hide | past | favorite | 22 comments


Thanks, yoinking this for Appointment Reminder.

In the spirit of creating wealth through exchange:

https://gist.github.com/721114

A frequent problem I have is using CSS files which include http:// linked background images, which will pass through every test with flying colors and then trigger prominent mixed-content security warnings in a browser I very seldom use. Slap this code in your initializers directory and you never have to worry about it again.

Just write your stylesheet_link_tag as normal, and then include :cache => request.ssl? ? "foo-secure" : "foo-standard"

Bonus: the included code also minimizes CSS and JS files in production, saving you page load time.


The minimizing is good, but doesn't using a path like //example.com/my-image.jpg instead of http://example.com/my-image.jpg fix the issue with http/https?


That works if you can enforce that 100% of the time, against all CSS loaded by your site. The first time you forget it in e.g. a lightbox script you add to get used on a landing page whose implementation you didn't even glance at, your registration page will break. This has cost me thousands of dollars before, so I enforce assumptions like "there are no http:// urls in my CSS files" in code.


How about part of your test suite that just greps for http:// in public//*.css and fails CI if it finds anything?


Couldn't you just use protocol-relative URLs in your CSS files? (Or for that matter, absolute paths.)


Wow, that tip about using protocol relative URLs is going to clean up so much code for me.

For those who missed it, apparently this works across browser to avoid mixed content warnings:

<script src="//ajax.googleapis.com/ajax/libs/prototype/1.6.1/prototype.js" type="text/javascript"></script>


Ach! Accidental downvote.

Thanks for the detail.


I usually just use whatever is the most robust, stable open source choice for Rails app. Sphinx for search, etc. and Apache for the web server. Doing SSL is well documented and pretty much everything is possible and documented and supported by webmin too. I have a virtual host sending all non-ssl requests to ssl.

It doesn't have to be Ruby, especially if I won't be programming for it anyway.


If you want a super simple way to require SSL in Rails 3, you can use route constraints: http://kyleslattery.com/entries/requiring-ssl-using-route-co...


This part was new to me: http://en.wikipedia.org/wiki/Strict_Transport_Security

This is an additional (draft status) header that tells the user agent to always (or up to X seconds in the future) use HTTPS for that site and it seems, to disallow any connections using bad certificates.

Currently only Chrome and latest dev version of Firefox seem to support it. Chrome apparently has a pre-loaded list of always-https domains, like www.paypal.com -- who are also the people behind this draft proposal.


The one problem nobody talks about is that browsers won't cache content served over ssl, which means your website will be a lot slower to your users. That will directly costs you money.


Firefox does cache content - I just ran Firebug to confirm. I can only assume from how fast my site works in other browsers that the same applies to them. What browsers are you referring to?


Interesting. I didn't know it was this easy with rack.

Am I right in thinking that if third party embeds/widgets dont answer at an https url, I am stuck with mixed content warnings?


You could proxy the request through a back-end server that does support HTTPS.


A word of caution: it is very easy for naive attempts to do this to end up creating an open proxy. You don't want that.


You profoundly don't want that.


Thank you, your comment reminded me of why this is Severity: Critical. Here's where Thomas and I live in different worlds: I hear "open proxy" and I think "a malicious user can use your server to attack other services on the Internet or download kiddie porn.". Thomas hears "open proxy" and starts to see visions of a host possibly within the firewall which can be used to make arbitrary HTTP requests against servers within the firewall, which let's you target the weak underbellies of e.g. intranet applications which have no security because all users are presumed to be employees and where compromise can do interesting things like e.g. read out all your source code through Trac, initiate wire transfers through the accounts payable application, etc.

Yeah, you profoundly do not want open proxies.


Sev: biblical. You're coughing up the lights-out management consoles on servers and storage appliances with them. I'm sure there's all sorts of funny business to be perpetrated through them with admin and account management applications. But I wouldn't know, because you're usually 5 minutes from command execution on the server once you find one.


Could I write a tiny one in rack and mount that in front of the rails stack using the new router in ActionDispatch 3? Hmmm...


Hmm, probably. Doesn't sound like a bad idea. Still, if the proxied request takes a long time, you might be locking out other requests from hitting the real app.


Would I be right in thinking that if I was serving my app using an evented web server (nginx) this would not be such a problem? Or would it still be tying up a passenger instance anyway? If my request hits the router and then goes straight to a seperate rack app, does this still require a full weight passenger process and one copy of my full app stack in memory (albeit shared memory?)


Most 3rd party stuff is good and supports SSL. YouTube is the giant exception. Annoying.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: