For a lot of web apps, including the one I'm working on, the JS interpreter isn't the bottleneck, or even close. Basically, anything involving rendering or the DOM is orders of magnitude slower than pure JS computation. So what's the point of optimizing function call and loop overhead? For what real-world apps will this make a noticeable difference?
When we ran our app in FF3 for the first time, we were dismayed to find that, after all the talk about browser performance improvements, it had actually gotten a lot slower (especially when scrolling) [~]. As far as I can tell, they're optimizing the wrong stuff. It's possible that our app is unique, but I doubt it. To judge by Google, many people appear to be experiencing the same thing.
[~] Safari 3, on the other hand, sped up our app considerably.
I like how this conversation has gone on for so long, but on the whole, everyone's still polite and detailed in their responses. :-) warm fuzzy feeling
I'm not sure if this is Hacker News, but the linked author isn't talking about the study's findings at all. He just uses it as a springboard to rant about a few mothers so bad they got sensational tabloid headlines. Then he works in a few predictable jabs at the welfare state for producing parasitical monsters.
This may be the study that the author is discussing, although it was released in 2007.
Hopefully, anyone on this board isn't afraid of real data. The picture seems kind of mixed and even the authors find it difficult to pin down the exact sources of well-being.
I actually try to actively fight the urge to seek out "interesting" things to read. The content on HN is of fairly high quality, and seeking out more stuff to consume outside of it leads to diminishing returns on time wasted.
I find if something is important enough, it will get to me one way or another. Or I'll see it passing by on HN.
it's been out like 3 days. I agree that it seems dumb so far but google has released tons of stuff that seemed half baked and then over time gets better.
I remember everyone hating google talk when it first came out. No one liked Google news when it first came out either.
There is no guarantees but give it six months to a year before you're 100% sure it's dead.
Nice analysis, especially the highlighting of the effective veto power over acquisitions in the protective provisions.
To nitpick, I think Agarwal's wording about the liquidation preference gives the impression the term sheet implements what's called 'fully participating' preference, where even after collecting the preference amount, they also share in the remaining money.
IANAL, but I think the wording actually implies a 'non-participating' preference: series AA can take the preference amount, or convert to common and share pro-ratably with commons, but not both.
To picture the difference: take Agarwal's example of selling 10% for $10K. But then imagine the company is sold for only $110K. With fully-participating preferred, I believe the investor gets their $10K liquidation preference, then also 10% of the remaining money, for a total of $20K. With non-participating, they could take the $10K liquidation preference, but then they would share none of the rest with the common. Instead they will choose to convert to common and get 10% of the total, $11K. (Someone please correct me if I'm wrong.)
Yeah, but this is an area where both languages are just a mess. If python's lambda is castrated, ruby's notion of an anonymous function is tumorous. We've got blocks, procs, lambdas and methods? What's the difference?
Ruby mixes up the syntactic requirements of "passing" some code to a loop to iterate with with the data requirements of binding a function to a scope. It's just a huge mess.
LISP and Scheme figured all this out decades ago, and yet only Javascript (and, almost, perl) among common scripting languages has managed to absorb the lesson.
Well, working extensively with a language means working with its community, as a reader of news and documentation if nothing else. The size, expertise, and helpfulness of the community are all important factors in deciding whether to adopt a language. If a programmer finds the community annoying, that could be a legitimate problem.
We actually did the same thing for Kongregate. Except we gave out an invite code to everyone who asked - usually same day. So the artificial scarcity was... artificial. But if we'd had a big problem we could have stopped the invites.
We did that for about two months and then took the invite barrier down. Worked for us.
edit: I forgot the second half of this story. A few months after we took the invite barrier down, we decided to do a "Hollywood" launch - really all we were doing was changing the "alpha" to "beta" and releasing some new features (earning cards by playing games). It worked, we got lots of press. Fox News even called and asked if I could go on their afternoon cable show. I did (here's the video - http://tinyurl.com/2gpz7x). Our site immediately cratered - it was a much, much bigger surge than Digg. We got it back up in about 20 minutes and the follow on traffic was good as well.
They're both impressive, modern languages. What I find interesting when comparing Python to Ruby is how much Python isn't like Lisp. Matz has said outright that he considers Ruby to be "MatzLisp" and it shows, whereas Guido has absolutely no problem chucking Lispy things overboard--like multi-line anonymous functions--if they do not fit the rest of the language.
Instead, Python finds another way, like powerful list comprehensions that can be used wherever Ruby would use blocks and maps.
So I would say you can learn some very interesting things from either language, and there's a good chance they will be different interesting things.
Can people please stop using "software" and "webapp" interchangeably?
I realize most members here are founders of startups that deal with webapps, but there are some startups that deal with traditional desktop applications - true "software," who share similar interests.
That said - I wonder which of these two approaches is better for desktop software development. I'd imagine the fact that a user has installed your software is a "hook" of sorts - they're easier to reel in than website visitors which can escape rather more easily.
You'll notice I did say my experience :-) I'm not trying to "win" an argument here, which would require "evidence", just pointing out that there are more important things in language selection than the features/characteristics/quirks of the language itself, especially given there's not much to call between Ruby and Python.
But, y'know, you could try quantifying it if you like, by looking at the libraries available for each language, and looking at how many jobs are "Ruby" as their main skill and how many are "10 years of whatever + Python".
I have to say that "graduated from MIT" is far more impressive to potential employers, investors and co-founders than is warranted, IMO. And I say this as an MIT grad. :) Being no dummy, I dig through my nightstand to find and wear my brass rat (MIT class ring) approximately once every 5 years, when I need a better shot at one of those things. Otherwise, no one I know cares, though half of our top technical leaders are from MIT.
I also look at the recruiting efforts my current company (startup in early 200x, now decidedly not-startupy, public, and $500+K in annual revenue): we bias heavily towards "top" CS schools when doing college recruiting. We've gotten some absolutely fabulous candidates from "second tier" schools and we naturally recruit there every year, but if you come from MIT, you get a full 25% (estimated) better chance right from the start of the interview.
For my own account (plural of anecdote not being data notwithstanding), despite being a pretty strong candidate, I strongly suspect that my current lot in life is based more than 50% on my attending (and not even spectacular results at) MIT, and as a result, I give heartily every time they call for alumni donations.
Employers and founders: MIT isn't all that special. So called "second tier" schools also have some extremely capable potential employees, and individual talent, motivation and drive is FAR more important.
High-school students: If you have the choice between MIT and $200K in debt and a state school with a full ride, and you're pretty sure you're going to work in engineering for 20 years, definitely go to MIT. Right or wrong, that name opens doors, and I believe I've done a lot more interesting and lucrative work than I'd have been able to do from the full ride I had offered to me at UMCP.
As an MIT grad, I'm a lot less impressed with the brass rat than most people who are in a position to help shape your lot in life. If you're going to found 5 startups in the hopes that one will be a solid success, save the $200K and invest it in your startups. If you're going to go work for someone else, even just for a while, you'll have no problem paying off your student loans from MIT, or any other top school, IME.
Good article, but you're wasting your breath. Sarah Lacy's article had such little substance and thought behind it that pointing out the errors in it is like picking apart scientology.
Turn on ads immediately, that will give you time to play around with the various knobs associated with online ads. Also, you get some idea on how much money you can make from the ads, since that's going to your primary source of revenue for this venture.
Don't worry about ads and early adopters. Most of them have adblock or some variant turned on, the rest probably don't click on the ads anyway.
I think it is more likely that you do not recognize your own flaws. It takes a real genius to be able to learn the little things about C++ in a week or two.
A language is not just the syntax of the language itself, it's also the code style, the convention, the best practise, the language specific stuff, the API of the libraries available, the best way of doing things with libraries.
If you use python to write simple number crunching console applications, then switch to ruby, that is an easy switch.
But if you're writing a Web application using Python and Django and you switch to writing DirectSound using C++, and you tell me you can do this in a week, then you are a genius.
The syntax of languages are trivial, but that's like saying you can discover the colors used in a painting. That's the easy part. The difficult part is knowing how it works. Let me list a few examples of technologies that I think differ strongly from each other that make it difficult to switch
- MFC with C++
- Django with Python
- DirectShow with C++
- VHDL
- ASM
- Javascript with JQuery
- CSS with HTML
- Helix Framework with C++
- Quicktime API with C++
- OpenGL with .NET
- LISP
Have you ever had to switch between things on that level? Or are you talking more of a python to ruby switch?
Best part:
"He was a prankster, but because he didn't boast, he only got caught once in high school. While spending the night in a juvenile detention facility as a result, he taught the prisoners how to take the electrical leads from the ceiling fan, wire them to the jail cell, and shock the guards."
You say "both languages are just a mess" as if to equate the two. But there's no comparison.
Your Ruby problem with blocks and Proc objects is a nit about elegance and semantics. How often does the difference between an actual Proc object and a block actually affect you? More importantly, if the cost of fixing that nit is that we wind up with Lisp's or Javascript's lambda notation, who would accept that? Javascript has relatively consistent semantics for anonymous functions, but extremely clumsy syntax.
Python simply doesn't have real lambdas. It has the ability to pass something called a lambda in place of a function, and that something forces you to think about things like the difference between a statement and an expression. It then tells you that you don't in fact want anonymous functions, but instead you want to define tens of itty bitty named functions and use them instead. You also "want" that "self" argument to all your methods, and you "want" to tack two underscores to the front of your method names to make them private.
So, I half agree with you. Yes, it would be nice if there was a (zero-cost) way of fixing Ruby's block semantics. Yes, I can rely on Lisp's lambda semantics slightly more than I can on Ruby's, which does start to matter when you start writing domain-specific language code. But I don't agree that this has any bearing on the Python vs. Ruby argument. Python lambdas suck.
Concrete example: The Ruby community is very focussed on the Web, so there's Rails but they aren't really interested in scientific computing so there isn't a real equivalent to NumPy.
I'd mod this up more if I could, because this is the best answer to the original question.
The important thing is the community, not the language. Decide what your problem is, find the people who are working on that problem, and use whatever they use.
The importance of community is the reason why the number of people who praise Lisp as a language is so much larger than the number who use Lisp every day: The Lisp community is small, unfocused, and arguably broken. Meanwhile, I work as a professional Drupal developer and am (god help me) gradually becoming an expert on PHP, but that has nothing to do with my nigh-nonexistent respect for PHP as a language. I do it because the Drupal community is large and growing larger, it includes as many noncoders as coders, and it's focused on building websites rather than obsessing over tools. The result seems to be that customers like using Drupal for their sites. Rails and the like are focused on making programmers feel empowered, but Drupal is focused on making site admins feel empowered.
When we ran our app in FF3 for the first time, we were dismayed to find that, after all the talk about browser performance improvements, it had actually gotten a lot slower (especially when scrolling) [~]. As far as I can tell, they're optimizing the wrong stuff. It's possible that our app is unique, but I doubt it. To judge by Google, many people appear to be experiencing the same thing.
[~] Safari 3, on the other hand, sped up our app considerably.
Edit: The only person I know of who's raised this issue is John Resig: http://ejohn.org/blog/javascript-performance-stack/