But that was the point, that not all programming languages are created equal. I only have anecdotal and personal experience to back this up, but I believe that a vast majority of people familiar with both Python and C would much prefer the former for Web development (as you do).
Looking at the big picture, computers are so ubiquitous now because they provide significantly better tools for all kinds of businesses. It's fairly obvious that programmers themselves can also benefit from improved tools. And our main tools are programming languages - although there are obviously also other parts to the puzzle (IDEs, compilers, build + deployment + CI systems). I think we are already better in that some of our mainstream languages today are nicer to use than most of those 30 years ago. However, the difference is not huge, and my impression is that the language development was largely done from personal experience of individual language creators, and has largely ignored the research advances made in the area of programming languages. Notable exceptions are garbage collection, and Clojure and Haskell (if we can consider them mainstream).
Now of course, there are other, both non-technical considerations that are also affected by language choice. Availability of libraries or programmers are obvious examples. But the real question is of course whether incorporating these advancements offers a real, measurable benefit, and I think that's what you're hinting at. The problem is that the benefits are hard to measure in a sensible way. Frankly, I don't think that the metric you propose (which, if I understand correctly, is "measure the cost/revenue ratio of projects in language X at company Y against the average ratio for all projects at company Y) can give reliable answers. How do you measure the revenue of something like Haxl? I think the language choice is going to remain a personal decision in the end.
Looking at the big picture, computers are so ubiquitous now because they provide significantly better tools for all kinds of businesses. It's fairly obvious that programmers themselves can also benefit from improved tools. And our main tools are programming languages - although there are obviously also other parts to the puzzle (IDEs, compilers, build + deployment + CI systems). I think we are already better in that some of our mainstream languages today are nicer to use than most of those 30 years ago. However, the difference is not huge, and my impression is that the language development was largely done from personal experience of individual language creators, and has largely ignored the research advances made in the area of programming languages. Notable exceptions are garbage collection, and Clojure and Haskell (if we can consider them mainstream).
Now of course, there are other, both non-technical considerations that are also affected by language choice. Availability of libraries or programmers are obvious examples. But the real question is of course whether incorporating these advancements offers a real, measurable benefit, and I think that's what you're hinting at. The problem is that the benefits are hard to measure in a sensible way. Frankly, I don't think that the metric you propose (which, if I understand correctly, is "measure the cost/revenue ratio of projects in language X at company Y against the average ratio for all projects at company Y) can give reliable answers. How do you measure the revenue of something like Haxl? I think the language choice is going to remain a personal decision in the end.