I agree with everything you said. I never said optimization isn't a part of ML, it was very much a part of my ML masters, in fact. I was just saying that, in this case, the OP doesn't have an explicit function to optimize, hence why he didn't need optimization...
Oh cool =) I don't really think in terms of classification and optimization anymore and don't really see much distinction between minimizing a loss or objective function. But I did rant on because I am really passionate about how much unnecessary complexity (hehe) is in much of machine learning. And am always eager to talk about promising unifying approaches. If you haven't read up on semirings or computational algebraic geometry you should. For an MLer learning more math is like learning math for a programmer, it is a lot of upfront work but it makes a lot of seemingly arbitrary list of rules much more cohesive and simpler when you've achieved what is basically a conceptual compression.
Yeah. It must be pretty energy intensive for the brain consider all that rewiring that occurs. I take the interlaced GIF approach to learning hard things.