I've got a few thoughts.
1. Regression of the mean works in opposition to evolution. It was one of the central reasons that lamarckian genetics was rejected. If everything could mix evenly any variation would regress back to the mean. Ie not net change over time. But in the discrete system of mendelian genetics that change cannot always regress. I may be explaining this point poorly so I encourage you to read about the history of the debate between the two.
2. I think macroscopic traits and complexity are often different things.
a. Evidence from the genome shows that we collect a lot of information in the genome, and the amount of information increases with time(just more junk in the genome). Evolution removes this sort of stuff, so it certainly decreases the complexity of the species(or gene pool) over time.(with a very strict definition of complexity)
b. Macroscopic traits like intelligence say very little about complexity. The way that we ended up being intelligent is probably not the simplest way to create a creature of equivalent complexity nor the most complex way. In other words evolution has a lot of appendices, this is particularly true in cognition, where we have newer layers of brain(in the evolutionary sense) layered on top of older ones and it seems a lot of those underlying layers do very little processing in humans. So a lot of extra complexity but very little extra intelligence.
3. Randomness is absolutely not necessary for evolution, but what you do need is variation. To have evolution in the way we generally think of it(ie not directed by things other than best fit to environment) that variation should be a good approximation of uniform variation. Luckily one of the constraints humans place on random variables(both intuitively and explicitly) is that those variables have a uniform distribution.
4. I personally think that the world is deterministic all the way down. As I understand quantum mechanics, the issue with hidden variables isn't that they are necessarily random, they can be non-random if they are non-local. I'm fine with non-locality, everything is connected, cool.
5. Given that, a fundamental question here is the definition of random. I think things are random if they appear random even if they have underlying deterministic behavior. If you disagree okay, but I think its a matter of opinion, or convention, if you will. I think once we finally define our terms this is a non-issue.
6. Finally there is a more interesting question beyond this. Does evolution necessarily converge on different forms? I can't say for any particular case, but what I can say for sure is that evolution is essentially a hill climbing algorithm(also often called simulated annealing). Which means it can only change from one form to another by small steps(or local variation). This precludes certain possibilities. So despite the fact that our brain is designed (I say designed only in the sense that I am anthropomorphizing an inanimate and non-directed process) by piling on layers, that now represent inefficiencies, it seems very likely that we couldn't have done it any other way. Any other intelligent evolved creatures in our universe would necessarily have their histories "built in" to their architecture, even if they end up doing it in a different way. It will never converge on the best way, because those intermediate steps are inseparable from the process( or algorithm). If you look at developmental biology all creatures on earth share this developmental commonality(and incidentally a common ancestor)
tl;dr
I'd just like to conclude with an interesting tangentially related thought. In A New Kind of Science, Wolfram claims there are actually 3 kinds of randomness rather than the two mentioned above.(This is actually the central claim of the book)
Specifically there are systems that don't have complexity built into their initial conditions(ie not chaotic systems), they don't have any "true" randomness, but instead they are systems that generate randomness.
They have simple homogeneous initial conditions and simple rules, but because they pass some incredibly low computational threshold they generate randomness. It's quite amazing.
http://mathworld.wolfram.com/Rule30.html
A 6 rule cellular automaton on a simple 1-d binary grid can generate a pattern that never repeats starting from a single black square. It is also proven that this system is a universal Turing machine. wow.