Hey Nilton,
First of all, +1 for taking this discussion away from Slims' thread, it was getting too far away from his initial idea there :)
Now, this is a very, very complex matter, and I do like such tasks a lot.
Since you asked for it, let my try to take the whole thing apart for you.
Basis of our discussion is the feeling of "I might be missing some great stuff here", which lead to the thought of "maybe a changed rating system could help me not miss good stuff".
One way to change the rating system has been outlined by yourself above, Lairdy actually suggested removing the rating system and I believe the current rating system is good as it is.
Let me explain.
First of all, lets look at the feeling of "I might be missing some of the good stuff".
I would like to stress the point that no one said "I am having trouble finding any good stuff", so that does not seem to be our problem.
So, the problem is rather that we all know there is a lot of good stuff, and we know how to find it, but we have the feeling there might be more good stuff than what we found so far... ok, then.
Lets talk about the "good stuff", the "hidden gems" - what are their common aspects?
I believe, a track with 120 thumbs and 23 remixes is not perceived as one of those "gems" any longer, we are wanting to stumble on those forgotten, unremixed and underrated tracks with 4 thumbs on a magnificent track, right?
Mind, these criteria a very, very much influenced by our knowledge of wikiloops - if the average track received 120 thumbs, our criteria would look different.
Besides that, our idea of what "good stuff" is is highly individual! Are we talking about "good stuff with vocals" (for the listener), "good stuff to improvise on" (ideally missing your instrument), or what?
So, whatever we do, we need to keep those personal preferrences open for you to choose, or our tracks to offer will not match your desires.
Our gem-search would work like this:
Find [my desired instruments & genres] which are [not totally popular] and - most importantly - [not known to me so far].
These criteria are all met by the "order by date" function - the latest jams are most likely to be unknown to you, not overly popular yet and are filterable in any way you like.
I know this does not make you all happy, hang in there. I know its complex.
Before going on, let me introduce my bold anti-thesis to our problem:
It goes like this: "If you invest some time spotting "good stuff" in the list of latest contributions and follow some people you found worthy, you will not be missing (m)any gems at all."
Why?
Let's have ourselfs an example.
So, we have track X which you have missed, and which for some wicked reason only got four thumbs while it was among the "latest 10".
You might think this track would vanish into the heap of tracks, but:
wikiloops has some kind of swarm intelligence going on.
It only takes one person with some followers who creates a remix of track X, and you will get to hear it when you are notified about the remix by newsfeed.
I have discovered quite a lot of interesting tracks and musicians this way, and "unknown template X" can end up with a huge remix tree that way.
You are however correct in your observation that track X may be stuck with 4 thumbs, while it has remixes with 30+ thumbs. I will get back to that a little later.
The second reason why track X will not vanish besides getting remixed by someone with a wider follower base is this:
Given - our track is not among the latest jams any longer, it "only" attracted a few thumbs and has not been remixed at all. Where would we find this track?
Possibly in the best rated tracks list. Surprise, surprise.
The label on the search-filter does say "best rated", it does not say "most thumbed",
and the best-rated-filter is based on a quite complicated algo which calculates an index value based on something like (thumbs+downloads+remixes) / listeners -
a completely "missed" track with 4 thumbs and 45 listeners will rank a lot better than a track with 12 thumbs and 3.000 listeners.
This explains why the "best rated" list is slowly, but constantly changing - and the fact that an instrumental track without any lady singing is on position 1 today prooves that the algo is doing quite a good job in evening-out the "unfair" thumb-effects some complain about.
What you need to understand that the working rating system is already trying to show you some real jewels, but is also designed to exchange them with others who also deserve some attention. If there was no such mechanism at work, the top listed tracks would stay forever, just because they are more likely to attract even more thumbs than the tracks on page 2.
Bottom line is - there we have the two layers of safety which do help to fish out any gems we are affraid to loose. If they don't get remixed by one of your buddies, you will see them return via the "best rated" lists if they deserve to show up there.
Of course you are welcome to "dig" a little by browsing results page 100 and beyond, where you'll find some good stuff on its way to fame :)
Now, let me get back to your idea of a changed rating system once more before I end.
As you will hopefully understand now, the effect of user ratings / thumbs on a tracks presentation is a lot smaller than you may have expected, so the need to change the system may not seem as urgent.
The problem about giving fractures of a thumb to prior mixing steps is that we'd definetly see the template tracks benefit from this, while later remixes will win less or have no benefit at all.
Right now, templates and early mix-stages already benefit from getting more points for remixes and downloads, so I don't see why they should receive additional points - as you will see when looking at the search results, there are few tracks with many participants listed there, the "best funk" 1st page features results ranging from 2-5 participants with only two tracks with 4 or more... that is quite balanced (mind, one-instrument templates are excluded on purpose!).
I could go on for long telling you about the different special algos used to present the best possible selections of tracks, it is an interesting thing to think about for sure.
IF I had to improve or change anything about the current system,
a filter showing me a list of new members contributions-only OR a filter to show the "best rated of the last three months" might be nice to have to go gem-hunting -
to change the rating system would lead to such a huge scoring difference between older and newer tracks that it would be almost impossible to even that out by math... and I'm carefull to fix something that actually works quite decent :)