Join the Season of the Stars Remix Event!
skip
Home » Forums » Features » ratings system ideas

ratings system ideas

creativelycommon
.
permalink   Mon, Feb 14, 2005 @ 1:19 PM
I had an idea regarding the rating system:

while most artists uploading here are quite talented, there are a few who are a bit, well.. lacking(I serve as a great example here). also, there isn’t anything stopping a guy from signing up and rating everyone down just for kicks.

I propose, as a solution to the spiteful and otherwise meaningless downrating, and as a measure for promoting truly exceptional work, ‘weighted’ rating. It would work so artists who have uploaded the best art and been rated highly would have more say in rating others art. An example would be someone who has uploaded 3 or 4 mixes, all of which were rated 4.5+ would have a 100% vote for all tracks he rates. While a user (such as me) with no substantial art might have a 25% vote. If I vote 1 for a track, and a talented mixer votes 5 for a track, i assume the track would presently be rated a 3. Under my proposed system, it should be perhaps 4.5. The longer this proposed system runs, the more of an evolution we would see in truly awesome art seperated from newbie mixes.

problems:

accomplished mixers inflating their friends mixes, regardless of quality. cliques would form, etc.

Also, during the contests the radioshow ‘the revolution’ was playing all tracks rated 4 or above and paying no attention to lower rated tracks, even if the track rated a 5 had one vote while a good track had a 3.5 or something with 10+ votes, which obviously would make it more popular. A link next to the overall rating showing all individual ratings may help here. Another idea: The highest a user may rate is 5, but the highest a song can be rated is 6. For every vote, a small amount is added to the overall. perhaps +.05, or +.1, or something more scalable as the community expands.

just ideas.
admin
admin
.
permalink   Mon, Feb 14, 2005 @ 4:41 PM
Ratings are subjective and what you’re proposing (in the first part) seems to make them based on yet another subjective statistic. I’m not a statistician but it seems like the way to reduce the subjectivity would be to base the weight on some more concrete data about the reviewer: the number of uploads, the number of times that person has been remixed, etc.

And you’re right, any algorithm would be subject to buddy-pats (trading 5’s) and angry drunks looking to get even. I don’t see a way to *really* curtail trading 5’s (and in the end, I’m not sure it’s such a devastating thing) and requiring a breathalizer before log in might hard to implement.

But in the end the amount of work (i.e. coding) required to do any of this doesn’t seem like it will make a real dent in the "fairness" issue. Amazon’s answer was to ‘rate the rater’ (‘I found this review helpful’ kind of thing.) It’s not hopeless, there are just higher priorities. At the risk of sounding like "Yes, Minister" I would say "All in due time."

And that Revolution guy is a total hack. This week’s show is some clown talking for 45 min. straight. Sheeez.

Peace,
Victor
admin
admin
.
permalink   Tue, Feb 15, 2005 @ 10:11 AM
(um since no one has busted me since I posted this I better clear this up: Grant from the Revolution is actually a great, talented guy, a cool Mixter citizen (fgr) and I totally love the show. The clown in the interview I was talking about is me. )

Victor
indieish.com
.
permalink   Tue, Feb 15, 2005 @ 1:32 PM
No, I’m a complete hack.. but a complete hack with some bandwidth and a microphone :)

If it wasn’t an interesting conversation, it wouldn’t have lasted 45 minutes.. And really, it only stopped there because I was afraid I’d not get any music in otherwise.

For those interested..

http://indieish.com/sounds/...

Oh, and Short Faced Bear is coming on soon. That should be interesting.
Heuristics Inc.
.
permalink   Wed, Feb 16, 2005 @ 4:09 AM
I don’t think the rating system should be based on how many uploads a person’s had, because that would discourage music fans who aren’t musicians themselves. Don’t know if we have a lot of those, but I wouldn’t want to exclude them right off.
Automatic downrating can be watched for by the administrators… if they’re willing to put in the extra effort :)
-bill
Weird Polymer
.
permalink   Fri, Feb 25, 2005 @ 11:45 AM
Since I am new at this I was surprised to find that things I rated as a 4, which I thought meant I really liked them, actually lowered the overall raiting of a piece I was reviewing. It made sense once I thought about it, but I did wind up feeling conflicted and I was disappointed when I realized it made the overall rating for the piece I was viewing drop. What I really wanted to express in my choosing 4 versus 5 was that I liked the piece too but wasn’t necessarily overwhelmed by it.

What might work is to display a ratio or something. You could show average rating per number of reviewers. You could show the lowerest and highest ratings given, etc.

It still bumbs me out that I think something is good, a 4, and this will wind up lowering the overall rating … Maybe matching the numbers with labels like

1 => "Your audio makes me ill"
2 => "Don’t like it/Not my style"
3 => "That was OK, I’d listen to it again"
4 => "That was fun/That was good",
5 => ""That was great"

But then I guess it is all subjective anyway …
admin
admin
.
permalink   Fri, Feb 25, 2005 @ 11:55 AM
I’m confused — it should only lower the rating if a 4 is below the current average — and it is an average rating of every one that has rated it.

and yea, life doesn’t get more subjective than rating art ;)
Weird Polymer
.
permalink   Sun, Feb 27, 2005 @ 8:06 AM
I think my problem is that I am conflicted between "me too" and a consistant subjective mapping into 1-5. The solution I am happy with is to just pick the same number as the average if I don’t want it to change and add my comment. I recently was subjected to a data visualization course as work so I think I was looking at things in a more complex manner so ignore my earilier idea.
brisvegas1
.
permalink   Sun, Feb 27, 2005 @ 3:44 PM
You could always implement a rating method using a Bayesian estimator. The Bayesian method prevents object ratings from spiking based on a small number of initial ratings.
Pat Chilla The Beat Gorilla
.
permalink   Mon, Feb 28, 2005 @ 9:29 AM
Hmmm…..maybe we’re becoming too sensitive about ratings and the implications of anything less than a 5. The way I see it, a 3.5 gpa is an A and 4.0 is an A+. The same logic should apply with a 4 or 5 rating. Also, I really haven’t seen the jaded "Intentional Downrater" cause any real kaos in the ratings as of yet. I don’t really have an ultimate point to make, but this was on my mind after reading the other responses.
teru
.
permalink   Mon, Feb 28, 2005 @ 12:37 PM
Let me try this again.

For me I found the first couple of reviews meant the most. Shoes gave me my very first review at this site. If he hadn’t, I probably would have run off scared. So I feel pretty strongly about commenting on newcomers especially.

BTW my first rating was a 3. I was so happy, told everyone I knew. Kind of a selfish perspective but I thought I’d share it.



Weird Polymer
.
permalink   Tue, Mar 1, 2005 @ 8:53 AM
I think rates serve at least three useful roles -

1) encouraging new contributors to stick around.

2) how much you, as the author of a work, have hit the target and a suggestion of the level of interest in a given piece. (the comments are most helpful here rather then the number)

3) As a guide when you are trying to stretch your tastes in unfamiliar music territory.

I find now when I look at the rates I am pretty consistantly looking at two numbers, the average AND the number of reviews. I also think labeling "3" as "good, I’d listen again" would help new folk feel more comfortable when doing their first couple of reviews. I’ll shut up now 8-)
admin
admin
.
permalink   Tue, Mar 1, 2005 @ 9:03 AM
Quote: I also think labeling "3" as "good, I’d listen again" would help new folk feel more comfortable when doing their first couple of reviews.

Done.

Victor
admin
admin
.
permalink   Mon, Mar 7, 2005 @ 8:49 AM
OK, hopefully this week we’ll debut a listing that encourporates a lot of the suggestions here.

Victor
admin
admin
.
permalink   Thu, Mar 10, 2005 @ 9:22 AM
You guys know this is up and running , right? ;)

Click on ‘Latest Uploads’ , then ‘Picks’

Victor
Weird Polymer
.
permalink   Thu, Mar 10, 2005 @ 9:35 AM
Way cool! I rate the added feature a 5.0 8-)
Cezary Ostrowski
.
permalink   Wed, Apr 13, 2005 @ 9:58 PM
The rating scale is ok, but the descriptions with the numbers are a little….. undone.
I think that Wierdpolymer is right, it should be more like this:

1 => "Try gardening instead"
2 => "Not my style"
3 => "Proper"
4 => "Really good"
5 => "Outstanding"

Try to change it this way admin, ok?
ASHWAN
.
permalink   Sat, Apr 16, 2005 @ 12:28 PM
i’m just concerned about the idea of needing to pass a breath test before using this site!
other than that, i would like the opportunity of passing on feedback/comments without having to leave a score.
it is definitely helpful and encouraging to see reviews, it shows someone has paid attention.
overall though, i would rate this site and the love getting spread around the world on it a 5.
well done.
ashwan
MarcoRaaphorst
.
permalink   Mon, Apr 18, 2005 @ 9:24 AM
I don’t think a rating system is such a cool idea. smells like teen spirit and schools to me. you start comparing.

why not a simple way: thumb up and thumb down thing. only stuff that really sux can be rated low. stuff with timing problems, out of tune music … stuff like that.
admin
admin
.
permalink   Tue, Apr 19, 2005 @ 8:01 PM
(I’m not ignoring this thread, just let me know when you guys reach a concensus ;)
teru
.
permalink   Tue, Apr 19, 2005 @ 10:58 PM
Anyway of incorporating collective picks? Keep the comments of course. Ditch the stars and change to;

1- not recommended(harsh but to uphold site quality)
2- fair,good,excellent,whatever.
3- highly recommended(get like 5-10 recommendations and get on the picks page).

This way older stuff may creep back onto the picks page. Some songs may get 5-10 recommendations right away some may take a while. Just an idea . I know there’s going to be way of manipulating this system but so far everyone here has been really cool so I think trusting the community goes a long way. In any case, feedback please. : )