Collaborative rating system?

Tools    





Hey, bumping this to say I did not forget about it, life's just been a bit busier than expected.

I do have a very rough list together already, it's basically just average rating with a minimum rating cutoff. But I think/understand we'd probably all want something slightly more sophisticated, yeah? Some kind of weighting, maybe a mix of tenure, post count, slight nudge for reviews, just start fiddling with those dials and see what looks right, even?

My inclination is to start very simple and very egalitarian and just weight a little extra for films listed as favorites in people's profiles.



I see that Besteveralbums charts system is mentioned here.

I was so impressed of this system that I've immediately start searching web like crazy if some film oriented site use it. It seems that planet Earth is so small that there are probably 2-3 vital web sites for music and 2-3 for films, can you imagine that. I think, while searching for this, I came across MoFo hoping to find here something live.
In the meantime, I started to apply the BEA system in RYM manually which is a crazy thing of course.

I think, the Chart System - Year / Decade / All Time is the only system that is possible to work.
Rating systems were corrupted long ago, see IMDB and RYM.

Is there possibility, this to be applied here?
__________________
Society ennobler, last seen in the Medici's Florence.



I think, the Chart System - Year / Decade / All Time is the only system that is possible to work.
Rating systems were corrupted long ago, see IMDB and RYM.

Is there possibility, this to be applied here?
Can you elaborate on it a bit? I'm not familiar with it.



Year / Decade / All Time chart system.



The Rank Score: 168 is everything for a movie. It is its reputation in the web site. The film has a place in the three main charts Y / D / AT (or in any potential secondary chart) in accordance of its rank score. For example: Sideways is placed #11 for 2004 because of its 168 pts and this score also places it #123 in the 00's and #1,132 in the All Time Rank table.

The Rank Score accumulates points every time a user includes a film in some of the official charts. Every registered user has a "wallet" in his/her profile (user CP) containing a pack of fixed official charts (lists) to be fulfilled with movies. You get the idea, right?

In my view, the number of items per list in the personal pack of charts should be limited for discipline. For example: Year chart (list) max 20 films / Decade chart max 50 films / All Time list max 150 films. Of course, the site system shows the global charts unlimited - the audience can see entire rank list for 2004, 2005, 2006.... 2000's etc.

Users entertain themselves filling up all these charts and following the dynamically changing rankings per Year per Decade and All Time. As shown on the image, the movie rankings are clickable. By clicking on #11, The 2004 Year chart opens and you can see how all movies of 2004 are ranked. The same with decade and all time. By clicking on the number of charts that includes the film, a list of all charts with the movie opens. There, one can see that user X ranked the film #16 in the Year chart / user Y ranked the film #37 in the Decade chart, etc.

Hope the idea was cleared and will be liked!

Cheers!



Wow, extremely thorough! Thank you.

Something like this should happen. The terminology or "charts" and "wallets" is still a little opaque to me, but I'll assume it's kind of like our Favorites here, in that it's sort of just a "highlight the things you like" kinda thing. That, along with the ratings we have here (and maybe reviews, not sure yet) would probably be the basis for any sort of overall ranking.



"How tall is King Kong ?"
The main issue with these ranking, is that 50 people will vote for Avengers Endgame because 50 people saw it, and 3 people will vote for Elisa Vida Mia because 3 people saw it. This says nothing of the quality of the film (maybe all the people who saw Elisa Vida Mia prefer it to Avengers Endgame, who knows), but it says something about the demographics of, say, internet.

Or it measures the notoriety of a film more than its evaluation.

This website may have a tool to balance that, with all these nifty lists of marked and unmarked titles. Still, the default is seen/unmarked. It's a weak tool, but already a start.

But in practice, through all these systems, we all "judge negatively", by default, the movies we haven't seen (okay, most rankings, as there would be also the opposite vulnerability of letting non-representative few votes determine a film's ranking : for instance some obscure militant movie). And that distorts rankings to the point of making them more or less useless, except as a tool of socialization (which movies must be seen to join conversations).



The thing about the number of ratings is at least partially addressed by using an average rating, rather than giving films credit for sheer number of ratings. There's some distortion on the low end, in that we'd obviously need to have a minimum number or something, but we could make that very modest and just weight slightly at the extremes.

Anyway, we've got lots of knobs to turn in that we c an give extra emphasis to things that can't be flooded easily by the latest release, IE: being placed on lists, being added to favorites, or being reviewed. There are some concerns there, already detailed earlier in the thread, but most of the most obvious issues have at least some potential solution.



Mostly, I'm just gonna ask everyone not to overreact to whatever incarnation(s) we start with, since this is obviously going to require iteration and experimentation, so it'll be a little messy at first. Main thing is just gonna be to get something up and tweak from there.



The main issue with these ranking, is that 50 people will vote for Avengers Endgame because 50 people saw it, and 3 people will vote for Elisa Vida Mia because 3 people saw it. This says nothing of the quality of the film (maybe all the people who saw Elisa Vida Mia prefer it to Avengers Endgame, who knows), but it says something about the demographics of, say, internet.

Or it measures the notoriety of a film more than its evaluation.

This website may have a tool to balance that, with all these nifty lists of marked and unmarked titles. Still, the default is seen/unmarked. It's a weak tool, but already a start.

But in practice, through all these systems, we all "judge negatively", by default, the movies we haven't seen (okay, most rankings, as there would be also the opposite vulnerability of letting non-representative few votes determine a film's ranking : for instance some obscure militant movie). And that distorts rankings to the point of making them more or less useless, except as a tool of socialization (which movies must be seen to join conversations).
These notes above contains some of the thoughts on what basis is developed the List System that I've suggested above.

A Huge crowd floods sites like IMDB and RYM, and in just couple of minutes time click on star ratings, supporting their mainstream favorites. You know, every teenager can uprate/downrate 30-40 films in a minute. The List System will stop this.

The List System considerably reduces the effect of this crowd because the user have to fill a list for the year or decade and this is work and this takes time. This way, I think about 80-90% of the crowd can't do this work.
On the other hand, the devoted cinema supporter has no problem patiently to prepare all his lists. As a result, the above mentioned Elisa, vida mía will gain support of all three people seen it and, at the same time, 47 out of 50 people who've seen Avengers Endgame will leave because can't support it with a click for a second. This way, the site will be turned into a place for devoted knowledgeable film fans and not a place for crowd that fights for their mass media injections.



I'd like to insert an observation that can be corrected quickly, I hope.

Currently, the site shows ratings like this:
= 4.0 to 4.49
I think, this is a bit visually misleading.

It should be adjusted to be like this:
= 3.75 to 4.24 and
= 4.25 to 4.74

You get the idea, right?



That elusive hide-and-seek cow is at it again
Good point. In that case, take it very personally and fly off the handle.




it's kinda what I do.
__________________
"My Dionne Warwick understanding of your dream indicates that you are ambivalent on how you want life to eventually screw you."
- Joel

"Ever try to forcibly pin down a house cat? It's not easy."
- Captain Steel



I'd like to insert an observation that can be corrected quickly, I hope.

Currently, the site shows ratings like this:
= 4.0 to 4.49
I think, this is a bit visually misleading.

It should be adjusted to be like this:
= 3.75 to 4.24 and
= 4.25 to 4.74

You get the idea, right?
I do, but this asymmetry was deliberate, simply because what's intuitive on a visual level and defensible on a mathematical one are, I think, sometimes two different things. It's something I can at least look at again when I get into the weeds on this stuff, though.



That elusive hide-and-seek cow is at it again
When I first joined here, I didn't feel that 5 half-increment scores were going to work for me. Too many movies fall into that 3-4 range, and there is a lot of space there that a half point just couldn't really express, IMO. The suggestion from regulars then was to use two ratings to create a 10 count rating.


That more or less addressed my concern at the time and may sort this one. That said, I don't think I could figure out how to create the rating graphic, so I just started writing my scores in text and never looked back. My reply? 2.75/5. I could even go as high as 2.87/5. Maybe.



When I first joined here, I didn't feel that 5 half-increment scores were going to work for me. Too many movies fall into that 3-4 range, and there is a lot of space there that a half point just couldn't really express, IMO. The suggestion from regulars then was to use two ratings to create a 10 count rating.


That more or less addressed my concern at the time and may sort this one. That said, I don't think I could figure out how to create the rating graphic, so I just started writing my scores in text and never looked back. My reply? 2.75/5. I could even go as high as 2.87/5. Maybe.
Just use the standard international symbols for 1/4 and 3/4 buckets:

1/4 bucket -
3/4 bucket -

Example: this response -
__________________
201620172018201920202021+
NomsPre-1930 Countdown


Fashionably late to every party since 1473!




I do, but this asymmetry was deliberate, simply because what's intuitive on a visual level and defensible on a mathematical one are, I think, sometimes two different things. It's something I can at least look at again when I get into the weeds on this stuff, though.
It seems that the system was adjusted when there was not image for half a box
.

Since this image is available (which is actually ten scale rating system) then both mathematically and visually 4.40 is
not
as shown now.

In addition, I would suggest the actual numbers to be shown next to the boxes.

Cheers!



We've had the half-box image as long as all the others, so it can't be that. I built this awhile ago, so I don't remember my specific reasoning for each thing, though I know I put a lot of thought into it at the time.

I do remember a few issues with the sliding effect on hover, and I think that was related: some stuff where it kinda "felt bad" (which, no joke, is actually a guiding principle in UX design, however vague it may sound) at the low end since it changes in a matter of pixels.

Rounding 4.4 to 4.5 is probably reasonable, but I'll see how it looks (and feels).