I had a product idea I have yet to make where you replace ratings with rankings. Instead of giving something a 1-5 review, you just answer a few quick questions whether something is better or worse than a listed alternative. You aggregate enough rankings and you can give everything a percentile score. The number is actually meaningful - a 70% means people on average think that it's 70% better than all ranked alternatives.
And you can't lie or influence a ranking as easily. "You think Rings of Power is a good show? Okay, but are you are actually going to rank it above The Sopranos?"
Hasn't failed me yet and if you do it as an org it helps with arguments down the line
Are you aware of metacritic? They take all kinds of ratings, scales, stars, grades, etc, from all kinds of critics and reviewers and turn them into nice 1-100 percent ratings to average.
Why not ? It depends what you are looking for at that moment.
If an entry causes a logical fallacy, that is an opportunity to represent the data in a different order and see if the user changes their ranking. This will actually help to keep the data fresh. And you can retain "fuzzy" rankings in certain areas without threat to the accuracy of the overall database.
If you want to have a multivariable structure, users could rank more than facet at once. So for a car, you could compare if a Honda Civic is better or worse than a Toyota Corolla on handling, comfort, features, etc. Combine this with non-subjective data (price, 0-60, etc) and users can choose if they want an aggregate ranking or weighted based on their criteria.
A 7/10 from Gamespot is useless data because they give 7/10 to everything.
Is Schindler's List a better Comedy than Gone With the Wind?
Is Schindler's List a better Romance than Gone With the Wind?
Is Schindler's List a better Documentary than Gone With the Wind?
Imagine if you had the dataset to say "remove every reviewer who ranked God of War above Spiritfarer" you would probably be left with an amazing set of recommendations.
If, in the future, your tastes change, a few things get ranked "above" what formerly held your top slot. The top slot was never "200 absolute points," it was just "the highest single ranking"
Although, I do see the coarseness of a new #1 bumping everything down … and forcing a reconsideration of whole blocks of rankings … arriving at "groups" … and basically a star system.
If you have enough people willing to mindlessly to swipe on random comparisons during the day so they can see their own report, and you could properly sort and tag all categories, you could have a truly bonkers data set. Like whether dollar for dollar consumers prefer the Barbie movie to owning a Porsche Cayenne.
The _Actors Who Love History But Not Accuracy_ list. Mel Gibson holding #1, but look out: Costner is on the rise!
- [0] my mood changes more often than my taste do
If I've spent all day on calls, then proceed to watch, for example, anything Aaron Sorkin, I'm likely to treat it less charitably (because I'm tired of flapping gums) than if I watched it after a week in the desert (and human contact is wonderful).
My mood would color ratings as well …
How would one flatten the effect of mood on a either-or ranking system? Is it possible?
I suppose the system should ask for every movie watched how they'd rank given a particular mood ? So it's "is A better than B when you want something with deep thinking to watch, is A better than B when you want something easy to follow ?" but it has its own can of worms: sometimes I want to watch something with deep thinking even though I am in the mood to unwind...
All in all, I think it's a waste of time to catalogue our own tastes and try to build a personal recommendation system. I hope/think/want to believe thank knowing ourselves gives better reward.