A scheme for rating player skill/experience levels
A scheme for rating player skill/experience levels
Has anyone thought of a way of measuring a player's skill level?
It would be good to be able to state your level of skill in a recognised way as it would put forum comments into context.
If this community could agree a way of doing it , it would also open up a way of ranking games in a way consistent with player ranking. For example, if it became apparant that a number of players of a given skill rank (or range of ranks) had 1CC-ed a set of certain games then it might group games by difficulty and thus describe a path to improving your skill while avoiding unnecessary frustration.
If there was a survey that everyone completed whereby they listed how far they progressed in each game they played (loop&level number or ALL) and a high score - that could provide a basis for comparing skill levels.
For games with different characters then different entries would have to be made for each combination- it might become apparant which characters require higher levels of skill.
To reduce complexity, game settings such as diff level, lives, bombs etc would have to be standardised for each game as they are in the high score tables.
A ranking calculated from the resulting dataset would need to average performances out but at the same time give some weight when it comes to how many games someone has played. Those games with a lesser number of claimed 1CCs could also attract a weighting so that those who complete them would get recognition for it, while those games with more 1CCs recorded against them would attract a smaller addition to a player's rank.
What's the opinion on something like this?
It would be good to be able to state your level of skill in a recognised way as it would put forum comments into context.
If this community could agree a way of doing it , it would also open up a way of ranking games in a way consistent with player ranking. For example, if it became apparant that a number of players of a given skill rank (or range of ranks) had 1CC-ed a set of certain games then it might group games by difficulty and thus describe a path to improving your skill while avoiding unnecessary frustration.
If there was a survey that everyone completed whereby they listed how far they progressed in each game they played (loop&level number or ALL) and a high score - that could provide a basis for comparing skill levels.
For games with different characters then different entries would have to be made for each combination- it might become apparant which characters require higher levels of skill.
To reduce complexity, game settings such as diff level, lives, bombs etc would have to be standardised for each game as they are in the high score tables.
A ranking calculated from the resulting dataset would need to average performances out but at the same time give some weight when it comes to how many games someone has played. Those games with a lesser number of claimed 1CCs could also attract a weighting so that those who complete them would get recognition for it, while those games with more 1CCs recorded against them would attract a smaller addition to a player's rank.
What's the opinion on something like this?
Re: A scheme for rating player skill/experience levels
It's not really possible, and it's kind of pointless.
@trap0xf | daifukkat.su/blog | scores | FIRE LANCER
<S.Yagawa> I like the challenge of "doing the impossible" with older hardware, and pushing it as far as it can go.
<S.Yagawa> I like the challenge of "doing the impossible" with older hardware, and pushing it as far as it can go.
Re: A scheme for rating player skill/experience levels
Well, for starters, it would require more people playing the games than we have here, so I don't think any ranking-based system would work.
Re: A scheme for rating player skill/experience levels
trap15 wrote:It's not really possible, and it's kind of pointless.
Would adding context to forum posts and formulating an evidence-based method of grading games for progression not be of value?
OK. Maybe we don't have enough people here to compile enough data to it. If there are enough members to rank the top 25 games on pure subjectivity I thought this might have been worth trying. Surely you have to play the game to be able to rank it? Shrugs.
Last edited by davyK on Tue Mar 22, 2016 7:28 pm, edited 1 time in total.
Re: A scheme for rating player skill/experience levels
I find the idea appealing for the purpose of measuring your own progress beyond just the score in one game. It would be cool if you had some kind of "Level" which summarizes the whole calculation. Practicing a game would then be sort of like grinding for exp in an RPG. Of couse that's already the case in the sense that you also practice common skills required by all STGs, but having a number go up is always fun.
But finding a scheme will be rather difficult. While it should actually be possible to design something which makes sense and is fair, it probably won't be possible to get everyone to agree on it.
But finding a scheme will be rather difficult. While it should actually be possible to design something which makes sense and is fair, it probably won't be possible to get everyone to agree on it.
Re: A scheme for rating player skill/experience levels
People have posted 1CC lists - even an analysis of that could be rather revealing.....
-
- Posts: 21
- Joined: Fri Sep 11, 2015 6:06 am
Re: A scheme for rating player skill/experience levels
This is a fascinating question. Any answer to this is bound to be wrong, but it would still be informative.
Personally, I use two references: Perikles' excellent 16-bit difficulty ranking and the Japanese STG difficulty wiki. These are good rules of thumb. I disagree with some rankings, but that's only natural, considering everyone will have different skill sets. You should at least thumb through these. Look up the easiest game and the hardest game you've played. Going by Perikles, I'm about a 3.
It also strikes me that Restart Syndrome has tons of data. I'm assuming that you could query it to count how many 1CCs each game has. That would give you some idea of its difficulty, though of course more popular games will have more clears.
But wait! Restart Syndrome already has a player ranking.
Personally, I use two references: Perikles' excellent 16-bit difficulty ranking and the Japanese STG difficulty wiki. These are good rules of thumb. I disagree with some rankings, but that's only natural, considering everyone will have different skill sets. You should at least thumb through these. Look up the easiest game and the hardest game you've played. Going by Perikles, I'm about a 3.
It also strikes me that Restart Syndrome has tons of data. I'm assuming that you could query it to count how many 1CCs each game has. That would give you some idea of its difficulty, though of course more popular games will have more clears.
But wait! Restart Syndrome already has a player ranking.
-
CStarFlare
- Posts: 3021
- Joined: Tue Feb 19, 2008 4:41 am
Re: A scheme for rating player skill/experience levels
RS's player ranking is quite iffy - I was just throwing lists out. 
If something comes out of this discussion I would be more than willing to lend the Restart Syndrome dataset for analysis.

If something comes out of this discussion I would be more than willing to lend the Restart Syndrome dataset for analysis.
Re: A scheme for rating player skill/experience levels
It'd only make much sense if you were to also categorize the different skills emphasized in different games. But even that's complicated by the fact that one game may have different ways to approach it depending on what skills you're more comfortable with. While it does make sense to compare the difficulty of similar games (like, say, comparing Touhou games with each other), doing so across the genre as a whole so far has only worked marginally better than "hardest video game ever" lists that compare the difficulty of a turn-based RPG with that of a 3D hack & slash, and it's unlikely to get much better than anecdotal evidence unless you've got a crapton of data to analyze.
If you want to get better, play games that you like. It's always easier to play a game that you like than one that you dislike.
If you want to get better, play games that you like. It's always easier to play a game that you like than one that you dislike.
Re: A scheme for rating player skill/experience levels
I wouldn't want to drill down too deep into what type of game requires what skills....I would propose that if you had enough info on how far players have progressed in a set of games then it would become self evident which are more difficult.
If you record how far people progress in a game even if they don't complete it then you could look at the ratio of players who play a game to those who have 1CCed it - which gives more insight into a game and would be a measure against a game's popularity skewing results.
You can of course talk about rarity of a game and its impact etc. but no measure would be perfect - however I believe that it would provide at the very least - an interesting indicator of player skill and game difficulty- and that's all anyone could hope for.
If the raw data was gathered and structured in some way to facilitate analysis and then made available to all, it could give rise to many different ways of evaluating it.
....just saying.
Maybe people don't want to admit to not being as good as others? WOuld be easy to anonymise data by subbing a reference ID for member name.
If you record how far people progress in a game even if they don't complete it then you could look at the ratio of players who play a game to those who have 1CCed it - which gives more insight into a game and would be a measure against a game's popularity skewing results.
You can of course talk about rarity of a game and its impact etc. but no measure would be perfect - however I believe that it would provide at the very least - an interesting indicator of player skill and game difficulty- and that's all anyone could hope for.
If the raw data was gathered and structured in some way to facilitate analysis and then made available to all, it could give rise to many different ways of evaluating it.
....just saying.
Maybe people don't want to admit to not being as good as others? WOuld be easy to anonymise data by subbing a reference ID for member name.
-
colour_thief
- Posts: 378
- Joined: Mon Apr 30, 2007 12:41 am
- Location: Waterloo, Ontario
Re: A scheme for rating player skill/experience levels
Not debating that it would be anything more than a novelty, but it's totally feasible with a Netflix recommendations level of accuracy.trap15 wrote:It's not really possible, and it's kind of pointless.
Re: A scheme for rating player skill/experience levels
Fun idea, I suppose you could summarize the players' rankings in the high score threads and then weigh the game against how high it ranks in the complete Top Shmups of All Time list that we voted for.
Re: A scheme for rating player skill/experience levels
I think there's two parts of skill, one is the practiced puzzle it out memo skill, and the other is the sight-reading skill (to put it into musical terms).
-
ProjectAKo
- Posts: 434
- Joined: Mon Nov 24, 2014 8:13 pm
Re: A scheme for rating player skill/experience levels
Yes, and the main one you're going to be using is the memo skill in scoring, so it really is pretty pointless. I don't think anyone's going to care if you go around saying "Yes, I am someone who routinely puts hundreds of hours into memorizing layouts, researching exploits, and drilling motions into my muscle memory." nor will they be able to glean much useful information from that.
Re: A scheme for rating player skill/experience levels
The Top 5 Pound-for-Pound Shmups.com Players
1. Jaimers
2. iconoclast
3. BOS
4. Erppo
5. saucykobold
If any one of these players shows up in your score thread, you're gonna have a bad time.
1. Jaimers
2. iconoclast
3. BOS
4. Erppo
5. saucykobold
If any one of these players shows up in your score thread, you're gonna have a bad time.
-
Bananamatic
- Posts: 3530
- Joined: Fri Jun 25, 2010 12:21 pm
Re: A scheme for rating player skill/experience levels
pazzy, x91, iconoclast, prometheus and 1 more random person
not listing any touhou people because you guys dont give a shit about them despite some amazing accomplishments
not listing any touhou people because you guys dont give a shit about them despite some amazing accomplishments
Re: A scheme for rating player skill/experience levels
Do any of them have any raizing scores, besides BOS's dimahoos.1. Jaimers
2. iconoclast
3. BOS
4. Erppo
5. saucykobold
"I've had quite a few pcbs of Fire Shark over time, and none of them cost me over £30 - so it won't break the bank by any standards." ~Malc
-
Bananamatic
- Posts: 3530
- Joined: Fri Jun 25, 2010 12:21 pm
Re: A scheme for rating player skill/experience levels
they all cheat by using savestates so it doesnt count
Re: A scheme for rating player skill/experience levels
Probably used game genie too. 

"I've had quite a few pcbs of Fire Shark over time, and none of them cost me over £30 - so it won't break the bank by any standards." ~Malc
Re: A scheme for rating player skill/experience levels
Bananamatic wrote:pazzy, x91, iconoclast, prometheus and 1 more random person
That's why I used "pound for pound". Our Raizing players tend to specialize (although Plasmo has good scores in a lot of their games).chempop wrote:Do any of them have any raizing scores, besides BOS's dimahoos.
Pazzy's DOJ score is arguably the best score on the forum, but it's his only score. x91 and prom also only have a few (but they're very very good). Zil is another who dominates in particular games.
The five I listed have many, many good scores. Jaimers in particular has shown proficiency in a wide variety, from Taito to Psikyo to Cave.
Edit: I hope no one takes my list too seriously. We have a lot of talented players here and it's great to see how much higher the skill set has risen the past five to ten years.
Re: A scheme for rating player skill/experience levels
Didn't we already have that thread?NTSC-J wrote:The Top 5 Pound-for-Pound Shmups.com Players
1. Jaimers
2. iconoclast
3. BOS
4. Erppo
5. saucykobold
If any one of these players shows up in your score thread, you're gonna have a bad time.
Re: A scheme for rating player skill/experience levels
People who play a lot of one game are going to have a huge memo advantage over people who don't! The only way to truly know who is the best is by letting everyone play a newly released shmup for a few hours, then compare scores! Memorization, strategy and watching replays is literally cheating.

Re: A scheme for rating player skill/experience levels
I have an idea.davyK wrote:If you record how far people progress in a game even if they don't complete it then you could look at the ratio of players who play a game to those who have 1CCed it - which gives more insight into a game and would be a measure against a game's popularity skewing results.
Some kind of website listing a bunch of shmups. You login (with the Shmup Forums account, maybe, to prevent cheating?). "What game did you play?" You click on one of them, it adds up to the number of people who played it. "Did you clear it? Yes/No" "How many hours did you put into it until you cleared it/so far?" Doesn't matter if you have to input a precise number or click on one of many options (like 0~3, 3~10...).
This way you would know the difficulty of each shmup (clears divided by people or viceversa, with the average of playtime to 1CC it added in somewhere, I'm no algorithm master, do it yourself) and according to that, you'd know the skill of each player.
You can't have something more accurate than this. Unless you can and I'm an arrogant a**hole.
Re: A scheme for rating player skill/experience levels
Judging the accuracy of a skill rating must start with the question of what you're going to use the rating for. Are you going to compare players' skill levels? Then you must take into account such factors as their practice methods, over what time period they spread their playtime, how much they knew about the game before playing it or how much they researched outside of the game, what experience they had with similar games, how much they tried scoring at the expense of survival (speaking of which I still haven't cleared Battle Bakraid's Advanced Course...), and a whole host of other factors that I doubt anybody wants to keep track of and report. Besides, it doesn't make sense to me to compare two players who don't play the same games (that's like comparing two musicians who don't play the same instruments), and if they do play the same games then you can already compare their scores.
Are you going to suggest games based on the skill rating? Then, again, an accurate recommendation should take into consideration whether they liked the game (which honestly is really all that should matter), what elements of the game gave them the most difficulty, etc.
Do you want a metric to see your improvement over time? If you are in fact improving your overall skill, I think you'll notice it without the help of a number, but at the same time it's hard to quantify, which is why I have qualms about a quantified rating system. I will admit that checking the ratings of games I've played on the Japanese STG difficulty wiki is fun, but it's also filled with caveats and attempting to correct for confounding factors without a clear understanding of them would risk making it even more misleading. Also, I think returning to games you've played before and seeing your improvement at specific games tells far more than any overall metric could possibly hope to. I do this from time to time with Touhou games, for instance.
I guess I just don't understand what people seek to gain out of this. The main problems are that there's a lot of variety in shmups and a ton of reasons why someone might learn a game faster or slower, such as interest, enjoyment, familiarity, desire to score, and general lifestyle (do you only play when drunk?). Just because two people found one game harder than another doesn't mean they think so for the same reason. One might be trying to do a lot of reactionary dodging and failing at that, the other might be trying to memorize the same patterns and failing the same amount. One might die a lot on stage 6, the other might lose critical resources on stage 3. All in all it makes about as much sense as a standardized rating for how good you are at games. If you think that makes sense than I guess this would also make sense.
Are you going to suggest games based on the skill rating? Then, again, an accurate recommendation should take into consideration whether they liked the game (which honestly is really all that should matter), what elements of the game gave them the most difficulty, etc.
Do you want a metric to see your improvement over time? If you are in fact improving your overall skill, I think you'll notice it without the help of a number, but at the same time it's hard to quantify, which is why I have qualms about a quantified rating system. I will admit that checking the ratings of games I've played on the Japanese STG difficulty wiki is fun, but it's also filled with caveats and attempting to correct for confounding factors without a clear understanding of them would risk making it even more misleading. Also, I think returning to games you've played before and seeing your improvement at specific games tells far more than any overall metric could possibly hope to. I do this from time to time with Touhou games, for instance.
I guess I just don't understand what people seek to gain out of this. The main problems are that there's a lot of variety in shmups and a ton of reasons why someone might learn a game faster or slower, such as interest, enjoyment, familiarity, desire to score, and general lifestyle (do you only play when drunk?). Just because two people found one game harder than another doesn't mean they think so for the same reason. One might be trying to do a lot of reactionary dodging and failing at that, the other might be trying to memorize the same patterns and failing the same amount. One might die a lot on stage 6, the other might lose critical resources on stage 3. All in all it makes about as much sense as a standardized rating for how good you are at games. If you think that makes sense than I guess this would also make sense.
-
Nameschonvergeben
- Posts: 74
- Joined: Thu Oct 15, 2015 3:29 pm
Re: A scheme for rating player skill/experience levels
Is the best western superplayer the person with the most western records?
Last edited by Nameschonvergeben on Wed Mar 23, 2016 1:13 pm, edited 1 time in total.
Re: A scheme for rating player skill/experience levels
You're right, Shepardus, but based on your criteria, no leaderboard can be created for anything.
It's not like people winning medals in track events are judged according to how often they trained, when, how, where, etc, all that counts is that they won the medal. How much time they spent to reach that goal is merely a bonus.
It's clearly much better to do this than look at a random wiki anyone can edit, or nothing at all.
...Are there really people who play for score before clearing it first? I understand doing it for extends, but...
I know there are, just a rhetorical question.
It's not like people winning medals in track events are judged according to how often they trained, when, how, where, etc, all that counts is that they won the medal. How much time they spent to reach that goal is merely a bonus.
It's clearly much better to do this than look at a random wiki anyone can edit, or nothing at all.
...Are there really people who play for score before clearing it first? I understand doing it for extends, but...
I know there are, just a rhetorical question.
Re: A scheme for rating player skill/experience levels
Leaderboards are possible for a single game because your goal is to compare the scores themselves, but that's a different problem from coming up with a scoring system that accurately reflects the "skill" you want it to measure. It's hard enough for the decathlon where the list of events isn't changing all the time and all competitors have scores for all events.AxelMill wrote:You're right, Shepardus, but based on your criteria, no leaderboard can be created for anything.
It's not like people winning medals in track events are judged according to how often they trained, when, how, where, etc, all that counts is that they won the medal. How much time they spent to reach that goal is merely a bonus.
It's clearly much better to do this than look at a random wiki anyone can edit, or nothing at all.
You know what they say, the journey matters more than the destination.AxelMill wrote:...Are there really people who play for score before clearing it first? I understand doing it for extends, but...
I know there are, just a rhetorical question.

-
CStarFlare
- Posts: 3021
- Joined: Tue Feb 19, 2008 4:41 am
Re: A scheme for rating player skill/experience levels
It occurs to me that beatmania IIDX has its DJ Point system, which is perhaps similar to what the OP might have in mind. It utilizes a formula with two main inputs (score and clear level) and adds each track's points into an overall point pool. It's one game but each song can be difficult in very different ways (speed, density, scratching, hold notes, etc), so it takes a couple metrics that are broadly applicable and builds something around them.
You could theoretically do the same thing with shmups - have score (perhaps relative to some base for each game) and clear (perhaps loops too?) give some numerical score for each game, and give each player a cumulative score. It won't tell you who's "best," but it will give people a figure they can use to approximate experience and maybe even progress.
You could theoretically do the same thing with shmups - have score (perhaps relative to some base for each game) and clear (perhaps loops too?) give some numerical score for each game, and give each player a cumulative score. It won't tell you who's "best," but it will give people a figure they can use to approximate experience and maybe even progress.