|
07-7-2018, 12:25 PM | #1 | |
✘ Forever OP✘
Join Date: Dec 2008
Location: Canada,Quebec
Age: 29
Posts: 4,171
|
Re: Entropy Gain for per-receptor NPS
Quote:
If you meant it as some kind of attribute to predict difficulty could you please explain your reasoning ? Otherwise I'm sorry I can't do that. |
|
07-7-2018, 12:29 PM | #2 |
I am leonid
Join Date: Oct 2008
Location: MOUNTAIN VIEW
Age: 35
Posts: 8,080
|
Re: Entropy Gain for per-receptor NPS
It gives a rough estimation of difficulty through general performances on the chart
Low % = Hard High % = Easy But you need a server to log all the user scores, users have to be online, and the chart needs a good enough number of players Using neural network is like assigning one person to judge all the difficulties (since it's supposed to map human brains and what not), but what if you disagree with that person |
07-7-2018, 12:40 PM | #3 | |
✘ Forever OP✘
Join Date: Dec 2008
Location: Canada,Quebec
Age: 29
Posts: 4,171
|
Re: Entropy Gain for per-receptor NPS
Quote:
As for the neural net, I have no clue why people are all on it, I dont recall mentionning it in this thread. That being said, you wonder what happens if people dont agree with a neural net's output ? Well, if the vast majority agrees with the net, then those who disagree should try to see if they're biaised because of their skillset and understand what led to that output. If only a minority agree with the output, then the possibility of it being wrong is greater and the chart would need closer inspection to see why that is the case. It's just how things goes when you have no predefined output class or labeled input. |
|
07-7-2018, 01:01 PM | #4 |
FFR Veteran
Skill Rating Designer
Join Date: Jan 2016
Age: 28
Posts: 282
|
Re: Entropy Gain for per-receptor NPS
Just for the record, I've tried to produce a difficulty algorithm primarily based on "distance to last note on each arrow/hand".
I don't know if there's something inherently wrong with this approach or I was just too inexperienced at programming to see it through to a satisfactory completion but I was unable to get to a result that was deemed usable by myself and the difficulty consultants whom I discussed the results with. On the subject of neural nets, both myself and Trumpet63 have attempted to use neural nets on FFR's song difficulties using extended level stats. Trumpet got his neural net closer than mine (his had a mean difference of 2.4 points from the actual value whereas mine was 4-5 IIRC) but his used several features (such as note color) that could be cheesed by a clever stepfile artist to over/underrepresent the difficulty of their file (ex. if white notes = high diff, throw in a lot of white grace notes that function identically to jumps in practice.)
__________________
|
07-7-2018, 01:07 PM | #5 | |
✘ Forever OP✘
Join Date: Dec 2008
Location: Canada,Quebec
Age: 29
Posts: 4,171
|
Re: Entropy Gain for per-receptor NPS
Quote:
What metrics did you use in relation with that distance ? Was it min/max/avg/distribution/... ? Because just like nps, it sounds like a solution that needs quite a few statistical values. Yes note color is arbitrary. |
|
07-7-2018, 01:20 PM | #6 |
FFR Veteran
Skill Rating Designer
Join Date: Jan 2016
Age: 28
Posts: 282
|
Re: Entropy Gain for per-receptor NPS
It was not pure NPS (NPS was included in the algorithm but only a small factor). It was more like "give each note a value based on how close it is to the next one on the next arrow/hand/overall, then sum everything and take the highest consecutive X notes, add factors for stamina/consistency/NPS"
I did a bunch of playing around with the factors and scales but I would always end up with either long streamy files being rated way too high or big spiky files being rated way too high (or both.)
__________________
Last edited by RenegadeLucien; 07-7-2018 at 01:21 PM.. |
07-7-2018, 01:29 PM | #7 | |
✘ Forever OP✘
Join Date: Dec 2008
Location: Canada,Quebec
Age: 29
Posts: 4,171
|
Re: Entropy Gain for per-receptor NPS
Quote:
|
|
07-7-2018, 01:45 PM | #8 |
FFR Veteran
Skill Rating Designer
Join Date: Jan 2016
Age: 28
Posts: 282
|
Re: Entropy Gain for per-receptor NPS
I'd need to experiment with it to get a definitive answer. I can see the value in having something like that, but it would be difficult to separate actual spikes/bursts from just natural variance in patterns (take a staircase for example: there are gaps of 5 notes between every left arrow, but only 1 between (some) down or up arrows, so the down/up arrows look much harder than the left/right arrows, and this could produce odd results for a difficulty change rate value. Would probably have to look at average difficulty over a short period of notes and use that to determine the difficulty change rate.
__________________
|
07-7-2018, 04:57 PM | #9 | |
✘ Forever OP✘
Join Date: Dec 2008
Location: Canada,Quebec
Age: 29
Posts: 4,171
|
Re: Entropy Gain for per-receptor NPS
Quote:
up, ,up, , , ,up, ,up __, ,0 , , , ,-2, ,2 vs ri, , , ,ri, , , ,ri, , , ,ri _, , , ,0, , , ,0, , , ,0 (changes between 0 and 1 have been normalized to the opposite of their inverse: 0.5 => 2 => -2) Takes a minimum of 3 notes to have a variation in distance. While it's true that the average is the same (0), you could maybe take the range between the minimum negative value (biggest deceleration) and the maximum positive value (biggest acceleration). Deceleration doesn't affect difficulty, don't forget that this is a per-receptor metric. A file starts at 0 difficulty with 0 notes. If you put a jack at x speed, then after a few notes its speed changes to x/2, the only problem is going from 0 speed to x speed, not from x to x/2. Gradual acceleration/deceleration aren't considered in this but you can get a primitive for it using this same concept. So, for the example of the staircase, if we discard the negative values, we get a max range of 2 on up and down, and a max range of 0 on left and right. And you dont aggregate those in any way because the min/max on each receptor is important. Does that cover the type of example you had in mind, Renegade ? |
|
07-7-2018, 05:23 PM | #10 |
FFR Veteran
Skill Rating Designer
Join Date: Jan 2016
Age: 28
Posts: 282
|
Re: Entropy Gain for per-receptor NPS
So, on spikes vs natural variance: what I mean by "spike", at least in the context of saying that my old algorithms would rate spiky files way too high, was files such as ABCDEath or TTE which have one disproportionately note-heavy section that overshadows everything else in the file. When I say "natural variance", I mean that some arrows in a long pattern like a stream, jumpstream, or staircase will be harder to hit than others.
What I'm trying to avoid is to see a staircase, get a max range of 2 on up or down as you described, and falsely claim that the staircase is a spike when in reality, it's just a staircase. Whatever metric that is used to determine the rate would have to be able to tell the difference.
__________________
|
07-7-2018, 06:55 PM | #11 | |
✘ Forever OP✘
Join Date: Dec 2008
Location: Canada,Quebec
Age: 29
Posts: 4,171
|
Re: Entropy Gain for per-receptor NPS
Quote:
You have to have some mathematical definition of your concepts if not used for visualisation only. For example, a spike would be a sudden high density x of notes, at least to my understanding of your description. In a more formal way you could say it's any section with high acceleration (lets use a trivial number like 4). Also btw my metric isn't totally correct for another reason, I'll post a fix to it. So you then have a trivial definition of a spike. With that, you want to avoid cases where the spike is short (i.e. in a staircase, the two ups or two downs) and constant (the staircase goes on for some time like 2 measures). The reason it's trivial is because first of all there is a trivial threshold to set and also because the length of said spike is not well bounded. You mention TTE. Take TTE's fastest spot (a rolly burst like 123412341234) and remove everything before it. The acceleration from nothing to that is equal for each receptor, so min = max = x. Now take a staircase 123432123432 with the distance between two up arrows being equal in this and the roll (from a per-receptor perspective, that is most definitely fair). From nothing to it, 2*min = max = x. It would seem that both are identical, however for the comparison to hold the total nps of the spike will be lower on the staircase than on the roll (the amount of notes bewteen the fastest consecutive notes per-receptor being 1 for the staircase and 3 for the roll). Therefore, a distinction Should be made naturally but the spikiness (again, per-receptor!) will be the same according to the trivial definition. EDIT: Just to be extra clear, I'll point out that what you refer to as a spike as we all know it is easily defined when using all notes (not per-receptor). There's a quick increase and decrease in the nps of the section and that's it. That metric can be useful, but it's not what I was explaining/arguing in the previous few posts. Last edited by xXOpkillerXx; 07-7-2018 at 07:01 PM.. |
|
07-7-2018, 10:14 PM | #12 |
FFR Veteran
Skill Rating Designer
Join Date: Jan 2016
Age: 28
Posts: 282
|
Re: Entropy Gain for per-receptor NPS
Yeah I think we're talking about totally different concepts here. Per-receptor spikiness isn't something I ever really considered in my algorithm, at least not beyond "this note is really close to the last note for this receptor, therefore it should have a high value".
I can't think of any files off the top of my head where per-receptor spikiness plays a major factor in the difficulty of the file, so I can't judge how well the simple "this note is close" factor covers it. I do think such a metric would be valuable to have.
__________________
|
07-8-2018, 11:16 AM | #13 | |
✘ Forever OP✘
Join Date: Dec 2008
Location: Canada,Quebec
Age: 29
Posts: 4,171
|
Re: Entropy Gain for per-receptor NPS
Quote:
|
|
07-8-2018, 11:36 AM | #14 |
✘ Forever OP✘
Join Date: Dec 2008
Location: Canada,Quebec
Age: 29
Posts: 4,171
|
Re: Entropy Gain for per-receptor NPS
Here's something I am wondering about per-hand stuff.
Is it safe to assume that any section with a high nps (x) on {3} and a lower nps on {4} is Harder than having x nps on both {3} and {4} ? Any counterexample is welcome. More visually, I'm thinking that [34]4[34]4 is always harder than [34][34][34][34]. But only per-hand, so the same wouldn't apply with combinations of receptors like {2} and {3}, or {2} and {4}, etc. And by always I mean no matter what is before it, after it, and what's going on on the other receptors. EDIT: I will even go as far as claiming that if x is the nps on {3} and y is the nps on {4}, the peak of that per-hand difficulty is reached when x = 2y or 2x = y. When you lower small nps, you get things like [34]44[34]44[34]44, and when you raise it, you get [34][34]4[34][34]4, both of which I would argue are objectively easier than [34]4[34]4[34]4. Last edited by xXOpkillerXx; 07-8-2018 at 12:29 PM.. Reason: oops meant easier in last sentence |
07-8-2018, 12:42 PM | #15 | ||
FFR Veteran
Skill Rating Designer
Join Date: Jan 2016
Age: 28
Posts: 282
|
Re: Entropy Gain for per-receptor NPS
Quote:
Quote:
__________________
|
||
07-8-2018, 01:09 PM | #16 | |
✘ Forever OP✘
Join Date: Dec 2008
Location: Canada,Quebec
Age: 29
Posts: 4,171
|
Re: Entropy Gain for per-receptor NPS
Quote:
How high does your algo rate DP compared to Undici ? Those are really hard files and none have been AAA'd yet Right now, they're only 2 points apart from each other, and I would Not consider the opposite to be an error because it's only a single file (it's better to look at results as a whole first and then understand the difference between particular files, so not having your complete results, I can only assume things). As for you AIM example, it would make no sense to put a long jack and a long jack with 1 jump in it at the same exact difficulty on a real numbers scale. The one with the jump Has to be harder, even if it's by a very small amount. |
|
07-8-2018, 01:52 PM | #17 | ||
FFR Veteran
Skill Rating Designer
Join Date: Jan 2016
Age: 28
Posts: 282
|
Re: Entropy Gain for per-receptor NPS
Quote:
Quote:
__________________
|
||
07-8-2018, 01:11 PM | #18 |
I am leonid
Join Date: Oct 2008
Location: MOUNTAIN VIEW
Age: 35
Posts: 8,080
|
Re: Entropy Gain for per-receptor NPS
Does it also address the fact that difficulty differs based on what your goal is? Charts can be trivial to AA but impossible to AAA, or there's some stupid minefield that makes it really hard to pass but once you survive it's a guaranteed AA, etc
Last edited by leonid; 07-8-2018 at 01:12 PM.. |
07-8-2018, 01:25 PM | #19 | |
✘ Forever OP✘
Join Date: Dec 2008
Location: Canada,Quebec
Age: 29
Posts: 4,171
|
Re: Entropy Gain for per-receptor NPS
Quote:
EDIT: @leonid: stepmania is different obviously. What you describe as difficulty to AA, AAA, pass, are all very distinct values that may have a similar computing process but would have their own specific primitives. You can't possibly have a single metric for overall difficulty when your definition of difficulty is an undefined combination of 3 distinct aspects, otherwise you end up with obviously biaised results that are very hard to interpret. A fair comparison can be made with Etterna's calculator: if overall difficulty is some aggregate (like avg or weighted avg) of the per-pattern difficulties (jack, stream, js, etc), then it's not a surprise that they haveso many files to ban from leaderboards. Last edited by xXOpkillerXx; 07-8-2018 at 01:50 PM.. |
|
07-8-2018, 02:47 PM | #20 |
Banned
|
Re: Entropy Gain for per-receptor NPS
I don't like difficulty being "one value".
It should vary in magnitude throughout the song, and be different kinds of difficulty. Like, how do scores change if you're only slightly less good at hitting something than another player. Difficulty might not be that well, but if it means the difference between AAA'ing and good-rushing a difficult jumpstream, then the scores are highly sensitive to skill. Maybe that's a good measure? Change in score vs. change in skill in a certain direction? Dunno. Thoughts aren't fleshed out at all. Just food for thought. |
Currently Active Users Viewing This Thread: 1 (0 members and 1 guests) | |
|
|