Hey,
I have been trying to figure out if there is any sort of setting I a use to prevent gestures from toggling overly aggressively. I’ve noticed 2 things, the first is you can gesture off/on/off/on seemingly non stop without gaps, which means sometimes I will gesture on, and accidentally go right back to off, or try and go off and it just pops back on. It seems you can just trigger the on and off gestures as fast as you can wiggle the saber.
The second and perhaps larger issue is, it seems like the wobble for the twist gesture, if you do left / right / left, it will actually treat it as 2 combos, left/right and right/left, it doesn’t distinguish that the right was already part of a gesture, so it makes triggering the second gesture SUPER easy which is of course less than desirable.
Are there any like gesture strength or like min/max twist options as well? Oh, also, is there just a generic blade is moving more than x speed value that’s computed anywhere? If so could it be possible to disable power gestures if the blade is moving > x velocity? I don’t mind if others trigger like stab and melt and force push and stuff, but it is also quite easy to toggle the blade off while spinning. I know there is a spinning mode, but as a software engineer, I hate having manual toggles for things that should be computable.
If any/all of this exists great, if not, even if you don’t have time, if you could just point me to the files I might need to poke at, I will be happy to take a crack myself. My unmanaged code experience has been limited to small changes in QMK for keyboards (name here is also my GitHub).
Oh last question, is there documentation about what data is there that can be sent to another device? I really wanna mess with trying to either transmit gyro data or more ideally record it, button presses, clashes, blocks, etc, and then using a second board with bluetooth (for bluetooth timecode like atmos supports). I still gotta figure out a way to do object triangulation with like AirTags and UWB. I’ve actually considered just outright sticking a few of them on sabers/cameras/etc and then just making an iPhone app that tracks them all and records it (again with timecode).
Also got an azure Kinect sdk kit, and I wanna get a lidar too, but the overall goal is to be able to take the data from all the hilt activity, sync with footage timecode and then just automatic draw the blades, and responses and stuff.
Both useful for initial ignitions where there are no blades to begin with AND to rotoscope or w/e over the fx blades to make better looking ones. I still wanna shoot mostly with FX blades just because it makes it SOOOO much easier to deal with the lighting in post since you don’t gotta like manually render lighting from the blade onto people/shadows/objects/etc (which is why I assume they use fx blades on set now).
I do wish there was more info about how they control theirs, but since they have umbilicals, I assume at least part of that is for power, part is to pass DMX to the hilt and gyro to the computer or we they connect to. I know there are prosolutions for object tracking like I mentioned doing with AirTags, but my Komodo was already a stretch, no way can I afford such a pro system >_<.
But I digress, my main concern is just finding a way to avoid the ignite/immediately retract blade or vice versa behavior I seem to trigger far too often (and the less common but still frequent, spinning the saber and then twisting in just 1 direction and blade turns off as the last part of the spin just happened to twist the other way). I can’t imagine there is no way to add a delay/guard clause to avoid it without actually breaking gestures or limiting them. But maybe I am overly optimistic.