Okay. Guess I am missing the knowledge how the “base” is calculated.
I was thinking, okay we work in pixels so we need to have all the units comparable.
But maybe the pixels are transformed into “cm” for the base calc. Then we could of course use “cm” for arm and hilt length.
Why 0 / X though? is the zero from the arm length? if yes, why arm length / hilt length?
I added pnmquantizedtorle
to github.
Assumes that the input image ias 256 colors or less.
What happens if they are more?
Also what happens in single color with a image including more than 1 color? are they getting reduced? If so by which algorithm.
The “base” is the distance from the fulcrum/hub/swing point to the first pixel.
The code currently represents this as a number with no unit, and it uses a different number with no unit to specify the length of the blade. It then does auto-resizing to fit the image into that, then outputs the requested number of pixels.
If you make the base 10x, and also make the length of the blade 10x, you get the same result. For practical purposes, it’s best to stick with units that people know, like pixels, inches or centimetres, but since the scale is irrelevant, it doesn’t matter which one of those we pick.
The zero came from your example where you had a zero at the end. It doesn’t matter if it’s zero inches, zero centimeters or zero miles, it’s still zero.
Pnmquantizedtorle exits with an error if you give it too many colors.
However, the makefile will run pnmquant 256
on the image before sending it to pnmquantizedtorle, so there won’t be. You can read the man page for pnmquant to see how it does the color reduction.
I should point out that I haven’t updated pov.h to handle 8-bit yet…
But how are two different units comparable with another? Don’t we want to have the Radius of that circle? should the radius not be the sum of the same unit (pixels + pixels)? That is why i was trying to go for percent, so it can be measured in relation to the one unit we know: blade height in pixels.
Wasn’t the last zero the arm length? Why would arm length be devided by hilt length?
Shouldn’t it be
base = arm length + hilt length
radius = base + blade length?
The answer to both of these questions is: because the scale doesn’t matter.
The scale doesn’t matter, because we’re going to scale it to a specific number of pixels afterwards.
Basically, we can pick any one of those numbers and make it a constant. Like, we could proclaim that the radius is 1, and everything else has to change to fit.
I’m just concerned the ol’ drag and drop doesn’t get lost with these additional options.
I wrote a whole how-to and a script to simplify this all down to basically:
- add an image to the folder
- double click the script.
There will be defaults in place then so if nothing is specified, the 144 will be made and done?
It would make more sense to drop the image on the script, would it not?
Also, until the script is checked in, I cannot guarantee that won’t break it.
Meanwhile I tried reducing the angles in half (400 width for a 144px blade → 200x72 wiperimage for my 72px blade)
This almost halfed the size again 23.8kB from 44.5kB
It works, but not sure if the quality is worse
Updated pov.h to handle 8-bit images and make it use POV_DATA_HEIGHT. (on github, untested)
PR started, although it was based on last night’s progress. Can work through things there?
I should also point out that there seems to be something wrong with github code reviews right now, so getting PRs in might be difficult. (Basically, I can’t seem to leave comments on people’s code right now…)
What does the info “X colors found” at the end of the windshield wiper script mean? There should only be 2 colors : black and white, and maybe some grey in between.
I don’t get this with more colorful images. It told me there were 252 or so.
Here are some more pictures with a wiper image of 200x72.
Need sleep. Let me know if i can do something tomorrow.
oof. Well, this is your forum, so if you want to communicate here about it for now, I’m game.
It’s probably pnmquant doing it’s thing.
pnmwindshieldwiper does all of it’s calculations at 10 times the width and 10 times the height, and when pnmscale scales it down, many pixels will have a mix of different color in them, which when averaged together will become a lot of different colors. So even if you start with just two colors, you can end up with 255 different colors after the downscale.