I posted this over on HNI, but in case some of you don't check there as often: Ok, although I made it through Calculus in high school, my math skills still are sub-par I shoot RH anyways, so this question is more or less for fun, and to finally understand a problem I've been racking over for about 15 minutes now. When fletching straight offset, there's often a degree recommendation (2-3, 4-6, etc). My question is how do you actually calculate the distance required for the front and/or rear of the fletch to offset from the centerline? At first, I just tried basic Trig. Given a 2" fletch (blazer) offset 2 degrees. Sin 2 x 2. Then I realized that you're not working on a 2d plane, but a 3d plane (the fletch base begins to curve and follow the contour of the shaft as it's offset), so that formula wouldn't work. I saw a poster over on AT mention that for a 2 degree offset on a 2" blazer, you would need to vary the front or the rear of the vane 1/16" from the centerline. How is that calculated? It's probably very simple, but I can't for the life of me figure it out.
Oh, and just to clarify. I wasn't sitting at home trying to calculate true offset distance before I started fletching. I just adjust my jig and go as well. The reason for my question is because everyone is always talking about 2 degrees offset here, 3 degrees here, etc. I was curious if people were just throwing those out there arbitrarily or do people really know how to calculate what distance front to back gives a specific degree of offset. For the most part, I'm guessing the former:d
Totally agree with ya Matt. I think most people are making a guess at their degree cause thats what their jig says or something. Honestly with blazers or now max hunters, I get as much as I can while still getting the vane to sit properly....as long as I do all of my arrows the same, I am good to go.