improve g_bar output
The original Bennett paper never mentions standard deviation, but instead calls the overlap measure the "variance of the two-ensemble estimate" expressed in form of an overlap integral. Therefore, the BAR output "stdev" column should probably be corrected to sigma.
Alternatively, we could entirely switch to calculating variance, i.e. sigma^2 rather than sigma.
Not naming-related, but it would also be useful if g_bar respected the
-prec argument for printing the lambda values not only the rest of the floating point values in the results table.