But that Ruby float as a string is ALWAYS going to return "1.0", so it always sets as "." even when the user's day-to-day decimal-separator is ","
The issue is how the user input of "1,0" is correctly read as a float or a length.
In the UI 'input' the default input type pretty much sorts that out.
Since 1.0.m displays as 1.000m or 1,000m depending on the user's locale [and of course the model's unit settings]
In a webdialog it's more awkward, because all input is a string that needs 'interpreting'.
So the earlier posts' trickery using lengths etc to get the real separator would help...
Certainly when initially populating the webdialog with decimal values...
Like sep = (begin;'1.0'.to_l;'.';rescue;',';end)
So if sep==',' we present decimal numbers differently using something like tr('.',',')?
But surely some leeway could be used...
What if a user first inputs x = 1.0 then x = 2,3 ?
Should BOTH be acceptable ?
So assuming they are expected as floats...
if sep=='.' x.tr!(',','.') else #',' x.tr!('.',',') end
For the display-side this makes either typed in separator suit the the 'locale', but on the Ruby-side, it's always x_float = x.tr(',','.').to_f
For inputted 'lengths' it is different, because the Ruby-side expects it to be in the locale separator format...
The first sep==...tr... still applies to ensure it's locale friendly... BUT then the x_length = x.to_l must be used Ruby-side...