Search Issue Tracker
Won't Fix
Votes
0
Found in [Package]
1.8.4
Issue ID
BUR-2399
Regression
No
Floating Point Precision degrades when Burst Compilation is enabled
How to reproduce:
1. Open the attached Project "burstTest" and load Scene "SampleScene"
2. In the Hierarchy, select the "Bug Test" GameObject
3. In the Inspector, change the value of "F" under "Bug Testing MB" to between 0.9999 and 0.999999
4. Observe the Console
Expected result: Values in the Console are less than or equal to 1
Actual result: Values are greater than 1
Reproducible with: 1.6.6 (2021.3.26f1), 1.8.4 (2022.3.0f1, 2023.1.0b18, 2023.2.0a16)
Notes:
- With Burst compilation disabled, all F values <=1 return values that are <=1
Add comment
All about bugs
View bugs we have successfully reproduced, and vote for the bugs you want to see fixed most urgently.
Latest issues
- GPU utilization increases by 20% on Meta Quest headsets when Render Graph is enabled on 6000.0.16f1 and higher
- Value on Slider (Int) control in UI Builder displays as default when saving UI Document
- Color mismatch in UI Builders Library panel when the Editors theme is set to Light Mode
- [Android ] "AndroidJNI.ToBooleanArray" returns a random non-zero value instead of "IntPtr.Zero" when the method argument is null
- Non-HDR color picker opens when selecting material color with HDR enabled
Resolution Note:
This is actually mono being weird. The mono runtime will treat all floating point calculations as doubles and convert the result back to a float at the end.
Burst does the "right thing" here and treats all the intermediate calculations as single precision operations, not double.
Burst will generate the same results as the equivalent C code. And if you run that C# code with .NET Core instead of mono you'll also get the same results.