Search Issue Tracker
Third Party Issue
Votes
2
Found in [Package]
2019.4
2020.3
2020.3.24f1
2021.2
2022.1
Issue ID
1386787
Regression
No
ARKit Body tracking neck joint(s) incorrectly reports rotations when the scanned person rotates their head
Reproduction steps:
1. Download and open the "arfoundation-samples" project from https://github.com/Unity-Technologies/arfoundation-samples
2. Build and deploy the project to iOS
3. Select "Body Tracking" > "3D"
4. Scan a person
5. Observe the robot's head when the scanned person moves their head left/right
Expected result: The robot rotates their head left/right
Actual result: The robot tilts their head
Reproducible with: 2019.4.34f1, 2020.3.25f1, 2021.2.7f1, 2022.1.0b2
Reproduced with: iPhone 13 Pro (iOS 15.0.0), iPhone 12 Pro (iOS 14.2.1)
Add comment
All about bugs
View bugs we have successfully reproduced, and vote for the bugs you want to see fixed most urgently.
Latest issues
- A particle is not rendered for the first frame when the duration and lifetime values match
- The "Target" Source Template is missing when using the Unity JetBrains.Annotations library subset
- [OpenXR][HDRP] XR Fails to initialize and black screen is present when waking up the HMD with DLSS enabled
- Baked reflection color of clouds does not match the Scene sky color when generating lighting in Unity
- The sampling result will have black artifacts when using URP Sample Buffer with UV value set to 1
Resolution Note:
We were able to reproduce visibly inaccurate neck tracking in a variety of body poses using ARKit; however, these values come from ARKit itself, and AR Foundation makes no modifications to the joint transforms. Please direct further questions to Apple for this issue.