c# - Can floating-point precision be thread-dependent? -


I have a small 3D vector class in C # 3.0 on the basis of the straight, which is double as the basic unit uses it.

An example: The y-value of a vector

  -20.0 directly  

I have a vector for y-value

10.094999999999965

I believe the value of y

  -30.094999999999963 (1)  < / Pre> 

Instead I get

  -30.094999313354492 (2)  

When I complete the calculations in the same formula I get (1). Apart from Debugger and VS Quick-clock returns (1). But, when I run some iterations in a thread and call the function with a different thread, the result is (2). Now, Debugger returns (2) too!

We must keep in mind that Net Jat can write back memory (website zone sketts) which reduces accuracy from 80 bit (FPU) up to 64 bits (double) however, (2) accuracy It is very little.

Vector class basically looks like this

  public structure vector 3D {private readonly double _x, _y, _z; ... Public Static Vector 3D Operator - (Vector 3D V1, Vector 3D V2) {New Vector 3D Return (v1._x - v2._x, v1._y - v2._y, v1._z - V2._z); }}  

The calculation is as simple as this

  vector3d pos41 = pos4 - pos1;  

Yes, I believe the result may be thread-dependent.

My guess is that you are using DirectX at some point in your code - and it sets the exact for the FPU, and I believe it sets on the basis of each thread. is.

To fix this, use the flag when you call CreateDevice Note that this potentially has a display effect is managed equivalent.

(I did not suggest that it is discontinued as a duplicate because they are at least look something different on the surface. Should help.)


Comments