Workaround a compiler issue

It looks like clang-cl is clobbering a floating point register
when passing a floating point argument to a DLL, because the
first time glUniform1f() is called, the value is set to 0.0f.
All subsequent glUniform1f() calls work properly.

I changed glUniform1f() calls to glUniform1fv() calls in the
AtanCornerCases test to get the test to pass again, but this
is just masking the compiler issue.

Bug: chromium:1013375
Change-Id: I8363182f26c50cd50c07ccea65ab2eb640a7da76
Reviewed-on: https://swiftshader-review.googlesource.com/c/SwiftShader/+/37348
Presubmit-Ready: Alexis Hétu <sugoi@google.com>
Kokoro-Presubmit: kokoro <noreply+kokoro@google.com>
Reviewed-by: Alexis Hétu <sugoi@google.com>
Tested-by: Alexis Hétu <sugoi@google.com>
diff --git a/tests/GLESUnitTests/unittests.cpp b/tests/GLESUnitTests/unittests.cpp
index f4e1a86..208fbe3 100644
--- a/tests/GLESUnitTests/unittests.cpp
+++ b/tests/GLESUnitTests/unittests.cpp
@@ -1107,8 +1107,11 @@
 	ASSERT_NE(-1, positive_value);
 	GLint negative_value = glGetUniformLocation(ph.program, "negative_value");
 	ASSERT_NE(-1, negative_value);
-	glUniform1f(positive_value,  1.0);
-	glUniform1f(negative_value, -1.0);
+
+	float value = 1.0f;
+	glUniform1fv(positive_value, 1, &value);
+	value = -1.0f;
+	glUniform1fv(negative_value, 1, &value);
 
 	glClearColor(0.0, 0.0, 0.0, 0.0);
 	glClear(GL_COLOR_BUFFER_BIT);