Printing sizes properly

Sizes weren't getting printed properly
because of their type. In this case,
primarySize, for example, is an
unsigned char, so it gets printed as if
it were a character instead of a number.
Changing it to "primarySize + '0'" also
fails, because then the result of the
operator+ gets interpreted as an int.
To get it to work, I would need either
something like:
static_cast<char>(primarySize + '0')
or
static_cast<int>(primarySize)
I chose the latter.

Change-Id: Ib0718a7a86ef4314f886b653526240aa788fb3df
Reviewed-on: https://swiftshader-review.googlesource.com/3084
Reviewed-by: Nicolas Capens <capn@google.com>
Tested-by: Alexis Hétu <sugoi@google.com>
diff --git a/src/OpenGL/compiler/intermOut.cpp b/src/OpenGL/compiler/intermOut.cpp
index 81b088d..521a57d 100644
--- a/src/OpenGL/compiler/intermOut.cpp
+++ b/src/OpenGL/compiler/intermOut.cpp
@@ -45,9 +45,9 @@
     if (array)
         stream << "array of ";
     if (isMatrix())
-        stream << primarySize << "X" << secondarySize << " matrix of ";
+		stream << static_cast<int>(primarySize) << "X" << static_cast<int>(secondarySize) << " matrix of ";
 	else if(primarySize > 1)
-		stream << primarySize << "-component vector of ";
+		stream << static_cast<int>(primarySize) << "-component vector of ";
 
     stream << getBasicString();
     return stream.str();