In EGL, precision is the total number of digits a variable can use to express its value, not just the number of decimal places. The precision of an INT is 9. For floating-point numbers, the precision is the maximum number of digits that the number can represent on the system on which the program is running.