In addition to selecting a data type for each column descriptor, a precision, and for numeric data, a scale, is selected.
The precision is the maximum number of digits or characters that are displayed for the data in that column. For nonnumeric data, the precision typically refers to the defined length of the column. The scale refers to the maximum number of digits that are displayed to the right of the decimal point. It is worth noting that the precision and scale of a column are used only to limit how the data is displayed; they do not limit the input. For instance, a column defined as VARCHAR(5) receives the string "turtle", which contains six characters. All six characters are stored, but that column would produce the output, "turtl" when queried.
By default, SQL-CREATE-VIEW assigns precision (column length) and scale to column descriptors as described below:
If the data type selected for the column is VARCHAR, the column length is the greater of the column-width in the ADI and the length of the longest data found through analysis.
For example, if the column-width is set to 10 and there exists data in the file like this is longer than 10, the length of this column would be set to 22. Note that this length is not adjusted if longer data is inserted into the file after SQL-CREATE-VIEW is performed.
If the data type selected is NUMERIC, the precision of the column is the greater of the column-width specified in the ADI and the largest number of digits found through data analysis. The scale is the greater of the mr# processing code (if detected) and the largest number of decimal places found through data analysis.
If the data type selected for a column is INTEGER or DATE, the default precision is used for the column. The default for both INTEGER and DATE is 10.
If the precision and scale assigned by SQL-CREATE-VIEW using the basic syntax is not sufficient, see Manual data typing.