In programming languages, different data used are represented in various ways. Char and Varchar are two types of data representation in programming languages. 'Char' simply means character, while Varchar means Variable character. Char simply means a fixed-length character, and it is used to represent and store characters that are not Unicode. Varchar, on the other hand, is the short form variable character. It is a variable length character which can be used to represent any kind of character of different length.
However, there is a limit to the number of variables it can store. What determines the maximum number of the varchar field depends solely on the field length declared during the operation. However, what determines the maximum number of characters used for the operation depends on the total size of the column you declare while forming the table. Char and Varchar use different kinds of memory, Char uses static memory, while Varcha uses dynamic memory.