What do you mean by C++ Tokens?


A token is the smallest element of a C++ program that is meaningful to the compiler. The C++ parser recognizes these kinds of tokens: identifiers, keywords, literals, operators, punctuators, and other separators. A stream of these tokens makes up a translation unit. Tokens are usually separated by white space.

The parser recognizes keywords, identifiers, literals, operators, and punctuators. Preprocessing tokens(like #include, #define, #if_def, etc.) are used in the preprocessing phases to generate the token stream passed to the compiler. The preprocessing token categories are header names, identifiers, preprocessing numbers, character literals, string literals, etc. that do not match one of the other categories. Character and string literals can be user-defined literals. Preprocessing tokens can be separated by white space or comments.

The parser separates tokens from the input stream by creating the longest token possible using the input characters in a left-to-right scan.

Updated on: 30-Jul-2019

2K+ Views

Kickstart Your Career

Get certified by completing the course

Get Started
Advertisements