Analysis Software
Documentation for sPHENIX simulation software
 All Classes Namespaces Files Functions Variables Typedefs Enumerations Enumerator Friends Macros Groups Pages
cpp.tokenize Namespace Reference

Classes

class  Token
 

Functions

def _GetString
 
def _GetChar
 
def GetTokens
 
def main
 

Variables

string __author__ 'nnorwitz@google.com (Neal Norwitz)'
 
string _letters 'abcdefghijklmnopqrstuvwxyz'
 
tuple VALID_IDENTIFIER_CHARS set(_letters + _letters.upper() + '_0123456789$')
 
tuple HEX_DIGITS set('0123456789abcdefABCDEF')
 
tuple INT_OR_FLOAT_DIGITS set('01234567890eE-+')
 
tuple _STR_PREFIXES set(('R', 'u8', 'u8R', 'u', 'uR', 'U', 'UR', 'L', 'LR'))
 
string UNKNOWN 'UNKNOWN'
 
string SYNTAX 'SYNTAX'
 
string CONSTANT 'CONSTANT'
 
string NAME 'NAME'
 
string PREPROCESSOR 'PREPROCESSOR'
 

Function Documentation

def cpp.tokenize._GetChar (   source,
  start,
  i 
)
private

Definition at line 105 of file tokenize.py.

View newest version in sPHENIX GitHub at line 105 of file tokenize.py

Referenced by cpp.tokenize.GetTokens().

+ Here is the caller graph for this function:

def cpp.tokenize._GetString (   source,
  start,
  i 
)
private

Definition at line 89 of file tokenize.py.

View newest version in sPHENIX GitHub at line 89 of file tokenize.py

Referenced by cpp.tokenize.GetTokens().

+ Here is the caller graph for this function:

def cpp.tokenize.GetTokens (   source)
Returns a sequence of Tokens.

Args:
  source: string of C++ source code.

Yields:
  Token that represents the next token in the source.

Definition at line 119 of file tokenize.py.

View newest version in sPHENIX GitHub at line 119 of file tokenize.py

References cpp.tokenize._GetChar(), cpp.tokenize._GetString(), test_detectors.lower, Acts::UnitConstants.min, print(), and cpp.gmock_class.set.

Referenced by cpp.tokenize.main().

+ Here is the call graph for this function:

+ Here is the caller graph for this function:

def cpp.tokenize.main (   argv)
Driver mostly for testing purposes.

Definition at line 274 of file tokenize.py.

View newest version in sPHENIX GitHub at line 274 of file tokenize.py

References cpp.tokenize.GetTokens(), and print().

+ Here is the call graph for this function:

Variable Documentation

string cpp.tokenize.__author__ 'nnorwitz@google.com (Neal Norwitz)'

Definition at line 20 of file tokenize.py.

View newest version in sPHENIX GitHub at line 20 of file tokenize.py

string cpp.tokenize._letters 'abcdefghijklmnopqrstuvwxyz'

Definition at line 42 of file tokenize.py.

View newest version in sPHENIX GitHub at line 42 of file tokenize.py

tuple cpp.tokenize._STR_PREFIXES set(('R', 'u8', 'u8R', 'u', 'uR', 'U', 'UR', 'L', 'LR'))

Definition at line 49 of file tokenize.py.

View newest version in sPHENIX GitHub at line 49 of file tokenize.py

string cpp.tokenize.CONSTANT 'CONSTANT'

Definition at line 55 of file tokenize.py.

View newest version in sPHENIX GitHub at line 55 of file tokenize.py

tuple cpp.tokenize.HEX_DIGITS set('0123456789abcdefABCDEF')

Definition at line 44 of file tokenize.py.

View newest version in sPHENIX GitHub at line 44 of file tokenize.py

tuple cpp.tokenize.INT_OR_FLOAT_DIGITS set('01234567890eE-+')

Definition at line 45 of file tokenize.py.

View newest version in sPHENIX GitHub at line 45 of file tokenize.py

string cpp.tokenize.NAME 'NAME'

Definition at line 56 of file tokenize.py.

View newest version in sPHENIX GitHub at line 56 of file tokenize.py

string cpp.tokenize.PREPROCESSOR 'PREPROCESSOR'

Definition at line 57 of file tokenize.py.

View newest version in sPHENIX GitHub at line 57 of file tokenize.py

string cpp.tokenize.SYNTAX 'SYNTAX'

Definition at line 54 of file tokenize.py.

View newest version in sPHENIX GitHub at line 54 of file tokenize.py

string cpp.tokenize.UNKNOWN 'UNKNOWN'

Definition at line 53 of file tokenize.py.

View newest version in sPHENIX GitHub at line 53 of file tokenize.py

tuple cpp.tokenize.VALID_IDENTIFIER_CHARS set(_letters + _letters.upper() + '_0123456789$')

Definition at line 43 of file tokenize.py.

View newest version in sPHENIX GitHub at line 43 of file tokenize.py