~pythonregexp2.7/python/issue2636-09-01+10

« back to all changes in this revision

Viewing changes to Lib/tokenize.py

  • Committer: Jeffrey C. "The TimeHorse" Jacobs
  • Date: 2008-09-22 21:39:45 UTC
  • mfrom: (39055.1.33 Regexp-2.7)
  • Revision ID: darklord@timehorse.com-20080922213945-23717m5eiqpamcyn
Merged in changes from the Single-Loop Engine branch.

Show diffs side-by-side

added added

removed removed

Lines of Context:
146
146
 
147
147
class StopTokenizing(Exception): pass
148
148
 
149
 
def printtoken(type, token, (srow, scol), (erow, ecol), line): # for testing
 
149
def printtoken(type, token, srow_scol, erow_ecol, line): # for testing
 
150
    srow, scol = srow_scol
 
151
    erow, ecol = erow_ecol
150
152
    print "%d,%d-%d,%d:\t%s\t%s" % \
151
153
        (srow, scol, erow, ecol, tok_name[type], repr(token))
152
154