docs: c_lex.py: store logger on its data

By having the logger stored there, any code using CTokenizer can
log messages there.

Signed-off-by: Mauro Carvalho Chehab <mchehab+huawei@kernel.org>
Signed-off-by: Jonathan Corbet <corbet@lwn.net>
Message-ID: <467979dc18149e4b2a7113c178e0cb07919632f2.1774256269.git.mchehab+huawei@kernel.org>
This commit is contained in:
Mauro Carvalho Chehab
2026-03-23 10:10:53 +01:00
committed by Jonathan Corbet
parent 9c3911812b
commit 2ca0b54dca

View File

@@ -177,7 +177,7 @@ class CTokenizer():
# This class is inspired and follows the basic concepts of:
# https://docs.python.org/3/library/re.html#writing-a-tokenizer
def __init__(self, source=None, log=None):
def __init__(self, source=None):
"""
Create a regular expression to handle RE_SCANNER_LIST.
@@ -188,6 +188,12 @@ class CTokenizer():
when matching a code via RE_SCANNER.
"""
#
# Store logger to allow parser classes to re-use it
#
global log
self.log = log
self.tokens = []
if not source: