gooderp18绿色标准版
Nie możesz wybrać więcej, niż 25 tematów Tematy muszą się zaczynać od litery lub cyfry, mogą zawierać myślniki ('-') i mogą mieć do 35 znaków.

613 lines
43KB

  1. <!DOCTYPE html>
  2. <html lang="en" data-content_root="../">
  3. <head>
  4. <meta charset="utf-8" />
  5. <meta name="viewport" content="width=device-width, initial-scale=1.0" /><meta name="viewport" content="width=device-width, initial-scale=1" />
  6. <meta property="og:title" content="tokenize — Tokenizer for Python source" />
  7. <meta property="og:type" content="website" />
  8. <meta property="og:url" content="https://docs.python.org/3/library/tokenize.html" />
  9. <meta property="og:site_name" content="Python documentation" />
  10. <meta property="og:description" content="Source code: Lib/tokenize.py The tokenize module provides a lexical scanner for Python source code, implemented in Python. The scanner in this module returns comments as tokens as well, making it u..." />
  11. <meta property="og:image" content="https://docs.python.org/3/_static/og-image.png" />
  12. <meta property="og:image:alt" content="Python documentation" />
  13. <meta name="description" content="Source code: Lib/tokenize.py The tokenize module provides a lexical scanner for Python source code, implemented in Python. The scanner in this module returns comments as tokens as well, making it u..." />
  14. <meta property="og:image:width" content="200" />
  15. <meta property="og:image:height" content="200" />
  16. <meta name="theme-color" content="#3776ab" />
  17. <title>tokenize — Tokenizer for Python source &#8212; Python 3.12.3 documentation</title><meta name="viewport" content="width=device-width, initial-scale=1.0">
  18. <link rel="stylesheet" type="text/css" href="../_static/pygments.css?v=80d5e7a1" />
  19. <link rel="stylesheet" type="text/css" href="../_static/pydoctheme.css?v=bb723527" />
  20. <link id="pygments_dark_css" media="(prefers-color-scheme: dark)" rel="stylesheet" type="text/css" href="../_static/pygments_dark.css?v=b20cc3f5" />
  21. <script src="../_static/documentation_options.js?v=2c828074"></script>
  22. <script src="../_static/doctools.js?v=888ff710"></script>
  23. <script src="../_static/sphinx_highlight.js?v=dc90522c"></script>
  24. <script src="../_static/sidebar.js"></script>
  25. <link rel="search" type="application/opensearchdescription+xml"
  26. title="Search within Python 3.12.3 documentation"
  27. href="../_static/opensearch.xml"/>
  28. <link rel="author" title="About these documents" href="../about.html" />
  29. <link rel="index" title="Index" href="../genindex.html" />
  30. <link rel="search" title="Search" href="../search.html" />
  31. <link rel="copyright" title="Copyright" href="../copyright.html" />
  32. <link rel="next" title="tabnanny — Detection of ambiguous indentation" href="tabnanny.html" />
  33. <link rel="prev" title="keyword — Testing for Python keywords" href="keyword.html" />
  34. <link rel="canonical" href="https://docs.python.org/3/library/tokenize.html" />
  35. <style>
  36. @media only screen {
  37. table.full-width-table {
  38. width: 100%;
  39. }
  40. }
  41. </style>
  42. <link rel="stylesheet" href="../_static/pydoctheme_dark.css" media="(prefers-color-scheme: dark)" id="pydoctheme_dark_css">
  43. <link rel="shortcut icon" type="image/png" href="../_static/py.svg" />
  44. <script type="text/javascript" src="../_static/copybutton.js"></script>
  45. <script type="text/javascript" src="../_static/menu.js"></script>
  46. <script type="text/javascript" src="../_static/search-focus.js"></script>
  47. <script type="text/javascript" src="../_static/themetoggle.js"></script>
  48. </head>
  49. <body>
  50. <div class="mobile-nav">
  51. <input type="checkbox" id="menuToggler" class="toggler__input" aria-controls="navigation"
  52. aria-pressed="false" aria-expanded="false" role="button" aria-label="Menu" />
  53. <nav class="nav-content" role="navigation">
  54. <label for="menuToggler" class="toggler__label">
  55. <span></span>
  56. </label>
  57. <span class="nav-items-wrapper">
  58. <a href="https://www.python.org/" class="nav-logo">
  59. <img src="../_static/py.svg" alt="Python logo"/>
  60. </a>
  61. <span class="version_switcher_placeholder"></span>
  62. <form role="search" class="search" action="../search.html" method="get">
  63. <svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" class="search-icon">
  64. <path fill-rule="nonzero" fill="currentColor" d="M15.5 14h-.79l-.28-.27a6.5 6.5 0 001.48-5.34c-.47-2.78-2.79-5-5.59-5.34a6.505 6.505 0 00-7.27 7.27c.34 2.8 2.56 5.12 5.34 5.59a6.5 6.5 0 005.34-1.48l.27.28v.79l4.25 4.25c.41.41 1.08.41 1.49 0 .41-.41.41-1.08 0-1.49L15.5 14zm-6 0C7.01 14 5 11.99 5 9.5S7.01 5 9.5 5 14 7.01 14 9.5 11.99 14 9.5 14z"></path>
  65. </svg>
  66. <input placeholder="Quick search" aria-label="Quick search" type="search" name="q" />
  67. <input type="submit" value="Go"/>
  68. </form>
  69. </span>
  70. </nav>
  71. <div class="menu-wrapper">
  72. <nav class="menu" role="navigation" aria-label="main navigation">
  73. <div class="language_switcher_placeholder"></div>
  74. <label class="theme-selector-label">
  75. Theme
  76. <select class="theme-selector" oninput="activateTheme(this.value)">
  77. <option value="auto" selected>Auto</option>
  78. <option value="light">Light</option>
  79. <option value="dark">Dark</option>
  80. </select>
  81. </label>
  82. <div>
  83. <h3><a href="../contents.html">Table of Contents</a></h3>
  84. <ul>
  85. <li><a class="reference internal" href="#"><code class="xref py py-mod docutils literal notranslate"><span class="pre">tokenize</span></code> — Tokenizer for Python source</a><ul>
  86. <li><a class="reference internal" href="#tokenizing-input">Tokenizing Input</a></li>
  87. <li><a class="reference internal" href="#command-line-usage">Command-Line Usage</a></li>
  88. <li><a class="reference internal" href="#examples">Examples</a></li>
  89. </ul>
  90. </li>
  91. </ul>
  92. </div>
  93. <div>
  94. <h4>Previous topic</h4>
  95. <p class="topless"><a href="keyword.html"
  96. title="previous chapter"><code class="xref py py-mod docutils literal notranslate"><span class="pre">keyword</span></code> — Testing for Python keywords</a></p>
  97. </div>
  98. <div>
  99. <h4>Next topic</h4>
  100. <p class="topless"><a href="tabnanny.html"
  101. title="next chapter"><code class="xref py py-mod docutils literal notranslate"><span class="pre">tabnanny</span></code> — Detection of ambiguous indentation</a></p>
  102. </div>
  103. <div role="note" aria-label="source link">
  104. <h3>This Page</h3>
  105. <ul class="this-page-menu">
  106. <li><a href="../bugs.html">Report a Bug</a></li>
  107. <li>
  108. <a href="https://github.com/python/cpython/blob/main/Doc/library/tokenize.rst"
  109. rel="nofollow">Show Source
  110. </a>
  111. </li>
  112. </ul>
  113. </div>
  114. </nav>
  115. </div>
  116. </div>
  117. <div class="related" role="navigation" aria-label="related navigation">
  118. <h3>Navigation</h3>
  119. <ul>
  120. <li class="right" style="margin-right: 10px">
  121. <a href="../genindex.html" title="General Index"
  122. accesskey="I">index</a></li>
  123. <li class="right" >
  124. <a href="../py-modindex.html" title="Python Module Index"
  125. >modules</a> |</li>
  126. <li class="right" >
  127. <a href="tabnanny.html" title="tabnanny — Detection of ambiguous indentation"
  128. accesskey="N">next</a> |</li>
  129. <li class="right" >
  130. <a href="keyword.html" title="keyword — Testing for Python keywords"
  131. accesskey="P">previous</a> |</li>
  132. <li><img src="../_static/py.svg" alt="Python logo" style="vertical-align: middle; margin-top: -1px"/></li>
  133. <li><a href="https://www.python.org/">Python</a> &#187;</li>
  134. <li class="switchers">
  135. <div class="language_switcher_placeholder"></div>
  136. <div class="version_switcher_placeholder"></div>
  137. </li>
  138. <li>
  139. </li>
  140. <li id="cpython-language-and-version">
  141. <a href="../index.html">3.12.3 Documentation</a> &#187;
  142. </li>
  143. <li class="nav-item nav-item-1"><a href="index.html" >The Python Standard Library</a> &#187;</li>
  144. <li class="nav-item nav-item-2"><a href="language.html" accesskey="U">Python Language Services</a> &#187;</li>
  145. <li class="nav-item nav-item-this"><a href=""><code class="xref py py-mod docutils literal notranslate"><span class="pre">tokenize</span></code> — Tokenizer for Python source</a></li>
  146. <li class="right">
  147. <div class="inline-search" role="search">
  148. <form class="inline-search" action="../search.html" method="get">
  149. <input placeholder="Quick search" aria-label="Quick search" type="search" name="q" id="search-box" />
  150. <input type="submit" value="Go" />
  151. </form>
  152. </div>
  153. |
  154. </li>
  155. <li class="right">
  156. <label class="theme-selector-label">
  157. Theme
  158. <select class="theme-selector" oninput="activateTheme(this.value)">
  159. <option value="auto" selected>Auto</option>
  160. <option value="light">Light</option>
  161. <option value="dark">Dark</option>
  162. </select>
  163. </label> |</li>
  164. </ul>
  165. </div>
  166. <div class="document">
  167. <div class="documentwrapper">
  168. <div class="bodywrapper">
  169. <div class="body" role="main">
  170. <section id="module-tokenize">
  171. <span id="tokenize-tokenizer-for-python-source"></span><h1><a class="reference internal" href="#module-tokenize" title="tokenize: Lexical scanner for Python source code."><code class="xref py py-mod docutils literal notranslate"><span class="pre">tokenize</span></code></a> — Tokenizer for Python source<a class="headerlink" href="#module-tokenize" title="Link to this heading">¶</a></h1>
  172. <p><strong>Source code:</strong> <a class="reference external" href="https://github.com/python/cpython/tree/3.12/Lib/tokenize.py">Lib/tokenize.py</a></p>
  173. <hr class="docutils" />
  174. <p>The <a class="reference internal" href="#module-tokenize" title="tokenize: Lexical scanner for Python source code."><code class="xref py py-mod docutils literal notranslate"><span class="pre">tokenize</span></code></a> module provides a lexical scanner for Python source code,
  175. implemented in Python. The scanner in this module returns comments as tokens
  176. as well, making it useful for implementing “pretty-printers”, including
  177. colorizers for on-screen displays.</p>
  178. <p>To simplify token stream handling, all <a class="reference internal" href="../reference/lexical_analysis.html#operators"><span class="std std-ref">operator</span></a> and
  179. <a class="reference internal" href="../reference/lexical_analysis.html#delimiters"><span class="std std-ref">delimiter</span></a> tokens and <a class="reference internal" href="constants.html#Ellipsis" title="Ellipsis"><code class="xref py py-data docutils literal notranslate"><span class="pre">Ellipsis</span></code></a> are returned using
  180. the generic <a class="reference internal" href="token.html#token.OP" title="token.OP"><code class="xref py py-data docutils literal notranslate"><span class="pre">OP</span></code></a> token type. The exact
  181. type can be determined by checking the <code class="docutils literal notranslate"><span class="pre">exact_type</span></code> property on the
  182. <a class="reference internal" href="../glossary.html#term-named-tuple"><span class="xref std std-term">named tuple</span></a> returned from <a class="reference internal" href="#tokenize.tokenize" title="tokenize.tokenize"><code class="xref py py-func docutils literal notranslate"><span class="pre">tokenize.tokenize()</span></code></a>.</p>
  183. <div class="admonition warning">
  184. <p class="admonition-title">Warning</p>
  185. <p>Note that the functions in this module are only designed to parse
  186. syntactically valid Python code (code that does not raise when parsed
  187. using <a class="reference internal" href="ast.html#ast.parse" title="ast.parse"><code class="xref py py-func docutils literal notranslate"><span class="pre">ast.parse()</span></code></a>). The behavior of the functions in this module is
  188. <strong>undefined</strong> when providing invalid Python code and it can change at any
  189. point.</p>
  190. </div>
  191. <section id="tokenizing-input">
  192. <h2>Tokenizing Input<a class="headerlink" href="#tokenizing-input" title="Link to this heading">¶</a></h2>
  193. <p>The primary entry point is a <a class="reference internal" href="../glossary.html#term-generator"><span class="xref std std-term">generator</span></a>:</p>
  194. <dl class="py function">
  195. <dt class="sig sig-object py" id="tokenize.tokenize">
  196. <span class="sig-prename descclassname"><span class="pre">tokenize.</span></span><span class="sig-name descname"><span class="pre">tokenize</span></span><span class="sig-paren">(</span><em class="sig-param"><span class="n"><span class="pre">readline</span></span></em><span class="sig-paren">)</span><a class="headerlink" href="#tokenize.tokenize" title="Link to this definition">¶</a></dt>
  197. <dd><p>The <a class="reference internal" href="#tokenize.tokenize" title="tokenize.tokenize"><code class="xref py py-func docutils literal notranslate"><span class="pre">tokenize()</span></code></a> generator requires one argument, <em>readline</em>, which
  198. must be a callable object which provides the same interface as the
  199. <a class="reference internal" href="io.html#io.IOBase.readline" title="io.IOBase.readline"><code class="xref py py-meth docutils literal notranslate"><span class="pre">io.IOBase.readline()</span></code></a> method of file objects. Each call to the
  200. function should return one line of input as bytes.</p>
  201. <p>The generator produces 5-tuples with these members: the token type; the
  202. token string; a 2-tuple <code class="docutils literal notranslate"><span class="pre">(srow,</span> <span class="pre">scol)</span></code> of ints specifying the row and
  203. column where the token begins in the source; a 2-tuple <code class="docutils literal notranslate"><span class="pre">(erow,</span> <span class="pre">ecol)</span></code> of
  204. ints specifying the row and column where the token ends in the source; and
  205. the line on which the token was found. The line passed (the last tuple item)
  206. is the <em>physical</em> line. The 5 tuple is returned as a <a class="reference internal" href="../glossary.html#term-named-tuple"><span class="xref std std-term">named tuple</span></a>
  207. with the field names:
  208. <code class="docutils literal notranslate"><span class="pre">type</span> <span class="pre">string</span> <span class="pre">start</span> <span class="pre">end</span> <span class="pre">line</span></code>.</p>
  209. <p>The returned <a class="reference internal" href="../glossary.html#term-named-tuple"><span class="xref std std-term">named tuple</span></a> has an additional property named
  210. <code class="docutils literal notranslate"><span class="pre">exact_type</span></code> that contains the exact operator type for
  211. <a class="reference internal" href="token.html#token.OP" title="token.OP"><code class="xref py py-data docutils literal notranslate"><span class="pre">OP</span></code></a> tokens. For all other token types <code class="docutils literal notranslate"><span class="pre">exact_type</span></code>
  212. equals the named tuple <code class="docutils literal notranslate"><span class="pre">type</span></code> field.</p>
  213. <div class="versionchanged">
  214. <p><span class="versionmodified changed">Changed in version 3.1: </span>Added support for named tuples.</p>
  215. </div>
  216. <div class="versionchanged">
  217. <p><span class="versionmodified changed">Changed in version 3.3: </span>Added support for <code class="docutils literal notranslate"><span class="pre">exact_type</span></code>.</p>
  218. </div>
  219. <p><a class="reference internal" href="#tokenize.tokenize" title="tokenize.tokenize"><code class="xref py py-func docutils literal notranslate"><span class="pre">tokenize()</span></code></a> determines the source encoding of the file by looking for a
  220. UTF-8 BOM or encoding cookie, according to <span class="target" id="index-0"></span><a class="pep reference external" href="https://peps.python.org/pep-0263/"><strong>PEP 263</strong></a>.</p>
  221. </dd></dl>
  222. <dl class="py function">
  223. <dt class="sig sig-object py" id="tokenize.generate_tokens">
  224. <span class="sig-prename descclassname"><span class="pre">tokenize.</span></span><span class="sig-name descname"><span class="pre">generate_tokens</span></span><span class="sig-paren">(</span><em class="sig-param"><span class="n"><span class="pre">readline</span></span></em><span class="sig-paren">)</span><a class="headerlink" href="#tokenize.generate_tokens" title="Link to this definition">¶</a></dt>
  225. <dd><p>Tokenize a source reading unicode strings instead of bytes.</p>
  226. <p>Like <a class="reference internal" href="#tokenize.tokenize" title="tokenize.tokenize"><code class="xref py py-func docutils literal notranslate"><span class="pre">tokenize()</span></code></a>, the <em>readline</em> argument is a callable returning
  227. a single line of input. However, <a class="reference internal" href="#tokenize.generate_tokens" title="tokenize.generate_tokens"><code class="xref py py-func docutils literal notranslate"><span class="pre">generate_tokens()</span></code></a> expects <em>readline</em>
  228. to return a str object rather than bytes.</p>
  229. <p>The result is an iterator yielding named tuples, exactly like
  230. <a class="reference internal" href="#tokenize.tokenize" title="tokenize.tokenize"><code class="xref py py-func docutils literal notranslate"><span class="pre">tokenize()</span></code></a>. It does not yield an <a class="reference internal" href="token.html#token.ENCODING" title="token.ENCODING"><code class="xref py py-data docutils literal notranslate"><span class="pre">ENCODING</span></code></a> token.</p>
  231. </dd></dl>
  232. <p>All constants from the <a class="reference internal" href="token.html#module-token" title="token: Constants representing terminal nodes of the parse tree."><code class="xref py py-mod docutils literal notranslate"><span class="pre">token</span></code></a> module are also exported from
  233. <a class="reference internal" href="#module-tokenize" title="tokenize: Lexical scanner for Python source code."><code class="xref py py-mod docutils literal notranslate"><span class="pre">tokenize</span></code></a>.</p>
  234. <p>Another function is provided to reverse the tokenization process. This is
  235. useful for creating tools that tokenize a script, modify the token stream, and
  236. write back the modified script.</p>
  237. <dl class="py function">
  238. <dt class="sig sig-object py" id="tokenize.untokenize">
  239. <span class="sig-prename descclassname"><span class="pre">tokenize.</span></span><span class="sig-name descname"><span class="pre">untokenize</span></span><span class="sig-paren">(</span><em class="sig-param"><span class="n"><span class="pre">iterable</span></span></em><span class="sig-paren">)</span><a class="headerlink" href="#tokenize.untokenize" title="Link to this definition">¶</a></dt>
  240. <dd><p>Converts tokens back into Python source code. The <em>iterable</em> must return
  241. sequences with at least two elements, the token type and the token string.
  242. Any additional sequence elements are ignored.</p>
  243. <p>The reconstructed script is returned as a single string. The result is
  244. guaranteed to tokenize back to match the input so that the conversion is
  245. lossless and round-trips are assured. The guarantee applies only to the
  246. token type and token string as the spacing between tokens (column
  247. positions) may change.</p>
  248. <p>It returns bytes, encoded using the <a class="reference internal" href="token.html#token.ENCODING" title="token.ENCODING"><code class="xref py py-data docutils literal notranslate"><span class="pre">ENCODING</span></code></a> token, which
  249. is the first token sequence output by <a class="reference internal" href="#tokenize.tokenize" title="tokenize.tokenize"><code class="xref py py-func docutils literal notranslate"><span class="pre">tokenize()</span></code></a>. If there is no
  250. encoding token in the input, it returns a str instead.</p>
  251. </dd></dl>
  252. <p><a class="reference internal" href="#tokenize.tokenize" title="tokenize.tokenize"><code class="xref py py-func docutils literal notranslate"><span class="pre">tokenize()</span></code></a> needs to detect the encoding of source files it tokenizes. The
  253. function it uses to do this is available:</p>
  254. <dl class="py function">
  255. <dt class="sig sig-object py" id="tokenize.detect_encoding">
  256. <span class="sig-prename descclassname"><span class="pre">tokenize.</span></span><span class="sig-name descname"><span class="pre">detect_encoding</span></span><span class="sig-paren">(</span><em class="sig-param"><span class="n"><span class="pre">readline</span></span></em><span class="sig-paren">)</span><a class="headerlink" href="#tokenize.detect_encoding" title="Link to this definition">¶</a></dt>
  257. <dd><p>The <a class="reference internal" href="#tokenize.detect_encoding" title="tokenize.detect_encoding"><code class="xref py py-func docutils literal notranslate"><span class="pre">detect_encoding()</span></code></a> function is used to detect the encoding that
  258. should be used to decode a Python source file. It requires one argument,
  259. readline, in the same way as the <a class="reference internal" href="#tokenize.tokenize" title="tokenize.tokenize"><code class="xref py py-func docutils literal notranslate"><span class="pre">tokenize()</span></code></a> generator.</p>
  260. <p>It will call readline a maximum of twice, and return the encoding used
  261. (as a string) and a list of any lines (not decoded from bytes) it has read
  262. in.</p>
  263. <p>It detects the encoding from the presence of a UTF-8 BOM or an encoding
  264. cookie as specified in <span class="target" id="index-1"></span><a class="pep reference external" href="https://peps.python.org/pep-0263/"><strong>PEP 263</strong></a>. If both a BOM and a cookie are present,
  265. but disagree, a <a class="reference internal" href="exceptions.html#SyntaxError" title="SyntaxError"><code class="xref py py-exc docutils literal notranslate"><span class="pre">SyntaxError</span></code></a> will be raised. Note that if the BOM is found,
  266. <code class="docutils literal notranslate"><span class="pre">'utf-8-sig'</span></code> will be returned as an encoding.</p>
  267. <p>If no encoding is specified, then the default of <code class="docutils literal notranslate"><span class="pre">'utf-8'</span></code> will be
  268. returned.</p>
  269. <p>Use <a class="reference internal" href="#tokenize.open" title="tokenize.open"><code class="xref py py-func docutils literal notranslate"><span class="pre">open()</span></code></a> to open Python source files: it uses
  270. <a class="reference internal" href="#tokenize.detect_encoding" title="tokenize.detect_encoding"><code class="xref py py-func docutils literal notranslate"><span class="pre">detect_encoding()</span></code></a> to detect the file encoding.</p>
  271. </dd></dl>
  272. <dl class="py function">
  273. <dt class="sig sig-object py" id="tokenize.open">
  274. <span class="sig-prename descclassname"><span class="pre">tokenize.</span></span><span class="sig-name descname"><span class="pre">open</span></span><span class="sig-paren">(</span><em class="sig-param"><span class="n"><span class="pre">filename</span></span></em><span class="sig-paren">)</span><a class="headerlink" href="#tokenize.open" title="Link to this definition">¶</a></dt>
  275. <dd><p>Open a file in read only mode using the encoding detected by
  276. <a class="reference internal" href="#tokenize.detect_encoding" title="tokenize.detect_encoding"><code class="xref py py-func docutils literal notranslate"><span class="pre">detect_encoding()</span></code></a>.</p>
  277. <div class="versionadded">
  278. <p><span class="versionmodified added">New in version 3.2.</span></p>
  279. </div>
  280. </dd></dl>
  281. <dl class="py exception">
  282. <dt class="sig sig-object py" id="tokenize.TokenError">
  283. <em class="property"><span class="pre">exception</span><span class="w"> </span></em><span class="sig-prename descclassname"><span class="pre">tokenize.</span></span><span class="sig-name descname"><span class="pre">TokenError</span></span><a class="headerlink" href="#tokenize.TokenError" title="Link to this definition">¶</a></dt>
  284. <dd><p>Raised when either a docstring or expression that may be split over several
  285. lines is not completed anywhere in the file, for example:</p>
  286. <div class="highlight-python3 notranslate"><div class="highlight"><pre><span></span><span class="s2">&quot;&quot;&quot;Beginning of</span>
  287. <span class="s2">docstring</span>
  288. </pre></div>
  289. </div>
  290. <p>or:</p>
  291. <div class="highlight-python3 notranslate"><div class="highlight"><pre><span></span><span class="p">[</span><span class="mi">1</span><span class="p">,</span>
  292. <span class="mi">2</span><span class="p">,</span>
  293. <span class="mi">3</span>
  294. </pre></div>
  295. </div>
  296. </dd></dl>
  297. </section>
  298. <section id="command-line-usage">
  299. <span id="tokenize-cli"></span><h2>Command-Line Usage<a class="headerlink" href="#command-line-usage" title="Link to this heading">¶</a></h2>
  300. <div class="versionadded">
  301. <p><span class="versionmodified added">New in version 3.3.</span></p>
  302. </div>
  303. <p>The <a class="reference internal" href="#module-tokenize" title="tokenize: Lexical scanner for Python source code."><code class="xref py py-mod docutils literal notranslate"><span class="pre">tokenize</span></code></a> module can be executed as a script from the command line.
  304. It is as simple as:</p>
  305. <div class="highlight-sh notranslate"><div class="highlight"><pre><span></span>python<span class="w"> </span>-m<span class="w"> </span>tokenize<span class="w"> </span><span class="o">[</span>-e<span class="o">]</span><span class="w"> </span><span class="o">[</span>filename.py<span class="o">]</span>
  306. </pre></div>
  307. </div>
  308. <p>The following options are accepted:</p>
  309. <dl class="std option">
  310. <dt class="sig sig-object std" id="cmdoption-tokenize-h">
  311. <span id="cmdoption-tokenize-help"></span><span class="sig-name descname"><span class="pre">-h</span></span><span class="sig-prename descclassname"></span><span class="sig-prename descclassname"><span class="pre">,</span> </span><span class="sig-name descname"><span class="pre">--help</span></span><span class="sig-prename descclassname"></span><a class="headerlink" href="#cmdoption-tokenize-h" title="Link to this definition">¶</a></dt>
  312. <dd><p>show this help message and exit</p>
  313. </dd></dl>
  314. <dl class="std option">
  315. <dt class="sig sig-object std" id="cmdoption-tokenize-e">
  316. <span id="cmdoption-tokenize-exact"></span><span class="sig-name descname"><span class="pre">-e</span></span><span class="sig-prename descclassname"></span><span class="sig-prename descclassname"><span class="pre">,</span> </span><span class="sig-name descname"><span class="pre">--exact</span></span><span class="sig-prename descclassname"></span><a class="headerlink" href="#cmdoption-tokenize-e" title="Link to this definition">¶</a></dt>
  317. <dd><p>display token names using the exact type</p>
  318. </dd></dl>
  319. <p>If <code class="file docutils literal notranslate"><span class="pre">filename.py</span></code> is specified its contents are tokenized to stdout.
  320. Otherwise, tokenization is performed on stdin.</p>
  321. </section>
  322. <section id="examples">
  323. <h2>Examples<a class="headerlink" href="#examples" title="Link to this heading">¶</a></h2>
  324. <p>Example of a script rewriter that transforms float literals into Decimal
  325. objects:</p>
  326. <div class="highlight-python3 notranslate"><div class="highlight"><pre><span></span><span class="kn">from</span> <span class="nn">tokenize</span> <span class="kn">import</span> <span class="n">tokenize</span><span class="p">,</span> <span class="n">untokenize</span><span class="p">,</span> <span class="n">NUMBER</span><span class="p">,</span> <span class="n">STRING</span><span class="p">,</span> <span class="n">NAME</span><span class="p">,</span> <span class="n">OP</span>
  327. <span class="kn">from</span> <span class="nn">io</span> <span class="kn">import</span> <span class="n">BytesIO</span>
  328. <span class="k">def</span> <span class="nf">decistmt</span><span class="p">(</span><span class="n">s</span><span class="p">):</span>
  329. <span class="w"> </span><span class="sd">&quot;&quot;&quot;Substitute Decimals for floats in a string of statements.</span>
  330. <span class="sd"> &gt;&gt;&gt; from decimal import Decimal</span>
  331. <span class="sd"> &gt;&gt;&gt; s = &#39;print(+21.3e-5*-.1234/81.7)&#39;</span>
  332. <span class="sd"> &gt;&gt;&gt; decistmt(s)</span>
  333. <span class="sd"> &quot;print (+Decimal (&#39;21.3e-5&#39;)*-Decimal (&#39;.1234&#39;)/Decimal (&#39;81.7&#39;))&quot;</span>
  334. <span class="sd"> The format of the exponent is inherited from the platform C library.</span>
  335. <span class="sd"> Known cases are &quot;e-007&quot; (Windows) and &quot;e-07&quot; (not Windows). Since</span>
  336. <span class="sd"> we&#39;re only showing 12 digits, and the 13th isn&#39;t close to 5, the</span>
  337. <span class="sd"> rest of the output should be platform-independent.</span>
  338. <span class="sd"> &gt;&gt;&gt; exec(s) #doctest: +ELLIPSIS</span>
  339. <span class="sd"> -3.21716034272e-0...7</span>
  340. <span class="sd"> Output from calculations with Decimal should be identical across all</span>
  341. <span class="sd"> platforms.</span>
  342. <span class="sd"> &gt;&gt;&gt; exec(decistmt(s))</span>
  343. <span class="sd"> -3.217160342717258261933904529E-7</span>
  344. <span class="sd"> &quot;&quot;&quot;</span>
  345. <span class="n">result</span> <span class="o">=</span> <span class="p">[]</span>
  346. <span class="n">g</span> <span class="o">=</span> <span class="n">tokenize</span><span class="p">(</span><span class="n">BytesIO</span><span class="p">(</span><span class="n">s</span><span class="o">.</span><span class="n">encode</span><span class="p">(</span><span class="s1">&#39;utf-8&#39;</span><span class="p">))</span><span class="o">.</span><span class="n">readline</span><span class="p">)</span> <span class="c1"># tokenize the string</span>
  347. <span class="k">for</span> <span class="n">toknum</span><span class="p">,</span> <span class="n">tokval</span><span class="p">,</span> <span class="n">_</span><span class="p">,</span> <span class="n">_</span><span class="p">,</span> <span class="n">_</span> <span class="ow">in</span> <span class="n">g</span><span class="p">:</span>
  348. <span class="k">if</span> <span class="n">toknum</span> <span class="o">==</span> <span class="n">NUMBER</span> <span class="ow">and</span> <span class="s1">&#39;.&#39;</span> <span class="ow">in</span> <span class="n">tokval</span><span class="p">:</span> <span class="c1"># replace NUMBER tokens</span>
  349. <span class="n">result</span><span class="o">.</span><span class="n">extend</span><span class="p">([</span>
  350. <span class="p">(</span><span class="n">NAME</span><span class="p">,</span> <span class="s1">&#39;Decimal&#39;</span><span class="p">),</span>
  351. <span class="p">(</span><span class="n">OP</span><span class="p">,</span> <span class="s1">&#39;(&#39;</span><span class="p">),</span>
  352. <span class="p">(</span><span class="n">STRING</span><span class="p">,</span> <span class="nb">repr</span><span class="p">(</span><span class="n">tokval</span><span class="p">)),</span>
  353. <span class="p">(</span><span class="n">OP</span><span class="p">,</span> <span class="s1">&#39;)&#39;</span><span class="p">)</span>
  354. <span class="p">])</span>
  355. <span class="k">else</span><span class="p">:</span>
  356. <span class="n">result</span><span class="o">.</span><span class="n">append</span><span class="p">((</span><span class="n">toknum</span><span class="p">,</span> <span class="n">tokval</span><span class="p">))</span>
  357. <span class="k">return</span> <span class="n">untokenize</span><span class="p">(</span><span class="n">result</span><span class="p">)</span><span class="o">.</span><span class="n">decode</span><span class="p">(</span><span class="s1">&#39;utf-8&#39;</span><span class="p">)</span>
  358. </pre></div>
  359. </div>
  360. <p>Example of tokenizing from the command line. The script:</p>
  361. <div class="highlight-python3 notranslate"><div class="highlight"><pre><span></span><span class="k">def</span> <span class="nf">say_hello</span><span class="p">():</span>
  362. <span class="nb">print</span><span class="p">(</span><span class="s2">&quot;Hello, World!&quot;</span><span class="p">)</span>
  363. <span class="n">say_hello</span><span class="p">()</span>
  364. </pre></div>
  365. </div>
  366. <p>will be tokenized to the following output where the first column is the range
  367. of the line/column coordinates where the token is found, the second column is
  368. the name of the token, and the final column is the value of the token (if any)</p>
  369. <div class="highlight-shell-session notranslate"><div class="highlight"><pre><span></span><span class="gp">$ </span>python<span class="w"> </span>-m<span class="w"> </span>tokenize<span class="w"> </span>hello.py
  370. <span class="go">0,0-0,0: ENCODING &#39;utf-8&#39;</span>
  371. <span class="go">1,0-1,3: NAME &#39;def&#39;</span>
  372. <span class="go">1,4-1,13: NAME &#39;say_hello&#39;</span>
  373. <span class="go">1,13-1,14: OP &#39;(&#39;</span>
  374. <span class="go">1,14-1,15: OP &#39;)&#39;</span>
  375. <span class="go">1,15-1,16: OP &#39;:&#39;</span>
  376. <span class="go">1,16-1,17: NEWLINE &#39;\n&#39;</span>
  377. <span class="go">2,0-2,4: INDENT &#39; &#39;</span>
  378. <span class="go">2,4-2,9: NAME &#39;print&#39;</span>
  379. <span class="go">2,9-2,10: OP &#39;(&#39;</span>
  380. <span class="go">2,10-2,25: STRING &#39;&quot;Hello, World!&quot;&#39;</span>
  381. <span class="go">2,25-2,26: OP &#39;)&#39;</span>
  382. <span class="go">2,26-2,27: NEWLINE &#39;\n&#39;</span>
  383. <span class="go">3,0-3,1: NL &#39;\n&#39;</span>
  384. <span class="go">4,0-4,0: DEDENT &#39;&#39;</span>
  385. <span class="go">4,0-4,9: NAME &#39;say_hello&#39;</span>
  386. <span class="go">4,9-4,10: OP &#39;(&#39;</span>
  387. <span class="go">4,10-4,11: OP &#39;)&#39;</span>
  388. <span class="go">4,11-4,12: NEWLINE &#39;\n&#39;</span>
  389. <span class="go">5,0-5,0: ENDMARKER &#39;&#39;</span>
  390. </pre></div>
  391. </div>
  392. <p>The exact token type names can be displayed using the <a class="reference internal" href="#cmdoption-tokenize-e"><code class="xref std std-option docutils literal notranslate"><span class="pre">-e</span></code></a> option:</p>
  393. <div class="highlight-shell-session notranslate"><div class="highlight"><pre><span></span><span class="gp">$ </span>python<span class="w"> </span>-m<span class="w"> </span>tokenize<span class="w"> </span>-e<span class="w"> </span>hello.py
  394. <span class="go">0,0-0,0: ENCODING &#39;utf-8&#39;</span>
  395. <span class="go">1,0-1,3: NAME &#39;def&#39;</span>
  396. <span class="go">1,4-1,13: NAME &#39;say_hello&#39;</span>
  397. <span class="go">1,13-1,14: LPAR &#39;(&#39;</span>
  398. <span class="go">1,14-1,15: RPAR &#39;)&#39;</span>
  399. <span class="go">1,15-1,16: COLON &#39;:&#39;</span>
  400. <span class="go">1,16-1,17: NEWLINE &#39;\n&#39;</span>
  401. <span class="go">2,0-2,4: INDENT &#39; &#39;</span>
  402. <span class="go">2,4-2,9: NAME &#39;print&#39;</span>
  403. <span class="go">2,9-2,10: LPAR &#39;(&#39;</span>
  404. <span class="go">2,10-2,25: STRING &#39;&quot;Hello, World!&quot;&#39;</span>
  405. <span class="go">2,25-2,26: RPAR &#39;)&#39;</span>
  406. <span class="go">2,26-2,27: NEWLINE &#39;\n&#39;</span>
  407. <span class="go">3,0-3,1: NL &#39;\n&#39;</span>
  408. <span class="go">4,0-4,0: DEDENT &#39;&#39;</span>
  409. <span class="go">4,0-4,9: NAME &#39;say_hello&#39;</span>
  410. <span class="go">4,9-4,10: LPAR &#39;(&#39;</span>
  411. <span class="go">4,10-4,11: RPAR &#39;)&#39;</span>
  412. <span class="go">4,11-4,12: NEWLINE &#39;\n&#39;</span>
  413. <span class="go">5,0-5,0: ENDMARKER &#39;&#39;</span>
  414. </pre></div>
  415. </div>
  416. <p>Example of tokenizing a file programmatically, reading unicode
  417. strings instead of bytes with <a class="reference internal" href="#tokenize.generate_tokens" title="tokenize.generate_tokens"><code class="xref py py-func docutils literal notranslate"><span class="pre">generate_tokens()</span></code></a>:</p>
  418. <div class="highlight-python3 notranslate"><div class="highlight"><pre><span></span><span class="kn">import</span> <span class="nn">tokenize</span>
  419. <span class="k">with</span> <span class="n">tokenize</span><span class="o">.</span><span class="n">open</span><span class="p">(</span><span class="s1">&#39;hello.py&#39;</span><span class="p">)</span> <span class="k">as</span> <span class="n">f</span><span class="p">:</span>
  420. <span class="n">tokens</span> <span class="o">=</span> <span class="n">tokenize</span><span class="o">.</span><span class="n">generate_tokens</span><span class="p">(</span><span class="n">f</span><span class="o">.</span><span class="n">readline</span><span class="p">)</span>
  421. <span class="k">for</span> <span class="n">token</span> <span class="ow">in</span> <span class="n">tokens</span><span class="p">:</span>
  422. <span class="nb">print</span><span class="p">(</span><span class="n">token</span><span class="p">)</span>
  423. </pre></div>
  424. </div>
  425. <p>Or reading bytes directly with <a class="reference internal" href="#tokenize.tokenize" title="tokenize.tokenize"><code class="xref py py-func docutils literal notranslate"><span class="pre">tokenize()</span></code></a>:</p>
  426. <div class="highlight-python3 notranslate"><div class="highlight"><pre><span></span><span class="kn">import</span> <span class="nn">tokenize</span>
  427. <span class="k">with</span> <span class="nb">open</span><span class="p">(</span><span class="s1">&#39;hello.py&#39;</span><span class="p">,</span> <span class="s1">&#39;rb&#39;</span><span class="p">)</span> <span class="k">as</span> <span class="n">f</span><span class="p">:</span>
  428. <span class="n">tokens</span> <span class="o">=</span> <span class="n">tokenize</span><span class="o">.</span><span class="n">tokenize</span><span class="p">(</span><span class="n">f</span><span class="o">.</span><span class="n">readline</span><span class="p">)</span>
  429. <span class="k">for</span> <span class="n">token</span> <span class="ow">in</span> <span class="n">tokens</span><span class="p">:</span>
  430. <span class="nb">print</span><span class="p">(</span><span class="n">token</span><span class="p">)</span>
  431. </pre></div>
  432. </div>
  433. </section>
  434. </section>
  435. <div class="clearer"></div>
  436. </div>
  437. </div>
  438. </div>
  439. <div class="sphinxsidebar" role="navigation" aria-label="main navigation">
  440. <div class="sphinxsidebarwrapper">
  441. <div>
  442. <h3><a href="../contents.html">Table of Contents</a></h3>
  443. <ul>
  444. <li><a class="reference internal" href="#"><code class="xref py py-mod docutils literal notranslate"><span class="pre">tokenize</span></code> — Tokenizer for Python source</a><ul>
  445. <li><a class="reference internal" href="#tokenizing-input">Tokenizing Input</a></li>
  446. <li><a class="reference internal" href="#command-line-usage">Command-Line Usage</a></li>
  447. <li><a class="reference internal" href="#examples">Examples</a></li>
  448. </ul>
  449. </li>
  450. </ul>
  451. </div>
  452. <div>
  453. <h4>Previous topic</h4>
  454. <p class="topless"><a href="keyword.html"
  455. title="previous chapter"><code class="xref py py-mod docutils literal notranslate"><span class="pre">keyword</span></code> — Testing for Python keywords</a></p>
  456. </div>
  457. <div>
  458. <h4>Next topic</h4>
  459. <p class="topless"><a href="tabnanny.html"
  460. title="next chapter"><code class="xref py py-mod docutils literal notranslate"><span class="pre">tabnanny</span></code> — Detection of ambiguous indentation</a></p>
  461. </div>
  462. <div role="note" aria-label="source link">
  463. <h3>This Page</h3>
  464. <ul class="this-page-menu">
  465. <li><a href="../bugs.html">Report a Bug</a></li>
  466. <li>
  467. <a href="https://github.com/python/cpython/blob/main/Doc/library/tokenize.rst"
  468. rel="nofollow">Show Source
  469. </a>
  470. </li>
  471. </ul>
  472. </div>
  473. </div>
  474. <div id="sidebarbutton" title="Collapse sidebar">
  475. <span>«</span>
  476. </div>
  477. </div>
  478. <div class="clearer"></div>
  479. </div>
  480. <div class="related" role="navigation" aria-label="related navigation">
  481. <h3>Navigation</h3>
  482. <ul>
  483. <li class="right" style="margin-right: 10px">
  484. <a href="../genindex.html" title="General Index"
  485. >index</a></li>
  486. <li class="right" >
  487. <a href="../py-modindex.html" title="Python Module Index"
  488. >modules</a> |</li>
  489. <li class="right" >
  490. <a href="tabnanny.html" title="tabnanny — Detection of ambiguous indentation"
  491. >next</a> |</li>
  492. <li class="right" >
  493. <a href="keyword.html" title="keyword — Testing for Python keywords"
  494. >previous</a> |</li>
  495. <li><img src="../_static/py.svg" alt="Python logo" style="vertical-align: middle; margin-top: -1px"/></li>
  496. <li><a href="https://www.python.org/">Python</a> &#187;</li>
  497. <li class="switchers">
  498. <div class="language_switcher_placeholder"></div>
  499. <div class="version_switcher_placeholder"></div>
  500. </li>
  501. <li>
  502. </li>
  503. <li id="cpython-language-and-version">
  504. <a href="../index.html">3.12.3 Documentation</a> &#187;
  505. </li>
  506. <li class="nav-item nav-item-1"><a href="index.html" >The Python Standard Library</a> &#187;</li>
  507. <li class="nav-item nav-item-2"><a href="language.html" >Python Language Services</a> &#187;</li>
  508. <li class="nav-item nav-item-this"><a href=""><code class="xref py py-mod docutils literal notranslate"><span class="pre">tokenize</span></code> — Tokenizer for Python source</a></li>
  509. <li class="right">
  510. <div class="inline-search" role="search">
  511. <form class="inline-search" action="../search.html" method="get">
  512. <input placeholder="Quick search" aria-label="Quick search" type="search" name="q" id="search-box" />
  513. <input type="submit" value="Go" />
  514. </form>
  515. </div>
  516. |
  517. </li>
  518. <li class="right">
  519. <label class="theme-selector-label">
  520. Theme
  521. <select class="theme-selector" oninput="activateTheme(this.value)">
  522. <option value="auto" selected>Auto</option>
  523. <option value="light">Light</option>
  524. <option value="dark">Dark</option>
  525. </select>
  526. </label> |</li>
  527. </ul>
  528. </div>
  529. <div class="footer">
  530. &copy;
  531. <a href="../copyright.html">
  532. Copyright
  533. </a>
  534. 2001-2024, Python Software Foundation.
  535. <br />
  536. This page is licensed under the Python Software Foundation License Version 2.
  537. <br />
  538. Examples, recipes, and other code in the documentation are additionally licensed under the Zero Clause BSD License.
  539. <br />
  540. See <a href="/license.html">History and License</a> for more information.<br />
  541. <br />
  542. The Python Software Foundation is a non-profit corporation.
  543. <a href="https://www.python.org/psf/donations/">Please donate.</a>
  544. <br />
  545. <br />
  546. Last updated on Apr 09, 2024 (13:47 UTC).
  547. <a href="/bugs.html">Found a bug</a>?
  548. <br />
  549. Created using <a href="https://www.sphinx-doc.org/">Sphinx</a> 7.2.6.
  550. </div>
  551. </body>
  552. </html>
上海开阖软件有限公司 沪ICP备12045867号-1