� F�gR����dZGd�de��Ze��ZejZejZejZejZejZej Z ej Z ej Z e j Z e j Z ejZejZejZejZee_e e_ e e_ d�Zd�Zied�ed�ed�ed�ed�ed �e d �e jd �e jd �e jd �e jd�e jd�e jd�e d�e jd�e jd�e jjd�ie jd�e jd�e jd�e jd�e jd�e j d�e j j!d�e j"d�e j#d�e jd�e jd�e j$d �e j%d!�e j%jd"�e j%j&d#�e j%j'd$�e j%j!d%��ie d&�e j(d'�e d(�e j)d)�e j*d*�e j+d+�e j,d,�e j-d-�e j.d.�e jd/�e j/d0�e j0d1�e jd2�e j1d3�e j2d4�e j3d5�e d6��ie j4d7�e j5d8�e j6d9�e j7d:�e j7j8d;�e j9d<�ed=�ej:d>�ed?�ej;d@�edA�ej<dB�ej=dC�ej>dD�ej?dE�ej2dF�ej@dG��edHejAdIejBdJejdKejCdLejDdMejEdNejFdOejGdPejHdQejIdRejJdSi �ZKdTS)Uz� pygments.token ~~~~~~~~~~~~~~ Basic token types and the standard tokens. :copyright: Copyright 2006-2025 by the Pygments team, see AUTHORS. :license: BSD, see LICENSE for details. c�<�eZdZdZd�Zd�Zd�Zd�Zd�Zd�Z d�Z dS) � _TokenTypeNc�v�g}|}|�|�|��|j}|�|���|S�N)�append�parent�reverse)�self�buf�nodes �^/home/asafur/pinokio/api/open-webui.git/app/env/lib/python3.11/site-packages/pygments/token.py�splitz_TokenType.splitsG�������� �J�J�t� � � ��;�D��� � � � � � �� �c�,�t��|_dSr)�set�subtypes)r �argss r �__init__z_TokenType.__init__s������ � � rc�l�||up0t|��|juo|dt|���|kSr)�type� __class__�len)r �vals r � __contains__z_TokenType.__contains__s>���s�{� � ��I�I��� '� $� � ��T��� �O�t� #� rc���|r|d���st�||��St||fz��}t |||��|j�|��||_|S)N�)�isupper�tuple�__getattribute__r�setattrr�addr)r r�news r � __getattr__z_TokenType.__getattr__"s|��� 5�#�a�&�.�.�*�*� 5��)�)�$��4�4� 4������'�'����c�3���� � ���#������ �� rc�@�d|rdpdzd�|��zS)N�Token�.�)�join�r s r �__repr__z_TokenType.__repr__+s%���$�,�3�,�"�-�������>�>rc��|Sr�r(s r �__copy__z_TokenType.__copy__.���� rc��|Srr+)r �memos r � __deepcopy__z_TokenType.__deepcopy__2r-r) �__name__� __module__� __qualname__rr rrr"r)r,r0r+rr rr s������� �F������� � � � ���?�?�?��������rrc� �||vS)z� Return True if ``ttype`` is a subtype of ``other``. exists for backwards compatibility. use ``ttype in other`` now. r+)�ttype�others r �is_token_subtyper7Us�� �E�>�rc��t|t��r|S|stSt}|�d��D]}t ||��}�|S)a} Convert a string into a token type:: >>> string_to_token('String.Double') Token.Literal.String.Double >>> string_to_token('Token.Literal.Number') Token.Literal.Number >>> string_to_token('') Token Tokens that are already tokens are returned unchanged: >>> string_to_token(String) Token.Literal.String r%)� isinstancerr$r �getattr)�sr �items r �string_to_tokentyper=^s\�� �!�Z� � ���� ��� � �D����� � �#�#���t�T�"�"��� �Krr&�w�esc�err�x�k�kc�kd�kn�kp�kr�kt�n�na�nb�bp�nc�no�nd�ni�ne�nf�fm�py�nl�nn�nx�nt�nv�vc�vg�vi�vm�l�ldr;�sa�sb�sc�dl�sd�s2�se�sh�si�sx�sr�s1�ss�m�mb�mf�mh�mi�il�mo�o�ow�p�pm�c�ch�cm�cp�cpf�c1�cs�g�gd�ge�gr�gh�gi�go�gp�gs�gu�ges�gtN)L�__doc__rrr$�Text� Whitespace�Escape�Error�Other�Keyword�Name�Literal�String�Number� Punctuation�Operator�Comment�Genericr7r=�Constant� Declaration� Namespace�Pseudo�Reserved�Type� Attribute�Builtin�Class� Decorator�Entity� Exception�Function�Magic�Property�Label�Tag�Variable�Global�Instance�Date�Affix�Backtick�Char� Delimiter�Doc�Double�Heredoc�Interpol�Regex�Single�Symbol�Bin�Float�Hex�Integer�Long�Oct�Word�Marker�Hashbang� Multiline�Preproc� PreprocFile�Special�Deleted�Emph�Heading�Inserted�Output�Prompt�Strong� Subheading� EmphStrong� Traceback�STANDARD_TYPESr+rr �<module>r�s�����(�(�(�(�(��(�(�(�V � � � �� �z�� �_� � ��� � �� � �� �-�� �z�� �-�� ��� ����� � �>�� �-�� �-���� ��� ��� �������:[� �2�[� �2�[��3� [�  �5� [�  �5� [� �3�[� �3�[� ��4�[� ��4�[� ��4�[� �N�4�[� ��4�[� �L�4�[�" �3�#[�$ �N�4�%[�& �L�4�'[�( �L��4�)[�[�* �J�4�+[�, �M�4�-[�. �N�4�/[�0 �K�4�1[�2 �N�4�3[�4 �M�4�5[�6 �M��4�7[�8 �M�4�9[�: �J�4�;[�< �N�4�=[�> �J�4�?[�@ �H�4�A[�B �M�4�C[�D �M��4�E[�F �M��4�G[�H �M��4�I[�J �M��4�K[�[�[�N �3�O[�P �L�4�Q[�T �3�U[�V �L�4�W[�X �O�4�Y[�Z �K�4�[[�\ ��4�][�^ �J�4�_[�` �M�4�a[�b �M�4�c[�d �N�4�e[�f �O�4�g[�h �L�4�i[�j �L�4�k[�l �M�4�m[�n �M�4�o[�r �3�s[�[�[�t �J�4�u[�v �L�4�w[�x �J�4�y[�z �N�4�{[�| �N��4�}[�~ �J�4�[�B �3�C[�D �M�4�E[�H�3�I[�J��4�K[�N �3�O[�P ��4�Q[�R ��4�S[�T �O�4�U[�V ��5�W[�X �N�4�Y[�Z �O�4�[[�[�^ �3� �O�4� �L�4� �M�4� �O�4� ��4� �N�4� �N�4� �N�4� ��4� ��5� ��4�u[�[���r
Memory