� >��g�<��b�dZddlmZddlZddlmZmZmZmZm Z m Z m Z m Z ddl mZddlmZmZmZmZmZddlmZmZddlmZdd lmZmZdd lmZmZm Z dd l!m"Z"dd l#m$Z$m%Z%dd l&m'Z'm(Z(m)Z)m*Z*ddl+m,Z,ddl-m.Z.ddl/m0Z0m1Z1ddl2m3Z3eddd���Gd�de3����Z4dd�Z5dS)z2Chain that just formats a prompt and calls an LLM.�)� annotationsN)�Any�Dict�List�Optional�Sequence�Tuple�Union�cast)� deprecated)�AsyncCallbackManager�AsyncCallbackManagerForChainRun�CallbackManager�CallbackManagerForChainRun� Callbacks)�BaseLanguageModel�LanguageModelInput)� BaseMessage)�BaseLLMOutputParser�StrOutputParser)�ChatGeneration� Generation� LLMResult)� PromptValue)�BasePromptTemplate�PromptTemplate)�Runnable�RunnableBinding�RunnableBranch�RunnableWithFallbacks)�DynamicRunnable)�get_colored_text)� ConfigDict�Field)�Chainz0.1.17z&RunnableSequence, e.g., `prompt | llm`z1.0)�since� alternative�removalc���eZdZUdZedDd���Zded< ded< d Zd ed <ee � ��Z d ed< dZ ded< ee � ��Z ded<edd���ZedEd���ZedEd���Z dFdGd�Z dFdHd"�Z dFdId$�Z dFdJd&�Z dFdKd'�Z dFdLd+�Z dFdLd,�ZedMd-���ZdNd/�Z dFdOd0�ZdFdPd3�ZdFdPd4�Z dFdQd6�Z dFdRd8�Z dFdSd:�Z!dTd<�Z" dFdSd=�Z#edMd>���Z$edUdA���Z%dVdC�Z&dS)W�LLMChaina^Chain to run queries against LLMs. This class is deprecated. See below for an example implementation using LangChain runnables: .. code-block:: python from langchain_core.output_parsers import StrOutputParser from langchain_core.prompts import PromptTemplate from langchain_openai import OpenAI prompt_template = "Tell me a {adjective} joke" prompt = PromptTemplate( input_variables=["adjective"], template=prompt_template ) llm = OpenAI() chain = prompt | llm | StrOutputParser() chain.invoke("your adjective here") Example: .. code-block:: python from langchain.chains import LLMChain from langchain_community.llms import OpenAI from langchain_core.prompts import PromptTemplate prompt_template = "Tell me a {adjective} joke" prompt = PromptTemplate( input_variables=["adjective"], template=prompt_template ) llm = LLMChain(llm=OpenAI(), prompt=prompt) �return�boolc��dS)NT���selfs �d/home/asafur/pinokio/api/open-webui.git/app/env/lib/python3.11/site-packages/langchain/chains/llm.py�is_lc_serializablezLLMChain.is_lc_serializableMs���t�r�promptzSUnion[Runnable[LanguageModelInput, str], Runnable[LanguageModelInput, BaseMessage]]�llm�text�str� output_key)�default_factoryr� output_parserT�return_final_only�dict� llm_kwargs�forbid)�arbitrary_types_allowed�extra� List[str]c��|jjS)zJWill be whatever keys the prompt expects. :meta private: )r4�input_variablesr/s r1� input_keyszLLMChain.input_keysfs�� �{�*�*r3c�2�|jr|jgS|jdgS)z=Will always return text key. :meta private: �full_generation)r;r8r/s r1� output_keyszLLMChain.output_keysns(�� � !� 8��O�$� $��O�%6�7� 7r3N�inputs�Dict[str, Any]� run_manager�$Optional[CallbackManagerForChainRun]�Dict[str, str]c�h�|�|g|���}|�|��dS�N�rJr)�generate�create_outputs�r0rHrJ�responses r1�_callzLLMChain._callys4�� �=�=�&��{�=�C�C���"�"�8�,�,�Q�/�/r3� input_list�List[Dict[str, Any]]rc�>�|�||���\}}|r|���nd}t|jt��r|jj||fd|i|j��S|jjdd|i|j���tt|��d|i��}g}|D]`}t|t��r%|� t|���g���<|� t|���g���at|���S� z Generate LLM result from inputs.rON� callbacks�stop)�message)r6)� generationsr.)� prep_prompts� get_child� isinstancer5r�generate_promptr=�bind�batchr rr�appendrrr� r0rUrJ�promptsrZrY�resultsr\�ress r1rPzLLMChain.generate�sS�� �)�)�*�+�)�N�N� ���/:�D�K�)�)�+�+�+�� � �d�h� 1� 2� 2� 6�+�4�8�+�����$���/� �� �$�d�h�m�A�A��A���A�A�G�G��T�7�#�#�k�9�%=���G�35�K�� ?� ?���c�;�/�/�?��&�&��s�(C�(C�(C�'D�E�E�E�E��&�&� ��(<�(<�(<�'=�>�>�>�>���5�5�5� 5r3�)Optional[AsyncCallbackManagerForChainRun]c��fK�|�||����d{V��\}}|r|���nd}t|jt��r!|jj||fd|i|j���d{V��S|jjdd|i|j���tt|��d|i���d{V��}g}|D]`}t|t��r%|� t|���g���<|� t|���g���at|���SrX)� aprep_promptsr^r_r5r�agenerate_promptr=ra�abatchr rrrcrrrrds r1� ageneratezLLMChain.agenerate�s����� #�0�0���0�U�U�U�U�U�U�U�U� ���/:�D�K�)�)�+�+�+�� � �d�h� 1� 2� 2� 6�2���2�����$���/� �������� �*�D�H�M�G�G�t�G�t��G�G�N�N��T�7�#�#�k�9�%=���������G�35�K�� ?� ?���c�;�/�/�?��&�&��s�(C�(C�(C�'D�E�E�E�E��&�&� ��(<�(<�(<�'=�>�>�>�>���5�5�5� 5r3�-Tuple[List[PromptValue], Optional[List[str]]]c��� �d}t|��dkrg|fSd|dvr|dd}g}|D]�� � fd�|jjD��}|jjd i|��}t |���d��}d|z}|r|�|d|j���d� vr� d|krtd ���|� |����||fS) �Prepare prompts from inputs.NrrZc�"��i|] }|�|�� Sr.r.��.0�krHs �r1� <dictcomp>z)LLMChain.prep_prompts.<locals>.<dictcomp>�����Q�Q�Q��q�&��)�Q�Q�Qr3�green�Prompt after formatting: � ��end�verbose�=If `stop` is present in any inputs, should be present in all.r.� �lenr4rC� format_promptr"� to_string�on_textr|� ValueErrorrc� r0rUrJrZre�selected_inputsr4� _colored_text�_textrHs @r1r]zLLMChain.prep_prompts�s'��� �� �z�?�?�a� � ��t�8�O� �Z��]� "� "��a�=��(�D��� � #� #�F�Q�Q�Q�Q�T�[�5P�Q�Q�Q�O�.�T�[�.�A�A��A�A�F�,�V�-=�-=�-?�-?��I�I�M�0�=�@�E�� K��#�#�E�t�T�\�#�J�J�J�����F�6�N�d�$:�$:� �S���� �N�N�6� "� "� "� "���}�r3c���� K�d}t|��dkrg|fSd|dvr|dd}g}|D]�� � fd�|jjD��}|jjd i|��}t |���d��}d|z}|r#|�|d|j����d{V��d� vr� d|krtd ���|� |����||fS) rpNrrZc�"��i|] }|�|�� Sr.r.rrs �r1ruz*LLMChain.aprep_prompts.<locals>.<dictcomp>�rvr3rwrxryrzr}r.r~r�s @r1rjzLLMChain.aprep_prompts�s=����� �� �z�?�?�a� � ��t�8�O� �Z��]� "� "��a�=��(�D��� � #� #�F�Q�Q�Q�Q�T�[�5P�Q�Q�Q�O�.�T�[�.�A�A��A�A�F�,�V�-=�-=�-?�-?��I�I�M�0�=�@�E�� Q�!�)�)�%�T�4�<�)�P�P�P�P�P�P�P�P�P�����F�6�N�d�$:�$:� �S���� �N�N�6� "� "� "� "���}�r3rYr�List[Dict[str, str]]c�z�tj||j|j��}|�dd|i|������} |�||���}n)#t$r}|�|��|�d}~wwxYw|� |��}|� d|i��|S�z0Utilize the LLM generate method for speed gains.NrU)�namerO�outputs) r� configurerYr|�on_chain_start�get_namerP� BaseException�on_chain_errorrQ� on_chain_end�r0rUrY�callback_managerrJrS�er�s r1�applyzLLMChain.apply�s���+�4� �t�~�t�|� � ��'�5�5� � �:� &������6� � � �  ��}�}�Z�[�}�I�I�H�H��� � � � � &� &�q� )� )� )��G����� �����%�%�h�/�/��� � �)�W�!5�6�6�6��s�A&�& B �0B�B c��K�tj||j|j��}|�dd|i|�������d{V��} |�||����d{V��}n/#t$r"}|�|���d{V��|�d}~wwxYw|� |��}|� d|i���d{V��|Sr�) r r�rYr|r�r�rmr�r�rQr�r�s r1�aapplyzLLMChain.aapplys3����0�9� �t�~�t�|� � ��-�;�;� � �:� &������<� � � � � � � � � �  �!�^�^�J�K�^�P�P�P�P�P�P�P�P�H�H��� � � ��,�,�Q�/�/� /� /� /� /� /� /� /��G����� �����%�%�h�/�/���&�&� �7�';�<�<�<�<�<�<�<�<�<��s�A4�4 B �>B�B c��|jS�N�r8r/s r1�_run_output_keyzLLMChain._run_output_keys ����r3� llm_resultc�X���fd�|jD��}�jr�fd�|D��}|S)zCreate outputs from response.c�V��g|]%}�j�j�|��d|i��&S)rF)r8r:� parse_result)rs� generationr0s �r1� <listcomp>z+LLMChain.create_outputs.<locals>.<listcomp>sJ��� � � � ����!3�!@�!@��!L�!L�!�:� � � � r3c�8��g|]}�j|�ji��Sr.r�)rs�rr0s �r1r�z+LLMChain.create_outputs.<locals>.<listcomp>$s'���L�L�L��t���$�/�(:�;�L�L�Lr3)r\r;)r0r��results` r1rQzLLMChain.create_outputss\��� � � � � )�4�  � � �� � !� M�L�L�L�L�V�L�L�L�F�� r3c��xK�|�|g|����d{V��}|�|��dSrN)rmrQrRs r1�_acallzLLMChain._acall'sJ���� �����k��J�J�J�J�J�J�J�J���"�"�8�,�,�Q�/�/r3�kwargsrc �2�|||���|jS)�SFormat prompt with kwargs and pass to LLM. Args: callbacks: Callbacks to pass to LLMChain **kwargs: Keys to pass to prompt template. Returns: Completion from LLM. Example: .. code-block:: python completion = llm.predict(adjective="funny") �rYr��r0rYr�s r1�predictzLLMChain.predict/s ���t�F�i�0�0�0���A�Ar3c��VK�|�||����d{V��|jS)r�r�N)�acallr8r�s r1�apredictzLLMChain.apredict@s7�����j�j��9�j�=�=�=�=�=�=�=�=�t��O�Or3�%Union[str, List[str], Dict[str, Any]]c ��tjd��|jdd|i|��}|jj�|jj�|��S|S)z(Call predict and then parse the results.z_The predict_and_parse method is deprecated, instead pass an output parser directly to LLMChain.rYNr.)�warnings�warnr�r4r:�parse�r0rYr�r�s r1�predict_and_parsezLLMChain.predict_and_parseQse�� � � B� � � ����<�<� �<�V�<�<�� �;� $� 0��;�,�2�2�6�:�:� :��Mr3�%Union[str, List[str], Dict[str, str]]c��K�tjd��|jdd|i|���d{V��}|jj�|jj�|��S|S)z)Call apredict and then parse the results.z`The apredict_and_parse method is deprecated, instead pass an output parser directly to LLMChain.rYNr.)r�r�r�r4r:r�r�s r1�apredict_and_parsezLLMChain.apredict_and_parse_s{���� � � B� � � �%�t�}�C�C�y�C�F�C�C�C�C�C�C�C�C�� �;� $� 0��;�,�2�2�6�:�:� :��Mr3�/Sequence[Union[str, List[str], Dict[str, str]]]c��tjd��|�||���}|�|��S)�&Call apply and then parse the results.z]The apply_and_parse method is deprecated, instead pass an output parser directly to LLMChain.r�)r�r�r��_parse_generation�r0rUrYr�s r1�apply_and_parsezLLMChain.apply_and_parsemsI�� � � B� � � ����J�)��<�<���%�%�f�-�-�-r3r�c�<���jj��fd�|D��S|S)Nc�d��g|],}�jj�|�j����-Sr.)r4r:r�r8)rsrgr0s �r1r�z.LLMChain._parse_generation.<locals>.<listcomp>|sC�������� �)�/�/��D�O�0D�E�E���r3)r4r:)r0r�s` r1r�zLLMChain._parse_generationxsC��� �;� $� 0�����%���� � � r3c��K�tjd��|�||����d{V��}|�|��S)r�z^The aapply_and_parse method is deprecated, instead pass an output parser directly to LLMChain.r�N)r�r�r�r�r�s r1�aapply_and_parsezLLMChain.aapply_and_parse�s_���� � � B� � � ��{�{�:��{�C�C�C�C�C�C�C�C���%�%�f�-�-�-r3c��dS)N� llm_chainr.r/s r1� _chain_typezLLMChain._chain_type�s���{r3r�templatec�D�tj|��}|||���S)z&Create LLMChain from LLM and template.)r5r4)r� from_template)�clsr5r��prompt_templates r1� from_stringzLLMChain.from_string�s)��)�6�x�@�@���s�s�?�3�3�3�3r3�intc�P�t|j���|��Sr�)�_get_language_modelr5�get_num_tokens)r0r6s r1�_get_num_tokenszLLMChain._get_num_tokens�s ��"�4�8�,�,�;�;�D�A�A�Ar3)r+r,)r+rAr�)rHrIrJrKr+rL)rUrVrJrKr+r)rUrVrJrhr+r)rUrVrJrKr+rn)rUrVrJrhr+rn)rUrVrYrr+r�)r+r7)r�rr+rV)rHrIrJrhr+rL)rYrr�rr+r7)rYrr�rr+r�)rYrr�rr+r�)rUrVrYrr+r�)r�r�r+r�)r5rr�r7r+r*)r6r7r+r�)'�__name__� __module__� __qualname__�__doc__� classmethodr2�__annotations__r8r$rr:r;r<r=r#� model_config�propertyrDrGrTrPrmr]rjr�r�r�rQr�r�r�r�r�r�r�r�r�r�r�r.r3r1r*r*&s�������� ��B�����[�����������"��J�����).���)O�)O�)O�M�O�O�O�O��#��"�"�"�"�P��u�T�2�2�2�J�2�2�2�2��:� $�����L� �+�+�+��X�+��8�8�8��X�8�=A�0�0�0�0�0�=A�6�6�6�6�6�<BF�6�6�6�6�6�<=A������:BF������6HL������,HL������*�����X�� � � � �"BF�0�0�0�0�0�B�B�B�B�B�"P�P�P�P�P�$&*� � � � � �&*� � � � � �HL� .� .� .� .� .� � � � �HL� .� .� .� .� .������X���4�4�4��[�4� B�B�B�B�B�Br3r*�llm_likerr+rc�r�t|t��r|St|t��rt|j��St|t ��rt|j��St|ttf��rt|j ��Stdt|�������)NzAUnable to extract BaseLanguageModel from llm_like object of type ) r_rrr��boundr �runnablerr!�defaultr��type)r�s r1r�r��s����(�-�.�.�  ��� �H�o� .� .�  �"�8�>�2�2�2� �H�3� 4� 4� �"�8�#4�5�5�5� �H�~��?� @� @� �"�8�#3�4�4�4�� ��H�~�~� � � � � r3)r�rr+r)6r�� __future__rr��typingrrrrrr r r �langchain_core._apir �langchain_core.callbacksr rrrr�langchain_core.language_modelsrr�langchain_core.messagesr�langchain_core.output_parsersrr�langchain_core.outputsrrr�langchain_core.prompt_valuesr�langchain_core.promptsrr�langchain_core.runnablesrrrr �%langchain_core.runnables.configurabler!�langchain_core.utils.inputr"�pydanticr#r$�langchain.chains.baser%r*r�r.r3r1�<module>r�se��8�8�"�"�"�"�"�"�����J�J�J�J�J�J�J�J�J�J�J�J�J�J�J�J�J�J�J�J�*�*�*�*�*�*�����������������������0�/�/�/�/�/�N�N�N�N�N�N�N�N�H�H�H�H�H�H�H�H�H�H�4�4�4�4�4�4�E�E�E�E�E�E�E�E������������� B�A�A�A�A�A�7�7�7�7�7�7�&�&�&�&�&�&�&�&�'�'�'�'�'�'� �� �8� ���� nB�nB�nB�nB�nB�u�nB�nB� �� nB�b  �  �  �  �  �  r3
Memory