� ?��g{'����ddlZddlZddlmZmZmZmZmZm Z m Z m Z m Z m Z mZmZmZmZmZmZddlmZGd�dej��ZdZGd�dej��ZGd�d ej��ZGd �d ej��Zd �Zd �Zddd�d�Z d�!e��e _"ddd�d�Z#ddd�d�Z$d�!e��e$_"dddd�d�Z%ddd�d�Z&dd�d�Z'dS)�N)�IpcReadOptions�IpcWriteOptions� ReadStats� WriteStats�Message� MessageReader�RecordBatchReader�_ReadPandasMixin�MetadataVersion� read_message�read_record_batch� read_schema� read_tensor� write_tensor�get_record_batch_size�get_tensor_sizec� �eZdZdZddd�d�ZdS)�RecordBatchStreamReadera Reader for the Arrow streaming binary format. Parameters ---------- source : bytes/buffer-like, pyarrow.NativeFile, or file-like Python object Either an in-memory buffer, or a readable file object. If you want to use memory map use MemoryMappedFile as source. options : pyarrow.ipc.IpcReadOptions Options for IPC deserialization. If None, default values will be used. memory_pool : MemoryPool, default None If None, default memory pool is used. N��options� memory_poolc�T�t|��}|�|||���dS)Nr�� _ensure_default_ipc_read_options�_open)�self�sourcerrs �[/home/asafur/pinokio/api/open-webui.git/app/env/lib/python3.11/site-packages/pyarrow/ipc.py�__init__z RecordBatchStreamReader.__init__2s-��2�7�;�;�� � � �6�7� � �D�D�D�D�D���__name__� __module__� __qualname__�__doc__r�r rrr"sE������ � �+/�D�E�E�E�E�E�E�Er raNParameters ---------- sink : str, pyarrow.NativeFile, or file-like Python object Either a file path, or a writable file object. schema : pyarrow.Schema The Arrow schema for data to be written to the file. use_legacy_format : bool, default None Deprecated in favor of setting options. Cannot be provided with options. If None, False will be used unless this default is overridden by setting the environment variable ARROW_PRE_0_15_IPC_FORMAT=1 options : pyarrow.ipc.IpcWriteOptions Options for IPC serialization. If None, default values will be used: the legacy format will not be used unless overridden by setting the environment variable ARROW_PRE_0_15_IPC_FORMAT=1, and the V5 metadata version will be used unless overridden by setting the environment variable ARROW_PRE_1_0_METADATA_VERSION=1.c�F�eZdZd�e��Zddd�d�ZdS)�RecordBatchStreamWriterz0Writer for the Arrow streaming binary format {}N��use_legacy_formatrc�V�t||��}|�|||���dS�N�r��_get_legacy_format_defaultr�r�sink�schemar*rs rrz RecordBatchStreamWriter.__init__S�0��,�->��H�H�� � � �4��� �1�1�1�1�1r �r"r#r$�format�_ipc_writer_class_docr%rr&r rr(r(NsL������� �f� "�#�#� �;?��2�2�2�2�2�2�2r r(c�"�eZdZdZdddd�d�ZdS)�RecordBatchFileReadera� Class for reading Arrow record batch data from the Arrow binary file format Parameters ---------- source : bytes/buffer-like, pyarrow.NativeFile, or file-like Python object Either an in-memory buffer, or a readable file object. If you want to use memory map use MemoryMappedFile as source. footer_offset : int, default None If the file is embedded in some larger file, this is the byte offset to the very end of the file data options : pyarrow.ipc.IpcReadOptions Options for IPC serialization. If None, default values will be used. memory_pool : MemoryPool, default None If None, default memory pool is used. Nrc�V�t|��}|�||||���dS)N�� footer_offsetrrr)rrr;rrs rrzRecordBatchFileReader.__init__ks=��2�7�;�;�� � � �6��"� � � =� =� =� =� =r �Nr!r&r rr8r8XsA��������$=�d�!�=�=�=�=�=�=�=r r8c�F�eZdZd�e��Zddd�d�ZdS)�RecordBatchFileWriterz1Writer to create the Arrow binary file format {}Nr)c�V�t||��}|�|||���dSr,r.r0s rrzRecordBatchFileWriter.__init__xr3r r4r&r rr>r>rsL������� �f� "�#�#� �;?��2�2�2�2�2�2�2r r>c���|�|�td���|rFt|t��s/td�t |�������|St j}|�:tttj � dd������}tttj � dd������r t j }t||���S)Nz8Can provide at most one of options and use_legacy_formatz expected IpcWriteOptions, got {}�ARROW_PRE_0_15_IPC_FORMAT�0�ARROW_PRE_1_0_METADATA_VERSION)r*�metadata_version)� ValueError� isinstancer� TypeErrorr5�typer �V5�bool�int�os�environ�get�V4)r*rrDs rr/r/}s����$��)<�� F�H�H� H� ���'�?�3�3� 4��>�#�V�D��M�M�2�2�4�4� 4���&�)��� � ��R�Z�^�^�$?��E�E�F�F� G� G� � �C�� ���?��E�E� F� F�G�G�.�*�-�� �->�,<� >� >� >�>r c��|rDt|t��s/td�t |�������|p t��S)Nzexpected IpcReadOptions, got {})rFrrGr5rHr-s rrr�sW��� �z�'�>�:�:� �� -� 4� 4�T�'�]�]� C� C� � � � � &�n�&�&�&r r)c�(�t||||���S�Nr))r(�r1r2r*rs r� new_streamrT�s#�� "�4��5F�+2� 4� 4� 4�4r z�Create an Arrow columnar IPC stream writer instance {} Returns ------- writer : RecordBatchStreamWriter A writer for the given sink rc�&�t|||���S)a Create reader for Arrow streaming format. Parameters ---------- source : bytes/buffer-like, pyarrow.NativeFile, or file-like Python object Either an in-memory buffer, or a readable file object. options : pyarrow.ipc.IpcReadOptions Options for IPC serialization. If None, default values will be used. memory_pool : MemoryPool, default None If None, default memory pool is used. Returns ------- reader : RecordBatchStreamReader A reader for the given source r)r)rrrs r� open_streamrV�s"��& #�6�7�/:� <� <� <�<r c�(�t||||���SrR)r>rSs r�new_filerX�s#�� ��v�3D�)0� 2� 2� 2�2r z�Create an Arrow columnar IPC file writer instance {} Returns ------- writer : RecordBatchFileWriter A writer for the given sink c�(�t||||���S)a� Create reader for Arrow file format. Parameters ---------- source : bytes/buffer-like, pyarrow.NativeFile, or file-like Python object Either an in-memory buffer, or a readable file object. footer_offset : int, default None If the file is embedded in some larger file, this is the byte offset to the very end of the file data. options : pyarrow.ipc.IpcReadOptions Options for IPC serialization. If None, default values will be used. memory_pool : MemoryPool, default None If None, default memory pool is used. Returns ------- reader : RecordBatchFileReader A reader for the given source r:)r8)rr;rrs r� open_filerZ�s%��, !��m��[� 2� 2� 2�2r ��nthreads�preserve_indexc�"�tj�|||���}tj��}tj||j��5}|�|��ddd��n #1swxYwY|���S)a� Serialize a pandas DataFrame into a buffer protocol compatible object. Parameters ---------- df : pandas.DataFrame nthreads : int, default None Number of threads to use for conversion to Arrow, default all CPUs. preserve_index : bool, default None The default of None will store the index as a column, except for RangeIndex which is stored as metadata only. If True, always preserve the pandas index data as a column. If False, no index information is saved and the result will have a default RangeIndex. Returns ------- buf : buffer An object compatible with the buffer protocol. r[N)�pa� RecordBatch� from_pandas�BufferOutputStreamr(r2� write_batch�getvalue)�dfr\r]�batchr1�writers r�serialize_pandasrh�s���( �N� &� &�r�H�6D� '� F� F�E� � � "� "�D� � #�D�%�,� 7� 7�"�6����5�!�!�!�"�"�"�"�"�"�"�"�"�"�"����"�"�"�"� �=�=�?�?�s�A2�2A6�9A6T�� use_threadsc���tj|��}tj|��5}|���}ddd��n #1swxYwY|�|���S)a�Deserialize a buffer protocol compatible object into a pandas DataFrame. Parameters ---------- buf : buffer An object compatible with the buffer protocol. use_threads : bool, default True Whether to parallelize the conversion using multiple threads. Returns ------- df : pandas.DataFrame The buffer deserialized as pandas DataFrame Nri)r_� BufferReaderr�read_all� to_pandas)�bufrj� buffer_reader�reader�tables r�deserialize_pandasrs s����O�C�(�(�M� � #�M� 2� 2�"�f����!�!��"�"�"�"�"�"�"�"�"�"�"����"�"�"�"� �?�?�{�?� 3� 3�3s�A � A�Ar<)(rL�pyarrowr_� pyarrow.librrrrrrr r r r r rrrrr�lib�_RecordBatchStreamReaderrr6�_RecordBatchStreamWriterr(�_RecordBatchFileReaderr8�_RecordBatchFileWriterr>r/rrTr5r%rVrXrZrhrsr&r r�<module>r{s���( � � � �����A�A�A�A�A�A�A�A�A�A�A�A�A�A�A�A�A�A�A�A�A�A�A�A�A�A�A�A�A�A�A�A�A�A�A�A�������E�E�E�E�E�c�:�E�E�E�*)��.2�2�2�2�2�c�:�2�2�2�=�=�=�=�=�C�6�=�=�=�42�2�2�2�2�C�6�2�2�2�>�>�>�('�'�'�37��4�4�4�4�4� � �F� �!�!� ��$(�T�<�<�<�<�<�.15�d�2�2�2�2�2� � �F� �!�!� ��2�T�t�2�2�2�2�2�6&*�$������8,0�4�4�4�4�4�4�4r
Memory