� ���g�l��>�ddlZddlZddlZddlZddlZddlZddlmZddl m Z m Z ddlm Z ddl mZddlmZddlmZmZddlmZmZmZmZmZddlZddlZddlZddlZ d d l!m"Z"d d l#m$Z$m%Z%d d l&m'Z'd d l(m)Z)m*Z*m+Z+m,Z,m-Z-m.Z.d dl/m0Z0m1Z1m2Z2m3Z3m4Z4m5Z5d dl6m7Z7d dl8m9Z9m:Z:d dl;m<Z<m=Z=m>Z>d dl?m@Z@d dlAmBZBd dlCmDZDmEZEmFZFmGZGerddlHZHe@eI��ZJeeKeLfZMd�ZNdeOdeOeLeLffd�ZPdeOdeKdeLdeQeOfd�ZRdzdeOeLeQfdee'de'fd �ZSd!eQeOeLefdeOeLeQffd"�ZTdeOeLeQfde eOeLeffd#�ZU d{d%e eVeMeOfd&eKd'eWde eVeMe jXffd(�ZYGd)�d*��ZZGd+�d,eZ��Z[Gd-�d.e[��Z\Gd/�d0eZ��Z]Gd1�d2e]��Z^Gd3�d4eZ��Z_Gd5�d6eZ��Z`Gd7�d8eZ��ZaGd9�d:eZ��ZbGd;�d<eZ��Zcd=eQeLfd>�ZdGd?�d@eZ��ZeGdA�dBeb��Zfde jXfdC�ZgGdD�dEeZ��ZhdFeeOe jXfdGeeWeQe jie jje jkfdHeLfdI�ZldJedFeeOe jXfdHeLfdK�ZmdJedFeeOe jXfdHeLfdL�ZnGdM�dNeh��ZoGdO�dPeZ��ZpGdQ�dReZ��ZqGdS�dTeZ��ZrGdU�dVeZ��ZsdeOdWe'dXeOeLeeLeWdffdeOfdY�ZtdeOdWe'dXeOeLeeLeWdffdeOfdZ�ZueGd[�d\����ZvGd]�d^eZ��ZweGd_�d`����ZxeGda�db����Zydc�ZzddeeKdefdeeKdeffdf�Z{Gdg�dhe%��Z| d|dieQe|djee7dkee9dleKde|f dm�Z} d}doeQe|dpeeQe~dqeeKdjee7dkee9dreBdsde|fdt�Zdue|dveKdweKde|fdx�Z�dy�Z�dS)~�N)�Counter)�Iterable�Iterator)�deepcopy)� dataclass)�partial)�cycle�islice)� TYPE_CHECKING�Any�Callable�Optional�Union�)�config)�Dataset�DatasetInfoMixin)�Features)� FeatureType�Value�_align_features�!_check_if_features_can_be_aligned�_visit�cast_to_python_objects)�ArrowFormatter�PythonFormatter�TableFormatter�TensorFormatter�get_format_type_from_alias� get_formatter)� DatasetInfo)� NamedSplit�Split)�cast_table_to_features�read_schema_from_file� table_cast)� get_logger)�Literal)�_merge_gen_kwargs�_number_of_shards_in_gen_kwargs�_shuffle_gen_kwargs�_split_gen_kwargsc��|S�N�)�xs �i/home/asafur/pinokio/api/open-webui.git/app/env/lib/python3.11/site-packages/datasets/iterable_dataset.py� identity_funcr27s�� �H��example�column_mappingc ���t�fd�|D����rbtdt|���dt|������dt |��t ���z �d����t�fd�|���D����rttdt|���dt|������dt ���t |�����z �d�����fd�|���D��S) Nc3� �K�|]}|�vV�� dSr.r/��.0�colr4s �r1� <genexpr>z%_rename_columns_fn.<locals>.<genexpr><s(����� 8� 8�#�3�g� � 8� 8� 8� 8� 8� 8r3zError when renaming z to z : columns z are not in the dataset.c3� �K�|]}|�vV�� dSr.r/r8s �r1r;z%_rename_columns_fn.<locals>.<genexpr>@s'����� =� =�c�3�'�>� =� =� =� =� =� =r3z are already in the dataset.c�(��i|]\}}|�|��Sr/r/)r9�original_column_name�new_column_namer4s �r1� <dictcomp>z&_rename_columns_fn.<locals>.<dictcomp>Ds6��� � � � 1� �/� ��!5�6� � � r3)�any� ValueError�list�values�set�items)r4r5s` r1�_rename_columns_fnrG;s���� � 8� 8� 8� 8�� 8� 8� 8�8�8� �� c�4��#7�#7� c� c�T�.�BW�BW�BY�BY�=Z�=Z� c� c�fi�jx�fy�fy�|�AH�}I�}I�gI� c� c� c� � � � � =� =� =� =�^�%:�%:�%<�%<� =� =� =�=�=� �� p�4��#7�#7� p� p�T�.�BW�BW�BY�BY�=Z�=Z� p� p�fi�jq�fr�fr�ux�zH�zO�zO�zQ�zQ�vR�vR�gR� p� p� p� � � � � � � �5C�5I�5I�5K�5K� � � �r3�idx�name�columnc�J�||vrtd|�d|�d����|||iS)NzError when adding z : column z is already in the dataset.)rB)r4rHrIrJs r1� add_column_fnrLJs<�� �w����^�d�^�^�T�^�^�^�_�_�_� �&��+� �r3�batch� try_features�returnc��tj�|��}|�P t|tj|j����}n'#t tjtjf$rYnwxYwtj |j��Sr.) �pa�Table� from_pydictr&�schema�type� TypeError� ArrowInvalid�ArrowNotImplementedErrorr�from_arrow_schema)rMrN�pa_tables r1�_infer_features_from_batchr[Ps���x�#�#�E�*�*�H��� �!�(�B�I�l�6G�,H�,H�I�I�H�H���2�?�B�,G�H� � � � �D� ���� � %�h�o� 6� 6�6s�'A � !A/�.A/�examplesc�r��d��D��}�fd�|D��}tt||����S)Nc��i|] }|D]}|d��� Sr.r/�r9r4r:s r1r@z&_examples_to_batch.<locals>.<dictcomp>]s'�� A� A� A�'�� A� A�#�C�� A� A� A� Ar3c�.���g|]��fd��D����S)c�:��g|]}|������Sr/)�getr_s �r1� <listcomp>z1_examples_to_batch.<locals>.<listcomp>.<listcomp>_s%���8�8�8�G�w�{�{�3���8�8�8r3r/)r9r:r\s @�r1rcz&_examples_to_batch.<locals>.<listcomp>_s/���� I� I� I�S�8�8�8�8�x�8�8�8� I� I� Ir3)�dict�zip)r\�cols�arrayss` r1�_examples_to_batchrhZsJ��� B� A�X� A� A� A�D� I� I� I� I�D� I� I� I�F� ��D�&�!�!� "� "�"r3c#���K�t|��dkrdn.t|tt|������}t|��D]$��fd�|���D��V��%dS)z3Convert a batch (dict of examples) to examples listrc�(��i|]\}}||���Sr/r/)r9r:�array�is �r1r@z&_batch_to_examples.<locals>.<dictcomp>gs#���=�=�=���e�s�E�!�H�=�=�=r3N)�len�next�iter�rangerF)rM� n_examplesrls @r1�_batch_to_examplesrrcs�������%�j�j�A�o�o���3�u�T�$�u�+�+�5F�5F�/G�+H�+H�J� �:� � �>�>��=�=�=�=�u�{�{�}�}�=�=�=�=�=�=�=�>�>r3F�iterable� batch_size�drop_last_batchc#��K�|�|dkr>dtj�td�|D��d�����fV�dSt |��}|D]�\}}t ||dz ��}||fgt |��z}t|��|kr|rdSt|�\}} d� d �|D����} | tj�t| d�����fV���dS) a�Convert and group examples in Arrow tables of size `batch_size`. Args: iterable (`Iterable[Tuple[Key, dict]]`): An examples iterable containing tuples (example_key, example) of type (int/str, dict) batch_size (`Optional[int]`): Size of each sub-table to yield. If None or <= 0, yields the full table. drop_last_batch (`bool`, defaults to `False`): Drop the last batch if it is smaller than `batch_size`. Nr�allc��g|]\}}|��Sr/r/)r9�_r4s r1rcz%_convert_to_arrow.<locals>.<listcomp>|s��8\�8\�8\�Z�Q���8\�8\�8\r3T)�only_1d_for_numpyrryc3�4K�|]}t|��V��dSr.��str�r9�keys r1r;z$_convert_to_arrow.<locals>.<genexpr>�s(����4�4��3�s�8�8�4�4�4�4�4�4r3) rQrR� from_pylistrror rCrmre�join) rsrtru�iteratorrr4�iterator_batch�key_examples_list�keysr\�new_keys r1�_convert_to_arrowr�jsA������Z�1�_�_� � �H� � �!7�8\�8\�S[�8\�8\�8\�pt�!u�!u�!u� v� v� � � � � ���H�~�~�H� �f�f� ��W���*�q�.�9�9��!�7�^�,�t�N�/C�/C�C�� � � !� !�J� .� .�?� .� �F�F��/�0���h��(�(�4�4�t�4�4�4�4�4���r�x�+�+�,B�8�_c�,d�,d�,d�e�e�e�e�e�e�e�f�fr3c ��eZdZdZdd�Zdeeeeffd�Z e de e geeee jfffd���Ze defd���Ze de efd���Zd ejjddfd �Zdd ed eddfd�Zdd ed edeefd�Ze defd���Zdefd�Zdedefd�Zdefd�ZdS)�_BaseExamplesIterablez?Base class for the examples iterable used by an IterableDatasetrONc��d|_dSr.�� _state_dict��selfs r1�__init__z_BaseExamplesIterable.__init__�s��8<����r3c�@�tt|���d����)zWAn examples iterable should yield tuples (example_key, example) of type (int/str, dict)z doesn't implement __iter__ yet��NotImplementedErrorrUr�s r1�__iter__z_BaseExamplesIterable.__iter__�s��!�T�$�Z�Z�"P�"P�"P�Q�Q�Qr3c��dSr.r/r�s r1� iter_arrowz _BaseExamplesIterable.iter_arrow�����tr3c��dS)NFr/r�s r1�is_typedz_BaseExamplesIterable.is_typed�s���ur3c��dSr.r/r�s r1�featuresz_BaseExamplesIterable.features�r�r3� generatorc�@�tt|���d����)z� Either shuffle the shards/sources of the dataset, or propagate the shuffling to the underlying iterable. If the order of the shards must stay fixed (when using .skip or .take for example), then this method returns self. z+ doesn't implement shuffle_data_sources yetr��r�r�s r1�shuffle_data_sourcesz*_BaseExamplesIterable.shuffle_data_sources�s!�� "�T�$�Z�Z�"\�"\�"\�]�]�]r3T� num_shards�indexc�@�tt|���d����)�ZEither keep only the requested shard, or propagate the request to the underlying iterable.z) doesn't implement shard_data_sources yetr��r�r�r�� contiguouss r1�shard_data_sourcesz(_BaseExamplesIterable.shard_data_sources�s��!�T�$�Z�Z�"Z�"Z�"Z�[�[�[r3c���|rW|j|z}|j|z}||zt||��z}||z||krdndz}tt||����Stt||j|����S)Nrr)r��minrCrp)r�r�r�r��div�mod�start�ends r1�split_shard_indices_by_workerz3_BaseExamplesIterable.split_shard_indices_by_worker�s��� � C��/�Z�/�C��/�J�.�C��%�K�#�e�S�/�/�1�E��#�+�e�c�k�k���q�9�C���e�S�)�)�*�*� *���e�T�_�j�A�A�B�B� Br3c�@�tt|���d����)Nz! doesn't implement num_shards yetr�r�s r1r�z _BaseExamplesIterable.num_shards�s��!�T�$�Z�Z�"R�"R�"R�S�S�Sr3c�@�tt|���d����)Nz' doesn't implement _init_state_dict yetr�r�s r1�_init_state_dictz&_BaseExamplesIterable._init_state_dict�s��!�T�$�Z�Z�"X�"X�"X�Y�Y�Yr3� state_dictc�0���fd���|j|��S)Nc� ��|�7t|t��r"|D]}�||||��||<�|S|�Qt|t��r<tt |����D]}�||||��||<�|S|Sr.)� isinstancerdrCrprm)�state� new_staterrl�_inner_load_state_dicts �r1r�zE_BaseExamplesIterable.load_state_dict.<locals>._inner_load_state_dict�s�����$��E�4�)@�)@�$�$�T�T�C�!7�!7��c� �I�c�N�!S�!S�E�#�J�J�� ��&�:�e�T�+B�+B�&��s�5�z�z�*�*�N�N�A�5�5�e�A�h� �!� �M�M�E�!�H�H�� �� r3r�)r�r�r�s @r1�load_state_dictz%_BaseExamplesIterable.load_state_dict�s4��� � � � � �&�%�d�&6� �C�C�Cr3c�`�|jrtj|j��Std���)NzPState dict is not initialized, please call ex_iterable._init_state_dict() first.)r��copyr� RuntimeErrorr�s r1r�z _BaseExamplesIterable.state_dict�s/�� � � 3��=��!1�2�2� 2��m�n�n�nr3)rON�T) �__name__� __module__� __qualname__�__doc__r�r�tuple�Keyrdr��propertyrr rQrRr��boolr�rr��np�random� Generatorr��intr�rCr�r�r�r�r�r/r3r1r�r��s-������I�I�=�=�=�=�R�(�5��d��#3�4�R�R�R�R���H�X�b�(�5��b�h��;O�2P�.P�%Q�R�����X����$�����X����(�8�,�����X��^�b�i�.A�^�F]�^�^�^�^�\�\�S�\��\�Ri�\�\�\�\�C�C��C�C�C�]a�be�]f�C�C�C�C��T�C�T�T�T��X�T�Z�$�Z�Z�Z�Z� D�$� D�4� D� D� D� D�o�D�o�o�o�o�o�or3r�c���eZdZdedeeeffdef�fd� Zdefd�Zd�Z de j j ddfd �Z dd ed eddfd �Zedefd���Z�xZS)�ExamplesIterable�generate_examples_fn.�kwargsc�d��t�����||_||_dSr.)�superr�r�r�)r�r�r�� __class__s �r1r�zExamplesIterable.__init__�s,��� ��������$8��!��� � � r3rOc�:�dd|jjd�|_|jS�Nr)� shard_idx�shard_example_idxrU�r�r�r�r�s r1r�z!ExamplesIterable._init_state_dict��"��)*��D�N�Lc�d�d�����r3c#�K�|jr |jdnd}tt|j|j���|d��D]}}|jr |jdnd}t|jdi|��|d��D]"}|jr|jdxxdz cc<|V��#|jr|jdxxdz cc<d|jd<�~dS�Nr�r�� max_num_jobsr�rr/)r�r r,r�r�r�)r��shard_idx_start� gen_kwags�shard_example_idx_start� key_examples r1r�zExamplesIterable.__iter__�s����;?�;K�R�$�*�;�7�7�QR��� 1�$�+�D�O� \� \� \�^m�os�t�t� :� :�I�OS�O_�&f�d�&6�7J�&K�&K�ef� #�%�&?�d�&?�&L�&L�)�&L�&L�Ne�gk�l�l� "� "� ��#�?��$�%8�9�9�9�Q�>�9�9�9�!�!�!�!�!��� :�� ��-�-�-��2�-�-�-�89�� �!4�5�� :� :r3r�c�8�t|j|j|��Sr.)�#ShuffledDataSourcesExamplesIterabler�r�r�s r1r�z%ExamplesIterable.shuffle_data_sources�s��2�4�3L�d�k�[d�e�e�er3Tr�r�c����t|j|j����|�|||���}t �fd�|D����}t |j|��S)�Keep only the requested shard.r��r�c� ��g|] }�|�� Sr/r/�r9rl�gen_kwargs_lists �r1rcz7ExamplesIterable.shard_data_sources.<locals>.<listcomp>�����1\�1\�1\��/�!�2D�1\�1\�1\r3)r,r�r�r�r)r�r��r�r�r�r�� shard_indices�requested_gen_kwargsr�s @r1r�z#ExamplesIterable.shard_data_sources�sk���+�D�K�d�o�V�V�V���:�:�:�u�Yc�:�d�d� �0�1\�1\�1\�1\�m�1\�1\�1\�]�]���� 9�;O�P�P�Pr3c�*�t|j��Sr.�r*r�r�s r1r�zExamplesIterable.num_shards����.�t�{�;�;�;r3r�)r�r�r�r r�r�rdr�r�r�r�r�r�r�r�r�r�r�� __classcell__�r�s@r1r�r��s��������X�c�5��d��;K�6K�-L��VZ�������  �$� � � � � :� :� :�f�b�i�.A�f�FX�f�f�f�f�Q�Q�S�Q��Q�Rd�Q�Q�Q�Q��<�C�<�<�<��X�<�<�<�<�<r3r�c���eZdZdedeeeffdedejj f�fd� Z defd�Z d�Z dd e d e dd fd �Z�xZS)r�r�.r�r�c�t��t���||��t|��|_dSr.�r�r�rr�)r�r�r�r�r�s �r1r�z,ShuffledDataSourcesExamplesIterable.__init__�s3��� �����-�v�6�6�6�!�)�,�,����r3rOc�:�dd|jjd�|_|jSr�r�r�s r1r�z4ShuffledDataSourcesExamplesIterable._init_state_dict�r�r3c#��K�t|j��}t||j��}|jr |jdnd}t t ||j���|d��D]}}|jr |jdnd}t |jdi|��|d��D]"}|jr|jdxxdz cc<|V��#|jr|jdxxdz cc<d|jd<�~dS)�*Shuffle the kwargs order to shuffle shardsr�rr�Nr�rr/) rr�r+r�r�r r,r�r�)r��rng�kwargs_with_shuffled_shardsr�r�r�r�s r1r�z,ShuffledDataSourcesExamplesIterable.__iter__sC�����t�~�&�&��&9�#�t�{�&K�&K�#�;?�;K�R�$�*�;�7�7�QR��� �9��� X� X� X�Zi�ko� � � :� :�I�PT�O_�&f�d�&6�7J�&K�&K�ef� #�%�&?�d�&?�&L�&L�)�&L�&L�Ne�gk�l�l� "� "� ��#�?��$�%8�9�9�9�Q�>�9�9�9�!�!�!�!�!��� :�� ��-�-�-��2�-�-�-�89�� �!4�5�� :� :r3Tr�r�r�c��t|j��}t||j��}t |j|���|||���S�r�r�)rr�r+r�r�r�r��r�r�r�r�r�r�s r1r�z6ShuffledDataSourcesExamplesIterable.shard_data_sourcessV���t�~�&�&��&9�#�t�{�&K�&K�#��� 9�;V�W�W�j�j� ��*�k� � � r3r�)r�r�r�r r�r�rdr�r�r�r�r�r�r�r�r�r�s@r1r�r��s��������-�$,�S�%��T� �2B�-B�$C�-�MQ�-�^`�^g�^q�-�-�-�-�-�-�  �$� � � � �:�:�:�" � �S� �� �Rd� � � � � � � � r3r�c����eZdZdedeeejffdef�fd� Z e d���Z defd�Z d�Z d �Zd ejjddfd �Zdd ededdfd�Ze defd���Z�xZS)�ArrowExamplesIterable�generate_tables_fn.r�c�d��t�����||_||_dSr.)r�r�r�r�)r�r�r�r�s �r1r�zArrowExamplesIterable.__init__s,��� ��������"4����� � � r3c��|jSr.�� _iter_arrowr�s r1r�z ArrowExamplesIterable.iter_arrow � ����r3rOc�:�dd|jjd�|_|jSr�r�r�s r1r�z&ArrowExamplesIterable._init_state_dict$r�r3c#�K�t��}|jr |jdnd}tt|j|j���|d��D]�}|jr |jdnd}d}|jdi|��D]�\}}|t|��z|kr|t|��z }�.|�tj ���D]V}|� |��} t| ��D]/} ||kr"|jr|jdxxdz cc<|| fV�|dz }�0�W��|jr|jdxxdz cc<d|jd<��dS)Nr�rr�r��� max_chunksizerr/) rr�r r,r�r�r�rm� to_readerr�'ARROW_READER_BATCH_SIZE_IN_DATASET_ITER� format_batchrr) r�� formatterr�r�r�r�rrZ� pa_subtable�formatted_batchr4s r1r�zArrowExamplesIterable.__iter__(s�����#�%�%� �;?�;K�R�$�*�;�7�7�QR��� 1�$�+�D�O� \� \� \�^m�os�t�t� :� :�I�OS�O_�&f�d�&6�7J�&K�&K�ef� #� !� �!8��!8�!E�!E�9�!E�!E� /� /� ��X�$�s�8�}�}�4�8O�O�O�%��X���6�%��#+�#5�#5�F�Dr�#5�#s�#s�/�/�K�&/�&<�&<�[�&I�&I�O�#5�o�#F�#F�/�/��,�0G�G�G�#�/�K� $� 0�1D� E� E� E�� J� E� E� E�"%�w�,�.�.�.�)�Q�.�)�)� /�/��� :�� ��-�-�-��2�-�-�-�89�� �!4�5��# :� :r3c#��K�|jr |jdnd}tt|j|j���|d��D]�}|jr |jdnd}d}|jdi|��D]M\}}|t |��z }||kr�|jr"|jdxxt |��z cc<||fV��N|jr|jdxxdz cc<d|jd<��dSr�)r�r r,r�r�r�rm)r�r�r�r�r�rrZs r1r�z!ArrowExamplesIterable._iter_arrow>sA����;?�;K�R�$�*�;�7�7�QR��� 1�$�+�D�O� \� \� \�^m�os�t�t� :� :�I�OS�O_�&f�d�&6�7J�&K�&K�ef� #� !� �!8��!8�!E�!E�9�!E�!E� $� $� ��X�!�S��]�]�2�!�$�(?�?�?���#�K��$�%8�9�9�9�S��]�]�J�9�9�9��8�m�#�#�#�#��� :�� ��-�-�-��2�-�-�-�89�� �!4�5�� :� :r3r�c�8�t|j|j|��Sr.)�(ShuffledDataSourcesArrowExamplesIterabler�r�r�s r1r�z*ArrowExamplesIterable.shuffle_data_sourcesNs��7��8O�QU�Q\�^g�h�h�hr3Tr�r�c����t|j|j����|�|||���}t �fd�|D����}t |j|��S)r�r�r�c� ��g|] }�|�� Sr/r/r�s �r1rcz<ArrowExamplesIterable.shard_data_sources.<locals>.<listcomp>Ur�r3)r,r�r�r�r)r�r�r�s @r1r�z(ArrowExamplesIterable.shard_data_sourcesQsk���+�D�K�d�o�V�V�V���:�:�:�u�Yc�:�d�d� �0�1\�1\�1\�1\�m�1\�1\�1\�]�]��$�T�%<�>R�S�S�Sr3c�*�t|j��Sr.r�r�s r1r�z ArrowExamplesIterable.num_shardsXr�r3r�)r�r�r�r r�r�rQrRrdr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�s@r1r�r�sA��������8�C��s�B�H�}�9M�4M�+N��X\������� � � ��X� � �$� � � � �:�:�:�,:�:�:� i�b�i�.A�i�F]�i�i�i�i�T�T�S�T��T�Ri�T�T�T�T��<�C�<�<�<��X�<�<�<�<�<r3r�c���eZdZdedeeejffdede j j f�fd� Z defd�Z d�Zd �Zdd ed edd fd�Z�xZS)r r�.r�r�c�t��t���||��t|��|_dSr.r�)r�r�r�r�r�s �r1r�z1ShuffledDataSourcesArrowExamplesIterable.__init__^s3��� �����+�V�4�4�4�!�)�,�,����r3rOc�:�dd|jjd�|_|jSr�r�r�s r1r�z9ShuffledDataSourcesArrowExamplesIterable._init_state_dictgr�r3c#��K�t|j��}t||j��}t ��}|jr |jdnd}t t||j���|d��D]�}|jr |jdnd}d}|j di|��D]�\}} |t| ��z|kr|t| ��z }�.| � tj ���D]V} |�| ��} t| ��D]/} ||kr"|jr|jdxxdz cc<|| fV�|dz }�0�W��|jr|jdxxdz cc<d|jd<��dS) r�r�rr�Nr�rrr/)rr�r+r�rr�r r,r�r�rmrrrrrr) r�r�r�rr�r�r�r�rrZr r r4s r1r�z1ShuffledDataSourcesArrowExamplesIterable.__iter__ks������t�~�&�&��&9�#�t�{�&K�&K�#�#�%�%� �;?�;K�R�$�*�;�7�7�QR��� �9��� X� X� X�Zi�ko� � � :� :�I�PT�O_�&f�d�&6�7J�&K�&K�ef� #� !� �!8��!8�!E�!E�9�!E�!E� /� /� ��X�$�s�8�}�}�4�8O�O�O�%��X���6�%��#+�#5�#5�F�Dr�#5�#s�#s�/�/�K�&/�&<�&<�[�&I�&I�O�#5�o�#F�#F�/�/��,�0G�G�G�#�/�K� $� 0�1D� E� E� E�� J� E� E� E�"%�w�,�.�.�.�)�Q�.�)�)� /�/��� :�� ��-�-�-��2�-�-�-�89�� �!4�5��' :� :r3c#� K�t|j��}t||j��}|jr |jdnd}t t ||j���|d��D]�}|jr |jdnd}d}|jdi|��D]M\}}|t|��z }||kr�|jr"|jdxxt|��z cc<||fV��N|jr|jdxxdz cc<d|jd<��dSr�) rr�r+r�r�r r,r�r�rm) r�r�r�r�r�r�r�rrZs r1r�z4ShuffledDataSourcesArrowExamplesIterable._iter_arrow�si�����t�~�&�&��&9�#�t�{�&K�&K�#�;?�;K�R�$�*�;�7�7�QR��� �9��� X� X� X�Zi�ko� � � :� :�I�PT�O_�&f�d�&6�7J�&K�&K�ef� #� !� �!8��!8�!E�!E�9�!E�!E� $� $� ��X�!�S��]�]�2�!�$�(?�?�?���#�K��$�%8�9�9�9�S��]�]�J�9�9�9��8�m�#�#�#�#��� :�� ��-�-�-��2�-�-�-�89�� �!4�5�� :� :r3Tr�r�r�c��t|j��}t||j��}t |j|���|||���Sr�)rr�r+r�r�r�r�r�s r1r�z;ShuffledDataSourcesArrowExamplesIterable.shard_data_sources�sV���t�~�&�&��&9�#�t�{�&K�&K�#�$�T�%<�>Y�Z�Z�m�m� ��*�n� � � r3r�)r�r�r�r r�r�rQrRrdr�r�r�r�r�r�r�r�r�r�r�s@r1r r ]s��������-�$�S�%��R�X� �*>�%>�?�-��-��9�&� -�-�-�-�-�-� �$� � � � �:�:�:�6:�:�:�( � �S� �� �Ri� � � � � � � � r3r c���eZdZddedeedef�fd� Zed���Z ed���Z ed���Z d e fd �Z d �Zd eeeejffd �Zd ejjd dfd�Zddeded dfd�Zed efd���Z�xZS)�RebatchedArrowExamplesIterableF� ex_iterablertruc�r��t�����||_||_||_dSr.)r�r�rrtru)r�rrtrur�s �r1r�z'RebatchedArrowExamplesIterable.__init__�s6��� ��������&���$���.����r3c��|jSr.r�r�s r1r�z)RebatchedArrowExamplesIterable.iter_arrow�rr3c��|jjSr.�rr�r�s r1r�z'RebatchedArrowExamplesIterable.is_typed������(�(r3c��|jjSr.�rr�r�s r1r�z'RebatchedArrowExamplesIterable.features�rr3rOc�n�|j���dddd|jjd�|_|jS)Nr)�examples_iterable�previous_state� batch_idx�num_chunks_since_previous_state�cropped_chunk_lengthrU�rr�r�r�r�r�s r1r�z/RebatchedArrowExamplesIterable._init_state_dict�sB��!%�!1�!B�!B�!D�!D�"��/0�$%��N�+�  � �����r3c#�$K�|jEd{V��dSr.)rr�s r1r�z'RebatchedArrowExamplesIterable.__iter__�s'�����#�#�#�#�#�#�#�#�#�#r3c#� K�|jr2|jdr%|j�|jd��|jjr|j���}nt |jd���}|j� |jdkrQ|jr|jddkrdSt jd�|D����}|jr d|jd<d|fV�dSg}g}d}|jr |jd nd}|jr |jd nd}|jr#|j���}||jd<|D�]@\} } t| � |j� ����D�]�\} } |dkr|dz}�|dkr |dkr|dz}�$|dkr0|dkr*| � |t| ��|z ��} d}d}t| ��dkr�n|t| ��z|jkr=|� | ��|� | ��|t| ��z }��|t| ��z|jkr�|� | ��|� | ��d �d �|D����} |jrA|jdxxdz cc<|jd xxt|��z cc<d|jd <| t j�|��fV�g}g}d}|jr||jd<| dz|jd <���|j|z }|� | �d|�d���|� | � d|����d �d�|D����} |jrA|jdxxdz cc<|jd xxt|��z cc<||jd <| t j�|��fV�| �d|�d�g}| � |t| ��|z ��g}t| ��|z }|jr||jd<| |jd <���|jr|j���}��B|js�|r�d �d�|D����} |jr3||jd<|jdxxdz cc<d|jd <d|jd <| t j�|��fV�dSdSdS)z-Iterate over sub-tables of size `batch_size`.r#r�rtNrr$c��g|]\}}|��Sr/r/)r9ryrZs r1rcz>RebatchedArrowExamplesIterable._iter_arrow.<locals>.<listcomp>�s��,R�,R�,R�+�!�X�X�,R�,R�,Rr3rwr%r&rryc3�4K�|]}t|��V��dSr.r|�r9�_keys r1r;z=RebatchedArrowExamplesIterable._iter_arrow.<locals>.<genexpr>��(����&I�&I�T�s�4�y�y�&I�&I�&I�&I�&I�&Ir3z[:�]c3�4K�|]}t|��V��dSr.r|r-s r1r;z=RebatchedArrowExamplesIterable._iter_arrow.<locals>.<genexpr>r/r3�[z:]c3�4K�|]}t|��V��dSr.r|r-s r1r;z=RebatchedArrowExamplesIterable._iter_arrow.<locals>.<genexpr>s(����A�A�T�s�4�y�y�A�A�A�A�A�Ar3)r�rr�r�r�rtrQ� concat_tablesr�� enumerater�slicerm�appendr�rR� from_batchesru)r�r�� all_pa_table� keys_buffer� chunks_buffer�chunks_buffer_size�num_chunks_to_skip�chunk_length_to_cropr#rrZr%�chunkr�r&s r1r�z*RebatchedArrowExamplesIterable._iter_arrow�s���� � � Q�� 0�1A� B� Q� � � ,� ,�T�-=�>N�-O� P� P� P� � � &� I��'�2�2�4�4�H�H�(��)9�a�H�H�H�H� �?� "�d�o��&:�&:��� �D�$4�[�$A�A�$E�$E����+�,R�,R��,R�,R�,R�S�S�L��� 2�01�� ��-���%� %� %� %� �F�� �� ���TX�Td�k�T�-�.O�P�P�jk��KO�K[�b�t�/�0F�G�G�ab�� � � @�!�-�8�8�:�:�N�1?�D� �-� .�%�4 ?�4 ?�M�C��:C�H�DV�DV�ei�et�DV�Du�Du�:v�:v�1 n�1 n�6�/��%��)�)�&�!�+�&��'�1�,�,�1E��1J�1J�&�!�+�&��'�1�,�,�1E��1I�1I�!�K�K�(<�c�%�j�j�K_�>_�`�`�E�)*�&�+,�(��u�:�:��?�?��%��E� � �2�T�_�D�D��&�&�s�+�+�+�!�(�(��/�/�/�&�#�e�*�*�4�&��'�#�e�*�*�4���G�G��&�&�s�+�+�+�!�(�(��/�/�/�!�h�h�&I�&I�[�&I�&I�&I�I�I�G��'�E��(��5�5�5��:�5�5�5��(�)J�K�K�K�s�S`�Oa�Oa�a�K�K�K�CD��(�)?�@�!�2�8�#8�#8��#G�#G�G�G�G�G�"$�K�$&�M�)*�&��'�r�=K��(�)9�:�Nm�pq�Nq��(�)J�K��+/�?�=O�+O�(��&�&�#�'H�'H�1E�'H�'H�'H�I�I�I�!�(�(����Q�8L�)M�)M�N�N�N�!�h�h�&I�&I�[�&I�&I�&I�I�I�G��'�X��(��5�5�5��:�5�5�5��(�)J�K�K�K�s�S`�Oa�Oa�a�K�K�K�CW��(�)?�@�!�2�8�#8�#8��#G�#G�G�G�G�G�&)�#D�#D�,@�#D�#D�#D�"E�K�%*�[�[�1E�s�5�z�z�Th�Gh�%i�%i�$j�M�),�U���6J�)J�&��'�n�=K��(�)9�:�Nm��(�)J�K���� ?�!%�!1�!<�!<�!>�!>����#� @� � @��h�h�A�A�[�A�A�A�A�A�G��� =�5C�� �!1�2�� ��-�-�-��2�-�-�-�FG�� �!B�C�;<�� �!7�8��2�8�0�0��?�?�?� ?� ?� ?� ?� ?� @� @� @� @r3r�c�h�t|j�|��|j|j��Sr.)rrr�rtrur�s r1r�z3RebatchedArrowExamplesIterable.shuffle_data_sourcess1��-� � � 1� 1�)� <� <�d�o�t�Oc� � � r3Tr�r�c�n�t|j�|||���|j|j��S�Nr�)rrr�rtrur�s r1r�z1RebatchedArrowExamplesIterable.shard_data_sourcess:��-� � � /� /� �E�j� /� Y� Y� �O� � � � � r3c��|jjSr.�rr�r�s r1r�z)RebatchedArrowExamplesIterable.num_shards&�����*�*r3�Fr�)r�r�r�r�rr�r�r�r�r�r�r�rdr�r�rr�r�rQrRr�r�r�r�r�r�r�r�r�s@r1rr�s��������/�/�$9�/�x�PS�}�/�gk�/�/�/�/�/�/� � � ��X� ��)�)��X�)��)�)��X�)�  �$�  �  �  �  �$�$�$�T@�X�e�C���M�&:�;�T@�T@�T@�T@�l �b�i�.A� �Ff� � � � �  � �S� �� �Rr� � � � ��+�C�+�+�+��X�+�+�+�+�+r3rc���eZdZdedeef�fd� Zed���Zed���Z ed���Z de fd�Z d �Z deeeejffd �Zd ejjddfd �Zddededdfd�Zedefd���Z�xZS)�SelectColumnsIterabler� column_namesc�d��t�����||_||_dSr.)r�r�rrI)r�rrIr�s �r1r�zSelectColumnsIterable.__init__,s/��� ��������&���(����r3c�,�|jjr|jSdSr.)rr�r�r�s r1r�z SelectColumnsIterable.iter_arrow1s#�� � � &� $��#� #� $� $r3c��|jjSr.rr�s r1r�zSelectColumnsIterable.is_typed6rr3c��|jjSr.r r�s r1r�zSelectColumnsIterable.features:rr3rOc�L�|j���|_|jSr.�rr�r�r�s r1r�z&SelectColumnsIterable._init_state_dict>�"���+�<�<�>�>�����r3c#�T�K�|jD]\}�|�fd�|jD��fV��dS)Nc�"��i|] }|�|�� Sr/r/)r9�c�rows �r1r@z2SelectColumnsIterable.__iter__.<locals>.<dictcomp>Ds���=�=�=�a��3�q�6�=�=�=r3)rrI)r�rHrTs @r1r�zSelectColumnsIterable.__iter__BsS������(� >� >�H�C���=�=�=�=�4�+<�=�=�=�=� =� =� =� =� >� >r3c#�K�|j���D]6\}}t|��dkr||�|j��fV��7dS�Nr)rr�rm�selectrI)r�rHrZs r1r�z!SelectColumnsIterable._iter_arrowFsd����!�-�8�8�:�:� >� >�M�C���8�}�}�q� � ��8�?�?�4�+<�=�=�=�=�=�=�� >� >r3r�c�\�t|j�|��|j��Sr.)rHrr�rIr�s r1r�z*SelectColumnsIterable.shuffle_data_sourcesKs'��$�T�%5�%J�%J�9�%U�%U�W[�Wh�i�i�ir3Tr�r�c�b�t|j�|||���|j��SrB)rHrr�rIr�s r1r�z(SelectColumnsIterable.shard_data_sourcesNs5��$� � � /� /� �E�j� /� Y� Y�[_�[l� � � r3c��|jjSr.rDr�s r1r�z SelectColumnsIterable.num_shardsSrEr3r�)r�r�r�r�rCr}r�r�r�r�r�rdr�r�rr�r�rQrRr�r�r�r�r�r�r�r�r�r�s@r1rHrH+sz�������)�$9�)��c��)�)�)�)�)�)� �$�$��X�$��)�)��X�)��)�)��X�)� �$� � � � �>�>�>�>�X�e�C���M�&:�;�>�>�>�>� j�b�i�.A�j�F]�j�j�j�j� � �S� �� �Ri� � � � � �+�C�+�+�+��X�+�+�+�+�+r3rHc����eZdZdededef�fd� Zed���Zed���Zde fd�Z d �Z d e j jddfd �Zdd ededdfd�Zedefd���Z�xZS)�StepExamplesIterabler�step�offsetc�r��t�����||_||_||_dSr.)r�r�rr]r^)r�rr]r^r�s �r1r�zStepExamplesIterable.__init__Ys3��� ��������&����� ��� � � r3c��|jjSr.rr�s r1r�zStepExamplesIterable.is_typed`rr3c��|jjSr.r r�s r1r�zStepExamplesIterable.featuresdrr3rOc�L�|j���|_|jSr.rOr�s r1r�z%StepExamplesIterable._init_state_dicthrPr3c#��K�t|j��} tt||j����}t |��|jkr||jV�ndS�Mr.)rorrCr r]rmr^)r�� ex_iteratorrMs r1r�zStepExamplesIterable.__iter__lsg�����4�+�,�,� � ��� �T�Y�7�7�8�8�E��5�z�z�D�K�'�'��D�K�(�(�(�(�(���  r3r�c�j�t|j�|��|j|j���S)N�r]r^)r\rr�r]r^r�s r1r�z)StepExamplesIterable.shuffle_data_sourcesus7��#� � � 1� 1�)� <� <�4�9�UY�U`� � � � r3Tr�r�c�p�t|j�|||���|j|j���S)Nr�rf)r\rr�r]r^r�s r1r�z'StepExamplesIterable.shard_data_sourceszs>��#� � � /� /� �E�j� /� Y� Y����;� � � � r3c��|jjSr.rDr�s r1r�zStepExamplesIterable.num_shards�rEr3r�)r�r�r�r�r�r�r�r�r�rdr�r�r�r�r�r�r�r�r�r�s@r1r\r\Xs1��������$9����c��������)�)��X�)��)�)��X�)� �$� � � � ���� �b�i�.A� �F\� � � � �  � �S� �� �Rh� � � � ��+�C�+�+�+��X�+�+�+�+�+r3r\c����eZdZ ddeededf�fd� Zed���Zed���Z d�Z d e fd �Z d �Z d ejjd dfd �Zed efd���Z ddeded dfd�Z�xZS)�#CyclingMultiSourcesExamplesIterable�first_exhausted� ex_iterables�stopping_strategy�rk� all_exhaustedc���t�����||_||_|dkr tjn tj|_dS)Nro)r�r�rlrmr�rwrA�bool_strategy_func)r�rlrmr�s �r1r�z,CyclingMultiSourcesExamplesIterable.__init__�sO��� ��������(���!2���.?�/�-Q�-Q�"�&�&�XZ�X^����r3c�&�|jdjSrV�rlr�r�s r1r�z,CyclingMultiSourcesExamplesIterable.is_typed����� ��#�,�,r3c�&�|jdjSrV�rlr�r�s r1r�z,CyclingMultiSourcesExamplesIterable.features�rtr3c #��K�|jr |jdnd}tttt |j������|dzd��D]}|jr ||jd<|V�|}�dS�N�ex_iterable_idxrr)r�r r rprmrl)r�ry�next_ex_iterable_idxs r1�_get_indices_iteratorz9CyclingMultiSourcesExamplesIterable._get_indices_iterator�s�����AE�AQ�X�$�*�+<�=�=�WX��$*�5��s�4�;L�7M�7M�1N�1N�+O�+O�Q`�cd�Qd�fj�$k�$k� 3� 3� ��� K�6J�� �!2�3�!� !� !� !�2�O�O�  3� 3r3rOc��dd�|jD��dgt|j��zdgt|j��z|jjd�|_|jS)Nrc�6�g|]}|�����Sr/�r��r9rs r1rczHCyclingMultiSourcesExamplesIterable._init_state_dict.<locals>.<listcomp>��$��a�a�a� �[�9�9�;�;�a�a�ar3F)ryrl�previous_states� is_exhaustedrU)rlrmr�r�r�r�s r1r�z4CyclingMultiSourcesExamplesIterable._init_state_dict�se�� �a�a�t�O`�a�a�a� $�v��D�,=�(>�(>�>�"�G�c�$�*;�&<�&<�<��N�+�  � �����r3c#�PK�dgt|j��z}|jrhtt|j����D]F}|jd|�1|j|�|jd|���Gd�|jD��}|���}|jrt j|jd��n&t jt|j��d��}|D�]&}|� |��rdS||�t||d��||<||}|jr.t|jd|��|jd|<t||d��||<||dur�d||<|jrd|jd|<d||<|jr=|j|� ��|jd|<d|jd|<t|j|��||<|dur|V���(dS)Nr�c�,�g|]}t|����Sr/�rors r1rcz@CyclingMultiSourcesExamplesIterable.__iter__.<locals>.<listcomp>�s ��L�L�L�;�T�+�&�&�L�L�Lr3r�FrlT)rmrlr�rpr�r{r�rk�fullrqrnrr�ro)r��nextsrl� iterators�indices_iteratorr��results r1r�z,CyclingMultiSourcesExamplesIterable.__iter__�s\�������T�.�/�/�/�� � � a��3�t�0�1�1�2�2� a� a���#�$5�6�q�9�E��%�a�(�8�8��9I�J[�9\�]^�9_�`�`�`��L�L�$�:K�L�L�L� ��5�5�7�7��;?�:J� v�B�H�T�%�n�5� 6� 6� 6�PR�PW�X[�\`�\m�Xn�Xn�pu�Pv�Pv� �"� � �A��&�&�|�4�4� �����Q�x��� �!� �e�4�4��a���1�X�F��� g�9A�$�BR�Sa�Bb�cd�Be�9f�9f�� �!2�3�A�6��I�a�L�%�0�0�E�!�H��Q�x�5� � �"&� �Q���#�?�:>�D�$�^�4�Q�7���a���#�B�:>�:K�A�:N�:_�:_�:a�:a�D�$�^�4�Q�7�=A�D�$�%6�7��:�#�D�$5�a�$8�9�9� �!� ��U�"�"�� � � ��3 � r3r�c�T���fd�|jD��}t||j��S)z*Shuffle each underlying examples iterable.c�:��g|]}|������Sr/�r��r9rr�s �r1rczLCyclingMultiSourcesExamplesIterable.shuffle_data_sources.<locals>.<listcomp>��'���i�i�i� � �8�8��C�C�i�i�ir3)rlrjrm�r�r�rls ` r1r�z8CyclingMultiSourcesExamplesIterable.shuffle_data_sources�s3���i�i�i�i�W[�Wh�i�i�i� �2�<��AW�X�X�Xr3c�>�td�|jD����S)Nc3�$K�|] }|jV�� dSr.�r�rs r1r;zACyclingMultiSourcesExamplesIterable.num_shards.<locals>.<genexpr>��%����O�O�k�;�)�O�O�O�O�O�Or3�r�rlr�s r1r�z.CyclingMultiSourcesExamplesIterable.num_shards��"���O�O�T�=N�O�O�O�O�O�Or3Tr�r�c�Z����t���fd�|jD��|j���S)r�c�@��g|]}|���������S�r��r��r9rsr�r�r�s ���r1rczJCyclingMultiSourcesExamplesIterable.shard_data_sources.<locals>.<listcomp>��/��� u� u� u�W_�X� (� (��U�z� (� R� R� u� u� ur3�rm)rjrlrmr�s ```r1r�z6CyclingMultiSourcesExamplesIterable.shard_data_sources�sE�����3� u� u� u� u� u� u�cg�ct� u� u� u�"�4� � � � r3)rkr�)r�r�r�rCr�r(r�r�r�r�r{rdr�r�r�r�r�r�r�r�r�r�r�s@r1rjrj�sp�������J[� _� _��0�1� _�#�#E�F� _� _� _� _� _� _��-�-��X�-��-�-��X�-�3�3�3� �$� � � � �(�(�(�TY�b�i�.A�Y�Fk�Y�Y�Y�Y� �P�C�P�P�P��X�P�7;� � �� �&)� � .� � � � � � � � r3rjc����eZdZdZdeef�fd� Zed���Zed���Z ed���Z de fd�Z d �Z d �Zd ejjddfd �Zedefd ���Z ddededdfd�Z�xZS)�2VerticallyConcatenatedMultiSourcesExamplesIterablea� VerticallyConcatenatedMultiSourcesExamplesIterable simply chains the input iterables. It doesn't require the examples iterables to always yield the same columns. Instead, this is handled by the `IterableDataset` class or `FormattedExamplesIterable`. For information, `IterableDataset` merges the features of all the datasets to concatenate into one. We use `IterableDataset._resolve_features` to obtain the features of all the datasets to concatenate. Then for each example, `IterableDataset` and `FormattedExamplesIterable` automatically fill missing columns with None. This is done with `_apply_feature_types_on_example`. rlc�V��t�����||_dSr.�r�r�rl�r�rlr�s �r1r�z;VerticallyConcatenatedMultiSourcesExamplesIterable.__init__��'��� ��������(����r3c�&�|jdjSrVrsr�s r1r�z;VerticallyConcatenatedMultiSourcesExamplesIterable.is_typed�rtr3c�&�|jdjSrVrvr�s r1r�z;VerticallyConcatenatedMultiSourcesExamplesIterable.featuresrtr3c�P�td�|jD����r|jSdS)Nc3�(K�|] }|jduV��dSr.)r�rs r1r;zPVerticallyConcatenatedMultiSourcesExamplesIterable.iter_arrow.<locals>.<genexpr>s*����W�W�k�{�%�T�1�W�W�W�W�W�Wr3)rwrlr�r�s r1r�z=VerticallyConcatenatedMultiSourcesExamplesIterable.iter_arrows7�� �W�W�T�EV�W�W�W� W� W� $��#� #� $� $r3rOc�X�dd�|jD��|jjd�|_|jS)Nrc�6�g|]}|�����Sr/r~rs r1rczWVerticallyConcatenatedMultiSourcesExamplesIterable._init_state_dict.<locals>.<listcomp>r�r3)ryrlrU�rlr�r�r�r�s r1r�zCVerticallyConcatenatedMultiSourcesExamplesIterable._init_state_dict s;�� �a�a�t�O`�a�a�a��N�+� � ��� ��r3c#�K�|jr |jdnd}t|j|d��D]&}|Ed{V��|jr|jdxxdz cc<�'dSrx)r�r rl�r��ex_iterable_idx_startrs r1r�z;VerticallyConcatenatedMultiSourcesExamplesIterable.__iter__s�����GK�GW� ^�� 0�1B� C� C�]^��!�$�"3�5J�D�Q�Q� 9� 9�K�"� "� "� "� "� "� "� "��� 9�� �!2�3�3�3�q�8�3�3�3�� 9� 9r3c#��K�|jr |jdnd}t|j|d��D]8}|���Ed{V��|jr|jdxxdz cc<�9dSrx)r�r rlr�r�s r1r�z>VerticallyConcatenatedMultiSourcesExamplesIterable._iter_arrows�����GK�GW� ^�� 0�1B� C� C�]^��!�$�"3�5J�D�Q�Q� 9� 9�K�"�-�-�/�/� /� /� /� /� /� /� /��� 9�� �!2�3�3�3�q�8�3�3�3�� 9� 9r3r�c���t���}t|j��}|�|���fd�|D��}t |��S)zTShuffle the list of examples iterable, as well as each underlying examples iterable.c�:��g|]}|������Sr/r�r�s �r1rcz[VerticallyConcatenatedMultiSourcesExamplesIterable.shuffle_data_sources.<locals>.<listcomp>(s'���d�d�d� � �8�8��C�C�d�d�dr3)rrCrl�shuffler�)r�r�r�rls ` r1r�zGVerticallyConcatenatedMultiSourcesExamplesIterable.shuffle_data_sources!s\����y�!�!���D�-�.�.� � � � �L�!�!�!�d�d�d�d�Wc�d�d�d� �A�,�O�O�Or3c�>�td�|jD����S)Nc3�$K�|] }|jV�� dSr.r�rs r1r;zPVerticallyConcatenatedMultiSourcesExamplesIterable.num_shards.<locals>.<genexpr>-r�r3r�r�s r1r�z=VerticallyConcatenatedMultiSourcesExamplesIterable.num_shards+r�r3Tr�r�c�L����t���fd�|jD����S)r�c�@��g|]}|���������Sr�r�r�s ���r1rczYVerticallyConcatenatedMultiSourcesExamplesIterable.shard_data_sources.<locals>.<listcomp>4r�r3)r�rlr�s ```r1r�zEVerticallyConcatenatedMultiSourcesExamplesIterable.shard_data_sources/s;�����B� u� u� u� u� u� u�cg�ct� u� u� u� � � r3r�)r�r�r�r�rCr�r�r�r�r�r�rdr�r�r�r�r�r�r�r�r�r�r�r�s@r1r�r��so������� � �)�T�*?�%@�)�)�)�)�)�)��-�-��X�-��-�-��X�-��$�$��X�$� �$� � � � �9�9�9�9�9�9�P���,�P� =�P�P�P�P��P�C�P�P�P��X�P�7;� � �� �&)� � =� � � � � � � � r3r�rIc���t|���td�����D����s!�fd��D��}td|�d����dS)zBCheck the column names to make sure they don't contain duplicates.c3�"K�|] }|dkV�� dS)rNr/)r9�counts r1r;z&_check_column_names.<locals>.<genexpr>;s&����8�8�e�u��z�8�8�8�8�8�8r3c�,��g|]}�|dk�|��S)rr/)r9r:�counters �r1rcz'_check_column_names.<locals>.<listcomp><s'���I�I�I�c��� �q�8H�8H�c�8H�8H�8Hr3zAThe examples iterables can't have duplicated columns but columns z are duplicated.N)rrwrDrB)rI�duplicated_columnsr�s @r1�_check_column_namesr�8s~����l�#�#�G� �8�8�w�~�~�'7�'7�8�8�8� 8� 8� �I�I�I�I�W�I�I�I��� t�Pb� t� t� t� � � � � r3c����eZdZdZdeef�fd� Zed���Zed���Z de fd�Z d�Z d e jjddfd �Zedefd ���Z dd ededdfd�Z�xZS)�4HorizontallyConcatenatedMultiSourcesExamplesIterablea5 HorizontallyConcatenatedMultiSourcesExamplesIterable merges examples together for the input list of iterables. It also checks that there are no duplicate columns (otherwise we don't know which one to keep). This check is done once when yielding the first example. However it doesn't fill missing columns with None. Instead, this is handled by the `IterableDataset` class or `FormattedExamplesIterable`. For information, `IterableDataset` merges the features of all the datasets to concatenate into one. We use `IterableDataset._resolve_features` to obtain the features of all the datasets to concatenate. Then for each example, `IterableDataset` and `FormattedExamplesIterable` automatically fill missing columns with None. This is done with `_apply_feature_types_on_example`. rlc�V��t�����||_dSr.r�r�s �r1r�z=HorizontallyConcatenatedMultiSourcesExamplesIterable.__init__Rr�r3c�&�|jdjSrVrsr�s r1r�z=HorizontallyConcatenatedMultiSourcesExamplesIterable.is_typedWrtr3c�&�|jdjSrVrvr�s r1r�z=HorizontallyConcatenatedMultiSourcesExamplesIterable.features[rtr3rOc�V�d�|jD��|jjd�|_|jS)Nc�6�g|]}|�����Sr/r~rs r1rczYHorizontallyConcatenatedMultiSourcesExamplesIterable._init_state_dict.<locals>.<listcomp>ar�r3)rlrUr�r�s r1r�zEHorizontallyConcatenatedMultiSourcesExamplesIterable._init_state_dict_s8��a�a�t�O`�a�a�a��N�+� � �����r3c#�K�d�|jD��}tj��D]�}g}g}t|��D]d} t |��\}}|�|��|�|���@#t $r|�|��Y�awxYw|ra|dkrtd�|D����i}|D]}|� |���d� d�|D����} | |fV���dSdS)Nc�,�g|]}t|����Sr/r�rs r1rczQHorizontallyConcatenatedMultiSourcesExamplesIterable.__iter__.<locals>.<listcomp>gs ��O�O�O�k��[�)�)�O�O�Or3rc��g|] }|D]}|��� Sr/r/)r9r4� column_names r1rczQHorizontallyConcatenatedMultiSourcesExamplesIterable.__iter__.<locals>.<listcomp>ts'��(h�(h�(h��`g�(h�(h�Q\��(h�(h�(h�(hr3ryc3�4K�|]}t|��V��dSr.r|r~s r1r;zPHorizontallyConcatenatedMultiSourcesExamplesIterable.__iter__.<locals>.<genexpr>xs(����"<�"<��3�s�8�8�"<�"<�"<�"<�"<�"<r3) rl� itertoolsr�rCrnr7� StopIteration�remover��updater�) r�� ex_iteratorsrlr�r\rdrr4� new_exampler�s r1r�z=HorizontallyConcatenatedMultiSourcesExamplesIterable.__iter__fsd����O�O�T�=N�O�O�O� ���"�"� � �A��D��H�#�L�1�1� 5� 5� �5�#'� �#4�#4�L�C���K�K��$�$�$��O�O�G�,�,�,�,��$�5�5�5� �'�'� �4�4�4�4�4�5����� ���6�6�'�(h�(h�H�(h�(h�(h�i�i�i� � �'�0�0�G��&�&�w�/�/�/�/��(�(�"<�"<�t�"<�"<�"<�<�<���{�*�*�*�*�*����' � s�<A<�<B�Br�c��|S)z^Doesn't shuffle the wrapped examples iterable since it would break the alignment between them.r/r�s r1r�zIHorizontallyConcatenatedMultiSourcesExamplesIterable.shuffle_data_sources}s ��� r3c��dS�Nrr/r�s r1r�z?HorizontallyConcatenatedMultiSourcesExamplesIterable.num_shards�s���qr3Tr�r�c�L����t���fd�|jD����S)r�c�@��g|]}|���������Sr�r�r�s ���r1rcz[HorizontallyConcatenatedMultiSourcesExamplesIterable.shard_data_sources.<locals>.<listcomp>�r�r3)r�rlr�s ```r1r�zGHorizontallyConcatenatedMultiSourcesExamplesIterable.shard_data_sources�s;�����D� u� u� u� u� u� u�cg�ct� u� u� u� � � r3r�)r�r�r�r�rCr�r�r�r�r�rdr�r�r�r�r�r�r�r�r�r�r�s@r1r�r�Bs:������� � �)�T�*?�%@�)�)�)�)�)�)� �-�-��X�-��-�-��X�-� �$� � � � ����.���,�� ?����� ��C�����X��7;� � �� �&)� � ?� � � � � � � � r3r�c ����eZdZ ddeedejjdeee de df�fd� Z e d ���Z e d ���Zd �Zd efd �Zdejjd dfd�Z ddeded dfd�Z�xZS)�+RandomlyCyclingMultiSourcesExamplesIterableNrkrlr�� probabilitiesrmrnc���t���||��t|��|_||_dSr.)r�r�rr�r�)r�rlr�r�rmr�s �r1r�z4RandomlyCyclingMultiSourcesExamplesIterable.__init__�s=��� ������'8�9�9�9�!�)�,�,���*����r3c�&�|jdjSrVrsr�s r1r�z4RandomlyCyclingMultiSourcesExamplesIterable.is_typed�rtr3c�&�|jdjSrVrvr�s r1r�z4RandomlyCyclingMultiSourcesExamplesIterable.features�rtr3c#�K�t|j��}t|j��}d}|jr |jdnd}|jr|jd|j_|j�p t|� d||���|d��D]F}|dz|z}|jr$||jd<|dkr|jj|jd<t|��V��G�o t|� |||j���|d��D]F}|dz|z}|jr$||jd<|dkr|jj|jd<t|��V��G�t) N���bit_generator_index_offsetr�bit_generator_stateT��sizer)r��p) rr�rmrlr�� bit_generatorr�r�r �integersr��choice)r�r�� num_sources�random_batch_size� index_offsetrls r1r{zARandomlyCyclingMultiSourcesExamplesIterable._get_indices_iterator�s������t�~�&�&���$�+�,�,� � ��IM�IY�`�t�'�(D�E�E�_`� � � � N�&*�&6�7L�&M�C� � #� � � %� !��� � �Q� �BS� � T� T�Vb�dh�i�i�!�!�A�$0�1�$4�8I�#I�L��'�^�IU��(�)E�F�'�1�,�,�FI�FW�F]�D�,�-B�C��a�&�&�L�L�L�L� !� !���J�J�{�1B�d�FX�J�Y�Y�[g�im���!�!�A�%1�1�$4�8I�#I�L��'�^�IU��(�)E�F�'�1�,�,�FI�FW�F]�D�,�-B�C��a�&�&�L�L�L�L� !r3rOc���|jjjdd�|jD��dgt |j��zdgt |j��z|jjd�|_|jS)Nrc�6�g|]}|�����Sr/r~rs r1rczPRandomlyCyclingMultiSourcesExamplesIterable._init_state_dict.<locals>.<listcomp>�r�r3F)r�r�rlr�r�rU)r�r�r�rlrmr�r�r�r�s r1r�z<RandomlyCyclingMultiSourcesExamplesIterable._init_state_dict�sp��#'�>�#?�#E�*+�a�a�t�O`�a�a�a� $�v��D�,=�(>�(>�>�"�G�c�$�*;�&<�&<�<��N�+�  � �����r3c�d���fd�|jD��}t|�|j|j���S)z;Shuffle the data sources of each wrapped examples iterable.c�:��g|]}|������Sr/r�r�s �r1rczTRandomlyCyclingMultiSourcesExamplesIterable.shuffle_data_sources.<locals>.<listcomp>�r�r3�r�r�rm)rlr�r�rmr�s ` r1r�z@RandomlyCyclingMultiSourcesExamplesIterable.shuffle_data_sources�sI���i�i�i�i�W[�Wh�i�i�i� �:� ���,�"�4�  � � � r3Tr�r�c�p����t���fd�|jD��|j|j|j��S)r�c�@��g|]}|���������Sr�r�r�s ���r1rczRRandomlyCyclingMultiSourcesExamplesIterable.shard_data_sources.<locals>.<listcomp>�r�r3)r�rlr�r�rmr�s ```r1r�z>RandomlyCyclingMultiSourcesExamplesIterable.shard_data_sources�sK�����;� u� u� u� u� u� u�cg�ct� u� u� u� �N� � � � "�  � � r3)Nrkr�)r�r�r�rCr�r�r�r�r�floatr(r�r�r�r�r{rdr�r�r�r�r�r�s@r1r�r��sH������� 04�IZ� +� +��0�1� +��9�&� +� ��U� �,� +� #�#E�F� +� +� +� +� +� +��-�-��X�-��-�-��X�-�!�!�!�:  �$�  �  �  �  � �b�i�.A� �Fs� � � � �7;�  �  ��  �&)�  � 6�  �  �  �  �  �  �  �  r3r�c�d�t|tj��r|St|tjtjf��rtj�|��StjrBdtj vr4ddl }t||j|jf��r|� ��S|S)N�polarsr) r�rQrR�pd� DataFrame�Series� from_pandasr�POLARS_AVAILABLE�sys�modulesr��to_arrow)�output�pls r1�_table_output_to_arrowr��s����&�"�(�#�#��� ��&�2�<���3�4�4�,��x�#�#�F�+�+�+� ��%�8�s�{�#:�#:����� �f�r�|�R�Y�7� 8� 8� %��?�?�$�$� $� �Mr3c����eZdZ d"dedededeeeded ee d ed eeed ee d eddee dee f�fd� Z e d���Ze d���Ze d���Zde fd�Zd�Zd�Zd#dee deeeejffd�Zdejjddfd�Zd$de de ddfd �Ze de fd!���Z�xZ S)%�MappedExamplesIterableFNr�r�function� with_indices� input_columns�batchedrtru�remove_columns� fn_kwargs� formatting�FormattingConfigr��/max_num_running_async_map_functions_in_parallelc ���t�����||_||_||_||_||_||_||_||_ | pi|_ | |_ | |_ | p tj|_| r�| jr�t#|t$��sTt'd| j����dt-|��j�dt-|��j�d����|j|r|ndkrNt'd| j����dt-|��j�d|r|nd�d|j�d� ���g|_dS) NzThe z -formatted z" has underlying iterablethat is a z- instead of a RebatchedArrowExamplesIterable.rz has batch_size=z/ which isdifferent from ex_iterable.batch_size=z from its underlying iterable.)r�r�rr�rrtrurr�rrr� _featuresr�/MAX_NUM_RUNNING_ASYNC_MAP_FUNCTIONS_IN_PARALLELr�is_tabler�rrB� format_type� capitalizerUr��_owned_loops_and_tasks)r�rr�r�rrrtrurrrr�rr�s �r1r�zMappedExamplesIterable.__init__�s���� ��������&��� �� ��� �$���.���,���(���*���"��b���$���!��� ;� u�v�?u� �<� � �*�-� ��k�+I�J�J� � �k�:�1�<�<�>�>�k�k�4�PT�:�:�K^�k�k�!%�k�!2�!2�!;�k�k�k�����'�'�,H�J�J�q�I�I� �^�:�1�<�<�>�>�^�^�4�PT�:�:�K^�^�^�F�qM�pz�pz�LM�^�^�&1�&<�^�^�^���� ce��#�#�#r3c�>�|jr|jjr |jSdSdSr.)rr r�r�s r1r�z!MappedExamplesIterable.iter_arrows5�� �?� $�t��7� $��#� #� $� $� $� $r3c��|jduSr.�r�r�s r1r�zMappedExamplesIterable.is_typed$s���}�D�(�(r3c��|jSr.�rr�s r1r�zMappedExamplesIterable.features(� ���~�r3rOc�l�|j���ddd|jjd�|_|jS)Nr)r"r#�!num_examples_since_previous_state�previous_state_example_idxrUr'r�s r1r�z'MappedExamplesIterable._init_state_dict,s?��!%�!1�!B�!B�!D�!D�"�12�*+��N�+�  � �����r3c#��K�|jrQ|jjrEt��}|�d���D]\}}||�|��fV��dS|���Ed{V��dS)Nrr)rr rr�� format_row�_iter)r�rrrZs r1r�zMappedExamplesIterable.__iter__6s����� �?� $�t��7� $�'�)�)�I�!%�!1�!1��!1�!B�!B� :� :� ��X��9�/�/��9�9�9�9�9�9�9� :� :��z�z�|�|� #� #� #� #� #� #� #� #� #r3c#���� � � � � ������K��jr �jdnd� �jr@�jdr3�j��jd���jd}nd}t�j����jr8t �jj��}t|t��r|j nd� nd� � � ��fd�� � �fd�� �fd���fd����fd �����fd �����fd �� g�tj �j ��rU tj���n##t$rtj���YnwxYw�j���f��nd��� � � � ���fd �} |��}�jr d �|D��}|D]@\}}�jr"�jd��jdxxdz cc<|dkr|dz}�:||fV��AdS#t(t*f$r��r�t,�dt1����d����D]}|�d���� ��tj����n6#tjt:f$rt,�d��YnwxYw�wxYw)Nrrr#rc3� �K�� D�]\}}� j� � jdkr� nt� � jdz ��}||fgt|��z}t|�\}}d�d�|D����}� jr-� j�&� jdkrt |��� jkrdSt|��}� r � |��n|}�fd�tt |����D��}�t |��z �|||ffV���dS)Nrrryc3�4K�|]}t|��V��dSr.r|r~s r1r;zLMappedExamplesIterable._iter.<locals>.iter_batched_inputs.<locals>.<genexpr>\s(����8�8�C�s�3�x�x�8�8�8�8�8�8r3c���g|]}�|z��Sr/r/�r9rl� current_idxs �r1rczMMappedExamplesIterable._iter.<locals>.iter_batched_inputs.<locals>.<listcomp>gs���R�R�R�q�;��?�R�R�Rr3) rtr rCrer�rurmrhrp) rr4r�r�r�r\rM�indicesr� format_dictr�r�s ����r1�iter_batched_inputsz9MappedExamplesIterable._iter.<locals>.iter_batched_inputsPsW����� (� ,� ,� ��W���.�$�/�Q�2F�2F��H���$�/�A�*=�>�>�� '*�7�^�$4�t�N�7K�7K�$K�!�!$�&7�!8���h��h�h�8�8�4�8�8�8�8�8���(����3���!�+�+��H� � ���7�7��F�F�*�8�4�4��.9�D� � �E�*�*�*�u��R�R�R�R�E�#�>O�:P�:P�4Q�4Q�R�R�R���s�7�|�|�+� ���U�|�+�+�+�+�+�/ ,� ,r3c3�Z�K��D]$\}}t|��}�dz ��dz ||ffV��%dSr�)rd)rr4rr�s ��r1� iter_inputsz1MappedExamplesIterable._iter.<locals>.iter_inputsksW����� (� 6� 6� ��W��w�-�-���q� � �!�A�o��W�~�5�5�5�5�5�  6� 6r3c �������jrj�rjtt��������fd��D��}|r?td|�d�fd�|D���d��dt �����d� ���dSdSdS)Nc�l��g|]0}t�|��t����k�.|��1Sr/�rm)r9r:� first_col�processed_inputss ��r1rczRMappedExamplesIterable._iter.<locals>.validate_function_output.<locals>.<listcomp>xsH�������s�;K�C�;P�7Q�7Q�UX�Yi�js�Yt�Uu�Uu�7u�7u�C�7u�7u�7ur3z!Column lengths mismatch: columns z have length c�:��g|]}t�|����Sr/r')r9r:r)s �r1rczRMappedExamplesIterable._iter.<locals>.validate_function_output.<locals>.<listcomp>}sB���TD�TD�TD�sv�TW�Xh�il�Xm�Tn�Tn�TD�TD�TDr3z while z has length �.)rrnrorBrm)r)�bad_colsr(r�s` @�r1�validate_function_outputz>MappedExamplesIterable._iter.<locals>.validate_function_outputus������|� � 0� � ��&6�!7�!7�8�8� ������#3�������$�\�H�\�\�TD�TD�TD�TD�{C�TD�TD�TD�\�\�!*�\�\�8;�<L�Y�<W�8X�8X�\�\�\���� � � � � �r3c����|\}��j��gn�fd��jD��}d}�jr||fz }t���}|||�jfS)Nc� ��g|] }�|�� Sr/r/r8s �r1rczHMappedExamplesIterable._iter.<locals>.prepare_inputs.<locals>.<listcomp>�s���Co�Co�Co�UX�G�C�L�Co�Co�Cor3r/)rr�rdr)r�r r�fn_args�additional_args�inputsr4r�s @�r1�prepare_inputsz4MappedExamplesIterable._iter.<locals>.prepare_inputs�sq����&�L�C��#'�#5�#=�w�i�i�Co�Co�Co�Co�\`�\n�Co�Co�Co�G� �O�� � &��G�:�%���'�]�]�F��7�O�T�^�C� Cr3c�|���|���jr"�jD]}||vr||=||dur||vr||=�i|�|�}|Sr��r)r�r2r)rS�transformed_inputsr�r-s ��r1�prepare_outputsz5MappedExamplesIterable._iter.<locals>.prepare_outputs�s|��� $� $�%5� 6� 6� 6��"� 0��,�0�0�A��F�{�{�"�1�I�'�;�q�>�9�9�a�CS�>S�>S�,�Q�/��!?�F�!?�.>�!?� �%� %r3c�d���||��\}}}}� jg|�|�Ri|��}�|||��S)z8Utility to apply the function on a selection of columns.�r�� r�r r2r0r1rr)r3r7r�s ���r1�apply_functionz4MappedExamplesIterable._iter.<locals>.apply_function�sX���:H�.��V]�:^�:^� 7�F�G�_�i�,�t�}�U�g�U��U�U�U�9�U�U� �"�?�;��8H�I�I� Ir3c��t�K��||��\}}}}� jg|�|�Ri|���d{V��}�|||��S)zLUtility to apply the function on a selection of columns. Same code but asyncNr9r:s ���r1�async_apply_functionz:MappedExamplesIterable._iter.<locals>.async_apply_function�so�����:H�.��V]�:^�:^� 7�F�G�_�i�%2�T�]�%[�G�%[�o�%[�%[�%[�QZ�%[�%[�[�[�[�[�[�[� �"�?�;��8H�I�I� Ir3c 3�0�K��jr � ��n ���}tj�j���r��jr2�j���}|�jd<d}�jd}g}|D�]\}}|�|������� ||������t����j kr��� tj �tj�����\}}�rht|���j krP�� tj �tj�����\}}�rt|���j k�Pt���d�j zkr�� �d���r��d���r�|�d����d��} }|| ���fV��jr'| |ur#|�jd<d�jd<|�jd<d\}}�r�d������jr'|�%�r#�j���}�d}� }���rS|d�� �d��fV�|�d����d��f��QdSdS�jr<�jr5�j����jd<d�jd<� �jd<|D]p\}}�jr�js � �jd<|� ||��fV��jr<�jr5�j����jd<d�jd<� �jd<�qdS) Nr#r)� return_when� rr�NN�����)r�inspect�iscoroutinefunctionr�r�rr�r7� create_taskrmr�run_until_complete�asyncio�wait�FIRST_COMPLETED�done�popr�)�inputs_iteratorr#�previous_state_taskrr rlr�rJ�pending�taskr;r=rr"r$�loopr��taskss ��������r1� iter_outputsz2MappedExamplesIterable._iter.<locals>.iter_outputs�s'�����7;�|�V�1�1�3�3�3�����O��*�4�=�9�9�5 Y��#�`�%)�%5�%@�%@�%B�%B�N�9G�D�$�%5�6�*.�'�15�1A�B^�1_�.�=?��&5�A�A�N�A�{��N�N�1�%�%�%��L�L��!1�!1�2F�2F�{�TU�2V�2V�!W�!W�X�X�X��5�z�z�T�%Y�Y�Y�(,�(?�(?�#�L��G�<S�T�T�T�)�)� ��g�$���G� � ��8l�(l�(l�,0�,C�,C� '� �U��@W� X� X� X�-�-�M�D�'�$���G� � ��8l�(l�(l��5�z�z�R�$�*^�%^�^�^��/�/��a��9�9�9��M�E�!�H�M�M�O�O�M�")�+�+�a�.�.�%�)�)�A�,�,�4��������.�.�.�.��+�M��8K�0K�0K�AO�D�,�-=�>�TU�D�,�-P�Q�Mg�D�,�-I�J�BL�?�N�,?� �M�E�!�H�M�M�O�O�M��'�A�,?�,G�E�,G�)-�)9�)D�)D�)F�)F��.3�B�i�+�5@�2���1�!�!�*�d�&=�&=�e�A�h�&G�&G�G�G�G�G��K�K��N�N�E�I�I�a�L�L�0�0��1�1�1�1�1��#�U��|�U�=A�=M�=X�=X�=Z�=Z��(�)9�:�PQ��(�)L�M�IT��(�)E�F�&5� Y� Y�N�A�{��'�Y�#�|�Y�MX�D�,�-I�J��^�^�K��;�;�;�;�;�;��'�Y��<�Y�AE�AQ�A\�A\�A^�A^�D�,�-=�>�TU�D�,�-P�Q�MX�D�,�-I�J�� Y� Yr3c3�HK�|]\}}t|��D]}||fV�� �dSr.)rr)r9r�transformed_batch�transformed_examples r1r;z/MappedExamplesIterable._iter.<locals>.<genexpr>�sa������.��.�/A�BS�/T�/T���,��-�.�������r3rz Canceling z async tasks.�KeyboardInterrupt)�msgzTasks canceled.)r�rr�rorr r r�r�recursive_tensorizerCrDr�rG�get_running_loopr��new_event_loopr r7r� ExceptionrV�logger�debugrm�cancelrF�gather�CancelledErrorrB)r��num_examples_to_skiprrR�outputsrrUrOr;r=rr!r"r$r�rPr3r7rQr-s` @@@@@@@@@@@@r1rzMappedExamplesIterable._iter>s'�����������������HL�HX�_�d�&�'C�D�D�^_� � � � %�� 0�1A� B� %� � � ,� ,�T�-=�>N�-O� P� P� P�#'�#3�4W�#X� � �#$� ���(�)�)�� �?� �%�d�o�&A�B�B�I�;E�i�Q`�;a�;a�k�)�7�7�gk�K�K��K� ,� ,� ,� ,� ,� ,� ,� ,�6 6� 6� 6� 6� 6� 6� � � � � � D� D� D� D� D� &� &� &� &� &� &� J� J� J� J� J� J� J�  J� J� J� J� J� J� J� %'�� � &�t�}� 5� 5� � 0��/�1�1����� 0� 0� 0��-�/�/���� 0���� � '� .� .��e�}� =� =� =� =��D�8 Y�8 Y�8 Y�8 Y�8 Y�8 Y�8 Y�8 Y�8 Y�8 Y�8 Y�8 Y�t �"�l�n�n�G��|� ���29����� -4� /� /�(��(��#�O��(8�9I�(J�(V��$�%H�I�I�I�Q�N�I�I�I�'�!�+�+�(�A�-�(���.�.�.�.�.�.�  /� /���,�-� � � �� 4�� � �C�#�e�*�*�C�C�C�D�D�D�!�9�9�D��K�K�$7�K�8�8�8�8�4��+�+�G�N�E�,B�C�C�C�C���.� �;�4�4�4��L�L�!2�3�3�3�3�3�4���� � ���sD� D � E�?E�/A G�AJ�+!I � J� 0J�=J�?J�Jrc #�� � K�|jrt|jj��n t��}|jjr|j���}n*t |j|jr|jnd|j ���}|j r@|j dr3|j� |j d��|j d}nd}|j r-|�+|j� ��|j d<d|j d<|j r |j dnd� |D�]q\}� |jr)|j�"t� ��|jkr |j rdS|j�|�� ��gn� fd�|jD��}|jrX|jr<|�� fd�t%t� ����D����n|�� ��|j|i|j��}t+|��}t-|t.j��s0t3d |j�d t7|���d |j�d ����|jr@|jD]8} | |jvr-|�|j�| ����}�9|�C� t� ��z � |j r"|j dxxt� ��z cc<||fV����tA|�!|� ����D]=\} } � dz � |j r|j dxxdz cc<|dkr|dz}�2|�d| ��| fV��>|j rM|j� ��|j d<d|j d<|j dxxt� ��z cc<��sdS)Nr�rtrur#rrrc� ��g|] }�|�� Sr/r/)r9r:rZs �r1rcz6MappedExamplesIterable._iter_arrow.<locals>.<listcomp> s���B�B�B��h�s�m�B�B�Br3c���g|]}�|z��Sr/r/rs �r1rcz6MappedExamplesIterable._iter_arrow.<locals>.<listcomp>$s���)X�)X�)X�a�+��/�)X�)X�)Xr3z(Provided `function` which is applied to z returns a variable of type z*. Make sure provided `function` returns a z to update the dataset.rry)"rr r rrr�r�rrtrur�r�r�rmrrr�r7rpr�rr�r�rQrRrV� table_typerUrrI� remove_columnr�r5r)r�rrr�rar� function_argsr�� output_tablerJrlr rrZs @@r1r�z"MappedExamplesIterable._iter_arrowsl������RV�Ra�$w�M�$�/�2M�$N�$N�$N�gu�gw�gw� � � � &� ��'�2�2�4�4�H�H�(�� �.2�l�A�4�?�?�� $� 4����H� � � %�� 0�1A� B� %� � � ,� ,�T�-=�>N�-O� P� P� P�#'�#3�4W�#X� � �#$� � � � F� � 9�15�1A�1L�1L�1N�1N�D� �-� .�DE�D� �@� A�HL�HX�_�d�&�'C�D�D�^_� �%�3 T�3 T�M�C��� � ��O�/���M�M�D�O�3�3��(�4�����%�-��'�'��1�1�2�2�B�B�B�B�t�/A�B�B�B� � � � 6��<�6�!�(�(�)X�)X�)X�)X�5��X���CW�CW�)X�)X�)X�Y�Y�Y�Y�!�(�(��5�5�5�"�T�]�M�D�T�^�D�D�F�1�&�9�9�L��l�B�H�5�5� ��}�y�?S�}�}��F�|�|�}�}�OX�Oc�}�}�}���� �"� k�"�1�k�k�F���!:�:�:�'3�'A�'A�,�B[�Ba�Ba�bh�Bi�Bi�'j�'j� ���$��s�8�}�}�,� ��#�T��$�%A�B�B�B�c�(�m�m�S�B�B�B��<�'�'�'�'�'�&/� �0F�0F�Ub�0F�0c�0c�&d�&d�4�4�N�A�{��1�$�K��'�S��(�)L�M�M�M�QR�R�M�M�M�+�a�/�/�,��1�,� � �,�,�1�,�,� �3�3�3�3�3��#�T�9=�9I�9T�9T�9V�9V�D�$�%5�6�LM�D�$�%H�I��$�%A�B�B�B�c�(�m�m�S�B�B�B��g3 T�3 Tr3r�c���t|j�|��|j|j|j|j|j|j|j |j |j |j |j �� � S)�&Shuffle the wrapped examples iterable.� r�r�rrrtrurrrr�r)r�rr�r�r�rrrtrurrrr�rr�s r1r�z+MappedExamplesIterable.shuffle_data_sourcesIsi��%� � � 1� 1�)� <� <��]��*��,��L��� �0��.��n����]�<@�<p�  �  �  � r3Tr�r�c���t|j�|||���|j|j|j|j|j|j|j |j |j |j |j �� � S)r�r�rm)r�rr�r�r�rrrtrurrrr�rr�s r1r�z)MappedExamplesIterable.shard_data_sourcesZsp��%� � � /� /� �E�j� /� Y� Y��]��*��,��L��� �0��.��n����]�<@�<p�  �  �  � r3c��|jjSr.rDr�s r1r�z!MappedExamplesIterable.num_shardskrEr3) FNFr�FNNNNNr.r�)!r�r�r�r�r r�rrCr}r�rdrr�r�r�r�r�r�r�rrr�r�rQrRr�r�r�r�r�r�r�r�r�s@r1r�r��sp������� #�-1��$(� %�.2�$(�37�'+�IM�,e�,e�*�,e��,e�� ,e�  ��S� �*� ,e� � ,e��S�M�,e��,e�!��c��+�,e��D�>�,e��/�0�,e��8�$�,e�:B�#��,e�,e�,e�,e�,e�,e�\�$�$��X�$��)�)��X�)�����X�� �$� � � � �$�$�$�A�A�A�FFT�FT��#��FT�(�5�QT�VX�V^�Q^�K_�B`�FT�FT�FT�FT�P �b�i�.A� �F^� � � � �" � �S� �� �Rj� � � � �"�+�C�+�+�+��X�+�+�+�+�+r3r��input�mask�mask_column_namec��t|tj��rjt|ttjtjf��s(tj|gtj�����}|�||��S||iS)N)rU) r�rQrRrC�Array� ChunkedArrayrk�bool_� append_column)rprqrrs r1� _add_maskrxpss�� �%���"�"�(��$��r�x��� A�B�B� 5��8�T�F�����4�4�4�D��"�"�#3�T�:�:�:� �$�'�'r3� mask_functionc�<�||g|�Ri|��}t|||��Sr.�rx�ryrprr�argsr�rqs r1�add_maskr~}s4�� �=�� 0�� 0� 0� 0�� 0� 0�D� �U�D�"2� 3� 3�3r3c��LK�||g|�Ri|���d{V��}t|||��Sr.r{r|s r1�async_add_maskr��sL������u�6�t�6�6�6�v�6�6� 6� 6� 6� 6� 6� 6�D� �U�D�"2� 3� 3�3r3c���eZdZdZ ddedededeee d ed ee d ee d ed f�fd� Z �fd�Z ddee f�fd� Zdee ddfd�Zdde de ddfd�Zede fd���Z�xZS)�FilteredExamplesIterablez ===MASK===FNr�rr�r�rrrtrrrc �@��||_|jr-ti|j�|jt d��i���} nd} t ���|ttj |��rtnt||j���||||||| �� � dS)Nr�)rr) rr�r�rrrtrrr�) ryr�rr�rrrr�r�rrCrDr�r~) r�rr�r�rrrtrrr�r�s �r1r�z!FilteredExamplesIterable.__init__�s����&��� � � �� ^�;�#7� ^��9N�PU�V\�P]�P]� ^� ^�_�_�H�H��H� �����#��")�"=�h�"G�"G�U���X��!%�!6���� &�'��!��!�� � � � � � r3c#��K�t�����D]4\}}t|��}|�|j��r||fV��5dSr.)r�rrdrKrr)r�rr4r�s �r1rzFilteredExamplesIterable._iter�sc�����!�G�G�M�M�O�O� #� #�L�C���7�m�m�G��{�{�4�0�1�1� #��7�l�"�"�"�� #� #r3rc#���K�t���|���D]C\}}||j}||�|j���|��fV��DdS)Nr)r�r�rr�drop�filter)r�rrrZrqr�s �r1r�z$FilteredExamplesIterable._iter_arrow�s{�����"�W�W�0�0�}�0�M�M� I� I�M�C���D�1�2�D��x�}�}�T�%:�;�;�B�B�4�H�H�H� H� H� H� H� I� Ir3�seedrOc ��t|j�|��|j|j|j|j|j|j|j ���S)rl�r�r�rrrtrr) r�rr�ryr�rrrtrr)r�r�s r1r�z-FilteredExamplesIterable.shuffle_data_sources�sS��'� � � 1� 1�$� 7� 7��'��*��,��L����n���  �  �  � r3Tr�r�c ��t|j�|||���|j|j|j|j|j|j|j ���S)r�r�r�) r�rr�ryr�rrrtrrr�s r1r�z+FilteredExamplesIterable.shard_data_sources�sZ��'� � � /� /� �E�j� /� Y� Y��'��*��,��L����n���  �  �  � r3c��|jjSr.rDr�s r1r�z#FilteredExamplesIterable.num_shards�rEr3)FNFr�NNr.r�)r�r�r�rrr�r r�rrCr}r�rdr�rr�r�r�r�r�r�r�s@r1r�r��s��������#�� #�-1��$(�$(�37� � �*� �� ��  �  ��S� �*�  � �  ��S�M� ��D�>� ��/�0� � � � � � �@#�#�#�#�#� I�I��#��I�I�I�I�I�I�  ��#��  �;U�  �  �  �  �  �  �S�  ��  �Rl�  �  �  �  ��+�C�+�+�+��X�+�+�+�+�+r3r�c �.��eZdZdededejjf�fd� Ze d���Z e d���Z de fd�Z d e de f�fd � Zedd ejjdedeefd ���Zd�Zdejjddfd�Zddededdfd�Ze defd���Z�xZS)�BufferShuffledExamplesIterabler� buffer_sizer�c�r��t�����||_||_||_dSr.)r�r�rr�r�)r�rr�r�r�s �r1r�z'BufferShuffledExamplesIterable.__init__�s4��� ��������&���&���"����r3c��|jjSr.rr�s r1r�z'BufferShuffledExamplesIterable.is_typed�rr3c��|jjSr.r r�s r1r�z'BufferShuffledExamplesIterable.features�rr3rOc�~�|j���|_|���|_|jSr.)rr�r�r��_original_state_dictr�s r1r�z/BufferShuffledExamplesIterable._init_state_dict�s4���+�<�<�>�>���$(�O�O�$5�$5��!���r3r�c���|jr%||jkrt�d��t ���|��S)Nz�Loading a state dict of a shuffle buffer of a dataset without the buffer content.The shuffle buffer will be refilled before starting to yield new examples.)r�r�r\�warningr�r�)r�r�r�s �r1r�z.BufferShuffledExamplesIterable.load_state_dict�sR��� � � ��T�6�6�6����a�����w�w�&�&�z�2�2�2r3r�r�c#�ZK� d�|�d||���D��Ed{V���))NTc3�4K�|]}t|��V��dSr.)r�)r9rls r1r;zFBufferShuffledExamplesIterable._iter_random_indices.<locals>.<genexpr>�s(����]�]�1��A���]�]�]�]�]�]r3rr�)r�)r�r�r�s r1�_iter_random_indicesz3BufferShuffledExamplesIterable._iter_random_indices�sK���� ^�]�]�� � �Q� �J[� �(\�(\�]�]�]� ]� ]� ]� ]� ]� ]� ]� ^r3c#�LK�|j}t|j��}|�||��}g}|jD]I}t |��|krt |��}||V�|||<�4|�|���J|�|��|Ed{V��dSr.) r�rr�r�rrmrnr7r�)r�r�r�r�� mem_bufferr0rls r1r�z'BufferShuffledExamplesIterable.__iter__�s������&� ��t�~�&�&���4�4�S�+�F�F��� ��!� %� %�A��:���+�-�-��)�*�*�� ��m�#�#�#� !� �1� � ��!�!�!�$�$�$�$� � � �J�������������r3c�`�t|j�|��|j|���S)zFShuffle the wrapped examples iterable as well as the shuffling buffer.�r�r�)r�rr�r�r�s r1r�z3BufferShuffledExamplesIterable.shuffle_data_sources s5��-� � � 1� 1�)� <� <�$�JZ�fo� � � � r3Tr�r�c�p�t|j�|||���|j|j���S)r�r�r�)r�rr�r�r�r�s r1r�z1BufferShuffledExamplesIterable.shard_data_sourcess?��-� � � /� /� �E�j� /� Y� Y��(��n� � � � r3c��|jjSr.rDr�s r1r�z)BufferShuffledExamplesIterable.num_shardsrEr3)r�r�)r�r�r�r�r�r�r�r�r�r�r�r�rdr�r�� staticmethodrr�r�r�r�r�r�r�s@r1r�r��s��������#�$9�#��#�XZ�Xa�Xk�#�#�#�#�#�#��)�)��X�)��)�)��X�)� �$� � � � � 3�$�3�4�3�3�3�3�3�3��^�^�"�)�"5�^�C�^�dl�mp�dq�^�^�^��\�^����" �b�i�.A� �Ff� � � � �  � �S� �� �Rr� � � � ��+�C�+�+�+��X�+�+�+�+�+r3r�c ����eZdZ ddedededef�fd� Zed���Zed���Z d e fd �Z d �Z e d ���Zd ejjd dfd�Zddeded dfd�Zed efd���Z�xZS)�SkipExamplesIterableTr�n�"block_sources_order_when_shuffling�split_when_shardingc���t�����||_||_||_||_dSr.�r�r�rr�r�r��r�rr�r�r�r�s �r1r�zSkipExamplesIterable.__init__�@��� ��������&������2T��/�#6�� � � r3c��|jjSr.rr�s r1r�zSkipExamplesIterable.is_typed,rr3c��|jjSr.r r�s r1r�zSkipExamplesIterable.features0rr3rOc�h�d|j���|jjd�|_|jS)NF)�skippedr"rUr'r�s r1r�z%SkipExamplesIterable._init_state_dict4s9���!%�!1�!B�!B�!D�!D��N�+� � ��� ��r3c#�K�|jr|jdrdn|j}|jr d|jd<t|j|d��Ed{V��dS)Nr�rT)r�r�r r)r�r�s r1r�zSkipExamplesIterable.__iter__<so����%)�%5� a�$�:J�9�:U� a���[_�[a�� � � /�*.�D� �Y� '��$�*�,A�4�H�H�H�H�H�H�H�H�H�H�Hr3c�j�||z}||z}|g|z}t|��D]}||xxdz cc<�|Sr��rp��numr��quotient� remainderr�rls r1� split_numberz!SkipExamplesIterable.split_numberB�R���!�8���!�G� ���a����y�!�!� � �A� �1�I�I�I��N�I�I�I�I�� r3r�c��|jr|St|j�|��|j|j|j���S)zeMay not shuffle the wrapped examples iterable since it would skip examples from other shards instead.�r�r�r�)r�r�rr�r�r�r�s r1r�z)SkipExamplesIterable.shuffle_data_sourcesK�N�� � 2� ��K�'�� �5�5�i�@�@��&�37�3Z�$(�$<� ��� r3r�r�c���|jrWt|j�|||���|�|j|��||j|j���S|S�r�r�r�)r�r�rr�r�r�r�r�s r1r�z'SkipExamplesIterable.shard_data_sourcesWsm�� � #� �'�� �3�3�J��R\�3�]�]��#�#�D�F�J�7�7��>�37�3Z�$(�$<� ��� ��Kr3c��|jjSr.rDr�s r1r�zSkipExamplesIterable.num_shardscrEr3�TTr��r�r�r�r�r�r�r�r�r�r�rdr�r�r�r�r�r�r�r�r�r�r�r�s@r1r�r�sl������� 48�$(� 7� 7�*� 7� � 7�-1� 7� "� 7� 7� 7� 7� 7� 7��)�)��X�)��)�)��X�)� �$� � � � �I�I�I� ����\�� �b�i�.A� �F\� � � � � � �S� �� �Rh� � � � ��+�C�+�+�+��X�+�+�+�+�+r3r�c���eZdZdZdedeef�fd� Zdefd�Z d�Z de j j ddfd �Zd ed eddfd �Zedefd ���Z�xZS)�RepeatExamplesIterablezP Iterable that repeats the underlying iterable a given number of times. r� num_timesc�d��t�����||_||_dSr.)r�r�rr�)r�rr�r�s �r1r�zRepeatExamplesIterable.__init__ms.��� ��������&���"����r3rOc�h�d|j���|jjd�|_|jS)Nr)� repeat_indexr"rUr'r�s r1r�z'RepeatExamplesIterable._init_state_dictvs9���!%�!1�!B�!B�!D�!D��N�+� � ��� ��r3c#�K�|jr |jdnd} |j�|t|jd��krdS|jEd{V��|dz }|jr+||jd<|j���|jd<�g)Nr�rTrr")r�r��maxrr�)r�r�s r1r�zRepeatExamplesIterable.__iter__~s�����;?�;K�R�t�'��7�7�QR� � \��~�)�l�c�$�.�RS�>T�>T�.T�.T����'� '� '� '� '� '� '� '� �A� �L��� \�3?�� ��0�8<�8H�8Y�8Y�8[�8[�� �!4�5� \r3r�c�^�t|j�|��|j���S)z-Shuffle the underlying iterable, then repeat.�r�)r�rr�r�r�s r1r�z+RepeatExamplesIterable.shuffle_data_sources�s*��%�d�&6�&K�&K�I�&V�&V�bf�bp�q�q�q�qr3� worker_id� num_workersc�`�t|j�||��|j���S)zShard, then repeat shards.r�)r�rr�r�)r�r�r�s r1r�z)RepeatExamplesIterable.shard_data_sources�s4��%� � � /� /� �;� G� G��n� � � � r3c��|jjSr.)r�n_shardsr�s r1r�zRepeatExamplesIterable.n_shards�rr3)r�r�r�r�r�rr�r�rdr�r�r�r�r�r�r�r�r�r�r�s@r1r�r�hs ���������#�*�#��C�=�#�#�#�#�#�#� �$� � � � � \� \� \�r�b�i�.A�r�F^�r�r�r�r� �C� �c� �F^� � � � ��)�#�)�)�)��X�)�)�)�)�)r3r�c ����eZdZ ddedededef�fd� Zed���Zed���Z d e fd �Z d �Z e d ���Zd ejjd dfd�Zddeded dfd�Zed efd���Z�xZS)�TakeExamplesIterableTrr�r�r�c���t�����||_||_||_||_dSr.r�r�s �r1r�zTakeExamplesIterable.__init__�r�r3c��|jjSr.rr�s r1r�zTakeExamplesIterable.is_typed�rr3c��|jjSr.r r�s r1r�zTakeExamplesIterable.features�rr3rOc�h�d|j���|jjd�|_|jS)Nr)� num_takenr"rUr'r�s r1r�z%TakeExamplesIterable._init_state_dict�s9���!%�!1�!B�!B�!D�!D��N�+� � ��� ��r3c#�K�|jr |jdnd}t|j|j|z ��D]"}|jr|jdxxdz cc<|V��#dS)Nr�rr)r�r rr�)r��ex_iterable_num_takenr�s r1r�zTakeExamplesIterable.__iter__�s�����AE�AQ� X�� 0�� =� =�WX��!�$�"2�D�F�=R�4R�S�S� � �K��� 3�� ��-�-�-��2�-�-�-�� � � � � � r3c�j�||z}||z}|g|z}t|��D]}||xxdz cc<�|Sr�r�r�s r1r�z!TakeExamplesIterable.split_number�r�r3r�c��|jr|St|j�|��|j|j|j���S)zeMay not shuffle the wrapped examples iterable since it would take examples from other shards instead.r�)r�r�rr�r�r�r�s r1r�z)TakeExamplesIterable.shuffle_data_sources�r�r3r�r�c�8�|jrWt|j�|||���|�|j|��||j|j���St|j�|||���|j|j|j���Sr�)r�r�rr�r�r�r�r�s r1r�z'TakeExamplesIterable.shard_data_sources�s��� � #� �'�� �3�3�J��R\�3�]�]��#�#�D�F�J�7�7��>�37�3Z�$(�$<� ��� �(�� �3�3�J��R\�3�]�]��&�37�3Z�$(�$<� ��� r3c��|jjSr.rDr�s r1r�zTakeExamplesIterable.num_shards�rEr3r�r�r�r�s@r1r�r��si������� 48�$(� 7� 7�*� 7� � 7�-1� 7� "� 7� 7� 7� 7� 7� 7��)�)��X�)��)�)��X�)� �$� � � � ��������\�� �b�i�.A� �F\� � � � ���S����Rh�����"�+�C�+�+�+��X�+�+�+�+�+r3r�r��token_per_repo_idc��t|��}|D] }||vrd||<� |�|��}|�||���}|S�N�r�)rd�encode_example�decode_example)r4r�r�r��encoded_example�decoded_examples r1�_apply_feature_types_on_exampler��sf���7�m�m�G��(�(� � �g� %� %�#'�G�K� ���-�-�g�6�6�O��-�-�o�Qb�-�c�c�O� �r3c���t|��}t|tt|������}|D]}||vr dg|z||<�|�|��}|�||���}|Sr�)rdrmrnro� encode_batch� decode_batch)rMr�r�rqr�� encoded_batch� decoded_batchs r1�_apply_feature_types_on_batchr��s��� ��K�K�E��U�4��U� � �,�,�-�.�.�J��5�5� � �e� #� #�"&��*�!4�E�+� ���)�)�%�0�0�M��)�)�-�K\�)�]�]�M� �r3c�^�eZdZUeeed<edefd���Zedefd���Z dS)rr rOc�P�tt|j��t��Sr.)r�r r rr�s r1r zFormattingConfig.is_table s���-��(8�9�9�>�J�J�Jr3c�P�tt|j��t��Sr.)r�r r rr�s r1� is_tensorzFormattingConfig.is_tensors���-��(8�9�9�?�K�K�Kr3N) r�r�r�rr}�__annotations__r�r�r r�r/r3r1rr s~��������#����� �K�$�K�K�K��X�K��L�4�L�L�L��X�L�L�Lr3rc �F��eZdZdedeedeedeee ee dfff�fd� Z e d���Z e d���Ze d ���Zd efd �Zd �Zd eeeejffd �Zdejjd dfd�Zddeded dfd�Ze d efd���Z�xZS)�FormattedExamplesIterablerrr�r�Nc���t�����||_||_||_||_dSr.)r�r�rrrr�)r�rrr�r�r�s �r1r�z"FormattedExamplesIterable.__init__s?��� ��������&���!���$���!2����r3c�V�|jjr|jr |jjr |jSdSdSr.)rr�rr r�r�s r1r�z$FormattedExamplesIterable.iter_arrow$sB�� � � &� $��� $�4�?�C[� $��#� #� $� $� $� $r3c�,�|jjp|jduSr.)rr�rr�s r1r�z"FormattedExamplesIterable.is_typed)s����(�F�D�N�$�,F�Fr3c��|jSr.rr�s r1r�z"FormattedExamplesIterable.features-rr3rOc�L�|j���|_|jSr.rOr�s r1r�z*FormattedExamplesIterable._init_state_dict1rPr3c#�dK�|jr |jjr$t|jjs|jnd���}n4t |jj|jjs|jnd|j���}|jj rI|� ��D]2\}}|� |��}t|��D]}||fV�� �3dSt|t��r|jnd}|jD]G\}}|jr(|jjst#||j|j���}|r ||��}||fV��HdS)Nr)r�r�r�)rr rrr�rr r r�r�r�rrrr�rrXr�r�)r�rrrZrMr4r!s r1r�z"FormattedExamplesIterable.__iter__5s������� �$�/�":� �'�t�GW�G`�1j����fj�k�k�k�I�I�%���+�/3�/?�/H�R����d�"&�"8����I� � � &� #�!%�!1�!1�!3�!3� '� '� ��X�!�.�.�x�8�8��1�%�8�8�'�'�G��w�,�&�&�&�&�'� '� '��i��9�9�� �-�-�� � !%� 0� #� #� ��W��=���)9�)B��=����$�BX����G��3�)�k�'�2�2�G��7�l�"�"�"�"� #� #r3c#��K�|js|j���Ed{V��|j���D]�\}}t|j��}|jj}|jD]\}||vrVt j�t j ��t|��dg��}|� ||��}�]|j |krt||j��}||fV���dSr.)r�rr�rErI� arrow_schemarQ� NullArray� from_buffers�nullrmrwrTr$)r�rrZ�columnsrTr�r:s r1r�z%FormattedExamplesIterable._iter_arrowTs �����}� 6��'�3�3�5�5� 5� 5� 5� 5� 5� 5� 5�!�-�9�9�;�;� � �M�C���(�/�0�0�G��]�/�F�#�}� H� H� ��g�-�-��,�3�3�B�G�I�I�s�8�}�}�t�f�U�U�C�'�5�5�k�3�G�G�H����&�(�(�1�(�D�M�J�J���x�-� � � � � � r3r�c�v�t|j�|��|j|j|j���S)rl�r�r�r)r�rr�r�r�rr�s r1r�z.FormattedExamplesIterable.shuffle_data_sourcescs=��(� � � 1� 1�)� <� <��]�"�4���  � � � r3Tr�r�c�|�t|j�|||���|j|j|j���S)r�r�r�)r�rr�r�r�rr�s r1r�z,FormattedExamplesIterable.shard_data_sourceslsD��(� � � /� /� �E�j� /� Y� Y��]�"�4���  � � � r3c��|jjSr.rDr�s r1r�z$FormattedExamplesIterable.num_shardsurEr3r�) r�r�r�r�rrrrdr}rr�r�r�r�r�r�r�r�rr�r�rQrRr�r�r�r�r�r�r�r�r�r�s@r1r�r�s�������� 3�*� 3��-�.� 3��8�$� 3�  ��U�3��d�?�%;� ;�<� 3� 3� 3� 3� 3� 3��$�$��X�$��G�G��X�G�����X�� �$� � � � �#�#�#�>  �X�e�C���M�&:�;�  �  �  �  � �b�i�.A� �Fa� � � � � � �S� �� �Rm� � � � ��+�C�+�+�+��X�+�+�+�+�+r3r�c�H�eZdZUejjed<dZee ed<dS)�ShufflingConfigr�N�_original_seed) r�r�r�r�r�r�r�rrr�r/r3r1rrzs7��������y�"�"�"�"�$(�N�H�S�M�(�(�(�(�(r3rc�$�eZdZUeed<eed<dS)�DistributedConfig�rank� world_sizeN)r�r�r�r�r�r/r3r1rr�s"������� �I�I�I��O�O�O�O�Or3rc��tjr>ddl}|jjj|jvr$|xj|jjjfz c_dSdSdS)zNAdd torch.utils.data.IterableDataset as a parent class if 'torch' is availablerN)r�TORCH_AVAILABLE�torch.utils.data�utils�data�IterableDataset� __bases__)�cls�torchs r1�._maybe_add_torch_iterable_dataset_parent_classr�sc�� ��A����� �;� � +�3�=� @� @� �M�M�e�k�.�>�@� @�M�M�M�M� A�A� A� @r3�valuez torch.Tensorc��tjrOddl}t||j��r|���S|j|�����S|SrV)rr rr��Tensor� share_memory_�tensor)rrs r1�*_maybe_share_with_torch_persistent_workersr�s_�� ���� � � � �e�U�\� *� *� 7��&�&�(�(� (��5�<��&�&�4�4�6�6� 6�� r3c��eZdZdZ d[dedeedeedeedee dee d ee e e e edfffd �Zd e fd �Zd e d dfd�Zd�Zd�Zd�Zd\d�Zed efd���Zd�Zed efd���Zed efd���Zd�Zd�Z d]deded efd�Zd�Zd^dedefd �Z e!dde"j#fd!e$d"ee%d#ee ded df d$���Z&e! d_d%d&deed"ee%d dfd'���Z'e!d(e d dfd)���Z( d`d*ee d dfd+�Z) dad-ee$d.ed/ee e e*e fd0edeeded1ee e e*e fd"ee%d2ee d dfd3�Z+ dbd-ee$d/ee e e*e fd0edeed2ee d df d4�Z, dcd!ee-j.j/d5ed dfd6�Z0d7efd8�Z1d9ed dfd:�Z2d;eed dfd<�Z3d9ed dfd=�Z4 ddd?ed@edAed dfdB�Z5ed ee*e fdC���Z6dDe dEe e*e-j7fd dfdF�Z8dGe dHe d dfdI�Z9dJe e e fd dfdK�Z:dLe e e*e fd dfdM�Z;dLe e e*e fd dfdN�Z<dEe dOe=d dfdP�Z>d"e%d dfdQ�Z?dedSedTed dfdU�Z@dVedWed dfdX�ZAdY�ZBd^deded dfdZ�ZCdS)fr z A Dataset backed by an iterable.Nr�info�splitr� shuffling� distributedr�c��|r#|jdkr|r|j�td���|�|���n t ��}t j|||���tj|��|_||_||_ ||_ |pi|_ td��|_ d|_|���t!|j��dS)Nrz�The dataset doesn't have a fixed random seed across nodes to shuffle and split the list of dataset shards by node. Please pass e.g. `seed=42` in `.shuffle()` to make all the nodes use the same seed. )rrr)rrr�r�r!rr�� _ex_iterable� _formatting� _shuffling� _distributed�_token_per_repo_idr�_epoch�_starting_state_dict�"_prepare_ex_iterable_for_iterationrr�)r�rrrrrrr�s r1r�zIterableDataset.__init__�s��� � �;�1�A�5�5�)�5� �H`�Hh��g��� � #�.�t�y�y�{�{�{�K�M�M���!�$�T��?�?�?�?� �I�k�2�2���%���#���'���EV�E\�Z\���2\�]^�2_�2_�� �48��!� �/�/�1�1�1�6�t�~�F�F�F�F�Fr3rOc�4�tj|j��S)a�Get the current state_dict of the dataset. It corresponds to the state at the latest example it yielded. Resuming returns exactly where the checkpoint was saved except in two cases: 1. examples from shuffle buffers are lost when resuming and the buffers are refilled with new data 2. combinations of `.with_format(arrow)` and batched `.map()` may skip one batch. Returns: `dict` Example: ```py >>> from datasets import Dataset, concatenate_datasets >>> ds = Dataset.from_dict({"a": range(6)}).to_iterable_dataset(num_shards=3) >>> for idx, example in enumerate(ds): ... print(example) ... if idx == 2: ... state_dict = ds.state_dict() ... print("checkpoint") ... break >>> ds.load_state_dict(state_dict) >>> print(f"restart from checkpoint") >>> for example in ds: ... print(example) ``` which returns: ``` {'a': 0} {'a': 1} {'a': 2} checkpoint restart from checkpoint {'a': 3} {'a': 4} {'a': 5} ``` ```py >>> from torchdata.stateful_dataloader import StatefulDataLoader >>> ds = load_dataset("deepmind/code_contests", streaming=True, split="train") >>> dataloader = StatefulDataLoader(ds, batch_size=32, num_workers=4) >>> # checkpoint >>> state_dict = dataloader.state_dict() # uses ds.state_dict() under the hood >>> # resume from checkpoint >>> dataloader.load_state_dict(state_dict) # uses ds.load_state_dict() under the hood ``` )r�rr�r�s r1r�zIterableDataset.state_dict�s��f�}�T�-�.�.�.r3r�c��||_dS)a�Load the state_dict of the dataset. The iteration will restart at the next example from when the state was saved. Resuming returns exactly where the checkpoint was saved except in two cases: 1. examples from shuffle buffers are lost when resuming and the buffers are refilled with new data 2. combinations of `.with_format(arrow)` and batched `.map()` may skip one batch. Example: ```py >>> from datasets import Dataset, concatenate_datasets >>> ds = Dataset.from_dict({"a": range(6)}).to_iterable_dataset(num_shards=3) >>> for idx, example in enumerate(ds): ... print(example) ... if idx == 2: ... state_dict = ds.state_dict() ... print("checkpoint") ... break >>> ds.load_state_dict(state_dict) >>> print(f"restart from checkpoint") >>> for example in ds: ... print(example) ``` which returns: ``` {'a': 0} {'a': 1} {'a': 2} checkpoint restart from checkpoint {'a': 3} {'a': 4} {'a': 5} ``` ```py >>> from torchdata.stateful_dataloader import StatefulDataLoader >>> ds = load_dataset("deepmind/code_contests", streaming=True, split="train") >>> dataloader = StatefulDataLoader(ds, batch_size=32, num_workers=4) >>> # checkpoint >>> state_dict = dataloader.state_dict() # uses ds.state_dict() under the hood >>> # resume from checkpoint >>> dataloader.load_state_dict(state_dict) # uses ds.load_state_dict() under the hood ``` N)r$)r�r�s r1r�zIterableDataset.load_state_dict�s��`%/��!�!�!r3c��d|jj�+t|jj�����nd�d|j�d�S)Nz IterableDataset({ features: �Unknownz, num_shards: z }))�_infor�rCr�r�r�s r1�__repr__zIterableDataset.__repr__"sk��r�X\�Xb�Xk�Xw�D���9L�9Q�9Q�9S�9S�4T�4T�4T�~G�r�r�\`�\k�r�r�r� rr3c��|jSr.)�__dict__r�s r1� __getstate__zIterableDataset.__getstate__%s ���}�r3c�n�||_t|j��|_t|j��dSr.)r-rr#rr�)r��ds r1� __setstate__zIterableDataset.__setstate__(s0���� �@���M�M�� �6�t�~�F�F�F�F�Fr3�c�b�tt|�|�������S)Nr*)rnro)r�r�s r1�_headzIterableDataset._head/s&���D����a��0�0�1�1�2�2�2r3c�*�t|j��Sr.)r�r#r�s r1�epochzIterableDataset.epoch2s���4�;���r3c�,�|jr|jdkr |jjS|jrat|jj���dd��|jz }|dkrd|zn|}t j�|��Std���)NrlzThis dataset is not shuffled) r r6r�rr�r�r�� default_rngrB)r��effective_seeds r1�_effective_generatorz$IterableDataset._effective_generator6s��� �?� =�t�z�Q����?�,� ,� �_� =�%�d�o�&?�@�@�I�I�!�W�U�U�X\�Xb�b�N�;I�A�;M�;M�g��7�7�Sa�N��9�(�(��8�8� 8��;�<�<� <r3c��|jr6|jj|jjzdkr|jj|jjzS|jjSrV)r!rr�rr�s r1r�zIterableDataset.num_shardsAsM�� � � P��!2�!=��@Q�@\�!\�`a�!a�!a��$�/�4�3D�3O�O� O�� �+�+r3c��|jSr.r�r�s r1r�zIterableDataset.n_shardsGs ����r3c #��K�|���}tj���ddl}|jj���}|���rv|j |j krft� d|j �d|j �d|j |j z �d���t� d|j �d|j �d���|jrd |jj�d �nd }|�|j |jd � ��}|�r�t�|�d|j�dt'|���d|j �d���|�|j |jd � ��}|���|jd�|_|jr6|j|jdkr |�|jd��|jr�|js |jjrnt;|jj|j���}|jr|���}ntA|d���}|D]\}} |�!| ��V��dS|D] \}} | V�� t�|�d|j�dt'|���d|j �d���dSt�|�d|j�d|j �d|j �d���dS)NrzToo many dataloader workers: z (max is dataset.num_shards=z ). Stopping z dataloader workers.z�To parallelize data loading, we give each process some shards (or data sources) to process. Therefore it's unnecessary to have a number of workers greater than dataset.num_shards=zJ. To enable more parallelism, please split the dataset in more files than r+znode#� �F�r�r�r�zdataloader worker#z, ': Starting to iterate over �/z shards.�r"r6r6r"rrr*z, ': Finished iterating over z9, ': Stopping... Number of dataset shards < num_workers (�<z).)"r%�fsspec�asyn� reset_lockr r r �get_worker_info�_is_main_processr�r�r\r�rr!rr��idr]rmr�r�r6r�r$r�rr�r r r r�r�r) r�rr� worker_info� _log_prefix�shards_indicesrr�rrZr4s r1� _iter_pytorchzIterableDataset._iter_pytorchKs'�����=�=�?�?� � � ��� � � ������k�&�6�6�8�8� � � � � "� "� �{�'=� �@W�'W�'W� �N�N�c� �0G�c�c�ep�e{�c�c�'�3�k�6L�L�c�c�c� � � � �K�K�u�ju�kA�u�u�[f�[q�u�u�u� � � � <@�;L�T�7�d�/�4�7�7�7�7�RT� �$�B�B�"�.�k�n�QV�C� � �� �! � �L�L��W�W�+�.�W�W�`c�dr�`s�`s�W�W�wB�wM�W�W�W� � � �&�8�8�&�2�+�.�UZ�9���K�&1�%A�%A�%C�%C��� � �D� ��(� \�T�Z�4�;T�U\�;]�-]�-]��+�+�D�,E�FY�,Z�[�[�[��� "�[�%;� "�t�?O�?X� "�)�$�*:�*F�QU�Q^�_�_�_� ��)�L�*�5�5�7�7�H�H�0���K�K�K�H�%-�9�9�M�C��#�.�.�x�8�8�8�8�8�8���$/�"�"�L�C��!�M�M�M�M� �L�L��V�V�+�.�V�V�_b�cq�_r�_r�V�V�vA�vL�V�V�V� � � � � � �L�L��p�p�+�.�p�p�|G�|R�p�p�U`�Ul�p�p�p� � � � � r3c��|jr|jjdkrdSdtjvr1ddl}|jj���}|� |jdkrdSdS)NrFrT) r!rr�r�r r r rGrI)r�rrJs r1rHz IterableDataset._is_main_process�sm�� � � ��!2�!7�!�!;�!;��5� �c�k� !� !� #� #� #� #��+�*�:�:�<�<�K��&�;�>�A�+=�+=��u��tr3rFrtruc ��|j}|jr%|js |jjrt |||���}|jr(|�|�����}n|}|jr�|jj }|jj }|j |zdkre|� ��r8|j |z}|dkrdnd}t�d|�d|�d|�d ���|�||d � ��}nl|� ��rFt�d |�d ���t�d|�d|j �d|����t!|||���}|js|jr2|j|jkr"t%||j|j|j���}|���|jd�|_|jr6|j|jdkr |�|jd��|S)Nrdrr�sr?z Assigning z shardz (or data sourcez) of the dataset to each node.Fr@zAssigning 1 out of zS examples of the dataset to each node. The others are skipped during the iteration.z�It is more optimized to distribute the dataset shards (or data sources) across nodes. You can do that by using a dataset with number of shards that is a factor of world_size=z. The current dataset has z which is not a factor of rf�rr�r�rBr6r")rrr�r rr r�r:r!rrr�rHr\rr�r\r�r�r"r�r6r�r$r�)r�rtrurrr�num_shards_per_node�plurals r1r%z2IterableDataset._prepare_ex_iterable_for_iteration�s����'� � � � ��!7� �4�;K�;T� �8�� �O����K� �?� &�%�:�:�4�;T�;T�;V�;V�W�W�K�K�%�K� � � ^��$�)�D��*�5�J��%� �2�a�7�7��(�(�*�*��*5�*@�J�*N�'�$7�!�$;�$;�S�S��F��K�K�~�%8�~�~��~�~�X^�~�~�~����*�<�<� �Z^�kp�<�q�q� � ��(�(�*�*���K�K�N�j�N�N�N�����K�K�r�s}�r�r�3>�3I�r�r�eo�r�r���� 3�;�Z�X\�]�]�]� � � � �� � �+�2F�$�-�2W�2W�3���+���"&�"9� ���K�"-�!=�!=�!?�!?��Z� � ��� � $� X���t�7P�QX�7Y�)Y�)Y� � '� '��(A�BU�(V� W� W� W��r3c#�4K�dtjvr_ddl}|jj���}t ||jjj��r|�|���Ed{V��dS|� ��}|j r�|j s |j j rnt|j j|j���}|j r|� ��}nt!|d���}|D]\}}|�|��V��dS|D] \}}|V�� dS)Nrrrrr*)r�r�r r r rGr�r rMr%rr�r r r r�r�r) r�rrJrrr�rrZr4s r1r�zIterableDataset.__iter__�sX���� �c�k� !� !� #� #� #� #��+�*�:�:�<�<�K��$�� � 0� @�A�A� �k�F]��-�-�/�/�/�/�/�/�/�/�/����=�=�?�?� � � � ��!7� �4�;K�;T� �%�d�&6�&B�T�]�[�[�[�I��%� H�&�1�1�3�3���,�[�Q�G�G�G��!)� 5� 5� ��X��*�*�8�4�4�4�4�4�4� �F�'� � �L�C���M�M�M�M� � r3c#�zK�|jr?t|jj|j���}t |t ��r|jnd}nd}|�||���}|jrb|js |jj rO|jr|���}nt|||���}|D]\}}|� |��V��dSt|��}|D]^\}} | gd�t||dz ��D��z} |rt| ��|krdSt| ��} |r || ��n| V��_dS)aIterate through the batches of size `batch_size`. Args: batch_size (:obj:`int`): size of each batch to yield. drop_last_batch (:obj:`bool`, default `False`): Whether a last batch smaller than the batch_size should be dropped rNrdc��g|]\}}|��Sr/r/)r9rr4s r1rcz(IterableDataset.iter.<locals>.<listcomp>�s��#a�#a�#a� ��W�G�#a�#a�#ar3r)rr r r�r�rrXr%r�r r�rror rmrh) r�rtrurr!rr�rrZr4r\rMs r1rozIterableDataset.iter�s����� � � �%�d�&6�&B�T�]�[�[�[�I�;E�i�Q`�;a�;a�k�)�7�7�gk�K�K��K��=�=��et�=�u�u� � � � ��!7� �4�;K�;T� ��%� r�&�1�1�3�3���,�[�Z�ap�q�q�q��!)� 7� 7� ��X��,�,�X�6�6�6�6�6�6� �F�� �$�$��$� ?� ?�L�C���y�#a�#a��x�Q[�^_�Q_�@`�@`�#a�#a�#a�a�H�� �3�x�=�=�:�#=�#=����&�x�0�0�E�(3�>�+�+�e�$�$�$�� >� >� >� >� ?� ?r3r�r�� gen_kwargsc�R�ddlm}||||d|������S)a�Create an Iterable Dataset from a generator. Args: generator (`Callable`): A generator function that `yields` examples. features (`Features`, *optional*): Dataset features. gen_kwargs(`dict`, *optional*): Keyword arguments to be passed to the `generator` callable. You can define a sharded iterable dataset by passing the list of shards in `gen_kwargs`. This can be used to improve shuffling and when iterating over the dataset with multiple workers. split ([`NamedSplit`], defaults to `Split.TRAIN`): Split name to be assigned to the dataset. <Added version="2.21.0"/> Returns: `IterableDataset` Example: ```py >>> def gen(): ... yield {"text": "Good", "label": 0} ... yield {"text": "Bad", "label": 1} ... >>> ds = IterableDataset.from_generator(gen) ``` ```py >>> def gen(shards): ... for shard in shards: ... with open(shard) as f: ... for line in f: ... yield {"line": line} ... >>> shards = [f"data{i}.txt" for i in range(32)] >>> ds = IterableDataset.from_generator(gen, gen_kwargs={"shards": shards}) >>> ds = ds.shuffle(seed=42, buffer_size=10_000) # shuffles the shards order + uses a shuffle buffer >>> from torch.utils.data import DataLoader >>> dataloader = DataLoader(ds.with_format("torch"), num_workers=4) # give each worker a subset of 32/4=8 shards ``` r)�GeneratorDatasetInputStreamT)r�r�rW� streamingr)� io.generatorrY�read)r�r�rWrrYs r1�from_generatorzIterableDataset.from_generator sG��b >�=�=�=�=�=�*�*��(�z�UY�af� � � � �$�&�&� r3�dfzpyspark.sql.DataFramec ��ddlm}tjdkrt d���||f||dd�|�����S)a�Create an IterableDataset from Spark DataFrame. The dataset is streamed to the driver in batches. Args: df (`pyspark.sql.DataFrame`): The DataFrame containing the desired data. split (`NamedSplit`, *optional*): Split name to be assigned to the dataset. features (`Features`, *optional*): Dataset features. Returns: [`IterableDataset`] Example: ```py >>> df = spark.createDataFrame( >>> data=[[1, "Elia"], [2, "Teo"], [3, "Fang"]], >>> columns=["id", "name"], >>> ) >>> ds = IterableDataset.from_spark(df) ``` r)�SparkDatasetReader�win32z@IterableDataset.from_spark is not currently supported on WindowsT)rr�rZ)�io.sparkr`r��platform�OSErrorr\)r^rr�r�r`s r1� from_sparkzIterableDataset.from_spark8 sr��< 1�0�0�0�0�0� �<�7� "� "��\�]�]� ]�!�!� � ����  � � �  � � �$�&�&�  r3�filenamec���t|��}tj|��}ttjd|i���}t |t|������S)z�Instantiate a IterableDataset from Arrow table at filename. Args: filename (`str`): File name of the dataset. Returns: [`IterableDataset`] rf)r�r)rr)r%rrYr�r� _generate_tables_from_cache_filer r!)rf�pa_table_schema�inferred_featuresrs r1� from_filezIterableDataset.from_filec s]��0��9�9��$�6��G�G��+�G�,T�^h�jr�]s�t�t�t� ��;�[�Rc�=d�=d�=d�e�e�e�er3rUc ��t|��}t|j|j���|jt |���t j|j��t j|j ��|j ���S)at Return a dataset with the specified format. Args: type (`str`, *optional*): Either output type selected in `[None, 'numpy', 'torch', 'tensorflow', 'jax', 'arrow', 'pandas', 'polars']`. `None` means it returns python objects (default). Example: ```py >>> from datasets import load_dataset >>> from transformers import AutoTokenizer >>> ds = load_dataset("cornell-movie-review-data/rotten_tomatoes", split="validation", streaming=True) >>> tokenizer = AutoTokenizer.from_pretrained("bert-base-cased") >>> ds = ds.map(lambda x: tokenizer(x['text'], truncation=True, padding=True), batched=True) >>> ds = ds.with_format("torch") >>> next(iter(ds)) {'text': 'compassionately explores the seemingly irreconcilable situation between conservative christian parents and their estranged gay and lesbian children .', 'label': tensor(1), 'input_ids': tensor([ 101, 18027, 16310, 16001, 1103, 9321, 178, 11604, 7235, 6617, 1742, 2165, 2820, 1206, 6588, 22572, 12937, 1811, 2153, 1105, 1147, 12890, 19587, 6463, 1105, 15026, 1482, 119, 102, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0]), 'token_type_ids': tensor([0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0]), 'attention_mask': tensor([1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0])} ``` )r �rrrrrrr�) rr rr*r��_splitrrr r!r")r�rUs r1� with_formatzIterableDataset.with_formats sx��X*�$�/�/����)�����"�"��+�'�D�9�9�9��m�D�O�4�4�� �d�&7�8�8�"�5� � � � r3r�r�r�rrrrc ��t|t��r|g}t|t��r|g}|�t}| �i} |j} | jr#|jj�|jj| jkrdn |jj} |jrR|jjrFt| tj |j��| |j ���} t| |r|nd|���} nf|jr'|jjrt|j|r|nd|���} |js| r/t| tj |j��| |j ���} t| |||||||| |j|�� � } |j� ��} || _t#| | |j|jtj |j��tj |j��|j ���S)a� Apply a function to all the examples in the iterable dataset (individually or in batches) and update them. If your function returns a column that already exists, then it overwrites it. The function is applied on-the-fly on the examples when iterating over the dataset. You can specify whether the function should be batched or not with the `batched` parameter: - If batched is `False`, then the function takes 1 example in and should return 1 example. An example is a dictionary, e.g. `{"text": "Hello there !"}`. - If batched is `True` and `batch_size` is 1, then the function takes a batch of 1 example as input and can return a batch with 1 or more examples. A batch is a dictionary, e.g. a batch of 1 example is {"text": ["Hello there !"]}. - If batched is `True` and `batch_size` is `n` > 1, then the function takes a batch of `n` examples as input and can return a batch with `n` examples, or with an arbitrary number of examples. Note that the last batch may have less than `n` examples. A batch is a dictionary, e.g. a batch of `n` examples is `{"text": ["Hello there !"] * n}`. If the function is asynchronous, then `map` will run your function in parallel, with up to one thousand simulatenous calls. It is recommended to use a `asyncio.Semaphore` in your function if you want to set a maximum number of operations that can run at the same time. Args: function (`Callable`, *optional*, defaults to `None`): Function applied on-the-fly on the examples when you iterate on the dataset. It must have one of the following signatures: - `function(example: Dict[str, Any]) -> Dict[str, Any]` if `batched=False` and `with_indices=False` - `function(example: Dict[str, Any], idx: int) -> Dict[str, Any]` if `batched=False` and `with_indices=True` - `function(batch: Dict[str, List]) -> Dict[str, List]` if `batched=True` and `with_indices=False` - `function(batch: Dict[str, List], indices: List[int]) -> Dict[str, List]` if `batched=True` and `with_indices=True` For advanced usage, the function can also return a `pyarrow.Table`. If the function is asynchronous, then `map` will run your function in parallel. Moreover if your function returns nothing (`None`), then `map` will run your function and return the dataset unchanged. If no function is provided, default to identity function: `lambda x: x`. with_indices (`bool`, defaults to `False`): Provide example indices to `function`. Note that in this case the signature of `function` should be `def function(example, idx[, rank]): ...`. input_columns (`Optional[Union[str, List[str]]]`, defaults to `None`): The columns to be passed into `function` as positional arguments. If `None`, a dict mapping to all formatted columns is passed as one argument. batched (`bool`, defaults to `False`): Provide batch of examples to `function`. batch_size (`int`, *optional*, defaults to `1000`): Number of examples per batch provided to `function` if `batched=True`. `batch_size <= 0` or `batch_size == None` then provide the full dataset as a single batch to `function`. drop_last_batch (`bool`, defaults to `False`): Whether a last batch smaller than the batch_size should be dropped instead of being processed by the function. remove_columns (`[List[str]]`, *optional*, defaults to `None`): Remove a selection of columns while doing the mapping. Columns will be removed before updating the examples with the output of `function`, i.e. if `function` is adding columns with names in `remove_columns`, these columns will be kept. features (`[Features]`, *optional*, defaults to `None`): Feature types of the resulting dataset. fn_kwargs (`Dict`, *optional*, default `None`): Keyword arguments to be passed to `function`. Example: ```py >>> from datasets import load_dataset >>> ds = load_dataset("cornell-movie-review-data/rotten_tomatoes", split="train", streaming=True) >>> def add_prefix(example): ... example["text"] = "Review: " + example["text"] ... return example >>> ds = ds.map(add_prefix) >>> list(ds.take(3)) [{'label': 1, 'text': 'Review: the rock is destined to be the 21st century's new " conan " and that he's going to make a splash even greater than arnold schwarzenegger , jean-claud van damme or steven segal .'}, {'label': 1, 'text': 'Review: the gorgeously elaborate continuation of " the lord of the rings " trilogy is so huge that a column of words cannot adequately describe co-writer/director peter jackson's expanded vision of j . r . r . tolkien's middle-earth .'}, {'label': 1, 'text': 'Review: effective but too-tepid biopic'}] ``` NrQrrd) r�r�rrrtrurrrr�rm)r�r}r2rr�r*r�rr r�r�rr"rr�r�rr rnr r!) r�r�r�rrrtrurr�rr�input_featuresrs r1�mapzIterableDataset.map� s/��f �m�S� )� )� ,�*�O�M� �n�c� *� *� .�,�-�N� � �$�H� � ��I��'� ��$� %�*.�*�*=�*E���I\�`k�`t�It�It� �D���$� � � � �� 0� 9� �3���=��)9�:�:�'�"&�"9� ���K� 9��g�(D� � �1�Ve����K�K��� �D�$5�$@� �<��%��2N�*�*�Q�`o���� ��� �>� �7��#�}�T�-=�>�>�+�&*�&=� ��� �-� ��%�'��!�+�)���'��  �  �  � ��y�~�~���� �� ��#���+��'��m�D�O�4�4�� �d�&7�8�8�"�5� � � � r3c ��t|t��r|g}|j}|jjs|jr0t ||j|jrdn |jj|j���}t||||||||j���}t||j|j |jtj |j��tj |j��|j���S)a� Apply a filter function to all the elements so that the dataset only includes examples according to the filter function. The filtering is done on-the-fly when iterating over the dataset. If the function is asynchronous, then `filter` will run your function in parallel, with up to one thousand simulatenous calls (configurable). It is recommended to use a `asyncio.Semaphore` in your function if you want to set a maximum number of operations that can run at the same time. Args: function (`Callable`): Callable with one of the following signatures: - `function(example: Dict[str, Any]) -> bool` if `with_indices=False, batched=False` - `function(example: Dict[str, Any], indices: int) -> bool` if `with_indices=True, batched=False` - `function(example: Dict[str, List]) -> List[bool]` if `with_indices=False, batched=True` - `function(example: Dict[str, List], indices: List[int]) -> List[bool]` if `with_indices=True, batched=True` If the function is asynchronous, then `filter` will run your function in parallel. If no function is provided, defaults to an always True function: `lambda x: True`. with_indices (`bool`, defaults to `False`): Provide example indices to `function`. Note that in this case the signature of `function` should be `def function(example, idx): ...`. input_columns (`str` or `List[str]`, *optional*): The columns to be passed into `function` as positional arguments. If `None`, a dict mapping to all formatted columns is passed as one argument. batched (`bool`, defaults to `False`): Provide batch of examples to `function`. batch_size (`int`, *optional*, default `1000`): Number of examples per batch provided to `function` if `batched=True`. fn_kwargs (`Dict`, *optional*, default `None`): Keyword arguments to be passed to `function`. Example: ```py >>> from datasets import load_dataset >>> ds = load_dataset("cornell-movie-review-data/rotten_tomatoes", split="train", streaming=True) >>> ds = ds.filter(lambda x: x["label"] == 0) >>> list(ds.take(3)) [{'label': 0, 'movie_review': 'simplistic , silly and tedious .'}, {'label': 0, 'movie_review': "it's so laddish and juvenile , only teenage boys could possibly find it funny ."}, {'label': 0, 'movie_review': 'exploitative and largely devoid of the depth or sophistication that would make watching such a graphic treatment of the crimes bearable .'}] ``` NrQr�rm)r�r}rr*r�rr�r�r"r�r rnr�rr r!)r�r�r�rrrtrrs r1r�zIterableDataset.filterC s���h �m�S� )� )� ,�*�O�M��'� � �:� � �$�"2� �3���+�!,�!5�N���4�:�;N�"&�"9� ���K�/� ��%�'��!���'�  �  �  � ��#����+��'��m�D�O�4�4�� �d�&7�8�8�"�5� � � � r3r�c �V�|� tj�|��}nt|��}t ||���}t t |j||���|j� ��|j |j |tj|j ��|j ���S)a^ Randomly shuffles the elements of this dataset. This dataset fills a buffer with `buffer_size` elements, then randomly samples elements from this buffer, replacing the selected elements with new elements. For perfect shuffling, a buffer size greater than or equal to the full size of the dataset is required. For instance, if your dataset contains 10,000 elements but `buffer_size` is set to 1000, then `shuffle` will initially select a random element from only the first 1000 elements in the buffer. Once an element is selected, its space in the buffer is replaced by the next (i.e. 1,001-st) element, maintaining the 1000 element buffer. If the dataset is made of several shards, it also does shuffle the order of the shards. However if the order has been fixed by using [`~datasets.IterableDataset.skip`] or [`~datasets.IterableDataset.take`] then the order of the shards is kept unchanged. Args: seed (`int`, *optional*, defaults to `None`): Random seed that will be used to shuffle the dataset. It is used to sample from the shuffle buffer and also to shuffle the data shards. generator (`numpy.random.Generator`, *optional*): Numpy random Generator to use to compute the permutation of the dataset rows. If `generator=None` (default), uses `np.random.default_rng` (the default BitGenerator (PCG64) of NumPy). buffer_size (`int`, defaults to `1000`): Size of the buffer. Example: ```py >>> from datasets import load_dataset >>> ds = load_dataset("cornell-movie-review-data/rotten_tomatoes", split="train", streaming=True) >>> list(ds.take(3)) [{'label': 1, 'text': 'the rock is destined to be the 21st century's new " conan " and that he's going to make a splash even greater than arnold schwarzenegger , jean-claud van damme or steven segal .'}, {'label': 1, 'text': 'the gorgeously elaborate continuation of " the lord of the rings " trilogy is so huge that a column of words cannot adequately describe co-writer/director peter jackson's expanded vision of j . r . r . tolkien's middle-earth .'}, {'label': 1, 'text': 'effective but too-tepid biopic'}] >>> shuffled_ds = ds.shuffle(seed=42) >>> list(shuffled_ds.take(3)) [{'label': 1, 'text': "a sports movie with action that's exciting on the field and a story you care about off it ."}, {'label': 1, 'text': 'at its best , the good girl is a refreshingly adult take on adultery . . .'}, {'label': 1, 'text': "sam jones became a very lucky filmmaker the day wilco got dropped from their record label , proving that one man's ruin may be another's fortune ."}] ``` N)r�rr�rm)r�r�r8rrr r�rr*r�rnrr!r")r�r�r�r�rs r1r�zIterableDataset.shuffle� s���d � �� �-�-�d�3�3�I�I� ��+�+�I�#�i��M�M�M� ��6��!�{�i��������"�"��+��'��� �d�&7�8�8�"�5�  �  �  � r3r6c�6�|xj||jz z c_dSr.)r#)r�r6s r1� set_epochzIterableDataset.set_epoch� s�� � � �u�t�{�*�*� � � � r3r�c �"�t|j||jdu|jdu���}t ||j���|j|jt j |j��t j |j��|j ���S)a� Create a new [`IterableDataset`] that skips the first `n` elements. Args: n (`int`): Number of elements to skip. Example: ```py >>> from datasets import load_dataset >>> ds = load_dataset("cornell-movie-review-data/rotten_tomatoes", split="train", streaming=True) >>> list(ds.take(3)) [{'label': 1, 'text': 'the rock is destined to be the 21st century's new " conan " and that he's going to make a splash even greater than arnold schwarzenegger , jean-claud van damme or steven segal .'}, {'label': 1, 'text': 'the gorgeously elaborate continuation of " the lord of the rings " trilogy is so huge that a column of words cannot adequately describe co-writer/director peter jackson's expanded vision of j . r . r . tolkien's middle-earth .'}, {'label': 1, 'text': 'effective but too-tepid biopic'}] >>> ds = ds.skip(1) >>> list(ds.take(3)) [{'label': 1, 'text': 'the gorgeously elaborate continuation of " the lord of the rings " trilogy is so huge that a column of words cannot adequately describe co-writer/director peter jackson's expanded vision of j . r . r . tolkien's middle-earth .'}, {'label': 1, 'text': 'effective but too-tepid biopic'}, {'label': 1, 'text': 'if you sometimes like to go to the movies to have fun , wasabi is a good place to start .'}] ``` N�r�r�rm) r�rr r!r r*r�rnrrr"�r�r�rs r1�skipzIterableDataset.skip� s���8+� � � �/3��$�/F� $� 1�T� 9�  � � � � �#�����"�"��+��'��m�D�O�4�4�� �d�&7�8�8�"�5� � � � r3r�c ���tt|j|���|j|j|jt j|j��t j|j ��|j ���S)a~ Create a new [`IterableDataset`] that repeats the underlying dataset `num_times` times. N.B. The effect of calling shuffle after repeat depends significantly on buffer size. With buffer_size 1, duplicate data is never seen in the same iteration, even after shuffling: ds.repeat(n).shuffle(seed=42, buffer_size=1) is equivalent to ds.shuffle(seed=42, buffer_size=1).repeat(n), and only shuffles shard orders within each iteration. With buffer size >= (num samples in the dataset * num_times), we get full shuffling of the repeated data, i.e. we can observe duplicates in the same iteration. Args: num_times (`int`) or (`None`): Number of times to repeat the dataset. If `None`, the dataset will be repeated indefinitely. Example: ```py >>> from datasets import load_dataset >>> ds = load_dataset("cornell-movie-review-data/rotten_tomatoes", split="train") >>> ds = ds.take(2).repeat(2) >>> list(ds) [{'label': 1, 'text': 'the rock is destined to be the 21st century's new " conan " and that he's going to make a splash even greater than arnold schwarzenegger , jean-claud van damme or steven segal .'}, {'label': 1, 'text': 'the gorgeously elaborate continuation of " the lord of the rings " trilogy is so huge that a column of words cannot adequately describe co-writer/director peter jackson's expanded vision of j . r . r . tolkien's middle-earth .'}, {'label': 1, 'text': 'effective but too-tepid biopic'}, {'label': 1, 'text': 'the rock is destined to be the 21st century's new " conan " and that he's going to make a splash even greater than arnold schwarzenegger , jean-claud van damme or steven segal .'}, {'label': 1, 'text': 'the gorgeously elaborate continuation of " the lord of the rings " trilogy is so huge that a column of words cannot adequately describe co-writer/director peter jackson's expanded vision of j . r . r . tolkien's middle-earth .'}, {'label': 1, 'text': 'effective but too-tepid biopic'}] ``` r�rm) r r�rr*rnrr�rr r!r")r�r�s r1�repeatzIterableDataset.repeat sd��B�.�t�/@�I�V�V�V����+��'��m�D�O�4�4�� �d�&7�8�8�"�5� � � � r3c �"�t|j||jdu|jdu���}t ||j���|j|jt j |j��t j |j��|j ���S)a� Create a new [`IterableDataset`] with only the first `n` elements. Args: n (`int`): Number of elements to take. Example: ```py >>> from datasets import load_dataset >>> ds = load_dataset("cornell-movie-review-data/rotten_tomatoes", split="train", streaming=True) >>> small_ds = ds.take(2) >>> list(small_ds) [{'label': 1, 'text': 'the rock is destined to be the 21st century's new " conan " and that he's going to make a splash even greater than arnold schwarzenegger , jean-claud van damme or steven segal .'}, {'label': 1, 'text': 'the gorgeously elaborate continuation of " the lord of the rings " trilogy is so huge that a column of words cannot adequately describe co-writer/director peter jackson's expanded vision of j . r . r . tolkien's middle-earth .'}] ``` Nrxrm) r�rr r!r r*r�rnrrr"rys r1�takezIterableDataset.take6 s���*+� � � �/3��$�/F� $� 1�T� 9�  � � � � �#�����"�"��+��'��m�D�O�4�4�� �d�&7�8�8�"�5� � � � r3Tr�r�r�c ��|j�|||���}t||j���|j|jt j|j��t j|j ��|j ���S)a(Return the `index`-nth shard from dataset split into `num_shards` pieces. This shards deterministically. `dataset.shard(n, i)` splits the dataset into contiguous chunks, so it can be easily concatenated back together after processing. If `dataset.num_shards % n == l`, then the first `l` datasets each have `(dataset.num_shards // n) + 1` shards, and the remaining datasets have `(dataset.num_shards // n)` shards. `datasets.concatenate_datasets([dset.shard(n, i) for i in range(n)])` returns a dataset with the same order as the original. In particular, `dataset.shard(dataset.num_shards, i)` returns a dataset with 1 shard. Note: n should be less or equal to the number of shards in the dataset `dataset.num_shards`. On the other hand, `dataset.shard(n, i, contiguous=False)` contains all the shards of the dataset whose index mod `n = i`. Be sure to shard before using any randomizing operator (such as `shuffle`). It is best if the shard operator is used early in the dataset pipeline. Args: num_shards (`int`): How many shards to split the dataset into. index (`int`): Which shard to select and return. contiguous: (`bool`, defaults to `True`): Whether to select contiguous blocks of indices for shards. Example: ```py >>> from datasets import load_dataset >>> ds = load_dataset("amazon_polarity", split="train", streaming=True) >>> ds Dataset({ features: ['label', 'title', 'content'], num_shards: 4 }) >>> ds.shard(num_shards=2, index=0) Dataset({ features: ['label', 'title', 'content'], num_shards: 2 }) ``` r@rm) rr�r r*r�rnrrr r!r")r�r�r�r�rs r1�shardzIterableDataset.shard[ s}��\�'�:�:�j�X]�jt�:�u�u� ��#�����"�"��+��'��m�D�O�4�4�� �d�&7�8�8�"�5� � � � r3c�t�|jj�+t|jj�����ndS)a-Names of the columns in the dataset. Example: ```py >>> from datasets import load_dataset >>> ds = load_dataset("cornell-movie-review-data/rotten_tomatoes", split="validation", streaming=True) >>> ds.column_names ['text', 'label'] ``` N)r*r�rCr�r�s r1rIzIterableDataset.column_names� s3��48�:�3F�3R�t�D�J�'�,�,�.�.�/�/�/�X\�\r3rIrJc�Z�|�tt||���d���S)z�Add column to Dataset. Args: name (str): Column name. column (list or np.array): Column data to be added. Returns: `IterableDataset` )rIrJT)r�)rrrrL)r�rIrJs r1� add_columnzIterableDataset.add_column� s*���x�x�� �D��H�H�H�W[�x�\�\�\r3r>r?c�0�|�||i��S)a� Rename a column in the dataset, and move the features associated to the original column under the new column name. Args: original_column_name (`str`): Name of the column to rename. new_column_name (`str`): New name for the column. Returns: `IterableDataset`: A copy of the dataset with a renamed column. Example: ```py >>> from datasets import load_dataset >>> ds = load_dataset("cornell-movie-review-data/rotten_tomatoes", split="train", streaming=True) >>> next(iter(ds)) {'label': 1, 'text': 'the rock is destined to be the 21st century's new " conan " and that he's going to make a splash even greater than arnold schwarzenegger , jean-claud van damme or steven segal .'} >>> ds = ds.rename_column("text", "movie_review") >>> next(iter(ds)) {'label': 1, 'movie_review': 'the rock is destined to be the 21st century's new " conan " and that he's going to make a splash even greater than arnold schwarzenegger , jean-claud van damme or steven segal .'} ``` )�rename_columns)r�r>r?s r1� rename_columnzIterableDataset.rename_column� s��8�"�"�$8�/�#J�K�K�Kr3r5c�B��|jjr|jj���nd}|�t t ����t ������}|�7t�fd�|���D����|j_|S)aa Rename several columns in the dataset, and move the features associated to the original columns under the new column names. Args: column_mapping (`Dict[str, str]`): A mapping of columns to rename to their new names Returns: `IterableDataset`: A copy of the dataset with renamed columns N)r5r5c�X��i|]&\}}|����vr�|n||��'Sr/)r�)r9r:�featurer5s �r1r@z2IterableDataset.rename_columns.<locals>.<dictcomp>� sP������$��W�,/�.�2E�2E�2G�2G�+G�+G�N�3�'�'�S�RY���r3) r*r�r�rrrrGrCrrF)r�r5�original_features� ds_iterables ` r1r�zIterableDataset.rename_columns� s����;?�*�:M�W�D�J�/�4�4�6�6�6�SW���h�h� �&�~� F� F� F�W[�\j�Wk�Wk�� � � � � (�)1�����(9�(?�(?�(A�(A����*�*�K� � &� �r3rIc� �|jjr|jj���nd}|�|���}|�I|���|j_|���D]\}}||vr |jj|=�|S)a^ Remove one or several column(s) in the dataset and the features associated to them. The removal is done on-the-fly on the examples when iterating over the dataset. Args: column_names (`Union[str, List[str]]`): Name of the column(s) to remove. Returns: `IterableDataset`: A copy of the dataset object without the columns to remove. Example: ```py >>> from datasets import load_dataset >>> ds = load_dataset("cornell-movie-review-data/rotten_tomatoes", split="train", streaming=True) >>> next(iter(ds)) {'text': 'the rock is destined to be the 21st century's new " conan " and that he's going to make a splash even greater than arnold schwarzenegger , jean-claud van damme or steven segal .', 'label': 1} >>> ds = ds.remove_columns("label") >>> next(iter(ds)) {'text': 'the rock is destined to be the 21st century's new " conan " and that he's going to make a splash even greater than arnold schwarzenegger , jean-claud van damme or steven segal .'} ``` Nr5)r*r�r�rrrF)r�rIr�r�r:rys r1rzIterableDataset.remove_columns� s���2;?�*�:M�W�D�J�/�4�4�6�6�6�SW���h�h�l�h�;�;� � � (�):�)?�)?�)A�)A�K� � &�+�1�1�3�3� 8� 8���Q��,�&�&�#�)�2�3�7���r3c �f��t|t��r|g}|jr�tj|j���|jj��t |��t |jj�����z }|rLtdt|���dt|jj������d����t�fd�|D�����_t|j |��}t|�|j|j|j|j|j���S)aVSelect one or several column(s) in the dataset and the features associated to them. The selection is done on-the-fly on the examples when iterating over the dataset. Args: column_names (`Union[str, List[str]]`): Name of the column(s) to select. Returns: `IterableDataset`: A copy of the dataset object with selected columns. Example: ```py >>> from datasets import load_dataset >>> ds = load_dataset("cornell-movie-review-data/rotten_tomatoes", split="train", streaming=True) >>> next(iter(ds)) {'text': 'the rock is destined to be the 21st century's new " conan " and that he's going to make a splash even greater than arnold schwarzenegger , jean-claud van damme or steven segal .', 'label': 1} >>> ds = ds.select_columns("text") >>> next(iter(ds)) {'text': 'the rock is destined to be the 21st century's new " conan " and that he's going to make a splash even greater than arnold schwarzenegger , jean-claud van damme or steven segal .'} ``` Nz Column name z- not in the dataset. Columns in the dataset: r+c�,��i|]}|�j|��Sr/r)r9rSrs �r1r@z2IterableDataset.select_columns.<locals>.<dictcomp>/ s"���)T�)T�)T�!�!�T�]�1�-=�)T�)T�)Tr3rm)r�r}r*r�rr�rEr�rBrCrrHrr rnrr r!r")r�rI�missing_columnsrrs @r1�select_columnszIterableDataset.select_columns s?���2 �l�C� (� (� *�(�>�L� �:� V��=���,�,�D��z�"�.�"%�l�"3�"3�c�$�*�:M�:R�:R�:T�:T�6U�6U�"U��"��$�?�t�O�'<�'<�?�?��� � 3� 8� 8� :� :�;�;�?�?�?���� !)�)T�)T�)T�)T�|�)T�)T�)T� U� U�� �+�D�,=�|�L�L� ��#���+��'��o��)�"�5� � � � r3r�c ���|j���}||j|<t|j||j|jtj|j��tj|j ��|j ���S)a�Cast column to feature for decoding. Args: column (`str`): Column name. feature (`Feature`): Target feature. Returns: `IterableDataset` Example: ```py >>> from datasets import load_dataset, Audio >>> ds = load_dataset("PolyAI/minds14", name="en-US", split="train", streaming=True) >>> ds.features {'audio': Audio(sampling_rate=8000, mono=True, decode=True, id=None), 'english_transcription': Value(dtype='string', id=None), 'intent_class': ClassLabel(num_classes=14, names=['abroad', 'address', 'app_error', 'atm_limit', 'balance', 'business_loan', 'card_issues', 'cash_deposit', 'direct_debit', 'freeze', 'high_value_payment', 'joint_account', 'latest_transactions', 'pay_bill'], id=None), 'lang_id': ClassLabel(num_classes=14, names=['cs-CZ', 'de-DE', 'en-AU', 'en-GB', 'en-US', 'es-ES', 'fr-FR', 'it-IT', 'ko-KR', 'nl-NL', 'pl-PL', 'pt-PT', 'ru-RU', 'zh-CN'], id=None), 'path': Value(dtype='string', id=None), 'transcription': Value(dtype='string', id=None)} >>> ds = ds.cast_column("audio", Audio(sampling_rate=16000)) >>> ds.features {'audio': Audio(sampling_rate=16000, mono=True, decode=True, id=None), 'english_transcription': Value(dtype='string', id=None), 'intent_class': ClassLabel(num_classes=14, names=['abroad', 'address', 'app_error', 'atm_limit', 'balance', 'business_loan', 'card_issues', 'cash_deposit', 'direct_debit', 'freeze', 'high_value_payment', 'joint_account', 'latest_transactions', 'pay_bill'], id=None), 'lang_id': ClassLabel(num_classes=14, names=['cs-CZ', 'de-DE', 'en-AU', 'en-GB', 'en-US', 'es-ES', 'fr-FR', 'it-IT', 'ko-KR', 'nl-NL', 'pl-PL', 'pt-PT', 'ru-RU', 'zh-CN'], id=None), 'path': Value(dtype='string', id=None), 'transcription': Value(dtype='string', id=None)} ``` rm� r*r�r�r rrnrrr r!r")r�rJr�rs r1� cast_columnzIterableDataset.cast_column< sr��D�z��� � �� '�� �f����)���+��'��m�D�O�4�4�� �d�&7�8�8�"�5� � � � r3c ���|j���}||_t|j||j|jtj|j��tj|j ��|j ���S)a� Cast the dataset to a new set of features. Args: features ([`Features`]): New features to cast the dataset to. The name of the fields in the features must match the current column names. The type of the data must also be convertible from one type to the other. For non-trivial conversion, e.g. `string` <-> `ClassLabel` you should use [`~Dataset.map`] to update the Dataset. Returns: `IterableDataset`: A copy of the dataset with casted features. Example: ```py >>> from datasets import load_dataset, ClassLabel, Value >>> ds = load_dataset("cornell-movie-review-data/rotten_tomatoes", split="train", streaming=True) >>> ds.features {'label': ClassLabel(names=['neg', 'pos'], id=None), 'text': Value(dtype='string', id=None)} >>> new_features = ds.features.copy() >>> new_features["label"] = ClassLabel(names=["bad", "good"]) >>> new_features["text"] = Value("large_string") >>> ds = ds.cast(new_features) >>> ds.features {'label': ClassLabel(names=['bad', 'good'], id=None), 'text': Value(dtype='large_string', id=None)} ``` rmr��r�r�rs r1�castzIterableDataset.castj sm��D�z��� � �� �� ���)���+��'��m�D�O�4�4�� �d�&7�8�8�"�5� � � � r3r�enable� num_threadsc���|jstd���|}dtfd�}|�r|dk�r|j���}|j���}t |t |d����t |t |d����|�|��}tj� |��}t t||j ��}|� ||���}t|jt��sJ�d|z|j_nL|j���} t | t ||����|�| ��}|S) a- Enable or disable the dataset features decoding for audio, image, video. When enabled (default), media types are decoded: * audio -> dict of "array" and "sampling_rate" and "path" * image -> PIL.Image * video -> torchvision.io.VideoReader You can enable multithreading using `num_threads`. This is especially useful to speed up remote data streaming. However it can be slower than `num_threads=0` for local data on fast disks. Disabling decoding is useful if you want to iterate on the paths or bytes of the media files without actually decoding their content. To disable decoding you can use `.decode(False)`, which is equivalent to calling `.cast()` or `.cast_column()` with all the Audio, Image and Video types set to `decode=False`. Args: enable (`bool`, defaults to `True`): Enable or disable features decoding. num_threads (`int`, defaults to `0`): Enable multithreading for features decoding. Returns: `IterableDataset`: A copy of the dataset with casted features. Examples: Disable decoding: ```py >>> from datasets import load_dataset >>> ds = load_dataset("sshh12/planet-textures", split="train", streaming=True) >>> next(iter(ds)) {'image': <PIL.PngImagePlugin.PngImageFile image mode=RGB size=2048x1024>, 'text': 'A distant celestial object with an icy crust, displaying a light blue shade, covered with round pits and rugged terrains.'} >>> ds = ds.decode(False) >>> ds.features {'image': Image(mode=None, decode=False, id=None), 'text': Value(dtype='string', id=None)} >>> next(iter(ds)) { 'image': { 'path': 'hf://datasets/sshh12/planet-textures@69dc4cef7a5c4b2cfe387727ec8ea73d4bff7302/train/textures/0000.png', 'bytes': None }, 'text': 'A distant celestial object with an icy crust, displaying a light blue shade, covered with round pits and rugged terrains.' } ``` Speed up streaming with multithreading: ```py >>> import os >>> from datasets import load_dataset >>> from tqdm import tqdm >>> ds = load_dataset("sshh12/planet-textures", split="train", streaming=True) >>> num_threads = min(32, (os.cpu_count() or 1) + 4) >>> ds = ds.decode(num_threads=num_threads) >>> for _ in tqdm(ds): # 20 times faster ! ... ... ``` z�Features decoding is only available for datasets with known features, but features are Unknown. Please set the datasets features with `ds = ds.cast(features)`.�decodec�8�t|d��r ||_dSdS)Nr�)�hasattrr�)r�r�s r1� set_decodingz,IterableDataset.decode.<locals>.set_decoding� s)���w��)�)� (�!'����� (� (r3rFTr�)r�rBr�r�rrr��multiprocessing�pool� ThreadPool� _apply_asyncr�rrr�rr�r) r�r�r��dsr��disabled_decoding_features�enabled_decoding_featuresr��funcr�s r1r�zIterableDataset.decode� sz��@�}� ��R��� ��� (�� (� (� (� (� � #�k�A�o�o�)-��);�);�)=�)=� &�(,� �(:�(:�(<�(<� %� �-�w�|�U�/K�/K� L� L� L� �,�g�l�D�.I�.I� J� J� J����3�4�4�B�"�'�2�2�;�?�?�D��<��/H�/W�X�X�D�����'@��A�A�B��b�o�/E�F�F� F� F�F�NO�R]�o�B�O� K� K��{�'�'�)�)�H� �8�W�\�6�:�:� ;� ;� ;�����"�"�B�� r3r]r^c ��t|j||���}t||j���|j|jt j|j��t j|j ��|j ���S)Nrfrm) r\rr r*r�rnrrr r!r")r�r]r^rs r1�_stepzIterableDataset._step� ss��*�4�+<�4�PV�W�W�W� ��#�����"�"��+��'��m�D�O�4�4�� �d�&7�8�8�"�5� � � � r3c ��|j�|S|jjr |jj}n4t|�d�������}|j���}||_t|j||j |j tj |j ��tj |j ��|j���S)Nrm)r�rr�r[ror4rr�r rnrrr r!r"r�s r1�_resolve_featuresz!IterableDataset._resolve_features s��� �=� $��K� � � '� R��(�1�H�H�1�$�2B�2B�4�2H�2H�2N�2N�2P�2P�Q�Q�H��y�~�~���� �� ���)���+��'��m�D�O�4�4�� �d�&7�8�8�"�5� � � � r3c��d�}|jr1td�|j���D����}nd}|�|d|||���S)a� Group samples from the dataset into batches. Args: batch_size (`int`): The number of samples in each batch. drop_last_batch (`bool`, defaults to `False`): Whether to drop the last incomplete batch. Example: ```py >>> ds = load_dataset("some_dataset", streaming=True) >>> batched_ds = ds.batch(batch_size=32) ``` c�>�d�|���D��S)Nc��i|] \}}||g�� Sr/r/)r9�k�vs r1r@z;IterableDataset.batch.<locals>.batch_fn.<locals>.<dictcomp>$ s ��9�9�9�t�q�!�A��s�9�9�9r3�rF)� unbatcheds r1�batch_fnz'IterableDataset.batch.<locals>.batch_fn# s ��9�9�y���'8�'8�9�9�9� 9r3c��i|] \}}||g�� Sr/r/)r9r:r�s r1r@z)IterableDataset.batch.<locals>.<dictcomp>' s �� Z� Z� Z�L�C���w�i� Z� Z� Zr3NT)rrtrur�)r�rrFrr)r�rtrur�r�s r1rMzIterableDataset.batch su�� :� :� :� �=� �� Z� Z�D�M�DW�DW�DY�DY� Z� Z� Z�[�[�H�H��H��x�x� �d�z�?�em�� � � r3)NNNNNN)r2)rFrFrAr.) NFNFr�FNNN)NFNFr�N)NNr�r�)Tr)Dr�r�r�r�r�rr!r"rrrrdr}rr�r�r�r�r+r.r1r4r�r�r6r:r�r�rMrHr%r�ror�r#�TRAINr rr]rerkrorCrrr�r�r�r�r�rvrzr|r~r�rIrkr�r�r�rr�rr�r�r�r�r�rMr/r3r1r r �s@������*�*� '+�&*�15�/3�37�IM�G�G�*�G��{�#�G�� �#� G� �-�.� G� �O�,� G��/�0�G�$�D��e�C��t�O�.D�)D�$E�F�G�G�G�G�:3/�D�3/�3/�3/�3/�j0/�$�0/�4�0/�0/�0/�0/�dr�r�r����G�G�G�3�3�3�3�� �s� � � ��X� � =� =� =��,�C�,�,�,��X�,� ��#�����X��9�9�9�v � � �<A�2�2��2�48�2� �2�2�2�2�h���2!?�!?�s�!?�T�!?�!?�!?�!?�F�(,�%)�!�K� 4�4��4��8�$�4��T�N�4�� 4� � 4�4�4��\�4�l�'+�'+�(�(� #�(�� �#�(��8�$�(� � (�(�(��\�(�T� f�C� f�$5� f� f� f��\� f�"#�8 �8 ��s�m�8 � �8 �8 �8 �8 �x(,�"�9=��$(� %�:>�'+�$(�T �T ��8�$�T ��T � ��c�4��9�n� 5�6� T � � T � �S�M� T ��T �!��s�D��I�~�!6�7�T ��8�$�T ��D�>�T � �T �T �T �T �p(,��9=��$(�$(�T �T ��8�$�T � ��c�4��9�n� 5�6� T � � T � �S�M� T ��D�>�T � �T �T �T �T �n^b�A �A �$,�R�Y�-@�$A�A �WZ�A � �A �A �A �A �F+�s�+�+�+�+�* �c�* �/�* �* �* �* �X) ��� �) �2C�) �) �) �) �V# �c�# �/�# �# �# �# �R � 7 �7 ��7 ��7 �� 7 � � 7 �7 �7 �7 �r� ]�h�t�C�y�1� ]� ]� ]��X� ]� ]�s� ]�E�$���.�,A� ]�FW� ]� ]� ]� ]�L�#�L��L�Pa�L�L�L�L�<�T�#�s�(�^��@Q�����2!�5��d�3�i��+@�!�EV�!�!�!�!�F1 �5��d�3�i��+@�1 �EV�1 �1 �1 �1 �f, �#�, � �, �@Q�, �, �, �, �\, ��, � �, �, �, �, �\[�[�T�[�s�[�CT�[�[�[�[�z  �#�  �s�  �/@�  �  �  �  � � � �& � �� �d� �GX� � � � � � r3r �dsetsrr�axisc��d�|D��}|dkrtd�|D����ntd�|D����td�td�|D����D����}d�|D��}|dkrt |��}nt |��}|�t jd �|D����}n|���}||_ d �|D��}t||||� ��S) a� Converts a list of `IterableDataset` with the same schema into a single `IterableDataset`. Missing data are filled with None values. <Added version="2.4.0"/> Args: dsets (`List[datasets.IterableDataset]`): List of Datasets to concatenate. info (`DatasetInfo`, optional): Dataset information, like description, citation, etc. split (`NamedSplit`, optional): Name of the dataset split. axis (``{0, 1}``, default ``0``, meaning over rows): Axis to concatenate over, where ``0`` means over rows (vertically) and ``1`` means over columns (horizontally). *New in version 1.6.0* Example: ```py >>> ds3 = _concatenate_iterable_datasets([ds1, ds2]) ``` c�6�g|]}|�����Sr/�r��r9r0s r1rcz2_concatenate_iterable_datasets.<locals>.<listcomp>K s$�� 2� 2� 2�q�Q� � � "� "� 2� 2� 2r3rc��g|] }|j�� Sr/r�r9�dsets r1rcz2_concatenate_iterable_datasets.<locals>.<listcomp>O s��*K�*K�*K�T�4�=�*K�*K�*Kr3c�&�g|]}|jD]}|���Sr/r)r9r��col_names r1rcz2_concatenate_iterable_datasets.<locals>.<listcomp>Q s'��V�V�V�$�� �V�V�H�X�V�V�V�Vr3c�H�i|]}|���D]\}}||�� � Sr/r��r9r�r�r�s r1r@z2_concatenate_iterable_datasets.<locals>.<dictcomp>V s;��n�n�n�(�]e�]k�]k�]m�]m�n�n�UY�UV�XY��A�n�n�n�nr3c��g|] }|j�� Sr/rr�s r1rcz2_concatenate_iterable_datasets.<locals>.<listcomp>V s��.O�.O�.O��t�}�.O�.O�.Or3c�@�g|]}tj|j����Sr/�r�rrr�s r1rcz2_concatenate_iterable_datasets.<locals>.<listcomp>Y s$��A�A�A�a�D�M�!�.�1�1�A�A�Ar3Nc��g|] }|j�� Sr/�rr�s r1rcz2_concatenate_iterable_datasets.<locals>.<listcomp>a s��&=�&=�&=�!�q�v�&=�&=�&=r3c�R�i|]$}|j���D]\}}||�� �%Sr/�r"rF�r9�dataset�repo_id�tokens r1r@z2_concatenate_iterable_datasets.<locals>.<dictcomp>f s;��v�v�v�G�SZ�Sm�Ss�Ss�Su�Su�v�v���%��%�v�v�v�vr3�rrrr�) rr�rrr�r�r!� from_merger�r�r )r�rrr�r�rlrr�s r1�_concatenate_iterable_datasetsr�/ s;��8 3� 2�E� 2� 2� 2�E� �q�y�y�)�*K�*K�U�*K�*K�*K�L�L�L�L��V�V�%�V�V�V�W�W�W��n�n�o�.O�.O��.O�.O�.O�P�P�n�n�n���H�B�A�5�A�A�A�L� �q�y�y�H��V�V� � �J�<�X�X� � �|��%�&=�&=�u�&=�&=�&=�>�>����y�y�{�{���D�M�v�v�u�v�v�v�� �{��U�^o� p� p� p�pr3rk�datasetsr�r�rmrnc��d�|D��}td�|D����td�td�|D����D����}d�|D��}|�t||���}n2tj�|��} t|| ||���}|�tj d �|D����}n|� ��}||_ d �|D��} t|||| � ��S) a� Interleave several iterable datasets (sources) into a single iterable dataset. The new iterable dataset alternates between the sources to yield examples. If `probabilities = None` (default) the iterable dataset will cycles through the sources in order for each next example in the iteration. If `probabilities` is not `None, the iterable dataset will sample a random source according to the provided probabilities for each next examples in the iteration. <Added version="2.4.0"/> Args: datasets (`List[IterableDataset]`): list of datasets to interleave probabilities (`List[float]`, optional, default None): If specified, the new iterable dataset samples examples from one source at a time according to these probabilities. seed (`int`, optional, default None): The random seed used to choose a source for each example. stopping_strategy (`str`, defaults to `first_exhausted`): Two strategies are proposed right now. By default, `first_exhausted` is an undersampling strategy, i.e the dataset construction is stopped as soon as one dataset has ran out of samples. If the strategy is `all_exhausted`, we use an oversampling strategy, i.e the dataset construction is stopped as soon as every samples of every dataset has been added at least once. Note that if the strategy is `all_exhausted`, the interleaved dataset size can get enormous: - with no probabilities, the resulting dataset will have max_length_datasets*nb_dataset samples. - with given probabilities, the resulting dataset will have more samples if some datasets have really low probability of visiting. Output: `datasets.IterableDataset` c�6�g|]}|�����Sr/r�r�s r1rcz1_interleave_iterable_datasets.<locals>.<listcomp>� s$��8�8�8�!��#�#�%�%�8�8�8r3c��g|] }|j�� Sr/rr�s r1rcz1_interleave_iterable_datasets.<locals>.<listcomp>� s��&J�&J�&J��t�}�&J�&J�&Jr3c�H�i|]}|���D]\}}||�� � Sr/r�r�s r1r@z1_interleave_iterable_datasets.<locals>.<dictcomp>� s;��q�q�q�(�`h�`n�`n�`p�`p�q�q�X\�XY�[\��A�q�q�q�qr3c��g|] }|j�� Sr/rr�s r1rcz1_interleave_iterable_datasets.<locals>.<listcomp>� s��.R�.R�.R��t�}�.R�.R�.Rr3c�@�g|]}tj|j����Sr/r�r�s r1rcz1_interleave_iterable_datasets.<locals>.<listcomp>� s$��D�D�D�a�D�M�!�.�1�1�D�D�Dr3Nr�r�c��g|] }|j�� Sr/r�r�s r1rcz1_interleave_iterable_datasets.<locals>.<listcomp>� s��&@�&@�&@�!�q�v�&@�&@�&@r3c�R�i|]$}|j���D]\}}||�� �%Sr/r�r�s r1r@z1_interleave_iterable_datasets.<locals>.<dictcomp>� sN�����"�W�E_�Ee�Ee�Eg�Eg���3A�7�E������r3r�) rrrrjr�r�r8r�r!r�r�r�r ) r�r�r�rrrmr�rlrr�r�s r1�_interleave_iterable_datasetsr�k sK��@9�8�x�8�8�8�H�&�&J�&J��&J�&J�&J�K�K�K��q�q�o�.R�.R��.R�.R�.R�S�S�q�q�q���H�E�D�8�D�D�D�L���9�,�Zk�l�l�l� � ��I�)�)�$�/�/� �A� �I�]�^o� � � � �  �|��%�&@�&@�x�&@�&@�&@�A�A����y�y�{�{���D�M���&.����� �{��U�^o� p� p� p�pr3r�rrc �$�|jr!||jjz|z}||jjz}t||���}t |j|j���|j|j tj |j ��||j ���S)a� Split an iterable dataset for the node at rank `rank` in a pool of nodes of size `world_size`. If the dataset has a number of shards that is a factor of `world_size` (i.e. if `dataset.num_shards % world_size == 0`), then the shards are evenly assigned across the nodes, which is the most optimized. Otherwise, each node keeps 1 example out of `world_size`, skipping the other examples. Args: dataset ([`IterableDataset`]): The iterable dataset to split by node. rank (`int`): Rank of the current node. world_size (`int`): Total number of nodes. Returns: [`IterableDataset`]: The iterable dataset to be used on the node at rank `rank`. )rrrm) r!rrrr rr*r�rnrrr r")r�rrrs r1�_split_by_node_iterable_datasetr�� s���&��B��G�0�5�5��<���'�"6�"A�A� �#��*�E�E�E�K� ��(� �]� � � !� !��n��&��-�� 2�3�3��!�4� � � �r3c��K�|�||f��} |���r|���Stjd���d{V���C)NTr)� apply_async�readyrbrG�sleep)r�r�r0�futures r1r�r�� sg���� � � �d�Q�D� )� )�F�#� �<�<�>�>� #��:�:�<�<� ��-��"�"� "� "� "� "� "� "� "� #r3r.rF)NNr)NNNNrk)�rGr�rCr��multiprocessing.poolr�r�� collectionsr�collections.abcrrr� dataclassesr� functoolsrr r �typingr r r rr� fsspec.asynrD�numpyr��pandasr��pyarrowrQr?r� arrow_datasetrrr�r�features.featuresrrrrrrrrrrrrr rr!�splitsr"r#�tabler$r%r&� utils.loggingr'�utils.py_utilsr(�utils.shardingr)r*r+r,rr�r\r�r}r�r2rdrGrCrLr[rhrrr�r�rRr�r�r�r�r�r rrHr\rjr�r�r�r�r�r�rtru� BooleanScalarrxr~r�r�r�r�r�r�r�r�rr�rrrrr r�r�r�r�r�r/r3r1�<module>r�s� ������ � � � ������������� � � � �������.�.�.�.�.�.�.�.�������!�!�!�!�!�!�������#�#�#�#�#�#�#�#�@�@�@�@�@�@�@�@�@�@�@�@�@�@�����������������������4�4�4�4�4�4�4�4���������������������������������������������%�%�%�%�%�%�%�%�L�L�L�L�L�L�L�L�L�L�%�%�%�%�%�%�#�#�#�#�#�#�v�v�v�v�v�v�v�v�v�v�v�v����L�L�L� ��H� � �� �C��H�o�� � � � �� �d�3��8�n� � � � ��4��c����d�4�j����� 7�7�d�3��9�o�7�X�h�EW�7�ck�7�7�7�7�#��d�3��8�n�!5�#�$�s�D�y�/�#�#�#�#�>�d�3��9�o�>�(�4��S��>�2J�>�>�>�>�"�f�f��u�S�$�Y�'�(�f��f��f��e�C���M�"�#� f�f�f�f�@Co�Co�Co�Co�Co�Co�Co�Co�L"<�"<�"<�"<�"<�,�"<�"<�"<�J" �" �" �" �" �*:�" �" �" �J@<�@<�@<�@<�@<�1�@<�@<�@<�FC �C �C �C �C �/D�C �C �C �LE+�E+�E+�E+�E+�%:�E+�E+�E+�P*+�*+�*+�*+�*+�1�*+�*+�*+�Z++�++�++�++�++�0�++�++�++�\d �d �d �d �d �*?�d �d �d �NH �H �H �H �H �9N�H �H �H �V �d�3�i� � � � �K �K �K �K �K �;P�K �K �K �\P �P �P �P �P �2U�P �P �P �f �b�h� � � � �}+�}+�}+�}+�}+�2�}+�}+�}+�@ (� ��r�x�� � (� ��d�B�H�b�o�r�7G�G� H� (�� (� (� (� (�4�H�4�U�4���>�-B�4�]`�4�4�4�4� 4��4�$)�$���.�$9�4�TW�4�4�4�4�J+�J+�J+�J+�J+�5�J+�J+�J+�ZD+�D+�D+�D+�D+�%:�D+�D+�D+�NH+�H+�H+�H+�H+�0�H+�H+�H+�V.)�.)�.)�.)�.)�2�.)�.)�.)�bN+�N+�N+�N+�N+�0�N+�N+�N+�b � � �%� �:>�s�E�#�t�UY�/�DZ�?Z�:[� � � � � � � � � �#� �8<�S�%��T�SW��BX�=X�8Y� � � � � � �  � L� L� L� L� L� L� L� �� L�a+�a+�a+�a+�a+� 5�a+�a+�a+�H �)�)�)�)�)�)�)� ��)�  �������� ��� A�A�A� �e�C��<O�6P� �UZ�[^�`n�[n�Uo� � � � �Q �Q �Q �Q �Q �&�Q �Q �Q �l,#'�"&�� 9q�9q� �� �9q� �;� �9q� �J� �9q� � 9q� � 9q�9q�9q�9q�|,0��"&�"&�EV� Aq�Aq��?�#�Aq��D��K�(�Aq� �3�-�Aq� �;� � Aq� �J� � Aq� �A�B� Aq��Aq�Aq�Aq�Aq�H�_��C��UX��]l�����D#�#�#�#�#r3
Memory