� B�g�$���dZddlmZddlmZmZddlZddlmZddl m Z er ddl m Z ddl mZd �Z d.d/d#�Z d0d1d-�ZdS)2z Google BigQuery support �)� annotations)� TYPE_CHECKING�AnyN��import_optional_dependency)�find_stack_level)� Credentials)� DataFramec�,�d}td|���}|S)Nzjpandas-gbq is required to load data from Google BigQuery. See the docs: https://pandas-gbq.readthedocs.io.� pandas_gbq)�extrar)�msgr s �]/home/asafur/pinokio/api/open-webui.git/app/env/lib/python3.11/site-packages/pandas/io/gbq.py� _try_importrs&�� ;��,�L��D�D�D�J� ��FT�query�str� project_id� str | None� index_col� col_order�list[str] | None�reauth�bool�auth_local_webserver�dialect�location� configuration�dict[str, Any] | None� credentials�Credentials | None�use_bqstorage_api� bool | None� max_results� int | None�progress_bar_type�returnr c ���tjdtt�����t ��} i}| �| |d<| �| |d<| |d<| j|f||||||||| d� |��S)a� Load data from Google BigQuery. .. deprecated:: 2.2.0 Please use ``pandas_gbq.read_gbq`` instead. This function requires the `pandas-gbq package <https://pandas-gbq.readthedocs.io>`__. See the `How to authenticate with Google BigQuery <https://pandas-gbq.readthedocs.io/en/latest/howto/authentication.html>`__ guide for authentication instructions. Parameters ---------- query : str SQL-Like Query to return data values. project_id : str, optional Google BigQuery Account project ID. Optional when available from the environment. index_col : str, optional Name of result column to use for index in results DataFrame. col_order : list(str), optional List of BigQuery column names in the desired order for results DataFrame. reauth : bool, default False Force Google BigQuery to re-authenticate the user. This is useful if multiple accounts are used. auth_local_webserver : bool, default True Use the `local webserver flow`_ instead of the `console flow`_ when getting user credentials. .. _local webserver flow: https://google-auth-oauthlib.readthedocs.io/en/latest/reference/google_auth_oauthlib.flow.html#google_auth_oauthlib.flow.InstalledAppFlow.run_local_server .. _console flow: https://google-auth-oauthlib.readthedocs.io/en/latest/reference/google_auth_oauthlib.flow.html#google_auth_oauthlib.flow.InstalledAppFlow.run_console *New in version 0.2.0 of pandas-gbq*. .. versionchanged:: 1.5.0 Default value is changed to ``True``. Google has deprecated the ``auth_local_webserver = False`` `"out of band" (copy-paste) flow <https://developers.googleblog.com/2022/02/making-oauth-flows-safer.html?m=1#disallowed-oob>`_. dialect : str, default 'legacy' Note: The default value is changing to 'standard' in a future version. SQL syntax dialect to use. Value can be one of: ``'legacy'`` Use BigQuery's legacy SQL dialect. For more information see `BigQuery Legacy SQL Reference <https://cloud.google.com/bigquery/docs/reference/legacy-sql>`__. ``'standard'`` Use BigQuery's standard SQL, which is compliant with the SQL 2011 standard. For more information see `BigQuery Standard SQL Reference <https://cloud.google.com/bigquery/docs/reference/standard-sql/>`__. location : str, optional Location where the query job should run. See the `BigQuery locations documentation <https://cloud.google.com/bigquery/docs/dataset-locations>`__ for a list of available locations. The location must match that of any datasets used in the query. *New in version 0.5.0 of pandas-gbq*. configuration : dict, optional Query config parameters for job processing. For example: configuration = {'query': {'useQueryCache': False}} For more information see `BigQuery REST API Reference <https://cloud.google.com/bigquery/docs/reference/rest/v2/jobs#configuration.query>`__. credentials : google.auth.credentials.Credentials, optional Credentials for accessing Google APIs. Use this parameter to override default credentials, such as to use Compute Engine :class:`google.auth.compute_engine.Credentials` or Service Account :class:`google.oauth2.service_account.Credentials` directly. *New in version 0.8.0 of pandas-gbq*. use_bqstorage_api : bool, default False Use the `BigQuery Storage API <https://cloud.google.com/bigquery/docs/reference/storage/>`__ to download query results quickly, but at an increased cost. To use this API, first `enable it in the Cloud Console <https://console.cloud.google.com/apis/library/bigquerystorage.googleapis.com>`__. You must also have the `bigquery.readsessions.create <https://cloud.google.com/bigquery/docs/access-control#roles>`__ permission on the project you are billing queries to. This feature requires version 0.10.0 or later of the ``pandas-gbq`` package. It also requires the ``google-cloud-bigquery-storage`` and ``fastavro`` packages. max_results : int, optional If set, limit the maximum number of rows to fetch from the query results. progress_bar_type : Optional, str If set, use the `tqdm <https://tqdm.github.io/>`__ library to display a progress bar while the data downloads. Install the ``tqdm`` package to use this feature. Possible values of ``progress_bar_type`` include: ``None`` No progress bar. ``'tqdm'`` Use the :func:`tqdm.tqdm` function to print a progress bar to :data:`sys.stderr`. ``'tqdm_notebook'`` Use the :func:`tqdm.tqdm_notebook` function to display a progress bar as a Jupyter notebook widget. ``'tqdm_gui'`` Use the :func:`tqdm.tqdm_gui` function to display a progress bar as a graphical dialog box. Returns ------- df: DataFrame DataFrame representing results of query. See Also -------- pandas_gbq.read_gbq : This function in the pandas-gbq library. DataFrame.to_gbq : Write a DataFrame to Google BigQuery. Examples -------- Example taken from `Google BigQuery documentation <https://cloud.google.com/bigquery/docs/pandas-gbq-migration>`_ >>> sql = "SELECT name FROM table_name WHERE state = 'TX' LIMIT 100;" >>> df = pd.read_gbq(sql, dialect="standard") # doctest: +SKIP >>> project_id = "your-project-id" # doctest: +SKIP >>> df = pd.read_gbq(sql, ... project_id=project_id, ... dialect="standard" ... ) # doctest: +SKIP z�read_gbq is deprecated and will be removed in a future version. Please use pandas_gbq.read_gbq instead: https://pandas-gbq.readthedocs.io/en/latest/api.html#pandas_gbq.read_gbq�� stacklevelNr"r$r&) rrrrrrrrr )�warnings�warn� FutureWarningrr�read_gbq)rrrrrrrrrr r"r$r&r �kwargss rr.r.s���z �M� S� �#�%�%� �������J�13�F��$�&7��"�#��� +��}��"3�F� �� �:� � � �����1���#�� � � � � � r�fail� dataframe�destination_table� chunksize� if_exists� table_schema�list[dict[str, str]] | None� progress_bar�Nonec ��tjdtt�����t ��} | �|||||||||| | �� � dS)Nz�to_gbq is deprecated and will be removed in a future version. Please use pandas_gbq.to_gbq instead: https://pandas-gbq.readthedocs.io/en/latest/api.html#pandas_gbq.to_gbqr)) rr3rr4rr5rr7r )r+r,r-rr�to_gbq) r1r2rr3rr4rr5rr7r r s rr:r:�s��� �M� Q� �#�%�%� �������J����������1�!��!��� � � � � r) NNNFTNNNNNNN)rrrrrrrrrrrrrrrrrrr r!r"r#r$r%r&rr'r ) NNFr0TNNTN)r1r r2rrrr3r%rrr4rrrr5r6rrr7rr r!r'r8)�__doc__� __future__r�typingrrr+�pandas.compat._optionalr�pandas.util._exceptionsr�google.auth.credentialsr �pandasr rr.r:�rr�<module>rCs4����"�"�"�"�"�"�������������>�>�>�>�>�>�4�4�4�4�4�4��!�3�3�3�3�3�3� � � � � � ����"� �"&��!%���+/�&*�%)�"�$(�}�}�}�}�}�F"� ���!%�04���&*�!�!�!�!�!�!�!r
Memory