repo stringlengths 7 90 | file_url stringlengths 81 315 | file_path stringlengths 4 228 | content stringlengths 0 32.8k | language stringclasses 1
value | license stringclasses 7
values | commit_sha stringlengths 40 40 | retrieved_at stringdate 2026-01-04 14:38:15 2026-01-05 02:33:18 | truncated bool 2
classes |
|---|---|---|---|---|---|---|---|---|
keras-team/keras | https://github.com/keras-team/keras/blob/c67eddb4ff8b615886893ca996dc216bc923d598/guides/distributed_training_with_jax.py | guides/distributed_training_with_jax.py | """
Title: Multi-GPU distributed training with JAX
Author: [fchollet](https://twitter.com/fchollet)
Date created: 2023/07/11
Last modified: 2023/07/11
Description: Guide to multi-GPU/TPU training for Keras models with JAX.
Accelerator: GPU
"""
"""
## Introduction
There are generally two ways to distribute computation... | python | Apache-2.0 | c67eddb4ff8b615886893ca996dc216bc923d598 | 2026-01-04T14:38:29.819962Z | false |
keras-team/keras | https://github.com/keras-team/keras/blob/c67eddb4ff8b615886893ca996dc216bc923d598/guides/transfer_learning.py | guides/transfer_learning.py | """
Title: Transfer learning & fine-tuning
Author: [fchollet](https://twitter.com/fchollet)
Date created: 2020/04/15
Last modified: 2023/06/25
Description: Complete guide to transfer learning & fine-tuning in Keras.
Accelerator: GPU
"""
"""
## Setup
"""
import numpy as np
import keras
from keras import layers
import ... | python | Apache-2.0 | c67eddb4ff8b615886893ca996dc216bc923d598 | 2026-01-04T14:38:29.819962Z | false |
keras-team/keras | https://github.com/keras-team/keras/blob/c67eddb4ff8b615886893ca996dc216bc923d598/guides/writing_your_own_callbacks.py | guides/writing_your_own_callbacks.py | """
Title: Writing your own callbacks
Authors: Rick Chao, Francois Chollet
Date created: 2019/03/20
Last modified: 2023/06/25
Description: Complete guide to writing new Keras callbacks.
Accelerator: GPU
"""
"""
## Introduction
A callback is a powerful tool to customize the behavior of a Keras model during
training, e... | python | Apache-2.0 | c67eddb4ff8b615886893ca996dc216bc923d598 | 2026-01-04T14:38:29.819962Z | false |
keras-team/keras | https://github.com/keras-team/keras/blob/c67eddb4ff8b615886893ca996dc216bc923d598/guides/custom_train_step_in_tensorflow.py | guides/custom_train_step_in_tensorflow.py | """
Title: Customizing what happens in `fit()` with TensorFlow
Author: [fchollet](https://twitter.com/fchollet)
Date created: 2020/04/15
Last modified: 2023/06/27
Description: Overriding the training step of the Model class with TensorFlow.
Accelerator: GPU
"""
"""
## Introduction
When you're doing supervised learnin... | python | Apache-2.0 | c67eddb4ff8b615886893ca996dc216bc923d598 | 2026-01-04T14:38:29.819962Z | false |
keras-team/keras | https://github.com/keras-team/keras/blob/c67eddb4ff8b615886893ca996dc216bc923d598/guides/training_with_built_in_methods.py | guides/training_with_built_in_methods.py | """
Title: Training & evaluation with the built-in methods
Author: [fchollet](https://twitter.com/fchollet)
Date created: 2019/03/01
Last modified: 2023/03/25
Description: Complete guide to training & evaluation with `fit()` and `evaluate()`.
Accelerator: GPU
"""
"""
## Setup
"""
# We import torch & TF so as to use t... | python | Apache-2.0 | c67eddb4ff8b615886893ca996dc216bc923d598 | 2026-01-04T14:38:29.819962Z | true |
keras-team/keras | https://github.com/keras-team/keras/blob/c67eddb4ff8b615886893ca996dc216bc923d598/guides/distributed_training_with_tensorflow.py | guides/distributed_training_with_tensorflow.py | """
Title: Multi-GPU distributed training with TensorFlow
Author: [fchollet](https://twitter.com/fchollet)
Date created: 2020/04/28
Last modified: 2023/06/29
Description: Guide to multi-GPU training for Keras models with TensorFlow.
Accelerator: GPU
"""
"""
## Introduction
There are generally two ways to distribute c... | python | Apache-2.0 | c67eddb4ff8b615886893ca996dc216bc923d598 | 2026-01-04T14:38:29.819962Z | false |
keras-team/keras | https://github.com/keras-team/keras/blob/c67eddb4ff8b615886893ca996dc216bc923d598/guides/writing_a_custom_training_loop_in_jax.py | guides/writing_a_custom_training_loop_in_jax.py | """
Title: Writing a training loop from scratch in JAX
Author: [fchollet](https://twitter.com/fchollet)
Date created: 2023/06/25
Last modified: 2023/06/25
Description: Writing low-level training & evaluation loops in JAX.
Accelerator: None
"""
"""
## Setup
"""
import os
# This guide can only be run with the jax back... | python | Apache-2.0 | c67eddb4ff8b615886893ca996dc216bc923d598 | 2026-01-04T14:38:29.819962Z | false |
keras-team/keras | https://github.com/keras-team/keras/blob/c67eddb4ff8b615886893ca996dc216bc923d598/guides/making_new_layers_and_models_via_subclassing.py | guides/making_new_layers_and_models_via_subclassing.py | """
Title: Making new layers and models via subclassing
Author: [fchollet](https://twitter.com/fchollet)
Date created: 2019/03/01
Last modified: 2023/06/25
Description: Complete guide to writing `Layer` and `Model` objects from scratch.
Accelerator: None
"""
"""
## Introduction
This guide will cover everything you ne... | python | Apache-2.0 | c67eddb4ff8b615886893ca996dc216bc923d598 | 2026-01-04T14:38:29.819962Z | false |
keras-team/keras | https://github.com/keras-team/keras/blob/c67eddb4ff8b615886893ca996dc216bc923d598/guides/custom_train_step_in_jax.py | guides/custom_train_step_in_jax.py | """
Title: Customizing what happens in `fit()` with JAX
Author: [fchollet](https://twitter.com/fchollet)
Date created: 2023/06/27
Last modified: 2023/06/27
Description: Overriding the training step of the Model class with JAX.
Accelerator: GPU
"""
"""
## Introduction
When you're doing supervised learning, you can use... | python | Apache-2.0 | c67eddb4ff8b615886893ca996dc216bc923d598 | 2026-01-04T14:38:29.819962Z | false |
keras-team/keras | https://github.com/keras-team/keras/blob/c67eddb4ff8b615886893ca996dc216bc923d598/guides/sequential_model.py | guides/sequential_model.py | """
Title: The Sequential model
Author: [fchollet](https://twitter.com/fchollet)
Date created: 2020/04/12
Last modified: 2023/06/25
Description: Complete guide to the Sequential model.
Accelerator: GPU
"""
"""
## Setup
"""
import keras
from keras import layers
from keras import ops
"""
## When to use a Sequential m... | python | Apache-2.0 | c67eddb4ff8b615886893ca996dc216bc923d598 | 2026-01-04T14:38:29.819962Z | false |
scrapy/scrapy | https://github.com/scrapy/scrapy/blob/d1bd8eb49f7aba9289e4ff692006cead8bcd9080/conftest.py | conftest.py | from __future__ import annotations
import importlib
from pathlib import Path
from typing import TYPE_CHECKING
import pytest
from twisted.web.http import H2_ENABLED
from scrapy.utils.reactor import set_asyncio_event_loop_policy
from tests.keys import generate_keys
from tests.mockserver.http import MockServer
if TYPE... | python | BSD-3-Clause | d1bd8eb49f7aba9289e4ff692006cead8bcd9080 | 2026-01-04T14:38:41.023839Z | false |
scrapy/scrapy | https://github.com/scrapy/scrapy/blob/d1bd8eb49f7aba9289e4ff692006cead8bcd9080/scrapy/robotstxt.py | scrapy/robotstxt.py | from __future__ import annotations
import logging
import sys
from abc import ABCMeta, abstractmethod
from typing import TYPE_CHECKING
from urllib.robotparser import RobotFileParser
from protego import Protego
from scrapy.utils.python import to_unicode
if TYPE_CHECKING:
# typing.Self requires Python 3.11
fro... | python | BSD-3-Clause | d1bd8eb49f7aba9289e4ff692006cead8bcd9080 | 2026-01-04T14:38:41.023839Z | false |
scrapy/scrapy | https://github.com/scrapy/scrapy/blob/d1bd8eb49f7aba9289e4ff692006cead8bcd9080/scrapy/interfaces.py | scrapy/interfaces.py | # pylint: disable=no-method-argument,no-self-argument
from zope.interface import Interface
class ISpiderLoader(Interface):
def from_settings(settings):
"""Return an instance of the class for the given settings"""
def load(spider_name):
"""Return the Spider class for the given spider name. If... | python | BSD-3-Clause | d1bd8eb49f7aba9289e4ff692006cead8bcd9080 | 2026-01-04T14:38:41.023839Z | false |
scrapy/scrapy | https://github.com/scrapy/scrapy/blob/d1bd8eb49f7aba9289e4ff692006cead8bcd9080/scrapy/resolver.py | scrapy/resolver.py | from __future__ import annotations
from typing import TYPE_CHECKING, Any
from twisted.internet import defer
from twisted.internet.base import ReactorBase, ThreadedResolver
from twisted.internet.interfaces import (
IAddress,
IHostnameResolver,
IHostResolution,
IResolutionReceiver,
IResolverSimple,
... | python | BSD-3-Clause | d1bd8eb49f7aba9289e4ff692006cead8bcd9080 | 2026-01-04T14:38:41.023839Z | false |
scrapy/scrapy | https://github.com/scrapy/scrapy/blob/d1bd8eb49f7aba9289e4ff692006cead8bcd9080/scrapy/shell.py | scrapy/shell.py | """Scrapy Shell
See documentation in docs/topics/shell.rst
"""
from __future__ import annotations
import contextlib
import os
import signal
from typing import TYPE_CHECKING, Any
from itemadapter import is_item
from twisted.internet import defer, threads
from twisted.python import threadable
from w3lib.url import a... | python | BSD-3-Clause | d1bd8eb49f7aba9289e4ff692006cead8bcd9080 | 2026-01-04T14:38:41.023839Z | false |
scrapy/scrapy | https://github.com/scrapy/scrapy/blob/d1bd8eb49f7aba9289e4ff692006cead8bcd9080/scrapy/mail.py | scrapy/mail.py | """
Mail sending helpers
See documentation in docs/topics/email.rst
"""
from __future__ import annotations
import logging
from email import encoders as Encoders
from email.mime.base import MIMEBase
from email.mime.multipart import MIMEMultipart
from email.mime.nonmultipart import MIMENonMultipart
from email.mime.tex... | python | BSD-3-Clause | d1bd8eb49f7aba9289e4ff692006cead8bcd9080 | 2026-01-04T14:38:41.023839Z | false |
scrapy/scrapy | https://github.com/scrapy/scrapy/blob/d1bd8eb49f7aba9289e4ff692006cead8bcd9080/scrapy/spiderloader.py | scrapy/spiderloader.py | from __future__ import annotations
import traceback
import warnings
from collections import defaultdict
from typing import TYPE_CHECKING, Protocol, cast
from zope.interface import implementer
from zope.interface.verify import verifyClass
from scrapy.interfaces import ISpiderLoader
from scrapy.utils.misc import load_... | python | BSD-3-Clause | d1bd8eb49f7aba9289e4ff692006cead8bcd9080 | 2026-01-04T14:38:41.023839Z | false |
scrapy/scrapy | https://github.com/scrapy/scrapy/blob/d1bd8eb49f7aba9289e4ff692006cead8bcd9080/scrapy/addons.py | scrapy/addons.py | from __future__ import annotations
import logging
from typing import TYPE_CHECKING, Any
from scrapy.exceptions import NotConfigured
from scrapy.utils.conf import build_component_list
from scrapy.utils.misc import build_from_crawler, load_object
if TYPE_CHECKING:
from scrapy.crawler import Crawler
from scrapy... | python | BSD-3-Clause | d1bd8eb49f7aba9289e4ff692006cead8bcd9080 | 2026-01-04T14:38:41.023839Z | false |
scrapy/scrapy | https://github.com/scrapy/scrapy/blob/d1bd8eb49f7aba9289e4ff692006cead8bcd9080/scrapy/link.py | scrapy/link.py | """
This module defines the Link object used in Link extractors.
For actual link extractors implementation see scrapy.linkextractors, or
its documentation in: docs/topics/link-extractors.rst
"""
class Link:
"""Link objects represent an extracted link by the LinkExtractor.
Using the anchor tag sample below t... | python | BSD-3-Clause | d1bd8eb49f7aba9289e4ff692006cead8bcd9080 | 2026-01-04T14:38:41.023839Z | false |
scrapy/scrapy | https://github.com/scrapy/scrapy/blob/d1bd8eb49f7aba9289e4ff692006cead8bcd9080/scrapy/responsetypes.py | scrapy/responsetypes.py | """
This module implements a class which returns the appropriate Response class
based on different criteria.
"""
from __future__ import annotations
from io import StringIO
from mimetypes import MimeTypes
from pkgutil import get_data
from typing import TYPE_CHECKING
from scrapy.http import Response
from scrapy.utils.... | python | BSD-3-Clause | d1bd8eb49f7aba9289e4ff692006cead8bcd9080 | 2026-01-04T14:38:41.023839Z | false |
scrapy/scrapy | https://github.com/scrapy/scrapy/blob/d1bd8eb49f7aba9289e4ff692006cead8bcd9080/scrapy/middleware.py | scrapy/middleware.py | from __future__ import annotations
import logging
import pprint
import warnings
from abc import ABC, abstractmethod
from collections import defaultdict, deque
from typing import TYPE_CHECKING, Any, Concatenate, ParamSpec, TypeVar, cast
from scrapy.exceptions import NotConfigured, ScrapyDeprecationWarning
from scrapy.... | python | BSD-3-Clause | d1bd8eb49f7aba9289e4ff692006cead8bcd9080 | 2026-01-04T14:38:41.023839Z | false |
scrapy/scrapy | https://github.com/scrapy/scrapy/blob/d1bd8eb49f7aba9289e4ff692006cead8bcd9080/scrapy/pqueues.py | scrapy/pqueues.py | from __future__ import annotations
import hashlib
import logging
from typing import TYPE_CHECKING, Protocol, cast
from scrapy.utils.misc import build_from_crawler
if TYPE_CHECKING:
from collections.abc import Iterable
# typing.Self requires Python 3.11
from typing_extensions import Self
from scrapy... | python | BSD-3-Clause | d1bd8eb49f7aba9289e4ff692006cead8bcd9080 | 2026-01-04T14:38:41.023839Z | false |
scrapy/scrapy | https://github.com/scrapy/scrapy/blob/d1bd8eb49f7aba9289e4ff692006cead8bcd9080/scrapy/squeues.py | scrapy/squeues.py | """
Scheduler queues
"""
from __future__ import annotations
import marshal
import pickle
from pathlib import Path
from typing import TYPE_CHECKING, Any
from queuelib import queue
from scrapy.utils.request import request_from_dict
if TYPE_CHECKING:
from collections.abc import Callable
from os import PathLik... | python | BSD-3-Clause | d1bd8eb49f7aba9289e4ff692006cead8bcd9080 | 2026-01-04T14:38:41.023839Z | false |
scrapy/scrapy | https://github.com/scrapy/scrapy/blob/d1bd8eb49f7aba9289e4ff692006cead8bcd9080/scrapy/exporters.py | scrapy/exporters.py | """
Item Exporters are used to export/serialize items into different formats.
"""
from __future__ import annotations
import csv
import marshal
import pickle
import pprint
from abc import ABC, abstractmethod
from collections.abc import Callable, Iterable, Mapping
from io import BytesIO, TextIOWrapper
from typing impor... | python | BSD-3-Clause | d1bd8eb49f7aba9289e4ff692006cead8bcd9080 | 2026-01-04T14:38:41.023839Z | false |
scrapy/scrapy | https://github.com/scrapy/scrapy/blob/d1bd8eb49f7aba9289e4ff692006cead8bcd9080/scrapy/statscollectors.py | scrapy/statscollectors.py | """
Scrapy extension for collecting scraping stats
"""
from __future__ import annotations
import logging
import pprint
from typing import TYPE_CHECKING, Any
from scrapy.utils.decorators import _warn_spider_arg
if TYPE_CHECKING:
from scrapy import Spider
from scrapy.crawler import Crawler
logger = logging.... | python | BSD-3-Clause | d1bd8eb49f7aba9289e4ff692006cead8bcd9080 | 2026-01-04T14:38:41.023839Z | false |
scrapy/scrapy | https://github.com/scrapy/scrapy/blob/d1bd8eb49f7aba9289e4ff692006cead8bcd9080/scrapy/signals.py | scrapy/signals.py | """
Scrapy signals
These signals are documented in docs/topics/signals.rst. Please don't add new
signals here without documenting them there.
"""
engine_started = object()
engine_stopped = object()
scheduler_empty = object()
spider_opened = object()
spider_idle = object()
spider_closed = object()
spider_error = objec... | python | BSD-3-Clause | d1bd8eb49f7aba9289e4ff692006cead8bcd9080 | 2026-01-04T14:38:41.023839Z | false |
scrapy/scrapy | https://github.com/scrapy/scrapy/blob/d1bd8eb49f7aba9289e4ff692006cead8bcd9080/scrapy/exceptions.py | scrapy/exceptions.py | """
Scrapy core exceptions
These exceptions are documented in docs/topics/exceptions.rst. Please don't add
new exceptions here without documenting them there.
"""
from __future__ import annotations
from typing import Any
# Internal
class NotConfigured(Exception):
"""Indicates a missing configuration situation... | python | BSD-3-Clause | d1bd8eb49f7aba9289e4ff692006cead8bcd9080 | 2026-01-04T14:38:41.023839Z | false |
scrapy/scrapy | https://github.com/scrapy/scrapy/blob/d1bd8eb49f7aba9289e4ff692006cead8bcd9080/scrapy/signalmanager.py | scrapy/signalmanager.py | from __future__ import annotations
import warnings
from typing import Any
from pydispatch import dispatcher
from twisted.internet.defer import Deferred
from scrapy.exceptions import ScrapyDeprecationWarning
from scrapy.utils import signal as _signal
from scrapy.utils.defer import maybe_deferred_to_future
class Sig... | python | BSD-3-Clause | d1bd8eb49f7aba9289e4ff692006cead8bcd9080 | 2026-01-04T14:38:41.023839Z | false |
scrapy/scrapy | https://github.com/scrapy/scrapy/blob/d1bd8eb49f7aba9289e4ff692006cead8bcd9080/scrapy/__main__.py | scrapy/__main__.py | from scrapy.cmdline import execute
if __name__ == "__main__":
execute()
| python | BSD-3-Clause | d1bd8eb49f7aba9289e4ff692006cead8bcd9080 | 2026-01-04T14:38:41.023839Z | false |
scrapy/scrapy | https://github.com/scrapy/scrapy/blob/d1bd8eb49f7aba9289e4ff692006cead8bcd9080/scrapy/dupefilters.py | scrapy/dupefilters.py | from __future__ import annotations
import logging
from pathlib import Path
from typing import TYPE_CHECKING
from warnings import warn
from scrapy.exceptions import ScrapyDeprecationWarning
from scrapy.utils.job import job_dir
from scrapy.utils.request import (
RequestFingerprinter,
RequestFingerprinterProtoco... | python | BSD-3-Clause | d1bd8eb49f7aba9289e4ff692006cead8bcd9080 | 2026-01-04T14:38:41.023839Z | false |
scrapy/scrapy | https://github.com/scrapy/scrapy/blob/d1bd8eb49f7aba9289e4ff692006cead8bcd9080/scrapy/extension.py | scrapy/extension.py | """
The Extension Manager
See documentation in docs/topics/extensions.rst
"""
from __future__ import annotations
from typing import TYPE_CHECKING, Any
from scrapy.middleware import MiddlewareManager
from scrapy.utils.conf import build_component_list
if TYPE_CHECKING:
from scrapy.settings import Settings
clas... | python | BSD-3-Clause | d1bd8eb49f7aba9289e4ff692006cead8bcd9080 | 2026-01-04T14:38:41.023839Z | false |
scrapy/scrapy | https://github.com/scrapy/scrapy/blob/d1bd8eb49f7aba9289e4ff692006cead8bcd9080/scrapy/__init__.py | scrapy/__init__.py | """
Scrapy - a web crawling and web scraping framework written for Python
"""
import pkgutil
import sys
import warnings
# Declare top-level shortcuts
from scrapy.http import FormRequest, Request
from scrapy.item import Field, Item
from scrapy.selector import Selector
from scrapy.spiders import Spider
__all__ = [
... | python | BSD-3-Clause | d1bd8eb49f7aba9289e4ff692006cead8bcd9080 | 2026-01-04T14:38:41.023839Z | false |
scrapy/scrapy | https://github.com/scrapy/scrapy/blob/d1bd8eb49f7aba9289e4ff692006cead8bcd9080/scrapy/logformatter.py | scrapy/logformatter.py | from __future__ import annotations
import logging
import os
from typing import TYPE_CHECKING, Any, TypedDict
from twisted.python.failure import Failure
# working around https://github.com/sphinx-doc/sphinx/issues/10400
from scrapy import Request, Spider # noqa: TC001
from scrapy.http import Response # noqa: TC001
... | python | BSD-3-Clause | d1bd8eb49f7aba9289e4ff692006cead8bcd9080 | 2026-01-04T14:38:41.023839Z | false |
scrapy/scrapy | https://github.com/scrapy/scrapy/blob/d1bd8eb49f7aba9289e4ff692006cead8bcd9080/scrapy/item.py | scrapy/item.py | """
Scrapy Item
See documentation in docs/topics/item.rst
"""
from __future__ import annotations
from abc import ABCMeta
from collections.abc import MutableMapping
from copy import deepcopy
from pprint import pformat
from typing import TYPE_CHECKING, Any, NoReturn
from scrapy.utils.trackref import object_ref
if TY... | python | BSD-3-Clause | d1bd8eb49f7aba9289e4ff692006cead8bcd9080 | 2026-01-04T14:38:41.023839Z | false |
scrapy/scrapy | https://github.com/scrapy/scrapy/blob/d1bd8eb49f7aba9289e4ff692006cead8bcd9080/scrapy/crawler.py | scrapy/crawler.py | from __future__ import annotations
import asyncio
import contextlib
import logging
import pprint
import signal
import warnings
from abc import ABC, abstractmethod
from typing import TYPE_CHECKING, Any, TypeVar
from twisted.internet.defer import Deferred, DeferredList, inlineCallbacks
from scrapy import Spider
from s... | python | BSD-3-Clause | d1bd8eb49f7aba9289e4ff692006cead8bcd9080 | 2026-01-04T14:38:41.023839Z | false |
scrapy/scrapy | https://github.com/scrapy/scrapy/blob/d1bd8eb49f7aba9289e4ff692006cead8bcd9080/scrapy/cmdline.py | scrapy/cmdline.py | from __future__ import annotations
import argparse
import cProfile
import inspect
import os
import sys
from importlib.metadata import entry_points
from typing import TYPE_CHECKING, ParamSpec
import scrapy
from scrapy.commands import BaseRunSpiderCommand, ScrapyCommand, ScrapyHelpFormatter
from scrapy.crawler import A... | python | BSD-3-Clause | d1bd8eb49f7aba9289e4ff692006cead8bcd9080 | 2026-01-04T14:38:41.023839Z | false |
scrapy/scrapy | https://github.com/scrapy/scrapy/blob/d1bd8eb49f7aba9289e4ff692006cead8bcd9080/scrapy/loader/__init__.py | scrapy/loader/__init__.py | """
Item Loader
See documentation in docs/topics/loaders.rst
"""
from __future__ import annotations
from typing import TYPE_CHECKING, Any
import itemloaders
from scrapy.item import Item
from scrapy.selector import Selector
if TYPE_CHECKING:
from scrapy.http import TextResponse
class ItemLoader(itemloaders.I... | python | BSD-3-Clause | d1bd8eb49f7aba9289e4ff692006cead8bcd9080 | 2026-01-04T14:38:41.023839Z | false |
scrapy/scrapy | https://github.com/scrapy/scrapy/blob/d1bd8eb49f7aba9289e4ff692006cead8bcd9080/scrapy/settings/default_settings.py | scrapy/settings/default_settings.py | """This module contains the default values for all settings used by Scrapy.
For more information about these settings you can read the settings
documentation in docs/topics/settings.rst
Scrapy developers, if you add a setting here remember to:
* add it in alphabetical order, with the exception that enabling flags an... | python | BSD-3-Clause | d1bd8eb49f7aba9289e4ff692006cead8bcd9080 | 2026-01-04T14:38:41.023839Z | false |
scrapy/scrapy | https://github.com/scrapy/scrapy/blob/d1bd8eb49f7aba9289e4ff692006cead8bcd9080/scrapy/settings/__init__.py | scrapy/settings/__init__.py | from __future__ import annotations
import copy
import json
import warnings
from collections.abc import Iterable, Iterator, Mapping, MutableMapping
from importlib import import_module
from pprint import pformat
from typing import TYPE_CHECKING, Any, TypeAlias, cast
from scrapy.exceptions import ScrapyDeprecationWarnin... | python | BSD-3-Clause | d1bd8eb49f7aba9289e4ff692006cead8bcd9080 | 2026-01-04T14:38:41.023839Z | false |
scrapy/scrapy | https://github.com/scrapy/scrapy/blob/d1bd8eb49f7aba9289e4ff692006cead8bcd9080/scrapy/extensions/memdebug.py | scrapy/extensions/memdebug.py | """
MemoryDebugger extension
See documentation in docs/topics/extensions.rst
"""
from __future__ import annotations
import gc
from typing import TYPE_CHECKING
from scrapy import Spider, signals
from scrapy.exceptions import NotConfigured
from scrapy.utils.trackref import live_refs
if TYPE_CHECKING:
# typing.Se... | python | BSD-3-Clause | d1bd8eb49f7aba9289e4ff692006cead8bcd9080 | 2026-01-04T14:38:41.023839Z | false |
scrapy/scrapy | https://github.com/scrapy/scrapy/blob/d1bd8eb49f7aba9289e4ff692006cead8bcd9080/scrapy/extensions/memusage.py | scrapy/extensions/memusage.py | """
MemoryUsage extension
See documentation in docs/topics/extensions.rst
"""
from __future__ import annotations
import logging
import socket
import sys
from importlib import import_module
from pprint import pformat
from typing import TYPE_CHECKING
from scrapy import signals
from scrapy.exceptions import NotConfigu... | python | BSD-3-Clause | d1bd8eb49f7aba9289e4ff692006cead8bcd9080 | 2026-01-04T14:38:41.023839Z | false |
scrapy/scrapy | https://github.com/scrapy/scrapy/blob/d1bd8eb49f7aba9289e4ff692006cead8bcd9080/scrapy/extensions/logstats.py | scrapy/extensions/logstats.py | from __future__ import annotations
import logging
from typing import TYPE_CHECKING
from scrapy import Spider, signals
from scrapy.exceptions import NotConfigured
from scrapy.utils.asyncio import AsyncioLoopingCall, create_looping_call
if TYPE_CHECKING:
from twisted.internet.task import LoopingCall
# typing.... | python | BSD-3-Clause | d1bd8eb49f7aba9289e4ff692006cead8bcd9080 | 2026-01-04T14:38:41.023839Z | false |
scrapy/scrapy | https://github.com/scrapy/scrapy/blob/d1bd8eb49f7aba9289e4ff692006cead8bcd9080/scrapy/extensions/debug.py | scrapy/extensions/debug.py | """
Extensions for debugging Scrapy
See documentation in docs/topics/extensions.rst
"""
from __future__ import annotations
import contextlib
import logging
import signal
import sys
import threading
import traceback
from pdb import Pdb
from typing import TYPE_CHECKING
from scrapy.utils.engine import format_engine_st... | python | BSD-3-Clause | d1bd8eb49f7aba9289e4ff692006cead8bcd9080 | 2026-01-04T14:38:41.023839Z | false |
scrapy/scrapy | https://github.com/scrapy/scrapy/blob/d1bd8eb49f7aba9289e4ff692006cead8bcd9080/scrapy/extensions/feedexport.py | scrapy/extensions/feedexport.py | """
Feed Exports extension
See documentation in docs/topics/feed-exports.rst
"""
from __future__ import annotations
import asyncio
import contextlib
import logging
import re
import sys
import warnings
from abc import ABC, abstractmethod
from collections.abc import Callable, Coroutine
from datetime import datetime, t... | python | BSD-3-Clause | d1bd8eb49f7aba9289e4ff692006cead8bcd9080 | 2026-01-04T14:38:41.023839Z | false |
scrapy/scrapy | https://github.com/scrapy/scrapy/blob/d1bd8eb49f7aba9289e4ff692006cead8bcd9080/scrapy/extensions/throttle.py | scrapy/extensions/throttle.py | from __future__ import annotations
import logging
from typing import TYPE_CHECKING
from scrapy import Request, Spider, signals
from scrapy.exceptions import NotConfigured
if TYPE_CHECKING:
# typing.Self requires Python 3.11
from typing_extensions import Self
from scrapy.core.downloader import Slot
f... | python | BSD-3-Clause | d1bd8eb49f7aba9289e4ff692006cead8bcd9080 | 2026-01-04T14:38:41.023839Z | false |
scrapy/scrapy | https://github.com/scrapy/scrapy/blob/d1bd8eb49f7aba9289e4ff692006cead8bcd9080/scrapy/extensions/logcount.py | scrapy/extensions/logcount.py | from __future__ import annotations
import logging
from typing import TYPE_CHECKING
from scrapy import Spider, signals
from scrapy.utils.log import LogCounterHandler
if TYPE_CHECKING:
# typing.Self requires Python 3.11
from typing_extensions import Self
from scrapy.crawler import Crawler
logger = loggi... | python | BSD-3-Clause | d1bd8eb49f7aba9289e4ff692006cead8bcd9080 | 2026-01-04T14:38:41.023839Z | false |
scrapy/scrapy | https://github.com/scrapy/scrapy/blob/d1bd8eb49f7aba9289e4ff692006cead8bcd9080/scrapy/extensions/corestats.py | scrapy/extensions/corestats.py | """
Extension for collecting core stats like items scraped and start/finish times
"""
from __future__ import annotations
from datetime import datetime, timezone
from typing import TYPE_CHECKING, Any
from scrapy import Spider, signals
if TYPE_CHECKING:
# typing.Self requires Python 3.11
from typing_extension... | python | BSD-3-Clause | d1bd8eb49f7aba9289e4ff692006cead8bcd9080 | 2026-01-04T14:38:41.023839Z | false |
scrapy/scrapy | https://github.com/scrapy/scrapy/blob/d1bd8eb49f7aba9289e4ff692006cead8bcd9080/scrapy/extensions/periodic_log.py | scrapy/extensions/periodic_log.py | from __future__ import annotations
import logging
from datetime import datetime, timezone
from typing import TYPE_CHECKING, Any
from scrapy import Spider, signals
from scrapy.exceptions import NotConfigured
from scrapy.utils.asyncio import AsyncioLoopingCall, create_looping_call
from scrapy.utils.serialize import Scr... | python | BSD-3-Clause | d1bd8eb49f7aba9289e4ff692006cead8bcd9080 | 2026-01-04T14:38:41.023839Z | false |
scrapy/scrapy | https://github.com/scrapy/scrapy/blob/d1bd8eb49f7aba9289e4ff692006cead8bcd9080/scrapy/extensions/__init__.py | scrapy/extensions/__init__.py | python | BSD-3-Clause | d1bd8eb49f7aba9289e4ff692006cead8bcd9080 | 2026-01-04T14:38:41.023839Z | false | |
scrapy/scrapy | https://github.com/scrapy/scrapy/blob/d1bd8eb49f7aba9289e4ff692006cead8bcd9080/scrapy/extensions/closespider.py | scrapy/extensions/closespider.py | """CloseSpider is an extension that forces spiders to be closed after certain
conditions are met.
See documentation in docs/topics/extensions.rst
"""
from __future__ import annotations
import logging
from collections import defaultdict
from typing import TYPE_CHECKING, Any
from scrapy import Request, Spider, signal... | python | BSD-3-Clause | d1bd8eb49f7aba9289e4ff692006cead8bcd9080 | 2026-01-04T14:38:41.023839Z | false |
scrapy/scrapy | https://github.com/scrapy/scrapy/blob/d1bd8eb49f7aba9289e4ff692006cead8bcd9080/scrapy/extensions/spiderstate.py | scrapy/extensions/spiderstate.py | from __future__ import annotations
import pickle
from pathlib import Path
from typing import TYPE_CHECKING
from scrapy import Spider, signals
from scrapy.exceptions import NotConfigured
from scrapy.utils.job import job_dir
if TYPE_CHECKING:
# typing.Self requires Python 3.11
from typing_extensions import Sel... | python | BSD-3-Clause | d1bd8eb49f7aba9289e4ff692006cead8bcd9080 | 2026-01-04T14:38:41.023839Z | false |
scrapy/scrapy | https://github.com/scrapy/scrapy/blob/d1bd8eb49f7aba9289e4ff692006cead8bcd9080/scrapy/extensions/statsmailer.py | scrapy/extensions/statsmailer.py | """
StatsMailer extension sends an email when a spider finishes scraping.
Use STATSMAILER_RCPTS setting to enable and give the recipient mail address
"""
from __future__ import annotations
from typing import TYPE_CHECKING
from scrapy import Spider, signals
from scrapy.exceptions import NotConfigured
from scrapy.mai... | python | BSD-3-Clause | d1bd8eb49f7aba9289e4ff692006cead8bcd9080 | 2026-01-04T14:38:41.023839Z | false |
scrapy/scrapy | https://github.com/scrapy/scrapy/blob/d1bd8eb49f7aba9289e4ff692006cead8bcd9080/scrapy/extensions/telnet.py | scrapy/extensions/telnet.py | """
Scrapy Telnet Console extension
See documentation in docs/topics/telnetconsole.rst
"""
from __future__ import annotations
import binascii
import logging
import os
import pprint
from typing import TYPE_CHECKING, Any
from twisted.conch import telnet
from twisted.conch.insults import insults
from twisted.internet ... | python | BSD-3-Clause | d1bd8eb49f7aba9289e4ff692006cead8bcd9080 | 2026-01-04T14:38:41.023839Z | false |
scrapy/scrapy | https://github.com/scrapy/scrapy/blob/d1bd8eb49f7aba9289e4ff692006cead8bcd9080/scrapy/extensions/httpcache.py | scrapy/extensions/httpcache.py | from __future__ import annotations
import gzip
import logging
import pickle
from email.utils import mktime_tz, parsedate_tz
from importlib import import_module
from pathlib import Path
from time import time
from typing import IO, TYPE_CHECKING, Any, Concatenate, cast
from weakref import WeakKeyDictionary
from w3lib.h... | python | BSD-3-Clause | d1bd8eb49f7aba9289e4ff692006cead8bcd9080 | 2026-01-04T14:38:41.023839Z | false |
scrapy/scrapy | https://github.com/scrapy/scrapy/blob/d1bd8eb49f7aba9289e4ff692006cead8bcd9080/scrapy/extensions/postprocessing.py | scrapy/extensions/postprocessing.py | """
Extension for processing data before they are exported to feeds.
"""
from bz2 import BZ2File
from gzip import GzipFile
from io import IOBase
from lzma import LZMAFile
from typing import IO, Any, BinaryIO, cast
from scrapy.utils.misc import load_object
class GzipPlugin:
"""
Compresses received data using... | python | BSD-3-Clause | d1bd8eb49f7aba9289e4ff692006cead8bcd9080 | 2026-01-04T14:38:41.023839Z | false |
scrapy/scrapy | https://github.com/scrapy/scrapy/blob/d1bd8eb49f7aba9289e4ff692006cead8bcd9080/scrapy/commands/runspider.py | scrapy/commands/runspider.py | from __future__ import annotations
import sys
from importlib import import_module
from pathlib import Path
from typing import TYPE_CHECKING
from scrapy.commands import BaseRunSpiderCommand
from scrapy.exceptions import UsageError
from scrapy.spiderloader import DummySpiderLoader
from scrapy.utils.spider import iter_s... | python | BSD-3-Clause | d1bd8eb49f7aba9289e4ff692006cead8bcd9080 | 2026-01-04T14:38:41.023839Z | false |
scrapy/scrapy | https://github.com/scrapy/scrapy/blob/d1bd8eb49f7aba9289e4ff692006cead8bcd9080/scrapy/commands/shell.py | scrapy/commands/shell.py | """
Scrapy Shell
See documentation in docs/topics/shell.rst
"""
from __future__ import annotations
from threading import Thread
from typing import TYPE_CHECKING, Any
from scrapy.commands import ScrapyCommand
from scrapy.http import Request
from scrapy.shell import Shell
from scrapy.utils.defer import _schedule_coro... | python | BSD-3-Clause | d1bd8eb49f7aba9289e4ff692006cead8bcd9080 | 2026-01-04T14:38:41.023839Z | false |
scrapy/scrapy | https://github.com/scrapy/scrapy/blob/d1bd8eb49f7aba9289e4ff692006cead8bcd9080/scrapy/commands/edit.py | scrapy/commands/edit.py | import argparse
import os
import sys
from scrapy.commands import ScrapyCommand
from scrapy.exceptions import UsageError
from scrapy.spiderloader import get_spider_loader
class Command(ScrapyCommand):
requires_project = True
requires_crawler_process = False
default_settings = {"LOG_ENABLED": False}
d... | python | BSD-3-Clause | d1bd8eb49f7aba9289e4ff692006cead8bcd9080 | 2026-01-04T14:38:41.023839Z | false |
scrapy/scrapy | https://github.com/scrapy/scrapy/blob/d1bd8eb49f7aba9289e4ff692006cead8bcd9080/scrapy/commands/check.py | scrapy/commands/check.py | import argparse
import time
from collections import defaultdict
from unittest import TextTestResult as _TextTestResult
from unittest import TextTestRunner
from scrapy.commands import ScrapyCommand
from scrapy.contracts import ContractsManager
from scrapy.utils.conf import build_component_list
from scrapy.utils.misc im... | python | BSD-3-Clause | d1bd8eb49f7aba9289e4ff692006cead8bcd9080 | 2026-01-04T14:38:41.023839Z | false |
scrapy/scrapy | https://github.com/scrapy/scrapy/blob/d1bd8eb49f7aba9289e4ff692006cead8bcd9080/scrapy/commands/view.py | scrapy/commands/view.py | import argparse
import logging
from scrapy.commands import fetch
from scrapy.http import Response, TextResponse
from scrapy.utils.response import open_in_browser
logger = logging.getLogger(__name__)
class Command(fetch.Command):
def short_desc(self) -> str:
return "Open URL in browser, as seen by Scrapy... | python | BSD-3-Clause | d1bd8eb49f7aba9289e4ff692006cead8bcd9080 | 2026-01-04T14:38:41.023839Z | false |
scrapy/scrapy | https://github.com/scrapy/scrapy/blob/d1bd8eb49f7aba9289e4ff692006cead8bcd9080/scrapy/commands/genspider.py | scrapy/commands/genspider.py | from __future__ import annotations
import os
import shutil
import string
from importlib import import_module
from pathlib import Path
from typing import TYPE_CHECKING, Any, cast
from urllib.parse import urlparse
import scrapy
from scrapy.commands import ScrapyCommand
from scrapy.exceptions import UsageError
from scra... | python | BSD-3-Clause | d1bd8eb49f7aba9289e4ff692006cead8bcd9080 | 2026-01-04T14:38:41.023839Z | false |
scrapy/scrapy | https://github.com/scrapy/scrapy/blob/d1bd8eb49f7aba9289e4ff692006cead8bcd9080/scrapy/commands/crawl.py | scrapy/commands/crawl.py | from __future__ import annotations
from typing import TYPE_CHECKING
from scrapy.commands import BaseRunSpiderCommand
from scrapy.exceptions import UsageError
if TYPE_CHECKING:
import argparse
class Command(BaseRunSpiderCommand):
requires_project = True
def syntax(self) -> str:
return "[options... | python | BSD-3-Clause | d1bd8eb49f7aba9289e4ff692006cead8bcd9080 | 2026-01-04T14:38:41.023839Z | false |
scrapy/scrapy | https://github.com/scrapy/scrapy/blob/d1bd8eb49f7aba9289e4ff692006cead8bcd9080/scrapy/commands/version.py | scrapy/commands/version.py | import argparse
import scrapy
from scrapy.commands import ScrapyCommand
from scrapy.utils.versions import get_versions
class Command(ScrapyCommand):
requires_crawler_process = False
default_settings = {"LOG_ENABLED": False}
def syntax(self) -> str:
return "[-v]"
def short_desc(self) -> str:... | python | BSD-3-Clause | d1bd8eb49f7aba9289e4ff692006cead8bcd9080 | 2026-01-04T14:38:41.023839Z | false |
scrapy/scrapy | https://github.com/scrapy/scrapy/blob/d1bd8eb49f7aba9289e4ff692006cead8bcd9080/scrapy/commands/list.py | scrapy/commands/list.py | from __future__ import annotations
from typing import TYPE_CHECKING
from scrapy.commands import ScrapyCommand
from scrapy.spiderloader import get_spider_loader
if TYPE_CHECKING:
import argparse
class Command(ScrapyCommand):
requires_project = True
requires_crawler_process = False
default_settings =... | python | BSD-3-Clause | d1bd8eb49f7aba9289e4ff692006cead8bcd9080 | 2026-01-04T14:38:41.023839Z | false |
scrapy/scrapy | https://github.com/scrapy/scrapy/blob/d1bd8eb49f7aba9289e4ff692006cead8bcd9080/scrapy/commands/settings.py | scrapy/commands/settings.py | import argparse
import json
from scrapy.commands import ScrapyCommand
from scrapy.settings import BaseSettings
class Command(ScrapyCommand):
requires_crawler_process = False
default_settings = {"LOG_ENABLED": False}
def syntax(self) -> str:
return "[options]"
def short_desc(self) -> str:
... | python | BSD-3-Clause | d1bd8eb49f7aba9289e4ff692006cead8bcd9080 | 2026-01-04T14:38:41.023839Z | false |
scrapy/scrapy | https://github.com/scrapy/scrapy/blob/d1bd8eb49f7aba9289e4ff692006cead8bcd9080/scrapy/commands/parse.py | scrapy/commands/parse.py | from __future__ import annotations
import functools
import inspect
import json
import logging
from typing import TYPE_CHECKING, Any, TypeVar, overload
from itemadapter import ItemAdapter
from twisted.internet.defer import Deferred, maybeDeferred
from w3lib.url import is_url
from scrapy.commands import BaseRunSpiderC... | python | BSD-3-Clause | d1bd8eb49f7aba9289e4ff692006cead8bcd9080 | 2026-01-04T14:38:41.023839Z | false |
scrapy/scrapy | https://github.com/scrapy/scrapy/blob/d1bd8eb49f7aba9289e4ff692006cead8bcd9080/scrapy/commands/bench.py | scrapy/commands/bench.py | from __future__ import annotations
import subprocess
import sys
import time
from typing import TYPE_CHECKING, Any
from urllib.parse import urlencode
import scrapy
from scrapy.commands import ScrapyCommand
from scrapy.http import Response, TextResponse
from scrapy.linkextractors import LinkExtractor
from scrapy.utils.... | python | BSD-3-Clause | d1bd8eb49f7aba9289e4ff692006cead8bcd9080 | 2026-01-04T14:38:41.023839Z | false |
scrapy/scrapy | https://github.com/scrapy/scrapy/blob/d1bd8eb49f7aba9289e4ff692006cead8bcd9080/scrapy/commands/__init__.py | scrapy/commands/__init__.py | """
Base class for Scrapy commands
"""
from __future__ import annotations
import argparse
import builtins
import os
from abc import ABC, abstractmethod
from pathlib import Path
from typing import TYPE_CHECKING, Any
from twisted.python import failure
from scrapy.exceptions import UsageError
from scrapy.utils.conf im... | python | BSD-3-Clause | d1bd8eb49f7aba9289e4ff692006cead8bcd9080 | 2026-01-04T14:38:41.023839Z | false |
scrapy/scrapy | https://github.com/scrapy/scrapy/blob/d1bd8eb49f7aba9289e4ff692006cead8bcd9080/scrapy/commands/fetch.py | scrapy/commands/fetch.py | from __future__ import annotations
import sys
from argparse import Namespace # noqa: TC003
from typing import TYPE_CHECKING
from w3lib.url import is_url
from scrapy.commands import ScrapyCommand
from scrapy.exceptions import UsageError
from scrapy.http import Request, Response
from scrapy.utils.datatypes import Seq... | python | BSD-3-Clause | d1bd8eb49f7aba9289e4ff692006cead8bcd9080 | 2026-01-04T14:38:41.023839Z | false |
scrapy/scrapy | https://github.com/scrapy/scrapy/blob/d1bd8eb49f7aba9289e4ff692006cead8bcd9080/scrapy/commands/startproject.py | scrapy/commands/startproject.py | from __future__ import annotations
import re
import string
from importlib.util import find_spec
from pathlib import Path
from shutil import copy2, copystat, ignore_patterns, move
from stat import S_IWUSR as OWNER_WRITE_PERMISSION
from typing import TYPE_CHECKING
import scrapy
from scrapy.commands import ScrapyCommand... | python | BSD-3-Clause | d1bd8eb49f7aba9289e4ff692006cead8bcd9080 | 2026-01-04T14:38:41.023839Z | false |
scrapy/scrapy | https://github.com/scrapy/scrapy/blob/d1bd8eb49f7aba9289e4ff692006cead8bcd9080/scrapy/pipelines/files.py | scrapy/pipelines/files.py | """
Files Pipeline
See documentation in topics/media-pipeline.rst
"""
from __future__ import annotations
import base64
import functools
import hashlib
import logging
import mimetypes
import time
import warnings
from collections import defaultdict
from contextlib import suppress
from ftplib import FTP
from io import ... | python | BSD-3-Clause | d1bd8eb49f7aba9289e4ff692006cead8bcd9080 | 2026-01-04T14:38:41.023839Z | false |
scrapy/scrapy | https://github.com/scrapy/scrapy/blob/d1bd8eb49f7aba9289e4ff692006cead8bcd9080/scrapy/pipelines/__init__.py | scrapy/pipelines/__init__.py | """
Item pipeline
See documentation in docs/item-pipeline.rst
"""
from __future__ import annotations
import asyncio
import warnings
from typing import TYPE_CHECKING, Any, cast
from twisted.internet.defer import Deferred, DeferredList
from scrapy.exceptions import ScrapyDeprecationWarning
from scrapy.middleware imp... | python | BSD-3-Clause | d1bd8eb49f7aba9289e4ff692006cead8bcd9080 | 2026-01-04T14:38:41.023839Z | false |
scrapy/scrapy | https://github.com/scrapy/scrapy/blob/d1bd8eb49f7aba9289e4ff692006cead8bcd9080/scrapy/pipelines/images.py | scrapy/pipelines/images.py | """
Images Pipeline
See documentation in topics/media-pipeline.rst
"""
from __future__ import annotations
import functools
import hashlib
import warnings
from contextlib import suppress
from io import BytesIO
from typing import TYPE_CHECKING, Any
from itemadapter import ItemAdapter
from scrapy.exceptions import No... | python | BSD-3-Clause | d1bd8eb49f7aba9289e4ff692006cead8bcd9080 | 2026-01-04T14:38:41.023839Z | false |
scrapy/scrapy | https://github.com/scrapy/scrapy/blob/d1bd8eb49f7aba9289e4ff692006cead8bcd9080/scrapy/pipelines/media.py | scrapy/pipelines/media.py | from __future__ import annotations
import asyncio
import functools
import logging
import warnings
from abc import ABC, abstractmethod
from collections import defaultdict
from typing import TYPE_CHECKING, Any, Literal, TypeAlias, TypedDict, cast
from twisted import version as twisted_version
from twisted.internet.defe... | python | BSD-3-Clause | d1bd8eb49f7aba9289e4ff692006cead8bcd9080 | 2026-01-04T14:38:41.023839Z | false |
scrapy/scrapy | https://github.com/scrapy/scrapy/blob/d1bd8eb49f7aba9289e4ff692006cead8bcd9080/scrapy/utils/gz.py | scrapy/utils/gz.py | from __future__ import annotations
import struct
from gzip import GzipFile
from io import BytesIO
from typing import TYPE_CHECKING
from ._compression import _CHUNK_SIZE, _check_max_size
if TYPE_CHECKING:
from scrapy.http import Response
def gunzip(data: bytes, *, max_size: int = 0) -> bytes:
"""Gunzip the ... | python | BSD-3-Clause | d1bd8eb49f7aba9289e4ff692006cead8bcd9080 | 2026-01-04T14:38:41.023839Z | false |
scrapy/scrapy | https://github.com/scrapy/scrapy/blob/d1bd8eb49f7aba9289e4ff692006cead8bcd9080/scrapy/utils/testproc.py | scrapy/utils/testproc.py | from __future__ import annotations
import os
import sys
import warnings
from typing import TYPE_CHECKING, cast
from twisted.internet.defer import Deferred
from twisted.internet.protocol import ProcessProtocol
from scrapy.exceptions import ScrapyDeprecationWarning
if TYPE_CHECKING:
from collections.abc import It... | python | BSD-3-Clause | d1bd8eb49f7aba9289e4ff692006cead8bcd9080 | 2026-01-04T14:38:41.023839Z | false |
scrapy/scrapy | https://github.com/scrapy/scrapy/blob/d1bd8eb49f7aba9289e4ff692006cead8bcd9080/scrapy/utils/sitemap.py | scrapy/utils/sitemap.py | """
Module for processing Sitemaps.
Note: The main purpose of this module is to provide support for the
SitemapSpider, its API is subject to change without notice.
"""
from __future__ import annotations
from typing import TYPE_CHECKING, Any
from urllib.parse import urljoin
import lxml.etree
if TYPE_CHECKING:
f... | python | BSD-3-Clause | d1bd8eb49f7aba9289e4ff692006cead8bcd9080 | 2026-01-04T14:38:41.023839Z | false |
scrapy/scrapy | https://github.com/scrapy/scrapy/blob/d1bd8eb49f7aba9289e4ff692006cead8bcd9080/scrapy/utils/spider.py | scrapy/utils/spider.py | from __future__ import annotations
import inspect
import logging
from typing import TYPE_CHECKING, Any, TypeVar, overload
from scrapy.spiders import Spider
from scrapy.utils.defer import deferred_from_coro
from scrapy.utils.misc import arg_to_iter
if TYPE_CHECKING:
from collections.abc import AsyncGenerator, Ite... | python | BSD-3-Clause | d1bd8eb49f7aba9289e4ff692006cead8bcd9080 | 2026-01-04T14:38:41.023839Z | false |
scrapy/scrapy | https://github.com/scrapy/scrapy/blob/d1bd8eb49f7aba9289e4ff692006cead8bcd9080/scrapy/utils/ssl.py | scrapy/utils/ssl.py | from __future__ import annotations
from typing import TYPE_CHECKING, Any
import OpenSSL._util as pyOpenSSLutil
import OpenSSL.SSL
import OpenSSL.version
from scrapy.utils.python import to_unicode
if TYPE_CHECKING:
from OpenSSL.crypto import X509Name
def ffi_buf_to_string(buf: Any) -> str:
return to_unicod... | python | BSD-3-Clause | d1bd8eb49f7aba9289e4ff692006cead8bcd9080 | 2026-01-04T14:38:41.023839Z | false |
scrapy/scrapy | https://github.com/scrapy/scrapy/blob/d1bd8eb49f7aba9289e4ff692006cead8bcd9080/scrapy/utils/url.py | scrapy/utils/url.py | """
This module contains general purpose URL functions not found in the standard
library.
"""
from __future__ import annotations
import re
import warnings
from importlib import import_module
from typing import TYPE_CHECKING, TypeAlias
from urllib.parse import ParseResult, urldefrag, urlparse, urlunparse
from warnings... | python | BSD-3-Clause | d1bd8eb49f7aba9289e4ff692006cead8bcd9080 | 2026-01-04T14:38:41.023839Z | false |
scrapy/scrapy | https://github.com/scrapy/scrapy/blob/d1bd8eb49f7aba9289e4ff692006cead8bcd9080/scrapy/utils/response.py | scrapy/utils/response.py | """
This module provides some useful functions for working with
scrapy.http.Response objects
"""
from __future__ import annotations
import os
import re
import tempfile
import webbrowser
from typing import TYPE_CHECKING, Any
from weakref import WeakKeyDictionary
from twisted.web import http
from w3lib import html
fr... | python | BSD-3-Clause | d1bd8eb49f7aba9289e4ff692006cead8bcd9080 | 2026-01-04T14:38:41.023839Z | false |
scrapy/scrapy | https://github.com/scrapy/scrapy/blob/d1bd8eb49f7aba9289e4ff692006cead8bcd9080/scrapy/utils/asyncio.py | scrapy/utils/asyncio.py | """Utilities related to asyncio and its support in Scrapy."""
from __future__ import annotations
import asyncio
import logging
import time
from collections.abc import AsyncIterator, Callable, Coroutine, Iterable
from typing import TYPE_CHECKING, Any, Concatenate, ParamSpec, TypeVar
from twisted.internet.defer import... | python | BSD-3-Clause | d1bd8eb49f7aba9289e4ff692006cead8bcd9080 | 2026-01-04T14:38:41.023839Z | false |
scrapy/scrapy | https://github.com/scrapy/scrapy/blob/d1bd8eb49f7aba9289e4ff692006cead8bcd9080/scrapy/utils/iterators.py | scrapy/utils/iterators.py | from __future__ import annotations
import csv
import logging
import re
from io import StringIO
from typing import TYPE_CHECKING, Any, Literal, cast, overload
from warnings import warn
from lxml import etree
from scrapy.exceptions import ScrapyDeprecationWarning
from scrapy.http import Response, TextResponse
from scr... | python | BSD-3-Clause | d1bd8eb49f7aba9289e4ff692006cead8bcd9080 | 2026-01-04T14:38:41.023839Z | false |
scrapy/scrapy | https://github.com/scrapy/scrapy/blob/d1bd8eb49f7aba9289e4ff692006cead8bcd9080/scrapy/utils/serialize.py | scrapy/utils/serialize.py | import datetime
import decimal
import json
from typing import Any
from itemadapter import ItemAdapter, is_item
from twisted.internet import defer
from scrapy.http import Request, Response
class ScrapyJSONEncoder(json.JSONEncoder):
DATE_FORMAT = "%Y-%m-%d"
TIME_FORMAT = "%H:%M:%S"
def default(self, o: A... | python | BSD-3-Clause | d1bd8eb49f7aba9289e4ff692006cead8bcd9080 | 2026-01-04T14:38:41.023839Z | false |
scrapy/scrapy | https://github.com/scrapy/scrapy/blob/d1bd8eb49f7aba9289e4ff692006cead8bcd9080/scrapy/utils/template.py | scrapy/utils/template.py | """Helper functions for working with templates"""
from __future__ import annotations
import re
import string
from pathlib import Path
from typing import TYPE_CHECKING, Any
if TYPE_CHECKING:
from os import PathLike
def render_templatefile(path: str | PathLike, **kwargs: Any) -> None:
path_obj = Path(path)
... | python | BSD-3-Clause | d1bd8eb49f7aba9289e4ff692006cead8bcd9080 | 2026-01-04T14:38:41.023839Z | false |
scrapy/scrapy | https://github.com/scrapy/scrapy/blob/d1bd8eb49f7aba9289e4ff692006cead8bcd9080/scrapy/utils/ftp.py | scrapy/utils/ftp.py | import posixpath
from ftplib import FTP, error_perm
from posixpath import dirname
from typing import IO
def ftp_makedirs_cwd(ftp: FTP, path: str, first_call: bool = True) -> None:
"""Set the current directory of the FTP connection given in the ``ftp``
argument (as a ftplib.FTP object), creating all parent dir... | python | BSD-3-Clause | d1bd8eb49f7aba9289e4ff692006cead8bcd9080 | 2026-01-04T14:38:41.023839Z | false |
scrapy/scrapy | https://github.com/scrapy/scrapy/blob/d1bd8eb49f7aba9289e4ff692006cead8bcd9080/scrapy/utils/defer.py | scrapy/utils/defer.py | """
Helper functions for dealing with Twisted deferreds
"""
from __future__ import annotations
import asyncio
import inspect
import warnings
from asyncio import Future
from collections.abc import Awaitable, Coroutine, Iterable, Iterator
from functools import wraps
from typing import (
TYPE_CHECKING,
Any,
... | python | BSD-3-Clause | d1bd8eb49f7aba9289e4ff692006cead8bcd9080 | 2026-01-04T14:38:41.023839Z | false |
scrapy/scrapy | https://github.com/scrapy/scrapy/blob/d1bd8eb49f7aba9289e4ff692006cead8bcd9080/scrapy/utils/ossignal.py | scrapy/utils/ossignal.py | from __future__ import annotations
import signal
from collections.abc import Callable
from types import FrameType
from typing import Any, TypeAlias
# copy of _HANDLER from typeshed/stdlib/signal.pyi
SignalHandlerT: TypeAlias = (
Callable[[int, FrameType | None], Any] | int | signal.Handlers | None
)
signal_names... | python | BSD-3-Clause | d1bd8eb49f7aba9289e4ff692006cead8bcd9080 | 2026-01-04T14:38:41.023839Z | false |
scrapy/scrapy | https://github.com/scrapy/scrapy/blob/d1bd8eb49f7aba9289e4ff692006cead8bcd9080/scrapy/utils/decorators.py | scrapy/utils/decorators.py | from __future__ import annotations
import inspect
import warnings
from functools import wraps
from typing import TYPE_CHECKING, Any, ParamSpec, TypeVar, overload
from twisted.internet.defer import Deferred, maybeDeferred
from twisted.internet.threads import deferToThread
from scrapy.exceptions import ScrapyDeprecati... | python | BSD-3-Clause | d1bd8eb49f7aba9289e4ff692006cead8bcd9080 | 2026-01-04T14:38:41.023839Z | false |
scrapy/scrapy | https://github.com/scrapy/scrapy/blob/d1bd8eb49f7aba9289e4ff692006cead8bcd9080/scrapy/utils/versions.py | scrapy/utils/versions.py | from __future__ import annotations
import platform
import sys
from importlib.metadata import version
from warnings import warn
import lxml.etree
from scrapy.exceptions import ScrapyDeprecationWarning
from scrapy.settings.default_settings import LOG_VERSIONS
from scrapy.utils.ssl import get_openssl_version
_DEFAULT_... | python | BSD-3-Clause | d1bd8eb49f7aba9289e4ff692006cead8bcd9080 | 2026-01-04T14:38:41.023839Z | false |
scrapy/scrapy | https://github.com/scrapy/scrapy/blob/d1bd8eb49f7aba9289e4ff692006cead8bcd9080/scrapy/utils/httpobj.py | scrapy/utils/httpobj.py | """Helper functions for scrapy.http objects (Request, Response)"""
from __future__ import annotations
from typing import TYPE_CHECKING
from urllib.parse import ParseResult, urlparse
from weakref import WeakKeyDictionary
if TYPE_CHECKING:
from scrapy.http import Request, Response
_urlparse_cache: WeakKeyDiction... | python | BSD-3-Clause | d1bd8eb49f7aba9289e4ff692006cead8bcd9080 | 2026-01-04T14:38:41.023839Z | false |
scrapy/scrapy | https://github.com/scrapy/scrapy/blob/d1bd8eb49f7aba9289e4ff692006cead8bcd9080/scrapy/utils/console.py | scrapy/utils/console.py | from __future__ import annotations
import code
from collections.abc import Callable
from functools import wraps
from typing import TYPE_CHECKING, Any
if TYPE_CHECKING:
from collections.abc import Iterable
EmbedFuncT = Callable[..., None]
KnownShellsT = dict[str, Callable[..., EmbedFuncT]]
def _embed_ipython_sh... | python | BSD-3-Clause | d1bd8eb49f7aba9289e4ff692006cead8bcd9080 | 2026-01-04T14:38:41.023839Z | false |
scrapy/scrapy | https://github.com/scrapy/scrapy/blob/d1bd8eb49f7aba9289e4ff692006cead8bcd9080/scrapy/utils/request.py | scrapy/utils/request.py | """
This module provides some useful functions for working with
scrapy.Request objects
"""
from __future__ import annotations
import hashlib
import json
from typing import TYPE_CHECKING, Any, Protocol
from urllib.parse import urlunparse
from weakref import WeakKeyDictionary
from w3lib.url import canonicalize_url
fr... | python | BSD-3-Clause | d1bd8eb49f7aba9289e4ff692006cead8bcd9080 | 2026-01-04T14:38:41.023839Z | false |
scrapy/scrapy | https://github.com/scrapy/scrapy/blob/d1bd8eb49f7aba9289e4ff692006cead8bcd9080/scrapy/utils/log.py | scrapy/utils/log.py | from __future__ import annotations
import logging
import pprint
import sys
from collections.abc import MutableMapping
from logging.config import dictConfig
from typing import TYPE_CHECKING, Any, cast
from twisted.internet import asyncioreactor
from twisted.python import log as twisted_log
from twisted.python.failure ... | python | BSD-3-Clause | d1bd8eb49f7aba9289e4ff692006cead8bcd9080 | 2026-01-04T14:38:41.023839Z | false |
scrapy/scrapy | https://github.com/scrapy/scrapy/blob/d1bd8eb49f7aba9289e4ff692006cead8bcd9080/scrapy/utils/deprecate.py | scrapy/utils/deprecate.py | """Some helpers for deprecation messages"""
from __future__ import annotations
import inspect
import warnings
from typing import TYPE_CHECKING, Any, overload
from scrapy.exceptions import ScrapyDeprecationWarning
from scrapy.utils.python import get_func_args_dict
if TYPE_CHECKING:
from collections.abc import Ca... | python | BSD-3-Clause | d1bd8eb49f7aba9289e4ff692006cead8bcd9080 | 2026-01-04T14:38:41.023839Z | false |
scrapy/scrapy | https://github.com/scrapy/scrapy/blob/d1bd8eb49f7aba9289e4ff692006cead8bcd9080/scrapy/utils/signal.py | scrapy/utils/signal.py | """Helper functions for working with signals"""
from __future__ import annotations
import asyncio
import logging
import warnings
from collections.abc import Awaitable, Callable, Generator, Sequence
from typing import Any as TypingAny
from pydispatch.dispatcher import (
Anonymous,
Any,
disconnect,
get... | python | BSD-3-Clause | d1bd8eb49f7aba9289e4ff692006cead8bcd9080 | 2026-01-04T14:38:41.023839Z | false |
scrapy/scrapy | https://github.com/scrapy/scrapy/blob/d1bd8eb49f7aba9289e4ff692006cead8bcd9080/scrapy/utils/_compression.py | scrapy/utils/_compression.py | import contextlib
import zlib
from io import BytesIO
with contextlib.suppress(ImportError):
try:
import brotli
except ImportError:
import brotlicffi as brotli
with contextlib.suppress(ImportError):
import zstandard
_CHUNK_SIZE = 65536 # 64 KiB
class _DecompressionMaxSizeExceeded(Value... | python | BSD-3-Clause | d1bd8eb49f7aba9289e4ff692006cead8bcd9080 | 2026-01-04T14:38:41.023839Z | false |
scrapy/scrapy | https://github.com/scrapy/scrapy/blob/d1bd8eb49f7aba9289e4ff692006cead8bcd9080/scrapy/utils/python.py | scrapy/utils/python.py | """
This module contains essential stuff that should've come with Python itself ;)
"""
from __future__ import annotations
import gc
import inspect
import re
import sys
import weakref
from collections.abc import AsyncIterator, Iterable, Mapping
from functools import partial, wraps
from itertools import chain
from typi... | python | BSD-3-Clause | d1bd8eb49f7aba9289e4ff692006cead8bcd9080 | 2026-01-04T14:38:41.023839Z | false |
scrapy/scrapy | https://github.com/scrapy/scrapy/blob/d1bd8eb49f7aba9289e4ff692006cead8bcd9080/scrapy/utils/curl.py | scrapy/utils/curl.py | from __future__ import annotations
import argparse
import warnings
from http.cookies import SimpleCookie
from shlex import split
from typing import TYPE_CHECKING, Any, NoReturn
from urllib.parse import urlparse
from w3lib.http import basic_auth_header
if TYPE_CHECKING:
from collections.abc import Sequence
clas... | python | BSD-3-Clause | d1bd8eb49f7aba9289e4ff692006cead8bcd9080 | 2026-01-04T14:38:41.023839Z | false |
scrapy/scrapy | https://github.com/scrapy/scrapy/blob/d1bd8eb49f7aba9289e4ff692006cead8bcd9080/scrapy/utils/trackref.py | scrapy/utils/trackref.py | """This module provides some functions and classes to record and report
references to live object instances.
If you want live objects for a particular class to be tracked, you only have to
subclass from object_ref (instead of object).
About performance: This library has a minimal performance impact when enabled,
and ... | python | BSD-3-Clause | d1bd8eb49f7aba9289e4ff692006cead8bcd9080 | 2026-01-04T14:38:41.023839Z | false |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.