chunkipy.text_splitters.semantic.sentences.spacy_sentence_text_splitter

Classes

SpacySentenceTextSplitter([models_map, ...])

Sentence splitter using spaCy for semantic text splitting.

class chunkipy.text_splitters.semantic.sentences.spacy_sentence_text_splitter.SpacySentenceTextSplitter(models_map=None, text_limit=None, language_detector=None)[source]

Bases: BaseSemanticTextSplitter

Sentence splitter using spaCy for semantic text splitting. This class uses spaCy to split text into sentences based on the language detected in the text. It supports multiple languages by loading different spaCy models based on the detected language. If the language is not supported, it defaults to English.

Parameters:
models

A dictionary mapping language codes to spaCy model names.

Type:

Dict[str, str]

models_map

A dictionary mapping language codes to spaCy model names.

Type:

Dict[str, str]

text_limit

The maximum length of text to process at once. If None, DEFAULT_LIMIT from base class is applied.

Type:

int

DEFAULT_LANG = 'en'
DEFAULT_MODELS_MAP = {'en': 'en_core_web_sm'}