chunkipy.text_splitters.semantic.sentences.spacy_sentence_text_splitter
Classes
|
Sentence splitter using spaCy for semantic text splitting. |
- class chunkipy.text_splitters.semantic.sentences.spacy_sentence_text_splitter.SpacySentenceTextSplitter(models_map=None, text_limit=None, language_detector=None)[source]
Bases:
BaseSemanticTextSplitterSentence splitter using spaCy for semantic text splitting. This class uses spaCy to split text into sentences based on the language detected in the text. It supports multiple languages by loading different spaCy models based on the detected language. If the language is not supported, it defaults to English.
- Parameters:
text_limit (int)
language_detector (BaseLanguageDetector | None)
- text_limit
The maximum length of text to process at once. If None, DEFAULT_LIMIT from base class is applied.
- Type:
- DEFAULT_LANG = 'en'
- DEFAULT_MODELS_MAP = {'en': 'en_core_web_sm'}