langchain_nvidia_ai_endpoints.embeddings.NVIDIAEmbeddings¶
- class langchain_nvidia_ai_endpoints.embeddings.NVIDIAEmbeddings[source]¶
Bases:
_NVIDIAClient,EmbeddingsClient to NVIDIA embeddings models.
Fields: - model: str, the name of the model to use - truncate: “NONE”, “START”, “END”, truncate input text if it exceeds the model’s
maximum token length. Default is “NONE”, which raises an error if an input is too long.
Create a new model by parsing and validating input data from keyword arguments.
Raises ValidationError if the input data cannot be parsed to form a valid model.
- param client: NVEModel = <class 'langchain_nvidia_ai_endpoints._common.NVEModel'>¶
- param curr_mode: _MODE_TYPE = 'nvidia'¶
- param infer_endpoint: str = '{base_url}/embeddings'¶
- param max_batch_size: int = 50¶
- param max_length: int = 2048¶
- Constraints
minimum = 1
maximum = 2048
- param model: str = 'ai-embed-qa-4'¶
Name of the model to invoke
- param model_type: Optional[Literal['passage', 'query']] = None¶
The type of text to be embedded.
- param truncate: Literal['NONE', 'START', 'END'] = 'NONE'¶
Truncate input text if it exceeds the model’s maximum token length. Default is ‘NONE’, which raises an error if an input is too long.
- async aembed_documents(texts: List[str]) List[List[float]]¶
Asynchronous Embed search docs.
- Parameters
texts (List[str]) –
- Return type
List[List[float]]
- async aembed_query(text: str) List[float]¶
Asynchronous Embed query text.
- Parameters
text (str) –
- Return type
List[float]
- classmethod construct(_fields_set: Optional[SetStr] = None, **values: Any) Model¶
Creates a new model setting __dict__ and __fields_set__ from trusted or pre-validated data. Default values are respected, but no other validation is performed. Behaves as if Config.extra = ‘allow’ was set since it adds all passed values
- Parameters
_fields_set (Optional[SetStr]) –
values (Any) –
- Return type
Model
- copy(*, include: Optional[Union[AbstractSetIntStr, MappingIntStrAny]] = None, exclude: Optional[Union[AbstractSetIntStr, MappingIntStrAny]] = None, update: Optional[DictStrAny] = None, deep: bool = False) Model¶
Duplicate a model, optionally choose which fields to include, exclude and change.
- Parameters
include (Optional[Union[AbstractSetIntStr, MappingIntStrAny]]) – fields to include in new model
exclude (Optional[Union[AbstractSetIntStr, MappingIntStrAny]]) – fields to exclude from new model, as with values this takes precedence over include
update (Optional[DictStrAny]) – values to change/add in the new model. Note: the data is not validated before creating the new model: you should trust this data
deep (bool) – set to True to make a deep copy of the model
self (Model) –
- Returns
new model instance
- Return type
Model
- dict(*, include: Optional[Union[AbstractSetIntStr, MappingIntStrAny]] = None, exclude: Optional[Union[AbstractSetIntStr, MappingIntStrAny]] = None, by_alias: bool = False, skip_defaults: Optional[bool] = None, exclude_unset: bool = False, exclude_defaults: bool = False, exclude_none: bool = False) DictStrAny¶
Generate a dictionary representation of the model, optionally specifying which fields to include or exclude.
- Parameters
include (Optional[Union[AbstractSetIntStr, MappingIntStrAny]]) –
exclude (Optional[Union[AbstractSetIntStr, MappingIntStrAny]]) –
by_alias (bool) –
skip_defaults (Optional[bool]) –
exclude_unset (bool) –
exclude_defaults (bool) –
exclude_none (bool) –
- Return type
DictStrAny
- embed_documents(texts: List[str]) List[List[float]][source]¶
Input pathway for document embeddings.
- Parameters
texts (List[str]) –
- Return type
List[List[float]]
- embed_query(text: str) List[float][source]¶
Input pathway for query embeddings.
- Parameters
text (str) –
- Return type
List[float]
- classmethod from_orm(obj: Any) Model¶
- Parameters
obj (Any) –
- Return type
Model
- classmethod get_available_functions(mode: Optional[Literal['catalog', 'nvidia', 'nim', 'open', 'openai']] = None, client: Optional[_NVIDIAClient] = None, **kwargs: Any) List[dict]¶
Map the available functions that can be invoked. Callable from class
- Parameters
mode (Optional[Literal['catalog', 'nvidia', 'nim', 'open', 'openai']]) –
client (Optional[_NVIDIAClient]) –
kwargs (Any) –
- Return type
List[dict]
- classmethod get_available_models(mode: Optional[Literal['catalog', 'nvidia', 'nim', 'open', 'openai']] = None, client: Optional[_NVIDIAClient] = None, list_all: bool = False, filter: Optional[str] = None, **kwargs: Any) List[Model]¶
Map the available models that can be invoked. Callable from class
- Parameters
mode (Optional[Literal['catalog', 'nvidia', 'nim', 'open', 'openai']]) –
client (Optional[_NVIDIAClient]) –
list_all (bool) –
filter (Optional[str]) –
kwargs (Any) –
- Return type
List[Model]
- get_binding_model() Optional[str]¶
Get the model to bind to the client as default payload argument
- Return type
Optional[str]
- get_model_details(model: Optional[str] = None) dict¶
Get more meta-details about a model retrieved by a given name
- Parameters
model (Optional[str]) –
- Return type
dict
- classmethod is_lc_serializable() bool¶
- Return type
bool
- json(*, include: Optional[Union[AbstractSetIntStr, MappingIntStrAny]] = None, exclude: Optional[Union[AbstractSetIntStr, MappingIntStrAny]] = None, by_alias: bool = False, skip_defaults: Optional[bool] = None, exclude_unset: bool = False, exclude_defaults: bool = False, exclude_none: bool = False, encoder: Optional[Callable[[Any], Any]] = None, models_as_dict: bool = True, **dumps_kwargs: Any) unicode¶
Generate a JSON representation of the model, include and exclude arguments as per dict().
encoder is an optional function to supply as default to json.dumps(), other arguments as per json.dumps().
- Parameters
include (Optional[Union[AbstractSetIntStr, MappingIntStrAny]]) –
exclude (Optional[Union[AbstractSetIntStr, MappingIntStrAny]]) –
by_alias (bool) –
skip_defaults (Optional[bool]) –
exclude_unset (bool) –
exclude_defaults (bool) –
exclude_none (bool) –
encoder (Optional[Callable[[Any], Any]]) –
models_as_dict (bool) –
dumps_kwargs (Any) –
- Return type
unicode
- mode(mode: Optional[Literal['catalog', 'nvidia', 'nim', 'open', 'openai']] = 'nvidia', base_url: Optional[str] = None, model: Optional[str] = None, api_key: Optional[str] = None, infer_path: Optional[str] = None, models_path: Optional[str] = '{base_url}/models', force_mode: bool = False, force_clone: bool = True, **kwargs: Any) Any¶
Return a client swapped to a different mode
- Parameters
mode (Optional[Literal['catalog', 'nvidia', 'nim', 'open', 'openai']]) –
base_url (Optional[str]) –
model (Optional[str]) –
api_key (Optional[str]) –
infer_path (Optional[str]) –
models_path (Optional[str]) –
force_mode (bool) –
force_clone (bool) –
kwargs (Any) –
- Return type
Any
- classmethod parse_file(path: Union[str, Path], *, content_type: unicode = None, encoding: unicode = 'utf8', proto: Protocol = None, allow_pickle: bool = False) Model¶
- Parameters
path (Union[str, Path]) –
content_type (unicode) –
encoding (unicode) –
proto (Protocol) –
allow_pickle (bool) –
- Return type
Model
- classmethod parse_obj(obj: Any) Model¶
- Parameters
obj (Any) –
- Return type
Model
- classmethod parse_raw(b: Union[str, bytes], *, content_type: unicode = None, encoding: unicode = 'utf8', proto: Protocol = None, allow_pickle: bool = False) Model¶
- Parameters
b (Union[str, bytes]) –
content_type (unicode) –
encoding (unicode) –
proto (Protocol) –
allow_pickle (bool) –
- Return type
Model
- classmethod schema(by_alias: bool = True, ref_template: unicode = '#/definitions/{model}') DictStrAny¶
- Parameters
by_alias (bool) –
ref_template (unicode) –
- Return type
DictStrAny
- classmethod schema_json(*, by_alias: bool = True, ref_template: unicode = '#/definitions/{model}', **dumps_kwargs: Any) unicode¶
- Parameters
by_alias (bool) –
ref_template (unicode) –
dumps_kwargs (Any) –
- Return type
unicode
- classmethod update_forward_refs(**localns: Any) None¶
Try to update ForwardRefs on fields based on this Model, globalns and localns.
- Parameters
localns (Any) –
- Return type
None
- classmethod validate(value: Any) Model¶
- Parameters
value (Any) –
- Return type
Model
- property available_functions: List[dict]¶
Map the available functions that can be invoked.
- property available_models: List[Model]¶
Map the available models that can be invoked.
- property lc_attributes: Dict[str, Any]¶
- property lc_secrets: Dict[str, str]¶