Model¶
This module contains all implemented models to predict Named Entity Recognition (NER). Author: Lucas Pavanelli
- class model.BERTSlotFilling(hidden_dim, num_classes)¶
BERTimbau model to predict NER classes.
- hidden_dimint
Hidden layer dimension.
- num_classesint
Number of NER classes.
- devicetorch.device
Class device
- hidden_dimint
Hidden layer dimension.
- num_classesint
Number of NER classes.
- bertAutoModel
BERTimbau model
- Wbnn.Linear
Linear layer
- softmaxnn.Softmax
Softmax layer
- forward(token_ids, subword_ids)¶
Computes probabilities for each NER class.
- token_idstorch.Tensor
List of tokens indexes.
- subword_idstorch.Tensor
List of subword indexes.
- torch.Tensor
Probabilities for each NER class
- class model.LinearLayerCRF(num_classes, vocab_size, out_w2id)¶
Linear Layer + CRF model that returns probability for each NER class
- vocab_sizeint
Vocabulary size.
- num_classesint
Number of NER classes.
- out_w2id: dict
Map from word to id for output vocabulary.
- devicetorch.device
Class device.
- linearSimpleLinear
Linear layer.
- crf: CRF
PyTorch CRF
- forward(token_ids)¶
Computes probabilities for each NER class.
- token_idstorch.Tensor
List of tokens indexes.
- torch.Tensor
Probabilities for each NER class
- loss(token_ids, tag_ids)¶
Computes model’s loss.
- token_idstorch.Tensor
List of tokens indexes.
- tag_ids: torch.Tensor
List of tags indexes.
- torch.Tensor
Model’s loss
- class model.SimpleLinear(vocab_size, num_classes, out_w2id, emb_dim=10)¶
Simple linear model to create embeddings and return entity logits
- vocab_sizeint
Vocabulary size.
- num_classesint
Number of NER classes.
- out_w2id: dict
Map from word to id for output vocabulary.
- emb_dim: int
Embedding layer dimension.
- embnn.Embedding
Embedding layer.
- emb2tagnn.Linear
Linear layer.
- forward(token_ids)¶
Computes entity logits
- token_idstorch.Tensor
List of tokens indexes.
- torch.Tensor
Entity logits