Sequence model
ESM2FineTuneSeqConfig
dataclass
Bases: ESM2GenericConfig[ESM2FineTuneSeqModel, BERTMLMLossWithReduction]
, IOMixinWithGettersSetters
ExampleConfig is a dataclass that is used to configure the model.
Timers from ModelParallelConfig are required for megatron forward compatibility.
Source code in bionemo/esm2/model/finetune/sequence_model.py
112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 |
|
get_loss_reduction_class()
Returns RegressorLossReduction class.
Source code in bionemo/esm2/model/finetune/sequence_model.py
132 133 134 135 136 137 138 139 |
|
ESM2FineTuneSeqModel
Bases: ESM2Model
ESM2 model that is suitable for fine-tuning on downstream tasks.
Source code in bionemo/esm2/model/finetune/sequence_model.py
66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 |
|
__init__(config, *args, post_process=True, include_embeddings=False, **kwargs)
Constructs an instance of the ESM2 model suitable for fine-tuning.
Source code in bionemo/esm2/model/finetune/sequence_model.py
69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 |
|
forward(*args, **kwargs)
Inference.
Source code in bionemo/esm2/model/finetune/sequence_model.py
90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 |
|
MegatronMLPHead
Bases: MegatronModule
An MLP class for sequence-level regression.
Source code in bionemo/esm2/model/finetune/sequence_model.py
42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 |
|
__init__(config)
Constructor.
Source code in bionemo/esm2/model/finetune/sequence_model.py
45 46 47 48 49 50 51 52 53 54 |
|
forward(hidden_states)
Inference.
Source code in bionemo/esm2/model/finetune/sequence_model.py
56 57 58 59 60 61 62 63 |
|