我按照这里的指南(,)从头开始训练类似罗伯塔的模型。(使用我自己的令牌程序和数据集)All model checkpoint weights were used whenAll the weights of RobertaForMaskedLM were initialized from the model checkpoint at roberta-base.我想知道这是否意味着我从零开始就开始使用</em
我在google colab上使用roberta的问答模型来解决推文情感提取问题。但是模型无法训练,因为我得到了一个资源耗尽错误;ResourceExhaustedError: OOM when allocating tensor with shape[32,16,128,64/roberta/encoder/layer_._17/attention/self/transpose (defined at
我试图微调我的自定义数据集上的"RobertaForQuestionAnswering“,并且我对它所需要的输入参数感到困惑。这是示例代码。RobertaTokenizer, RobertaForQuestionAnswering>>> model = RobertaForQuestionAnswering.f