你可以对MPT-7B语言模型运行一些推理。

huangapple go评论49阅读模式
英文:

How can I run some inference on the MPT-7B language model?

问题

我想知道如何在MPT-7B语言模型上运行一些推断。 Huggingface上的MPT-7B语言模型文档页面没有提到如何运行推断(即,给定一些词,预测接下来的几个词)。

英文:

I wonder how I can run some inference on the MPT-7B language model. The documentation page on MPT-7B language model  on huggingface doesn't mention how to run the inference (i.e., given a few words, predict the next few words).

答案1

得分: 0

以下是已翻译的内容:

https://huggingface.co/mosaicml/mpt-30b 提供了一个推断的示例代码:

import transformers
model = transformers.AutoModelForCausalLM.from_pretrained(
  'mosaicml/mpt-30b',
  trust_remote_code=True
)

from transformers import pipeline

with torch.autocast('cuda', dtype=torch.bfloat16):
    inputs = tokenizer('Here is a recipe for vegan banana bread:\n', return_tensors="pt").to('cuda')
    outputs = model.generate(**inputs, max_new_tokens=100)
    print(tokenizer.batch_decode(outputs, skip_special_tokens=True))

# or using the HF pipeline
pipe = pipeline('text-generation', model=model, tokenizer=tokenizer, device='cuda:0')
with torch.autocast('cuda', dtype=torch.bfloat16):
    print(
        pipe('Here is a recipe for vegan banana bread:\n',
            max_new_tokens=100,
            do_sample=True,
            use_cache=True))

如果您希望使用MPT-7B,请将 mpt-30b 替换为 mpt-7b

英文:

https://huggingface.co/mosaicml/mpt-30b gives an example code for inference:

import transformers
model = transformers.AutoModelForCausalLM.from_pretrained(
  'mosaicml/mpt-30b',
  trust_remote_code=True
)

from transformers import pipeline

with torch.autocast('cuda', dtype=torch.bfloat16):
    inputs = tokenizer('Here is a recipe for vegan banana bread:\n', return_tensors="pt").to('cuda')
    outputs = model.generate(**inputs, max_new_tokens=100)
    print(tokenizer.batch_decode(outputs, skip_special_tokens=True))

# or using the HF pipeline
pipe = pipeline('text-generation', model=model, tokenizer=tokenizer, device='cuda:0')
with torch.autocast('cuda', dtype=torch.bfloat16):
    print(
        pipe('Here is a recipe for vegan banana bread:\n',
            max_new_tokens=100,
            do_sample=True,
            use_cache=True))

Just replace mpt-30b with mpt-7b if you wish to use MPT-7B.

huangapple
  • 本文由 发表于 2023年5月29日 20:50:17
  • 转载请务必保留本文链接:https://go.coder-hub.com/76357536.html
匿名

发表评论

匿名网友

:?: :razz: :sad: :evil: :!: :smile: :oops: :grin: :eek: :shock: :???: :cool: :lol: :mad: :twisted: :roll: :wink: :idea: :arrow: :neutral: :cry: :mrgreen:

确定