大模型对比网站:大模型综合评测对比 | 当前主流大模型在各评测数据集上的表现总榜单 | 数据学习 (DataLearner)
大模型下载:互链高科
ClueAI/PromptCLUE-base-v1-5 at main (huggingface.co) 支持多任务生成,支持中文,不支持多轮对话,体验:ClueAI (cluebenchmarks.com)
基于promptclue-base进一步训练的模型:ClueAI/ChatYuan-large-v1 at main (huggingface.co) 支持多任务生成,支持中文,支持简单对话
关于huggingface模型下载:
手动下载:https://mirrors.tuna.tsinghua.edu.cn/hugging-face-models/hfl/
代码下载:
import llama
#MODEL = '/home/guo/llama_test/llama_model'
MODEL = 'decapoda-research/llama-7b-hf'
# MODEL = 'decapoda-research/llama-13b-hf'
# MODEL = 'decapoda-research/llama-30b-hf'
# MODEL = 'decapoda-research/llama-65b-hf'
#tokenizer = llama.LLaMATokenizer.from_pretrained(MODEL,mirror='tuna')
#model = llama.LLaMAForCausalLM.from_pretrained(MODEL, mirror='tuna',low_cpu_mem_usage = True)
tokenizer = llama.LLaMATokenizer.from_pretrained(MODEL,mirror='https://mirrors.tuna.tsinghua.edu.cn/hugging-face-models')
model = llama.LLaMAForCausalLM.from_pretrained(MODEL, mirror='https://mirrors.tuna.tsinghua.edu.cn/hugging-face-models',low_cpu_mem_usage = True)
model.to('cpu')
batch = tokenizer("Yo mama", return_tensors = "pt")
print(tokenizer.decode(model.generate(batch["input_ids"], max_length=100)[0]))
关于github的镜像仓库下载,参考:(4条消息) git clone 换源 / GitHub 国内镜像_git换源_面里多加汤的博客-CSDN博客:文章来源:https://www.toymoban.com/news/detail-661054.html
https://gitclone.com
# 服务器位于杭州(可用)
使用方式:原始git地址:https://github.com/junegunn/vim-plug
克隆地址: https://gitclone.com/github.com/junegunn/vim-plug
#香港服务器https://doc.fastgit.org ,当前不可用
欢迎各位留言文章来源地址https://www.toymoban.com/news/detail-661054.html
到了这里,关于已经开源的中文大模型对比,支持更新的文章就介绍完了。如果您还想了解更多内容,请在右上角搜索TOY模板网以前的文章或继续浏览下面的相关文章,希望大家以后多多支持TOY模板网!