학술논문

RakutenAI-7B: Extending Large Language Models for Japanese
Document Type
Working Paper
Source
Subject
Computer Science - Computation and Language
Computer Science - Machine Learning
Language
Abstract
We introduce RakutenAI-7B, a suite of Japanese-oriented large language models that achieve the best performance on the Japanese LM Harness benchmarks among the open 7B models. Along with the foundation model, we release instruction- and chat-tuned models, RakutenAI-7B-instruct and RakutenAI-7B-chat respectively, under the Apache 2.0 license.