Riiid announced two of its new AI research studies as it continues to extend its AI capabilities with NLP technology to build extensive new educational content-aware AI models.
Riiid is currently focusing on creating dialogue systems like chatbots that can deliver meaningful results even with a limited amount of labelled and structured dialogue data by leveraging very large language models. By using only 1 per cent of the training data, Riiid’s new method was able to achieve 66 per cent of the full data performance, whereas other approaches achieved less than 60 per cent.
“Dialogue State Tracking (DST) is an essential element of task-oriented dialogue systems but is infamous for the expensive and difficult data collection process. Our study proposes a new method to reformulate DST into dialogue summarisation, to minimise the pre-train and fine-tune discrepancies that typically occur. This study can be extended as the proof-of-concept that Riiid can develop new educational features such as AI Tutors with chatting capabilities at a much lower cost with much higher efficiency and accuracy,” said Jay Shin, AI Research Scientist, Riiid.
Riiid researchers provided rule-based summary templates from dialogue states and guided the summarisation to conform to these templates. Applying heuristic dialogue state extraction from the generated summaries, researchers were able to create the strongest DST model in the limited label scenario that uses only 1 per cent of training data.
GRAM will be deployed in the A/B testing platform in Santa, the company’s own English proficiency test (TOEIC) prep solution that is the best-selling AI-based smartphone application in Japan and Korea. With 300 new questions added to Santa every month, the company expects its latest model to provide new levels of a personalised learning experience for users.