Automatic Essay Scoring

Lingnan Dai presented his Part II project on Automated essay scoring

Abstract: One interesting application of Natural Language Processing today is Automatic Essay Scoring (AES), which is the technology that automatically evaluates and scores the quality of writing. The goal of AES is to build models that evaluate writing as reliably as human readers, while providing many advantages over manual marking, such as constant application of marking criteria and faster assessment. I’ll briefly introduce the usual procedure underlying such a system, highlighting textual parsing, feature extraction and machine learning, as well as giving examples on different approaches currently deployed in commercial and academic applications. This will be followed by a more in-depth look at the Entity-based coherence model I’m developing in my Part II Project and the evaluation schemes used to test and evaluate an AES.