This project focuses on laying the foundations for automated analysis of argumentation schemes, supporting identification and classification of the arguments being made in a text, for the purpose of scoring the quality of written analyses of arguments. We developed annotation protocols for 20 argument prompts from a college-level test under the framework of the theory of argumentation schemes, which defines reasoning patterns in argumentation. The annotation protocols listed critical questions associated with each argumentation scheme, which makes the argument structure in a text explicit and classifiable. Furthermore, we annotated 200 student essays across four selected argument prompts to test if the annotation protocols can be applied reliably by human annotators. Preliminary results indicate that this method of analyzing argument structure is reliable and promising.