csc148 Comments from the Markers Spring 1996 of Assignment #2 =========================================================================== Read this feedback carefully, and try to use it to improve your future assignments. Q1. Defining the specs: ------------------------ Generally well done, although some students' specs did not contribute at all to distinguishing between the mazes in the two figures. In general, the simpler the spec was, the fewer problems it had. Q2. Maze checker program: -------------------------- Some people included no internal documentation (what important loops were doing, etc.) so at times it was impossible to know quickly what exactly was going on. The most annoying part was that people did not annotate their output so a lot of times there were up to 50 pages of output with 1 printed line on each page and no comments on what it was for etc. Many students did a lot of tests but they forgot to test the original cases from Q1. *** SUGGESTION: Use the "Printing Turing Assignment" command in the OOT environment. It produces a nice package with all of your test runs, saves paper, and uses smaller fonts that avoid wrapping around lines. A handful of people did not discuss the limitations of their program. Q3. Ambiguity: --------------- This was in general poorly done. Most students' answers were still ambiguous. A lot of people did not even refer to the specs in their answer. Suggestion: Be sure to look carefully at this part of our solution and ask us if you don't understand it. Q4. Maze builder program: -------------------------- Same as Q2. Those who did the best did a thorough job of convincing me that their program worked. Others did not have a complete test suite and were marked accordingly. Q5. Data structure: -------------------- Well done in general. Some students did not understand what data structure means -- they described their algorithms. Q6. Users' guide: ------------------ Well done in general. For this program, a step-by-step guide seems to be the most clear and effective way to guide the user. Few were able to clearly articulate the instructions in prose Most people forgot to discuss what happens to invalid inputs. In many cases, the User's Guide was not written for the "user" which should be the appropriate style for this document (as opposed to a reiteration of question 1). Quality of Writing: ------------------- This could be improved. Simply proofreading would help immensely -- obviously, many people didn't proofread. Some students had grammatical problems.