While reminiscing about the good old days at university, for some unknown reason I felt a sudden urge to rummage through my PC files containing stuff related to my studies. Thus I stumbled upon my lengthy MA thesis and I realized how remote and distant the work appeared – almost as if written by somebody else. I caught myself judging the writing style, which seemed so terribly alien. I remembered all the rules they had told us to stick to when writing a piece of academic text and I also recalled how desperate I had sometimes felt when looking for the suitable synonym or a linking word.
The aim of my thesis was to analyze a short story by Ernest Hemingway called Cat in the Rain through the lens of an EFL teacher. Long story short, apart from other things, I had used several online readability tools to see whether the story was suitable for an intermediate English learner. When scanning the paper, I couldn’t help revisiting some of the links to online tools I had been once looking at from the perspective of a teacher when it occurred to me that they might well be used by language learners. I’d like to show that apart from being able to test one’s writing quality, online readability tools can also help to improve it.
Just for the sake of fun I pasted a few samples of my blog posts to see the reading and grade level of the texts (see Figure 1). I chose the Text Readability Consensus Calculator which uses 7 popular readability formulas to calculate the average grade level, reading age, and text difficulty of my sample text.
![]() |
Figure 1 |
Figure 2 shows that the Flesch Reading Ease score of the sample reads 63.1, and this is labelled as standard (or average). According to the authors of the website, anything between 60 and 70 is considered an acceptably demanding piece of text. However, my text was scored as “hard to read” By Gunning Fog. A fog index of 12 requires the reading level of a U.S. high school senior student (of about 18 years old). A text which scores above 12 is simply too hard for most people to read. The Flesh-Kincaid Grade Level shows that the text can be easily read by an average U.S student in 9th grade.
Enough of numbers. Now on to the practical part; how can this knowledge help me improve my writing? What first comes to mind is the question: Who am I actually writing for? Am I writing for kids, for like-minded people, for a general audience, for the academic audience, etc? This should determine where the readability scores should fall. One thing I learnt about Ernest Hemingway was that his style was, at first sight, fairly plain – he produced short words and sentences, and he used a lot of lexical and grammatical repetition. His writing style felt like the one of a child, even though he was clearly writing for an adult audience. That’s why his way of writing was sometimes criticized, but it was also appreciated for its simplicity and uniqueness. One thing is certain, it undoubtedly met his aim. Does my own writing exhibit the same effectiveness and focus? If not, I should probably start working on changes.
![]() |
Figure 2 |
As Figure 3 shows, there are other tools embedded in the Text Readability Consensus Calculator which can help me analyze my text. I can look at the Word Statistics overview, which reveals several things that I deem very important. To put it simply, the statistic shows 1) how many unique words I used, 2) how many and what words I used repeatedly and 3) how long the words were. Again, if I’m writing for kids, I’m not going to use twenty 4-syllable words per sentence. On the other hand, my academic sample should undoubtedly look a bit more ‘sophisticated’.
![]() |
Figure 3 |
When back at school, I remember we were discouraged from repeating the same words again and again in our written assignments. I can’t say I disagree; learners should be motivated to look for synonyms and other alternative ways of expressing the same idea, which ultimately helps them enlarge their vocabulary. However, repetition is often beneficial and vital if we want our text to sound cohesive. This doesn’t mean repeating the same noun or adjective whenever we refer to the same thing but occasional repetition can serve our purpose. For example, writers should be trained to refer to something they have already said but also to what they are planning to say later. This helps to make their production well-structured and thus more readable, but also, in case of L2 learners, it can demonstrate the necessity of planning one’s writing.
![]() |
Figure 5 |
All the aforementioned encourages me to believe that students should keep a record of their written assignments. Blogs and online journals seem ideal for this purpose. They enable comparison of what they wrote some time ago with what they’ve just produced. Students can see how much progress they have made, which automatically increases their motivation. They can also compare their writing with somebody else’s piece of work of the same genre and look at the differences. Also, when you are a non-native speaker of the language constantly working on your vocabulary, the statistic generator is a great tool for revising words which you once used – perhaps in order to sound more ‘native-like’ – but which you’ve forgotten since. But more importantly, and no matter whether you are a native or non-native speaker of the language, the tool can draw attention to words which you tend to overuse. We all have our favourites which we can’t get rid of, don’t we?
Although this seems as a post aimed at a fairly advanced learners of L2, I believe it could work with lower levels as well. Measuring readability scores of written assignments produced by pre-intermediate students, for example, can clearly demonstrate that it takes little to spice up one’s writing. By extending short sentences, adding adjectives and adverbs where they were previously missing, we can show our students how to make their writing more ‘advanced’ and/or native-like, in case this is their aim.