[clug] Man pages.

Lana Brindley mail at lanabrindley.com
Mon Jul 27 22:38:03 UTC 2015


On 28/07/15 06:26, Mike Carden wrote:
> On Tue, Jul 28, 2015 at 12:04 AM, Scott Ferguson <
> scott.ferguson.clug at gmail.com> wrote:
>
>> On 27/07/15 12:49, Lana Brindley wrote:
>>> On 27/07/15 11:40, Scott Ferguson wrote:
>> At best readability tests ensure you avoid common grammar mistake and
>> help keep documents free of unnecessarily long words out of
>> documentation.
>
> There has t be some law of the Internetz that requires you to make those
> two errors in a sentence discussing the difficulties encountered when
> trying to write properly.  :)
>

It's called Muphry's Law*

I have read all the responses with interest, thank you to everyone who
wrote to expand, I really appreciate it :)

The way I see it is very similar to what Scott said in his
second-to-last post:

> To generalise - those readability tests determine how easy it is to
> understand based on the size of words.
>
> Which is a good indicator until applied to writing that uses acronyms,
> program names, products, and processes. The use of those increase in
> technical areas. 

Readability tests are great for general audience writing. If you were
writing a YA novel, or a blog post about fishing, or have translated
something from another language into English, I'd say go right ahead.
For tech comms (including man pages), they're largely irrelevant.
Technical communicators use a process called 'audience analysis' to
determine how they should be writing each piece of documentation this
creates. This involves determining their audience's age, education
level, base level of technical proficiency, and other demographic info.
It also involves determining what tasks the audience is likely to be
performing, and under what conditions they're performing it.

One of my favourite examples of getting this wrong comes from JoAnn
Hackos (a respected writer in this field). A previous contractor was
asked to create a manual for a team of electrical tradesman whose work
was primarily carried out at the top of telegraph poles. That contractor
produced a beautifully written and laid out manual, in the form of a
large binder. Of course, when you're working up a telegraph pole, taking
the binder isn't particularly accessible. JoAnn eventually produced an
electronic version of the document to resolve the issue. The point here
being: it doesn't matter how good your writing is, you need to consider
how real people are using it.

So, I guess this is probably just professional pride getting in the way,
more than anything else. We spend a lot of time determining exactly how
to write to our readers, and a readability score is never going to
accurately reflect what we do. If you are doing some writing and you're
*not* a professional, you can certainly use Flesch-Kincaid or something
as an indicator if you wish, but don't put too much stock in it. To me,
it's no more a useful tool than the Word grammar checker, both are
unlikely to make your writing better. In many cases, it can make it worse.

One free internet to every person who picks out a typo in that little
screed :P

Interestingly, a friend of mine has just started blogging on this exact
topic too:
http://deenacooper.com/index.php/2015/06/25/preschoolers-on-tech-writing/

L

* https://en.wikipedia.org/wiki/Muphry%27s_law

-- 
Lana Brindley
writer:speaker:blogger
http://lanabrindley.com

"Quantum mechanics: The dreams stuff is made of."




More information about the linux mailing list