The Reality War



A new time travel, action-adventure series begins with The Reality War Book1: The Slough of Despond, out now

Kindle (Free*) | (Free*) . ePub Kobo (free) | iTunes (free) | Smashwords (free) | Nook (free) | Sony (free)

Paperback  US  | UK   *book1 free on 31 Jan 2013. Check price before purchase.

Click here for further details.

Book 2 now available for Kindle  | and out now  in paperback and ePUB eBook format. Click here for further details.

Aside | Posted on by | Tagged , , , , , | 9 Comments

Tips for self-publishers: How to publish your back catalog

I’ve worked with a number of authors who have a back catalog of traditionally printed books for which the rights have now reverted to them. This throws up a number of problems when they wish to republish their books, whether as eBooks, paperbacks or both.

If you have your original Word or other word processing document, then your task is much easier, but take care with copy editing. Most commonly the document file you have is what was sent to the publisher before final copy editing. In other words, you need to go through copy editing again. Even if the bulk of copy editing was contained within your Word document, it’s common for a few last-minute changes to have been made at the publisher’s end.

If your publisher gave you a PDF of the finished book, then this carries its own problems. You can convert a PDF to Word format, but the result is not pretty. Unless you are publishing a paperback of the same trim size (page dimensions) as the publisher’s PDF then this will require a lot of work to knock it into an acceptable state before publishing but still a better result than the last resort: scanning.

Scanning is the most common way to re-publish an old book in my professional experience. Take a paperback, scan the pages using an OCR scanner (Optical Character Recognition), assemble the scans into a single Word document, tidy, format, republish.

This sounds simple, after all most inkjet printers can do OCR scanning these days, but scanning isn’t as easy as it looks, and even the best scanning will leave many difficult-to-spot errors.

The first scanning task is to turn the printed version of your book into a single Word document (or other word processor). The best way is to pay a professional to do this. Google for ‘book scanning’ services in your country and get some quotes. You’re looking here for a service provided by a printing company. If you have the time and patience you can do this yourself with a cheap multi-function printer, but expect the professionals to do a faster and better job of it.

Even a professional job will still be loaded with errors. For example, suppose you have a character called ‘Saul’. The OCR software will have a very hard time telling the difference between ‘Saul’ and ‘Soul’. Most likely you will get a random mixture of both. Your spell checker will not complain about either so that won’t help. Most problems can be identified by reading out aloud or converting to an eBook format and getting your iPad or Kindle or whatever to read to you. But in the case of ‘Saul’ / ‘Soul’ even that won’t help.


Jeff Noon is one of the authors I worked with on their back catalog. Jeff proof read very thoroughly before I saw the manuscripts, which helped enormously. Click on the image to see Jeff’s new eBook editions.

And a more amateurish job will be laden with spelling errors for you to address. The OCR software will struggle to differentiate between the number ’1′ and the lower-case character ‘l’. ’6′ and ‘b’ may look the same depending on the font. It may decide an opening double quote is actually a superscript ‘m’ and a closing double quote is a superscript ’3′.

The best approach to publishing a scanned in book is to accept right from the beginning that tidying a scanned book is a lengthy task. Use a variety of proofing techniques at the start, and then expect to use beta or proof readers to pick up the few examples you missed.

Here are some techniques to use:

  • Distrust every use of a numeral. Check for every use of the number ’1′. Then search for every use of ’2′ etc.
  • Investigate every single spelling error reported by your word processor (usually shown with a red underline in Word). If you are certain the word is correct but isn’t in Word’s dictionary, click the option to ‘add to dictionary’.
  • Investigate every grammar error (usually shown with a green underline in Word). Yes, I know this is tedious. Word will report scores of grammar errors where you know better. Sometimes when there is an error in your manuscript, Word can’t identify the error directly, but knows something isn’t right and so flags a grammar error. In other words, when the grammar checker finds a genuine error the grammar rule it tells you has been broken is usually nonsense, but if you look deeper into the sentence, there is a real error lurking underneath. Remember the example above of ‘Soul’/ ‘Saul’? There is a good chance that the grammar checker will spot this.
  • Investigate every suggested word. Word 2007 started putting blue squiggly lines under words it thinks you might have mistaken and suggests what you should have used instead. For example it’s and its. With each new edition of Word this seems to get more accurate. You should be looking at these in any case, but if you’re scanning in a book, go through all the blue squiggles now.
  • Get a computer to read the results back. There are various ways to do this. The easiest is on an eReader such as Kindle or iPad/iPhone. Check your manual to see whether your device manages text-to-speech. To transfer your book to your eReader, do this:
    • On your computer, download a free eBook management tool called Calibre.
    • Save your manuscript from Word as html format.
    • From Calibre, Add Book. Browse to the html file you saved and add that.
    • Convert the book to the required format. MOBI for Kindles and ePUB for everything else.
    • Connect your device to your computer using your USB cable.
    • Once Calibre has detected your device, right click the book on your Calibre library screen and ‘send to main memory’ on your device.
    • From Calibre, eject your device.
    • Disconnect your eReader and set your device’s text-to-speech option running.
    • Doing this with Apple tablets and phones doesn’t always work. Apple wants you to do everything through iTunes. I use Dropbox to send myself files to my iPad, but you could email yourself.

The Takeaway from this Post

  • If you have the rights, then taking back control of your back catalog and self-publishing can be tremendously fulfilling.
  • However, don’t underestimate the work required to get your old books up to scratch, especially if you have to scan them in.

Follow this link to my other writing and publishing tips

This article was adapted from ‘Format Your Print Book for Createspace: 2nd Edition‘ available now as a Kindle eBook, and as a 296 page paperback:

eBook: |

Paperback |

Posted in Writing Tips | Tagged , , , , , , , | Leave a comment

Grey DeLisle: Interview with a Voice Actress Extraordinaire!


Wow! Can’t believe I missed Armand getting to interview the marvelous Grey DeLisle, the voice of Daphne Blake in the 21st century.

Originally posted on Inezian's Notes.:

Grey with the extraordinary  cast of characters that she's voiced.

Grey with the extraordinary cast of characters that she’s voiced.

For my third Granite State Comicon interview, I am starstruck to be hosting the interview with voice actress Grey DeLisle! Grey began her entertainment career as a stand up comic and singer (releasing a number of albums), but has more recently turned her talents to voice acting, with a stunning list of credits. If you or your children watch cartoons, chances are good that you’ve seen (or heard) some of her work in shows like Fairly Odd Parents, Scooby Doo, Handy Manny, and the Penguins of Madagascar. She has also done extensive voice work for major video game releases.

Q: For many of us, the careers of voice actors are a bit of mystery. It’s sort of like trapeze artists or cruise ship captains. Obviously, someone does those jobs, but we often wonder: How did they get there in the…

View original 531 more words

Posted in Uncategorized | Leave a comment

Snot Wizards and alien editors. I’ve been interviewed!

Just to prove I do write fiction books too sometimes, I’ll point your way to Inezian’s Notes where I’ve been interviewed about my YA books.  Scooby Doo gets a mention, somewhat unexpectedly, and I think 2000AD comes up at some point too (it usually does). Oh yes, and there’s the snot wizard.

Here’s some Wattpad advertising I’ve dug up to give you a feel for the books. Yes, I know they don’t look exactly slick, but I was just having a little fun.

Green_Tailor_Watty_Advertv2Treasure_Watty_adv_v2 DIG_Watty_advertv2 SnotWizardWattyAdvertv2




Posted in Interviews | Tagged , , | Leave a comment

Kindle support for Unicode pt2: how to use Unicode

Last time I posted about how Kindles can support Unicode, despite rumors to the contrary. This time I’m going to give some practical advice on what Unicode is, and how if you are self-publishing eBooks you can use it in your books.

It’s a fairly long post, so here’s a:

Cut-out-and-keep executive summary

  • Kindle support for Unicode is very good, although weaker with the earlier models (I mean specifically Kindle 1, Kindle 2 and Kindle DX; Kindle 3 support is excellent).
  • You need to state somewhere in the file you upload that it is encoded in UTF-8. I explain how to do this for saving as an html file from word and then uploading to KDP, and also for people working directly with html or xhtml
  • Support for Unicode with ePUB readers is so much weaker than with Kindle devices that if you use Unicode glyphs, you need to assume that they won’t show up for some people.
  • The world of eBooks is moving fast. I wrote this post in December 2013. I expect UTF-8 to remain a dominant encoding system for many years to come, but the idiosyncrasies of Amazon KDP and ePUB support are likely to change more rapidly, so this post will date.

What is Unicode and why should you care?

Since the earliest days of computers and telecommunication there’s been a need for a standard way to encode text. Documents are stored digitally as bits and bytes: numbers essentially. So what number or numbers represent an upper case ‘A’, and what number or numbers represent a dollar symbol? If Computer A wants to send a document to Computer B, then both computers need to agree on the same encoding system, otherwise what looks good on Computer A will look gibberish on Computer B.

What we require is an independent standards body to define the encoding system and the codes within them. One of the important early standards (from the 1960s) was ASCII 127. The coding system was to represent each character as a seven-bit binary number, and the list of which of the resulting 127 possible ‘code points’ corresponded to which character was defined like this ( ) With 127 code points, and some used for control characters (like one to ring the bell – ASCII 127 was used with teletypes) there wasn’t room for some common characters. We get upper case and lower case ‘A’ through ‘Z’ but we don’t get any accented characters. We get numbers and basic mathematical operators. We get a dollar symbol, but we don’t get a pound (£) sign. We get a basic ‘typewriter’ apostrophe and quotes, but we don’t get the proper curled versions (what Microsoft calls smart quotes) that have always been the norm in books and magazines.

The need to limit character encoding to 7-bits is a constraint that has long-since become obsolete. So people have naturally wanted more characters. The same problem persists that everyone needs to agree the rules for how those characters should be stored in computer files. ASCII 127 was pretty dominant for many years. Today there are many rival schemes for encoding characters, which is why I’ve no doubt you’ve seen examples where encoding has gone wrong. The most popular standard at the moment is a form of Unicode called UTF-8. Most websites you see are now using UTF-8. So even if you’ve not heard of it, you have definitely used it.

The reason you should care is that if you have any character not in the ASCII 127 character set, then you need to find a way to encode it safely. If the protagonist in your novel speaks some words in Spanish or Polish or Hebrew or whatever, then you need something better than ASCII 127.

It’s not just the threat of getting things wrong, there’s the opportunity too. If you want a hammer and sickle symbol for your Cold War spy thriller use   (U+262D). Try arrow symbols for your pirate map (U+27AA), or a heart for your ‘I love New York’ t-shirt, or perhaps your historical romance needs a fancy scene break character (U+2767) Unicode supplies the answer. (By the way, I’ve double checked, and all those symbols I’ve just mentioned render fine on my Kindles).

You don’t need to know how UTF-8 works in order to make an ebook (or webpage) in UTF-8. Here’s all you need to know about how it works.

Suppose you have a webpage that needs something more than the old ASCII127 characters. Here’s what you do.

Your webpage is written in a coding language called html.

At the top of your html document is a statement that says “I am encoded using UTF-8″. That’s not something human readers get to see; that statement is put in a special place for other software to find.

Now suppose someone opens up your webpage in a browser. That might be on an Android phone, iPad, Mac, PC or something else. Doesn’t matter. The browser looks at your html page and looks for the statement that tells it how you encoded all your characters.

The browser recognizes UTF-8 as one of the encoding systems it understands. Now it can separate out all the characters in your webpage and knows which Unicode code point each one represents.

What’s a Unicode code point? Take a look at the screenshot I showed you in my last post on Unicode.

The screen is listing separate Unicode code points. So the code point for the ‘black suit heart’ symbol I used in the Jack Fish book (with all the ‘I ♥ New York’ T-shirts I mentioned last time) is U+2665.

So when I enter the heart symbol into my html code, it is saved in the file as U+2665. When your browser sees that Unicode code point, it knows it has to go away and look up that code number in the current font file and display the pattern of dots it finds for that code. That pattern of dots is called a glyph, and will almost certainly be defined as a vector graphic, so it is sharp at any size.

Now, as you might imagine, not every font has every Unicode code point defined. For example, I don’t think the Times New Roman font has U+2665 defined. If your web page is set in Times New Roman,  what decent browser software will say to itself is this: “Hmm, I don’t have anything for U+2665 in the current font. I could display an empty box, question mark, or some other gobbledygook. But that’s a last resort. What I’ll do first is check whether I have a fall back font that does know how to display U+2665. I’ll look in my Arial Unicode MS font first, because that has thousands of glyphs (your magic talking browser might try another font, such as Lucida Grande, on a Mac). Ah, yes. There we are!”

That’s really all you need to do: declare that your file is encoded using UTF-8 and hope that browser reading your webpage has each glyph defined in the font you have defined, or in a fall back font.

How your glyph displays depends on how the font designers have decided it should look. I’ve worked a little with Hebrew glyphs in Kindle books and find that there is a big variation for the same Unicode code point. In my own example of the I love NY t-shirts, the Unicode code point is named by (I presume) the Unicode Consortium as ‘black suit heart’. But on Kindle for iPad, the heart actually comes up red. The same Kindle book on a Kindle Fire (a color device) will show the heart symbol as black.

I’ve used a web page as an example here. But eBooks in Kindle or ePUB format are essentially web pages. Each section of the book is an html page (or a variant called xhtml). There is software embedded in your Sony Reader, Nook, Kindle or whatever that tells the device how to display each page, just the same as Chrome, Safari or Internet Explorer tells a computer or other device how to display a web page.

How to insert Unicode characters using Microsoft Word

Open up the character map (Insert | Symbol) and pick your Unicode symbol, then press the insert button at the bottom to put it into your text. One problem, though, check that the little box at bottom-right says ‘Unicode’ or can be set to Unicode. If you click on the little down arrow and Unicode isn’t an option then the symbol isn’t a glyph for a Unicode code point. You can still use the symbol in a paperback if you embed the font in the PDF, but the symbol will come out as gobbledygook in an eBook (unless that same font is embedded in the eBook, something I don’t advise).

To see the most Unicode code point glyphs in the character map, you will want a font specially designed to have glyphs for Unicode. In the screenshot below, I’ve selected Arial Unicode MS, which is provided by Microsoft and available for Windows, and Macs if you have the right Office installations. This has a huge number of Unicode code point glyphs. For Macs without Arial Unicode MS, try Lucida Grande. There are some free Unicode fonts around, though the purpose behind most is to provide fonts for many languages, rather than fancy characters such as heart symbols. Try looking here:

How to make a Kindle book with UTF-8 if you upload a Word or html file to Amazon KDP or some other auto-converter

For this approach, you need to be able to upload an html file to whatever service makes your Kindle book for you. The simplest way to do this is to save your Word document as html (from the Save As… menu in Word) and then upload the resulting html file directly to Amazon KDP (though you’ll need to read my note in a moment if you include images).

When you save to html, you must set the encoding to UTF-8 as in the following screenshot.

Here I am Saving As… and changing the format (Save as type) to Web Page, Filtered (which is a slightly more streamlined version of what you would get if selecting save to html). I click on the ‘Tools’ button right at the bottom and pick ‘Web Options’. Then I pick ‘Encoding’ and Save the document as Unicode (UTF-8). What this does is to put a statement at the top of the html file that says ‘I am encoded using UTF-8′.

I’ve just tested this out myself to double-check it works. I’m writing this post in Word 2013. I’ve saved as html, zipped the result (see next section for why) and uploaded that to Amazon KDP. The result looks great with all my heart symbols and other Unicode fanciness coming out perfectly in the resulting Kindle file.

In fact, here’s a screenshot of my previous blog post saved to html, uploaded to Amazon KDP, downloaded and then sent to my iPad as a Kindle book. I started all this lengthy post about Unicode because I’d read someone post online that Kindle books don’t support Unicode. There’s my Unicode heart symbol to prove that isn’t so.

Html and images

This is going a little off-topic, but I can’t talk about uploading html files without a little explanation about images. If I went through saving the Word document for this post as html and uploading to Amazon KDP, then all the screenshot images will be missing. It’s easy to fix (so long as you aren’t too bothered about image quality).

Suppose you have a Word document called (naturally) MyDoc.docx. If MyDoc contains images, then when you save you will find a file has been created called MyDoc.html. So far, so simple, but Word will also create a subfolder called ‘MyDoc’ and in there it will place compressed versions of your images, saved as separate files and numbered (e.g. image0001.jpg). For Amazon KDP, what you need to do is create a zip file of your html file and the folder of images. Upload that zip file to Amazon KDP and it will look fine. [Here's how to zip on Windows (and don't worry if you don't have Windows7 as it's worked this way for a long time) and on Mac.]

The only problem with this approach is that Word always tries to compress your images when saving to html format. You have some limited control through the ‘Pictures’ tab (to the left of the ‘Encoding’ tab in the Web Options screenshot above) but the normal Word Options setting that allows you to turn off image compression doesn’t apply to saving as html, at least not in my Word 2013. When Amazon KDP builds your Kindle book (or you do it yourself through Kindlegen) then that will compress your images anyway, so it might not make a lot of difference for large images but just be aware that when Word saves to html it quietly changes your images.

How to make a Kindle book with UTF-8 if you code your own html

This is the way I make eBooks.

Html files should have the following in the <head> section

<meta http-equiv=”Content-Type” content=”text/html; charset=utf-8″/>

[this is the statement that Word adds when you save to html and set the encoding to utf-8 in Word]

For xhml files you want

<!–?<?xml version=”1.0″ encoding=”utf-8″ ?>

How to make an ePUB book support UTF-8

I’ve concentrated on Kindle books so far; ePUB format uses xhtml files to store its book content and these files need the encoding statement I’ve just given (  <?xml version=”1.0″ encoding=”utf-8″ ?>)

In general, ePUB books are trickier to give guidance for than Kindle because when it comes down to the fine details, there is much more variation in the way in which the ePUB format is implemented by the various ePUB reader devices and the various firmware versions that sit upon them. I’ve read people suggest that because ePUB is an open standard, all you need to do is write one ePUB file and it is guaranteed to work the same way on every device that can show ePUB books. I’m afraid that is far from the truth, but that’s for another post.

When it comes specifically to Unicode support on ePUB, I find support on my Nook Glow, iPad iBooks, and Adobe Digital Editions is good. My Kobo Mini isn’t so good. In my last post I showed some East European Latin extensions implemented as Unicode UTF-8. In that previous post they looked good on my Kobo Mini. But I was cheating! Here’s another screenshot where the Kobo can’t find the right glyph and gives a box with a cross through, what I call a ‘huh?’ symbol.

So what’s gone wrong?

The problem with the Kobo is that it doesn’t seem capable of working with fall back fonts. If the Kobo comes across a Unicode code point, it looks in the current font to see whether it has a glyph defined for it. If it doesn’t, it gives up. What it doesn’t do is go looking in a fall back font. Which is a shame because Kobos have good Unicode support in their Georgia font, which can display those characters perfectly.

You could try forcing the font to Georgia, but that’s easily overridden by the user.

So when I build eBooks for clients and they have requirements outside of a basic Latin character set, I have a conversation about portability – which basically means how confident we can be that the book will behave as we intend across a range of platforms. In some cases this means we have a higher-spec version of the eBook for the Kindle format, and produce a dumbed-down version for ePUB.

What’s the take-away from this post?

Well, it’s really the same as previous post.

The first is that Kindles do have excellent Unicode support, despite what you might read elsewhere. What’s more, an occasional use of Unicode can lift your book out of the ordinary. If you are coding your Kindle book directly with html, or through a tool such as Sigil, then all you need to do is ensure the encoding is correct in thetag as I’ve shown you. If you upload a doc or html file directly to KDP, then you could try setting the encoding as I’ve suggested in Word’s Save As, or simply make do with basic characters.

The second take-away is to beware of what people post on the internet about how to make eBooks because there is a lot out there that isn’t accurate. Treat whatever you find with suspicion, test your books thoroughly, and try to get multiple opinions. That advice, of course, goes for my posts too, every bit as much as anyone else’s.

Click here for part 1 of my unicode posts

Follow this link to my other writing and publishing tips

 ‘Format Your Print Book for Createspace: 2nd Edition‘ available now as a Kindle eBook, and as a 296 page paperback:

eBook: |

Paperback |

Posted in Writing Tips | Tagged , , , , , | 1 Comment

Kindle support for Unicode pt1: dispelling a myth

I seem to be working through a mountain of jobs to format and edit eBooks and paperbacks at the moment. It’s gratifying that people trust me with their precious writing, but because I work with so many books, I get to find all the dark secrets of eBook design. In fact the darkness isn’t with eBook design at all, it is with the design of the tablets and eReaders and the software they use to display eBooks. If there’s a secret to eBook design, it is to know the current set of idiosyncrasies and bugs in reading devices, and know how to code your way around them. One day these inconsistencies and workarounds will be a thing of the past, and we will be able to concentrate on good typography and layout. But I think we’re still years away from that.

Since I have to face these problems head on, I’ll share some of what I find on the website. After all, I can’t only post about Lego creations. Can I?

A problem I faced recently was with the ePUB version of an anthology called Looking Landwards (in association with the Institute of Agricultural Engineers, no less). One of the stories had extracts in Polish, which meant accented Eastern European characters, and that means Unicode.

I was inspired to write this post after coming across a  post from someone who provides a freelance eBook production service. In her post, the author explains that Kindles only support the basic ASCII character set. In other words, pretty much what you can type in on a basic keyboard. To be fair to the person who wrote that, she was probably influenced by advice from Amazon themselves. Now, in comparison with Barnes & Noble, Sony and Kobo, Amazon are by far the best of the bunch for documenting how to write books for their eReader device. Not great, but better than the others, so I’m not having a go at Amazon.

What this blogger had probably read was Amazon’s simple guidance for self-publishers who upload a Word document (possibly saved as an html file) and get Amazon to automatically convert that to the Kindle format. That approach makes sense for many self-publishers, but sometimes the simple guide for beginners says things that contradict the Amazon Coding Guide PDF. I’m sure Amazon are trying to simplify things for beginners and shield them from topics, such as Unicode, where it is easy for the unwary to become ensnared. But because some self-publishers, and even freelance book designers, will blog about how to make eBooks without ever getting as far as reading the coding guidelines, various myths and half-truths sometimes spread.

One of these is that eBooks, and Kindles in particular, don’t support Unicode. In fact, Unicode support is very good with Kindles and other eReaders, though not perfect (for example, see Devanagari support Kindle screenshots here)

I almost have some slight sympathy for the mistaken view that Kindles only support ASCII, because if you go to the Amazon KDP site, you will see this comment:

In case the text is too small to read on your browser, I’ve ringed passages that say: ‘avoid UTF-8 as the encoding type’ and ‘Amazon Kindle Direct Publishing supports text in the Latin-1 format, and ‘all characters from that set not currently supported are: spades, clubs, hearts, up-arrow, down-arrow.’. In other words: don’t use Unicode!

But hold on, I’ve just said I’ve made a book with passages in Polish, and some of the characters used are not in the Latin-1 format. And I have most definitely encoded the content files for my eBook in UTF-8 character encoding that Amazon says you should not use. In fact, I always use UTF-8 encoding and this is far from the first time I have used characters not in the Latin-1 character set.

I’ve also just worked on a book called Jack the Fish set in New York. There are plenty of references to “I ♥ New York’ t-shirts. Plenty of I ♥… all sorts of things; it’s a running gag. Now in internet-world, I can’t be sure those last two sentences displayed correctly: you should have seen a heart symbol. That’s the same heart symbol that Amazon says explicitly you shouldn’t use. But having used the Unicode black suit heart symbol (Unicode code point U+2665), it displays fine with Kindle devices from Kindle 3 onwards.

I worked in the software industry for twenty years before I came to make eBooks. There’s a common saying there: you don’t know what you don’t know. Sounds a bit Donald Rumsfeld, I know, but it reflects the fact that in such a fast-changing world as software coding, your knowledge is constantly becoming outdated, and one of the biggest risks is that there is no cheap and easy way to know what new things have come along that means you are now out of date. The same is true with making eBooks.

But I’m not working with the beginners’ guide. I’m referring to the next level up of Amazon coding guidelines. You can find these on the Amazon KDP site, but the easiest way is from the Amazon Kindle Previewer from the Help menu as in this screenshot.

(By the way, if you are a self-publisher and don’t have Kindle Previewer, then you really should do. The only reason not to is if you already have all the devices. There used to be problems with the accuracy of the rendering. There still are but the previewer is vastly improved on earlier versions.)

And what do the Kindle Publishing Guidelines say? Unicode is fine (see section 3.1.4). There’s no reference whatsoever to there being anything special about ‘Latin-1′. After all, if you look through the Unicode glyphs (a glyph is a single character or symbol in a font) defined in the fonts on a Kindle then you will discover there are thousands of them. Why would Amazon put thousands of glyphs onto their Kindles and then tell you never to use them?

Thousands of glyphs? An exaggeration? Actually, pick up this free book (here) and load it onto your kindle. It simply lists all the Unicode code points in turn. Any code point supported on your Kindle will display nicely. Any that doesn’t will get the little ‘huh?’ box. (A code point is basically a serial number of a glyph. That serial number is defined by the Unicode Consortium. If you want to know more about Unicode, this post is a good place to start, and there’s more from me in my part2 follow-up post.)

Here’s a shot of that book on my iPad, which is running the Kindle app for iOS. I’d love to credit the author, by the way, but I can’t find his or her name anywhere. These glyphs also display correctly on my Kindle 3, Kindle Touch, Kindle Fire, Kindle for PC, iBooks on the iPad, Nook Glow, and very nearly on my Kobo Mini, though there’s a problem with the Kobo I’ll come to in my next post.

Here’s another shot, showing our Looking Landwards Polish characters looking fine on a Nook Glow, Kobo Mini, and Kindle Touch. Look closely and you will see there are also proper double quotes, not the ‘straight up’ typewriter quotes. There are other Unicode symbols elsewhere in this book, such as some fancy right arrows.

The take-aways from this post

Hopefully I’ve dispelled the myth that ‘Kindle can’t handle Unicode’.

On the way, I’ve also pointed out that Amazon seems to give Kindle book formatting advice at two levels: one for beginners who upload a Word .doc file or one saved to html, and a more detailed one for those who build the eBook themselves and upload the resulting file. If you want to take the second approach, you do need to read the Kindle Publishing Guidelines that I showed you.

The third take-away is to beware of what people post on the internet about how to make eBooks because there is a lot out there that isn’t accurate. Treat whatever you find with suspicion, test your books thoroughly, and try to get multiple opinions. I’m not setting myself up as the definitive expert, though. The advice to treat what you find online with suspicion, of course, goes for my posts too, every bit as much as anyone else’s. I might have taken the time to actually read Amazon’s documentation about how to make Kindle books, which is something some people who post advice and sell formatting how-to books have clearly never bothered to do, but I still suffer from the same Rumsfeld-esque weakness as everyone else.

You don’t know what you don’t know.

I first heard that 20 years ago. It’s just as true today.

What I haven’t done is do anything more than hint at what Unicode is and why you would want to use it. Nor have I explained how you can use it. Well, we’ve stretched this post out long enough, so I’ll give you all those goodies in the next post.

Take care and beware.


Click here for part 2 of my Unicode posts

Follow this link to my other writing and publishing tips

122313_1859_Kindlesuppo6.jpgAvailable now: ‘Format Your Print Book for Createspace: 2nd Edition‘ available now as a Kindle eBook, and as a 296 page paperback:

eBook: at |

Paperback  at |

Posted in Writing Tips | Tagged , , , , , , , , | 2 Comments

Tips for self-publishers: Table of Contents in Word

For anthologies and most non-fiction, a table of contents (TOC) is essential for readers to find their way around your paperback book. It’s also a way of showing off the contents of your book for potential browsers, and that includes the ‘Look Inside’ image of your Createspace book that Amazon will put onto your book’s page on Amazon.

Other than anthologies and compilations, most fiction does not benefit from a TOC.

For most purposes, Microsoft’s tool for generating tables of contents automatically does a good job and is probably more flexible than you realize. It’s what I used for the print edition of the book this post is extracted from.

The beauty of inserting an automatic table of contents (from the Ribbon, pick REFERENCES | Table of Contents) is that Word does all the hard work of finding all your headings, automatically adding them into your TOC, and correctly adding your page numbers. If your book changes, you can tell your TOC to update itself and all the page numbers and headings will recalculate themselves. You can update the TOC in several ways: the simplest is from the Ribbon, REFERENCES | Table of Contents | Update Table.

Microsoft’s online help for the topic of tables of contents is extensive. One of the best places to start is with Microsoft’s Word 2013 training videos on this subject. I’m not aware of any changes to tables of contents since Word 2007, so the 2013 video should apply at least as far back as 2007. These training courses use Microsoft Powerpoint .pptx format. If you can’t read that straight away, you can download a free viewer from Microsoft or try running it using Quicktime.

The course I have in mind is called “Advanced tables of contents”. You can Google for it, or go to

The key to Microsoft’s automatic TOC is your use of styles. By default, Word will look through your text and pick out your heading styles (for example, ‘heading1′, ‘heading2′) and use those as TOC entries.

Once you’ve created a TOC, you will be able to modify TOC styles in the Styles Pane. The first level TOC entries are styled by the TOC1 style. The next level by a style called TOC2, and so on.

The heading of the TOC has its own style called TOC Heading.


Using styles to drive the automatic table of contents

Look up similar books to yours in the library or your bookshelf to get ideas on styling to aim for. Also, make good use of the Look Inside feature on Amazon to see how others do it, though make sure you are looking at the print edition.

As well as the automatic table of contents, Word allows a Custom TOC from the same part of the Ribbon. This allows you to fine-tune the design, and to fill the TOC using other styles than heading styles.

There’s plenty of further guidance on this topic online. If you need such sophistication as multiple TOCs, sub-headings and summaries, then Word can do this if you learn a few field codes.

Alternatively, if you need flexibility, it could be easiest to abandon Microsoft’s TOC generator altogether and write and style the TOC yourself, something I’ve done myself for several poetry and short story anthologies.

I’ve reproduced below a table of contents I made for Dead to Rights: A Circularity of Glosas, a poetry anthology by Alain C. Dexter. Typing in the entries and styling the TOC yourself is easy, but to type in the page numbers by hand is dangerous. It would be so easy to get them wrong, especially if the correct page numbers change after editing.

The trick is to use cross-references, which you can access from the Ribbon under REFERENCES | Captions | Cross-reference. For our purposes, we want reference type: Heading and insert reference to: Page number, as with the screenshot below.

Using cross-references to reference page numbers

When you hit the Inset button, the correct page number appears in the TOC. If the page number changes at a later date, the cross-reference doesn’t change automatically to match (which is the same as Word’s automatic TOC.) To make sure all the page numbers are accurate, do this:

Save your document

Select the entire contents of your document (CTRL+A if you’re running Windows, + A if on a Mac)

Update field codes (F9 – if you’re running a Mac and that doesn’t update field codes, you must first turn off the Exposé keyboard shortcut for this key. On the Apple menu, click System Preferences. Under Personal, click Exposé & Spaces)

If you were reading the paperback edition of the book this post was taken from, you would see entries inside the text such as

For more information on cross-references, see [p218 of Format Your Print Book for Createspace: 2nd Edition].

Those page numbers are cross-references and use the technique I’ve just explained. Where I want to refer to a specific page where there isn’t a heading, I insert a bookmark (Ribbon: INSERT | Links | Bookmark) and then in the cross-reference dialog I pick Bookmark as the reference type.

If you’re writing non-fiction then cross-references can be a powerful tool. If you’ve not used them before, open up a new document in Word and start playing with them. I’ve just run through a whole lot of dialogs and key presses, which might leave you feeling a little dizzy. Once you’ve used them a few times, though, you’ll find them easy, I promise! So go off and try these techniques for yourself.

A final word about your TOC. Word TOCs use tabs and page numbers. Neither have any place in eBooks. Never allow an automatic tool to convert your paperback table of contents directly into your eBook as it will look awful.

Microsoft Word’s table of contents customization allows you to set table of contents entries as hyperlinks. Depending on how you create your eBook, this might be all you need to set the hyperlinks that your eBook TOC must have in place of page numbers.

Follow this link to my other writing and publishing tips

This article was adapted from ‘Format Your Print Book for Createspace: 2nd Edition‘ available now as a Kindle eBook, and as a 296 page paperback:

eBook: $2.99 at | £2.00 at

Paperback  $10.80 at | £8.50 at


Posted in Writing Tips | Tagged , , , , , , , , | 2 Comments

I have two new YA books out today!

TheUltimateGreenEnergy_prodcat_400px_96dpiRunning a publishing business and doing freelance book design doesn’t leave me much time to write any fiction. That’s a shame because I love to write, This year I’ve made the time, and have been scribbling away in the background, as Monday night regulars at the Swan at Bromham can confirm. Rather than write to hit market demographics and all that, I’ve written for pleasure and relaxation, taking as my inspiration my teenage joys of discovering science fiction and fantasy in such sources as 2000AD, Blake’s 7, Doctor Who, and I guess a little from Dungeons & Dragons too.

I’ve written before about how 2000AD influenced me. I didn’t realize at the time that the writers at the comic were mining every cliche of golden age science fiction from the 40s & 50s and then adding their own spin and taking them in new directions in such series as Tharg’s Future Shocks. When I grew older and started to read the golden age science fiction (and much besides) for myself, I was glad 2000AD had mined that storytelling gold because it would have been denied me otherwise. Sometimes I think contemporary adult short science fiction can be too fixated on trying something new, of needing to be clever. Other than an addiction to making jokey reference to contemporary culture and politics, the cleverness of the 2000AD that I loved in the 1970s and 80s was that the writers were (largely) unconstrained: they were allowed to let their imaginations soar in pursuit of storytelling in a way that I suspect short story writers of that time often felt they were not.

Treasure of the Last Dragon_400px_96dpiAnd that has been my approach in these stories. There’s Treasure of the Last Dragon, for instance, which is styled a little like the Famous Five stories I was reading to my son at bedtime, except the children are hexapod aliens. I just thought that was a fun thing to do.

The idea of the playthings of super beings being left around the galaxy was a common wonder in early 2000AD (and a cliche long before then). I combined the essence of that sense of wonder, although not the abandoned plaything idea, with a little teenage angst for my story Collision Course

There was a craze for Dungeons & Dragons while I was at school. We had a strange spread-out school site built up over many decades with gardens that linked various buildings. There was even a gardener named Arthur who let us in his shed at breaktimes to play D&D or just have a chat. The combination of D&D and Arthur’s shed led me to create St. Rushby’s Home for the Un-parented and write a story about a young forensic sorcerer in The Snot Wizard.

Crustias_logo_91px_300dpi_left_padded_TextPartly the inspiration for these stories came from Gill Shutt’s Alien Legends. Gill was a talented author and source of creative energy who proof read most of these stories before her untimely loss earlier this year. It was Gill who came up with the initial concept of the Repository of Imagination and who gamely let me mess around with her creation to arrive at the form the Repository takes today. The Senior Repositarian of the Earth branch is one Crustias Scattermush, and I’ve written these books under the Crustias penname. Actually, as my friend Elaine Stirling would explain, Crustias is more than a pen name; he is taking on a heteronymic life of his own. So I take no responsibility for his comments at the end of each story, they are purely his own.

I’ve put these stories out today in two short Kindle books, each costing less than a dollar or pound.

Click here for more info on: Tales from the Repository of Imagination #1 - Treasure of the Last Dragon 

Click here for more info on: Tales from the Repository of Imagination #2 - The Ultimate Green Energy

I’m quite pleased with the Photoshopped stamp I created for the series, and so since I’m in a mood to show off today, here’s a bonus stamp. I’ve several more stories sketched out or nearly written, so I hope to be using this stamp again in 2014.



Crustias portrait  © RATOCA –
Posted in Book launch | Tagged , , , , , , , , , , , , , , | 2 Comments

Self-publishing paperbacks: faux fonts and small caps

There’s a wealth of guidance online to help you lay out your book manuscript. One of the terms you will come across is ‘faux’ meaning your word processor (or typesetting software) can’t access a properly designed glyph and so has to fake one, possibly with ugly results. (A ‘glyph’, by the way, is simply one character or symbol in a font) . In this extract from ‘Format Your Print Book for Createspace: 2nd Edition’ I explain more about faux fonts, dispel a few rumors, and explain how to use true small caps in Microsoft Word (although you might howl in protest when you read my solution).

A faux font is one that is faked. So faux bold text, for example, would be the normal weighted typeface but crudely fattened up. With a slightly more sophisticated typeface, there will be a version of the font that is specifically designed to be bold. There will be another that is specifically designed to be italic, maybe another for small caps, and possibly even more variations designed to be used at different sizes. The basic fonts you will be familiar with (such as Garamond, Times New Roman, Arial, Verdana, Tahoma) all have proper bold and italic fonts, so you are in no danger of faux fonts there.

Are faux fonts really such an evil? It depends on the typefaces. These days all common typefaces I’m aware of come with italic and bold versions, and all of them have something called font style-linking (which means that Word knows that when you tell it to set text in italics, for example, that it has to go pick the glyphs from the italic font). So the typefaces that don’t do all this font-linking for you are obscure, amateur or specialist — designed to perform a specific task. So, yes, if you force a typeface to use faux fonts, it probably means you’re forcing it to do something it’s not designed for, and that will look bad.

I do sometimes come across the suggestion that under the Windows platform, Word always uses faux italics and bolds. This isn’t true for any of the common fonts that come with your operating system or with Word, nor any fonts I have purchased from Adobe, although I have found one obscure font for which this is true. I think the rumor stems from the fact that Windows and Macs have a different approach to displaying fonts in their standard font dialogs.

So now that we’ve understood that faux fonts are ones that have been made by altering the normal font, rather than being designed specifically for that purpose, we come to faux small caps.

With small caps, Word lets us down. It’s easy enough to set text to use Small Caps (from the standard font dialog) but these are faux glyphs Even if you have fonts that contain small cap glyphs, Microsoft still can’t access them …

… at least, not easily. If you are desperate for genuine small caps, you can insert them from INSERT | Symbol if they exist in the font. If they are there then they will most likely appear in the section called ‘Private Use Area’. (It’s generally the more fancy or paid-for fonts that have small cap glyphs). Inserting characters one by one is not my idea of a good time, but is perfectly practical for a title or footer, for instance.

Also, a few typefaces have a specific font for small caps, such as Fontin (a free typeface you can download legally here). If you have Fontin installed, then to switch to small caps, select the text and change font from ‘Fontin’ to ‘Fontin Smallcaps’.

So what is all the fuss? Below I’ve written out text in real and faux small caps using a paid-for font called Adobe Garamond Pro.


The lower case letter comes out much larger with the faux small caps, so underneath I’ve reduced their font size by a point to match the real small caps on the left. There certainly is a difference even then. The strokes are narrower; in fact, at these fairly small point sizes (10pt with the faux small caps reduced to 9pt), it isn’t easy to read.

If I raise the font size to 11pt (10pt for the reduced small caps) then we get…


All of them are easy to read, the default Word small caps is too ‘shouty’ for my tastes, especially when mixed in with normal text (rather than used in titles and headings), and reducing the font size of the faux small caps makes the strokes look slightly weedy. All of this, of course, applies to just one typeface, others may look better or worse.

So the issue of faux small caps is genuine and not something that exists only in the minds of overly conservative typographers, but whether that means you should never use them is something else entirely. My view on typesetting is that you should always be guided by your eye. If it looks good and is clear then use them. I use faux small caps extensively in the paperback version of Format Your Print Book for Createspace: 2nd Edition because I feel the end result is clearer for you to read than not using them. When I do use real small caps it is normally for title pages.

Kindles and other eReader devices currently use faux small caps unless you start mucking around with installing font files, which I would generally advise against for books you’re publishing (although installing fonts on a Kindle can be fun). Where I use small caps, (for the lower case letters) I normally up the font size too and span them with a CSS class (font-variant:small-caps; font-size:110%;).

The purpose of small caps within body text is to make words and phrases stand out but without overwhelming the other body text as you would do if you went all caps. That’s all; there’s nothing magical about small caps that means you have to use them. They only exist to do a job for you. Whether in your printed book or eBook, If your faux small caps or any other faux glyphs look bad to your eye, then they are not delivering the effect they are intended for. Scrap the small caps and achieve the same effect some other way.

For example, it’s common to have the first few words of a chapter in small caps. If your small caps look awful, use bold characters instead. I’ve seen this work well on my Kobo, Kindle, and Nook.

— follow this link to read my other writing tips —

FormatYourPrintBook2ndEd_200px_159dpi_q8This article was adapted from ‘Format Your Print Book for Createspace: 2nd Edition‘ available now as a Kindle eBook, and as a 296 page paperback:

eBook: $2.99 at | £2.00 at

Paperback  $10.80 at | £8.50 at

Posted in Writing Tips | Tagged , , , , , , , , , , , , | 1 Comment

Out Now! Format Your Print Book for Createspace: 2nd Edition

FormatYourPrintBook2ndEd_400px_159dpi_q8Out now is the second edition of my book on how to produce your book for Createspace using Microsoft Word. The book is twice as big, has many more images, updated to cover changes to Createspace and Word versions up to Word 2o13.

If you have already purchased the first edition for Kindle, then please hold tight as I’m working with Amazon to get you an automatic update. It’s not there yet.

Click here for purchase information and more…

To whet your appetite, here are some review comments from the first edition followed by a quick tour of the contents page.

Praise for the first edition:

Here’s a selection of comments from 5-star reviews on Amazon

“It helped me get my book ready much, much, MUCH faster than I could have done it on my own.”

“There is a TON of information here, all laid out with step-by-step explanations and screen captures.”

“As the owner of a small publishing company and an author myself, this book has become my ‘Createspace Bible.’ “

“…the best instructional seminar you can read.”

“I have some other books about formatting for print, but none of them even come close to providing the amount of information I found in this book.”

“Tells you all the “little” things you need to know to format.”

“…very easy to read and understand.”

Topics covered in the book are:

image (c) Kirril M/


Why bother with a paperback edition? — How to use Createspace to print and sell your book — 1. Let Amazon sell them for you — 2. Sell them yourself — 3. Set yourself  up as a traditional small publisher — 3a. Using Createspace’s ‘Expanded Distribution’ option — 4.  And still more ways to sell your book … — Conclusion: How YOU should sell YOUR book! — Common Questions about Createspace — What is an ISBN and do I need one? — Should I use my own ISBNs or use one from Createspace? — Can I use the same ISBN for my eBook and my paperback? — If I print my book with Createspace, do I still own my book? — If I print my book with Createspace, can I also print it with another printer? — If I make changes to my book, do I need to use a new ISBN? — Do I really have to upload my book to Createspace as a PDF? — Copyright, Library of Congress, and mandatory deposit — How much royalty/margin will I make? — Should I choose cream or white paper? — Which trim size should I choose? — If I change my trim size, should I lay out my book differently? — Can I make hardback books with Createspace? — How can I sell to Canada? — How can I sell to Europe? — If I’m not a US citizen, why am I paying US tax? — Why can’t I have spine text? — Is it worth selecting Createspace Expanded Distribution? — How can I link my paperback and Kindle edition on Amazon? — Should I use Createspace to build my Kindle edition? — Amazon has reduced my retail price. Can they do that? — Help! My paperback book is being sold by someone else. Are they pirates?


One-page Summary: how to self-publish with Createspace — A typical Createspace workflow: DETAIL — Set up your Createspace account — Register your book with the US Copyright Office — Set up dummy projects — Organize cover art — Choose trim size — Create your main Createspace book project — Set correct paragraph and page/section breaks — Set headers and footers — Set page layout options in Word — Set font and paragraph styling — Set section breaks — Consider use of ornaments — Set front and back matter — Curl your quotes — Perform other final checks — Upload paperback interior — Check results in Interior Reviewer — Upload Cover — Print proof/ beta copies    — Finalize book description — Run your beta-reader program — Create your eBook version


Writing Robust Paragraphs — Manual line feed. — How to set page breaks correctly — How to fix badly marked paragraphs


Plan the layout strategy for your pages — Copy the page setup from the Createspace template — Headers and footers — Link to previous — Which, what, huh? — Sections — What’s a section? — Smashwords, Kindle editions, ePub and section breaks — Page numbers — Make sure you have entered page numbers for ALL your headers/footers — Formatting page numbers — Working with page numbers in Word 2003 — Introducing styles — Don’t start from here! Start formatting as you write your book — Defining Styles — Defining Styles#1 — direct formatting into styles — Defining Styles#2 — create a style family — Why styles again? — Editing your styles — Sorting out a style soup — Sorting out style soup… without losing your italics — Justification — Justification and eBook formatting — Paragraph indentation — Exceptions – other types of books — Exceptions – scene breaks and letters — Paragraph spacing — Consistency — Maintainability — eBooks — The paragraph spacing gotcha — A starting set of styles — Fonts and Typefaces — Leading — eBooks and leading — Super style sets — Using style sets with your book — Creating your own style sets — Older versions of Word


Images: a great opportunity but Word lets you down — WHERE TO USE IMAGES — Ideas for images: fiction — Ideas for images: non-fiction — Typographer’s Ornaments Revisited — COLOR IMAGES, BLEED AND FULL PAGE IMAGES — HOW TO INSERT IMAGES — 1. Keep the images separately in a safe place, then insert them — 2. Set gridlines on — 3. Use the size and position menu to place the image — 4. How to set captions — 5. Using frames, shadows and special effects — HOW MICROSOFT WORD RUINS YOUR IMAGES PART#2  — Working with images that aren’t 300dpi   — Image basics… and image resizing


Dinkus asterisms, and how print and eBooks are very different — You cannot tell how your eBook will look — The final tidies — Do I really need to upload a PDF? — Embedding fonts in PDFs — Front Matter — What to put in your front matter — Front matter for eBooks — Adding blank pages at the end of the book — Widows and orphans   — Table of Contents — Drop Caps and Small Caps — Small caps and page header styles


Typography 101 — Special Effects with OpenType  — Kerning & Spacing — Typography 909 — Faux Fonts — How to tell what your font can do — Font style-linking — How to republish your back catalog — Cover art tips — More on Margins — Createspace vs. Lulu & LSI. Which is best? — Advanced Colorspaces and Dot gain — Beyond Microsoft Word — is it worth paying for anything else? — A note about Adobe products — Adobe Acrobat Standard/ Pro  — Adobe InDesign — Adobe Photoshop — CutePDF Writer — doPDF — Nitro Pro — NovaPDF — Scrivener — yWriter5 — Text boxes and Wordart — Wordart & Text Effects   — Where to find further information

Appendix A — How this book was formatted

Posted in Book launch | Tagged , , , , , | 3 Comments

Windows 8.1 hung my Photoshop CS6, but here’s the solution

The Windows 8.1 update arrived at my work desktop today. I’ve just read that Microsoft have halted the rollout due to problems. Well, I had one straight away. I was doing some book cover typography in Adobe Photoshop CS6. Suddenly I couldn’t pick a color. Trying to pick or swap colors gave me the standard Adobe error message that translates to ‘I’ve completely given up and I don’t even know what went wrong’. (I’ve seen and ahem… written a few of those in my time):

“Could not complete your request because of a program error”

That meant Photoshop was completely broken as far as I was concerned, and having run the Windows 8.1 update an hour earlier was too much of a coincidence.

A little rooting around Adobe support suggested a two-part fix that got me working straight away.

The problem is to do with the Photoshop preferences file, something that seems to be the canary that squawks and topples over whenever life gets too much for Photoshop.

The fix is to scrap your Photoshop preferences file and reset to factory defaults. Unfortunately this didn’t work because the preferences file was read-only (which, I suspect, caused the fault in the first place).

So the fix is to go to the Adobe preferences library in File Manager and set it to be write-enabled (turn off the ‘read only’ setting in the folder properties). The location is explained by Adobe here.

Having made the folder write-enabled, close Photoshop and make the secret Adobe sign on your keyboard. It’s a bit like the Masonic handshake, but you hold Ctrl + Alt + Shift simultaneously. Keeping all those buttons pressed (you’ll probably want to use the left shift button for this…) you launch Photoshop. You should be asked if you want to create a new preferences file. You answer ‘yes’.

For me, at least, this solved my problem. If Windows 8.1 has done any more damage, I’ve yet to see it.

I don’t want anyone to experience the same problem as me. But if you do, hopefully this post will get you started again.

Posted in Comments | Tagged , , , , , , | Leave a comment