The official company blog, used for update announcements, new products and general communication. If you want to keep up with posts here you can subscribe to the RSS feed or follow the Twitter account.
Earlier today I got a support email from a customer who found a bug few people are likely to hit in Codepoints, causing it to render characters small and in black:
Both of those font rendering characteristics, size and color, are actually hardcoded in the app to be fairly large and white, which led me to believe there was something funny about the fonts on the system.
As it turns out, the issue is caused by a font that was intentionally disabled by the user, and by chance it was the one font that Codepoints (and OS X itself) relies on to render a large number of Unicode glyphs, because it is the only one that ships with OS X that can render many of them: Arial Unicode MS.
I'm preparing to update Codepoints to deal with situations like that, but I thought I would write a little about how font rendering in OS X actually works, and how Codepoints handles it.
The Unicode 6.1 specification has 24,428 records in the database, however no single preinstalled font on OS X can render all of them. In fact, even when taken as a whole, OS X does not have font glyphs to render them all, and installing a 3rd party Unicode font is needed to get full coverage (GNU Unifont is common and free).
To deal with this, most operating systems use fallback behavior, they first try to render a character with the default or user selected font, and then cycle through other installed fonts until they either find a suitable glyph, or use the Last Resort font, or give up entirely, which tends to look like a question mark or a black box.
(Codepoints actually detects when the Last Resort font is about to be used and explicitly rejects it on a case by case basis, because most of the time the glyph returned by Last Resort is simply not useful as a preview)
When automatic font selection is turned on in Codepoints, what the app does as of 1.0.2 is select Arial Unicode MS as the font to render things with, but will allow OS X to use the standard fallback mechanisms to render characters when Arial can’t. This has the side effect of ensuring that most previews are rendered by the same font, giving at least some visual consistency to the list when scrolling.
Selecting a specific font in the drop down menu is when things get interesting, instead of merely asking OS X to render a character with a specific font (which would trigger fallback behavior), Codepoints actually takes upon itself the job of determining whether or not the selected font can render a specific character, and by doing this it becomes possible to completely disable the fallback behavior. If you select Helvetica, you will not see any character at all if Helvetica cannot render it.
The goal there is to not have the interface “lying” to the user, if you select a specific font, but then Codepoints allows a character to be rendered by something else, it would be misleading.
If you want to know the reason a specific character was rendered or not rendered, you can mouse over the right side of the row and a tooltip will appear. Most of the time this tooltip is accurate, however there are some edge cases where a font claims to be able to render a character and Codepoints trusts it, but then the font doesn’t actually return a glyph. You can sometimes see this in the higher Emoji range when the Apple Color Emoji font gets a little overzealous, but also with some obscure symbols in the lower range.
I haven’t decided whether or not to actually switch the default font yet, as I need to do some testing to see what exactly happens when you disable major fonts like this, however I do plan to deal with it in 1.0.x :)