7–.
Early February this year I suddenly noticed a flashing behaviour of some of the pages of my website. I have fixed it in the meantime, but I deliberately left part the old situation in tact in one case, so I can still demonstrate the phenomenon.
Compare this: after the fix, a new list of random page links appears immediately after clicking the refresh button, or pressing F5 (that’s Fn-F5 on some laptop or desktop computers, depending on BIOS configuration).
But when called like this, you should see the flashes. Or maybe you don’t.
What I see under Linux Mint 20.1 (Ulyssa) using Firefox 109.0.1 is this:
Briefly, but long enough to be recognisable, the screen appears in black letters on a white background, using system fonts, and taking the full width of the screen. So apparently, my CSS is not yet in force. Then when the CSS kicks in, text is redisplayed in green on a brown background (nature’s colours), only a limited width is used so paragraphs do not become unreadably wide, and the font is what I chose it to be: Gentium Book Plus.
Especially because of the colour difference, there is a flashing effect, which I find quite annoying.
On an old smartphone, obsolete Android version, Google Chrome: doesn’t flash, the old way or the new way alike. Same, but using the standard browser pre-installed by Samsung: always flashes.
Same test on a more recent Samsung Galaxy tablet: does not flash, not with Chrome and not with Samsung’s browser.
Back to Linux Mint 20.1, now using what I thought was Google Chrome, but it identifies itself as Chromium, version 109.0.5414.119: does not flash either way. Strange, because I distinctly remember the problem was there in that browser too, a few days ago.
Windows 10, Firefox 109.0.1: no flash whatever I do. Windows 10, Edge (also version 109, and based on Chromium; are all browsers the same nowadays?): same thing, the problem is not reproducible.
So I fixed it for nothing? Well, I’m still glad I did it, otherwise the problem would still be noticeable in the situation I am in most of the time: Firefox and Linux Mint.
How can this happen? I suppose it’s because the CSS loads slowly. Apparently Firefox under Linux Mint 20 first loads, interprets and displays the HTML, even before the CSS is handled, and only then redisplays everything taking the CSS into account. Doesn’t seem very sensible, because you know beforehand that the first rendering will have to be overwritten soon.
Perhaps they do it, so that if the CSS is extremely slow (slow line? I have Fiber to the Home here), the user at least gets to see SOMEthing, ANYthing, even if the layout isn’t optimal, and she won’t have to wait until all the work is done.
Or maybe it has to do with optimally utilising the now almost ubiquitous multi-kernel processors?
If the CSS is slow, what can cause it? And why did I never notice it before, but only recently, the last couple of days or weeks?
A recent change I made is downloaded fonts. As a heritage of when I still
used Windows, before August 2019, in my CSS I mentioned lists of Windows
fonts that I like, following my a generic serif
,
sans-serif
, or monospace
. It works, but
as a webmaster you are not in full control of what users will get
to see: Windows fonts are generally not available on Apple Mac or
Linux systems (the latter including Android), so those platforms
will present some similar font from their own repertoires.
I knew there was a better way, so
late last year and
early 2023 I implemented and tested that:
fonts are no longer assumed to be on the visitor’s system,
but are downloaded using
@fontface
in CSS.
Downloading from a public site like
Google Fonts is a possibility, but I
preferred to host the selected fonts (all with an appropriate
free licence, of course)
on my own site
instead.
The upside is that all users will get to see the pages in exactly the same layout and font. Or virtually the same, as for example the width of a scrollbar may vary slightly. The downside of course is the extra time for downloading the font files.
But what I hoped and expected is true: this isn’t the cause of any slowness. If you clear the browser cache (it can be done using Ctrl-Shift-Del in many browsers), you see a delay the next time, but just once. The font files are cached, by the browser and perhaps also close-by by the access provider, and no further delay is noticeable.
Myself being in Europe, I used a temporary American VPS for testing this, hired from the French company Virtua Cloud. The font files having to come from California to the Netherlands all the way over the American continent, and under or over the ocean, does create a noticeable delay. But only that one time, just after clearing the cache. Not any next time. So it is never a problem in practice.
So what IS the cause of the flashing then?
After systematically excluding possible causes, I found out with certainty that it must be in the way I cascade my CSS files (CSS = Cascading Style Sheets).
I has to do with my preference for crazy colour combinations,
as
described here. I don’t want to force my readers to
accept my craziness, so I offer buttons to get normal,
neutral colours instead. To implement that, I wrote a
program called
‘colschem.c’,
which after compiling results in colschem.cgi
.
The initial idea was that the program would filter CSS content
read from files, meaning it would change the colour settings
in the existing files. To that end, instead of reading the
CSS from files using
<link rel="stylesheet" href="…">
,
I let a CGI program, written in C
, do the reading,
so I can manipulate whatever I want in any way I want.
For security reasons, CGI ‘scripts’ (written in any scripting
language or programming language, in my case usually sh
or C
) are in a directory
(/cgi-bin)
that cannot be accessed directly
from the web, and the scripts in turn can only access data
in subdirectories of directory /cgi-bin
. So
all my CSS files were in /cgi-bin/css
.
The consequence of that was that any statements
@import url("…");
also could not directly
access a file, but had to call colschem.cgi
again.
Later on, instead of filtering, i.e. manipulating the CSS read, I used the cascading idea: any CSS setting you add, overwrites and overrides any earlier setting of the same property (in the same context). So I don’t filter, I just add. Much simpler.
This setup is inefficient, because things that need only be done once, like checking cookies, checking time conditions, and adding the colour settings if necessary, are done multiple times. Often two times, sometimes three times or more.
For example in foneport.htm,
I link in foneport.css
, which imports
fonegnrl.css
, which in turn imports
textfont.css
. And in
Random view (itself also a CGI
program) I link in mainmenu.css
, which imports
menu.css
, which imports textfont.css
.
This is the web page where I first noticed the flashing.
To verify my assumption that this multiple, nested calling was
the cause of the strange flashing screens, I temporarily
added four, five extra levels, each adding yet another call
of cgi-bin/colschem.cgi
. And yes, that made
the problem worse. And it was less noticeable in case
of two-level nesting than when there were three levels.
Does this explain why I noticed the problem only recently?
Is the problem in fcgiwrap
which I use in
combination with nginx
? Do similar solutions
suitable for use with apache
not have the
problem? But my noticing is much more recent than my
transition from apache
to nginx
,
which went into production in February 2022.
And as we saw in recent tests, the problem does not occur in all circumstances.
The solution of course is simple. If colschem.cgi
, based
on cookies or timing, sometimes overrides colours, it suffices
to call it once, after the other CSS files have been read and
imported. Those can then be served from the web server in the
normal way, reading them from static files, not using CGI.
A small scale test indeed showed that that eliminated the flashing problem. It requires only small changes:
/var/www/cgi-bin/css
to
/var/www/html/css
.
In the copies, change for example:
@import url("../cgi-bin/colschem.cgi?url=menu.css");
to:
@import url("menu.css");
so the import is done in the same directory as where the importing
CSS resides.
colschem.c
such that if no url for the
CSS is specified, the step of serving that CSS is skipped, but
the checks (involving cookies, and other conditions) whether
the weird or the neutral colours are to be used, are still done.
<link rel=stylesheet
type="text/css"
href="../cgi-bin/colschem.cgi?url=foneport.css">
<link
rel=stylesheet href="../css/foneport.css">
<link
rel=stylesheet href="../cgi-bin/colschem.cgi>
The part type="text/css"
does no harm, but it is not
necessary in HTML5, although perhaps it was, more or less, in HTML4.
So I removed it, now that I had to make changes anyway.
/var/www/cgi-bin/css
,
but leave three of them there, unmodified, so the old behaviour can
still be shown:
mainmenu.css
,
menu.css
,
textfont.css
.
The changes in step 4 above are straightforward and simple, except that
the name of the CSS file can be different each time. And the number
of levels in ../cgi-bin
etc. varies with the level in
the directory tree where the HTML resides. So sometimes it has to
be ../../cgi-bin
or ../../../cgi-bin
etc.,
and the same for the new ../css
, etc.
Also, trivial changes are still an awful lot of work if they have to be
made, as in the case of my website, in over 1700 files! So I did
not do that by hand, but automated. I tried the stream editor
sed
, couldn’t get it to work, then I turned to good
old Unix editor ed
, which is very old, probably from
the 1970s, and it certainly existed in 1985. But in 2023 it still
there, now smart and modern enough to handle Unicode in UTF-8, where
the older edition of course could only do US-ASCII.
After some fruitless experimenting, I developed this simple script:
/^.link..*stylesheet..*cgi-bin.colschem.cgi.url..*[a-z][a-z]*\.css..*$/d x x -1s@cgi-bin/colschem\.cgi?url=@css/@ s@type="text/css" @@ +1s@colschem\.cgi?url=[a-z][a-z]*\.css@colschem.cgi@ s@type="text/css" @@ w q
I put this in a file called ed.in
, which is where a script
I wrote years ago expects it. I fed the script with the paths of my
HTML files, generated by find
. The script looks like
this:
#! /bin/sh # # Multi-ed, apply ed to several files in one go. # Copyright (C) 2008 by R. Harmsen # # Reads filenames (paths) from stdin. An ed script is expected # in a file called ed.in in the current directory. # while read F do B=`echo $F | sed 's/\(.*\)/\1\.bak/'` cp $F $B echo "Editing file $F ..." ed $F < ed.in echo "Done editing file $F\n" done
This worked well. But I had overlooked the fact that the filenames of
some of my CSS do not only contain lowercase alphabetic letters,
but also a digit or a dash. So I ran a modified script –
after replacing
[a-z][a-z]*
by
[a-z][a-z0-9-]*
– to cover those too.
Then I found I had made some assumptions about my HTML, which did not
always hold true, so the script had introduced mistakes. I started
correcting these by hand (using nano
, not ed
),
realised it was too much work, wrote another script to make the
corrections automatically, which in some cases didn’t work correctly,
so it introduced yet different mistakes, but only in a small number
of cases, so it was feasible to get that right in nano
.
Then I was done. Easy! And efficient.
Copyright © 2023 by R. Harmsen, all rights reserved.