Doh! Hitting the docs again (RTFM!) I discovered perl's <> globbing is now (as of 5.something) handled internally by File::Glob. Poking around I started to see the light. Globbing can be used in array *and* scalar context (as with most perl functions). Perl was "guessing" I was asking for scalar, when I really wanted array, probably because I was immediately negating the result. Doh.
In scalar context, globbing magically acts as an iterator. If the glob string doesn't change, then it maintains an internal pointer, iterates over each file found by the glob then finishes by returning undef once it's out of files.
So, my program was really foobar in terms of how I was using the glob to check for the error case of no files being output by my other script.
Since my other script creates on each run between 1 and 8 output files, that explains the "random" behavior I was seeing. It probably ran the glob the first time, say on 5 files, cached the glob results and simply returned the next cached file entry on each iteration. Once all 5 were exhausted, it returned undef and did the "try #2". At least I think that's what was happening. I did just yesterday see a "try #3" and I can't really think how that would ever occur, as an undef first glob should "reload" on the 2nd glob.
The bigger problem was that the above problem would mean that in most cases my sanity check that >0 files were output would no work a certain percentage of the time.
Thanks all. As usual, talking through the problem leads one to the solution.
Aside: as for backticks vs system, I didn't need the output so I used system. As I get older, I hate backticks more an more as they can be susceptible to injection attacks unless you're very very careful about the var interpolation going on. I really love the list-form of system, which gets no shell interpolation and thus is guaranteed shell-safe: system 'ps','-f',1;