On two occasions I have been asked [by members of Parliament!]: 'Pray, Mr. Babbage, if you put into the machine wrong figures, will the right answers come out ?' I am not able rightly to apprehend the kind of confusion of ideas that could provoke such a question.
-- Charles Babbage

Introduction

One of my (maybe bad) pet peeves are compiler warnings. I try to compile cleanly at the highest warning level and go through all the warnings one by one and really understand what they are fuzzing about. And then fix them. The reasoning behind this is that the compiler writers are usually much much better at the language than I am. If they bother enough to put in a warning message, then there is probably a catch somewhere. Dismissing it out of hand as stupid is done at your own peril. Every now and then there is some subtle gotcha that the compiler writers understand but you have yet to grasp about the language. And boy do you feel stupid after a monster debug session only to find that the bug you were searching for was already flagged by the compiler in a warning that you ignored!

Open source

Every now and then I download new versions of open source libraries and compile them locally for the different configurations I have here at home. Unlike most, I tend to generate the buildfiles locally with my Badger Config program to have full control over the compiler settings. This in part to avoid running into issues like linking against different runtime libraries (which is a horrible idea, never ever ignore link error LNK40981). The downside to this is that I get to see all the warnings in the source code as well. It's usually a grim picture presented. Warnings flash by faster than you can count and your confidence in the code quickly sinks. Now, for example the zlib maintainers claim that they just ignore the warnings and just write code that works. Fine. That might actually work for the c-library they're maintaining, but honestly I think it's a horrible idea. As I already stated, the compiler writers are usually experts at the language. You might think you are, but let's step back. Compiler writers work on a product that deals with the transformation of the language every day. Many of us ordinary programmers are just using it as a tool to solve some other problem. Mostly the language is just a way to express our idea of the solution. If someone is nice enough to step in and point out potential errors in our expressions, we should listen.

The issue of portability comes up. We can't possibly support all the compilers out there, now can we? I don't know, but the most common front ends out there is possibly Microsoft's Visual C and gcc. Frankly, they're not nearly as rigorous as a decent lint program. Both compilers are free of charge, there is no real excuse not to clean up the code to at least compile cleanly for both of them.

Professional source

That's open source. Ok, we might forgive them, the code is free after all and usually managing a large number of volunteers that usually just are interested in getting their feature in there and not about the overall code quality. Understandable. But in a environment where you have professional programmers working on a common product and source base, is there any excuse to not compile cleanly at the highest warning level? I don't think so, I've heard numerous arguments but none that I've felt being compelling or convincing. The weakest one I've heard was that there was no time. That was on a codebase around 1M lines of code. I inherited a branch of this where they compiled at a very low warning level on visual studio 6. We've just migrated to 7 and cranked the warning level all the way up. And boy were the warnings. A lot of them. A couple of days of mind numbing, drudging, boring pressing F7 and F8 to find and fix all the warnings we compiled everything cleanly. That was a couple of days of one persons time. I even think I fixed a couple of multiple inheritance casting bugs in there at the same time. That time is nothing if you count the agony of debugging bugs that doesn't need to be there. Bad enough that we code logical bugs, when the language itself also fights you there is no chance to win and keep your sanity.

64 bit

Warnings for pointer casts and truncation of integers comes up pretty frequently whenever we consider the 64 bit platforms. This might seem silly, but even now it's not out of the realm of possibility to require 64 bit address space. This might be a monumental task, close to impossible if you have a large codebase with bad practices. Visual Studio has an option to turn on 64 bit portability warnings. For a start turn them on and then see if you have any warnings in the codebase. If you have none, then it's fantastic, you're already pretty safe. Most of us will however have problems where programmers have assumed (maybe not even consciously) a 32 bit platform, maybe even a Intel 386+ platform.

All these assumptions are of course wrong and code that does this are actually in violation of the language. C programmers love to do this though. Especially the pointer magic. It does not work in C++. No. Really. And forget about enabling strict aliasing after this.

Pragmas

Ok. Fine. - But Visual Studio complains about a lot of bogus things as well. Sometimes you just have to work around that as a price for getting warnings. I feel that it's well worth it. But beware of the pragmas controlling warnings in Visual Studio. It's very easy to just disable most of the warnings! Which gives you the worst of both worlds, you think you compile cleanly at high warning levels, but you don't see the warnings and thus have the potential bugs anyways. If you find that you possibly can not fix a warning, here is a preferred way to disable that potential warning.

 

#ifdef _MSC_VER
	#pragma warning (push)
	#pragma warning (disable : 4018)
#endif

void myBadFunction()
{
		// ...
}

#ifdef _MSC_VER
	#pragma warning (pop)
#endif
Listing 1: Sample showing how to disable specific warnings under visual studio.

Somewhat convoluted but it does guard against other compilers and it doesn't assume anything about the translation unit being compiled. There might be a lot of code in the rest of the translation unit that have the same warning, but can be fixed so we don't want to stomp over it. And we also need to consider future code that might introduce the warning, but if we just disable it here we never get the warning for all the code past this snippet unless we had the pop instruction for the warnings.

In closing

Warnings. We love them and hate them. These are of course just the ones that are built into the compiler. Since the big lint/compiler split4 we also have manually run lint on our source and that's sure to produce even more warnings than the compiler can find. For all you open source developers out there, please make sure that your code compiles cleanly at least on one of gcc5 or msvc at the highest warning level.

Footnotes

[1] Linking against two different runtimes in Visual Studio might cause allocation from the debug heap and then a free from the release heap. The two heaps have different headers before the memory area and you will cause a memory corruption and possibly a crash. MSDN entry.

[2] The correct way is to use the standard defined typedef uintptr_t, which is guaranteed to hold a void*.

[3] If ints are 32 bit the following cast will clear the upper 32 bits. (void*)(uintptr_t)(int)(uintptr_t)(void*)&foobar;

[4] Fairly early on lint and the c compiler were split into two separate applications, the theory was that the lint process slowed down the compilation and the compiler could be made simple if it only focused on compiling the code. True enough, but the corollary was that you could trust the programmer to run lint every now and then and fix the warnings there. Well, that didn't happen (I've even met programmers who didn't know what lint was!). Modern compilers are slowly incorporating more and more features from lint into their parsing engines.

[5] For gcc, -Wall -Wextra -pedantic -ansi should be a good starting point for warnings.

Comments