New projects treat compiler warnings as errors by default
All new project templates will set the "treat warnings as errors" flag to true by default.
James World commented
I have to agree with Mark Russinovich (16:30 here: http://channel9.msdn.com/Events/Build/2014/3-615) on this one. Lost count of how many times I've consulted on projects that had hundreds of warnings in the build - no way of knowing which ones are important. A warning is a warning purely because the compiler requires human intervention to determine if it is an error or not. Suppress the non-errors, fix the errors. It's sloppy development to not have warnings treated as errors.
Matteo Tontini commented
Completely nonsense! In my world the definition of error is different from the definition of warning (error != warning).
We should stop putting the blame on phantomatic stupid developers not looking at warnings.
Redefining words is a job for politicians. If you think warnings should be treated as errors just make them errors!
Usually on the projects it happens according to the broken window theory: if there are two warnings, there will be soon dozens and hundreds of them.... With all respective consequences. So I usually force all warnings as errors. On the other hand, there are still legal cases where warnings possible, but there are tools for suppressing messages. So having all new projects created with "warnings as errors" makes obvious benefit for me and codebase I interact with.
Even more, to prevent warnings on new projects, builds are run on our CI server with manually passed TreatWarningsAsErrors=true
Sure, we can't require all to adhere this practise, so I'd consider adding some switch to the options, even off by default (despite setting it on would be more correct from the educational point of view)
Dear Scott, both your suggestions are ridiculous. I'm very sorry but that's fact.
That makes no sense.
Floele: In my experience, being warnings defeat the point of having warnings. Developers do not generally pay attention to warnings. Many warnings are actually logical errors. All of the warnings have a way of saying, "I am consciously doing this unusual thing and I am affirming that it is not a problem." By making all warnings be errors by default, you force developers to actually address them in the code, either by fixing the problem they are highlighting or by proactively noting they're ok.
As said, I don't agree. But there might be some options to decide if the application should be run when there are warnings or if tests fails and also to prevent the creation of a setup in such case. Maybe some easily accessible options to controls when it stop and when a confirmation is displayed.
Why? Warnings are called "warnings" because they are no errors. Making them errors by default defeats the point of having warnings at all, doesn't it?