C++ does not permit treating enum values as individual bits used with
the bitwise operators. For types that are a mask of flags, switch the
typedef to an unsigned int and use preprocessor macros for flag
constants.
Details
Diff Detail
- Repository
- rG FreeBSD src repository
- Lint
Lint Skipped - Unit
Tests Skipped - Build Status
Buildable 62636 Build 59520: arc lint + arc unit
Event Timeline
This is a common pattern in CAM to make things more debuggable.
Why #define rather than done variation on const int foo = 0x1000; ?
Oh, I just did #define since that is the most common way of doing constants in the tree. I would not be opposed to using const int.
The only reason we do the funky ENUM bit dance is so gdb prints symbolic values better.
If const int gives us that, that's a reason to use it. Otherwise, I'm neutral: I'm not dogmatically
opposed to cpp, and const int doesn't solve a real problem in this case when cpp dogma is
factored out. Maybe 'C++ custom" would be a reason enough to do it if we had a larger C++
developer community that adhered to that custom.
tl;dr: I'm good either way unless const int gives better debugging which I'm skeptical of now that I'm typing this out.