Tuesday, March 27, 2012

Are Bit Flags Bad?

Is the process of using integer data types to represent multiple values via the use of bit flags bad practice? It seems to go against the rules of normalization in a single field can represent multiple values. On the other hand that since these values can be tested for via bitwise operations, that it's not entirely bad.

Any insight would be appreciated.

ThanksI've used integers to store multiple bit states on occasion and found it very usefull.

For instance, you can have multiple boolean security states, and if you think of them as separate boolean values then separate bit fields would seem appropriate. But is you think of them as a single value describing the overall state of the security setting, then you aren't really violating any principles by storing them as a single field.

At a practical level, SQL Server will combine up to eight bit fields and store them internally as a single byte anyway.

Normalization is good practice because it leads to efficient and functional design. But if you avoid an instance where violating it would actually improve an application, then you are letting the cart lead the horse. Normalization is principle, not dogma.|||there is a multitude of examples where M$ itself used...smallint...as a datatype for this type of scenarios. i think int would be an overkill, while following the trend would save you storage space while doing the job.

No comments:

Post a Comment