What is exactly meant by "flag" in computer programming?

Word "flag" is used in computer programming almost on a daily basis.  A example of usage of this word is as follows:

Some car object property might be a flag representing whether or not the motor is running.  What is exaclty meant by flag?

Who is Participating?
jkrConnect With a Mentor Commented:
See http://en.wikipedia.org/wiki/Flag_%28computing%29 - I could not sum that up better: "In computer programming, flag refers to one or more bits that are used to store a binary value or code that has an assigned meaning."
Compare it to the expression : "flag something", as in mark it for some given purpose. A flag is just a mark that is either set or not for whatever reason you need it.

Usually it's only 1 bit - several flags can be grouped together (and often are) into one multi-bit value.
Question has a verified solution.

Are you are experiencing a similar issue? Get a personalized answer when you ask a related question.

Have a better answer? Share it in a comment.

All Courses

From novice to tech pro — start learning today.