84

What is the point of #define in C++? I've only seen examples where it's used in place of a "magic number" but I don't see the point in just giving that value to a variable instead.

9 Answers 9

164

The #define is part of the preprocessor language for C and C++. When they're used in code, the compiler just replaces the #define statement with what ever you want. For example, if you're sick of writing for (int i=0; i<=10; i++) all the time, you can do the following:

#define fori10 for (int i=0; i<=10; i++)

// some code...

fori10 {
    // do stuff to i
}

If you want something more generic, you can create preprocessor macros:

#define fori(x) for (int i=0; i<=x; i++)
// the x will be replaced by what ever is put into the parenthesis, such as
// 20 here
fori(20) {
    // do more stuff to i
}

It's also very useful for conditional compilation (the other major use for #define) if you only want certain code used in some particular build:

// compile the following if debugging is turned on and defined
#ifdef DEBUG
// some code
#endif

Most compilers will allow you to define a macro from the command line (e.g. g++ -DDEBUG something.cpp), but you can also just put a define in your code like so:

#define DEBUG

Some resources:

  1. Wikipedia article
  2. C++ specific site
  3. Documentation on GCC's preprocessor
  4. Microsoft reference
  5. C specific site (I don't think it's different from the C++ version though)
Sign up to request clarification or add additional context in comments.

6 Comments

And, by the way, there is no good reason to use a #define for a constant in C++ these days. In fact, it's considered un-C++ or something like that.
Those who do not fear BOURNEGOL are doomed to repeat it.
Just as a note, these are all really contrived examples just to show what the preprocessor can do. I do not recommend anyone actually use any of these.
Note that #define fori(x) for (int i=0; i<=x; i++) should be written #define fori(x) for (int i=0; i<=(x); i++) instead. (Note the additional brackets.) The use of brackets may not be obvious in this example, but something wrong will happen if someone put a bitwise operation in argument x.
This should have been the chosen answer.
|
61

Mostly stylistic these days. When C was young, there was no such thing as a const variable. So if you used a variable instead of a #define, you had no guarantee that somebody somewhere wouldn't change the value of it, causing havoc throughout your program.

In the old days, FORTRAN passed even constants to subroutines by reference, and it was possible (and headache inducing) to change the value of a constant like '2' to be something different. One time, this happened in a program I was working on, and the only hint we had that something was wrong was we'd get an ABEND (abnormal end) when the program hit the STOP 999 that was supposed to end it normally.

2 Comments

@farhangdon "Mostly stylistic these days." is the answer to the question. Basically, use it if your style allows it. supercheetah's answer is better.
Mostly stylistic these days? Nope. Better use defines to avoid a constant that consumes memory especially important when programming microcontrollers with limited resources availabe.
39

I got in trouble at work one time. I was accused of using "magic numbers" in array declarations.

Like this:

int Marylyn[256], Ann[1024];

The company policy was to avoid these magic numbers because, it was explained to me, that these numbers were not portable; that they impeded easy maintenance. I argued that when I am reading the code, I want to know exactly how big the array is. I lost the argument and so, on a Friday afternoon I replaced the offending "magic numbers" with #defines, like this:

 #define TWO_FIFTY_SIX 256
 #define TEN_TWENTY_FOUR 1024

 int Marylyn[TWO_FIFTY_SIX], Ann[TEN_TWENTY_FOUR];

On the following Monday afternoon I was called in and accused of having passive defiant tendencies.

11 Comments

They were very right. #define MARYLYN_SIZE 256 is what you want.
The point is to separate the meaning. Why is 256 VS 1024 important? It's a 'magic number' because there's no indication of why the value is important. When you know why the value is important, the literal value becomes less important.
And then you were fired, right? This was not a very professional way to behave. You were out of line and did not perform the simple task assigned to you. The company policy is correct - just dumping in 256 and 1024 is not maintainable, yet when confronted with this you decided to make it worse. If this behaviour continued in my place you'd be out.
@JAMESBRYANB.Juventud How many Marylyns and Anns should there be? What decision drives that? Name the constants after that decision. e.g. static constexpr const size_t MAX_HOMES_AVAILABLE = 1024; int Ann[MAX_HOMES_AVAILABLE];. If the choice is really obvious and localised then using 1024 directly isn't too bad, but doing what Harpee did is just trolling.
@JAMESBRYANB.Juventud Note that this doesn't have much to do with portability
|
8

#define can accomplish some jobs that normal C++ cannot, like guarding headers and other tasks. However, it definitely should not be used as a magic number- a static const should be used instead.

Comments

7

C didn't use to have consts, so #defines were the only way of providing constant values. Both C and C++ do have them now, so there is no point in using them, except when they are going to be tested with #ifdef/ifndef.

4 Comments

They are still handy if you are using templates, you need to know the value in compile-time.
@Erbureth consts are known at compile time.
@Erbureth: I believe const values at namespace scope have internal linkage by default, so you are also not using a global variable. That may only apply to integral constants, though.
And had I read the comments further down the page, I'd have seen that this point has already been made. Oh well.
4

Most common use (other than to declare constants) is an include guard.

Comments

1

Define is evaluated before compilation by the pre-processor, while variables are referenced at run-time. This means you control how your application is built (not how it runs)

Here are a couple examples that use define which cannot be replaced by a variable:

  1. #define min(i, j) (((i) < (j)) ? (i) : (j))
    note this is evaluated by the pre-processor, not during runtime

  2. http://msdn.microsoft.com/en-us/library/8fskxacy.aspx

3 Comments

The define can easily be replaced by an inline function.
That define is "expanded" by the pre-processor into a c expression which may or may not be evaluated at compile time. eg min(1,2) vs min(a,b) where a and b are variables.
@Bo Persson: a template function is needed, inline or not, since it should work on all types.
1

The true use of #define should be for controlling the preprocessor stage, to manipulate the source code, before the compiler runs.

Include guards are the most common example:

#ifndef _HEADER_FILE_H
#define _HEADER_FILE_H
...
#endif

The code within the guard is only included in the source code once, through the logic that the preprocessor will only include it when _HEADER_FILE_H is not defined and, upon including it, this preprocessor variable is defined. So any further attempt to include this code will fail the #ifndef and the code will be excluded by the preprocessor.

But another use is to conditionally include code, such as only compiling code for a particular compiler or feature (as all compilers define symbols to allow for their detection).

#if !defined __cplusplus
...[code to include when NOT compiling as C++]...
#endif

#if __cplusplus >= 201103L
...[code that's only included if compiling to at least the C++11 standard]...
#endif

#if defined(__clang__)
...[code for Clang compiler]...
#elif defined(__GNUC__) || defined(__GNUG__)
...[code for GCC]...
#elif defined(_MSC_VER)
...[code for MSC]...
#endif

Another benefit of #define is that preprocessor symbols can be defined on the command line, so you can add, for example, -DDEBUG to the compiler arguments, which effectively adds a #define DEBUG before running the preprocessor.

So you can turn on / turn off, for example, debugging directives at the command line (or within your makefile or whatever).

#ifdef DEBUG
   #define Log(msg)    fprintf(stderr, msg);
#else
   #define Log(msg)
#endif

Using the above code, you can sprinkle Log("entering main loop..."); throughout your code to provide debug logging while it runs.

But it'll only be included in the executable if DEBUG is defined. Otherwise, the #else applies and all those Log macros equate to absolutely nothing and vanish.

This is really what #define is for. It creates a preprocessor symbol and you can use that to control the preprocessor in various ways to manipulate the source code before it even reaches the compiler.

In this context, it's extremely useful and that's what you're supposed to be using it for.

But, yes, one trivial way it can manipulate the source code is just a straight substitution that you can use to create 'magic numbers':

#define ARRAY_SIZE 256

int Array[ARRAY_SIZE];

Which works. But that's not really what it's for. And, yes, a const probably makes more sense for defining your constants.

#define is a preprocessor directive and it's meant for the preprocessor stage before compilation. To be able to manipulate the source code BEFORE you send it for compilation.

So I wouldn't say that there's no point to it or compare it to const, but rather point out that it's supposed to be for a different stage of compilation. Like, #define is perfect for conditional compilation, const is probably what you want for defining constants.

It's not really that one is better or worse than the other. It's more that they serve different purposes in different stages of compilation - but, sure, you can do "magic numbers" at any stage. But if that's what you want to do, almost certainly better to use const for that sort of thing.

Comments

0

The #define allows you to establish a value in a header that would otherwise compile to size-greater-than-zero. Your headers should not compile to size-greater-than-zero.

// File:  MyFile.h

// This header will compile to size-zero.
#define TAX_RATE 0.625

// NO:  static const double TAX_RATE = 0.625;
// NO:  extern const double TAX_RATE;  // WHAT IS THE VALUE?

EDIT: As Neil points out in the comment to this post, the explicit definition-with-value in the header would work for C++, but not C.

2 Comments

Um, wrong. const double TAX_RATE = 0.625. consts have translation unit scope by default.
In C constants have external linkage and thus should be defined in .c files. In C++ they have internal linkage and thus could be defined in header. So, agree with @Neil if C++ and not C. (The question relates to magic numbers in header, so I disagree with other tangents for #define as used for include guards or any non-magic-number discussion.)

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.