[Webinar] Learn how to a build a cloud-first strategyRegister Now

x
  • Status: Solved
  • Priority: Medium
  • Security: Public
  • Views: 327
  • Last Modified:

how to solve this problem?

#define x y;
#define z  x;
#define y x;
main()
{
// statements;
}

does this statement execute?
0
ganeshkumar_cse
Asked:
ganeshkumar_cse
  • 6
  • 4
  • 2
  • +4
4 Solutions
 
Let_Me_BeCommented:
You have bad prototype of the main function. It has to be either int main() or int main(int argc, char* argv[]).

But otherwise there is no problem with your code. You didn't specify what you want it to do, therefore we can't check that.
0
 
evilrixSenior Software Engineer (Avast)Commented:
>> does this statement execute?Short answer... this code is invalid C code since the main function is malformed so it shouldn't even compile. Main should return an int so the definition should be thusint main(){   return 0;}Now, assuming you corrected this issue, you have no code in main so although the main function would execute it would do nothing of any use. It would just terminated immediately.
0
 
evilrixSenior Software Engineer (Avast)Commented:
>> how to solve this problem?
And that problem would be what?
0
VIDEO: THE CONCERTO CLOUD FOR HEALTHCARE

Modern healthcare requires a modern cloud. View this brief video to understand how the Concerto Cloud for Healthcare can help your organization.

 
pgnatyukCommented:
No.
Probably you wanted something like:
template<class T>
void swap(T& x, T& y)
{
   T z = x;
   y = x;
   x = z;
}
 
0
 
phoffricCommented:
Maybe it depends on which compiler you are using, but for ANSI C circa 1988,
main() {
...
}
is ok because by default it returns an int.

This question is listed in the C Programming zone, so why bring up templates?
0
 
SuperdaveCommented:
I tried this:

 gcc -E -
#define x y;
#define z  x;
#define y x;
main()
{
// statements;
f(x,y,z);  
}

and got:

# 1 "<stdin>"
# 1 "<built-in>"
# 1 "<command-line>"
# 1 "<stdin>"



main()
{

f(x;;,y;;,x;;;);
}

So, x is x, y is y, and z is x.  But you probably don't want the semicolons at the end of your #defines.
0
 
pgnatyukCommented:
You don't need semicolon in #define.
I attached the code that will print 2 instead of 1.
If you need to swap the values, you do not need this define at all.

#include <stdlib.h>

int x = 1;
int y = 2;

#define x y

void f(int num)
{
	printf("num = %d\n", num);
}

int main()
{
	f(x);
	return 0;
}

Open in new window

0
 
Let_Me_BeCommented:
> Maybe it depends on which compiler you are using, but for ANSI C circa 1988,

That's why there are standards. And according to both C standards, not specifying the return type is invalid.
0
 
phoffricCommented:
re: "
So, x is x, y is y, and z is x.  But you probably don't want the semicolons"

I believe the semicolons serve a purpose, not functional, but for schooling. My take on this is that preprocessor keeps resolving macro tokens until it detects a cycle.

x is y; which is x;; (since y is x;) and we better stop due to circular def
y is x; which is y;; (since x is y;) and we better stop due to circular def
z is x; which is y;; (since x is y;) which is x;;; (since y is x;) and we better stop due to circular def

Now, if we remove the semicolons, then we are able to initialize x and y, but not z (since z is x).
0
 
phoffricCommented:
C99 standard requires main() to have an explicit int. And students should be advised to use the standard prototype. But there are compilers that do not require this (e.g., cygwin cc), and so their real problem may be unrelated to the main() prototype.
0
 
Kent OlsenData Warehouse Architect / DBACommented:
>> You have bad prototype of the main function. It has to be either int main() or int main(int argc, char* argv[]).

Not 100% accurate.

The standard says:


The function called at program startup is named main. The implementation declares no prototype for this function. It shall be defined with a return type of int and with no parameters:

  int main(void) { /* ... */ }

or with two parameters (referred to here as argc and argv, though any names may be used, as they are local to the function in which they are declared):

  int main(int argc, char *argv[]) { /* ... */ }


Since the default return type is integer, letting main default to a return type of integer is perfectly legal.


Kent
0
 
phoffricCommented:
re: "It shall be defined with a return type of int"
  -- that means to me that per this standard, you will get an error if you do not explicitly have int main() {...}.

Since there is much legacy C code just having main() without the explicit return int, do compilers have a switch for backward compatibility?
0
 
Let_Me_BeCommented:
> Since the default return type is integer, letting main default to a return type of integer is perfectly legal.

No, there is no default, and omitting return value is illegal in both standards.

> do compilers have a switch for backward compatibility

Yes, most compilers in their "quirk" mode will compile something like this, but there is absolutely no guarantee what they will actually do with this code. They may use void main(), int main(), they may ignore return values...
0
 
ganeshkumar_cseAuthor Commented:
fine guys!!

sorry for late reply!!

actually what happens if something like
 #define y x
#define z x
#define x y
int main()
{
int x=1;
{
int y=2;
{
z=x+y;
}
}
printf("%d",z);
}

does those macros are taken into account or simply not considered?
0
 
Let_Me_BeCommented:
Yes they will be taken into account, but they might not do what you would expect, the preprocessed code is:


int main()
{
int x=1;
{
int y=2;
{
x=x+y;
}
}
printf("%d",x);
}

Open in new window

0
 
phoffricCommented:
Yes, recall that "Now, if we remove the semicolons, then we are able to initialize x and y, but not z (since z is x)."
The point of this exercise, I believe, is to understand how preprocessor circular definitions are handled.
0
 
ganeshkumar_cseAuthor Commented:
why z alone replaced by macros,?

why not x and y?
0
 
phoffricCommented:
When you had the semicolon on the end, it was more apparent that x and y were being replaced by macros. But not once, but twice:

An "x" is replaced by "y;". But the preprocessor sees that the "y" should be replaced by "x;"
This results in "x" becoming first "y;", and then "x;;"
(That is, "x;" followed by the ";").
The preprocessor stops at this point since if it were to replace the "x" in "x;;" with a "y;", then we would have "y;;;" which becomes "x;;;;" which becomes "y;;;;;"; ad infinitum.
So, the preprocessor stops when it detects the circular definition.

The preprocessor recognizes when the token replacement results in a token that it already had seen during the macro substitution. If it were to continue in this circular definition, then how would the preprocessor ever stop. It would loop in the circular definition forever.

Preprocessor successive substitutions for x:  x --> y --> x
Preprocessor successive substitutions for y:  y --> x --> y
Preprocessor successive substitutions for z:  z --> x --> y --> x

If it were not for the semicolons, one might think that we had for z: z --> x
So, thanks to circular definitions for x and y, we had the illusion that there was no macro replacement. Since z does not wrap on itself, it was more apparent that there was replacement.
0

Featured Post

Concerto's Cloud Advisory Services

Want to avoid the missteps to gaining all the benefits of the cloud? Learn more about the different assessment options from our Cloud Advisory team.

  • 6
  • 4
  • 2
  • +4
Tackle projects and never again get stuck behind a technical roadblock.
Join Now