in C programming we can use both %d and %i for interger variable.
that is, both these are same:
printf(" %d", x);
printf(" %i" , x);
whats the difference between %d and %i ??
C programming?
%d is decimal and %i is integer. A whole decimal is an int--1.0 is the same as 1. If you want formatted output using %i and your result is an int, like 3--the i%formatting would result in 3.0, not just 3. If your answer is 3.5 then using %d will result in 3, not 3.5.
Reply:%d is decimal, it has a dot in the number like 1.5, and i is interget it is a whole number like 10.
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment