Difference between %d and %i format specifier in C language.


Format Specifiers

In C programming language, %d and %i are format specifiers as where %d specifies the type of variable as decimal and %i specifies the type as integer. In usage terms, there is no difference in printf() function output while printing a number using %d or %i but using scanf the difference occurs. scanf() function detects base using %i but assumes base 10 using %d.

Example (C)

 Live Demo

#include <stdio.h>
int main() {
   int num1 ,num2;
   int num3, num4;
   scanf("%i%d",&num1 , &num2);
   printf("%i\t%d
",num1, num2);    num3 = 010;    num4 = 010;    printf("%i\t%d",num3, num4);    return 0; }

Output

32767-498932064
8 8

Here 010 is an octal number. scanf read the number as 10 using %d and read the number as 8 using %i. printf is good in both case to read the number as octal.

Updated on: 06-Jan-2020

12K+ Views

Kickstart Your Career

Get certified by completing the course

Get Started
Advertisements