Yes, but only if we're trying to copy more bytes than the length of src i.e. if n>sizeof(src) then n-sizeof(src) 0s will be padded to the end of dest.
Actually no, this is wrong. The manual says:
The strncpy() function is similar, except that at most n bytes of src are copied. Warning: If there is no null byte among the first n bytes of src, the string placed in dest will not be null-terminated.
If the length of src is less than n, strncpy() writes additional null bytes to dest to ensure that a total of n bytes are written.
sizeof() only tells you how many bytes an object is. The length of a string stored in a very large array can be very different:
Code:
// c99:
#include <stdio.h>
#include <string.h>
int main(void)
{
char foo[1000] = "&#$";
printf("length of %s = %zu, sizeof(foo) = %zu\n", foo, strlen(foo), sizeof(foo));
}
Could it be that in sample 2 the array was initialized with zeros and in sample 3 (where I'm actually accessing elements that are out of bounds) it just so happens that there are zeros saved at those specific spots in memory?
It is possible. If you learn anything from this experience, it should be that uninitialized variables are unpredictable and that they can impact the correctness of the program.
If we look at Sample 2 again, we see that for some reason there is @@ in the string which is not correct. Elements after that happen to be 0. In Sample 3 you happen to see what you expect to see.
When I tested it, I got completely different results from you, and only fun1() was consistently correct.
Code:
#include <stdio.h>
#include <string.h>
#include <stdlib.h>
#define NUMBERS "/tmp/numbers"
void fun1(void);
void fun2(void);
void fun3(void);
int main(void)
{
fun1();
fun2();
fun3();
return 0;
}
void fun1()
{
char path[108];
strncpy(path, NUMBERS, sizeof(path) - 1);
printf("%s\n", path); /* I forgot to include this line initially */
}
void fun2()
{
char path[20];
strncpy(path, NUMBERS, sizeof(NUMBERS)-5);
printf("length of path after: %lu\n", strlen(path));
printf("%s\n", path);
printf("%d\n", path[strlen(path)]);
printf("%d\n", path[strlen(path)+1]);
printf("%d\n", path[strlen(path)+2]);
printf("%d\n", path[strlen(path)+3]);
}
void fun3()
{
char path[10];
printf("size of NUMBERS: %lu\n", sizeof(NUMBERS));
strncpy(path, NUMBERS, sizeof(NUMBERS));
printf("length of path after: %lu\n", strlen(path));
printf("%s\n", path);
printf("%d\n", path[strlen(path)]);
printf("%d\n", path[strlen(path)+1]);
printf("%d\n", path[strlen(path)+2]);
printf("%d\n", path[strlen(path)+3]);
}
C:\Users\jk\Desktop>stringsfun
length of path after: 9
/tmp/num$
0
0
0
-124
size of NUMBERS: 13
length of path after: 12
/tmp/numbers
0
0
-128
18
C:\Users\jk\Desktop>stringsfun
/tmp/numbers
length of path after: 8
/tmp/num
0
0
0
0
size of NUMBERS: 13
length of path after: 12
/tmp/numbers
0
0
20
0
I mean, you might assume or even expect memory to be zero, but just look at how different it is for me.
I won't lie, I probably handled compiling differently from you:
gcc -ansi -O3 -s -o stringsfun.exe stringsfun.c
But in a perfect world, and with robust code, it's not supposed to make a difference if I optimize or not, and it does.