The problem are programs that use limits for the size of data but do
force-limit the data to that size. Many such programs are written in
C and C++ but that rather means bad programmers. Most problems can be
avoided by using ‘n’ functions in place of non-‘n’ functions:
sure but using n function means a loss in performance .
But very little compared to the use of a scripting language
Incidentally, snprintf is actually rather painful to use safely. I
originally wrote some code like this:
char buf[256];
int len = 0;
len += snprintf(buf+len, sizeof(buf)-len, ...);
len += snprintf(buf+len, sizeof(buf)-len, ...); /* etc */
You’d think that would work? It doesn’t. In the event of the output being
truncated, snprintf actually returns the number of characters it would
have written to the string, if it had been given unlimited space. Hence
‘len’ still goes off the end of the string, and subsequent snprintf’s will
write ‘\0’ to bits of memory which they shouldn’t
sure, there are lots of examples of secure code in C (qmail,postfix),
Generally, such code ends up building a lot of its own scaffolding for
memory management. I’m not familiar with qmail or postfix, but I have looked
at the memory management code in exim. There’s quite a lot of it.
I think the point is, if you want to write safe code in C (which takes input
from an untrusted source, such as over the Internet), then you need to
realise that you must write this scaffolding as well. The standard C library
doesn’t help you much. However, in a decent scripting language, it’s all
taken care of.
but anyway is always worth considering the right language for the job.
Absolutely!
Regards,
Brian.
···
On Mon, Mar 31, 2003 at 06:10:46AM +0900, gabriele renzi wrote:
il Mon, 31 Mar 2003 03:06:52 +0900, “Josef ‘Jupp’ Schugt” > jupp@gmx.de ha scritto::