[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
Long and OT: A selection of languages works best (why is the world full of idiots ?)
- To: misc_(_at_)_openbsd_(_dot_)_org
- Subject: Long and OT: A selection of languages works best (why is the world full of idiots ?)
- From: Chuck Yerkes <chuck+misc_(_at_)_2003_(_dot_)_snew_(_dot_)_com>
- Date: Fri, 28 Mar 2003 15:14:40 -0500
- Mail-followup-to: Chuck Yerkes <chuck+misc_(_at_)_2003_(_dot_)_snew_(_dot_)_com>, misc_(_at_)_openbsd_(_dot_)_org
Quoting Peter Galbavy (peter_(_dot_)_galbavy_(_at_)_knowtion_(_dot_)_net):
> > I'm using several pieces of sofware written in Java for a security
> > sensitive environment (banking), and I've never seen a buffer overflow
> > in any of that code. no string format error. no double free either.
> > That's just a question of using the right tool for the right application,
> > and as much as I like C, it is certainly not the right solution
> > everywhere..
> OK, maybe my point wasn't clear either. I suppose an implied opinion of mine
> is 'what does the language matter ?'. If you have the rights tools great,
> but the article was written (with the exception of one, maybe two,
> paragraphs) along the lines of 'C/C++ is bad'. You can wrap C code (see
> systrace) in a runtime environment that protects you against some problems,
> and I think that that is great, but I am still trying to figure out what the
> sales pitch of the article is...
theregister is pretty good, actually.
Languages have different reasons for being.
C was great because it was low level, or close enough.
A friend worked once with a bunch of assembly lang folks. They
needed assembly because it was fast and small. They were completely
committed to Z80(?) stuff because they had libraries of routines
written in it. As other, perhaps better stuff came along, they
used Z80s - that's what the libraries did. He introduced C.
"Ha ha ha! A high level language? WAY too slow. I can craft this
routine down to 85 clock cycles. We'll never use it."
A new experienced programmer there took a program they had, and
it's libs, and rewrote it in C. On his terminal to the big computer
(vms). Wrote stub routines to make it work enough (like "write to
LEDs" and "turn on motor" routine was caught by his helper lib and
spewed text to the terminal next to him).
Spent about a week (partly understanding the routines there) and
got it working.
Pulled it to the Z80 dev system and compiled it. Burned it to
PROM. It ran 80-90% the speed of assembly. Fast enough.
Pulled it to a dev system for a 16bit processor and compiled it.
It ran. With about a day of work. New system.
They wrote a new app in C as an experiment (with the old hands
cracking smart and looking nervous). It took half the time
and ran a little slower than assembly.
But routines were readable and portable. And good enough.
I've used Forth for similar reasons (and danced when the SPARCstations
came with forth in boot prom). Interactive low level debugging (write
this to THAT memory location, write a 5 there and !Hey! the motor is
Now, lets move forward: C was written to fit onto a machine with
very little RAM and run fast. It's got abhorrent string handling.
Almost an after-thought. It's been our nightmare.
The "Object Oriented" phrase got way over marketed, but the notion of
clean interfaces between parts is neither new nor bad. Calling
(3rd party?) routines that Do Things without affecting our space
is desired. We do this at the machine level all the time. If
"sort" affected my environment or left me with more RAM in ways
other than I expect, it would be gone. If I sent data to a SQL
server over there and it affected my machine, it would be gone.
Yet we write programs that have to link in libraries that, with
overflows and what not, affect our programms.
Smalltalk had some very interesting ideas.
Java showed/shows a lot of promise, but it's forked.
A developer I know wrote a CAD program for SGIs in 1985 for a small
company. (Old code, private.) He's talked about to moving it to Java.
"Isn't Java slow?" I asked. He replied, "It's in C and a little
C++ and ran ok on a 12MHz machine. If Java is 80% the speed (or
10% even), then it will be fine on a 1GHz machine. And portable."
He learned Java and makes a living at it. It's a little slower
than C. On the other hand, when you right an app that will be used
for 9-12 months and C needs 2+ months to develop it and Java needs 2
weeks, the company gets it quicker and gets 6 more weeks of use
from it. Is it still slower?
Those of us who have studied CS have written simple compilers. Simple
languages. I wrote a lightboard language (read sensors, control lights,
allow macros to be programmed for "scenes"). You could run it from
a console full of sliders or a keyboard.
We were all amused to learn "YACC" and what it stands for. It's
indicative of how many languages there are, how quickly people
need to make a syntax and build a program.
> > So I guess the author is saying that as long as so many people (which
> > are not expert in auditing) use C for everything, there will be security
> > problems... and I tend to agree to that.
> Anyone who uses a washing machine without using it 'right' will also have
> problems. Selling me a different brand of washing machine will not instantly
> fix my stupidity - even one with fewer buttons to press.
And if I sell you a washing machine that can catch fire if you press
buttons on the front (oh, you set it to spew water on the wires), then
*I'm* responsible and it gets recalled and perhaps I am held responsible
I can remove a bolt by hammering off the top of it. Or I can use
Why would I need a router (not the IP kind) when I have my chisels?
If you can't handle chisels right it's YOUR fault.
The value and quality of peoples work is a combination of the tools
they have and their skills.
The value and quality of peoples work is a combination of the tools
they know and their skills.
Systrace is a patch to fix a bad idea (catching bugs as they run
is FAR worse than not letting the bugs happen in the first place).
Systrace is a cool way to jail a language that easily runs rampant.
I think it's important when we run code from strangers and untrusted
Since C, and often C++, don't have the tools people need to do this
work well (buffer management for network and character streams),
we have bugs. Weekly. Part of it is bad programmer training, but
I'd shift a bunch of that blame to people using languages that are the
wrong tool for the job.
Kernels want to be fast. C and assembly (in small bits) are the
right tool. For now. Closely controlled and audited, they can be
But for end user apps, languages that have primitives for the
windowing system, threading, string and complex data structure
handling, and network handing are better and used more.
I got out of C programming in a major way because I was tired of
Ok, get the string ... wait, we need to allocate memory for that
and check that it's only passing us STRINGBUFLEN -1 characters.
(how many "off by one" buffer problems come through every year
in a rigidly watched OS like OpenBSD?)
There were several PC languages in the 80s that let me do things
like "read in $ThisType" and pass a "prototype" dBase II and III
(and I suppose later) had things to ensure data types:
and you couldn't pass in non-numbers. Or more than 8 digits (it
display the /s).
Perhaps C could be cleaned up by REMOVING those long dangerous
routines that don't count input. But they are faster and when your
strcat is working with already checked strings, then you want it.
C handles everything as a stream. Sometimes a stream is made of piss.
Perhaps this article might point stubborn fools to look at other
languages, to push vendors to support them.
There are about 50 thousand language out there. The world is not C.
VBasic took off because, again, as with dBase, a non-professional
programmer can write code.
A sampling of things I've played near recently (I'm a system admin,
not a professional coder):
There's an interesting paper on Erlang. Erlang was developed,
like many langs, by a phone company. Criteria there are "30
or 30,000 machines working together that must be up for 10
years and let me reboot parts of it with complete redundancy."
(Needed to reboot your phone lately? I can count the number of
times on one hand where I picked up my phone and had it fail, barring
big trees knocking wires. AT&T had a monopoly based on protecting
that reliability for 9 decades.)
Talks about prototyping apps really quickly that run on multiple
machines (not just SMP, distributed systems).
A MUSH OF TOOLS TO GET THE JOB DONE QUICKLY
My cousin, a very smart PhD in archaelogy uses MySql, Perl and
Visual Basic to do detailed mapping of archaelogy sites. The VB
stuff means that windows laptops can be used to gather data and he
can write happy little Windowing programs (UI) in an evening. FAST
prototypes where the code often never lasts longer than a prototype.
Field work. He can swap simms, but he can't talk about the value
of ECC RAM in servers; he doesn't know about garbage collection
or memory allocation. He reboots the Perl server at night because
after 4-5 days the machine runs out of RAM (many of the db perl routines
don't actually free() right). But it doesn't matter. It just means
that someone gets asked: "did you restart the server this morning?"
I write a lot in Perl because it's flexible and quick, I can write
something fairly complex in a short time, there are a bunch of
people writing "routines" for it. I can toss up something to parse
syslog, or be an ftp or IMAP client in a matter of minutes. It's
slower than C, but when I need something for a week, it's not
really. 3 hrs and a week of running is faster than 3-4 (to 15)
days of development, trying to get regex's to work in C and spending
80% of the time dealing with trivia. (read c-client some time.)
I've also been working on some really blecherous perl code written
by someone with wierd coding style habits. Still low level crap.
I don't write things that run thousands of times/minute in perl.
Mail filters (ala procmail) would destroy my machine with startup costs.
Perl is very easy to code in "write only" mode. The festival
of syntaxs allow spagetti programs to go wild. The OO stuff of Perl5
helped "contain" a lot of addons, but bad code is still common.
RUBY has taken off (another scripting language)
Fairly readable if you know C or Perl or other langs.
Basic threading (handle async connections from 50 clients and do stuff).
Kinda low level (more than Perl), higher than C.
I tend towards scripting languages for dev speed. Java stands out as
a good language to write daemons in - you don't easily treat a Pointer
as a number, you don't easily shove 40k into an 8 char string, you
can thread and handling multiple requests quite nicely.
Java and apache are creating interesting offspring.
Visit your host, monkey.org