Google Groups no longer supports new Usenet posts or subscriptions. Historical content remains viewable.
Dismiss

Secure ObjC programming

8 views
Skip to first unread message

Lorenzo Thurman

unread,
Feb 23, 2007, 4:02:17 PM2/23/07
to
I strive to write secure code. Never was taught that things like printf
could be bad. Its something I've learned on my own. I'm no security
expert, but I do what I can. I've recently run into this:
http://www.dwheeler.com/secure-programs/Secure-Programs-HOWTO/index.html

But this is just for C/C++. Is there similar info for ObjC? Doesn't have
to be online or even free, just thorough.

Michael Ash

unread,
Feb 23, 2007, 10:02:00 PM2/23/07
to

Not that I know of. However, ObjC is such a thin layer that all of the
tips for C will be valid for ObjC (and the ones for C++ for ObjC++), and
you don't really need much, if anything, more.

Some obvious stuff jumps to mind, like not using user input to construct
selectors, but for the most part the mistakes and best practices are the
same. One nice thing is that if you can use Foundation, NSString will
allow you to avoid a lot of the string handling nastiness in C that has
historically created a large number of security holes.

--
Michael Ash
Rogue Amoeba Software

Lorenzo

unread,
Feb 25, 2007, 1:40:27 AM2/25/07
to
In article <11722861...@nfs-db1.segnet.com>,
Michael Ash <mi...@mikeash.com> wrote:

Thanks for the reply. I kinda the ObjC layer would help insulate from
C's deficiencies, but its good to know just how much.

--
"My Break-Dancing days are over, but there's always the Funky Chicken"
--The Full Monty

Michael Ash

unread,
Feb 25, 2007, 2:34:13 PM2/25/07
to
Lorenzo <lor...@excitement.com> wrote:
> Thanks for the reply. I kinda the ObjC layer would help insulate from
> C's deficiencies, but its good to know just how much.

ObjC-the-language doesn't do anything itself, because it provides so very
little. The main advantage is in providing a nice environment for writing
reusable libraries. I used NSString as an example before; that's just
using ObjC to create a string handling library that insulates you from the
evil that is a NUL-terminated char *. Of course you could easily write a
string library in pure C, but for some reason few people do. ObjC provides
an environment conducive to writing and reusing libraries, so you can
choose from many hopefully-secure implementations that already exist
instead of being forced to write your own, or use libc's horrible horrible
stuff.

Lefty Bigfoot

unread,
Mar 2, 2007, 2:12:36 AM3/2/07
to
On Sun, 25 Feb 2007 00:40:27 -0600, Lorenzo wrote
(in article
<lorenzo-
2DBD39.004...@hedley.internal.thethurmans.com>):

With the exception of gets(), the usual "C deficiencies" really
amount to deficiencies in those that call themselves C
programmers, but in point of fact are not qualified to use that
descriptor on themselves.


--
Lefty
All of God's creatures have a place..........
.........right next to the potatoes and gravy.
See also: http://www.gizmodo.com/gadgets/images/iProduct.gif

Michael Ash

unread,
Mar 2, 2007, 11:56:14 AM3/2/07
to
Lefty Bigfoot <nu...@busyness.info> wrote:
> With the exception of gets(), the usual "C deficiencies" really
> amount to deficiencies in those that call themselves C
> programmers, but in point of fact are not qualified to use that
> descriptor on themselves.

It is a point of fact that people are not perfect. Any construct which
does not take this into account is inherently flawed. Yet, when imperfect
people interact with something that requires them to be perfect, and then
subsequently fail, most will blame the people involved rather than the
thing they were interacting with.

It is basically impossible to write a large program using certain C
constructs, in particular but not limited to standard string handling, and
remain secure. The proof of this statement is in the many open source
projects which are extremely security minded and have a large number of
extremely smart people working on them which still have security
vulnerabilities of a type which other languages can't even experience.

There's an entire class of security vulnerabilities, buffer overflows,
that most other languages can't even experience. In theory they can be
avoided by careful programming, but in practice it seems that they can
only be avoided by either severely limiting your program's functionality
or by pushing off all direct buffer manipulations into small and
rigorously tested libraries (which is effectively what these other
languages do). Objective-C can help by providing or making it easy to
write these small and rigorously tested libraries.

Lefty Bigfoot

unread,
Mar 2, 2007, 9:47:32 PM3/2/07
to
On Fri, 2 Mar 2007 10:56:14 -0600, Michael Ash wrote
(in article <11728545...@nfs-db1.segnet.com>):

> Lefty Bigfoot <nu...@busyness.info> wrote:
>> With the exception of gets(), the usual "C deficiencies" really
>> amount to deficiencies in those that call themselves C
>> programmers, but in point of fact are not qualified to use that
>> descriptor on themselves.
>
> It is a point of fact that people are not perfect. Any construct which
> does not take this into account is inherently flawed.

No, it isn't. Not everything in the world needs to be designed
for the lowest common denominator. C has historically been (and
remains) primarily a systems programming language. It's not for
noobs. Just as Visual Basic is worthless for writing operating
systems, C is pretty worthless for writing video games, although
it is quite often used for the lower-level non-display
components of same.

> Yet, when imperfect
> people interact with something that requires them to be perfect, and then
> subsequently fail, most will blame the people involved rather than the
> thing they were interacting with.

C does not require perfection. All it really requires is the
ability to read K&R2 and man pages. Plenty of non-perfect
people have managed to do that and be highly successful in their
efforts at C programming.

> It is basically impossible to write a large program using certain C
> constructs, in particular but not limited to standard string handling, and
> remain secure.

No, it isn't. It's not easy, I'll grant you. There are
tradeoffs within any design, and C is in many ways far safer
than assembly programming, which it is basically a higher-level
syntax for in reality. But, it won't hold your hand for you,
and doesn't come with safety goggles.

> The proof of this statement is in the many open source
> projects which are extremely security minded and have a large number of
> extremely smart people working on them which still have security
> vulnerabilities of a type which other languages can't even experience.

If you think any language precludes security vulnerabilities,
you're confused. That's not what you said, but some might read
it as such. Many open source projects suffer from the too many
cooks problem, you can have a lot of gourmet chefs working on
the same meal, and still wind up with bad food if they work at
cross purposes.

> There's an entire class of security vulnerabilities, buffer overflows,
> that most other languages can't even experience.

Proof? "Most" is a strong term. I look forward to the
evidence.

> In theory they can be avoided by careful programming,

Yes of course they can.

> but in practice it seems that they can
> only be avoided by either severely limiting your program's functionality
> or by pushing off all direct buffer manipulations into small and
> rigorously tested libraries (which is effectively what these other
> languages do).

Library code serves a useful purpose, particularly for the
programmer that hasn't been in practice long enough to develop a
suitable and well-tested library of their own, or whose job
changes preclude proper code-reuse. Commercial libraries, for
any language, serve a useful purpose, or can, some of them are
crap, just as with anything else.

> Objective-C can help by providing or making it easy to
> write these small and rigorously tested libraries.

So can C, with the addition of a small, rigorously tested
library, by the development of your own, run through a similar
grinder.

Michael Ash

unread,
Mar 3, 2007, 1:20:24 AM3/3/07
to
Lefty Bigfoot <nu...@busyness.info> wrote:
> On Fri, 2 Mar 2007 10:56:14 -0600, Michael Ash wrote
> (in article <11728545...@nfs-db1.segnet.com>):
>
>> Lefty Bigfoot <nu...@busyness.info> wrote:
>>> With the exception of gets(), the usual "C deficiencies" really
>>> amount to deficiencies in those that call themselves C
>>> programmers, but in point of fact are not qualified to use that
>>> descriptor on themselves.
>>
>> It is a point of fact that people are not perfect. Any construct which
>> does not take this into account is inherently flawed.
>
> No, it isn't. Not everything in the world needs to be designed
> for the lowest common denominator.

Are you perfect? Do you know anyone who is?

I said nothing about designing for the lowest common denominator. I only
state that people are not perfect, which is obvious.

If your system has a user, then people are part of your system. If your
system assumes that one of its components is perfect when it is not, then
your system is flawed.

> C has historically been (and
> remains) primarily a systems programming language. It's not for
> noobs. Just as Visual Basic is worthless for writing operating
> systems, C is pretty worthless for writing video games, although
> it is quite often used for the lower-level non-display
> components of same.

Imperfection has nothing to do with "noobs".

>> Yet, when imperfect
>> people interact with something that requires them to be perfect, and then
>> subsequently fail, most will blame the people involved rather than the
>> thing they were interacting with.
>
> C does not require perfection. All it really requires is the
> ability to read K&R2 and man pages.

If you want to write working code, this is tre. Good code, even. But if
you want to write secure code which can be safely handed arbitrary data
from malicious people, it's clearly not enough. Or do you claim that the
people who work on Apache, OpenSSL, etc. are unable to read K&R2 and man
pages?

> Plenty of non-perfect
> people have managed to do that and be highly successful in their
> efforts at C programming.

I never said you couldn't be successful. I said that C has inherent flaws
which make it extremely difficult to avoid certain classes of security
vulnerabilities.

>> It is basically impossible to write a large program using certain C
>> constructs, in particular but not limited to standard string handling, and
>> remain secure.
>
> No, it isn't. It's not easy, I'll grant you. There are
> tradeoffs within any design, and C is in many ways far safer
> than assembly programming, which it is basically a higher-level
> syntax for in reality. But, it won't hold your hand for you,
> and doesn't come with safety goggles.

If it's not impossible then I would like proof. Show my a large, secure
program written in C. By "secure", I mean one that hasn't had major
vulnerabilities since it's been released. If it's possible then somebody
must have done it.

>> The proof of this statement is in the many open source
>> projects which are extremely security minded and have a large number of
>> extremely smart people working on them which still have security
>> vulnerabilities of a type which other languages can't even experience.
>
> If you think any language precludes security vulnerabilities,
> you're confused. That's not what you said, but some might read
> it as such.

Perhaps. So let me make it clear: any application written in any language
can have security problems. But there are major classes of security
vulnerabilities in C programs which programs written in other languages
cannot experience.

> Many open source projects suffer from the too many
> cooks problem, you can have a lot of gourmet chefs working on
> the same meal, and still wind up with bad food if they work at
> cross purposes.

Closed source projects have the same problem. You need multiple people to
create sizeable systems. So once again we have a case of inherent
imperfection in one of the components, which must be taken into account.

>> There's an entire class of security vulnerabilities, buffer overflows,
>> that most other languages can't even experience.
>
> Proof? "Most" is a strong term. I look forward to the
> evidence.

Would you be happy with "many"?

In any case, this isn't proof, but here is a list of serious languages
that I know of which either cannot experience buffer overflows, or which
force the programmer to go to great lengths to bypass the range checking
which prevents them from happening:

- Java
- Common Lisp
- Scheme
- Python
- Perl
- PHP (not that I'm claiming PHP as a remotely secure language...)
- Smalltalk
- JavaScript
- Haskell
- ML
- Prolog
- AppleScript
- Ruby

And here are the ones I know of which can experience buffer overflows:

- C
- C++
- Objective-C
- D

I'm also guessing that FORTRAN should go on the second list but I don't
know it well enough to say. I deliberately left off assembly because
assembly simply isn't used to directly process potentially hostile data
anymore.

>> In theory they can be avoided by careful programming,
>
> Yes of course they can.
>
>> but in practice it seems that they can
>> only be avoided by either severely limiting your program's functionality
>> or by pushing off all direct buffer manipulations into small and
>> rigorously tested libraries (which is effectively what these other
>> languages do).
>
> Library code serves a useful purpose, particularly for the
> programmer that hasn't been in practice long enough to develop a
> suitable and well-tested library of their own, or whose job
> changes preclude proper code-reuse. Commercial libraries, for
> any language, serve a useful purpose, or can, some of them are
> crap, just as with anything else.

I honestly have no idea what you're trying to say here. You seem to be
saying "libraries can be good, or they can be bad", which is pretty much
meaningless.

>> Objective-C can help by providing or making it easy to
>> write these small and rigorously tested libraries.
>
> So can C, with the addition of a small, rigorously tested
> library, by the development of your own, run through a similar
> grinder.

Certainly. And yet, how many C applications avoid all direct use of char *
in favor of a robust string manipulation library? How many C applications
avoid all direct buffer manipulation in favor of a robust range-checked
array library?

The language allows it, but it does not appear to be conducive to it. Most
Objective-C applications end up using a string library and an array
library.

I'll be the first to admit that you can do the same things in C. I am in
fact quite a fan of doing OO-style programming in C by the use of opaque
structs and method-like functions to operate on them. But simply from
looking at how the language is commonly used, it's clear that the
language, for whatever reason, does not encourage this sort of thing.

Just look at Apache and OpenSSL. These are both extremely popular products
with highly skillled programmers whose security is important to huge
portions of the internet. Despite all of this, they experience security
vulnerabilities. And not always complex unforseeable interactions between
modules due to multiple people working on the projects, many of these
vulnerabilities are simple overflows that exist within a single function.

I mean no disrespect whatsoever to the Apache or OpenSSL projects. Quite
the contrary: I respect them greatly, and when they still have these
problems that are impossible to even have in various higher level
languages, it tells me that there is an inherent flaw in the language.

Inherent flaws are nothing new in the language world. But when your
inherent flaw is security related, it makes me wonder why people still
insist on using it to write critical software which deals with potentially
malicious data.

0 new messages