The smtp proxy

0 views
Skip to first unread message

Timothy Mieszkowski

unread,
Aug 15, 2006, 12:24:21 AM8/15/06
to okopi...@googlegroups.com
 Ok I see what you're saying about the smtp proxy but I don't think its a good idea. 
I think it not only adds too much complexity, but I think that it adds it too close to the
user.  I'm hearing things about reconfiguring the mail client to send mail to localhost and I'm seeing images of joe blow scratching his head. Even if thats not true, its still an added complexity for us.
  Writing a seperate script for each mail client might seem more complex, but it is really taking the burden off the user and putting it on the developers.  And it really wouldn't be that hard anyway, you are just fetching some messages. (and as for the part about hashing them and not sending the messages in text and all that, that should be handled by the okopipi client anyway, all the scripts would do is deposit the messages for the client to 'pick up'.)

  And Kevin as far as not using the actual GNUnet, after thinking about it I agree with you.  You were dead on about it being nice but probably more trouble than its worth. 
  If for no other reason than to not bring on these guys the shitstorm that's sure to occur when this gets off the ground.

-Tim

Kevin Winter

unread,
Aug 15, 2006, 12:35:02 AM8/15/06
to okopi...@googlegroups.com
On 8/15/06, Timothy Mieszkowski <miesz...@gmail.com> wrote:
> Ok I see what you're saying about the smtp proxy but I don't think its a
> good idea.
> I think it not only adds too much complexity, but I think that it adds it
> too close to the
> user. I'm hearing things about reconfiguring the mail client to send mail
> to localhost and I'm seeing images of joe blow scratching his head. Even if
> thats not true, its still an added complexity for us.
> Writing a seperate script for each mail client might seem more complex,
> but it is really taking the burden off the user and putting it on the
> developers. And it really wouldn't be that hard anyway, you are just
> fetching some messages. (and as for the part about hashing them and not
> sending the messages in text and all that, that should be handled by the
> okopipi client anyway, all the scripts would do is deposit the messages for
> the client to 'pick up'.)

Well, assuming that the okopipi client is a standalone application
(i.e. not solely a plugin), how would you get the email from a mail
client to the okopipi client?

>
> And Kevin as far as not using the actual GNUnet, after thinking about it I
> agree with you. You were dead on about it being nice but probably more
> trouble than its worth.
> If for no other reason than to not bring on these guys the shitstorm
> that's sure to occur when this gets off the ground.

Yeah. Hopefully patching will work. If not we'll just grab their
source tarball and start from there.

~Kevin
--
Open Source, Open Mind

Timothy Mieszkowski

unread,
Aug 15, 2006, 2:00:14 AM8/15/06
to okopi...@googlegroups.com
On 8/15/06, Kevin Winter <therebe...@gmail.com> wrote:
>
> On 8/15/06, Timothy Mieszkowski <miesz...@gmail.com> wrote:
> > Ok I see what you're saying about the smtp proxy but I don't think its a
> > good idea.
> > I think it not only adds too much complexity, but I think that it adds it
> > too close to the
> > user. I'm hearing things about reconfiguring the mail client to send mail
> > to localhost and I'm seeing images of joe blow scratching his head. Even if
> > thats not true, its still an added complexity for us.
> > Writing a seperate script for each mail client might seem more complex,
> > but it is really taking the burden off the user and putting it on the
> > developers. And it really wouldn't be that hard anyway, you are just
> > fetching some messages. (and as for the part about hashing them and not
> > sending the messages in text and all that, that should be handled by the
> > okopipi client anyway, all the scripts would do is deposit the messages for
> > the client to 'pick up'.)
>
> Well, assuming that the okopipi client is a standalone application
> (i.e. not solely a plugin), how would you get the email from a mail
> client to the okopipi client?

I'm sorry, I know i'm not too coherent sometimes.
Think of it like this: the okopipi client would have a configuration
directory, located wherever.
Inside that directory would be:
a) /scripts -- the mail retrieval script directory
This could contain all scripts that exist and the only
configuration the user would have
to do would be to change the execute permissions on the ones he
needs. These
scripts would download the emails from the folder labeled 'spam' to...
b) /spam -- the directory for the actual spam
The okopipi client would be responsible for retrieving the
spams from here
the okopipi client would be responsible for running any scripts with
execute permission at an
interval, and the user would have to put their spam in a folder called
'spam'. Which, obviously is already done for you with gmail.
Using the libgmail python library this would probably be no more
than 2 dozen lines.

and while i'm at it it would also include
c) /web_scripts -- directory for the spam response scripts
Which i guess would only be used by supernodes?

I hope this clears up what I was trying to say

--Tim

Kevin Winter

unread,
Aug 15, 2006, 2:05:40 AM8/15/06
to okopi...@googlegroups.com
On 8/15/06, Timothy Mieszkowski <miesz...@gmail.com> wrote:
> I'm sorry, I know i'm not too coherent sometimes.
> Think of it like this: the okopipi client would have a configuration
> directory, located wherever.
> Inside that directory would be:
> a) /scripts -- the mail retrieval script directory
> This could contain all scripts that exist and the only
> configuration the user would have
> to do would be to change the execute permissions on the ones he
> needs. These
> scripts would download the emails from the folder labeled 'spam' to...
> b) /spam -- the directory for the actual spam
> The okopipi client would be responsible for retrieving the
> spams from here
> the okopipi client would be responsible for running any scripts with
> execute permission at an
> interval, and the user would have to put their spam in a folder called
> 'spam'. Which, obviously is already done for you with gmail.
> Using the libgmail python library this would probably be no more
> than 2 dozen lines.

Oh. So basically, the user is responsible for putting their spam in a
folder marked "Spam" ? (or some user-conifgurable name). And then the
client automatically runs a script every interval to retreive said
spams?

Unfortunately, that will only work for POP accounts and libgmail, as
you said. This won't work with IMAP, Exchange, or most other webmail.
The reason i suggested the SMTPproxy with forwarding, is it should
work for any non-webmail client with minimal configuration.

Still not sure how to handle those pesky non-gmail clients though...

By the way, is gmail only accessible with the help of python? I'd
like to make this as dependancy-light as possible.

Kevin Winter

unread,
Aug 15, 2006, 2:10:47 AM8/15/06
to okopi...@googlegroups.com
On 8/15/06, Kevin Winter <therebe...@gmail.com> wrote:

Sorry, i responded too quickly (and hence too negatively). I like
what your getting at, using helper scripts to accomplish tasks, and to
organize the directories in a sane manner. I'd suggest we place the
okopipi configuration directory in the user's home directory (so it's
more easily portable). (i.e. ".okopipi" or "documents and
settings\myusername\application data\okopipi")


Hrmmm. On second thought, maybe we can retreive mail for non-POP
accounts. Perhaps we can implement a simple mail client that connects
to the users email account and downloads a copy of every message in
the SPAM folder, then deletes the contents. That way all the user
need do is place all spam in folder SPAM on their mailserver, run
okopipi, and hit "retreive mail" or some such. That just might work
(albeit increasing the complexity).

Timothy Mieszkowski

unread,
Aug 15, 2006, 2:38:17 AM8/15/06
to okopi...@googlegroups.com
On 8/15/06, Kevin Winter <therebe...@gmail.com> wrote:

> By the way, is gmail only accessible with the help of python? I'd
> like to make this as dependancy-light as possible.
>

oh yeah I forgot that gmail has pop access anyway. no python would be
necessary, but you still have to use something.
Bash is out if you want to be multiplatform. I guess perl would be the
way to go if
you had to force a choice. Personally I say anything executable, but
I'm a unix user.
This might be a point against this approach.

ahh damn... I didn't even think about username/passwords..

-Tim

crookedfoot

unread,
Aug 15, 2006, 5:54:54 AM8/15/06
to okopipi-dev
I agree that a smtp proxy is overkill. Mail client plugins are the way
to go IMHO. I would the think the best way to go about getting the
messages to the okopipi client would be to start simply. Begin with a
cross-platform command line tool that pipes the message into the
already-running client process. Then, each mail client-specific script
could just be built upon that tool.

How did Bluefrog handle this? No need to reinvent the wheel.

-Aaron

> ------=_Part_53693_13800516.1155615861101
> Content-Type: text/html; charset=UTF-8
> X-Google-AttachSize: 1227
>
> &nbsp;Ok I see what you're saying about the smtp proxy but I don't think its a good idea.&nbsp; <br>I think it not only adds too much complexity, but I think that it adds it too close to the<br>user.&nbsp; I'm hearing things about reconfiguring the mail client to send mail to localhost and I'm seeing images of joe blow scratching his head. Even if thats not true, its still an added complexity for us.
> <br>&nbsp; Writing a seperate script for each mail client might seem more complex, but it is really taking the burden off the user and putting it on the developers.&nbsp; And it really wouldn't be that hard anyway, you are just fetching some messages. (and as for the part about hashing them and not sending the messages in text and all that, that should be handled by the okopipi client anyway, all the scripts would do is deposit the messages for the client to 'pick up'.)
> <br><br>&nbsp; And Kevin as far as not using the actual GNUnet, after thinking about it I agree with you.&nbsp; You were dead on about it being nice but probably more trouble than its worth.&nbsp; <br>&nbsp; If for no other reason than to not bring on these guys the shitstorm that's sure to occur when this gets off the ground.
> <br><br>-Tim<br>
>
> ------=_Part_53693_13800516.1155615861101--

Kevin Winter

unread,
Aug 15, 2006, 9:08:15 AM8/15/06
to okopi...@googlegroups.com
On 8/15/06, crookedfoot <aaron...@gmail.com> wrote:
>
> I agree that a smtp proxy is overkill. Mail client plugins are the way
> to go IMHO. I would the think the best way to go about getting the
> messages to the okopipi client would be to start simply. Begin with a
> cross-platform command line tool that pipes the message into the
> already-running client process. Then, each mail client-specific script
> could just be built upon that tool.

That's an excellent point. How's about this:

Make the okopipi client a daemon running in the background. This is
the true client, all else is an interface. (look at mpd/mpc and gmpc
for a great example). Then we can start with the command line tools
as you suggest (like mpc), keep them in their for the commandline
junkies like me. Also makes it scriptable. Then we make plugins for
various clients. These plugins just follow the pre-established
protocol to talk to the daemon client, which is just a slightly
modified GNUnet client.

We can even take Mosinu's current project and use that as an
interface. His code also does a LOT more filtering, and may be able
to automatically flag messages as spam.

For those who want to use scripts to take mail from their SPAM folder
and send it to the client - that works fine as well.

As to gmail POP: yes, it has that capability. But unless you leave
the message on the server, it's very easy to download all their email
by mistake. I've known too many people who POPed their email to some
random machine, and lost access to it to recommend even suggesting
gmail POP. I'd like to use that as a last resort. A better solution
would be such like a firefox plugin for a multiple of webclients.

Mosinu

unread,
Aug 15, 2006, 9:39:43 AM8/15/06
to okopipi-dev
Actually you're missing something here...Bluefrog had a central point
to send this out to. They had an e-mail address to forward the e-mail
to. They had a central site the webmail client did an HTTP post to so
you could report spam. We don't have that, we do not have the central
server to accept all this.

Writing plugins for every client is a LOT of work, and you have to
maintain them. Don't forget the different versions of every client out
there also. This means dealing with the different ways ISP's setup
e-mail (MSN, AOL, Earthlink, universities). As for a client connecting
to my mail accounts, my ibiblio account does mail differently than my
earthlink, do you really want to try to sort all those out? If you POP
gmail...what about yahoo and hotmail?

A transparent proxy is the easiest method in my opinion. We could catch
the message incoming and grab most spam before it ever hits the inbox,
and the user can tag what we miss and report it back out which will be
caught and reported. We can easily add something in the proxy to grab
webmail from Yahoo, Hotmail and Gmail. We could even take the bluefrog
firefox extension and use it as it is and let it report to the proxy.

Kevin Winter

unread,
Aug 15, 2006, 9:47:19 AM8/15/06
to okopi...@googlegroups.com
On 8/15/06, Mosinu <mos...@gmail.com> wrote:
> A transparent proxy is the easiest method in my opinion. We could catch
> the message incoming and grab most spam before it ever hits the inbox,
> and the user can tag what we miss and report it back out which will be
> caught and reported. We can easily add something in the proxy to grab
> webmail from Yahoo, Hotmail and Gmail. We could even take the bluefrog
> firefox extension and use it as it is and let it report to the proxy.

I quite agree with this. Along with this, we should still have the
okopipi client remain as daemon in the background. Then whichever
method of submitting spam is used, we can modularize development so
that if someone wants to use a script with mutt, or a plugin with
firefox, or a transparent proxy (which, i have to admit, is starting
to look like my first choice given the alternative), they all
interface with the same backend client.

~Kevin

Aaron Askew

unread,
Aug 15, 2006, 10:06:15 AM8/15/06
to okopipi-dev
Kevin Winter wrote:
> I quite agree with this. Along with this, we should still have the
> okopipi client remain as daemon in the background. Then whichever
> method of submitting spam is used, we can modularize development so
> that if someone wants to use a script with mutt, or a plugin with
> firefox, or a transparent proxy (which, i have to admit, is starting
> to look like my first choice given the alternative), they all
> interface with the same backend client.

I'm sold. At first, the idea of a layman having to change email server
settings seemed horrific (and I still think it will turn some away),
but it seems the lesser of some evils at this point. I also agree that
the proxy should be a module that connects to the underlying daemon.

I would think that the "modules" would be responsible for transmitting
the unparsed message to the daemon that then hashes and processes it.
This would allow for maximum modularity so that the proxy could be
replaced in the future if so desired.

By the way, I think the proxy is a superb notion. My only concern with
it is appealing to the lowest common denominator.

-Aaron

Mosinu

unread,
Aug 15, 2006, 10:12:59 AM8/15/06
to okopipi-dev
If done right, the proxy and okopipi could be the same daemon. The
proxy piece would catch the spam and hand it off to a module inside to
handle reporting. For those who are "super nodes" or "admin nodes" they
could get an extra module or plugin for that role.

If we do the proxy correctly, the user won't have to change anything in
the mail client (eg. TRANSPARENT proxy). My main concern with client
plugins is for the ones that are not always backwards compatiable.
Think Outlook 2000 vs Outlook 2003 vs Outlook express.

With the code I sent you, it is already modular, and has an API. So
anyone could write plugins to easily expand the capablities. We could
make a reporting plugin, that reports rather than just filter. Then we
create one for opt-out campaigns for those willing to go this route.
This way we have something for everyone, but those who question the
tactic of opt-out do not have to particpate or even install the module.
This would give us a much broader user base than Bluefrog was able to
gain because a lot of people question the use of the opt-out the way
they did it. It also lets the user use the RBL's even if their ISP does
not. We are then only dealing with standard protocols, not e-mail
clients.

Kevin Winter

unread,
Aug 15, 2006, 10:24:29 AM8/15/06
to okopi...@googlegroups.com
On 8/15/06, Mosinu <mos...@gmail.com> wrote:
> If done right, the proxy and okopipi could be the same daemon. The
> proxy piece would catch the spam and hand it off to a module inside to
> handle reporting. For those who are "super nodes" or "admin nodes" they
> could get an extra module or plugin for that role.

The main reason i want to make it modular is to not force the user to
use the transparent proxy. The way i'm thinking, we'll likely package
the transparent proxy/gui along with the daemon, but give the user a
choice on how to access it. Besides which, as Aaron suggested, this
makes the transparency proxy replaceable if needed. This modularity
will make development a tad easier than if we have one monolithic
application.

Think about it this way: if we need to make a change, we'd only have
to update one, and not the other. It's been my experience that GUIs
introduce more bugs than standalone daemons. I'd like to make the
okopipi client daemon a stable, solid base to build off of.

>
> If we do the proxy correctly, the user won't have to change anything in
> the mail client (eg. TRANSPARENT proxy). My main concern with client
> plugins is for the ones that are not always backwards compatiable.
> Think Outlook 2000 vs Outlook 2003 vs Outlook express.

I agree. The transparent proxy will certainly solve this issue. I
just don't want to make the users _have_ to use it.

> With the code I sent you, it is already modular, and has an API. So
> anyone could write plugins to easily expand the capablities. We could
> make a reporting plugin, that reports rather than just filter. Then we
> create one for opt-out campaigns for those willing to go this route.
> This way we have something for everyone, but those who question the
> tactic of opt-out do not have to particpate or even install the module.
> This would give us a much broader user base than Bluefrog was able to
> gain because a lot of people question the use of the opt-out the way
> they did it. It also lets the user use the RBL's even if their ISP does
> not. We are then only dealing with standard protocols, not e-mail
> clients.

I'd like to fold some of the functionality into the daemon. I'd like
to see the transparent proxy use greylists/RBL/baysian filtering, and
then pass the email off to the daemon. If we do this right, a few
bash scripts and cron jobs should make automatic reporting and
opt-outs a real possibility.

I'd like the transparent proxy/frontend configure and manipulate
everything the client does; however, i want the actual functionality
in the client daemon.

Timothy Mieszkowski

unread,
Aug 15, 2006, 1:55:18 PM8/15/06
to okopi...@googlegroups.com
I'm not exactly convinced about the transparent proxy, but I agree
that it does look best.
If the interfaces are properly modularized it wouldn't be much trouble
to switch over
to handling each mail client seperately. I don't think they even need
to be seperate processes or pieces of code. Just so we don't get
locked into any one way of doing anything.

As for the gui thing, there is NO way that a gui of any kind should be
part of this. This is just
an extension of the reasoning above. Its easy to make a gui to
interface to a daemon, but
to include them together is madness if you ask me.

Overall, I think we need to concentrate on getting something simple
out the door after the 1st. There's no sense getting bogged down in
details like this, because once coding starts who knows what could
change.

Mosinu

unread,
Aug 15, 2006, 2:02:59 PM8/15/06
to okopipi-dev

Kevin Winter wrote:
>
> The main reason i want to make it modular is to not force the user to
> use the transparent proxy. The way i'm thinking, we'll likely package
> the transparent proxy/gui along with the daemon, but give the user a
> choice on how to access it. Besides which, as Aaron suggested, this
> makes the transparency proxy replaceable if needed. This modularity
> will make development a tad easier than if we have one monolithic
> application.
>
> Think about it this way: if we need to make a change, we'd only have
> to update one, and not the other. It's been my experience that GUIs
> introduce more bugs than standalone daemons. I'd like to make the
> okopipi client daemon a stable, solid base to build off of.
>
>
> I agree. The transparent proxy will certainly solve this issue. I
> just don't want to make the users _have_ to use it.

That works for me.


>
> > With the code I sent you, it is already modular, and has an API. So
> > anyone could write plugins to easily expand the capablities. We could
> > make a reporting plugin, that reports rather than just filter. Then we
> > create one for opt-out campaigns for those willing to go this route.
> > This way we have something for everyone, but those who question the
> > tactic of opt-out do not have to particpate or even install the module.
> > This would give us a much broader user base than Bluefrog was able to
> > gain because a lot of people question the use of the opt-out the way
> > they did it. It also lets the user use the RBL's even if their ISP does
> > not. We are then only dealing with standard protocols, not e-mail
> > clients.
>
> I'd like to fold some of the functionality into the daemon. I'd like
> to see the transparent proxy use greylists/RBL/baysian filtering, and
> then pass the email off to the daemon. If we do this right, a few
> bash scripts and cron jobs should make automatic reporting and
> opt-outs a real possibility.

The proxy has whitelist/blacklist/RBL/Bayseian filter already, so that
isn't a problem. But I am curious about bash and cron scripts. While it
is great for those of us on Linux and Mac OS X; what about the windoze
users? Do you expect them to install cygwin or something similar? Why
not have the daemon report as soon as it determines the message is
spam? Remove the need for cron/bash/python/perl/c#.

>
> I'd like the transparent proxy/frontend configure and manipulate
> everything the client does; however, i want the actual functionality
> in the client daemon.
>

It can handle this, with no problem.

Kevin Winter

unread,
Aug 15, 2006, 2:16:58 PM8/15/06
to okopi...@googlegroups.com
On 8/15/06, Mosinu <mos...@gmail.com> wrote:
> The proxy has whitelist/blacklist/RBL/Bayseian filter already, so that
> isn't a problem. But I am curious about bash and cron scripts. While it
> is great for those of us on Linux and Mac OS X; what about the windoze
> users? Do you expect them to install cygwin or something similar? Why
> not have the daemon report as soon as it determines the message is
> spam? Remove the need for cron/bash/python/perl/c#.

Well, i don't want to make it a need, so much as make it an option.
If we have a standalone client daemon listening on a port, it can
handle connections from your proxy, from a plugin, or from a script
using the same protocol. This makes it ultimately modular, allowing
the end user to use whichever interface they choose.

I also do not intend to inflict a gui upon a user. My preference
would be a config file the daemon reads when run. Any user could
simply edit this file in a text editor, or they could use a gui to
change settings. i DO feel it is important to include _a_ gui, or else
this won't appeal to the average user. They won't have to use it, it
will just be there.

Reply all
Reply to author
Forward
0 new messages