Dedalus homogeneity tolerance

63 views
Skip to first unread message

evan...@colorado.edu

unread,
Aug 18, 2015, 4:42:19 PM8/18/15
to Dedalus Development
Hi all,

I'm moving this discussion to the Dev group, as I think it's a more appropriate place than the user group.

I've just pushed up a first run implementation of allowing a homogeneity tolerance (basically, accepting the RHS for the EVP if it's essentially zero).  The logic of what I've implemented is this:

1.) Evaluate the expression
2.) PIck out the max(abs(evaluated expression))
3.) Pick out the max(abs(all of the NCCs that went into calculating that expression))
4.) compare max(evaluated) / max(NCCs) to a tolerance (right now 1e-10 default or user-specified by kwarg)
5.) if the max ratio of #4 is < tol, then the expression is considered homogeneous.

As it currently stands, what this means is that if you have a problem with bigger parameters (let's say order 1e5), then the expression only has to have a max of 1e-5 in order to fit within a tol of 1e-10.  If you have maximum NCCs of 1e-3, then you need the expression maximum to be 1e-13 to fit within a tol of 1e-10.  This isn't a perfect implementation, and I'm definitely open to suggestions.

Other things to note: I'm not sure where all of the functionality of dedalus is hidden and it's highly possible that I re-implemented something that it can already do.  If so, point me in the direction of those functions and I'll clean stuff up.

I'm also happy to clean up comments and logger info, but right now I'm just trying to make things fairly obvious so that it's clear what I'm doing.

(I'm about to copy this to the Pull request, too, but I wanted to make sure the discussion could happen easily via e-mail)

Keaton Burns

unread,
Aug 19, 2015, 11:08:44 AM8/19/15
to dedal...@googlegroups.com
Hi Evan,

This looks like a good general approach and a good start at an implementation. Here are a couple suggestions:

1) Instead of trying to extract parameters by re-parsing the RHS expressions, you can use the ‘atoms’ method to get the set of leaves in the operator tree, i.e. all fields and scalars present in the expression, and you could then directly examine the size of all of these to get a measure of the amplitude you want to compare to. Note this will also include numbers directly entered in the equations, which won’t be listed as parameters but whose amplitudes we still may want to check.

2) One thing to be really careful about is deadlocking under parallelization, specifically here making sure that different processes make the same decision about accepting or rejecting an expression, otherwise you may have some processes raising an error while others just continue and wait at the next MPI block. In particular, since the parameter data may be distributed in grid space, you’ll need to do an allreduce on the local maximum that each process computes for it’s local portion of the parameter data, test the local portion of each expression against this global amplitude, and then do another allreduce so that if any process finds data violating the tolerance, everyone else will be on the same page about raising the error. Note that this wasn’t necessary before, since the test "expr != 0” is just a test on the scalar value of the RHS, which unlike grid data, is constant across processes.

GENERAL QUESTION: Does anyone see issues with this tolerance-relative-to-RHS-parameters approach to doing the zero testing? Seems ok to me, but I haven’t been working with the complex RHS’s some of you guys have, where perhaps something funny could happen.

Again, looking good, and let me know if you’d like me to round out a few of these corners with parsing, MPI checks, etc., as I’d be happy to.

-Keaton
> --
> You received this message because you are subscribed to the Google Groups "Dedalus Development" group.
> To unsubscribe from this group and stop receiving emails from it, send an email to dedalus-dev...@googlegroups.com.
> To post to this group, send email to dedal...@googlegroups.com.
> To view this discussion on the web visit https://groups.google.com/d/msgid/dedalus-dev/15ed8689-4bb0-430f-92c9-864bdf5fdd3b%40googlegroups.com.
> For more options, visit https://groups.google.com/d/optout.

evan...@colorado.edu

unread,
Aug 19, 2015, 4:13:47 PM8/19/15
to Dedalus Development
Hi Keaton (& All),

Good points!  I went back and un-reinvented the wheel.  Now dedalus handles parsing the expression and evaluating it, which dropped about 20 lines of code.  Further, I implemented the parallelization you suggested.  I had been testing this exclusively in serial and managed to forget entirely that parallelism is part of what makes dedalus.  I've tested it over 4 processors on my PC and it seems to be working as intended.

If there's anything else stylistically you'd like me to address (or if anyone has suggestions for improving the logic), I'm happy to improve or polish it.  It seems to be working well for my application, but it would be good to get it tested on other problems (if anyone else has a similar set of equations with NCCs on the RHS).

Side note: The /extras/EVP.py file is still lingering.  Should I drop it out of the PR?  I have a copy of it in my polytrope directory for my own personal use (while this homogenization business gets worked out), but I don't know what everyone thinks about its usefulness as something in /extras.  Feelings certainly won't be hurt if we want to drop it.

-Evan

Ben Brown

unread,
Aug 19, 2015, 11:50:36 PM8/19/15
to dedal...@googlegroups.com
Keaton & Evan & all,

On Wed, Aug 19, 2015 at 9:08 AM, Keaton Burns <keaton...@gmail.com> wrote:

GENERAL QUESTION:  Does anyone see issues with this tolerance-relative-to-RHS-parameters approach to doing the zero testing?  Seems ok to me, but I haven’t been working with the complex RHS’s some of you guys have, where perhaps something funny could happen.


Seems like one possible edge case is when several RHS parameters cancel out to zero or near-zero.  For example, three terms that are each larger than tol but collectively add up to be less than tol.  This is in fact the exact non-homogenous term in the atmosphere that Evan's working with.  Does the parsing structure catch this already?  Seems like it must, since Evan's getting decent results?

A related, even more edge case:  say we again have several RHS parameters that are intended to cancel to zero.  One of them is above tol, but the rest of them are below tol.  The below tol ones are dropped, leaving the above tol one still remaining and non-zero.  This seems like a plausible but pretty unlikely case.

These edge cases may already be caught by Evan's test above, but I'm uncertain.

--Ben
 

> On Aug 18, 2015, at 4:42 PM, evan...@colorado.edu wrote:
>
> Hi all,
>
> I'm moving this discussion to the Dev group, as I think it's a more appropriate place than the user group.
>
> I've just pushed up a first run implementation of allowing a homogeneity tolerance (basically, accepting the RHS for the EVP if it's essentially zero).  The logic of what I've implemented is this:
>
> 1.) Evaluate the expression
> 2.) PIck out the max(abs(evaluated expression))
> 3.) Pick out the max(abs(all of the NCCs that went into calculating that expression))
> 4.) compare max(evaluated) / max(NCCs) to a tolerance (right now 1e-10 default or user-specified by kwarg)
> 5.) if the max ratio of #4 is < tol, then the expression is considered homogeneous.
>
> As it currently stands, what this means is that if you have a problem with bigger parameters (let's say order 1e5), then the expression only has to have a max of 1e-5 in order to fit within a tol of 1e-10.  If you have maximum NCCs of 1e-3, then you need the expression maximum to be 1e-13 to fit within a tol of 1e-10.  This isn't a perfect implementation, and I'm definitely open to suggestions.
>
> Other things to note: I'm not sure where all of the functionality of dedalus is hidden and it's highly possible that I re-implemented something that it can already do.  If so, point me in the direction of those functions and I'll clean stuff up.
>
> I'm also happy to clean up comments and logger info, but right now I'm just trying to make things fairly obvious so that it's clear what I'm doing.
>
> (I'm about to copy this to the Pull request, too, but I wanted to make sure the discussion could happen easily via e-mail)
>
> --
> You received this message because you are subscribed to the Google Groups "Dedalus Development" group.
> To unsubscribe from this group and stop receiving emails from it, send an email to dedalus-dev...@googlegroups.com.
> To post to this group, send email to dedal...@googlegroups.com.
> To view this discussion on the web visit https://groups.google.com/d/msgid/dedalus-dev/15ed8689-4bb0-430f-92c9-864bdf5fdd3b%40googlegroups.com.
> For more options, visit https://groups.google.com/d/optout.

--
You received this message because you are subscribed to the Google Groups "Dedalus Development" group.
To unsubscribe from this group and stop receiving emails from it, send an email to dedalus-dev...@googlegroups.com.
To post to this group, send email to dedal...@googlegroups.com.

Keaton Burns

unread,
Aug 20, 2015, 6:26:53 AM8/20/15
to dedal...@googlegroups.com
Hi Ben,

What Evan is doing here is examining all the RHS parameters, taking the maximum value attained by any of them, and then making sure their sum is below some tolerance relative to that maximum value. He’s not throwing out individual terms based on a tolerance — he’s using the tolerance to determine how close to zero the cancellation between the terms needs to be, so it should be exactly covering the case you describe.

-Keaton
> To view this discussion on the web visit https://groups.google.com/d/msgid/dedalus-dev/CAHqBLzwYaHVr6p3cj4aykA3x%2Bwkk_Q8Yxd-P3HGc2UnciJgJAA%40mail.gmail.com.

evan...@colorado.edu

unread,
Aug 20, 2015, 2:52:25 PM8/20/15
to Dedalus Development
Hi Ben,

Keaton summed it up nicely.  So those two edge cases which you mentioned should be handled by my implementation, as we're not dropping individual parameters but rather comparing the result of the evaluated expression which combines all those parameters.

My concern is in the case of a large constant in the equation (e.g. 2) multiplying a combination of parameters which are of order small (say 1e-6). 

In this case, my "max param" will pick out 2 as the value to set the tolerance.  Thus, for tol=1e-10, as long as the rhs is below 2 * tol = 2e-10, then it's considered homogeneous.  Unfortunately, when our 'big' parameters are of order 1e-6, then only requiring that their evaluated result contain entirely values smaller than 2e-10 might not be rigorous enough.  Of course, the logger.info does put out a 'warning' message letting the user know how homogeneous their equation is, and they can adjust the tolerance parameter, but that might require a little more know-how than someone who just wants to slap in their equations and EVP solve.  It might get overlooked.

If anyone has suggestions for how to be careful in a case like this, I'm happy to hear it.

-Evan

Daniel Lecoanet

unread,
Aug 20, 2015, 3:04:58 PM8/20/15
to dedal...@googlegroups.com
I've had a bit of experience with this, where it turned out that 10^-4 error in NCC's were creating an order 1 error in my flux balance.  There's no way to get around these "edge cases."  Whenever the user picks a tolerance, they're always at risk, and it is up to them to make sure the tolerance is appropriate for their problem.

Daniel

Keaton Burns

unread,
Aug 25, 2015, 2:22:16 PM8/25/15
to dedal...@googlegroups.com
Hi Evan,

Yeah I think that issue is unavoidable, and not necessarily related to constants — for instance, you could have a parameter that only appears on the RHS with exponents greater than 1, so the magnitude of the parameter isn’t really a correct measure of the amplitude of the inhomogeneous terms. I guess a solution to this would be to expand the RHS and compare take the magnitudes over each additive term, rather than over each atom, and compare the cancellation to this… but that may be more trouble than it’s really worth.

Ok now on to some of the details in the PR:

1) Instead of using MPI.COMM_WORLD, it would be better to use domain.dist.comm, which is the communicator that a given domain/problem is distributed over, and isn’t necessarily the same as comm_world. The domain is available as an attribute on the solver object.

2) I think removing the extras/EVP file is probably appropriate, since the current approach should subsume it’s functionality. That is, the code should drop any truly inhomogeneous RHS terms if you set tolerance=np.inf.

Some other minor points:

1) The tolerance probably doesn’t need to be passed through as an argument to either _require_homogeneous or _check_if_zero, since those methods can both access it as an attribute off the solver (self).

2) You could simplify the comparison to just be: homogeneous = (global_max <= tolerance*max_param)

3) In _find_max_param, you can just use 0 as the starting point for max_val rather than -1e99. The logic about “if params ==‘all’” can probably also be removed.

-Keaton
> To view this discussion on the web visit https://groups.google.com/d/msgid/dedalus-dev/CAJoYf%3DhftX%3Df9r7hvNP5Aw_ouQGFKmzm5crP%3DTzveU%2B-ZLaVyg%40mail.gmail.com.

evan...@colorado.edu

unread,
Aug 25, 2015, 4:40:30 PM8/25/15
to Dedalus Development
Hi Keaton,

Yeah, that sounds like more effort than is worth it.  If someone manages to come up with a use case where they run into such a problem, I'm happy to try to modify it then.  For now, this should work.

All of your comments should now be incorporated into the PR.  Let me know if anything looks amiss or confusing, and I'll try to fix it.

-Evan

Keaton Burns

unread,
Sep 29, 2015, 5:11:25 PM9/29/15
to dedal...@googlegroups.com
Hi Evan,

Is there anything else you're thinking of adding to the PR? If not, I think it would be a good idea to rebase & collapse these change sets onto the more recent main-repo commits, and then it'll probably be ready to merge. Rebasing can be a little tricky but really helps clean things up when examining the main repo development — let me know if you want a quick outline of how to go about this.

Best,
-Keaton
> To view this discussion on the web visit https://groups.google.com/d/msgid/dedalus-dev/42913a72-3520-4ced-8241-c9f2fa95b135%40googlegroups.com.

Jeffrey S. Oishi

unread,
Sep 29, 2015, 5:13:06 PM9/29/15
to dedal...@googlegroups.com
Actually, I just started writing up a tutorial on this as part of a larger "how to contribute to Dedalus" document I'm working on. Evan, if you'd like to see my WIP on that, you're welcome to it.

j

Ben Brown

unread,
Sep 29, 2015, 5:13:40 PM9/29/15
to dedal...@googlegroups.com
Keaton,
     I think a walkthrough of rebasing and collapsing changes would be generally useful.  Do you mind writing up a short tutorial and sending it out to dedalus-dev as a new thread?  Then we'd all have a spot to point others to when this comes up in the future.

--Ben

Ben Brown

unread,
Sep 29, 2015, 5:13:57 PM9/29/15
to dedal...@googlegroups.com

Jeffrey S. Oishi

unread,
Sep 29, 2015, 5:14:58 PM9/29/15
to dedal...@googlegroups.com
OK, I'll send something out in the morning. Right now I have to go teach.,..

Evan H. Anders

unread,
Sep 29, 2015, 5:21:51 PM9/29/15
to dedal...@googlegroups.com
All,

I'm not thinking of adding anything else to the PR, so I'm happy to rebase it.  I'll look into how to do that some now.  If I can't figure it out from a little google-fu, I'll await Jeff's mail in the morning!

-Evan

--
You received this message because you are subscribed to a topic in the Google Groups "Dedalus Development" group.
To unsubscribe from this topic, visit https://groups.google.com/d/topic/dedalus-dev/BTX7p7wDja4/unsubscribe.
To unsubscribe from this group and all its topics, send an email to dedalus-dev...@googlegroups.com.

To post to this group, send email to dedal...@googlegroups.com.

Jeffrey S. Oishi

unread,
Sep 29, 2015, 5:22:36 PM9/29/15
to dedal...@googlegroups.com
OK, cool. It's not super hard, but is made tricky by the fact that your changes are already public, since you've pushed them to bitbucket.

Evan H. Anders

unread,
Sep 29, 2015, 11:18:30 PM9/29/15
to dedal...@googlegroups.com
All,

As far as I can tell, I've successfully squished my commit history.  Let me know if everything looks OK! (Or just merge in the PR)

-Evan

Jeffrey S. Oishi

unread,
Sep 30, 2015, 12:29:46 PM9/30/15
to dedal...@googlegroups.com

Hi Evan,

You sure did. Everything looks great. I've sent the promised notes on collapsing changesets to this list in  separate thread.

Jeff

Reply all
Reply to author
Forward
0 new messages