validation of tk_gui

58 views
Skip to first unread message

Xunchen Liu

unread,
Jul 3, 2024, 3:05:57 PM7/3/24
to openpiv-users
Dear Alex and other developers openpiv :

Thank you for developing the openpiv package. I was a LaVision DaVis user, but I just decided to switch to openpiv python to have more flexibility and automation of the process.

I mainly measure high repetition rate PIV images so later I would like to try the time smoothing function.

But first I need to learn using the package. I think the tutorial and the windef example are good starting points?

Also I tried the tk_gui and studied the youtube video of the GUI. I am not sure what the red vectors mean after validation.  after putting a simple signal to noise filter with 1.06 (recommended value), all vectors are marked as red. I am not sure what does that mean.
These are calculation of the test example. The issue is the same for my own data.

Screenshot%Y%M%D%H%m%T-1.png
Screenshot%Y%M%D%H%m%T.png

can I export all the setting from the GUI ? So that I can automate the following calculation?

Thanks again

Ivan Nepomnyashchikh

unread,
Jul 3, 2024, 5:58:50 PM7/3/24
to openpiv-users
According to your screenshots, you only chose to validate if your velocity vectors are valid based on the signal to noise ratio, for which you picked the threshold of 1.06. In this case, all the vectors that are based on the correlation that have a value of signal to noise ratio less than your threshold will be deemed invalid and marked red. Otherwise, they will be marked blue. Since all of your vectors are red, they all are invalid. I.e., their signal to noise ratio is less than 1.06.
It's good that, having validated your field, you didn't choose to remove the invalid vectors - your entire field would've been removed.
Try lowering your s2n threshold till you get about 95% of the blue (i.e., valid) vectors.
I personally don't use GUI and I don't know how to work with it. If - as you claim - you want more flexibility and automation options, I suggest you should switch to Python. Personally, I use the Python version of OpenPIV and it gives me all the flexibility I can dream of.
Ivan

Xunchen Liu

unread,
Jul 4, 2024, 10:22:57 AM7/4/24
to openpiv-users
Thanks Ivan for your valuable suggestion. You are right. I need to figure out why all the sig2noise value are low. I used python too. I used the windef.py as the starting point. I think all the settings are there to tune.
I tested with the data I measured, which are about 100 piv images of a swirling flame.
Figure_2.png
then I used 32/16 window size with 16/8 overlapping to calculate the correlation. I like the option in the  windef function that I can plot all figures.
Figure_23.png
Figure_24.png
But after tuning some other options, all vectors are marked red. So I guess I have a signal to noise problem. There are also some histogram plot that I do not know what they are, by only look at the windef.py.
Figure_6.png
Figure_13.png

I looked at the windef.py, and modified it to save the signal to noise of first pass and at the end of multipass.
So my signal to noise looks like this. There are many points with s2n =0.
I think I need to have sig2noise around 1.05-1.1?
Why these values are so far away?
Figure_1st.png
Figure_multipass.png

anyway, I also add a function to plot streamline in the tools.py. I think this is a good option that we can add in.

Figure_1.png

Ivan Nepomnyashchikh

unread,
Jul 4, 2024, 1:07:17 PM7/4/24
to openpiv-users
In fact, multipass is the most advanced PIV procedure. I don't think starting with it is a good idea. Actually, it is more difficult to tune multipass - even though it sound counter intuitive. Multipass has a lot of steps and it doesn't give you a good means to debug each step. Instead, it asks you to supply the settings, then it does all the steps with these setting and gives you the result. You could do the same in Lavision or Dantec - there would not be much difference. That's why I, personally, don't like how multipass is implemented.

That being said, you did a good job with multipass: your s2n histograms are pretty nice and visually there are not that many outliers in your velocity field. And yes, s2n should better be about 1.2 or something like that, but it can be difficult to achieve. You need to tune your multipass settings more. But it can be tedious in my experience. Just so you know, the most difficult part is find the right setting for the median filter. That can be such a pain!

I strongly suggest you should mask out the portion of the field without the flow (that black part on the right hand side in your PIV images). Those kinds of things create problems, especially with very advanced procedures like windows deformation used in multipass. And they give you red vectors, which lowers your cumulative s2n. Masking is easy, but figuring it out is tricky. Luckily, the part of your images without the flow looks almost like a vertical rectangle. In this case, instead of masking, just use the numpy trick. Namely, every image is a numpy matrix. Just use numpy slices to cut off the right part of your image and supply the result to your OpenPIV code.

To summarize, I think starting with windef.py is not the best idea. I would start with simple piv. You could use either the simple PIV function from OpenPIV package or do what I did: find correlations first, convert them to displacements second. That's the most basic PIV procedure you could think of and it gives you a means to debug each step.

Below is my code (for educational purposes - to show you how to use correlations and displacements functions separately). I have bubbles in the flow, that's why I use masks (to mask out the bubbles). Just so you know, I also have issues with signal to noise ratio, but that's because my original images are not ideal. Also, pay attention that OpenPIV uses the word "mask" to refer to the invalid vectors. When I say "mask" I mean the real mask - something that you put on the image to cover that part of the image.

def PIVanalysis(dataArray,
                imageShape,
                searchAreaSize,
                overlap,
                scalingFactor,
                deltaT,
                mask,
                saveToDir,
                counter,
                threshold = 0.8                
               ):
   
    """
    See ensemble_correlation.ipynb OpenPIV tutorial.
    Parameters: dataArray (numpy array) - an array of PIV images that contains only
                                          one PIV pair, its structure is
                                          dataArray = np.asarray([frameA, frameB])
    Returns: None
    """
    # The name of the OpenPIV txt file.
    nameTXT = saveToDir / 'OpenPIVtxtFileForPair{}.txt'.format(counter)

    # Find the correlations map.
    corrs = pyprocess.fft_correlate_images(
        pyprocess.moving_window_array(dataArray[0],searchAreaSize,overlap),
        pyprocess.moving_window_array(dataArray[1],searchAreaSize,overlap),
        normalized_correlation=False
    )
   
    mesh = pyprocess.get_field_shape(imageShape,
                                     search_area_size = searchAreaSize,
                                     overlap = overlap
    )

    nrows, ncols = mesh[0], mesh[1]
   
    # Having found correlations, we can find corresponding displacements in pix.
    # IMPOTANT: they use u and v for displacements as well as for velocities in
    # OpenPIV. Since that's confusing, I decided to use xDisp and yDisp for
    # displacements and u and v for velocities.
    xDisp, yDisp = pyprocess.correlation_to_displacement(corrs, nrows, ncols)

    # Having found displacements in pix, we can find corresponding velocities in
    # pix/time
    u = xDisp/deltaT # pix/s
    v = yDisp/deltaT # pix/s

    x, y = pyprocess.get_coordinates(imageShape,
                                     search_area_size = searchAreaSize,
                                     overlap = overlap
    )

    # Mask out the areas without the flow right now (bubbles or corners).
    grid_mask = preprocess.prepare_mask_on_grid(x, y, mask[:,::-1])
    masked_u = np.ma.masked_array(u, mask=grid_mask)
    masked_v = np.ma.masked_array(v, mask=grid_mask)

    # Now let's do validation. Pay attention, that validation doesn't mean automatic
    # replacement of the outliers, it means that we are going to detect the outliers
    # and mark them as outliers. Their replacement is a separate procedure and will
    # be done next. Important: I don't remember why, below, I do median test first
    # and signal to noise ratio second. But I remember I did it on purpose. I am
    # going to just go with it.
   
    # Do the validation based on the median test (as described in the
    # German PIV book on p.185).
    invalid_mask_median = validation.local_median_val(masked_u,
                                                      masked_v,
                                                      u_threshold=50, # abs difference with the median
                                                      v_threshold=50, # abs difference with the median
                                                      size=2 # size=2 is a 5x5 kernel
    )
    # Now we can replace outliers flagged by the invalid_mask.
    masked_u, masked_v = filters.replace_outliers(masked_u.flatten(), masked_v.flatten(),
                                                  invalid_mask_median.flatten(),
                                                  method = 'localmean',
                                                  max_iter = 3,
                                                  kernel_size = 1
    ) # kernel_size=1 is the best; IMPORTANT to flatten u and v

    # Do the validation of the correlation peaks.
    corrs = corrs.astype('float64')
    s2n = pyprocess.sig2noise_ratio(corrs, "peak2mean")
    invalid_mask_s2n = validation.sig2noise_val(s2n, threshold = threshold)
    # Now we can replace outliers flagged by the invalid_mask.
    masked_u, masked_v = filters.replace_outliers(masked_u.flatten(), masked_v.flatten(),
                                                  invalid_mask_s2n,
                                                  method = 'localmean',
                                                  max_iter = 3,
                                                  kernel_size = 1 # uset to be 3
    ) # kernel_size=1 is the best; IMPORTANT to flatten u and v

    # Combine the two masks (the way they are combined is copied from the code
    # for OpenPIV.validation.typical_validation line 295). These are, actually,
    # flags, not masks.
    invalid_mask = invalid_mask_s2n | invalid_mask_median.flatten()

    x, y, masked_u, masked_v = scaling.uniform(x, y, masked_u, masked_v, scaling_factor = scalingFactor)

    # MASK OUT VELOCITY FIELD.
    # Before saving the field to a .txt file, give zeros to those vectors that lie in the masked regions.
    # Right now, x and y are in the image system of coordinates: x is to the right, y is downwards, (0,0)
    # is in the top left corner. I learnt that from the GitHub code of tools.transform_coordinates.
    # Since my x and y are in mm, I'm going to use scaling factor to convert them to pix. The I'm going
    # to use them to identify whether or not their place on an example masked image is masked.
    xFlat = x.flatten()
    yFlat = y.flatten()
    for i in range(xFlat.size):
        if mask[int(yFlat[i]*scalingFactor), int(xFlat[i]*scalingFactor)] == 255:
                masked_u[i] = 0
                masked_v[i] = 0
   
    x, y, masked_u, masked_v = tools.transform_coordinates(x, y, masked_u, masked_v)

    tools.save(str(nameTXT), x, y, masked_u, masked_v, invalid_mask)
   
    # FOR DEBUGGING PURPOSE USE THIS CODE TO PLOT VELOCITY FIELDS INSTEAD OF SAVING THEM:
    # figure, _ = tools.display_vector_field(str(nameTXT),
    #                                        scaling_factor = scalingFactor,
    #                                        scale = 100,
    #                                        on_img = False,
    #                                        show_invalid = True
    # )
    # plt.show()

Xunchen Liu

unread,
Jul 5, 2024, 5:57:25 AM7/5/24
to openpiv-users
Thanks Ivan for sharing your nice function. I understand you point that windef has many layers and I should look at the procedures of each step. You are right about the mask too.

Return to my original question about tk_gui, I still try to understand what's going on in postprocessing. Today I tried again with the ideal test data. I found that in the postprocessing step, all vectors are marked as invalid (flag = 1) in the .vec file. How was the signal to noise calculated or read out? According to your code, it's from the correlation file that stored ?
Screenshot%Y%M%D%H%m%T-1.png

Screenshot%Y%M%D%H%m%T-2.png

Ivan Nepomnyashchikh

unread,
Jul 5, 2024, 1:34:43 PM7/5/24
to openpiv-users
Again, I think you are going the hard way. Your screenshots mean that you are using a rather advanced PIV procedure. Therefore, if you want to return to your original question and try to understand what's going on in the postprocessing, you will have a hard time. I suggest you abandon this GUI thing and do a very simple PIV procedure - you will get good results. There is nothing wrong with the simple PIV.

But if you want to proceed with your GUI question, then, like I mentioned above, I don't know how GUI works because I have never used it. But I know how to approach such problems: you will have to go to the source code on GitHub, read it and try to understand what it is doing. That's how I learnt OpenPIV. That's what I, personally, do all the time.

Here is the GitHub folder with the source code for the GUI: https://github.com/OpenPIV/openpiv_tk_gui/tree/master/openpivgui.

I bet your GUI postprocessing is based on the openpiv-python's windef.py function. Here is the GitHub folder with the source code for the windef.py function: https://github.com/OpenPIV/openpiv-python/blob/master/openpiv/windef.py

Also, here is what I know from my experience with regards to your last two questions.

You are asking how the signal to noise was calculated or read out? I think the GUI is based on the openpiv-python and uses its function
pyprocess.sig2noise_ratio(correlations). Here is its source code: https://github.com/OpenPIV/openpiv-python/blob/master/openpiv/pyprocess.py (see line 480). It uses an algorithm to find correlation peaks when the correlations themselves are found beforehand and supplied to the algorithm as an argument in the form of a numpy array (not a file like you are guessing). Once the peaks are found, your GUI compares them to the threshold you provide. Pay attention, that's for the simplest case. But, your GUI uses this algorithm in several passes, adjusting all the parameters at every pass. You will have to look at the windef.py function the link for which I provided above to see how exactly the parameters are adjusted.

When I was trying to adjust windef.py in my work, I had a hard time. I would give it the signal to noise ratio threshold and the median filter threshold and it starts making the passes adjusting my initial parameters giving me some strange stuff at the end.

Also - very important - according to your screenshots, you do two validations, not one! You do a validation based on the signal to noise ratio (SNR) and you do local median validation. SNR compares the correlation peaks to the threshold you provided. It may as well be perfectly fine (i.e., it gives you all the valid vectors). But then you do local median validation. This is a very tricky one! It is the local median validation that could result in all of your vectors being invalid. Local median validation compares each velocity vector to several surrounding velocity vectors. If the difference between the vector and the average of the surrounding vectors is bigger than the threshold your provided (you picked 1.2 according to your screenshots), then the vector is considered invalid. I think 1.2 is a very small threshold. I think this is why all your vectors are red. Make this threshold, say 25 (that's what I am using right now in my own work). And see how it goes. I am telling you that from my experience, getting local median validation right is the biggest pain. I can easily spend a day tuning the parameters for it.

Ivan
Reply all
Reply to author
Forward
0 new messages