Iwas looking for a function in python which can shrink the labels. It is kind of like erosion but of each labelled region. Lets say I have two touching labels, 1 and 2 of two circles. Now I want to shrink both labels/erode pixels so that they do not touch each other by d number of pixels.
I am not quite sure I understand you correctly.
Do you want to shrink each label by n pixels ? Or do you want to shrink until touching labels are separated by at least n pixels and leave non-touching labels untouched?
In the latter case you could find gradients in the label image. Basically every pixel where the label value changes. Then dilate this line by n pixels and use it to mask out the labels.
I was just reading your code a bit. Labels can disappear yes but if I am eroding only the border pixels they can not be split into two right? I need some time to read and understand your code and then I will be happy to do a PR.
@VolkerH
What I wanted is the former, shrink each label by n pixels, I thought of the brute force way too, just was thinking if I can do something along the lines of expand_labels implementation or if someone has some function like that one.
However, it does not take care of the scenarios described by @haesleinhuepf (disappearing objects and cutting label in two. In the second case, the separated masks will both be relabelled with the original label).
I did but did not post the file back since it is so easy to do according to what I described. But apparently this is not the case? Attached is the result of it joined without naked edges at the surface seams
trimshrink_fixed.3dm (93.7 KB)
I am new to Grasshopper and I am struggling with shrink wrapping object. If someone can help me with this issue, it would be great! Since I do not know how to write my own definition, I found two definitions that gives me the result that looks close but they all still have some problems. In the image I attached, the one on the very left is closest to what I want although I need the edges to be sharper. And that is what was included in the Rhino file that I downloaded from Forum with the definition. However, when I use the same definition ([
1.gh) and bake it, the middle one in the image is what I am getting which is way too tensile. And I also found another definition (
2.gh) and I could get the result on the very right. This one on the other hand is not tensile at all but I could get sharp edges. Since I am completely new to Grasshopper, I could not figure out the problems. If someone knows how to solve it, please let me know! @laurent_delrieu
Thank you very much for the reply. But I am getting an error message next to the right most mesh command saying that data conversion failed from line to mesh. And also, do I have to set my objects in the Brep? once I do that, the grasshopper goes Not Responding. I am trying to Set Multiple Breps with complex and large size. Is that maybe why the grasshopper goes not responding? Because my final purpose was to wrap my building which would be complex and multiple objects. Please let me know what I should do. Thank you very much!
Capture.JPG838604 18.6 KB
that makes sense Daniel, thanks for the explanation. Having a solution where no points/edges touch the input mesh would be ideal. Even having a nominal offset amount, so instead of the wrinkwrap touching, it can stay always away by a keep-out distance. Than can also be controlled by offsetting the input mesh I guess, but when those are complex and non-manifold, or have self intersections, the mesh offset fails.
Daniel, thank you for the amazing script and custom goal.
Do you think there is some way to make it work for this set of branches in 3d space?
Basically I have joined the branches and a interior mesh but the branches for some reason are not detected. Is that because the subdivision of the wrapper is too big? Or is there something I can do in code to enhance the detection of this relatively small volumes compared to the bigger mesh of the interior?
0E5363B1-AA5E-4084-A23F-18AC5035D19F11321524 182 KB
The meshes were processed and are all closed, went trough Mesh Repair but some could contain very few (4-10) non manifold edges or other problems even after the repair but I had no time to fix manually all. Scale is real size (mm). I can send a sample file if you think that can help?
Wow thanks for the fast answer Daniel!
Somehow it gives an error on the TriRemesh - Invalid cast: Point GeometryBase which is quite strange as all points are valid and the convex hull is working without any errors. I tried to convert to Point class that is derived from the GeometryBase or even cast to it but still shows the same error It also does the same error on your sample so it must be some library mismatch? If anyone has ideas please share.
Any NURBS surface will ALWAYS form a rectangle. By shrinking it, the points will get as close to the trim curve as possible, but unless the trim curve is exactly along the points grid, there will always be points sticking out of the trimmed edges.
I'm getting charged per GB for SQL Server storage, so I have an incentive to use just as much storage as I need and no more. There are enough GB involved to create an attractive benefit to using less storage. I immediately think, "shrink the database." Then I think, "uh oh."
Nope, no progress. I got all tangled up in COVID-19 support for our agency, and am only fully back to my day job now. Actually, this item dropped through the cracks, and I need to get back around to it. I think I have some kind of trashy databases lying around that I can test on without concern. Thanks for the reminder . I'll post my results here when I have some.
Generally speaking, doing this with an Enterprise database is a huge no-no. While the process used does accomplish what you are looking to achieve, you're going to cause performance issues with larger and more heavily used databases by doing this.
@John_Spence - Are there any other ways around this problem that you may know of? Our database is similar to @TimMinter 's in size and we are up against the same problem. I'm having other database maintenance tasks issues come up that are a result of the size issue. I'm tempted to go ahead with a shrink since it worked for Tim. Tim, did you end up ever seeing any problems arise from the shrink process? And did you end up rebuilding the index? Thanks!
If you really really have no other option, you can shrink by releasing unused space after you compress the database (via catalog DB Admin tools). At that point, you have done all you can short of doing the unthinkable.
Even worse, the PostGreSQL wall for this subscription ended up to represent 4TiB in a few days, and we had to stop it. Back to normal, the extra data space was released, but the reserved space however, still lives there. Paying 7TiB instead of 3TiB is really expensive, and as you can guess, does no add any value to our business.
Unfortunately, directly shrinking a Cloud SQL database disk is not possible. While storage size can be increased, decreasing it is challenging due to the inherent limitations of the underlying storage system. However, there are several approaches you can take to optimize your data and migrate to a more efficient storage size:
Modified Use of Database Migration Service (DMS): Although DMS faced challenges with the PostgreSQL wall in your initial attempt, a staged migration approach might be more effective. This involves using DMS to gradually migrate specific table data in batches to a new Cloud SQL instance with a smaller disk size. However, given the frequent schema changes and high data modification rate in your database, the success of this approach would depend heavily on these specific dynamics.
Hot Backup & Recovery with Additional Optimization: After performing a hot backup and restoring it to a new instance, consider using the VACUUM FULL command in PostgreSQL. This command can help reclaim unused space by defragmenting the database. Be aware that VACUUM FULL can be time-consuming and requires significant downtime, as it locks the tables during the process.
Exploring Alternative Backup Tools: Tools like pg_basebackup or third-party solutions might offer faster backup and restore capabilities compared to pg_dump. While these tools can potentially reduce the downtime, the overall time required will still largely depend on the database size and network bandwidth. Additionally, these methods may not directly address the disk size reduction.
Cold Defragmentation Approach: Cold defragmentation using external tools like pg_repack involves exporting the data, defragmenting it offline, and then uploading it to a new Cloud SQL instance with a smaller disk size. This process is complex and requires a deep understanding of PostgreSQL. It's effectiveness in reducing disk size also varies based on the database's specific characteristics.
Creating a New Instance with Desired Disk Size: One effective way to reduce disk size is to create a new Cloud SQL instance with the desired smaller disk size and migrate your data to this new instance. This method involves backing up your data and restoring it to the new instance, which can help in achieving the disk size reduction you're aiming for.
Shrink is a 2009 American independent black comedy-drama film about a psychiatrist who treats members of the entertainment industry in Los Angeles, California. It is directed by Jonas Pate, written by Thomas Moffett, and stars Kevin Spacey with an ensemble cast.[3] The film premiered at the 2009 Sundance Film Festival[4] and includes music by Jackson Browne. Shrink received negative reviews from critics, praising Spacey's performance but critical of the film's scripting and directing.
In Hollywood, psychiatrist Dr. Henry Carter treats mostly luminaries in the film industry, each undergoing their own life crisis. Carter lives in a large, luxurious house overlooking the Hollywood Hills and has published a hugely successful self-help book. However, he is disheveled and lives alone. He smokes marijuana at home, in his car and behind his office when not seeing patients. Carter routinely drinks himself to sleep around his house, waking up in his clothes, but never enters his bedroom. Despite his own problems, Carter continues psychotherapy with his patients, maintaining his incisiveness, compassion and strong doctor-patient relationships.
3a8082e126