# How can things be speeded up for high res mesh ?

30 views

### Maia R.

Feb 18, 2021, 8:46:10 AMFeb 18
to scalismo
Hi,
What is the difference between:
====
val sampler = RandomMeshSampler3D( referenceMesh, numberOfPoints = 300, seed = 42)
val lowRankGP = LowRankGaussianProcess.approximateGP( gp, sampler, numBasisFunctions = 100) val shapeModel = StatisticalMeshModel(referenceMesh, lowRankGP)
====
and
====
val lowRankGP = LowRankGaussianProcess.approximateGPCholesky(
referenceMesh,
gp,
relativeTolerance = 0.01,
interpolator = TriangleMeshInterpolator3D[EuclideanVector[_3D]]()
)
====

I have meshes with almost 70000 nodes and when I apply a non-rigid registration (as in Tutorial 12) with the reference mesh of those 70000 points, it takes a very loooong time (several hours !!!). My steps are:
- created a multiscale low randk GP with approximateGPCholesky
- create the pdm with the reference mesh with 70000 points
- create a posterior model posteriorGP with some landmarks: this posterior model has a LowRandkGaussianProcess and NOT DiscreteLowRankProcess
- apply doRegsitration function with this posteriorGP with
===
val registrationParameters = Seq(
RegistrationParameters(regularizationWeight = 1e-1, numberOfIterations = 20, numberOfSampledPoints = 1000),
RegistrationParameters(regularizationWeight = 1e-2, numberOfIterations = 30, numberOfSampledPoints = 1000),
RegistrationParameters(regularizationWeight = 1e-4, numberOfIterations = 40, numberOfSampledPoints = 5000),
RegistrationParameters(regularizationWeight = 1e-6, numberOfIterations = 50, numberOfSampledPoints = 10000)
)
===
I am wondering if I have to discretize the posteriorGP on the reference mesh and then interpolate it in order to reduce the computation time.

Thank you very much
Best regards,
Maia

### Marcel Luethi

Feb 23, 2021, 3:22:04 PMFeb 23
to Maia R., scalismo
Hi Maia

Most of the operations in Scalismo scale linearly in the number of vertex points of the mesh. So if you can reduce the resolution of the mesh, you will have large speedup. I usually work with meshes that have only between 2000 and 10000 vertices. I rarely encounter shapes, which demand a higher resolution. So my first advice would be to downsample your reference mesh. The statistical mesh model class has a method  decimate, which can even do that on the fly. So you can build a model a model using the high-resolution mesh, decimate it before the registration, and then use the resoluting coefficients to get the corresponding high resolutions surface. This will save you a lot of time.

Also, you should run the computation of the model as a preprocessing step, which is done independently of the registration. Compute the model, save it to disk (using StatisticalModelIO) and load it again in the registration program. In this way you avoid that you have to compute the model multiple times.

Best regards,

Marcel

--
You received this message because you are subscribed to the Google Groups "scalismo" group.
To unsubscribe from this group and stop receiving emails from it, send an email to scalismo+u...@googlegroups.com.

### Maia R.

Feb 23, 2021, 5:26:01 PMFeb 23
to scalismo
Hi Marcel,
Thank you a lot, I will try this.
Best regards,
Maia

### Maia R.

Feb 23, 2021, 5:40:44 PMFeb 23
to scalismo
Hi Marcel,
When downsampling the reference mesh before registration, how can I deal the landmarks for the a posteriori part ? Or this does not matter ?
Thank you a lot,
Best regards
Maia

Le mardi 23 février 2021 à 15:22:04 UTC-5, marcel...@gmail.com a écrit :

### Andreas Morel-Forster

Feb 24, 2021, 8:18:03 AMFeb 24

Hi Maia

You could first calculate the posterior based on the landmarks and afterwards use the downsampling. Like this you do not have to care that the landmarks are still present in the downsampled version, but your registration can still be guided by them.

Best regards
Andreas

### Maia R.

Feb 24, 2021, 8:25:20 AMFeb 24
to scalismo
Can I use a decimated reference mesh from an external software (ex: Meshlab), load it and use it as the new reference mesh for registration ?

### Andreas Morel-Forster

Feb 24, 2021, 8:39:13 AMFeb 24

If you have a new enough version of Scalismo a function is already included in e.g. StatisticalMeshModel

```/**
* Changes the number of vertices on which the model is defined
* @param targetNumberOfVertices  The desired number of vertices
* @return The new model
*/
def decimate(targetNumberOfVertices: Int): StatisticalMeshModel = {
val newReference = referenceMesh.operations.decimate(targetNumberOfVertices)
val interpolator = TriangleMeshInterpolator3D[EuclideanVector[_3D]]()
val newGp = gp.interpolate(interpolator)

StatisticalMeshModel(newReference, newGp)
}
```

Best regards
Andreas

### Maia R.

Feb 24, 2021, 8:42:07 AMFeb 24
to scalismo
Yes, I have the newest version.
Thank you very much.
Best regards.

### Maia R.

Mar 10, 2021, 2:05:13 PMMar 10
to scalismo
Hi Marcel:
"and then use the resoluting coefficients to get the corresponding high resolutions surface. This will save you a lot of time."
How can I have and apply those coefficients from ICP  ?
For https://scalismo.org/docs/tutorials/tutorial12, as far as I understand, I apply the registrationTransformation to the high resolution reference  mesh
===
val registrationTransformation = transformationSpace.transformationForParameters(registrationResult.parameters)
val fittedMesh = referenceMesh.transform(registrationTransformation)
===
But how about the non-rigid ICP ?

Thanks a lot,
Best regards,
Maia

Le mardi 23 février 2021 à 15:22:04 UTC-5, marcel...@gmail.com a écrit :