SummerSim 2016: Investigating the Fidelity of an Improvement-Assessment Tool After One Vacuum Bell Treatment Session - Response for Reviewers

7 views
Skip to first unread message

Jacob Barhak

unread,
Jun 8, 2016, 4:54:19 PM6/8/16
to public-scien...@googlegroups.com
This response below was delivered by email by the author:

#####################

Dear Reviewer #1,

Thank you for your comments and questions.

Please see responses to your questions below:


The authors should add a few words on the topic of pre and post processing.

>>Scans are exported in .ply format from ReconstructMe and imported as is into the chest surface comparison system (CSCS) software. After comparisons in the software, results are exported as .jpg images for display to clinician, patient, and parents. This information was added to the second paragraph in the METHODS AND MATERIALS section.

How do you handle outliers that are quite common in 3d scanning?

>> Outliers are handled by averaging the closest 10 points around the deepest point. This information was added to the paper under the statistical test discussion in the STUDY DESIGN section.

How do you crop the scans? How do you select initial positioning before registration?

>>Scans are cropped based on the positions of manually selected anatomical landmarks (nipples, sternal notch, and navel), which allows for automatic positioning. This information was clarified in the Procedure discussion in the METHODS AND MATERIALS section.

Is it fully automated or is there much human intervention?

>>Clinicians need to selected the scan files to compare, select the four anatomical landmarks on each scan, and choose a registration and comparison method or utilize the default values. The manual and automatic tasks were described more clearly in the added text.

Also, did you try to perform multiple separate scans and compare the results to establish repeatability?

>> Such comparisons were performed utilizing known sized objects and published in a 2016 Medicine Meets Virtual Reality Conference that has been referenced in the paper.

Looking at figure 2 it seems that the differences become larger on the left and smaller on the right as if the scans are tilted a bit. Is it possible that registration failed in this case? Or are you showing only part of the distance color map while registration took into account many more points not shown?

>>PE patients frequently have an asymmetric pattern associated with the chest deformity. It is likely that this shift in color reflect the asymmetry in this patient. This information was clarified in the Obtained Results portion of the RESULTS AND DISCUSSION section.

Your conclusions are interesting stating that there is not much difference between new method and the human measurement. However, you show the difference in one patient that seems significant. Does this imply that the human ground truth measurement is incorrect? Is it reasonable to deduce that there was human error in the ground truth measurement? Or, is this a machine algorithm error in this case?

>> In the first paragraph of the STUDY DESIGN section, we state an inherent error in the human ground truth measurement of about +- 0.5 to 1 mm due to the coarse scales on the dowel measurement tool. Moreover, additional errors may creep in depending upon the pressure applied by the dowel when placed on the skin as this could cause a slight depression in the skin surface. There is also machine error involved in the resolution of the scanning as well as noise. Machine error will be a subject in an upcoming paper in which we compare different versions of the scanning sensor.

If you eventually decide to use the new method in practice, will it complicate the process for the human handling it and require special setup and training? And what will be the benefit of extra effort in case the process is fully automated?

>> The system is currently deployed and in use in a clinical environment at CHKD, where training was provided and the system provides several default values so minimal manual interactions are needed. The most significant effort is in the selection of the surface landmarks for cropping.

 

 

Frederic (Rick) D. McKenzie, Ph.D.

Professor and Chair

Modeling, Simulation, & Visualization Engineering Dept.

Joint Appointment, Electrical & Computer Engineering Dept.

Old Dominion University

1307 ECSB, 4700 Elkhorn Avenue

Norfolk, VA 23529

(757) 683-5590

(757) 683-3200 (fax)

rdmc...@odu.edu

www.odu.edu/msve

Jacob Barhak

unread,
Jun 10, 2016, 5:42:11 PM6/10/16
to public-scien...@googlegroups.com
In Addition to my review, the following conversation took place between the author and the second reviewer. I is added here in backwards chronological order for the record.

##################

On Fri, Jun 10, 2016 at 10:41 AM, Matt Jacobson <matt.w....@gmail.com> wrote:

OK, I've modified my review to "Strong Accept". I didn't nominate for Best Paper because there was no option to abstain. As a new reviewer to SummerSim, I'm not familiar enough with the standard for that.

Matt

On Fri, Jun 10, 2016 at 11:16 AM, Jacob Barhak <jacob....@gmail.com> wrote:

So Mathew,

Your review should be recorded in the start system where you uploaded your initial review.

If you are ready to accept, mark it in the review form and you can mention the conversation. I will make sure this transcript is uploaded to the system and made public.

Once accepted, I need to mark it in the system and Rick will receive a message asking to upload the camera ready version and a few forms.

Many thanks for being available on a short notice and for the rapid responses.

Jacob

On Jun 10, 2016 10:08 AM, "Matt Jacobson" <matt.w....@gmail.com> wrote:

I'm happy with the revision, but am not sure if I should be doing anything until Rick manages to upload it to START.

Matt

On Thu, Jun 9, 2016 at 6:06 PM, Mckenzie, Rick <rdmc...@odu.edu> wrote:

Dear Jacob, Matthew,

START is not allowing me to see options for submitting a revision but please see enclosed the updated paper.

Best Regards,

Rick

Frederic (Rick) D. McKenzie, Ph.D.

Professor and Chair

Modeling, Simulation, & Visualization Engineering Dept.

Joint Appointment, Electrical & Computer Engineering Dept.

Old Dominion University

1307 ECSB, 4700 Elkhorn Avenue

Norfolk, VA 23529

(757) 683-5590

(757) 683-3200 (fax)

rdmc...@odu.edu

www.odu.edu/msve

From: Mckenzie, Rick Sent: Thursday, June 09, 2016 5:10 PM To: 'Jacob Barhak'; Matt Jacobson Subject: RE: Late Submission SummerSim Paper

Ok. Will do, Jacob.

Hopefully, I have the capability to submit the new revision now.

We did not speak on the phone.

Best Regards,

Rick

Frederic (Rick) D. McKenzie, Ph.D.

Professor and Chair

Modeling, Simulation, & Visualization Engineering Dept.

Joint Appointment, Electrical & Computer Engineering Dept.

Old Dominion University

1307 ECSB, 4700 Elkhorn Avenue

Norfolk, VA 23529

(757) 683-5590

(757) 683-3200 (fax)

rdmc...@odu.edu

www.odu.edu/msve

From: Jacob Barhak [mailto:jacob....@gmail.com] Sent: Thursday, June 09, 2016 5:08 PM To: Matt Jacobson Cc: Mckenzie, Rick

Subject: Re: Late Submission SummerSim Paper

So Rick,

It seems it is worthwhile to create a new revision of the paper incorporating all those changes and upload it to the start system.

Matthew can then inspect the revised product and if satisfied he can update his review through the start system.

You are still welcome to converse, yet please keep me in the loop since this conversation should be public, so unless you record the phone call and later send me the recording, please avoid phone conversations until the end of the review.

If you already talked on the phone, please write down minutes and send those to me.

I will handle making this process public.

Hopefully the process can be completed quickly before the weekend.

Jacob

On Jun 9, 2016 3:53 PM, "Matt Jacobson" <matt.w....@gmail.com> wrote:

I think that will work.

On Thu, Jun 9, 2016 at 4:38 PM, Mckenzie, Rick <rdmc...@odu.edu> wrote:

Yes. That is a good point and we can update exactly that in the intro i.e. “the current standard of practice (dowel measurement) has similar accuracy at a single point but offers no surface profile and suffers from dramatic errors due to user inconsistencies.”

If this works, we can update the paper with it and the items mentioned below.

Best Regards,

Rick

Frederic (Rick) D. McKenzie, Ph.D.

Professor and Chair

Modeling, Simulation, & Visualization Engineering Dept.

Joint Appointment, Electrical & Computer Engineering Dept.

Old Dominion University

1307 ECSB, 4700 Elkhorn Avenue

Norfolk, VA 23529

(757) 683-5590

(757) 683-3200 (fax)

rdmc...@odu.edu

www.odu.edu/msve

From: Matt Jacobson [mailto:matt.w....@gmail.com] Sent: Thursday, June 09, 2016 4:24 PM To: Mckenzie, Rick Cc: Jacob Barhak Subject: Re: Late Submission SummerSim Paper

Hi Rick,

This sounds reasonable and would largely address my criticisms. One other thing I would mention is that, as a reader layman to this clinical application, it was a little unclear to me what level of accuracy is "accurate enough". I get vague hints in the paper that you are striving for 0.5 mm - 1 mm accuracy, but I think it would help to quantify it up front in the Introduction. Furthermore, if you're saying that the dowel has this accuracy at a single point, but offers no surface profile and suffers from drawbacks x,y, and z then having that in the Intro would make it very plain for me why it was used as the standard for comparison.

Matt

On Wed, Jun 8, 2016 at 6:43 PM, Mckenzie, Rick <rdmc...@odu.edu> wrote:

Dear Matthew,

Sorry for contacting you directly but I am not yet able to submit the reply within the review system.

Hopefully, we can address your concerns.

Please see comments below:

1. The authors need to carefully justify selecting the method of manual measurement with cylindrical rulers as the ground truth against which their system is tested.

>> Please note that this is the main method in use at the clinic we collaborate with. We hope to have them transition to this new method completely once we have shown this evidence of its effectiveness.

This method is listed in their Introduction as having drawbacks in accuracy - drawbacks they seek to outperform. It reads very strangely, therefore, that they would strive to perform equivalently to this method in the tests described later in the paper, as opposed to Haller Index measurements, which the authors cite as the Gold Standard.

>> Yes. The dowel measurement provides one value at the deepest point of the chest while our method provides a surface map of improvement over the target pectus region. This allows the surgeon to compare the deformity as a whole and allows the patient and parents to see a visualization of the improvement which we hope will contribute to understanding of the treatment and, in the vacuum bell case, encouragement to continue to faithfully utilize the device which they need to do every day for at least eighteen months. We can add this to the paper to better explain this aspect.

It would indeed have been interesting to see if their method did as well as HI, without the need for ionizing CT radiation.

>> It is known that HI does not perform well post-surgery and we are doing the studies now which may provide an alternative to HI.

Finally, since the Conclusion of the paper states "there exists no significant difference between the hand-measured and surface scan-measured values", the reader is left wondering why not prefer the status quo hand-measurement technique, since this must be far cheaper than the surface-scanning equipment proposed by the authors.

>> This is a good observation. I believe that discussion above that indicates the additional features provided as well as the additional errors that are provided by the pressure of the dowel on the skin of the patient can be added to the conclusion to show that it is at least as good in fidelity but adds more capability and less possibility of user errors.

2. Earlier work on optical scanning methods cited by the authors (Glinkowski et al and Poncet et al) make it difficult to gauge the originality of the present paper. The authors do comment on some drawbacks of the earlier work ("stationarity", "shaded-colorless models"), but the import of these drawbacks is not clear. Since the paper is about measurement fidelity, why not compare on that basis? There must be something to say because the authors mention that the earlier papers compared themselves to HI.

>> Yes. In this paper, we are looking at an improvement measurement and not at a measure of severity as provided by the HI. Perhaps we can better clarify this in the paper.

3. The description of the registration technique "non-rigid registration for alignment and iterative closest point for refinement" was not clear. I think they mean that iterative closest point was used with a non-rigid deformation model. If so, it would be better to say that.

>> We would be happy to clarify this in the paper.

Best Regards,

Rick

Frederic (Rick) D. McKenzie, Ph.D.

Professor and Chair

Modeling, Simulation, & Visualization Engineering Dept.

Joint Appointment, Electrical & Computer Engineering Dept.

Old Dominion University

1307 ECSB, 4700 Elkhorn Avenue

Norfolk, VA 23529

(757) 683-5590

(757) 683-3200 (fax)

rdmc...@odu.edu

www.odu.edu/msve

Reply all
Reply to author
Forward
0 new messages